1. Field of the Invention
The present invention relates to a display screen control method, a graphical user interface, an information processing apparatus, an information processing method, and a program.
2. Description of the Related Art
A technique for generating a group including data located closely from each other in a feature space defined by a predetermined feature quantity is called clustering. The clustering is widely used in various fields. Generation of a data structure having a tree structure is widely performed by further classifying, into groups, data included in each cluster generated by clustering.
The data structure thus generated is structured such that a higher level includes lower levels. Accordingly, this is used for the purpose of searching for desired data by selecting groups, one by one in order from a coarse-grained group to a fine-grained group, and for the purpose of grouping various granularities by changing levels when certain data are grouped (for example, see Japanese Patent Application Laid-Open No. 2007-122562).
When a user searches for data classified into groups, the data are often searched for by sequentially tracing, from the top in order, a hierarchical structure formed by clustering operation. Japanese Patent Application Laid-Open No. 2007-122562 indicates that a display screen allowing a user to intuitively understand a hierarchical structure is provided to allow the user to easily execute data search.
The search method such as the one described in Japanese Patent Application Laid-Open No. 2007-122562 is effective when data to be searched for are known. However, for example, when a user wants to search for a content similar to certain content data such as a picture, it is more convenient if the user can view and search for data based on data in question.
Accordingly, applications and services for displaying a list of contents based on a specified position have been recently developed.
The above-explained application for displaying a list of contents based on a specified position is configured to display all contents on a display screen. Therefore, there is an issue in that the display screen becomes complicated.
In view of the foregoing, it is desirable to provide a display screen control method and a graphical user interface capable of providing information about contents without making a display screen complicated.
In some cases, it may be desired to classify data into groups as follows: a certain position is used as a reference, and data located closer to the reference position are divided with a fine granularity, whereas data located farther are grouped with a coarse granularity. This kind of grouping can be achieved by performing clustering operation in view of not only absolute positions of data in a feature space but also distances from a particular position to data.
However, when a large amount of data are particularly necessary, the clustering needs equal amount of calculation. Accordingly, when data are classified into groups according to a specified position which changes from time to time, it is necessary to execute clustering again on every specified position. Therefore, there is an issue in that a heavy load is imposed upon an apparatus performing clustering operation.
Further, in view of the foregoing, it is desirable to provide an information processing apparatus, an information processing method, and a program capable of performing clustering operation for changing a cluster granularity based on a distance from a particular position in a feature space while suppressing a load necessary for the clustering.
According to an embodiment of the present invention, there is provided a display screen control method including the steps of generating a tree structure in which a leaf node represents content data associated with position information, serving as metadata, representing a location of a feature space defined based on a predetermined feature quantity, and a set of nodes whose distance in the feature space satisfies a predetermined condition is defined as a parent node of the nodes satisfying the predetermined condition, when any position information serving as a reference is specified, identifying a node in the tree structure to which the specified position information belongs, extracting, according to a position of the identified node in the tree structure, one or a plurality of nodes, other than the node to which the reference position information belongs, from among the nodes in the tree structure, and using a node extraction result obtained in the step of extracting the node to display an object corresponding to the content data at a position in a display screen according to the position information. In the step of identifying the node and in the step of extracting the node, a position corresponding to a center of the display screen is used as the position information serving as the reference, in the step of displaying the object, in a case where there is content data in which a position corresponding to the position information is out of a range displayed in the display screen, a node including the content data located out of the range is selected from among the extraction result, and an object corresponding to the selected node is displayed as an object of the content data located out of the range.
In the step of displaying the object, in a case where the object corresponding to the node is displayed, a direction instruction object may be displayed together with the object corresponding to the node, the direction instruction object indicating a direction of a position corresponding to the position information associated with the node.
In the step of displaying the object, in a case where the direction instruction object is selected by user operation, the display screen may be changed so that a central position of the node corresponding to the direction instruction object or a position of the content data located at a position closest to the central position of the node is arranged in the center of the display screen.
In the step of displaying the object, a size of a region displayed in the display screen may be determined so that other nodes or content data included in the node are all displayed within the display screen.
In the step of displaying the object, the node selected from among the extraction result may be changed according to a size of a region displayed in the display screen.
Sizes of the direction instruction object and the object corresponding to the node may be determined according to a distance between the node and a position corresponding to the center of the display screen or the number of content data or other nodes included in the node.
According to another embodiment of the present invention, there is provided a graphical user interface including a display region for displaying an execution screen of an application for displaying, at a display position corresponding to position information, an object corresponding to content data associated with the position information, serving as metadata, representing a location of a feature space defined based on a predetermined feature quantity. The content data are clustered into one or a plurality of groups based on the position information in advance, and a display state of the object in the execution screen changes according to a clustering result and a distance between a position corresponding to the position information and a central position of the execution screen.
According to another embodiment of the present invention, there is provided an information processing apparatus including a tree structure generation unit that generates a tree structure in which a leaf node represents content data associated with position information, serving as metadata, representing a location of a feature space defined based on a predetermined feature quantity, and a set of nodes whose distance in the feature space satisfies a predetermined condition is defined as a parent node of the nodes satisfying the predetermined condition, and a node extraction unit that, when any position information is specified, identifies a node in the tree structure to which the specified position information belongs, and extracts, according to a position of the identified node in the tree structure, one or a plurality of nodes, other than the node to which the specified position information belongs, from among the nodes in the tree structure.
The node extraction unit preferably extracts, from among the nodes in the tree structure, all child nodes of the identified node and nodes, other than the identified node, branching from a parent node of the identified node.
The node extraction unit may newly adopt, as a new target node, a parent node having a child node, other than the identified node, branching from the identified node and a parent node of the identified node, and further extracts a node, other than the target node, branching from a parent node of the target node.
The node extraction unit may repeat node extraction until the target node becomes a root node.
In a case where the specified position information belongs to a plurality of nodes in the tree structure, the node extraction unit may adopt, as a node to which the specified position information belongs, a node located at a deepest position with respect to the root node from among the plurality of nodes.
In a case where the specified position information further includes information for specifying a region in the feature space, the node extraction unit may change an extracted node according to a size of an area of the region.
The feature space may be a space representing a location on a surface of a sphere defined by a latitude and a longitude.
The feature space may be a space defined based on a feature quantity for specifying a location on a plane.
The feature space may be a space defined based on a feature quantity for specifying a time.
According to another embodiment of the present invention, there is provided an information processing method, including the steps of generating a tree structure in which a leaf node represents content data associated with position information, serving as metadata, representing a location of a feature space defined based on a predetermined feature quantity, and a set of nodes whose distance in the feature space satisfies a predetermined condition is defined as a parent node of the nodes satisfying the predetermined condition, identifying a node in the tree structure to which any specified position information belongs, and extracting, according to a position of the identified node in the tree structure, one or a plurality of nodes, other than the node to which the specified position information belongs, from among the nodes in the tree structure.
According to another embodiment of the present invention, there is provided a program for causing a computer to realize a tree structure generation function for generating a tree structure in which a leaf node represents content data associated with position information, serving as metadata, representing a location of a feature space defined based on a predetermined feature quantity, and a set of nodes whose distance in the feature space satisfies a predetermined condition is defined as a parent node of the nodes satisfying the predetermined condition, and a node extraction function for, when any position information is specified, identifying a node in the tree structure to which the specified position information belongs, and extracting, according to a position of the identified node in the tree structure, one or a plurality of nodes, other than the node to which the specified position information belongs, from among the nodes in the tree structure.
As explained above, according to the present invention, information about contents can be provided without making a display screen complicated.
Further, according to the present invention, clustering for changing a cluster granularity can be performed based on a distance from a particular position in a feature space while suppressing a load necessary for the clustering.
Hereinafter, preferred embodiments of the present invention will be described in detail with reference to the appended drawings. Note that, in this specification and the appended drawings, structural elements that have substantially the same function and structure are denoted with the same reference numerals, and repeated explanation of these structural elements is omitted.
The following explanation will be made in an order described below.
(1) Tree structure
(2) First Embodiment
(3) Hardware configuration of information processing apparatus according to embodiment of the present invention
First, terms relating to a tree structure used in this specification will be briefly explained with reference to
For example, as shown in
Now, attention is paid to a node “B” shown in
Whether a node is called a parent node or a child node is determined in a relative manner, and when attention is paid to a different node, the way it is called changes accordingly. For example, from the viewpoint of the leaf 3 or the leaf 4, the node B is a parent node. However, from the viewpoint of the root node, the node B is a child node.
The tree structure has a multi-level structure as shown in
A child node other than a target node branched from a certain parent node is referred to as a sibling node. For example, a node A and a node C are referred to as sibling nodes when attention is paid to the node B. For example, in
In the example shown in
First, overview of clustering achieved by an information processing apparatus according to the first embodiment of the present invention will be briefly explained with reference to
As explained above, in some cases, it may be desired to classify data into groups (clustering) as follows: a certain position is used as a reference, and data located closer to the reference position are divided with a fine granularity, whereas data located farther are grouped with a coarse granularity.
For example, an apparatus for displaying recommended spots around a current location on a map will be considered. In this case, spots located in proximity to the current location are displayed without being classified into groups (alternatively, they are classified into groups in such a manner that 1 piece of data corresponds to 1 group). Spots somewhat away from the current location are displayed in such a manner that they are classified into groups by municipalities. Spots in far away foreign countries are displayed in such a manner that the spots are classified into groups by country.
In the example shown in
When this kind of display is provided by the apparatus, the user can roughly, easily understand the arrangement of the displayed clusters. Therefore, if the above-explained apparatus can be realized, the convenience of the user can be improved as a result.
When it is desired that the sizes of groups are classified according to distances from a specified position as shown in the above example, such grouping can be achieved by performing clustering operation in view of not only absolute positions of data in a feature space but also distances to data from the particular position.
However, when the amount of data is particularly large, a heavy load of calculation is imposed in the clustering. Therefore, when spots are classified into groups according to the current location as shown in the above example, the system is forced to bear a heavy load upon re-execution of clustering on every current location changing from time to time.
In a case of clustering based on an actual current location, it is difficult to move so fast in the real world, for example. Accordingly, operation may be performed such that, for example, the current location is changed every one minute. In contrast, when the same thing as the above example is performed in a virtual world, it is difficult to imagine when and how much a particular location changes. In this case, it is difficult to achieve such clustering.
Accordingly, in an information processing apparatus according to the present embodiment explained below, clustering is performed to generate a multi-level cluster structure having different cluster granularities, and a tree structure representing the cluster structure is generated. Further, when a certain position is specified in a feature space defining the cluster structure, the specified position and the generated cluster structure are used to extract a desired cluster from various levels. Therefore, the information processing apparatus according to the present embodiment can perform clustering for changing a cluster granularity while suppressing a load imposed on clustering, based on a distance from the particular position in the feature space.
Subsequently, a configuration of the information processing apparatus according to the first embodiment of the present invention will be explained in detail with reference to
Examples of content data handled by the information processing apparatus 10 according to the present embodiment include image contents such as still picture contents and motion picture contents, and various kinds of text information, image information, and the like which are registered to servers and the like for sharing various kinds of information with users. In addition to the above data, the information processing apparatus 10 can be applied to contents such as mails, music, schedule, electronic money use history, telephone history, content viewing history, sightseeing information, local information, news, weather forecast, and ringtone mode history.
In the explanation below, image contents such as still picture contents and motion picture contents are explained, for example. However, the information processing apparatus 10 according to the present embodiment can handle any information and content data as long as position information representing a location in a feature space is attached as metadata with the data.
Preferably, the content data and data representing various kinds of information are stored in the information processing apparatus 10. Alternatively, main data may be stored in an apparatus such as a server arranged outside the information processing apparatus 10, and metadata corresponding to the main data may be stored in the information processing apparatus 10. For example, in the explanation below, the information processing apparatus 10 stores content data and data representing various kinds of information together with metadata.
For example, as shown in
The tree structure generation unit 101 is realized with, for example, a CPU (Central Processing Unit), ROM (Read Only Memory), RAM (Random Access Memory), and the like. The tree structure generation unit 101 generates a tree structure in which a leaf node represents content data associated with position information, serving as metadata, representing a location of a feature space defined based on a predetermined feature quantity, and a set of nodes satisfying a predetermined condition in the feature space is defined as a parent node of the nodes satisfying the predetermined condition.
First, the position information associated with the content data will be explained.
The tree structure generation unit 101 according to the present embodiment assumes the feature space defined by the predetermined feature quantity, based on the predetermined feature quantity described in the metadata associated with the content data. Examples of predetermined feature quantities described in metadata include information about latitude/longitude for identifying a location where a content is generated, information about a time when a content is generated, and information about an address representing a location where a content is generated. The metadata of the above predetermined feature quantities may be stored in, for example, an Exif (Exchangeable Image File Format) tag and the like associated with the content data.
The information about latitude/longitude for identifying a location is information which can be obtained by obtaining and analyzing a GPS signal, for example. Position information such as latitude/longitude is a feature quantity for identifying a position on a surface of an spherical object called earth (a position on a surface of a sphere). Accordingly, the feature space defined based on the information about the latitude/longitude is a space representing a position on the surface of the sphere called earth. Naturally, a position in this feature space can be defined by specifying each of a latitude and a longitude. Further, a distance between two positions in the feature space can be defined using a so-called ground distance.
Further, even in a case where information representing a location on the surface of the earth is used as a feature quantity, the surface of the sphere can be approximated as a flat surface when a local region is considered. Therefore, by adopting a latitude as an x coordinate and a longitude as a y coordinate, a feature space can also be defined. In this case, the feature space is a planar space (Euclidean space) defined by a two-dimensional vector such as (x, y), and a distance between two positions in the feature space can be defined by using a so-called Euclidean distance.
On the other hand, when information about a time when a content is generated is used as a feature quantity, the feature space is defined based on one-dimensional information, i.e., time. Therefore, in such case, the feature space is defined by time, i.e., scalar quantity, and a distance between two positions in the feature space is defined by a time difference.
The tree structure generation unit 101 assumes the feature space defined using the feature quantity as described above and generates a tree structure representing a clustering result of contents according to the degree of distribution of the contents within this feature space.
The tree structure generated by the tree structure generation unit 101 has the following features.
(1) Content data correspond to leaf nodes.
(2) Data closely located with each other in the feature space are included in the same node.
(3) When a node including data closely located with each other is present close to another node, these nodes are included in the same node.
(4) Nodes in the same level have relation in terms of the node size.
Further, the tree structure generated by the tree structure generation unit 101 may have the following feature in addition to the above features (1) to (4).
(5) A region of a certain node in a feature space and a region of another node in the feature space do not overlap unless the nodes are in parent-child relationship.
For example, the tree structure generation unit 101 generates the above-explained tree structure as follows.
First, the tree structure generation unit 101 references metadata associated with content data that can be used by the information processing apparatus 10, and arranges the content data on a plane surface in the feature space, based on position information described in the metadata. It should be noted that the arrangement of these contents are nothing but virtual.
Subsequently, the tree structure generation unit 101 calculates distances between data, for a set of content data in the plane. Subsequently, the tree structure generation unit 101 performs grouping (classification) by making a plurality of data located closely to each other into groups. The grouping processing carried out by the tree structure generation unit 101 can be called clustering. Further, each group made by this grouping processing (clustering) will be referred to a cluster.
The tree structure generation unit 101 classifies contents that can be used by the information processing apparatus 10 into a plurality of clusters by way of joining operation or separating operation of the clusters, thus generating a multi-level tree structure in which content data are represented by leaf nodes and the clusters are represented by nodes.
In the explanation below, a clustering method carried out by the tree structure generation unit 101 will be briefly explained with reference to
The clustering method carried out by the tree structure generation unit 101 according to the present embodiment is performed according to a flow shown in
First, processing for generating the internal tree will be explained.
It should be noted that the cluster c2 shown in
Each cluster generated after a plurality of contents are clustered is a circular region, which has a central position (central point) and a radius of the circle thereof as attribute values. As described above, a circular cluster region defined by a central point and a radius includes contents which belong to the cluster.
For example, as shown in
For example, as shown in
In clustering, a distance between contents is calculated in order to obtain a distance between clusters each having only one content. For example, a distance between a position of a content belonging to the cluster c3 and a position of a content belonging to the cluster c4 is calculated in order to obtain a distance between the clusters c3 and c4.
For example, explained below is a case where at least four contents belong to the cluster c5 as shown in
In clustering, a shortest distance between peripheries of the circles of the clusters is calculated in order to obtain a distance between clusters to which a plurality of contents belong. For example, a distance between the clusters c6 and c7 is a distance d shown in the figure. Where the radius of the cluster c6 is A2, the radius of the cluster c7 is A3, and the radius of the cluster c5 is A4, the distance d between the clusters c6 and c7 is 2 (A4−A2−A3).
A method for calculating a distance between clusters used by the tree structure generation unit 101 according to the present embodiment is not limited to the above method, and may be any method such as centroid method, shortest distance method, longest distance method, inter-group average distance method, and Ward method.
Subsequently, a specific example of clustering processing performed by the tree structure generation unit 101 will be explained with reference to
First, the tree structure generation unit 101 references position information associated with the five contents C11 to C15, and arranges these contents on a plane in a feature space (
Likewise, the tree structure generation unit 101 performs processing to make a cluster c22 including a content C14 and a content C15, the distance between which is the second shortest among the distances between the contents, by making the content C14 and the content C15 into one group (
Subsequently, the tree structure generation unit 101 respectively calculates distances between the generated two clusters c21 and c22 and the remaining content C13. In the case shown in
Finally, the tree structure generation unit 101 makes the remaining two clusters c22 and c23 into one group to make a cluster c24 (
As described above, the tree structure generation unit 101 successively clusters the contents C11 to C15, thereby generating the clusters c21 to c24. Further, the tree structure generation unit 101 generates a tree structure (clustering tree diagram) based on the generated clusters c21 to c24.
When the contents C11 to C15 are treated as leaf nodes, the clusters generated by the tree structure generation unit 101 form the tree structure as shown in
As is evident from
The generation processing of the internal tree carried out by the tree structure generation unit 101 has been hereinabove explained using the specific example.
When the tree structure generation unit 101 terminates the generation processing of the internal tree, the tree structure generation unit 101 subsequently performs a generation processing of a cluster tree as explained below.
When the generation processing of the internal tree as shown in
For example, when there are totally n pieces of content data, the tree structure generation unit 101 sets clusters such that each piece of data as one element belongs to one cluster, thus generating n clusters in total. It should be noted that each cluster has a central point C and a radius r as attribute values. The initial value of the central point C is a coordinate value of data. The initial value of the radius r is 0.
Subsequently, the tree structure generation unit 101 determines a cluster center C and a radius r such that a distance between the cluster center C and each of all the elements of the cluster is equal to or less than the radius r. Therefore, all the elements of the cluster are included in a sphere defined by the central point C and the radius r.
Subsequently, for example, the tree structure generation unit 101 determines the distances between the clusters as follows.
When a cluster k is generated by combining a cluster i and a cluster j, the tree structure generation unit 101 can calculate a distance d (i, j) between the cluster i and the cluster j using the following expressions 101 and 102.
d(i,j)=r(k)−r(i)−r(j)(r(k)≧r(i)+r(j)) (Expression 101)
d(i,j)=0(r(k)<r(i)+r(j)) (Expression 102)
In the above expressions 101 and 102, r(i) represents a radius of the cluster i. As is evident from the above expressions 101 and 102, the distance d between the clusters corresponds to an increment of radius when the clusters are combined.
Subsequently, a method for calculating a central point and a radius of a combined cluster made by combining two clusters will be hereinafter briefly explained with reference to
When two clusters are combined, the tree structure generation unit 101 determines the following three patterns according to an inclusion relation of elements which belong to a cluster.
(a) m(i)⊃m(j)
(b) m(j)⊃m(i)
(c) Other than the above
It should be noted that m(i) represents a set of all elements which belong to the cluster i, and m(j) represents a set of all elements which belong to the cluster j.
The situation shown in the above (a) is a case where all the elements of the cluster j belong to the cluster i as shown in
The tree structure generation unit 101 determines the above cases (a) to (c) based on a coordinate of each central point and each radius of the cluster i and the cluster j.
For example, when a sphere having a radius r(i) and a coordinate C(i) of a central point of the cluster i includes all of the cluster j made of a sphere having a radius r(j) and a coordinate C(j) of a central point, the tree structure generation unit 101 determines that the situation (a) as shown in
In other words, in a case where r(i)≧r(j)+1(i,j) holds, the tree structure generation unit 101 determines that the relationship (a) is satisfied. In this example, 1(i,j) is a Euclidean distance between the central points of the cluster i and the cluster j as shown in the following expression 103.
l(i,j)=|C(i)−C(j)| (Expression 103)
In this case, where the degree of data is dim, l(i, j) can be represented by the following expression 104. In the following expression 104, c(i, k) means the k-th value of an attribute representing a center value of the cluster i.
In a case where the situation (a) is satisfied, the tree structure generation unit 101 uses the central point and the radius of the cluster i as a central point and a radius of the combined cluster k.
Since the case (b) is obtained by swapping “i” and “j” of the case (a), the tree structure generation unit 101 can perform the same processing as the case (a).
When the situation (c) is satisfied, the tree structure generation unit 101 generates the cluster k as the smallest sphere including the sphere of the cluster i and the sphere of the cluster j as shown in
By using the above-explained method, the tree structure generation unit 101 can determine the inter-cluster distance and the central point of the cluster.
The tree structure generation unit 101 adopts the central point (central position) and the radius of the cluster thus calculated as attribute values unique to the cluster constituting cluster data. The tree structure generation unit 101 uses these attribute values unique to each cluster constituting the internal tree to execute the generation processing of the cluster tree as explained below. Further, the later-explained node extraction unit 105 can easily determine whether a certain point is included in a cluster or not by comparing attribute values of each cluster constituting the cluster tree with position information corresponding to the point in question. A certain cluster region is included in a cluster region of a parent cluster of the cluster region, and attribute values of the cluster (a central position and a radius) represent a range of elements included in the cluster. Therefore, the node extraction unit 103 and the display control unit 107, which are explained later, can easily associate elements and clusters displayed on a display screen.
Subsequently, the generation processing of the cluster tree carried out by the tree structure generation unit 101 will be briefly explained with reference to
The generation processing of the cluster tree based on the internal tree will be carried out with the parameters shown in
In this case, the tree structure generation unit 101 searches the tree structure of the generated internal tree, one by one in order from the root node, and identifies nodes satisfying a condition about the first node. Then, the tree structure generation unit 101 adopts the uppermost node, satisfying the condition, of each branch including an identified node as a node of the first level. As a result, in the example shown in
Likewise, the tree structure generation unit 101 searches the tree structure of the generated internal tree, one by one in order from the root node, and identifies nodes satisfying a condition about the second node. Then, the tree structure generation unit 101 adopts the uppermost node, satisfying the condition, of each branch including an identified node as a node of the second level. As a result, in the example shown in
By performing the above processing, the tree structure generation unit 101 generates the cluster tree as shown at the right side of
When the tree structure generation unit 101 terminates generation of the cluster tree for the contents that can be used by the information processing apparatus 10, the tree structure generation unit 101 associates metadata as shown in
The cluster data are information unique to each generated cluster. For example, as shown in
The cluster ID is identification information unique to a cluster corresponding to cluster data. For example, the cluster ID includes four-digit integer value. The cluster central position includes data representing a central position of a cluster corresponding to cluster data, and includes information for specifying a position in a feature space (for example, information representing a latitude and a longitude corresponding to the central position of the cluster). The cluster radius is data representing a radius of the cluster corresponding to cluster data. For example, a value in units of meters (m) is recorded in any format suitable for representing a feature quantity defining a feature space. The number of contents is data representing the number of contents included in a region of a cluster corresponding to cluster data. The content data list is data representing IDs of contents included in a region of a cluster corresponding to cluster data (represented as an integer value in
When the tree structure generation unit 101 terminates clustering processing, and associates cluster data with each generated cluster, the tree structure generation unit 101 stores the tree structure data and the cluster data representing the generated tree structure in the later-explained storage unit 115 and the like.
The tree structure generation unit 101 of the information processing apparatus 10 according to the present embodiment has been explained. Subsequently, the extraction condition setting unit 103 of the information processing apparatus 10 according to the present embodiment will be explained.
The extraction condition setting unit 103 is realized with, for example, a CPU, a ROM, a RAM, and the like. The extraction condition setting unit 103 sets, based on information notified by the GPS signal processing unit 113 or the input unit 111 explained later, an extraction condition which is used when the later-explained node extraction unit 105 extracts a certain node using the tree structure generated by the tree structure generation unit 101.
More specifically, the extraction condition setting unit 103 generates, based on information notified by the GPS signal processing unit 113 or the input unit 111, information about a position used as a reference when the later-explained node extraction unit 105 performs node extraction processing, and adopts the generated position information as an extraction condition.
The position information set by the extraction condition setting unit 103 corresponds to a type of a feature space set by the tree structure generation unit 101. For example, when the feature space is defined by a feature quantity representing a position on a surface of a sphere such as latitude/longitude, the extraction condition setting unit 103 sets, as an extraction condition, position information described with a feature quantity such as latitude/longitude. Alternatively, when the feature space is a planar space defined with a two-dimensional vector, the extraction condition setting unit 103 sets, as an extraction condition, position information described with a predetermined two-dimensional vector. Alternatively, when the feature space is one-dimensional space defined with a scalar quantity such as a time, the extraction condition setting unit 103 sets, as an extraction condition, position information described with a predetermined scalar quantity
The extraction condition setting unit 103 outputs the set position information to the later-explained node extraction unit 105.
The node extraction unit 105 is realized with, for example, a CPU, a ROM, a RAM, and the like. The node extraction unit 105 uses the tree structure generated by the tree structure generation unit 101 to extract one or a plurality of nodes from among the nodes constituting the tree structure, based on the extraction condition set by the extraction condition setting unit 103.
More specifically, when the extraction condition setting unit 103 specifies any position information as an extraction condition, the node extraction unit 105 references cluster data associated with a node of the tree structure to which the specified position information belongs, and determines to which node the specified position information belongs. Further, the node extraction unit 105 extracts one or a plurality of nodes, other than the node to which the specified position information belongs, from among the nodes in the tree structure according to the position of the specified node in the tree structure.
In this case, the node extraction unit 105 extracts (i) all child nodes of the identified node and (ii) nodes, other than the identified node, branching from a parent node of the identified node (in other words, sibling nodes) from among the nodes (i.e., clusters) in the tree structure. Further, the node extraction unit 105 adopts, as a new target node, a parent node of the identified node and a sibling node of the identified node, and further extracts a node, other than the target node, branching from a parent node of the target node (i.e., sibling node of the target node). The node extraction unit 105 repeats this node extraction processing until the target node corresponds to the root node.
In some tree structure generated by the tree structure generation unit 101, position information set by the extraction condition setting unit 103 may belong to a plurality of nodes in the tree structure (in other words, the position information belongs to a plurality of clusters). In this case, the node extraction unit 105 preferably adopts, as a node to which the specified position information belongs, a node located at the deepest position from the root node from among the plurality of nodes to which the set position information belongs.
In the explanation below, the node extraction processing carried out by the above-explained node extraction unit 105 will be explained in a more specific manner with reference to
In the explanation below, it is assumed that the feature space is a positional space on a surface of a sphere representing a position on the surface of the earth, and any position in a feature space is defined by a latitude and a longitude. It is assumed that a distance between data in the feature space is defined by a so-called ground distance as shown in
A ground distance represents a distance between two locations on a sphere, and corresponds to a length of a curve d shown in
[Numerical Expression 2]
d=cos−1{sin(lat1)sin(lat2)+cos(lat1)cos(lat2)cos(long2−long1)} (Expression 107)
In
As explained in
In this example, names given to clusters located from the 0th level to the 3rd level are prepared only for the purpose of explanation, and clusters generated by the tree structure generation unit 101 may not be given names characterizing regions in a real space represented by the clusters. When there arises a situation where clusters are presented to users, the tree structure generation unit 101 references information representing addresses described in metadata of contents and various kinds of information input by users, and may give specific names to the clusters.
In this example, attention is paid to leaf nodes “j”, “k”, “l” located in the 4th level as shown in
In
Cluster regions of nodes in parent-child relationship are overlapped. For example, the node “Tokyo Observation Deck” is included in the node “Tokyo”. In contrast, cluster regions of nodes other than the above are not overlapped. For example, cluster regions of nodes “Tokyo” and “Nagoya” are not overlapped. In other words, this tree structure is a tree structure having all the five features (1) to (5) of the tree structure as described above.
Processing performed by the node extraction unit 105 will be explained in a more specific manner with reference to
In such case, first, the node extraction unit 105 requests the tree structure generation unit 101 to notify whether there is any tree structure currently generated, and obtains tree structure data about the tree structure (cluster tree) as shown in
In the example shown in
Subsequently, the node extraction unit 105 selects a node located in the lowermost level from among the nodes to which the notified position information belongs. In the example shown in
Subsequently, the node extraction unit 105 extracts the leaf node j, the leaf node k, and the leaf node l, i.e., all child nodes of the start node “Tokyo Observation Deck”. Further, the node extraction unit 105 extracts the node “Shinjuku Garden” which is a sibling node of the start node “Tokyo Observation Deck”.
Subsequently, the node extraction unit 105 adopts, as a target node, the node “Tokyo”, i.e., a parent node of the node “Tokyo Observation Deck” and the node “Shinjuku Garden”, and extracts the node “Chiba”, i.e., a sibling node of the target node “Tokyo”.
Subsequently, the node extraction unit 105 adopts, as a new target node, the node “Tokyo Metropolitan Area”, i.e., a parent node of the extracted node “Chiba” and the target node “Tokyo”, and the node “Nagoya Metropolitan Area”, i.e., a sibling node of the target node “Tokyo Metropolitan Area”.
Subsequently, the node extraction unit 105 adopts, as a new target node, the node “Japan”, i.e., a parent node of the extracted node “Nagoya Metropolitan Area” and the target node “Tokyo Metropolitan Area”. In this example, in the tree structure shown in
As a result of the extraction processing as described above, the node extraction unit 105 extracts the leaf nodes j to l, the node “Shinjuku Garden”, the node “Chiba”, and the node “Nagoya Metropolitan Area” from among the nodes in the tree structure as a result of clustering based on a specified position.
In the example shown in
Subsequently, the node extraction unit 105 extracts a node “Chiba Amusement Park” and a node “Chiba Exhibition Hall”, i.e., child nodes of the node “Chiba”. Further, the node extraction unit 105 extracts the node “Tokyo”, i.e., a sibling node of the start node “Chiba”.
Subsequently, the node extraction unit 105 adopts, as a target node, the node “Tokyo Metropolitan Area”, i.e., a parent node of the node “Tokyo” and the node “Chiba”, and extracts the node “Nagoya Metropolitan Area”, i.e., a sibling node of the target node “Tokyo Metropolitan Area”.
Subsequently, the node extraction unit 105 adopts, as a new target node, the node “Japan”, i.e., a parent node of the extracted “Nagoya Metropolitan Area” and the target node “Tokyo Metropolitan Area”. In this example, in the tree structure shown in
As a result of the extraction processing as described above, the node extraction unit 105 extracts the node “Chiba Amusement Park”, the node “Chiba Exhibition Hall”, the node “Tokyo”, and the node “Nagoya Metropolitan Area” from among the nodes in the tree structure as a result of clustering based on a specified position.
In the example shown in
When the start node of the node extraction processing is the root node in the tree structure, the node extraction unit 105 extracts all the child nodes of the root node (in other words, all the nodes of the 1st level), and terminates the node extraction processing. Therefore, in the example shown in
In some cases, position information notified from the extraction condition setting unit 103 is not included in the root node of the tree structure obtained from the tree structure generation unit 101. In such case, the node extraction unit 105 extracts the root node of a tree structure, and terminates the processing. For example, in the tree structure shown in
Subsequently, node extraction processing will be explained with reference to
In the tree structure shown in
The node extraction unit 105 references the tree structure obtained from the tree structure generation unit 101 to recognize that there is an overlapping region of nodes without parent-child relationship. Then, the node extraction unit 105 performs the processing explained below.
First, for each branch branched from the root node, the node extraction unit 105 determines which node includes the notified position information. In the example shown in
When a plurality of nodes including the specified position information are identified, the node extraction unit 105 subsequently determines which of the plurality of nodes is located in the lowermost level, and selects the node located in the lowermost level as a start node of node extraction processing. In the example shown in
On the other hand, the example shown in
When a plurality of nodes including the specified position information are identified, the node extraction unit 105 recognizes that the node D and the node E are candidates for the start node. Subsequently, the node extraction unit 105 determines which node is located in a lower level based on the tree structure obtained from the tree structure generation unit 101. In the present example, the node extraction unit 105 recognizes that both of the two nodes belong to the same level. When the plurality of nodes serving as candidates for the start node belong to the same level as described above, the node extraction unit 105 treats each of the plurality of nodes in the same level as the start node. In the present example, the node extraction unit 105 selects the node D and the node E as the start nodes of the node extraction processing.
Subsequently, the node extraction unit 105 extracts all child nodes of the start node. In the example shown in
Subsequently, the node extraction unit 105 adopts as a target node, a parent node of each start node, and continues node extraction. In the present example, both of the parent node of the start node D and the parent node of the node E are the node C. Therefore, the node extraction unit 105 makes these two selection states into one to adopt only the node C as a target node, and continues the processing.
The node extraction unit 105 repeats the processing until the target node no longer has any parent node. As a result, in the example shown in
Since there is only one root node in the tree structure, a plurality of selection states are ultimately combined into one in the root node.
In the above example, the processing performed by the node extraction unit 105 in a case where the extraction condition setting unit 103 specifies a point in a feature space has been described. In the explanation below, processing will be explained, where not only a position but also a region having a range in a feature space is specified.
This processing can be performed, for example, in a case where clustering is performed relying on a current view (displayable region) when a clustering result is displayed somewhere. For example, a map with a scale displaying the entire Japan is displayed on a display screen of the display unit 109 of the information processing apparatus 10. In this example, the extraction condition setting unit 103 notifies, as an extraction condition, a region represented by a circle having a center at a certain point.
In this example, when position information notified by the extraction condition setting unit 103 is a location around the landmark “Tokyo Observation Deck” shown in
Accordingly, in order to cope with such case, the node extraction unit 105 previously stores, in the later-explained storage unit 115 and the like, a correspondence between a lower limit corresponding to a level in a tree structure as shown in
In the example shown in
For example, the following case is considered: the nodes “j”, “k”, “l” can be extracted when only a position is specified as a condition setting. In such case, when the lower limit of the level is three, the node extraction unit 105 extracts the node “Tokyo Observation Deck” instead of these three nodes.
In the above explanation, the specified region is the circle having the center at the certain point. Alternatively, this specified region may be a rectangular region represented as an oblong. In this case, half of a shorter side of the oblong or half of an average of a shorter side and a longer side may be used in place of the above-explained specified radius.
Alternatively, instead of a circular shape and a rectangular shape, any shape may be specified as a region. In this case, a square root of an area of a region (in a case of n-th degree, (1/n)th power of a volume) may be used in place of the above-explained specified radius.
In the above example, the lower limit of the displayed level is determined according to the size of the specified region. Alternatively, the upper limit of a displayed level may be determined according to the size of the specified region.
The node extraction unit 105 may automatically generate correspondence according a data structure, instead of previously generating a correspondence table as shown in
Even in a case where a position on the surface of the earth is represented as in the above example, the surface of the sphere can be approximated as a flat surface when data exist locally. Therefore, a two-dimensional feature plane having a latitude x and a longitude y may be considered, and a data structure (tree structure) generated by approximating a distance with a Euclidean distance may be used. Even in such case, the same results can be obtained by performing the same method as the above-explained method.
Further, the feature space may be one-dimensional time space. In such case, a position in a feature space is defined by a time, i.e., scalar quantity, and a distance between data in the feature space is defined by a time difference. By performing the above-explained processing on the feature space, grouping based on a particular time can be achieved.
For example, a case where a current time is specified as a particular time will be considered. In this case, data represent times when pictures were taken. In this case, pictures taken more recently are clustered with finer granularities, and older pictures taken in the past are clustered with coarser granularities. Therefore, the following effects can be obtained. For example, recent pictures are clustered with a granularity in units of days, and on the other hand, pictures taken several months ago are clustered with a granularity in units of months. Further, pictures taken several years ago are clustered in units of years.
As explained above, the node extraction unit 105 according to the present embodiment does not perform clustering upon structuring a tree structure every time position information is specified. Instead, the node extraction unit 105 uses a tree structure (cluster tree) previously structured based on distances between data in a feature space to extract nodes while determining which node of the tree structure the specified position information belongs to. Therefore, even when the specified position information changes from time to time, it is not necessary to re-execute clustering on every such occasion. Clustering can be performed to change a cluster granularity based on a distance from a particular position in a feature space, while a load necessary for clustering is suppressed.
The functions of the node extraction unit 105 according to the present embodiment have been hereinabove explained in detail.
Subsequently, the display control unit 107 according to the present embodiment will be explained with reference back to
The display control unit 107 is realized with, for example, a CPU, a ROM, a RAM, and the like. When the display control unit 107 receives from the later-explained input unit 111 a notification indicating that user operation for instructing viewing of clusters has been made, the display control unit 107 obtains contents stored in the later-explained storage unit 115 and the like, based on nodes extracted by the node extraction unit 105 (in other words, clusters). Thereafter, the display control unit 107 structures a view by grouping the obtained image contents based on extracted clusters, and performs display control so that the later-explained display unit 109 displays this view.
As necessary, the display control unit 107 may request the tree structure generation unit 101 or the node extraction unit 105 to transmit the tree structure data. As necessary, the display control unit 107 may request the tree structure generation unit 101 or the node extraction unit 105 to give the tree structure or a parent node, child nodes, sibling nodes of a certain node, and the like.
A display control method of the display unit 109 carried out by the display control unit 107 will be explained in detail later.
The display unit 109 is an example of a display device of the information processing apparatus 10 according to the present embodiment. The display unit 109 is a display unit for displaying an execution screen and the like of various applications and various contents that can be executed by the information processing apparatus 10. Further, the display unit 109 may display various objects used for operating execution situations of various applications, operations of various contents, and the like.
Various kinds of information are displayed in the display screen of the display unit 109 under the control of the display control unit 107. An example of a display screen displayed on the display unit 109 will be hereinafter explained in detail again.
The input unit 111 is an example of an input device of the information processing apparatus 10 according to the present embodiment. This input unit 111 is realized with, for example, a CPU, a ROM, a RAM, an input device, and the like. The input unit 111 converts user operation performed on a keyboard, a mouse, a touch panel, and the like of the information processing apparatus 10 into an electric signal corresponding to the user operation, and notifies the user operation to the extraction condition setting unit 103 and the display control unit 107. For example, when a user performs operation for specifying a location of the display screen or operation for specifying a region having a center at a certain location of the display screen, the input unit 111 generates information representing the location or the region, and outputs the information to the extraction condition setting unit 103 and the like.
The GPS signal processing unit 113 is realized with, for example, a CPU, a ROM, a RAM, a communication device, and the like. The GPS signal processing unit 113 calculates position information of a location where the information processing apparatus 10 is located (more specifically, a location where a GPS signal is receive) based on a GPS signal received by a GPS receiver antenna (not shown). The GPS signal processing unit 113 outputs calculated position information to the extraction condition setting unit 103. This calculated position information includes various kinds of metadata such as a latitude, a longitude, and an altitude.
The storage unit 115 is an example of a storage device of the information processing apparatus 10 according to the present embodiment. This storage unit 115 may store various content data of the information processing apparatus 10, metadata associated with the content data, and the like. Further, the storage unit 115 may store tree structure data corresponding to a tree structure generated by the tree structure generation unit 101. Further, the storage unit 115 may store execution data corresponding to various applications which are used by the display control unit 107 to display various kinds of information on the display unit 109. Further, this storage unit 115 may store various parameters or progress of processing that are necessary to be stored while the information processing apparatus 10 performs certain processing, and may store various kinds of databases and the like as necessary. This storage unit 115 can be freely read and written by each processing unit of the information processing apparatus 10 according to the present embodiment.
It should be noted that the information processing apparatus 10 according to the present embodiment may be any apparatus as long as it has a function of obtaining position information and a generation time of a content from the content and an attached data file. Examples of applicable apparatuses include imaging apparatuses such as a digital still camera and a digital video camera, a multimedia content viewer with a built-in storage device, a personal digital assistant capable of recording, storing, and viewing a content, a content management viewing service working in synchronization with an online map service, application software for a personal computer, a portable game terminal having a picture data management function, a mobile phone with a camera having a storage device, and a digital household electrical appliance and a game device having a storage device and a picture data management function. The effect of grouping can be obtained more significantly when the capacity of a storage device is large. However, regardless of the storage capacity, the function according to the present embodiment can be applied.
An example of functions of the information processing apparatus 10 according to the present embodiment has been hereinabove explained. Each of the above constituent elements may be made with a generally-used member and circuit, or may be made with hardware dedicated for the function of each constituent element. Alternatively, all of the functions of the constituent elements may be performed by a CPU and the like. Therefore, the used configuration may be changed as necessary in accordance with the state of the art at the time when the present embodiment is carried out.
It is possible to create a computer program for realizing the functions of the above-described information processing apparatus according to the present embodiment, and the computer program can be implemented on a personal computer and the like. Further, a computer-readable recording medium storing such computer program can be provided. Examples of recording media include a magnetic disk, an optical disk, a magneto-optical disk, and a flash memory. Further, for example, the above computer program may be distributed through a network, without using any recording medium.
In the above explanation, each node of the tree structure is a hypersphere. However, it is to be understood that each node of the tree structure is not limited to the above example. A node region of the tree structure may be represented using, for example, a method for representing a node region with an oblong (R-Tree method), a method for representing a node region with a combination of an oblong and a circle (SR-Tree method), and a method for representing a node region with a polygon.
Subsequently, an information processing method carried out by the information processing apparatus 10 according to the present embodiment (more specifically, node extraction method) will be briefly explained with reference to
It is assumed that that, before the following explanation, the tree structure generation unit 101 has generated the above-explained tree structure (cluster tree) about the content data that can be used by the information processing apparatus 10, and the node extraction unit 105 has obtained tree structure data corresponding to the tree structure from the tree structure generation unit 101.
First, a node extraction method using a tree structure in which there is an overlapping region only in nodes in parent-child relationship (in other words, tree structure having all of the above-explained five features (1) to (5) of the tree structure) will be briefly explained with reference to
First, when the node extraction unit 105 receives from the extraction condition setting unit 103 position information about a position serving as a reference for node extraction processing, the node extraction unit 105 identifies which position in a feature space related to a tree structure the specified position information corresponds to (step S101). Subsequently, the node extraction unit 105 compares a region in the feature space occupied by a node in the tree structure with a position in the feature space of the specified position information, thereby determining whether the specified position is included in a node, one by one in order from the root node (step S103). Subsequently, the node extraction unit 105 selects, as a start node of node extraction processing, a node in the lowermost level including the specified position specified by the extraction condition setting unit 103 (step S105).
Subsequently, the node extraction unit 105 sets a parameter P to identification information representing the selected node (step S107). Subsequently, the node extraction unit 105 initializes a parameter C, representing nodes having been subjected to extraction processing, to empty data (null) (step S109).
Thereafter, the node extraction unit 105 repeats step S113 and step S115 explained below while the parameter P is not empty data (step S111).
In step S113, the node extraction unit 105 extracts all child nodes of the node represented in the parameter P except for those described in the parameter C while referencing the tree structure data obtained from the tree structure generation unit 101.
In step S115, the parameters are updated. In other words, the node extraction unit 105 sets the parameter C to the content currently described in the parameter P. Further, the node extraction unit 105 sets the parameter P to a parent node of the node represented in the newly set parameter C.
The node extraction unit 105 can execute the node extraction processing as illustrated in
Subsequently, a node extraction method using a tree structure in which there is an overlapping region in nodes other than nodes in parent-child relationship (in other words, one without the feature (5) of the above-explained five features of the tree structure) will be briefly explained with reference to
First, when the node extraction unit 105 receives from the extraction condition setting unit 103 position information about a position serving as a reference for node extraction processing, the node extraction unit 105 identifies which position in a feature space related to a tree structure the specified position information corresponds to (step S201). Subsequently, the node extraction unit 105 compares a region in the feature space occupied by a node in the tree structure with a position in the feature space of the specified position information, thereby determining whether the specified position is included in a node, one by one in order from the root node (step S203). Subsequently, the node extraction unit 105 selects, as a start node of node extraction processing, a node Pi in the lowermost level including the specified position specified by the extraction condition setting unit 103, and inputs the node Pi into a list L (step S205).
Subsequently, the node extraction unit 105 sets various parameters. In other words, the node extraction unit 105 sets a parameter Pi.ptr to a pointer pointing to the selected node, and sets a parameter Pi.ignore_list to empty data (step S207).
Subsequently, the node extraction unit 105 repeats step S211 to step 219 explained below while the parameter P0.ptr is not empty data (step S209).
In step S211, the node extraction unit 105 extracts all child nodes of the node represented in the parameter Pi.ptr except for those described in the parameter Pi.ignore_list while referencing the tree structure data obtained from the tree structure generation unit 101.
In step S213, the parameters are updated. In other words, the node extraction unit 105 inputs the pointer currently described in the parameter Pi.ptr to the parameter Pi.ignore_list. Further, the node extraction unit 105 sets the parameter Pi.ptr to a parent node of the node represented in Pi.ptr.
In step S215, the node extraction unit 105 determines whether there is a combination of nodes Pi, Pj having the same Pi.ptr. When it is determined that there is a combination of (i, j) having the same Pi.ptr in the determination step in step S217, the node extraction unit 105 executes the following processing. In other words, the node extraction unit 105 combines Pi.ignore_list and Pj.ignore_list to make a new Pi.ignore_list, and delete Pj from the list L (step S219). On the other hand, when it is determined that there is no combination of (i, j) having the same Pi.ptr, the node extraction unit 105 does not execute the processing in step S219.
The node extraction unit 105 can execute the node extraction processing as illustrated in
Since the node extraction method shown in
The node extraction method carried out by the information processing apparatus 10 according to the present embodiment has been hereinabove explained briefly. Subsequently, an example of a display screen of the display unit 109 and a display control method carried out by the display control unit 107 according to the present embodiment will be explained in detail with reference to
First, an example of a display screen displayed on the display unit 109 controlled by the display control unit 107 according to the present embodiment will be explained in detail with reference to
In the explanation below, the display control unit 107 executes an application for displaying objects such as thumbnails and icons corresponding to content data on a display position corresponding to position information associated with the content data. In an application explained below, objects corresponding to image contents such as still picture contents and motion picture contents are displayed using a map application for displaying a map around a specified position.
It is assumed that, before the following explanation, a tree structure (cluster tree) about contents that can be executed by the information processing apparatus 10 has been structured in advance.
When a user operates this map application to start the map application, and an operation signal corresponding to this user's operation is notified from the input unit 111 to the display control unit 107, the display control unit 107 obtains a corresponding program main body of the map application from the storage unit 115 and the like and executes the program main body. Accordingly, a map around a predetermined position is displayed in the display screen of the display unit 109. In this example, the position initially displayed in the display screen may be a current position notified by the GPS signal processing unit 113 or may be a position specified by a user and notified by the input unit 111.
In this example, when the display control unit 107 generates an execution screen by executing this map application, the display control unit 107 performs adjustment so that the position specified by the input unit 111 or the GPS signal processing unit 113 is positioned in the center of the execution screen.
On the other hand, information about the position specified by the input unit 111 or the GPS signal processing unit 113 is also notified to the node extraction unit 105 via the extraction condition setting unit 103. The node extraction unit 105 extracts one or a plurality of nodes from among nodes (clusters) included in the previously structured tree structure by performing the processing as explained above, and outputs the nodes to the display control unit 107.
In this example, when the display control unit 107 displays, on the execution screen, a list of contents that can be used by the information processing apparatus 10, the display control unit 107 changes an object of a content displayed in the display screen according to a distance between the center position of the execution screen and a position represented by position information corresponding to the content.
More specifically, when a content is included in a region displayed in the display screen as the execution screen (when the position information of the content indicates a position in the display region), the display control unit 107 displays objects such as a thumbnail image of the corresponding content. In other words, the display control unit 107 considers a cluster represented as a parent node of a leaf node corresponding to content data, and in a case where at least a portion of a cluster region is included in the region displayed as the execution screen, the display control unit 107 displays, on the display screen, a thumbnail image and the like of the corresponding content data.
In some cases, a position represented by position information corresponding to a content may not be included in a region displayed on the display screen. In such case, the display control unit 107 uses a node (cluster) including the corresponding content among the nodes notified by the node extraction unit 105 to display an object corresponding to this cluster on the display screen. At this occasion, a name given to the cluster is preferably used as the object corresponding to the cluster.
For example, explanation will be made using the example as shown in
In such case, position information of contents corresponding to leaf nodes j to l is included in a region displayed in the display screen. Therefore, the display control unit 107 uses objects such as thumbnail images of the contents corresponding to the leaf nodes j to l to display the objects on the display screen.
On the other hand, position information of contents corresponding to leaf nodes g to i is not included in the region displayed in the display screen. Therefore, the display control unit 107 uses the node “Shinjuku Garden” extracted by the node extraction unit 105 and including these leaf nodes to display an object corresponding to this node.
Likewise, position information of contents corresponding to leaf nodes m to r is not included in the region displayed in the display screen. Therefore, the display control unit 107 uses the node “Chiba” extracted by the node extraction unit 105 and including these leaf nodes to display an object corresponding to this node.
On the other hand, position information of contents corresponding to leaf nodes a to f is not included in the region displayed in the display screen. Therefore, the display control unit 107 uses the node “Nagoya Metropolitan Area” extracted by the node extraction unit 105 and including these leaf nodes to display an object corresponding to this node.
The display control unit 107 can present, to a user, a list of contents that can be executed by the information processing apparatus 10 by performing the above display control, so that each content is displayed with a clustering granularity according to a distance from the central position of the display screen.
As shown in the figure shown in the center of
In the case of the figure shown in the center of
Further, the cluster object 303, i.e., the object representing the cluster, is arranged with a direction instruction object 305 such as an arrow as shown in
In some cases, a plurality of cluster objects 303 may be arranged in the display screen. In this case, the display control unit 107 preferably adjusts display positions of the cluster object 303 and the direction instruction object 305 in such a manner that the cluster object 303 and the direction instruction object 305 do not overlap with each other.
This direction instruction object 305 is displayed in the display screen in such a manner that the end of the direction instruction object 305 points to the central position of the corresponding cluster object 303. A drawing method of the direction instruction object 305 will be briefly explained with reference to
Further, as shown in
Further, as shown in
In
For example, as shown in
On the other hand, when the cluster object 303 as shown in
In this case, the display control unit 107 can use any method to determine the specific sizes of the cluster objects 303 and the direction instruction objects 305. For example, the display control unit 107 may use a function as shown in
In the function shown in
In this example, the display control unit 107 determines a display magnification rate Y according to the expression 151 and the expression 152 as follows.
As is evident from the above expressions, in a case where the distance from the center of the cluster is less than a predetermined threshold value (MIN_DIST), the display control unit 107 changes the display magnification rate to a maximum value (MAX_SCALE). In a case where the distance is equal to or more than the predetermined threshold value, the display control unit 107 changes the display magnification rate to 1/X of the maximum value.
Further, the display control unit 107 may determine the specific size of the cluster object 303 and the direction instruction object 305 according to the number of contents included in a cluster. In this case, the display control unit 107 may determine the specific size using the function as shown in
In the function shown in
In this example, the display control unit 107 determines a display magnification rate Y according to the expression 153 and the expression 154 as follows.
In this example, a parameter k in the above expression 153 is a coefficient determining an inclination of the function. The parameter k may be set to any value according to an environment to which this method can be applied. As is evident from the above expression, in a case where the number of content included in the cluster is one, the display control unit 107 sets the display magnification rate to a minimum value (MIN_SCALE), and changes the display magnification rate based on the above expression 153 according to an increase in the number of contents included in the cluster.
An example of a display screen will be explained with reference back to
Depending on the number of contents displayed in the display screen, many objects are displayed in the display screen, and the screen becomes complicated in some cases. Accordingly, when the number of objects displayed in the display screen increases, and the display control unit 107 determines that the display screen has become complicated, the display control unit 107 may further select the objects displayed on the display screen.
For example, the display control unit 107 can further select the objects according to a distance from a central position of a display screen, a size of a content, the number of contents included in a cluster, history information of a user regarding content viewing, presence/non-presence of various kinds of information associated with a content and an order thereof, and the like.
When the number of cluster objects 303 displayed in the display screen becomes complicated, the display control unit 107 may make a plurality of cluster objects 303 corresponding to clusters in the same level into one cluster object 303 and display the cluster object 303.
A determination as to whether the display screen has become complicated is made by any method. For example, the display control unit 107 may make a determination based on whether the number of objects displayed in the display screen is more than the predetermined threshold value.
In some cases, a user who sees the display screen selects a thumbnail image 301 of a displayed content by clicking or tapping the thumbnail image 301. In such case, the display control unit 107 may switch the display screen in order to display metadata of an explanatory text associated with the selected content according to the display screen and display an explanatory text. On the other hand, when the selected content is a reproducible content such as a motion picture content, the content may be reproduced.
In some cases, a user may enlarge or reduce a display region without changing a central position of a display screen. For example, when the user performs zoom-out processing, the display control unit 107 displays, on the display screen, a thumbnail image 301 of a content coming into the display region. With this processing, the sizes of the cluster objects 303 and the direction instruction objects 305 are changed according to the zoom level.
On the contrary, in some cases, a user may perform zoom-in processing. In this case, in response to the zoom-in processing, the display control unit 107 changes, from the thumbnail image 301 to the cluster object 303, an object of a content whose position corresponding to position information no longer exists in a new display screen.
Further, the display control unit 107 may change a granularity of a cluster displayed as the cluster object 303 in response to enlarging/reducing processing. Accordingly, it is possible to let the user know that a large change occurs in a distance to a cluster in response to enlarging/reducing processing.
For example, when zoom-out processing is performed in the figure shown in the center of
On the other hand, when zoom-in processing is performed in the figure shown in the center of
A determination as to whether the granularities of clusters are to be changed or not can be made by any method. For example, the display control unit 107 may determine whether the granularities of clusters are to be changed or not according to the following method.
For example, as shown in
On the other hand, in some cases, a user may select a direction instruction object 305 displayed in the display screen. In this case, first, the display control unit 107 identifies which cluster corresponds to the selected direction instruction object 305. Subsequently, the display control unit 107 identifies a cluster central position of the identified cluster based on cluster data, and changes the screen so that such position is arranged in the center of the display screen. Alternatively, the display control unit 107 may change the screen so that the central position of the cluster is not arranged in the center of the display screen but a position of a content closest to the cluster central position is arranged in the center of the display screen. When the screen is changed as above, the display control unit 107 preferably determines a scale of an execution screen (for example, a map) so that all clusters (or contents) included in the new cluster are displayed within the display screen.
Alternatively, when the screen is changed as above, the display control unit 107 may request the node extraction unit 105 to perform node extraction processing again so as to display representing images in the display screen based on newly extracted nodes. In this case, examples of representing images include an image close to a central position of a cluster, an image close to a barycenter of content distribution within a cluster, and the like.
For example, as shown in
As hereinabove explained, the display control unit 107 according to the present embodiment uses the extraction result provided by the node extraction unit 105 to cluster and display closely located contents as a clustering result, thus solving the issue of complicated display screen. Further, the display control unit 107 displays contents on the display screen as follows: the closer the content is located from a specified position, the finer the granularity of the content. Accordingly, information about contents located close to the specified position can be displayed in detail.
In the above explanation, when the display control unit 107 displays the contents in the display screen, the thumbnail images of the contents are displayed. However, the display control unit 107 may display, on the display screen, objects such as pins representing positions of contents, instead of thumbnail images of contents.
Subsequently, a flow of a display screen control method according to the present embodiment will be briefly explained with reference to
It is assumed that, before the following explanation, the tree structure generation unit 101 has generated a tree structure about contents that can be used by the information processing apparatus 10.
When a user performs operation for requesting start of a predetermined application, the display control unit 107 of the information processing apparatus 10 starts the specified application (step S301). Further, the extraction condition setting unit 103 sets an extraction condition used in node extraction processing based on various kinds of information notified by the input unit 111 or the GPS signal processing unit 113, and notifies the extraction condition to the node extraction unit 105. Subsequently, the node extraction unit 105 carries out the above-explained node extraction processing based on the notified extraction condition (step S303), and notifies the information about the extracted nodes to the display control unit 107.
Subsequently, the display control unit 107 uses the information about the extracted nodes to generate a display screen displayed on the display unit 109 (step S305), and displays the generated display screen in a predetermined region of the display unit 109.
Subsequently, the information processing apparatus 10 determines whether the user has performed a termination operation of the application (step S307). When the user has performed the termination operation, the information processing apparatus 10 terminates execution of the application.
On the other hand, when the user has not performed the termination operation, the information processing apparatus 10 determines whether the user has performed operation for changing the state of the display screen (step S309).
For example, in a case where the user has performed an operation to select a certain cluster (cluster object), the display control unit 107 generates a display screen for displaying a content of the selected cluster (step S311), and displays a predetermined region of the display unit 109. Thereafter, the information processing apparatus 10 returns back to step S307 to continue processing.
In a case where the user has performed an operation for changing the display region, the display control unit 107 generates a display screen based on the changed display region (step S313), and displays a predetermined region of the display unit 109. Thereafter, the information processing apparatus 10 returns back to step S307 to continue processing.
In a case where the user selects a certain content, the display control unit 107 performs processing for displaying, on the display screen, information such as explanatory texts corresponding to the selected content (step S315). Thereafter, the information processing apparatus 10 returns back to step S307 to continue processing.
By performing the above processing, the information processing apparatus 10 according to the present embodiment can display contents on the display screen so as not to make the display screen complicated.
Next, the hardware configuration of the information processing apparatus 10 according to the embodiment of the present invention will be described in detail with reference to
The information processing apparatus 10 mainly includes a CPU 901, a ROM 903, and a RAM 905. Furthermore, the information processing apparatus 10 also includes a host bus 907, a bridge 909, an external bus 911, an interface 913, an input device 915, an output device 917, a storage device 919, a drive 921, a connection port 923, and a communication device 925.
The CPU 901 serves as an arithmetic processing apparatus and a control device, and controls the overall operation or a part of the operation of the information processing apparatus 10 according to various programs recorded in the ROM 903, the RAM 905, the storage device 919, or a removable recording medium 927. The ROM 903 stores programs, operation parameters, and the like used by the CPU 901. The RAM 905 primarily stores programs used in execution of the CPU 901 and parameters and the like varying as appropriate during the execution. These are connected with each other via the host bus 907 configured from an internal bus such as a CPU bus or the like.
The host bus 907 is connected to the external bus 911 such as a PCI (Peripheral Component Interconnect/Interface) bus via the bridge 909.
The input device 915 is an operation means operated by a user, such as a mouse, a keyboard, a touch panel, buttons, a switch and a lever. Also, the input device 915 may be a remote control means (a so-called remote control) using, for example, infrared light or other radio waves, or may be an externally connected device 929 such as a mobile phone or a PDA conforming to the operation of the information processing apparatus 10. Furthermore, the input device 915 generates an input signal based on, for example, information which is input by a user with the above operation means, and is configured from an input control circuit for outputting the input signal to the CPU 901. The user of the information processing apparatus 10 can input various data to the information processing apparatus 10 and can instruct the information processing apparatus 10 to perform processing by operating this input apparatus 915.
The output device 917 is configured from a device capable of visually or audibly notifying acquired information to a user. Examples of such device include display devices such as a CRT display device, a liquid crystal display device, a plasma display device, an EL display device and lamps, audio output devices such as a speaker and a headphone, a printer, a mobile phone, a facsimile machine, and the like. For example, the output device 917 outputs a result obtained by various processings performed by the information processing apparatus 10. More specifically, the display device displays, in the form of texts or images, a result obtained by various processes performed by the information processing apparatus 10. On the other hand, the audio output device converts an audio signal such as reproduced audio data and sound data into an analog signal, and outputs the analog signal.
The storage device 919 is a device for storing data configured as an example of a storage unit of the information processing apparatus 10 and is used to store data. The storage device 919 is configured from, for example, a magnetic storage device such as a HDD (Hard Disk Drive), a semiconductor storage device, an optical storage device, or a magneto-optical storage device. This storage device 919 stores programs to be executed by the CPU 901, various data, and various data obtained from the outside.
The drive 921 is a reader/writer for recording medium, and is embedded in the information processing apparatus 10 or attached externally thereto. The drive 921 reads information recorded in the attached removable recording medium 927 such as a magnetic disk, an optical disk, a magneto-optical disk, or a semiconductor memory, and outputs the read information to the RAM 905. Furthermore, the drive 921 can write in the attached removable recording medium 927 such as a magnetic disk, an optical disk, a magneto-optical disk, or a semiconductor memory. The removable recording medium 927 is, for example, a DVD medium, an HD-DVD medium, or a Blu-ray medium. The removable recording medium 927 may be a CompactFlash (CF; registered trademark), a flash memory, an SD memory card (Secure Digital Memory Card), or the like. Alternatively, the removable recording medium 927 may be, for example, an IC card (Integrated Circuit Card) equipped with a non-contact IC chip or an electronic appliance.
The connection port 923 is a port for allowing devices to directly connect to the information processing apparatus 10. Examples of the connection port 923 include a USB (Universal Serial Bus) port, an IEEE1394 port, a SCSI (Small Computer System Interface) port, and the like. Other examples of the connection port 923 include an RS-232C port, an optical audio terminal, an HDMI (High-Definition Multimedia Interface) port, and the like. By the externally connected apparatus 929 connecting to this connection port 923, the information processing apparatus 10 directly obtains various data from the externally connected apparatus 929 and provides various data to the externally connected apparatus 929.
The communication device 925 is a communication interface configured from, for example, a communication device for connecting to a communication network 931. The communication device 925 is, for example, a wired or wireless LAN (Local Area Network), Bluetooth (registered trademark), a communication card for WUSB (Wireless USB), or the like. Alternatively, the communication device 925 may be a router for optical communication, a router for ADSL (Asymmetric Digital Subscriber Line), a modem for various communications, or the like. This communication device 925 can transmit and receive signals and the like in accordance with a predetermined protocol such as TCP/IP on the Internet and with other communication devices, for example. The communication network 931 connected to the communication device 925 is configured from a network and the like, which is connected via wire or wirelessly, and may be, for example, the Internet, a home LAN, infrared communication, radio wave communication, satellite communication,
Heretofore, an example of the hardware configuration capable of realizing the functions of the information processing apparatus 10 according to the embodiment of the present invention has been shown. Each of the structural elements described above may be configured using a general-purpose material, or may be configured from hardware dedicated to the function of each structural element. Accordingly, the hardware configuration to be used can be changed as appropriate according to the technical level at the time of carrying out the present embodiment.
It should be understood by those skilled in the art that various modifications, combinations, sub-combinations and alterations may occur depending on design requirements and other factors insofar as they are within the scope of the appended claims or the equivalents thereof.
The present application contains subject matter related to that disclosed in Japanese Priority Patent Application JP 202009-277082 filed in the Japan Patent Office on Dec. 4, 2009, Japanese Priority Patent Application JP 202009-277081 filed in the Japan Patent Office on Dec. 4, 2009, the entire content of which is hereby incorporated by reference.
Number | Date | Country | Kind |
---|---|---|---|
P2009-277081 | Dec 2009 | JP | national |
P2009-277082 | Dec 2009 | JP | national |