This application is based on Japanese Patent Application No. 2009-217709 filed on Sep. 18, 2009 and No. 2010-169704 filed on Jul. 28, 2010, the contents of which are hereby incorporated by reference.
1. Field of the Invention
The present invention relates to an image display device which displays an image and an image display method.
2. Description of Related Art
In recent years, digital imaging devices, which record taken images (including moving images and still images) as data on a recording medium instead of recording on a film, have been widely spread. This type of imaging device is capable of recording a large amount of image data that is obtained by taking images and capable of deleting any image data item recorded. The device thus allows a user to casually take as many images as he/she likes until a satisfactory image is obtained.
The user uses an image display device attached to the imaging device, typically, a viewer such as a monitor or a photo frame, to reproduce image data (to play moving images or to display still images). The user selects image data to be reproduced with the use of an operation unit constituted of, for example, a touch panel. Some image display devices facilitate the selection process by displaying images corresponding to image data items (hereinafter referred to as corresponding images) such as thumbnail images.
However, clearly displaying a large number of corresponding images at once is difficult because of the limited display screen size of the attached image display device. The user therefore needs to operate the operation unit to switch displayed corresponding images one after another until the corresponding image of desired image data is displayed.
When the number of image data items is large in this case, the user may need to perform the switching many times to find and select desired image data. Specifically, the user may need to go forward (to display the next corresponding image), or go backward (to display the preceding corresponding image), through corresponding images many times. This makes the operation of selecting desired image data laborious, which is a problem.
In the case where there is a plurality of image data items obtained by taking images from the same angle, in the same location, at the same time of day, or the like and the user decides that the desired image data is not among this set of image data items, the user will feel no need to examine the corresponding images of this set of image data items carefully, and is likely to fast forward or fast back (to go forward or backward through many corresponding images in one operation) through the set of image data items in the hope of quickly finding and selecting the desired image data. Then the user may accidentally go past the displayed corresponding image of the desired image data (“forward overshoot” or “backward overshoot”) and take longer to select the desired image data, which is another problem.
To address the problems, an image display device has been proposed which allows a user to select desired image data quickly in just a few operations by displaying reduced images of image data items that resemble a selected image data item in the order of similarity, and subsequently displaying only reduced images of image data items relevant to an image data item that is selected from among the first displayed reduced images.
In another image display device that has been proposed, a representative image data item is determined for each group of a plurality of image data items, and reduced images of the determined image data items are displayed side by side. When a user selects an image data item that is the representative of one group, this image display device displays reduced images of image data items that belong to the same group as the selected image data item.
A drawback of the former image display device is that searching for an image data item to which only a few image data items are similar or relevant is difficult because their reduced images cannot be displayed preferentially. A drawback of the latter image display device is that an image data item is difficult to search for when to which group the image data item belongs is unknown. A user trying to select such an image data item as those has no choice but to search for the image to be selected by going forward or backward through many corresponding images the same way as in conventional devices. The problem of laborious operation and the problem of prolonged image data selection due to “forward overshoot” or “backward overshoot” arise as a result.
According to the present invention, there is provided an image display device, including:
a display unit which displays at least one of corresponding images which are in a given order;
an input unit to which a switching instruction is input to switch the at least one corresponding image displayed on the display unit in the given order: and
a switching control unit which switches the at least one corresponding image displayed on the display unit in accordance with the switching instruction,
in which, when the at least one corresponding image displayed on the display unit does not have a close correlation with at least a corresponding image to be displayed next, the switching control unit switches the corresponding images by a first switching amount, which is determined based on the switching instruction, and
in which, when the correlation is close, the switching control unit switches the corresponding images by a second switching amount, which is determined by a method different from that of the first switching amount.
According to the present invention, there is also provided an image display method, including:
a first step of displaying at least one of corresponding images which are in a given order;
a second step of inputting a switching instruction which switches the at least one corresponding image displayed in the first step in the given order; and
a third step of switching the at least one corresponding image displayed in the first step,
in which, when the at least one corresponding image displayed in the first step does not have a close correlation with at least a corresponding image to be displayed next, the corresponding images are switched in the third step by a first switching amount, which is determined based on the switching instruction input in the second step, and
in which, when the correlation is close, the corresponding images are switched in the third step by a second switching amount, which is determined by a method different from that of the first switching amount.
The significance and effects of the present invention are clarified by the following description of an embodiment. However, the embodiment given below is merely one of possible embodiments of the present invention, and terms describing the present invention and its components are not limited to the meaning written in the following embodiment.
<<Overview of an Image Display Device>>
An embodiment of the present invention is described below with reference to the drawings. First, an overview of an image display device according to the present invention is given with reference to
As illustrated in
The image display device 1 causes the user to select an image data item to be reproduced from among image data items recorded in the image recording unit 2. To this end, the display control unit 5 causes the display unit 4 to display at least one corresponding image of an image data item (for example, a thumbnail image attached as one of the contents of the image data item, or an image obtained by the display control unit 5 by adjusting the image data item (e.g., a reduced image of a still image or a reduced image of one frame contained in a moving image)).
Image data items recorded in the image recording unit 2 are in a given order. The given order can be any order, for example, the order of image taking date/time, the order of image taking, the order of image data names, the order of file formats, an arbitrary order set by the user, or a combination of those orders. A corresponding image which corresponds to an image data item can be interpreted as occupying the same place in the order as its image data item. In addition to the image data item's place in the order, the corresponding image is interpreted as having various relations its image data item has (for example, a correlation and a category). The following description is simplified by assuming that a corresponding image occupies the same place in the order and has the same relations as its image data item.
In a search for image data to be reproduced, display by the display unit 4 looks, for example, as illustrated in
As illustrated in
When the user operates the operation unit 3 to input a “selection instruction,” an image data item to which the reproduction candidate image C10 corresponds is reproduced. As illustrated in
The reproduction candidate images C10 to C12 are preferred to be distinguishable from other corresponding images such as the preceding candidate images B10 to B12 and the next candidate images A10 to A12. In the example of
The image display device 1 can be a part of some device (for example, the image display device 1 can be a monitor of an imaging device). The components illustrated in
In
While
<<Switching Control of Corresponding Images>>
The description given next with reference to the drawings is about details of corresponding image switching control in a search for image data to be reproduced.
<Switching Control: Basics>
The basics of switching control are described first with reference to the drawings.
The “instruction amount” is the signal value of a switching instruction which is input to the display control unit 5 by the user by operating the operation unit 3. In principle, the instruction amount increases as the amount of the user's operation of the operation unit 3 at a time (or per unit time) is increased or the length of operation at a time is prolonged. For example, in the case where the operation unit 3 is a touch panel, the instruction amount is larger when the user slides a finger, a stylus, or the like on the touch panel (strokes the touch panel) in one direction for a longer distance, or at a higher speed, at a time. To give another example, in the case where the operation unit 3 is a set of keys, the instruction amount is larger when the user keeps pressing one key for a longer period of time. In still another example where the operation unit 3 is a tracking ball, the instruction amount is larger when the user causes the tracking ball to rotate in one direction faster. Those are merely examples and the instruction amount can be set in any way.
The “switching amount” is a value set by the display control unit 5 based on the instruction amount which is input from the operation unit 3, and indicates the amount of corresponding images displayed on the display unit 4 that are to be switched. To give a concrete example, the switching amount can be defined as the number of corresponding images that are switched per unit action or per unit time. The following description is made concrete by defining the switching amount as the number of corresponding images that are switched in one switching action and as an integer (which means that the screen scrolls forward or backward through corresponding images on an image basis). Defined as this, the switching amount in the switch from
The graphs of
The image display device 1 may switch corresponding images such that the user can view not only the display before and after the switching but also the process of the switching (the dynamic sequence of corresponding images being switched to go forward or backward). This structure allows the user to recognize the specifics of the switching with ease, and therefore is preferred.
<Switching Control: Corresponding Images Having a Close Correlation>
Corresponding images are switched, in principle, by a switching amount set in accordance with a basic relation as shown in
[Calculation of the Degree of Correlation]
An example of how to calculate the degree of correlation is described first. The degree of correlation is calculated by comparing various types of information on a plurality of image data items. The degree of correlation may be calculated from two image data items that are consecutive in the order, or from three or more image data items that are consecutive in the order.
The calculation of the degree of correlation can use various types of information on image data, including the image taking date/time and image taking location of image data, the degree of similarity between images composed from image data items (for example, a still image or one frame contained in a moving image, which hereinafter may simply be referred to as image), settings set by the user, and a result of comprehensively weighing those points. To give a concrete example, the degree of correlation is higher when compared image data items have image taking dates/times closer to each other, have image taking locations closer to each other, and create images more similar to each other (e.g., images have a greater degree of similarity to each other).
In the case where the degree of correlation is calculated from the image taking date/time of image data, the image taking date/time used for the calculation is, for example, one that is recorded as part of image data when an image is taken. The calculation of the degree of correlation may be weighted such that compared image data items have a particularly high degree of correlation when the time difference between the image taking dates/times of the image data items is smaller than a reference time, which is a given length of time.
In the case where the degree of correlation is calculated from the location of image taking, the image taking location used for the calculation is, for example, one recorded as part of image data when an image is taken by an imaging device equipped with a global positioning system (GPS). The calculation of the degree of correlation may be weighted such that compared image data items have a particularly high degree of correlation when the distance difference between the image taking locations of the image data items is smaller than a reference distance, which is a given distance.
Methods of calculating the degree of similarity between images are described below. The degree of similarity between images can be calculated from various aspects. Three different methods of calculating the degree of similarity which are referred to as first method, second method, and third method are discussed in the following description. The calculation of the degree of similarity may use any of the first to third methods or a combination thereof.
Described first as the first method is a method of calculating the degree of similarity based on the number of people in each image. In this method of calculating the degree of similarity, the number of people is calculated for each compared image by performing face detection on each image and counting the number of faces detected in the image. When the number of people calculated for one image and the number of people calculated for another image are substantially equal to each other, the degree of similarity between the images is set high. The degree of similarity is set high also when the number of people calculated for one image and the number of people calculated for another image are both zero.
The first method can employ various known technologies for face detection. For example, AdaBoost (Yoav Freund, Robert E. Schapire, “A decision-theoretic generalization of on-line learning and an application to boosting,” European Conference on Computational Learning Theory, Sep. 20, 1995) may be used. In AdaBoost, a plurality of weak classifiers weighted by classifying a large number of training samples (face and non-face sample images) sequentially classifies parts of a frame of a moving image, to thereby detect a face.
Described next as the second method is a method of calculating the degree of similarity based on whether or not persons contained in images are the same person. In this method of calculating the degree of similarity, face recognition is perfoiined on each compared image to determine whether or not the same person is detected in the compared images. When the same person is detected in the compared images, the degree of similarity between the images is set high.
The second method can employ various known technologies for face recognition. For example, the face of a person which is detected in an image through face detection may be compared with a sample image of a specific person which is recorded in advance. To give another example, a person's face detected in one image and a person's face detected in another image may be compared with each other.
As the third method, a method of calculating the degree of similarity by utilizing a “feature vector” which indicates the feature amount of an image is described with reference to the drawings. The following description takes as an example a feature vector calculation method that uses the feature vector of a background region, which is a region remaining after a person region is removed from the whole image. A person region can be calculated by estimating which region contains a person based on the location and size of a face region that is detected by, for example, face detection as described above. In the case of an image that does not contain a person, the entire image can be a background region.
The filters 111, 112, 113, and 114 extract edges extending in the horizontal direction, the vertical direction, a right oblique direction, and a left oblique direction of the image 100, respectively, and output filter output values indicating intensity of the extracted edges. The filter 115 extracts an edge extending in a direction not classified in the horizontal direction, the vertical direction, the right oblique direction, and the left oblique direction, and outputs a filter output value indicating intensity of the extracted edge.
The intensity of the edge represents a gradient magnitude of a pixel signal (for example, luminance signal). For example, when there is an edge extending in the horizontal direction of the image 100, a relatively large gradient occurs in the pixel signal in the vertical direction which is orthogonal to the horizontal direction. Further, for example, when spatial filtering is performed by causing the filter 111 to function on the small region having the focused pixel 101 at the center thereof, the gradient magnitude of the pixel signal along the vertical direction of the small region having the focused pixel 101 at the center thereof is obtained as the filter output value. Note that, this is common to the filters 112 to 115.
In a state in which a certain pixel in the image 100 is determined as the focused pixel 101, the filters 111 to 115 are caused to function on the small region having the focused pixel 101 at the center thereof, to thereby obtain five filter output values. Among the five filter output values, the maximum filter output value is extracted as an adopted filter value. When the maximum filter output value is the filter output value obtained from one of the filters 111 to 115, the adopted filter value is called one of a first adopted filter value to a fifth adopted filter value. Therefore, for example, when the maximum filter output value is the filter output value from the filter 111, the adopted filter value is the first adopted filter value, and when the maximum filter output value is the filter output value from the filter 112, the adopted filter value is the second adopted filter value.
The position of the focused pixel 101 is caused to move from one pixel to another in the horizontal direction and the vertical direction in the background region of the image 100, for example. In each movement, the filter output values of the filters 111 to 115 are obtained, to thereby determine the adopted filter value. After the adopted filter values with respect to all the pixels in the background region of the image 100 are determined, histograms 121 to 125 of the first to fifth adopted filter values as illustrated in
The histogram 121 of the first adopted filter value is a histogram of the first adopted filter value obtained from the image 100. In the example illustrated in
In addition, color histograms representing a state of color in the background region of the image 100 are created. For example, when pixel signals in each pixel forming the image 100 include an R signal representing intensity of red color, a G signal representing intensity of green color, and a B signal representing intensity of blue color, a histogram HSTR of an R signal value, a histogram HSTG of a G signal value, and a histogram HSTB of a B signal value in the background region of the image 100 are created as the color histograms of the image 100. For example, when the number of bins of each color histogram is 16, 48 frequency data items may be obtained from the color histograms HSTR, HSTG, and HSTB. A vector (for example, 48-dimensional vector) having the frequency data items obtained from the color histograms as elements thereof is obtained as a color vector HC.
When the feature vector of the image 100 is expressed by H, the feature vector H is obtained by an expression “H=kC×HC+kE×HE”, where kC and kE denote predetermined coefficients (note that, kC≠0 and kE≠0). Therefore, the feature vector H of the image 100 represents the image feature amounts in accordance with a shape and color of an object in the image 100.
A method of calculating the degree of similarity by using the feature vector H which is calculated in the manner described above is now described. To calculate the degree of similarity between two images, for example, feature vectors H1 and H2 of the respective images are calculated first. The feature vectors H1 and H2 are placed into a space where the feature vector H is to be defined. The start points of the feature vectors H1 and H2 are placed at the origin, and the distance (Euclidean distance) between the end point of the feature vector H1 and the end point of the feature vector H2 in the feature space is calculated. The calculation of the degree of similarity is then performed so that the degree of similarity is larger when this distance is shorter. The calculation of the degree of similarity may be performed such that the degree of similarity is particularly high when this distance is shorter than a reference distance, which is a given distance.
Note that, in a moving picture experts group (MPEG) 7, the derivation of the feature vector H (feature amount) of the image is performed by using five edge extracting filters. Moreover, the five edge extracting filters may be applied to the filters 111 to 115. In addition, the feature vector H (feature amount) of the image 100 may be derived by applying a method standardized in MPEG 7 to the image 100. Further, the feature vector H may be calculated by using only one of the feature amounts of a shape and color.
The degree of correlation between image data items recorded in the image recording unit 2 may be calculated in advance. Alternatively, the display control unit 5 may calculate the degree of correlation at the time corresponding images displayed on the display unit 4 are switched. Details of when to calculate the degree of correlation (when to determine whether there is a close correlation or not) are described later.
[Switching Amount]
Described next with reference to the drawings is the switching amount that is set when corresponding images to be switched have a close correlation.
The case where corresponding images to be switched have a close correlation is, for example, a case where the correlation is close between an image data item to which the current reproduction candidate image corresponds and an image data item that precedes (when going backward) or follows (when going forward) this image data item in the order (i.e., the image data item of a corresponding image that turns into a reproduction candidate image at least next to the current reproduction candidate image). To give a concrete example, corresponding images to be switched have a close correlation when the correlation is close at least between an image data item to which the current reproduction candidate image corresponds and an image data item that is immediately before or after this image data item in the order. Corresponding images to be switched do not have a close correlation when the correlation between those image data items is not close.
To switch corresponding images that do not have a close correlation, the switching amount is set in accordance with the basic relation described above. Specifically, switching amounts E1 and E2, for example, are set with respect to an instruction amount D as shown in
The image display device 1 structured as above can vary how corresponding images are switched depending on whether the correlation between the corresponding images is close or not. Switching suited to each specific set of corresponding images is thus executed. In particular, corresponding images that have a close correlation can be switched quickly by setting a large switching amount for the switching of closely correlated corresponding images. This allows the user to easily and quickly switch corresponding images that are not wanted at the moment, with the result that desired corresponding images are displayed easily and quickly. The user can accordingly select desired image data easily and quickly.
[Display Example]
Concrete examples of how display looks when corresponding images displayed on the display unit 4 are switched (first to fourth display examples) are described next with reference to the drawings.
The corresponding image 203 and the corresponding image 204 can be determined as having a close correlation. Similarly, the corresponding image 204 and the corresponding image 205 can be determined as having a close correlation. It can therefore be determined that the corresponding images 203 to 205 have a close correlation. The determination that the corresponding images 203 to 205 have a close correlation may be made as a result of directly comparing image data items to which the corresponding images 203 to 205 respectively correspond.
In the first to fourth display examples described below, the corresponding images 201 to 207 are each checked in advance before switching (for example, at the time of image taking or transferring; details are described later) for whether or not the corresponding image has a close correlation with other corresponding images, unless otherwise stated.
{First Display Example}
In
The corresponding image 202 (reproduction candidate image C20) and the corresponding image 203 (next candidate image A20) do not have a close correlation as described above. Therefore, when a switching instruction having the instruction amount D is input in
As described above, the corresponding image 203 (reproduction candidate image C21), the corresponding image 204 (next candidate image A21), and the corresponding image 205 have a close correlation. Therefore, when a switching instruction having the instruction amount D is input in
This switching brings
When images are displayed and switched in the manner described above, it seems to the user as if the corresponding images 203 to 205 are switched at high speed.
In the case where the determination of whether there is a close correlation or not is performed at the time of switching in this example, executing the determination can be difficult, in terms of calculation amount and calculation speed, for other correlations than the correlation between the current reproduction candidate image and few corresponding images preceding (in the case of going backward) or following (in the case of going forward) the reproduction candidate image in the order. A countermeasure is, for example, to set the switching amount EC to a value based on the instruction amount D or to a given value. The switching amount EC may be set to a value based on the instruction amount D or to a given value also when whether there is a close correlation or not is determined in advance.
Setting the switching amount EC to a value based on the instruction amount D or to a given value may cause a situation where the corresponding images 203 to 205 which have a close correlation are not switched at once to go forward (in other words, one of the states of
{Second Display Example}
In
The corresponding image 202 (reproduction candidate image C30) and the corresponding images 203 to 205 (next candidate image A30) do not have a close correlation as described above. Therefore, when a switching instruction having the instruction amount D is input in
As described above, the corresponding images 203 to 205 (reproduction candidate image C31) have a close correlation. Therefore, when a switching instruction having the instruction amount D is input in
This switching brings
When images are displayed and switched in the manner described above, it seems to the user as if the corresponding images 203 to 205 are switched in a mass to go forward.
In this example, one representative corresponding image (corresponding image 205) selected out of the corresponding images 203 to 205 which are overlaid on top of one another to be grouped together into one stack is displayed in the same way as other corresponding images. The representative corresponding image can be any of the corresponding images 203 to 205 and, in this example, the corresponding image 205 which is the last of the three corresponding images in the order serves as the representative corresponding image. In the case where corresponding images are arranged in the order of image taking time or the order of image taking, for example, an image data item that is the last in the order is likely to be one with which the person who took the image is satisfied, and is not likely to be image data obtained from failed image taking. Therefore, selecting a corresponding image that is the last in the order as the representative corresponding image enables the user to grasp the entire set of corresponding images that have a close correlation when the corresponding images are thumbnail images or reduced images.
The stack of corresponding images obtained by overlaying the corresponding images 203 to 205 on top of one another may always be displayed as a stack irrespective of whether the switching of corresponding images is to be executed or not. Alternatively, the corresponding images 203 to 205 may be displayed individually instead of as a stack when the switching of images is not planned.
{Third Display Example}
In
The corresponding image 202 (reproduction candidate image C40) and the corresponding image 203 (next candidate image A40) do not have a close correlation as described above. Therefore, when a switching instruction having the instruction amount D is input in
As described above, the corresponding image 203 (reproduction candidate image C41), and the corresponding image 204 and the corresponding image 205 (next candidate image A41) have a close correlation. Therefore, when a switching instruction having the instruction amount D is input in
This switching brings
When images are displayed and switched in the manner described above, it seems to the user as if the corresponding images 203 to 205 are switched in a mass to go forward.
In this display example, the first corresponding image 230 in order among the corresponding images 203 to 205, which have a close correlation to one another, is displayed independently without being grouped together. The corresponding images 204 and 205 are displayed as a combined image obtained by grouping the corresponding images 204 and 205 into one group. Displaying independently the corresponding image 230 which is the first in the order enables the user to easily grasp that the corresponding images grouped together relate with the preceding corresponding image 203.
The combined corresponding image obtained by grouping the corresponding images 204 and 205 into one group may always be displayed as a group irrespective of whether the switching of corresponding images is to be executed or not. Alternatively, the corresponding images 204 and 205 may be displayed individually instead of as a group when the switching of images is not planned.
The corresponding images that are combined into one corresponding image may be the corresponding images 203 to 205 as in the second display example described above, or it may be the corresponding images 204 and 205 that are overlaid on each other to be one corresponding image. The corresponding image that is displayed independently instead of as part of a combined image may be the corresponding image 205, while the corresponding images 203 and 204 are combined into one corresponding image.
{Fourth Display Example}
In
The corresponding image 202 (reproduction candidate image C50) and the corresponding images 203 to 205 (next candidate image A50) do not have a close correlation as described above. Therefore, when a switching instruction having the instruction amount D is input in
As described above, the corresponding images 203 to 205 (reproduction candidate image C31) have a close correlation. Therefore, when a switching instruction having the instruction amount D is input in
This switching brings
When images are displayed and switched in the manner described above, it seems to the user as if the corresponding images 203 to 205 are switched in a mass to go forward.
In this example, the grouped corresponding image obtained by displaying only one of the corresponding images 203 to 205, namely, one representative corresponding image to be displayed, is the corresponding image 205. The representative corresponding image can be any of the corresponding images 203 to 205. However, as described in the second display example, selecting a corresponding image that is the last in the order as the representative corresponding image enables the user to grasp the entire set of corresponding images that have a close correlation.
The grouped corresponding image of this display example which is obtained by displaying only one of the corresponding images 203 to 205 is difficult to distinguish from other corresponding images if displayed as it is. The grouped corresponding image therefore is preferred to announce itself as a grouped corresponding image in some way. An image can be announced as a grouped corresponding image by, for example, displaying the image in a frame wider than that of other corresponding images as illustrated in
The grouped corresponding image obtained by displaying only one of the corresponding images 203 to 205 may always be displayed as a group irrespective of whether the switching of corresponding images is to be executed or not. Alternatively, the corresponding images 203 to 205 may be displayed individually instead of as a group when the switching of images is not planned.
The announcement of a grouped corresponding image described above is not limited to the fourth display example and may be employed in the first to third display examples, where the grouped corresponding image displays the closely correlated corresponding images 203 to 205 directly (first display example) or indirectly (second and third display examples).
<Modification Example of Switching Control>
A description is given below with reference to the drawings on a modification example of switching control that is executed when corresponding images having a close correlation are switched.
As shown in
When the input instruction amount is an instruction amount D2 which is larger than the threshold Dth1 and smaller than a threshold Dth2, the same switching control as in the examples of
In this modification example, the screen jumps to an image in the next category when an instruction amount D3 which is equal to or larger than the threshold Dth2 is input. A “category” is, for example, a group of image data items that have the same image taking date, the same image taking location, the same event where image taking took place, or the like. A “jump” is a switch to a corresponding image that belongs to the next category (for example, a corresponding image that is the first in the order within the category). A jump is executed by, for example, setting a switching amount (y−x+1), which is based on the order (x) of the current reproduction candidate image in a category and the number (y) of corresponding images belonging to this category.
A description is given below with reference to the drawing on a concrete example of how corresponding images are switched when the instruction amounts D1 to D3 of
In
The corresponding images 301 and 302 belong to the same category. The corresponding image 303 belongs to a category of its own. The corresponding images 304 to 307 belong to the same category. The corresponding image 308 belongs to a category of its own. In
When the input instruction amount is D1, the switching amounts E11 and E21 which satisfy the basic relation are set. In the example of
In this example, although the corresponding image 304 and the corresponding image 305 which follows the corresponding image 304 in the order have a close correlation, the switching amounts E11 and E21 which satisfy the basic relation are set as described above. The corresponding images 304 and 305 are accordingly switched to go forward one at a time (the same is true when the corresponding images 306 and 307 are switched to go forward).
When the instruction amount D2 is input to switch the corresponding image 301 to go forward, the switching amounts E12 and E22 which satisfy the basic relation are set because the corresponding image 301 and the corresponding image 302 which follows the corresponding image 301 in the order do not have a close correlation (the same is true when the corresponding image 303 is switched to go forward). In the example of
When the instruction amount D2 is input to switch the corresponding image 305 to go forward, the switching amount EC which does not satisfy the basic relation is set because the corresponding image 305 and the corresponding images 306 and 307 which follow the corresponding image 305 in the order have a close correlation. As in the first to fourth display examples, the switching amount EC in the example of
When the input instruction amount is D3, corresponding images are switched to go forward on a category basis. Each time the instruction amount D3 is input (each time the user operates the operation unit 3), corresponding images are switched to go forward and reach one that is the first in the order among corresponding images belonging to the next category. Specifically, in the example of
As described above, when the input instruction amount is D1 which is equal to or smaller than the threshold Dth1, the closely correlated corresponding images 304 to 307 can be switched separately (for example, in a manner that turns each into the reproduction candidate image separately) by setting the switching amounts E11 and E21, which satisfy the basic relation, irrespective of whether corresponding images to be switched have a close correlation or not. Therefore, in the case where an image data item to be selected is among image data items to which the corresponding images 304 to 307 correspond, the image data item of interest is easily selected by simply reducing the instruction amount (for example, by reducing the amount of the user's operation of the operation unit 3 at a time (or per unit time), or by shortening the length of operation at a time).
When the input instruction amount is D3 which is equal to or larger than the threshold Dth2 and corresponding images are switched on a category basis, one of the advantages is that candidates can be narrowed down at an early stage of a search for an image data item to be selected. The image data item is therefore selected easily and quickly.
The threshold Dth2 may not be provided. In this case, when the input instruction amount is larger than the threshold Dth1 and corresponding images to be switched do not have a close correlation, a switching amount that satisfies the basic relation may be set whereas the switching amount EC which does not satisfy the basic relation is set when the correlation is close.
The thresholds Dth1 and Dth2 can take any values. The thresholds Dth1 and Dth2 may be values that are different from the values at steps of the stepped basic relation as in
This modification example is applicable to the second to fourth display examples. In this case, a grouped corresponding image may be broken into its constituent corresponding images to be displayed and switched separately at least when, for example, the input instruction amount is equal to or smaller than the threshold Dth1.
<When to Calculate the Presence or Absence of Correlation>
When to execute the determination of whether corresponding images have a close correlation or not is described next with reference to the drawings.
[At the Time of Switching]
As illustrated in
Based on the instruction amount and the result of the determination in STEP 2, the display control unit 5 determines whether to switch the pre-switching corresponding image to the switching candidate corresponding image (STEP 3). This determination of whether to execute a switch can be made based on whether or not the switch to the switching candidate corresponding image is within the range of the switching amount described above. Specifically, when the switching of corresponding images is within the range of the switching amount, it is determined that the corresponding images are to be switched.
As described above, the number of corresponding images, for which whether or not there is a close correlation can be determined, is limited in some cases due to the calculation amount and the calculation speed. In such cases, in STEP 3, a switching amount is set based on whether or not the determination of whether there is a close correlation or not can be executed (whether or not STEP 2 can be executed further), and whether to execute the switch is determined.
When it is determined that the corresponding images are not to be switched (STEP 3: NO), switching is ended. When it is determined that the corresponding images are to be switched (STEP 3: YES), on the other hand, the pre-switching corresponding image is switched to the switching candidate corresponding image (STEP 4). The corresponding images are switched at high speed as in the first display example. After the corresponding images are switched in STEP 4, the processing returns to STEP 2 to subsequently repeat STEP 3 and STEP 4.
The switching control described above can thus be performed on corresponding images of any kind (for example, image data items taken by a plurality of imaging devices and image data items whose information such as their order have been changed) by executing the determination of whether corresponding images (image data items) have a close correlation or not at the time of switching.
[At the Time of Transfer]
As illustrated in
When the correlation between the image data item to be transferred and its preceding or following image data item is close (STEP 14: YES), the image data items are recorded as ones that have a close correlation in the image display device or the recording device (STEP 15). At this point, information indicating that the correlation is close may be recorded in a part of each image data item such as a header, or may be recorded in a system recording area of the image display device or the recording device.
When the correlation between the image data item to be transferred and its preceding or following image data item is not close (STEP 14: NO), the image data items are recorded as ones that do not have a close correlation in the image display device or the recording device (STEP 16). At this point, as in STEP 15, information indicating that the correlation is not close may be recorded in the header or in the system recording area. Alternatively, the distant correlation may be indicated by not recording information about the correlation.
After STEP 15 or STEP 16 is finished, the processing returns to STEP 12 to check whether or not another image data item is to be transferred, and the subsequent steps are repeated.
With this structure, whether the correlation between image data items is close or not is determined prior to image data reproduction in the image display device 1. Accordingly, there is no need to execute the determination of whether or not there is a close correlation at the time of switching, and the switching control described above is completed quickly.
In STEP 15 and STEP 16, in which information indicating that the correlation is close is recorded, the degree of correlation may be recorded instead. The determination of whether there is a close correlation or not may be executed by the imaging device, or by the image display device or the recording device.
[At the Time of Image Taking]
As illustrated in
When the correlation is close between the image data item obtained by image taking and the image data item that precedes (or follows) the obtained image data item in the order (STEP 23: YES), the image data items are recorded as ones that have a close correlation in a recording unit of the imaging device (STEP 24). At this point, information indicating that the correlation is close may be recorded in a part of each image data item such as a header, or may be recorded in a system recording area of the imaging device.
When the correlation is not close between the image data item obtained by image taking and the image data item that precedes (or follows) the obtained image data item in the order (STEP 23: NO), the image data items are recorded as ones that do not have a close correlation in the recording unit of the imaging device (STEP 25). At this point, as in STEP 24, information indicating that the correlation is not close may be recorded in the header or in the system recording area. Alternatively, the distant correlation may be indicated by not recording information about the correlation. After STEP 24 or STEP 25 is finished, the action is ended.
With this structure, as in the case where the determination is made at the time of transfer in the manner described above, whether the correlation between image data items is close or not is determined prior to image data reproduction in the image display device 1. Accordingly, there is no need to execute the determination of whether or not there is a close correlation at the time of switching, and the switching control described above is completed quickly.
In STEP 24 and STEP 25, where information indicating that the correlation is close is recorded, the degree of correlation may be recorded instead.
[Switching of Corresponding Images for Which Whether the Correlation is Close or Not is Determined in Advance]
Described next with reference to
As illustrated in
Based on the instruction amount and the correlation checked in STEP 32, the display control unit 5 determines whether to switch the pre-switching corresponding image to the switching candidate corresponding image (STEP 33). This determination of whether to execute a switch can be made based on whether or not the switching of the switching candidate corresponding image is within the range of the switching amount described above. Specifically, when the switch to the switching candidate corresponding image is within the range of the switching amount, it is determined that the corresponding images are to be switched.
When it is determined that the corresponding images are not to be switched (STEP 33: NO), switching is ended. When it is determined that the corresponding images are to be switched (STEP 33: YES), on the other hand, the pre-switching corresponding image is switched to the switching candidate corresponding image (STEP 34). The corresponding image is switched at high speed as in the first display example, or switched together with other corresponding images as in the second to fourth display examples. After the corresponding images are switched in STEP 34, the processing returns to STEP 32 to subsequently repeat STEP 33 and STEP 34.
As described above, when whether a corresponding image has a close correlation or not is determined in advance, the image display device 1 only needs to check the correlation at the time of switching. This allows the image display device 1 to speed up switching control and to have a simpler structure.
<Examples of an Action Executed When the User Selects a Composite Corresponding Image>
In the description given above, image data reproduced is one to which a corresponding image selected by the user (by inputting a selection instruction via the operation unit 3) corresponds. However, the user may select one corresponding image in which at least two corresponding images are grouped together (see the second to fourth display examples,
Examples of an action executed when the image selected by the user is a composite corresponding image are described below. The following action examples, whether they be of the same kind or of different kinds, can be combined unless there is a contradiction. What follows are mainly examples of applying the action examples to the second display example (see
[Composite Corresponding Image Selection Detection Action]
Examples of an action executed by the display control unit 5 to detect that the user has selected a composite corresponding image are described first with reference to the drawings.
{Composite Corresponding Image Selection Detection Action: First Example}
The selection detection range S is not limited to the area where the reproduction candidate image is displayed, and may be set to the area where the preceding candidate image or the next candidate image is displayed. The selection detection range S may also be set to not one but a plurality of areas. For instance, the selection detection range S may be set to each of, or two of, the areas where the preceding candidate image, the reproduction candidate image, and the next candidate image are respectively displayed.
The state in this action example where a composite corresponding image is situated in the selection detection range S and an instruction from the user has not been input via the operation unit 3 for a given period of time or longer is similar to and interchangeable with a state in a fourth example of a post-composite corresponding image selection action which is described later.
{Composite Corresponding Image Selection Detection Action: Second Example]
{Composite Corresponding Image Selection Detection Action: Third Example}
In this action example, the display control unit 5 detects that a composite corresponding image has been selected when a composite corresponding image is situated in the selection detection range S described in the first example of the composite corresponding image selection detection action (see
Specifically, the display control unit 5 detects that a composite corresponding image has been selected when, for example, a composite corresponding image is the reproduction candidate image C6 and the user operates the operation unit 3 in a direction different from one for inputting a switching instruction (hereinafter referred to as switching instruction direction) (e.g., a direction perpendicular to the switching instruction direction or a direction between this direction and the switching instruction direction, which is hereinafter referred to as non-switching instruction direction.).
For example, in the case where the operation unit 3 is a touch panel and a switching instruction is input when the user slides a finger, a stylus, or the like on the touch panel (strokes the touch panel) in a direction in which a preceding candidate image B6 (B7), the reproduction candidate image C6 (C7), and a next candidate image A6 (A7) are aligned on the display unit 4 (the left-right direction in the drawings, i.e., the switching instruction direction), the user operates the operation unit 3 in a non-switching instruction direction by sliding a finger, a stylus, or the like on the display unit 4 (strokes the touch panel) along a direction that is not the switching instruction direction (the top-bottom direction or oblique direction in the drawings, i.e., a non-switching instruction direction). To give another example, in the case where the operation unit 3 is a tracking ball or a set of keys and a switching instruction is input by rolling the tracking ball in the alignment direction (switching instruction direction) or by pressing a key that is allocated to the switching instruction direction, the user operates the operation unit 3 in a non-switching instruction direction by rolling the tracking ball along a direction that is not the switching instruction direction (non-switching instruction direction) or by pressing a key that is allocated to a non-switching instruction direction.
An instruction that is not a switching instruction in this action example is similar to and interchangeable with an instruction in a third example of the post-composite corresponding image selection action which is described later.
The first to third examples of the composite corresponding image selection detection action are applicable not only to cases where a composite corresponding image is selected but also to cases where a corresponding image is selected.
[Post-composite Corresponding Image Selection Action]
Examples of an action executed by the display control unit 5 after the user selects a composite corresponding image are described next with reference to the drawings.
{Post-composite Corresponding Image Selection Action: First Example}
As illustrated in
While image data is being reproduced as illustrated in
Subsequently, the user further inputs an instruction that is not a switching instruction (for example, the user operates the operation unit 3 in a non-switching instruction direction), causing the display control unit 5 to sequentially switch the image data reproduced on the display unit 4. The image data reproduced is switched, for example, in the order of the degree of correlation, the order of the degree of similarity between images, the order of image taking date/time, or other orders described above.
This structure enables the user to easily reproduce and check image data items to which a plurality of closely correlated corresponding images respectively correspond.
In the case where the user's operation of the operation unit 3 in a non-switching instruction direction causes the switching of the image data reproduced on the display unit 4, image data may be switched in a given order which is determined depending on the direction of the operation (for example, whether the operation direction is the upward direction or the downward direction, or whether the operation direction is the oblique upward direction or the oblique downward direction). To give a concrete example, the image data reproduced on the display unit 4 may be switched in ascending order when the user operates the operation unit 3 in the upward direction, whereas the image data reproduced on the display unit 4 is switched in descending order when the user operates the operation unit 3 in the downward direction. This structure facilitates the user's search for desired image data.
The operation direction may also determine which order index (e.g., the image taking date/time or the degree of similarity between images composed from image data items) is to be used in switching. To give a concrete example, the image data reproduced on the display unit 4 may be switched in the order of image taking date/time when the user operates the operation unit 3 in the upward direction, whereas the image data reproduced is switched in the order of the degree of similarity between images when the user operates the operation unit 3 in the downward direction. This structure allows the user to select an arbitrary index in a search for desired image data.
The image data reproduced on the display unit 4 may be switched in different methods associated with different general operation directions (for example, the top-bottom direction and the oblique direction) in which the user operates the operation unit 3. To give a concrete example, the image data reproduced may be switched in a given order when the user operates the operation unit 3 in the top-bottom direction, whereas the image data reproduced is switched in an order of another index when the user operates the operation unit 3 in the oblique direction.
The general operation direction and the specific operation direction may respectively determine which order index is to be used and whether this order is an ascending order or a descending order. To give a concrete example, the image data reproduced on the display unit 4 may be switched in ascending order of image taking date/time when the user operates the operation unit 3 in the upward direction, whereas the image data reproduced on the display unit 4 is switched in descending order of image taking date/time when the user operates the operation unit 3 in the downward direction. The image data reproduced on the display unit 4 may be switched in ascending order of the degree of similarity between images when the user operates the operation unit 3 in the oblique upward direction, whereas the image data reproduced on the display unit 4 is switched in descending order of the degree of similarity between images when the user operates the operation unit 3 in the oblique downward direction.
When this action example is applied to the third display example, the same action as when this action example is applied to the second display example may be executed. In this case, the example may be modified such that image data items to which the corresponding images 204 and 205 (see
{Post-composite Corresponding Image Selection Action: Second Example}
The display unit 4 in this action example executes the same action as in the first example of the post-composite corresponding image selection action (see
In this action example, the image data reproduced on the display unit 4 is switched in a given order when the display control unit 5 detects that no instruction has been input from the user via the operation unit 3 for a given period of time or longer. The display control unit 5 can execute the switching repeatedly.
With this structure, the user can easily reproduce and check image data items to which a plurality of closely correlated corresponding images respectively correspond, without needing to perform a special operation.
{Post-composite Corresponding Image Selection Action: Third Example}
As illustrated in
While the composite corresponding image is displayed as illustrated in
Subsequently, the user further inputs an instruction that is not a switching instruction, causing the display control unit 5 to sequentially switch the representative corresponding image of the selected composite corresponding image. The representative corresponding image is switched, for example, in the order of the degree of correlation, the order of the degree of similarity between images, the order of image taking date/time, or other orders described above.
This structure enables the user to check a plurality of corresponding images constituting a composite corresponding image with ease.
The action described in the first example of the post-composite corresponding image selection action, in which the user's operation of the operation unit 3 in a non-switching instruction direction causes the switching of image data reproduced on the display unit 4, may be executed in this action example. However, the image switched in this action example is the representative corresponding image of the selected composite corresponding image.
When this action example is applied to the third display example, the same action as when this action example is applied to the second display example may be executed. In this case, the example may be modified such that how the corresponding images 204 and 205 (see
{Post-composite Corresponding Image Selection Action: Fourth Example}
The display unit 4 in this action example executes the same action as in the third example of the post-composite corresponding image selection action (see
In this action example, the representative corresponding image of a composite corresponding image is switched in a given order when the display control unit 5 detects that no instruction has been input from the user via the operation unit 3 for a given period of time or longer. The display control unit 5 can execute the switching repeatedly.
With this structure, the user can easily check a plurality of corresponding images constituting a composite corresponding image, without needing to perform a special operation.
{Post-composite Corresponding Image Selection Action: Fifth Example}
As illustrated in
While the view-all images D1221 to D1223 are being displayed as illustrated in
With this structure, corresponding images that have a close correlation with one another can easily be checked at once.
In the case where the user inputs a given instruction (for example, a selection instruction for selecting an arbitrary area on the display unit 4) while the display unit 4 is reproducing image data as in
In the case where the display unit 4 is as shown in
When this action example is applied to the third display example, the same action as when this action example is applied to the second display example may be executed. In this case, the closely correlated corresponding images 203 to 205 (see
[Modification Examples of the Actions]
{First Modification Example}
In the case where the action examples described above are applied to the third display example, the action executed in response to the user's selection of the corresponding image 203 (see
A concrete description is given with reference to the drawings on an example of this action.
In this modification example, as illustrated in
With this structure, the user can easily check corresponding images that have a close correlation to one another at once by selecting at least one of the closely correlated corresponding images.
This modification example corresponds to the second example of the composite corresponding image selection detection action and the fifth example of the post-composite corresponding image selection action, but can be adapted so as to correspond to other action examples as well.
{Second Modification Example}
The corresponding image that is displayed preferentially, such as the representative corresponding image in the second and fourth display examples or a corresponding image that has a close correlation with corresponding images constituting a composite corresponding image but is not displayed as part of the composite corresponding image in the third display example, may be variable. For instance, the corresponding image that is displayed preferentially may be one that has most recently been selected by the user, such as a corresponding image of image data most recently reproduced by the user, or a corresponding image most recently displayed preferentially in the third example of the post-composite corresponding image selection action.
This structure enables the image display device 1 to preferentially display a corresponding image that is likely to be selected by the user. Selecting a desired corresponding image is thus made easy for the user.
{Third Modification Example}
The display control unit 5 may additionally display operation methods of the operation unit 3 on the display unit 4 when a corresponding image and a composite corresponding image are displayed, when view-all images are displayed, when image data is reproduced, or the like. An example of how display by the display unit 4 looks in this case is described with reference to the drawings.
As illustrated in
As illustrated in
This structure allows the user to operate the operation unit 3 while looking at the images indicating operation methods that are displayed on the display unit 4. The user can thus easily bring the display unit 4 to a desired state.
The images indicating operation methods may be displayed on the display unit 4 all the time, or may be displayed only when a given condition is met, such as when the user operates the operation unit 3. This modification example corresponds to the second example of the composite corresponding image selection detection action and the first example of the post-composite corresponding image selection action, but can be adapted so as to correspond to other action examples as well. This modification is applicable not only to cases where a composite corresponding image is displayed (e.g., the second to fourth display examples) but also to cases where a composite corresponding image is not displayed (e.g., the first display example). However, this modification example is more effective when applied to cases where a composite corresponding image is displayed and accordingly the operation of the operation unit 3 may become complicated (e.g., the second and third examples of the composite corresponding image selection detection action and the first, third, and fifth examples of the post-composite corresponding image selection action).
When the user inputs a switching instruction via the operation unit 3 while the display unit 4 is reproducing image data as in the display example of
<<Modification Example>>
In the description given above, the switching amount takes only an integer value so that corresponding images are switched on an image basis. Alternatively, the switching amount may be set to a decimal number so that corresponding images are switched on a partial-image basis. An example of this case is described with reference to the drawings.
As shown in
The switching amount which is defined in the description given above as the number of corresponding images switched per switching action may be defined as the number of corresponding images switched per unit time. Defined as this, the switching amount is interpreted as a speed at which corresponding images are switched, and images are switched faster as the instruction amount increases (for example, corresponding images seemingly move fast on the display unit 4 to be switched). The switching speed during a switching action is not limited to a constant speed. For instance, when a plurality of corresponding images are to be switched at once, the switching speed may be increased for corresponding images that are switched nearer to the end of this switching action.
In some cases, the switching amount per unit time is inevitably increased as a result of increasing the switching amount per unit action, or the switching amount per unit action is inevitably increased as a result of increasing the switching amount per unit time. Those cases include the first display example when the length of one switching action is limited to a given range, and the second to fourth display examples.
In the image display device 1 according to the embodiment of the present invention, the actions executed by the display control unit 5 and the actions of other components may be implemented by a control device such as a microcomputer. Further, all or some of functions implemented by the control device may be written as a program so that all or some of the functions are implemented by running the program on a program executing device (e.g., a computer).
The image display device 1 of
The embodiment of the present invention has now been described. The present invention, however, is not limited thereto and can be carried out with various modifications, without departing from the spirit of the present invention.
The present invention is applicable to an image display device for displaying an image, typically, a display unit of an imaging device or a viewer, and to an image display method.
Number | Date | Country | Kind |
---|---|---|---|
2009-217709 | Sep 2009 | JP | national |
2010-169704 | Jul 2010 | JP | national |