This application is based upon and claims the benefit of priority from Japanese Patent Application No. 2009-114544, filed on May 11, 2009, the entire contents of which are incorporated herein by reference.
1. Field of the Invention
The present invention relates to an image search device and an image search method.
2. Description of the Related Art
Most modern digital cameras have large memory capacities that can store one thousand or more images so that the user can take images without worrying about the memory capacity. However, with the increase in the number of images storable in digital cameras, the user has to handle too many images during playback, which may increase the time taken to find a desired image amongst the taken images. Therefore, an efficient image search is needed.
An image search device according to an aspect of the present invention includes an image storage unit that stores therein a plurality of pieces of image data; an element-information storage unit that stores therein a plurality of pieces of element information indicative of element items assigned respectively to the pieces of the image data stored in the image storage unit; a target-image selecting unit that selects target image data from the image data stored in the image storage unit; a target-image displaying unit that displays the target image data selected by the target-image selecting unit on a display unit; a relevant-image selecting unit that selects a predetermined number of pieces of relevant image data that contain at least one of the pieces of element information identical to the piece of element information assigned to the target image data in such a manner that the predetermined number of pieces of relevant image data contain at least one image that has a different combination of the pieces of element information; and a relevant-image displaying unit that displays, on the display unit, the predetermined number of pieces of relevant image data selected by the relevant-image selecting unit.
An image search method according to another aspect of the present invention includes selecting target image data from image data stored; displaying the selected target image data; selecting, on the basis of a plurality of pieces of element information indicative of element items assigned respectively to the pieces of the image data, a predetermined number of pieces of relevant image data that contain at least one of the pieces of element information identical to the piece of element information assigned to the target image data in such a manner that the predetermined number of pieces of relevant image data contain at least one image that has a different combination of the pieces of element information; and displaying the predetermined number of pieces of relevant image data selected.
The above and other features, advantages and technical and industrial significance of this invention will be better understood by reading the following detailed description of presently preferred embodiments of the invention, when considered in connection with the accompanying drawings.
Exemplary embodiments of the present invention are described in detail below with reference to the accompanying drawings. In the present embodiment, a digital camera is used as an image search device according to the present invention. It is noted that the present invention is not limited to these embodiments. The same parts are denoted with the same reference numerals in the drawings.
On the rear surface of the main body 3, as shown in
On the side surface of the main body 3, a card slot (not shown) is provided to connect the digital camera 1 to a portable recording medium (an external memory 22 shown in
When the digital camera 1 is powered on by operation of the power ON/OFF button 6 and the operation mode is switched to the shooting mode by operation of the mode dial 11, the digital camera 1 is ready to take an image. In the shooting mode, images of the subject by light coming through the imaging lens 4 are displayed one frame to another (each, for example, 1/30 second) on the display unit 19 in real time as live preview images. The shooting mode includes various modes, such as a normal shooting mode, a sports-game mode, a landscape mode for a landscape, a portrait mode, and a continuous shooting mode in which images of the subject are taken continuously for a short time. The user dials the mode dial 11 and selects a shooting mode according to the shooting scene. While checking the live preview images, the user presses the release button 5 and takes a still image or video images. During the power is ON, the user can switch the operation mode to the playback mode by dialing the mode dial 11 and enjoy the images taken by the digital camera 1 being displayed (replayed) on the display unit 19. When the user switches the operation mode to the search mode by dialing the mode dial 11, the user can search for a desired one amongst the images taken at a previous time and display the desired image.
The configuration of the digital camera 1 is described below.
The imaging optical system 101 includes the imaging lens 4 (see
When, for example, the image data output from the A/D conversion circuit 104 is displayed on the display unit 19 as a frame to be displayed, the image buffer memory 17 stores therein the image data before the displaying. When data containing a taken image is read from the storage unit 20 (an internal memory 21 or the external memory 22) and then displayed on the display unit 19, the image buffer memory 17 stores therein the read image data before the displaying. When data containing a taken image output from the A/D conversion circuit 104 is stored in the storage unit 20, the image buffer memory 17 stores therein the image data before the storing.
Upon receiving the image data from the imaging unit 100, the image processing unit 15 processes the image data using various techniques and converts the processed image data into image data suitable for the storing or the displaying. For example, the image processing unit 15 converts a taken image into a small-size image for display (thumbnail), thereby creating data containing the thumbnail (thumbnail data).
The image processing unit 15 extracts feature information from data containing a taken image. For example, the image processing unit 15 extracts the feature information using a face detection technology disclosed in, for example, U.S. Pat. No. 7,324,670. During the face detection process, a region of a face (face region) is detected in the image data using, for example, pattern matching, and various facial parts, such as the right and left eyes, the nose, the lips, etc., are detected in the image data according to the result of the face region detection. The image processing unit 15 according to the first embodiment performs the face detection process and, for example, counts the detected faces in accordance with the result of the face detection. The number of the detected faces is set as the feature information. The image processing unit 15 counts the detected faces and extracts the number of the faces as the feature information. The image processing unit 15 recognizes the landscape (background) in accordance with a result of the image processing, detects a mountain, a river, a sea, a building, etc., in the image, and extracts them as the feature information.
When data containing a taken image is stored in the storage unit 20, the compressing/decompressing unit 16 compresses the data using, for example, JPEG (joint photo graphic experts group). When the image data is read from the storage unit 20 and then displayed on the display unit 19, the compressing/decompressing unit 16 decompresses the image data.
The displaying unit 18 displays the image data to be displayed on the display unit 19. For example, if the shooting mode is selected, the displaying unit 18 displays, continuously on the display unit 19 as live preview images, images that have been taken frame by frame by the imaging element 102 and then subjected to the image processing by the image processing unit 15. If the playback mode is selected, the displaying unit 18 displays, on the display unit 19, an image that have been read from the storage unit 20 and then subjected to the image processing by the image processing unit 15. The display unit 19 displays thereon not only taken images and live view images but also setting screens that contain various setting information about the digital camera 1. The display unit 19 is, for example, a TFT LCD.
The wireless I/F 24 is an interface that connects the digital camera 1 to an external device so that image data is wirelessly transferred between them. The wireless I/F 24 is an interface compatible with transmission standards, for example, wireless USB (WUSB), ultra wideband (UWB) wireless transmission, optical wireless transmission based on Infrared Data Association (IrDA) specifications, etc.
The wired I/F 23 is an interface that connects the digital camera 1 to an external device so that image data is transferred between them via a communication cable. The wired I/F 23 is an interface compatible with, for example, USB standards and IEEE1394 standards.
The storage unit 20 includes the internal memory 21 and the external memory 22. The internal memory 21 and the external memory 22 is made up of a recording medium and a reading/writing device that reads/writes from/to the recording medium, etc. The recording medium can be, for example, semiconductor memories, such as an updatable flash memory and a ROM, an internal hard disk or an external hard disk connected to the digital camera 1 via a data communication terminal, and a memory card. The internal memory 21 and the external memory 22 are one or a combination of recording media that are selected as appropriately from the above-described types. The storage unit 20 works as an image storage unit, i.e., stores therein data containing images taken by the imaging unit 100. The storage unit 20 stores therein not only the image data but also sound data that is collected by the sound collecting unit 26 during the time of, for example, shooting.
The operation unit 25 receives various user instructions, such as an instruction for time to take an image, an instruction to switch the operation modes that include, shooting modes, a playback mode, and a search mode, and an instruction to set shooting configurations, and sends a signal of the instruction to the CPU 27. The operation unit 25 is made up of, for example, button switches each assigned to a function, dials, and sensors. The operation unit 25 includes the release button 5, the power ON/OFF button 6 shown in
The sound collecting unit 26 is a sound collector (microphone) that collects sounds or voices of the subject and outputs a signal of the collected sounds to the CPU 27.
The CPU 27 reads a camera program from the flash ROM 29 in accordance with, for example, the instruction signal received from the operation unit 25, executes the read program, and sends instructions or data to the target unites of the digital camera 1, thereby controlling the operation of the digital camera 1. For example, the CPU 27 performs a process for causing the imaging optical system 101 to AF control or zoom control and a process for causing the imaging unit 100 to create data containing a taken image. The CPU 27 assigns image numbers to the taken images in chronological shooting order and stores the created image data in the storage unit 20 in associated with the image number or information indicative of the shooting time (shooting date and time). The CPU 27 performs a process for assigning element information to an taken image and storing the taken image and the element information in an element information database (DB) 291 of the flash ROM 29 and a process for, when an taken image is displayed as a target image on a screen, setting the image number of the target image as target-image log information 293. The CPU 27 performs a process for creating sound data from the sound signal collected by the sound collecting unit 26, a process for storing the created sound data, a process for setting various shooting conditions, and a process for displaying image data on the display unit 19. Moreover, the CPU 27 performs a process for outputting the sound data from a speaker (not shown), a process for inputting/outputting data via the wired I/F 23 or the wireless I/F 24, a process for controlling various operations of the digital camera 1, and a process for implementing various functions of the digital camera 1.
The flash ROM 29 is an electrically erasable nonvolatile memory and stores therein various camera programs and data used in the camera programs. The camera programs are used to activate the digital camera 1 and implement various functions of the digital camera 1. The flash ROM 29 works as an element-information storage unit, i.e., stores therein the element information DB 291. The element information DB 291 contains element information assigned to every taken image stored in the storage unit 20.
In the first embodiment, during creation of a taken image, four element items are assigned to the taken image that include shooting date, shooting mode, presence/absence of human figures (hereinafter, “No. of human figures”), type of the subject (hereinafter, “subject type” in short). The element items are expressed by values (element information). For example, the CPU 27 extracts the shooting date from the shooting date and time and sets the extracted shooting date as the element item “shooting date”. The CPU 27 sets the shooting mode in which the image is taken selected from the above-described shooting modes that include the normal shooting mode, the sports-game mode, the landscape mode, and the portrait mode, as the element item “shooting mode”. The CPU 27 sets the feature information extracted by the image processing unit 15 as the “No. of human figures” or the “subject type”. For example, the CPU 27 sets the number of faces detected in the image data as the element item “No. of human figures” in accordance with a result of the face detection process performed by the image processing unit 15. Moreover, the CPU 27 sets the type of the landscape as the element item “subject type” on the basis of the landscape recognized/detected by the image processing unit 15 in the image data.
The flash ROM 29 works as a target-image log storage unit, i.e., stores therein the target-image log information 293 and manages logs of images that are set as the target image in the search mode. For example, the target-image log information 293 contains the image numbers of images that have been displayed on a screen as the target image in the search mode arranged in chronological search order.
The search mode, which is a motion mode of the digital camera 1, is described below.
While checking the image search screen, the user moves a cursor C11 to a desired thumbnail by operation of the arrow pad 8 and then presses the enter button 9, thereby selecting the desired image. For example, if the user presses the enter button 9 in the situation as shown in
While checking the target-image display screen, the user moves a cursor C21 to a desired thumbnail by operation of the arrow pad 8 and then presses the enter button 9, thereby selecting the desired on-screen relevant image. Suppose the case where the user presses the left arrow of the arrow pad 8 in the situation where the cursor C21 is on a left-most thumbnail I211 in the relevant-image display area E21 (the situation shown in
How the on-screen relevant images are selected according to the first embodiment is described below.
When the target image is selected, in the first embodiment, the process for selecting relevant images from all the taken images stored in the storage unit 20 and selecting on-screen relevant images from the relevant images (hereinafter, “on-screen relevant image setting process”) is performed. During the on-screen relevant image setting process, an image having at least one element item identical to the corresponding piece of the element information assigned to the target image is selected as a relevant image. After that, a group of relevant images is made on the basis of their element information and one image is selected from the group as a representative relevant image.
The representative relevant image, which has been selected from the group of the relevant images, the solo relevant image, and the target image are set as the on-screen relevant images.
After that, data containing the images that are set as the on-screen relevant images in the above-described manner are displayed on the relevant-image display area of the target-image display screen in the thumbnail-size images in such a manner that a set of a predetermined number (e.g., five) of images is displayed at a time in chronological shooting order. In the example shown in
When the user is looking at an image taken at a previous time, the image may remind the user of images that are relevant to the image, for example, images of the same subject or another subject alike and images taken in almost the same situations. If each element item, such as the “shooting date”, the “No. of human figures”, and the “subject type”, is set to each taken image, it is possible to select the relevant images in the above manner. However, in the case where the relevant images are selected on the basis of the element information assigned to each taken image, if every image having one or more pieces of identical element information is selected as the relevant images, the relevant images may include too many similar images.
In the first embodiment, an image having at least one element item identical to the element item of the target image is selected as a relevant image in the same manner as the above-described conventional technology. In contrast to the conventional manner, if two or more relevant images have the element items each identically set, then a group that contains the relevant images is made and only one image is selected for display from the group. In other words, a predetermined number of (e.g., five in the present embodiment) on-screen relevant images that are displayed on the relevant-image display area of the target-image display screen in the thumbnail-size images are selected in such a manner that not all on-screen relevant images have the combination of the element items identically set.
If the search mode is selected as the operation mode (Step a3: Yes), the CPU 27 displays an image search screen on the display unit 19 where a list of a predetermined number of thumbnails of taken images are arranged in the chronological shooting order (Step a7). After that, the CPU 27 receives an instruction to select the target image. When one thumbnail is selected from the list of thumbnails and the instruction to select the target image is received (Step a9: Yes), the CPU 27 adds the image number of the selected target image to the target-image log information 293, thereby updating the target-image log information 293 (Step a11). After that, the CPU 27 performs an on-screen relevant image setting process (Step a13).
As shown in
After the relevant images are selected by checking the element information assigned to every taken image except the target image, the CPU 27 determines whether relevant images that have combination of the element items identically set are found. If the determination is positive (Step b9: Yes), the CPU 27 makes a group of these relevant images and sets a relevant image having the latest shooting date and time as the representative relevant image (Step b11). The CPU 27 determines the representative relevant image set at Step b11, the solo relevant image that cannot make a group with another relevant image, and the target image to be the on-screen relevant images (Step b13). After that, the process control returns to Step a13 of
If the determination at Step b9 is negative (Step b9: No), the CPU 27 determines the relevant images selected at Step b7 and the target image to be the on-screen relevant images (Step b15). After that, the process control returns to Step a13 of
If the determination at Step b5 is negative (Step b5: No), the CPU 27 determines the target image to be the on-screen relevant image (Step b17). After that, the process control returns to Step a13 of
At Step a15, the CPU 27 displays the target-image display screen shown in
After that, the CPU 27 operates according to user instructions. As shown in
If the cursor is on either the left side or the right side of the relevant-image display area (Step a21: Yes), the CPU 27 selects a predetermined number of (e.g., five) new images to be displayed from the on-screen relevant images in accordance with the direction specified by the instruction received via the arrow pad 8 (right or left) (Step a25). The CPU 27 arranges the thumbnails of the on-screen relevant images to be displayed, thereby updating the display of the relevant-image display area (Step a27).
If an instruction is received via the enter button 9 (Step a29: Yes), the CPU 27 adds the image number of the on-screen relevant image indicated by the cursor to the target-image log information 293, thereby updating the target-image log information 293 (Step a31). After that, the process control goes to Step a13 of
If a log display instruction is received (Step a33: Yes), the CPU 27 performs a log display process (Step a35). It is assumed, for example, the menu button 7 is assigned to the log display function during the search mode. In this case, when an instruction is received via the menu button 7, the process control goes to Step a35. It is noted that some members other than the menu button 7 can be assigned to the log display function. It is allowable to assign the log display function to an additional dedicated member.
As shown in
If the target-image log information 293 has the image numbers, i.e., one or more taken images have been displayed on a screen as the target image (Step c3: Yes), the CPU 27 displays a list of thumbnails of the taken images corresponding to the read image numbers arranged in chronological search order (Step c7). The cursor is on, for example, the thumbnail of an image having the latest shooting date and time.
After that, if an instruction is received via the arrow pad 8 (Step c9: Yes), the CPU 27 moves the cursor in the direction specified by the instruction received via the arrow pad 8, thereby updating the display (Step c11). The CPU 27 can receive an instruction to select a taken image. If one thumbnail is selected from the list of the thumbnails displayed at Step c7 and the instruction to select the taken image is received (Step c13: Yes), the process control returns to Step a35 of
If an instruction to complete the log display process is received (Step c15: Yes), the process control returns to Step a35 of
At Step a37 of
At Step a39, the CPU 27 determines whether an instruction to switch between operation modes is received. If the operation mode is switched to a mode other than the search mode, such as the shooting mode and the playback mode (Step a39: Yes), the process control goes to Step a5 of
At Step a41, the CPU 27 determines whether an instruction to power the digital camera 1 off is received via the power ON/OFF button 6. If an instruction to power the digital camera 1 off is received (Step a41: Yes), the process control goes to end. In this case, the power supply to the units of the digital camera 1 is stopped and the digital camera 1 shifts to the power-off state. If no instruction to power the digital camera 1 off is received (Step a41: No), the process control returns to Step a17 and the CPU 27 waits for a user instruction.
As described above, in the first embodiment, an image having at least one element item identical to the corresponding piece of the element information assigned to the target image is selected from all of the taken images except the target image as a relevant image. Moreover, if there are two or more relevant images that have the combination of the element items identically set, a group of these relevant images is made and one image (e.g., image having the latest shooting date and time) is selected from the group as the representative relevant image. The representative relevant image, the solo relevant image that cannot make a group with another relevant image, and the target image are set as the on-screen relevant images and they are displayed on a screen together with the target image. Therefore, the user can check the images relevant to the target image for search in an efficient manner.
A second embodiment is described below.
In the above-described first embodiment, when the target image is selected, images relevant to the target image are selected and on-screen relevant images are then selected. After that, in response to a target-image selecting instruction or an instruction via the arrow pad 8, on-screen relevant images to be displayed are selected from the images that are set as the on-screen relevant images in the above manner. In the second embodiment, in contrast, each time when a target-image selecting instruction or an instruction via the arrow pad 8 is received, on-screen relevant images are selected and the selected on-screen relevant images are displayed. More particularly, during the on-screen relevant image setting process, the taken images are checked in specified sequence and, if the checked image has at least one element item identical to the corresponding piece of the element information assigned to the target image (not shown), the checked image is selected as a relevant image. If the checked image selected as the relevant image has combination of the element items different from the pieces of the element information assigned to any of on-screen relevant images that have already been set as images to be displayed on the updated screen, then the checked image selected as the relevant image is set as an on-screen relevant image. If the checked image set as the relevant image has the combination of the element items identically set, the checked image selected as the relevant image is set as a member of the group that contains the on-screen relevant image having the combination of the element items identically set.
If the on-screen relevant image setting process is performed in response to an instruction to select the target image, the sequence is determined to be according to ascending chronological shooting order and the taken images are checked in sequence according to ascending chronological shooting order. On the other hand, if the on-screen relevant image setting process is performed in response to an instruction received via the arrow pad 8, the sequence is determined depending on the direction specified by the instruction received via the arrow pad 8 (right or left). If, for example, the direction specified by the instruction received via the arrow pad 8 is right, the taken images with their shooting date and time later than the shooting date and time of the on-screen relevant images currently appearing on the relevant-image display area are checked in sequence according to ascending chronological shooting order. If the direction specified by the instruction received via the arrow pad 8 is left, the taken images with their shooting date and time earlier than the shooting date and time of the on-screen relevant images currently appearing on the relevant-image display area are checked in sequence according to descending chronological shooting order.
Suppose there is a situation where, when the element information assigned to the taken images is checked in sequence, the taken image having the image number “147” shown in
It is assumed that, after the group G41 is made, the taken image having the image number “281” is selected as a relevant image Ir43. The combination of the element items is compared with the pieces of the element information assigned to the on-screen relevant image Id41. It is then found that element information E41 indicative of the “shooting mode” and element information E43 indicative of the “No. of human figures”, both surrounded by the dotted blocks shown in
As shown in
The CPU 27 determines, on the basis of the element information assigned to the target image selected at Step e3, whether the checked image has at least one element item identical to the corresponding piece of element information assigned to the target image that has been read at Step e1. If the checked image has no identical element item (Step e5: No), the process control goes to Step e15. If the checked image has an identical element item (Step e5: Yes), the CPU 27 selects the checked image as a relevant image (Step e7).
After that, if the CPU 27 determines that one or more images have already been set as on-screen relevant images to be displayed on the updated screen and the checked image that is selected at Step e7 as the relevant image has the combination of the element items the same as the combination of the element items assigned to any of the on-screen relevant images (Step e9: Yes), the checked image is set as a member of the group that contains the on-screen relevant image having the same combination of the element items (Step e11). The process control then goes to Step e15. If the checked image that is selected as the relevant image has a different combination of the element items (Step e9: No), the CPU 27 sets the checked image as an on-screen relevant image (Step e13) and the process control goes to Step e15.
At Step e15, the CPU 27 determines whether the number of the on-screen relevant images reaches a predetermined number n (e.g., n=5). If the determination is positive (Step e15: Yes), the process control returns to Step d13 of
If the determination is negative (Step e15: No), the CPU 27 determines whether there is any taken image to be checked. If taken images are selected in ascending chronological shooting order and the currently checked image is the image having the latest shooting date and time, there is not any more taken image to be checked (Step e17: No); therefore, the process control returns to Step d13 of
At Step d15, the CPU 27 displays an updated target-image display screen on the display unit 19 with the target image appearing on the target-image display area at the center and the thumbnails of the updated on-screen relevant images appearing on the relevant-image display area in accordance with the result of the on-screen relevant image setting process at Step d13.
After that, the CPU 27 performs a process corresponding to the user instruction. In the second embodiment, as shown in
If an instruction is received via the enter button 9 (Step a29: Yes), the CPU 27 adds the image number of the on-screen relevant image indicated by the cursor to the target-image log information 293, thereby updating the target-image log information 293 (Step a31). The process control then goes to Step d13 of
If a log display instruction is received (Step a33: Yes), the CPU 27 performs the log display process (Step a35). If an instruction to select a taken image is received during the log display process performed at Step a35 (Step a37: Yes), the process control goes to Step d13 of
As described above, in the second embodiment, when the target image is displayed, an image having at least one element item identical to the element item of the target image is selected as a relevant image. Moreover, if there are two or more relevant images that have the element items each identically set, a group of these relevant images is made and one image is selected from the group for display. Therefore, the same effects that have been obtained in the first embodiment can be obtained.
A third embodiment is described below.
In the third embodiment, when the on-screen relevant image setting process is performed in response to an instruction to select the target image or an instruction received via the arrow pad 8, successive n number of (e.g., n=5) relevant images are extracted from the relevant images arranged in specified sequence. If all the extracted relevant images have the combination of the element items identically set, the CPU 27 searches for an image that has the combination of the element items differently set amongst the relevant images after the last-extracted relevant image (n-th image: the fifth image on the right side shown in
Suppose for example there is the situation shown in
When the relevant images Ir77 that has combination of the element items different from the combination of the element items assigned to the extracted relevant images is found, the first to the third extracted relevant images Ir71 to Ir73 are set as on-screen relevant images Id71 to Id73. Moreover, the successive images from the fifth (n-th) extracted relevant image to the relevant image immediately before the search position (in the example shown in
Data containing the images that are set as the on-screen relevant images in this manner is display on the relevant-image display area of the target-image display screen in the thumbnail-size images. As shown in
After that, the CPU 27 checks every taken image with their image data being stored in the storage unit 20 and their element information being stored in the element information DB 291 in specified sequence (Step g15). The CPU 27 selects taken images having at least one element item identical to the corresponding piece of the element information assigned to the target image read at Step g13 as relevant images (Step g17). If the process at Step g15 is performed in response to the selecting of the target image, the taken images are checked in sequence according to ascending chronological shooting order. On the other hand, if the process at Step g15 is performed in response to a later-described instruction received via the arrow pad 8 at Step a17 of
After that, the CPU 27 performs an on-screen relevant image setting process (Step g19).
After that, the CPU 27 determines whether all the extracted relevant images have combination of the element items identically set. If one or more of the extracted relevant images have combination of the element items differently set (Step h3: No), the process control goes to Step h15 and the CPU 27 sets the extracted relevant images that are extracted at Step h1 as the on-screen relevant images. After that, the process control returns to Step g19 of
If all the extracted relevant images have combination of the element items identically set (i.e., every element item is identically set) (Step h3: Yes), the CPU 27 checks the element information assigned to the relevant images after the last-extracted relevant image (n-th extracted relevant image) in sequence and searches for a relevant image having combination of the element items different from the combination of the element items assigned to the extracted relevant images (Step h5). If such an image is found (Step h7: Yes), the CPU 27 makes a group of the n−1-th extracted relevant image and successive images from the n-th extracted relevant image to the relevant image immediately before the search position and replaces the n−1-th extracted relevant image with, for example, an image having the latest shooting date and time selected from the group (Step h9). After that, the CPU 27 replaces the n-th extracted relevant image with the relevant image on the search position (Step h11).
After that, the process control goes to Step h15 and the CPU 27 sets the extracted relevant images that include the new n−1-th and the new n-th extracted relevant images as the on-screen relevant images. The process control then returns to Step g19 of
If an image having different element information is not found (Step h7: No), a message is displayed on the display unit 19 that all the relevant images to be displayed (on-screen relevant images) have the same element information (Step h13). After that, the process control goes to Step h15 and the extracted relevant images that are extracted at Step h1 are set as the on-screen relevant images. The process control then returns to Step g19 of
At Step g21, the CPU 27 displays the target-image display screen shown in
After that, the CPU 27 performs a process corresponding to the user instruction. In the third embodiment, as shown in
If an instruction is received via the enter button 9 (Step a29: Yes), the CPU 27 adds the image number of the on-screen relevant image indicated by the cursor to the target-image log information 293, thereby updating the target-image log information 293 (Step a31). The process control then goes to Step g13 of
If a log display instruction is received (Step a33: Yes), the CPU 27 performs the log display process (Step a35). If an instruction to select a taken image is received during the log display process performed at Step a35 (Step a37: Yes), the process control goes to Step g13 of
As described above, in the third embodiment, when the target image is displayed, images each having at least one element item identical to the element item of the target image is selected as relevant images. Even if the successive relevant images have the combination of the element items identically set, images appearing on the relevant-image display area of the target-image display screen include at least one relevant image having combination of the element items differently set. Therefore, the same effects that have been obtained in the first embodiment can be obtained.
A fourth embodiment is described below.
In the fourth embodiment, serial numbers are assigned to individual continuous shooting images taken in the continuous shooting mode and, therefore, the continuous shooting images are identifiable. If a continuous shooting image with a serial number is selected as a relevant image, a group of the continuous shooting images is made and one image (e.g., the image having the earliest date and time) is selected as the representative relevant image from the continuous shooting images. In the example shown in
In this case, an on-screen relevant image setting process shown in
During the on-screen relevant image setting process according to the fourth embodiment, after the CPU 27 selects, as shown in
On the other hand, if it is determined at Step i9 that no relevant image is a continuous shooting image, the CPU 27 sets the relevant images selected at Step b7 and the target image as the on-screen relevant images (Step i15).
As described above, in the fourth embodiment, when the target image is displayed, images each having at least one element item identical to the element item of the target image are selected as relevant images. If any of the relevant images are a continuous shooting image, a group of the continuous shooting images is made and one image (e.g., the image having the earliest date and time) is selected as the representative image from the group. Therefore, the same effects that have been obtained in the first embodiment can be obtained.
In the above-described fourth embodiment, a group that contains a set of the four continuous shooting images that is taken in the continuous shooting mode is made, and one image is selected as the representative image from the group. In contrast, it is allowable to determine, on the basis of information about the shooting date and time, images taken at intervals a predetermined time (e.g., three seconds) or shorter to be a set of continuous shooting images and make a group of the continuous shooting images. One image is selected as the representative image from the group.
Although, in the above-described embodiments, the element items include the “shooting date”, the “shooting mode”, the “No. of human figures”, and the “subject type”, the types and the number of the element items can be changed appropriately.
For example, it is allowable to extract information about types of events, such as the sports festival, the birthday, and the Christmas day, from the image data in accordance with a result of image processing or the shooting date and time or the like and set the types of events as the element information. It is also allowable to estimate, using a well-known image recognition process, the sex, the facial expression, and the age of the human figure detected in the image data and set the extracted data as the element information. It is also allowable to register the face of a predetermined human figure, identify, using a well-known image recognition process, the human figure in the image data on the basis of the registered face, and set the identified human figure as the element information. It is also allowable to measure a degree of brightness (bright or dark) or the hue of each image on the basis of its pixel values, recognize the mood of each image, and set the mood as the element information.
It is allowable to design types of element items selectable according to, for example, a user instruction. When an instruction to select an element item is received from the user, information corresponding to the selected element item is extracted from each image stored in the storage unit 20 and the extracted information is set as the element information.
In the above-described embodiments, some relevant images are selected as on-screen relevant images by checking the relevant images in sequence according to chronological order (chronological shooting order) and the on-screen relevant images are displayed on the relevant-image display area. However, it is also allowable to select a piece of important element information in accordance with, for example, a user instruction and check taken images in sequence beginning with images having the pieces of important element information. In this case, the CPU 27 works as an importance setting unit, i.e., sets a piece of element information as important element information in accordance with, for example, a user instruction, and stores the important element information in the flash ROM 29. It is assumed, for example, that the “portrait mode” of the element item “shooting mode” is set as the important element information. In this case, the CPU 27 checks the relevant images in sequence beginning with taken images each having the shooting mode “portrait mode”. It is assumed another example in which the mood of each image is set as a piece of element information in the same manner as in the above-described modification and “bright” is set as the important element information. In this case, the CPU 27 checks the relevant images in sequence beginning with taken images having “bright” as the element information. If some relevant images are selected as on-screen relevant images by checking the relevant images in sequence according to chronological order in the same manner as in the above-described embodiments, the time taken to view an image in the back with respect to chronological order may increase. In contrast, if the present modification is taken, because a piece of element information can be set appropriately as important element information, the user can search for images relevant to the target image in an efficient manner.
The present invention is not limited to the embodiments described above. Another embodiment can be formed by combining some components appropriately selected from the plurality of components disclosed in the embodiments. For example, one embodiment can be formed by eliminating some from all the components disclosed in an embodiment above described. Another embodiment can be formed by combining components appropriately selected from components of different embodiments.
Although, in the above-described embodiments, the digital camera is used as the image search device according to the present invention, some other devices can be used as the image search device, such as a camera cell-phone, a music player, a recording device, and a laptop personal computer. Moreover, the above-described technology can be used in an image search device that does not include the imaging unit 100 (camera), such as a personal computer that stores therein images and element information assigned to the images (element information DB).
According to the present invention, a predetermined number of images each having at least one piece of element information identical to the corresponding piece of element information assigned to a target image is selected as relevant images in such a manner that the relevant images include at least one relevant image has combination of pieces of the element information different from the combination of pieces of the element information assigned to the other relevant images. This brings an efficient search for images relevant to the target image.
Additional advantages and modifications will readily occur to those skilled in the art. Therefore, the invention in its broader aspects is not limited to the specific details and representative embodiments shown and described herein. Accordingly, various modifications may be made without departing from the spirit or scope of the general inventive concept as defined by the appended claims and their equivalents.
Number | Date | Country | Kind |
---|---|---|---|
2009114544 | May 2009 | JP | national |