The present invention relates to a displaying program and a displaying apparatus.
Conventionally, various techniques have been examined with regard to image editing associated with digital images. For example, the invention of Patent Document 1 has disclosed a technique of performing image analysis on a target image of image editing, and based on the analysis result, prompting input of parameters or providing notification of appropriate parameters so that the user can select preferred image correcting parameters.
However, with the prior art, the user cannot select preferred parameters when the result of the image analysis is not correct. In addition, a sufficiently easy-to-understand and easy-to-use user interface is also required even for users lacking knowledge and experience.
It is a proposition of the present invention to provide, in displaying an image, a user interface which is easy for the user to understand and use.
A displaying program according to an aspect of the embodiment is the displaying program causing a computer to perform display control on a displaying apparatus which includes a reading step reading a target image to be displayed, a decision step deciding a content of a user interface for accepting various settings for the target image to be displayed on the displaying apparatus based on the target image, and a displaying step displaying the user interface on the displaying apparatus based on the content being decided in the decision step.
In the decision step, the content of the user interface may be decided based on at least one of a factor being attribute information of the target image and a factor being a shooting condition when shooting the target image.
In addition, there may be further included an analyzing step performing an image analysis of the target image, in which in the decision step, the content of the user interface may be decided based on at least one of a factor being attribute information of the target image, a factor being a shooting condition when shooting the target image, and a factor being a result of the image analysis.
Additionally, in the decision step, the factor may be at least the result of the image analysis.
Furthermore, in the decision step, the content of the user interface for accepting various settings related to editing of the target image may be decided.
Moreover, the attribute information may include at least one of the data format of the target image and the data size of the target image.
In addition, the shooting condition may include at least one of a type of shooting mode being set when shooting the target image, an exposure condition when shooting the target image, a presence or an absence of flash light emission when shooting the target image, positional information when shooting the target image, and information of an external device being connected to the imaging apparatus when shooting the target image.
Additionally, in the analyzing step, may perform at least one of a distribution analysis of brightness of the target image, a face detection detecting a face region included in the target image, a dust detection of the target image, a red-eye detection of the target image, a blur detection of the target image, a bokeh detection of the target image, a detection of tilt information of the target image, and a scene analysis of a subject included in the target image, and the result of the image analysis may include at least one of distribution information of the brightness, a result of the face detection, a result of the dust detection, a result of the red-eye detection, a result of the blur detection, a result of the bokeh detection, the tilt information, and a result of the scene analysis.
Furthermore, when deciding the content of the user interface based on the factor in a plurality in the decision step, the content of the user interface may be decided after having performed appropriate weighting on the plurality of the factors.
Moreover, in the decision step, a content of a user interface having preferred items for the various settings which are visible by a user are decided as the content of the user interface.
In addition, in the decision step, at least one of types of items related to the various settings, a display order of the items, an arrangement of the items, and a display format of the items may be decided as the content of the user interface.
Additionally, in the decision step, appropriate weighting may be performed on the items and at least one of types of items related to the various settings, a display order of the items, an arrangement of the items, and a display format of the items may be decided according to the weighting.
Additionally, in the decision step, the content of the user interface may be decided based on a content of the various settings being performed previously.
A displaying apparatus according to an aspect of the embodiment includes a displaying part displaying an image, a reading part reading a target image to be displayed on the displaying part, a decision part deciding a content of a user interface for accepting various settings for the target image to be displayed on the displaying part based on the target image, and a display controlling part displaying the user interface on the displaying part based on the content being decided by the decision part.
A computer 11 illustrated in
The data reading part 12 is used in reading, from the outside, data of the target image or the displaying program described above. For example, the data reading part 12 is constituted by a reading device (reading apparatus of optical disk, magnetic disk, magneto-optical disk, etc.) which obtains data from a removable storage medium, and a communication device (USB interface, LAN module, wireless LAN module, etc.) which communicates with an external device in conformity with a well-known communication standard.
The storage apparatus 13 is constituted by, for example, a storage medium such as a hard disk, a nonvolatile semiconductor memory, or the like. The storage apparatus 13 has recorded therein a displaying program and various data required for execution of the program. Meanwhile, the storage apparatus 13 can also store data of a target image which has been read from the data reading part 12.
The CPU 14 is a processor which controls each part of the computer 11 in an integrated manner. The CPU 14 functions as an image analyzing part 23, a UI decision part 24, and an image processing part 25, respectively, by execution of the image processing program described above (respective operations of the image analyzing part 23, the UI decision part 24, the image processing part 25 will be described later).
The memory 15 temporarily stores various results from the image processing program. The memory 15 includes, for example, a volatile SDRAM or the like.
In the following, an operation example of the displaying apparatus in the first embodiment will be described, referring to the flowchart of
Each process illustrated in
In the example of
In the example of
Additionally, in the display examples of
Meanwhile, the displays of
In the following, description will be provided by exemplifying a case where a target image of image editing has been selected by user operation via the input device 18, in a state where the viewer illustrated in
(Step S101)
The CPU 14 obtains, from the storage apparatus 13 via the data reading part 12, data of a target image specified by the user. Meanwhile, when data of the target image is recorded outside of the computer 11, the CPU 14 obtains, from outside of the computer 11 via the input-output I/F 16, data of the target image specified by the user.
(Step S102)
The CPU 14 obtains attribute information from the target image obtained at step S. The attribute information is information relating to attributes of the target image itself typically included in auxiliary information such as data format (RAW,JPEG) of the target image, data size of the target image or the like, and such as normal EXIF information.
(Step S103)
The CPU 14 obtains, from the target image obtained at step S, shooting conditions when the target image is shot. The shooting condition includes information relating to: types of shooting modes (portrait mode, scenery mode, etc.) which has been set when the target image is shot; exposure conditions (shutter speed, diaphragm, sensitivity, etc.) when the target image is shot; presence or absence of flash light emission when the target image is shot; positional information (GPS information) when the target image is shot; external devices (an external flash apparatus, communication equipment) connected to the imaging apparatus when the target image is shot, or the like. The shooting conditions may be included in auxiliary information such as EXIF information, or may be recorded as data other than the data of the target image.
(Step S104)
The CPU 14 performs, with the image analyzing part 23, image analysis of the target image obtained at step S.
The image analysis performed by the image analyzing part 23 includes the following:
Meanwhile, these image analyses are exemplary, and the image analyzing part 23 may perform image analysis other than that described above. In addition, the target of image analysis need not be the target image itself obtained at step S101. For example, image analysis may be carried out using a reduced image (thumbnail image, quick view image, etc.) recorded together with the target image, or may be carried out using an image obtained by performing predetermined thinning processing or resizing processing on the target image.
In addition, the image analysis by the image analyzing part 23 may be performed before the target image is obtained at step S101. That is, the image analysis may be preliminarily performed when the target image is shot, when the target image is recorded, or after having recorded the target image, at a timing when the processing by the image analyzing part 23 is not performed.
Although the example of
(Step S105)
The CPU 14 decides, with a UI decision part 24, the user interface to be displayed on the monitor 19.
The UI decision part 24 decides the contents of the user interface to be displayed on the monitor 19, according to the attribute information obtained at step S102, the shooting conditions obtained at step S103, and the result of the image analysis performed at step S104. The user interface is a user interface preferable for editing the image selected as the target image by the user. The contents of the user interface include types of items, displaying order items, arrangement of items, display format (display size, display color, etc.) of items.
Several examples will be illustrated in the following.
When the target image is a RAW format image, image editing such as “exposure correction”, “white balance”, “noise reduction”, or the like is frequently performed and effective as well. Therefore, the UI decision part 24 selects a user interface capable of setting details of such items.
When the target image is an image having many washed-out highlights, image editing such as “highlight adjustment” or the like is frequently performed and effective as well. In addition, when the target image is an image having a large difference between the white level and the black level, image editing such as “contrast adjustment” is frequently performed and effective as well. Therefore, the UI decision part 24 selects a user interface capable of setting details of such items.
Meanwhile, the example of
When the target image is an “image including a person”, image editing such as “saturation adjustment” is frequently performed and effective as well. Therefore, the UI decision part 24 selects a user interface capable of setting details of the item.
Meanwhile, the example of
In the following, a preferable user interface will be described (illustration being omitted) with regard to respective items of the attribute information obtained at step S102, the shooting conditions obtained at step S103, and the result of the image analysis performed at step S104.
<Concerning the Attribute Information Obtained at Step S102>
Data Format of the Target Image (RAW, JPEG, Etc.)
When the data format of the target image is RAW format, a user interface capable of setting details of an item suitable for an image before compression is selected from the menu displayed on the window W2. In contrast, when the data format of the target image is a format such as JPEG format, a user interface capable of setting details of an item suitable for an image after compression is selected from the menu displayed on the window W2.
Data Size of the Target Image
When the data size of the target image is relatively large, a user interface capable of setting details of an item with a light processing load is selected from the menu displayed on the window W2. In contrast, when the data size of the target image is relatively small, a user interface capable of setting details of an item with a heavy processing load is selected from the menu displayed on the window W2.
In addition, when the data size of the target image is relatively large, the selection may be performed similarly to the case where the data format of the target image described above is RAW format, and when the data size of the target image is relatively small, the selection may be performed similarly to the case where the data format of the target image described above is a format such as jPEG or the like.
<Concerning the Shooting Conditions Obtained at Step S103>
Type of Shooting Mode Set when the Target Image is Shot
For example, when the shooting mode described above is a night view mode, a user interface capable of setting details of an item such as “noise reduction” or “cross filter” is selected from the menu displayed on the window W2. In addition, when the shooting mode described above is a shooting mode related to shooting a person, such as the portrait mode, the selection is performed in the same way as the case described referring to
Exposure Conditions when the Target Image is Shot
When it can be determined based on exposure conditions that blurring or fuzziness is very likely to be generated in the target image, a user interface capable of setting details of an item such as “contrast adjustment” or “sharpness adjustment” is selected from the menu displayed on the window W2.
Presence or Absence of Flash Light Emission when the Target Image is Shot
When flash light emission is performed at the time of shooting, a user interface capable of setting details of an item such as “white balance” is selected from the menu displayed on the window W2.
Positional Information (GPS Information) when the Target Image is Shot
When there exists GPS information at the time of shooting, a user interface capable of setting details of displaying positional information or map information is selected (not illustrated in the menu displayed on the window W2).
External Device Connected to the Imaging Apparatus when the Target Image is Shot
When there exists information related to an external device connected to the imaging apparatus when the target image is shot, a user interface capable of setting details of a preferred item is selected from the menu displayed on the window W2, according to the type or operation state of the external device.
<Concerning the Result of the Image Analysis Performed at Step S104>
Analysis of Distribution with Regard to Brightness of the Target Image
Selection is performed in the same way as the case described referring to
Face Detection which Detects a Face Region Included in the Target Image
When the target image is an image including a person, as a result of performing face detection, selection is performed in the same way as the case described referring to
Dust Detection with Regard to the Target Image
As a result of performing dust detection, when it is determined that there exists influence of dust on the target image, a user interface capable of setting details with regard to “removal of dust” (automatic removal, inquiring the user of necessity of dust removal) is selected (not illustrated in the menu displayed on the window W2).
Red-Eye Detection with Regard to the Target Image
When it is determined that there exists a red-eye part in the target image as a result of performing red-eye detection, a user interface capable of setting details relating to “red-eye correction” (automatic correction, inquiring the user of necessity of correcting the red-eye) is selected (not illustrated in the menu displayed on the window W2).
Blur Detection and Bokeh Detection with Regard to the Target Image
When it is determined that there are blurring and bokeh to the target image, as a result of performing blur detection and bokeh detection, a user interface capable of setting details of an item such as “sharpness” is selected from the menu displayed on the window W2. Alternatively, a user interface capable of setting details with regard to “blur correction” and “bokeh correction” (automatic correction, inquiring the user of necessity of respective corrections or the like) is selected (not illustrated in the menu displayed on the window W2).
Detection of Tilt Information with Regard to the Target Image
When the target image is inclined as a result of performing detection of tilt information, a user interface capable of setting details of an item such as “tilt adjustment” (automatic correction, inquiring the user of necessity of tilt adjustment or the like) is selected from the menu displayed on the window W2.
Meanwhile, the same also applies to the case where tilt information of the imaging apparatus at the time of shooting is included in the attribute information obtained at step S102.
Scene Analysis with Regard to a Subject Included in the Target Image
It suffices to assume a main subject included in the target image based on the result of performing the scene analysis, and to select a user interface capable to setting details of a preferable item for the assumed main subject from respective items described above.
Furthermore, the UI decision part 24 may decide contents of the user interface, based on the contents of image editing previously performed.
That is, every time image editing is performed, the CPU 14 records the contents as a history according to the target image. Then, the UI decision part 24 determines, in selecting a user interface capable of setting details of an item from the menu displayed on the window W2, whether or not there exists a history of image editing with regard to the item, and when there exists such a history, the UI decision part 24 decides the contents of the user interface based on the history.
In the example of
Meanwhile, when there is a plurality of factors (attribute information obtained at step S102, shooting conditions obtained at step S103, respective items of the result of the image analysis performed at step S104) for deciding the user interface, the UI decision part 24 decides the contents of the user interface after having performed weighting on the factors as appropriate. Weighting makes it possible to provide the plurality of factors with priority and to decide a user interface.
Weighting in this case may be predetermined, may be determined by the UI decision part 24 each time, or may be determined by the user. In performing weighting, the contents of the user interface may be decided based on the logical sum (OR) of a plurality of factors, or the contents of the user interface may be decided based on the logical product (AND).
In addition, among the plurality of factors, the contents of the user interface may be decided based on only some of the factors. On which factors the decision of the contents of the user interface is based may be decided according to the weighting and priority described above, may be decided by the UI decision part 24 each time, or may be decided by the user.
In addition, in deciding the contents of the user interface based on the plurality of factors, it is conceivable that settings of “respective items relating to image editing” described in
In the following, there will be described setting when the user performs weighting on “respective items relating to image editing”.
In the example of
The user performs pull-down operation on a region A1 as to respective items in the window W5, and sets priority (for example, priority from 1 to 3). The priority can be set for each of the items displayed on the window W5. In addition, as to “exposure correction” and “white balance” among the respective items displayed on the window W5, a check box for allowing setting (for opening a menu) of details of the item without fail is always provided in a region A2, when the data format of the target image is RAW format in the attribute information obtained at step S102. The user can always perform setting of opening a menu of the item by checking the check box in the region A2 when the data format of the target image is RAW format.
Furthermore, in order to support the setting by the user, on what kind of images respective items relating to image editing act (what effect is caused on what image) may be displayed. For example, the following sentence may be displayed when the user brings the selecting cursor close to any of the respective items displayed on the window W5.
In addition, as described above, items relating to setting of weighting are displayed on the second window W6 at the lower part in the example of
Automatic open setting is a setting which allows (opens a menu for) automatic setting of details of any of the respective items displayed on the window W5, according to predetermined conditions. The automatic open setting is performed when the check box displayed in the region A3 is checked and a target priority is set by pull-down operation in the region A3. Predetermined conditions for automatic opening of each of the items displayed on the window W5 are described below.
Exposure Correction
White Balance
Noise Reduction
Highlight Adjustment
Shadow Adjustment
Contrast Adjustment
Brightness Adjustment
Saturation Adjustment
Sharpness Adjustment
Tilt Adjustment
Cross Filter
Meanwhile, among the respective items, those having a plurality of conditions may be decided using only particular conditions (some of the conditions), or may be decided using all the conditions.
In addition, the target priority set in the region A3 of the window W6 described above is compared with the priority described in the region A1 of the window W5 and, based on the result of the comparison, it is determined whether or not the menu of the items is opened. For example, when the target priority is set “equal to or higher than priority 2” by pull-down operation in the region A3, no menu will be opened for an item for which the priority described in the region A1 of the window W5 described above is three.
In addition, as illustrated in
Furthermore, as illustrated in
By deciding the user interface in this way, the user easily can know the contents of the previous image editing. Therefore, a highly convenient user interface can also be provided in performing fine adjustment or readjustment.
(Step S106)
Based on the user interface decided at step S105, the CPU 14 displays, on the monitor 19, a user interface according to the target image obtained at step S101.
The series of processing described above allows the CPU 14 to dynamically change the user interface displayed on the monitor 19, according to the target image.
Meanwhile, the processing by the CPU 14 after having dynamically changed the user interface is similar to a known technique. That is, the CPU 14 controls the image processing part 25 and performs image processing on the data of the target image obtained at step S101, based on editing instructions provided by user operation via the input device 18. Then, when the save button B3 displayed on the window W3 is selected, the CPU 14 stores, in the storage apparatus 13, data of the target image reflecting the result of the editing. In addition, the CPU 14 terminates image editing when the end button B4 displayed on the window W3 is selected.
A described above, according to the first embodiment, the contents of the user interface for accepting various settings for the target image to be displayed on the displaying apparatus is decided based on the target image to be displayed, and the user interface is displayed on the displaying apparatus based on the contents of the decision.
Therefore, according to the configuration of the first embodiment, it is possible to provide a user interface which is easy for the user to understand and use in displaying an image.
Particularly, according to the configuration of the first embodiment, it is possible to provide the user with a user interface including frequently-performed items or effective items, according to the target image, by dynamically changing the user interface according to the target image. Therefore, it is possible to implement a user interface which is sufficiently easy to understand and use even for users lacking knowledge and experience.
The imaging element 33 is an imaging device which captures an image of a subject formed by the imaging optical system 32, and generates an image signal of the captured image. Meanwhile, the image signal output from the imaging element 33 is input to the CPU 34 via an A/D conversion circuit (not illustrated).
The CPU 34 is a processor which controls the operation of the electronic camera 31 in an integrated manner. For example, the CPU 34 functions, by execution of a program, as a displaying apparatus (CPU 14, image analyzing part 23, UI decision part 24, image processing part 25) of the first embodiment described above.
The ROM 35 has a program stored therein to be executed by the CPU 34. In addition, the main memory 36 stores temporarily data of an image in the pre-processing or post-processing of the image processing.
The recording I/F 37 has a connector for connecting a nonvolatile storage medium 40 thereto. Then, the recording I/F 37 performs writing/reading of data to/from the storage medium 40 connected to the connector. The storage medium 40 described above includes a hard disk, a memory card having a built-in semiconductor memory, or the like. Meanwhile, in
The displaying part 39 displays the image data obtained from the CPU 34, as well as performing display described at step S106 of the first embodiment.
When a target image to be edited or displayed are selected by user operation via the operating part 38 in the electronic camera 31 of the second embodiment, the CPU 34 performs processing similar to those from step S101 to step S106 of the flow chart shown in in
(1) Although description has been provided in the above-described respective embodiments on the assumption that a single image is selected as the target image, the present invention can be similarly applied to a case where a plurality of number of images is selected as target images. When the plurality of number of images is selected as the target image, it is convenient to display the thumbnail list described in
When the plurality of numbers of images is selected as target images, the UI decision part 24 performs weighting as appropriate for each of the factors of the plurality of target images at step S105, or decides the contents of the user interface after having provided a plurality of the factors with priority. The weighting or priority in this case may be predetermined, may be decided by the UI decision part 24 each time, or may be decided by the user, similarly to the case described at step S105. In performing weighting as appropriate, the contents of user interface may be decided based on the logical sum (OR) of the plurality of the factors, or the contents of the user interface may be decided based on the logical product (AND) thereof.
(2) The contents of the user interface described in the above-described respective embodiments are each one example, and the present invention is not limited to the examples. At step S105, the UI decision part 24 decides, as the contents of the user interface, at least one of: type of items relating to various settings; displaying order items; arrangement of items; and display format of items. For example, the UI decision part 24 may select a user interface including only the items whose details can be set, or may select a user interface not including items whose details cannot be set. In addition, the UI decision part 24 may rearrange the displaying order of respective items described for the window W2 of
Furthermore, the CPU 14 and the UI decision part 24 may be provided with a so-called learning function. Then, the UI decision part 24 may decide the user interface, based on frequency and experience with regard to setting of various items.
(3) Although description has been provided in the above-described respective embodiments by exemplifying a case of selecting an image to be edited as the target image, the present invention is not limited to this example. For example, the present invention can be applied similarly in selecting an image to be displayed as the target image.
In the exemplary display of
(4) The displaying apparatus of the present invention is not limited to the examples of the displaying apparatus, computer, and electronic camera of the embodiment described above. The displaying apparatus of the present invention may be an electronic device (e.g., photo viewer, digital photo frame, printing apparatus of photographs) having reproduction display function and retouch function of digital images. In addition, the imaging apparatus of the present invention may be implemented as a camera module of a cellular phone terminal.
(5) Although description has been provided in the respective embodiments described above for an example for implementing each of the processing of the image analyzing part 23, the UI decision part 24, and the image processing part 25 as software, it is taken for granted that each of the processing is implemented as hardware by ASIC or the like.
(6) Although description has been provided in the above-described respective embodiments by exemplifying a case where operation is performed only inside a computer or an electronic camera, for an image existing on the computer or the electronic camera as the target image, the present invention is not limited to this example. For example, the present invention can be applied similarly not only to a case where data of the target image is a file existing in a physical folder, but also to a case where the data of the target image is data in a virtual folder managed by a database. Furthermore, in a case where data of the target image exists in a server located on a network and shared by a plurality of number of persons, the present invention may be configured such that software which operates in a local computer or an electronic camera to access the data via the network. In addition, assuming the use of a network, the invention may be software operating on a WEB server as a WEB application via the Internet.
The many features and advantages of the embodiments are apparent from the detailed specification and, thus, it is intended by the appended claims to cover all such features and advantages of the embodiments that fall within the true spirit and scope thereof. Further, since numerous modifications and changes will readily occur to those skilled in the art, it is not desired to limit the inventive embodiments to the exact construction and operation illustrated and described, and accordingly all suitable modifications and equivalents may be resorted to, falling within the scope thereof.
Number | Date | Country | Kind |
---|---|---|---|
2011-137486 | Jun 2011 | JP | national |
This application is a U.S. national stage application of PCT/JP2012/003635 filed Jun. 1, 2012 and claims foreign priority benefit of Japanese Application No. 2011-137486 filed Jun. 21, 2011 in the Japanese Intellectual Property Office, the contents of both of which are incorporated herein by reference.
Filing Document | Filing Date | Country | Kind | 371c Date |
---|---|---|---|---|
PCT/JP2012/003635 | 6/1/2012 | WO | 00 | 11/26/2013 |