1. Field of the Invention
The present invention relates to a display control apparatus capable of displaying related information on a selected item among a plurality of displayed items and a control method thereof.
2. Description of the Related Art
Functions of electronic apparatuses have become diversified and complicated, and it is not easy for users to master functions of an electronic apparatus. In a case of selecting an item from a plurality of selectable items displayed on a display, if a user cannot understand details of each item, it is difficult for the user to select a desired item. Japanese Patent Application Laid-Open No. 2008-152345 discusses a technique for displaying a pop-up window of a lower layer menu of a focused menu item when a predetermined time period has elapsed since the focus for selection was stopped at any of a plurality of menu items.
In Japanese Patent Application Laid-Open No. 2008-152345, related information on a selected item among a plurality of displayed items is displayed, so that the user can easily judge whether the selected item is a desired item. Furthermore, the timing of displaying the related information is set in such a manner that the related information is displayed when a predetermined time period has elapsed since the focus was stopped. Thus, the related item being displayed can be prevented from being switched frequently while the user feels that the user is still performing an operation.
However, in a case where there is a plurality of operation units to perform a selection operation, if related information is displayed when a fixed time period has elapsed since a selection operation was performed through any operation unit, suitable information is not always displayed for an operation of each operation unit.
For example, in a case where a selection operation is performed by touching an item among a plurality of items displayed on a touch panel, it is assumed that the selection operation is finished when a user's finger is removed from the touch panel. If, nevertheless, related information on the selected item is displayed long after the removal of the finger from the touch panel, the user feels that the response is slow.
On the other hand, in a case where a selection operation is performed by operating a button and the selection operations are performed on a plurality of buttons one after another, the same operation button may be pressed several times as a series of operations. Therefore, even when pressing of the operation button is once finished, it cannot instantly be determined that the selection operation has been finished. Hence, when related information on the selected item is displayed shortly after pressing of the operation button is finished, information that is irrelevant to an item that the user desires may be displayed during a series of user operations. This is confusing.
The present invention is directed to a display control apparatus capable of displaying related information on a selected item at a timing a user feels comfortable based on an operation unit used to select the item in a case where the item selection can be performed through a plurality of operation members.
In an aspect of the present invention, a display control apparatus includes: a display control unit configured to control such that a display unit displays a plurality of selectable items; a first operation acceptance unit configured to accept a first operation performed on the display unit; a second operation acceptance unit configured to accept a second operation; and a control unit configured to control such that when an item among the plurality of selectable items is selected in response to the second operation, related information on the item selected is displayed in response to an elapse of a first time period from performance of the second operation, and when an item among the plurality of selectable items is selected in response to the first operation, related information on the item selected is displayed before an elapse of the first time period from performance of the first operation.
Further features and aspects of the present invention will become apparent from the following detailed description of exemplary embodiments with reference to the attached drawings.
The accompanying drawings, which are incorporated in and constitute a part of the specification, illustrate exemplary embodiments, features, and aspects of the invention and, together with the description, serve to explain the principles of the invention.
Various exemplary embodiments, features, and aspects of the invention will be described in detail below with reference to the drawings.
It is to be noted that the following exemplary embodiment is merely one example for implementing the present invention and can be appropriately modified or changed depending on individual constructions and various conditions of apparatuses to which the present invention is applied. Thus, the present invention is in no way limited to the following exemplary embodiment.
The memory 102 is configured of, for example, a random access memory (RAM) (volatile memory using semiconductor device). The CPU 101 controls each unit of the digital camera 100 according to a program stored in, for example, the nonvolatile memory 103 by use of the memory 102 as a work memory. The nonvolatile memory 103 stores image data, audio data, other data, various programs for the CPU 101 to operate, and the like. The nonvolatile memory 103 also records time period t1 to t3, which will be described below. The nonvolatile memory 103 is configured of, for example, a hard disk (HD) and a read only memory (ROM).
Based on the control of the CPU 101, the image processing unit 104 performs various kinds of image processing on data such as image data stored in the recording medium 108, image data obtained via the external I/F 109 or the communication I/F 110, and image data captured by the image capturing unit 112. The image processing performed by the image processing unit 104 includes analog-to-digital (A/D) conversion processing, digital-to-analog (D/A) conversion processing, and encoding processing, compression processing, decoding processing, enlarging/reducing processing (resizing), noise reduction processing, and color conversion processing of image data. The image processing unit 104 may be configured of a dedicated circuit block for performing specific image processing. Some types of image processing can be performed by the CPU 101 according to a program without using a dedicated circuit block.
Based on the control of the CPU 101, the display 105 displays an image and a graphical user interface (GUI) screen configuring a GUI. The CPU 101 generates a display control signal according to a program to control each unit of the digital camera 100 so that a video signal for displaying a video image on the display 105 is generated and output to the display 105. The display 105 displays a video image based on the video signal thus input. The digital camera 100 may include only an interface configured to output a video signal for displaying a video image on the display 105, and the display 105 may be an external monitor (e.g., television).
The operation unit 106 is an input device configured to accept a user operation. The operation unit 106 includes a touch panel 230, which is a pointing device, a right button 202, a left button 204, and an electronic dial 211. The operation unit 106 also includes a joy stick, a touch sensor, a touchpad, a power switch, and a shutter button. The operation unit 106 may also include a text information input device, such as a keyboard, and a mouse (pointing device). The touch panel 230 is an input device that is superposed flatly on the display 105 and configured to output coordinate information corresponding to a touched position.
On the recording medium I/F 107, a recording medium 108 such as a memory card, a compact disk (CD), and a digital versatile disc (DVD) can be mounted. Based on the control of the CPU 101, the recording medium I/F 107 reads and writes data from/on the mounted recording medium 108. The external I/F 109 is an interface connected to an external apparatus via a wired cable or wirelessly to input and output a video signal and an audio signal. The communication I/F 110 is an interface configured to communicate with an external apparatus, an internet 111 and the like to transmit and receive various kinds of data such as a file and a command. The image capturing unit 112 includes at least an image sensor for converting an optical image into an electrical signal, such as a charge coupled device (CCD) and a complementary metal oxide semiconductor (CMOS) device, and includes optical members such as a zoom lens, a focus lens, a mirror, a diaphragm, and a shutter. The system timer 113 is a timer configured to perform time measurement of a clock function built in the digital camera 100 and to measure a control period of each control.
The touch panel 230 and the display 105 can be formed integrally. For example, the touch panel 230 is configured to have a light transmissivity that does not obstruct a display on the display 105, and is mounted on an upper layer of a display surface of the display 105. Then, input coordinates on the touch panel 230 are associated with display coordinates on the display 105. This can configure a GUI that allows a user to directly operate a screen displayed on the display 105. The CPU 101 is capable of detecting the following operations performed on the touch panel 230:
the touch panel 230 is touched with a finger or a pen (hereinafter, referred to as “touch-down”);
the touch panel 230 is being touched with a finger or a pen (hereinafter, referred to as “touch-on”);
the touch panel 230 is touched with a finger or a pen moving on the touch panel 230 (hereinafter, referred to as “touch-move”);
the touch panel 230 is touched with a finger or a pen then being removed from the touch panel 230 (hereinafter, referred to as “touch-up”); and
the touch panel 230 is not touched (hereinafter, referred to as “touch-off”).
The CPU 101 is notified via the internal bus 150 of the above operations and coordinates of a position on the touch panel 230 where a finger or a pen is touching. Based on the information thus notified, the CPU 101 determines what operation has been performed on the touch panel 230. As to the touch-move operation, vertical and horizontal components of the direction in which a finger or a pen is moved on the touch panel 230 can be determined based on a change in the position coordinates. Further, when a user performs touch-down and then a predetermined amount of touch-move followed by touch-up on the touch panel 230, the CPU determines that a stroke has been drawn. An operation to draw the stroke quickly is called a flick. The flick is an operation that a finger is moved quickly on the touch panel 230 for some distance while touching the touch panel 230 and then is removed from the touch panel 230. In other words, the flick is an operation that a user quickly moves his finger on the touch panel 230 to flip the touch panel 230 with the finger. If the CPU 101 detects touch-move for a predetermined distance or longer at a predetermined speed or higher followed by touch-up, the CPU 101 determines that the flick operation has been performed. Further, if the CPU 101 detects touch-move for a predetermined distance or longer at a speed lower than a predetermined speed, the CPU 101 determines that a drag operation has been performed. A touch panel of any type can be used as the touch panel 230 such as resistance film type, capacitance type, surface acoustic wave type, infrared-ray type, electromagnetic induction type, image recognition type, and optical sensor type touch panels among various types of touch panels.
The display 105 is a display unit configured to display an image and various kinds of information. The display 105 is formed integrally with the touch panel 230. The operation unit 106 illustrated in
Operations according to a first exemplary embodiment will be described below with reference to
In the present exemplary embodiment, a state will be described in which an item is selected from a plurality of items displayed as setting candidates on a screen for changing a white balance (hereinafter “WB”) setting. In the present exemplary embodiment, a period of time that is set to elapse until guide information on a selected WB setting is displayed is changed depending on whether the WB setting has been selected through a touching operation on the touch panel 230 or through an operation on the left/right key or the electronic dial 211.
In step S301, the CPU 101 displays an initial screen of the WB setting screen on the display 105.
In step S302, the CPU 101 determines whether the user has performed touch-down on the touch panel 230. If the user has performed touch-down (YES in step S302), then the processing proceeds to step S303 (first operation acceptance). If the user has not performed touch-down (NO in step S302), then the processing proceeds to step S319.
In step S303, the CPU 101 determines whether the position of the touch-down performed by the user in step S302 is within the setting item display area 403. If the position is within the setting item display area 403 (YES in step S303), then the processing proceeds to step S306. If the position is not within the setting item display area 403 (NO in step S303), then the processing proceeds to step S304.
In step S304, the CPU 101 determines whether the user has performed touch-move. If the user has performed touch-move (YES in step S304), then the processing proceeds to step S303 again. If the user has not performed touch-move (NO in step S304), then the processing proceeds to step S305.
In step S305, the CPU 101 determines whether the user has performed touch-up. If the user has performed touch-up (YES in step S305), then the processing proceeds to step S302 again. If the user has not performed touch-up (NO in step S305), then the processing proceeds to step S304.
In step S306, the CPU 101 displays an item at a touch position (position being touched) in a display form in a color that indicates that touch-on is in progress.
In step S307, the CPU 101 determines whether the user has performed touch-move. If the user has performed touch-move (YES in step S307), then the processing proceeds to step S303 again. If the user has not performed touch-move (NO in step S307), then the processing proceeds to step S308.
In step S308, the CPU 101 determines whether the user has performed touch-up. If the user has performed touch-up (YES in step S308), then the processing proceeds to step S309. If the user has not performed touch-up (NO in step S308), then the processing proceeds to step S307 again.
In step S309, the CPU 101 changes the display form of the item at the position (touch-up position) that was touched immediately before the touch-up in step S308 from the display form during touch-on to the selected state display form 401, which is the display form during touch-off, thereby displaying the item in the selected state display form 401. Furthermore, the CPU 101 sets a setting value (WB setting) indicated by the item at the touch-up position to the digital camera 100.
In step S310, the CPU 101 starts time measurement (the timer is started). More specifically, the CPU 101 obtains time information at this start point from the system timer 113 and stores the time information in the memory 102. From a difference between the time information thus stored and time information obtained during the subsequent time measurement, a time period that has elapsed since the timer was started can be determined.
In step S311, the CPU 101 determines whether a time period t1 has elapsed since the timer was started. If the time period t1 has elapsed (YES in step S311), then the processing proceeds to step S314. If the time period t1 has not elapsed (NO in step S311), then the processing proceeds to step S312. The time period t1 is shorter than a time period t2, which will be described below (t1<t2). The time period t1 is a time period from the time of touch-up from the touch panel 230 to finish the selection operation to the time when related information on the selected item is displayed. The time period t1 is, for example, about 0.2 seconds. If this time period is excessively long, the user may feel that the response is slow. Further, since the time period t1 is only required to be shorter than the time period t2, the time period t1 may be 0 second. In order not to give unnaturalness to the user, however, the time period t1 may have a substantial time period that is not 0 second.
In step S312, the CPU 101 determines whether the user has performed touch-down. If the user has performed touch-down (YES in step S312), then the processing proceeds to step S303 again. If the user has not performed touch-down (NO in step S312), then the processing proceeds to step S313.
In step S313, the CPU 101 determines whether the user has pressed the left/right key (either one of the right button 202 and the left button 204) or rotated the electronic dial 211. If the user has pressed the left/right key or rotated the electronic dial 211 (YES in step S313), then the processing proceeds to step S320. If the user has neither pressed the left/right key nor rotated the electronic dial 211 (NO in step S313), then the processing proceeds to step S311.
In step S314, the CPU 101 displays a guide 405 of the selected item as related information on the selected item on the display 105. The guide is displayed after touch-up but not during touch-on.
In step S315, the CPU 101 determines whether the user has performed touch-down. If the user has performed touch-down (YES in step S315), then the processing proceeds to step S316. If the user has not performed touch-down (NO instep S315), then the processing proceeds to step S317.
In step S316, the CPU 101 deletes the guide 405 displayed in step S314 (the guide 405 is not displayed), and the processing proceeds to step S303.
In step S317, the CPU 101 determines whether the user has pressed the left/right key (either one of the right button 202 and the left button 204) or rotated the electronic dial 211. If the user has pressed the left/right key or rotated the electronic dial 211 (YES in step S317), then the processing proceeds to step S318. If the user has neither pressed the left/right key nor rotated the electronic dial 211 (NO in step S317), then the processing proceeds to step S315 again.
In step S318, the CPU 101 deletes the guide 405 displayed in step S314 (the guide 405 is not displayed), and the processing proceeds to step S320.
On the other hand, in step S319, the CPU 101 determines whether the user has pressed the left/right key (either one of the right button 202 and the left button 204) or rotated the electronic dial 211. If the user has pressed the left/right key or rotated the electronic dial 211 (if a physical operation has been accepted) (YES in step S319), then the processing proceeds to step S320 (second operation acceptance). If the user has neither pressed the left/right key nor rotated the electronic dial 211 (NO in step S319), then the processing proceeds to step S302 again.
In step S320, the CPU 101 changes an item to be selected according to whether the user has pressed the left/right key or rotated the electronic dial 211. More specifically, if the user has pressed the right button 202, the CPU 101 selects an adjacent item on the right side of the item that was selected before the pressing. If the user has pressed the left button 204, the CPU 101 selects an adjacent item on the left side of the item that was selected before the pressing. If the user has rotated the electronic dial 211 clockwise, the CPU 101 moves the selection frame to the right according to the amount of rotation. If the user has rotated the electronic dial 211 anticlockwise, the CPU 101 moves the selection frame to the left according to the amount of rotation.
In step S321, the CPU 101 changes the display form of the selected item thus changed to the selected state display form 401, which is the display form during touch-off, thereby displaying the selected item in the selected state display form 401. Furthermore, the CPU 101 sets the setting value (WB setting) of the selected item thus changed to the digital camera 100.
In step S322, the CPU 101 starts time measurement (the timer is started). More specifically, the CPU 101 obtains time information at this start point from the system timer 113 and stores the time information in the memory 102. From a difference between the time information thus stored and time information obtained during the subsequent time measurement, a time period that has elapsed since the timer was started can be determined.
In step S323, the CPU 101 determines whether the time period t2 has elapsed since the timer was started. If the time period t2 has elapsed (YES in step S323), then the processing proceeds to step S314. If the time period t2 has not elapsed (NO in step S323), then the processing proceeds to step S324. The time period t2 (first time period) is longer than the time period t1 (second time period) described above (t1<t2). The time period t2 is a time period from the time when the item to be selected is changed according to whether the user has pressed the left/right key or rotated the electronic dial 211 to the time when related information on the selected item is displayed. The time period t2 is, for example, about 0.8 seconds. If this time period is excessively short, the content of the guide 405 switches successively during a user operation to make the user feel bothered. When the time period is about 0.8 seconds, an operation is likely to be almost finished even if the operation is a continuous operation. Therefore, the user would not feel that the operation is still in progress.
In step S324, the CPU 101 determines whether the user has performed touch-down. If the user has performed touch-down (YES in step S324), then the processing proceeds to step S303. If the user has not performed touch-down (NO instep S324), then the processing proceeds to step S325.
In step S325, the CPU 101 determines whether the user has pressed the left/right key (either one of the right button 202 and the left button 204) or rotated the electronic dial 211. If the user has pressed the left/right key or rotated the electronic dial 211 (YES in step S325), then the processing proceeds to step S320. If the user has neither pressed the left/right key nor rotated the electronic dial 211 (NO in step S325), then the processing proceeds to step S323 again.
In the processing illustrated in
In the present exemplary embodiment, the guide 405 is displayed when the time period t2 has elapsed (after YES in step S323) since the operation of the left/right key or the electronic dial 211 was finished in step S302. The guide 405 may be displayed after the elapse of different time periods depending on whether the user operation has been performed through the left/right key or the electronic dial 211. For example, when the operation has been performed through the left/right key (first operation member), the guide 405 is displayed after the time period t2 (first time period: for example, 0.8 seconds) has elapsed. On the other hand, when the operation has been performed through the electronic dial 211 (second operation member), the guide 405 is displayed after a time period t3 (third time period: for example, 0.5 seconds), which is shorter than the time period t2 and longer than the time period t1, has elapsed. In other words, the time period t1 (time period from touch-up to guide display)<the time period t3 (time period from dial operation to guide display)<the time period t2 (time period from button operation to guide display). This relation is set because an interval between continuous rotation operations of the electronic dial 211 is presumed to be shorter than an interval between continuous pressing operations of either one of the left/right key. When the interval between continuous operations is short, even if the time period to elapse before the guide 405 is displayed is set relatively short, the guide 405 is less likely to be displayed frequently during the operations. Accordingly, the time period from the time of the selection operation to the time when the related information on the selected item is displayed is set as appropriate for each operation member through which the user performs the selection operation, whereby the related information can be displayed with more meticulous care at the timing that the user feels comfortable.
Further, as illustrated in
In the first exemplary embodiment described above, the present invention is applied to the WB setting screen of the digital camera 100 in the image capturing mode. The plurality of selectable items is applied to the setting candidates settable as the WB setting, and the related information on the selected item is applied to the guide display of the selected WB setting item. However, the present invention is not limited thereto. In a second exemplary embodiment, the present invention is applied to a multi-display screen where multiple images are displayed on a single screen. The plurality of selectable items is applied to a plurality of displayed images, and the related information on the selected item is applied to an enlarged image of the selected image. The multi-display screen is displayed when the digital camera 100 is in a playback mode.
When the user presses arrow keys or rotates the electronic dial 211 in the state illustrated in
According to the second exemplary embodiment, the enlarged image can be displayed as related information on the selected image at the timing that the user feels comfortable according to the operation unit through which the user has selected the item from the plurality of images. Thus, the user can perform comfortable operation. It is to be noted that although the example is described in which the enlarged image of the selected image is displayed as related information on the selected item, the present invention is not limited thereto. The related information may be attribute information (e.g., image capturing setting information and image capturing position information included in header information) on the selected image or information on other files associated with the selected image.
Furthermore, the operation member is not limited to the touch panel 230, and any operation member that allows a user to directly designate and select a desired item instead of sequential selection can be used. An operation member such as a mouse and a touchpad can also be used in a similar manner to the touch panel 230 in the exemplary embodiments described above.
The control performed by the CPU 101 in the above exemplary embodiments is not limited to the control performed by a single hardware component. The control of the entire apparatus maybe performed by a plurality of hardware components sharing the processing.
The above-described exemplary embodiments are merely examples for implementing the present invention and can be appropriately modified or changed depending on individual constructions and various conditions of apparatuses to which the present invention is applied. Thus, the present invention is in no way limited to the foregoing exemplary embodiments.
In the above-described exemplary embodiments, the examples are described in which the present invention is applied to the digital camera 100. However, the present invention is not limited to the examples. The present invention is also applicable to any display control apparatus that allows a user to select an item from a plurality of selectable items and is capable of displaying related information on the selected item. More specifically, the present invention is applicable to personal computers, personal digital assistants (PDA), mobile phone terminals, portable image viewers, printer apparatuses including a display, digital photo frames, music players, game apparatuses, electronic book readers and the like.
While the present invention has been described with reference to exemplary embodiments, it is to be understood that the invention is not limited to the disclosed exemplary embodiments. The scope of the following claims is to be accorded the broadest interpretation so as to encompass all modifications, equivalent structures, and functions.
This application claims priority from Japanese Patent Application No. 2012-130096 filed Jun. 7, 2012, which is hereby incorporated by reference herein in its entirety.
Number | Date | Country | Kind |
---|---|---|---|
2012-130096 | Jun 2012 | JP | national |