The present invention relates to a graphical user interface (GUI)
Some electronic devices having a touch-screen are provided with a GUI in which icons are aligned, and in which appearances change in response to a user's touch of a screen. JP-A-2009-15182 discloses a portable electronic device, in which a tray 216 for accommodating icons having functionalities frequently used by a user and a tray 214 for accommodating icons having a functionality which are activated infrequently by the user are displayed, and in which the icons can be moved from tray 216 to tray 214 in response to a user's operation. Generally, such icons can either be added or deleted.
When the number of icons displayed in a list of icons is large, it may be difficult for a user to locate a required icon. For example, when icons are grouped by folders, directories or other schemes used for control of display of icons collectively, a user is required to open such folders one by one to find a required icon. In view of the foregoing, an object of the present invention is to change the appearances of a group of icons associated with each other by an intuitive operation that is different from operations employed in the prior art.
According to an aspect of the present invention, there is provided a display apparatus including: a display having a screen in which an image is displayed; an input unit having a surface on which a user's touch is sensed; a display controller that displays a plurality of individual icons, and one or more category icons each of which represents a category of one or more individual icons; and a motion recognition unit that detects a pinching-out motion for widening a distance between two points of touch on the surface based on a motion sensed by the input unit, wherein the display controller changes a state of one or more individual icons associated with at least one category from an enfolded state in which the one or more individual icons are not displayed to an unfolded state in which the one or more individual icons are displayed.
In a preferred embodiment, the display controller changes a number of categories within which one or more individual icons are displayed in the unfolded state, based on an amount or velocity of two points of touch applied in the pinching-out motion detected by the motion recognition unit.
In another preferred embodiment, the display controller changes the one or more categories within which the one or more individual icons are displayed in the unfolded state based on the positions of the two points of touch detected by the motion recognition unit.
In yet another preferred embodiment, the display controller changes the one or more categories within which the one or more individual icons are displayed in the unfolded state based on a difference in an amount of movement of the two points of touch in the pinching-out motion as detected by the motion recognition unit.
In yet another preferred embodiment, the motion recognition unit detects a pinching-in motion in which two points of touch are used to narrow a distance between the two points, based on the user's touch sensed by the input unit, and upon detection of the pinching-in motion by the motion recognition unit, the display controller, changes the state of one or more individual icons associated with at least one of the one or more categories, from the unfolded state to the enfolded state.
Preferably, the display controller changes a number of the one or more categories within which the one or more individual icons are not displayed, based on an amount of movement or velocity of the two points of touch in the pinching-in motion detected by the motion recognition unit.
Preferably, the display controller changes the one or more categories within which the one or more individual icons are not displayed in the enfolded state, based on the positions of the two points of touch in the pinching-in motion detected by the motion recognition unit.
Preferably, the display controller changes the one or more categories within which the one or more individual icons are not displayed in the enfolded state based on a difference in amounts of movement of the two points of touch in the pinching-in motion detected by the motion recognition unit.
In yet another aspect of the present invention, there is provided a method of generating a user interface at a display apparatus in which a display that display(s) a plurality of individual icons and one or more category icons each of which represents a category of one or more individual icons on a screen and an input unit having a surface on which the user's touch is sensed are provided, the method including: a first step of detecting a pinching-out motion to widen a distance between two points touching the surface when a one or more category icons are displayed on the screen; and a second step of changing, upon detection of the pinching-out motion, a state of one or more individual icons associated with at least one category from an enfolded state in which the one or more individual icons are not displayed to an unfolded state in which the one or more individual icons are displayed.
In yet another aspect of the present invention, there is provided a display apparatus including: a display having a screen in which an image is displayed; an input unit having a surface on which a user's touch is sensed; a display controller that displays a plurality of individual icons and one or more category icons each of which represents a category of one or more individual icons; and a motion recognition unit that detects a motion by which a distance in two points of touch changes based on the user's touch sensed by the input unit, wherein the display controller changes a state of one or more individual icons associated with at least one category between an enfolded state in which the one or more individual icons are not displayed and an unfolded state in which the one or more individual icons are displayed.
In yet another aspect of the present invention, there is provided a program that causes a computer of a display apparatus, in which a display that displays a plurality of individual icons and one or more category icons each of which represents a category of one or more individual icons on a screen and an input unit having a surface on which a user's touch is sensed are provided, to execute: a first step of detecting a pinching-out motion to widen a distance between two points touching the surface when one or more category icons are displayed on the screen; and a second step of changing, upon detection of the pinching-out motion, a state of one or more individual icons associated with at least one category from an enfolded state in which the one or more individual icons are not displayed to an unfolded state in which the one or more individual icons are displayed.
According to the present invention, appearances of icons associated with each other are changed by an intuitive operation that is different from operations employed in the prior art.
a, 6b, and 6c are examples of a screen transition according to the first embodiment;
a and 12b show an example of amounts of shift d1 and d2 in detail.
The size of display apparatus 100 is suitable for a user to input instructions onto screen 101 by a finger(s). For example, display apparatus 100 is a mobile phone including smart phones, tablet PC, slate PC, or Personal Digital Assistants (PDA). The size of display apparatus 100 may be suitable for being hand held. Alternatively, display apparatus 100 may be configured for desk use or attachment to a holder for a use of being put on a desk or attached to a holder. Also, display apparatus 100 is not necessarily a plate-like shape.
Main controller 110 controls all the elements of display apparatus 100. Main controller 110 includes a processor such as a Central Processing Unit (CPU) and a storage unit such as a Read Only Memory (ROM), and Random Access Memory (RAM). Main controller 110 generates a GUI of the present invention by executing a program stored in a ROM or memory 120. Also, main controller 110 is configured to execute a plurality of application software (hereinafter referred to as applications) to implement functionalities of the applications in the display apparatus 100. Main controller 110 may be capable of performing a multi-tasking operation in which two or more tasks or processes are executed in parallel. A multi-core hardware configuration may be adopted for performing the multi-tasking.
Memory 120 stores data. Memory 120 may be a hard drive, flash memory or other storage medium used to store data for access by main controller 110. Memory 120 may be of a removable type that can be attached to and detached from display apparatus 100. Programs executable by main controller 110 and image data for display on screen 101 can be stored in memory 120. It is possible to store an identifier for identifying a user in memory 120, when a single user uses two or more display apparatuses 100 or a single display apparatus 100 is used by two or more users.
Touch screen 130 displays an image and receives an input from a user. Specifically, touch screen 130 includes a display 131 that displays an image and an input unit 132 that receives an input from the user.
Display 131 includes a display panel including a liquid crystal or organic electroluminescence for displaying an image and a driving circuit for driving the display panel. As a result, an image according to image data supplied from main controller 110 is displayed on screen 101. Input unit 132 is provided on screen 101 covering screen 101. A two-dimensional sensor for sensing a touch by a finger(s) on screen 101 is provided in input unit 132. Input unit 132 outputs to main controller 110 operation information representing a position (hereinafter referred to as “a point of touch” at which a finger(s) touches. Input unit 132 supports multi-touch functionality in which two or more touches performed at the same time can be detected.
Communications unit 140 transmits and receives data. Communications unit 140 may be a network interface for connecting a network such as a mobile communications' network and the Internet. Alternatively, communications unit 140 can be configured to communicate with other electronics devices without using a network. For example, communications unit 140 may wirelessly communicate with other devices based on a Near Field Communication (NFC) standard. The data transmitted or received by communications unit 140 may include electronic money (electronic coupon) or other information representing electronically exchangeable money or strip.
The explanation of a hardware configuration of display apparatus 100 has been completed. With the stated configuration, display apparatus 100 executes various applications. Functionalities of the applications executed in display apparatus 100 may include displaying news, weather forecasts, and images (including static and moving images), reproduction of music, and enabling a user to play a game or read an electronic book. In addition, the applications may include a mailer or web browser. The applications include an application that can be executed in parallel and an application that can be executed as a background application. The applications may be pre-installed in display apparatus 100. Alternatively, the user may buy the application from a content provider and download it via communications unit 140.
Display apparatus 100 executes an application for displaying icons of applications. Hereinafter, an application is referred to as “an icon management application” and an application associated with an icon is referred to as “a target application.” A target application may include all applications executable by display apparatus 100 except for an icon management application or a part of applications. In other words, at least a part of the executable applications can be a subject of management performed by the icon management application.
The icon management application enables a user to manage or execute target applications. Specifically, functionalities of the icon management application include classifying target applications into a predetermined category and defining a new category for a target application. Each category of a target application has attributes represented by a “game” or “sound,” or the like, which makes it easy to understand a type of a target application. The attributes assigned to each category may represent a frequency or history with respect to a usage of a target application. For example, one of the attributes is defined as a “frequently used target application.” The icon management application may determine a category for a target application based on the attributes (frequency of use, for example) of the target application or on an instruction input by a user's finger(s).
A category icon represents a category associated with one or more individual icons. Thus, individual icons associated with a category icon belong to a single category represented by the category icon. In this embodiment, when an individual icon(s) is displayed, a category icon is allocated above the individual icon(s).
Display of individual icons is controlled on a category basis. In the example of
The icon list is not necessarily displayed on screen 101 in its entirety. In other words, a user can view the entire icon list by scrolling the list. A vertical length of the icon list is variable depending on a number of unfolded icons.
Sensing unit 111 obtains operation information. Specifically, sensing unit 111 obtains operation information from input unit 132 of touch screen 130. Operation information represents one or more coordinates of point of touch on screen 101 in two-dimensional orthogonal coordinates system with a predetermined position (for example, center or a corner of screen 101) defined as an origin. Operation information changes in response to a movement of user's finger on screen 101, resulting in a change of a sensed point of touch.
Motion recognition unit 112 detects a type of motion made by the user, based on operation information obtained by sensing unit 111. In this embodiment, motion recognition unit 112 detects at least three types of motion, that is, a tapping, pinching-in motion, and pinching-out motion. Motion recognition unit 112 may detect a dragging, flicking, and double tapping, in which tappings are performed twice in succession, and other types of motions.
Tapping is a motion of touching a point on a screen 101. A pinching-in motion and pinching-out motion are motions of touching two points on screen 101 at the same time. The pinching-in motion is a motion of touching two points on screen 101 and then moving the points of touch to narrow a distance between the two points of touch. The pinching-in motion is also referred to as “pinch closing.” The pinching-out motion is a motion of first touching two points on screen 101 and then moving the two points of touch to expand a distance between the two points of touch. The pinching-out motion is also referred to as “pinch opening.” In the pinching-in motion and the pinching-out motion, a series of motions starting from fingers touching screen 101 and ending with disengaging the fingers from screen 101 are recognized as a single operation. It is noted that each of the pinching-in motion and the pinching-out motion involve finger motion in both a horizontal direction and a vertical direction with respect to screen 101.
Display controller 113 controls display of an image in display 131. Display controller 113 has at least functionalities of unfolding and folding of individual icons in response to a user's operation when motion recognition unit 112 determines that the operation is a tapping pinching-in motion or a tapping pinching-out motion. Upon detection of a user's tapping on a particular individual icon, display controller 113 depicts an image of the particular individual icons associated with a target application.
In the configuration described above, display apparatus 100 executes the icon management application to control displacement of individual icons. After the icon management application is exacted, at least one category icons is always displayed on screen 101, but an individual icon(s) is not necessarily displayed. For example, display apparatus 100 may display individual icons such that the displayed individual icons do not differ from when the icon management application was executed at a most recent previous time.
In this embodiment, display apparatus 100 provides a user with a GUI in which unfolding and enfolding are performed differently depending on whether a user's operation is a tapping, pinching-in motion or a tapping pinching-out motion. Specifically, display apparatus 100 enfolds or unfolds only individual icons associated with a particular category of icon(s) in response to tapping, and enfolds or unfolds all of the individual icons associated with all of the category icons displayed in the icons list in response to a pinching-in motion or pinching-out motion
Details of processing of control of a display performed in display apparatus 100 will now be provided.
In step S1, main controller 110, obtains operation information. Main controller 110 determines a coordinate(s) of one or more points of touch and a change in the coordinate(s) based on the obtained operation information. Next, main controller 110, based on the operation information, recognizes user's operation (motion) (step S2). In step S2, main controller 110 determines a position to which the operation is pointing as well as a type of the operation. It is possible to detect the amount of movement of the point of touch and a velocity of the movement by main controller 110, which will be described later.
Next, main controller 110, based on a detection result performed in step S2, determines whether the user's operation is intended to designate a particular category icon (step S3). A motion for designating a particular category icon is a tapping on an area of screen 101 where particular category icons are displayed. Thus, main controller 110 determines whether a user's operation is a tapping and a tapped point is in a category icon in step S3. This operation is one of the examples of motions for selecting a category icon(s) employed in the present invention.
When it is determined that the selecting operation is directed to a particular category icon, main controller 110 determines whether an individual icon(s) associated with the category icon is in the enfolded state (step S4). When the individual icon(s) is in the enfolded state, main controller 110 unfolds the individual icon(s) (step S5). When the individual icon(s) is not folded (i.e., is in an unfolded state), main controller 110 enfolds the individual icon(s) (step S6). By doing so, main controller 110 switches the state of the individual icons associated a particular category icon between the unfolded state and enfolded state in response to a user's selection of the particular category icon.
When an operation different from the operation for selecting a particular category icon(s) is detected, main controller 110 determines whether the detected operation is a pinching-out motion or pinching-in motion (steps S7 and S9, respectively). When a pinching-out motion is detected, main controller 110 unfolds all of the individual icons associated with all of the displayed category icons collectively (step S8). When a pinching-in motion is detected, main controller 110 enfolds all of the individual icons associated with all of the displayed category icons collectively (step S10). Hereinafter, processing of steps S8 and S10 are referred to as “a collective unfolding” and “a collective folding”, respectively. It is noted that steps S7 and S9 can be performed in a reverse order.
In step S8, main controller 110 remains the individual icon(s) unfolded for a category within which the individual icon(s) is in the unfolded state. Similary, main controller 110 remains the individual icon(s) enfolded for a category within which the individual icon(s) is in the enfolded state in step S10.
When main controller 110 determines that a detected operation is neither a pinching-out motion nor a pinching-in motion in Steps S7 and S9, respectively, main controller 110 initiates an exceptional processing according to the user's operation (step S11). For example, when an individual icon has been tapped, main controller 110 executes a target application associated with the individual icon to display a content of the target application. The exceptional processing may include a process in which no particular processing is performed, in other words, no response to a user's operation is generated.
a, 6b, and 6c show a screen transition of the icon list according to a control of a display according to this embodiment.
In the display control of this embodiment, the user can perform enfolding/unfolding for a particular category(s) or all of the categories by a single action. Thus, a user can adjust the number of displayed individual icons by an operation depending on the situation. For example, a user performs a pinching-out motion to search all the executable target applications, whereas the user performs a tapping to designate a particular selection category so as to search for an intended target application when the user can estimate its category.
In the display control of this embodiment, a state of the individual icons changes from the unfolded state to the enfolded state by a pinching-out motion to expand two fingers (widen a distance thereof), and changes from the enfolded state to the unfolded state by a pinching-in motion to close the fingers (narrow the distance). As individual icons that have been enfolded are re-displayed by the process, the process of activating the unfolded state has a conceptual similarity to a motion of expanding fingers. Similary, the process of activating the enfolded state has a conceptual similarity to a motion of closing expanding fingers. In this regard, the display control according to the present embodiment enables the user to change display statuses of individual icon(s) by an intuitive operation.
In this embodiment, a hardware configuration of display apparatus 100 is the same as in the first embodiment but details of the display control are different from the first embodiment. Specifically, in this embodiment the number of categories in which individual icons are to be enfolded or unfolded changes depending on the position or the amount of movement of a point of touch. In other words, the number of displayed individual icons changes depending on the position or the amount. Since a configuration of a display of this embodiment is similar to display apparatus 100 of the first impediment, like numerals are used with regard to a display of this embodiment and (a) detailed explanation thereof is therefore omitted.
In this embodiment, a process is different from the first embodiment in steps S8 (collective unfolding) and step S10 (collective folding). Processing other than steps S8 and S10 are similar to first embodiment. In this embodiment, collective unfolding and collective enfolding are performed when the determination performed in steps S7 or S9 is “YES”.
If the amount d is greater than the threshold Th1, main controller 110 unfolds collectively all of the individual icons associated with all of the categories displayed in the icon list (step Sa2). A folding process of individual icons is similar to that of step S8.
If the amount d is equal to or smaller than the threshold Th1, main controller 110, restricts the number of categories in which individual icons are unfolded to m. The parameter m is an integer that is more than 2 and less than the number (i.e., 5 in the example shown in
Main controller 110 identifies positions of two points of touch in a pinching-out motion, and determines whether the identified positions are located in an upper half area (step Sa3). More specifically, a representative point between the two identified positions is located in the upper half area of screen 101 is determined. The representative point may be chosen as a middle point of a line segment formed by the two points. Other methods of determining the representative point can be employed. For example, the determination is made based on either one of the points of touch.
Upon determination that a pinching-out motion is performed in the upper half area, main controller 110 unfolds individual icons associated with m category icons from the top (step Sa4). If it is determined that the pinching-out motion is performed at the lower half area, main controller 110 unfolds individual icons associated with m category icons from the bottom (step Sa5). For example, in a case where the number of category icons included in the icon list is 5 as shown in
In the display control according to this embodiment, the user can change categories to be enfolded or unfolded according to details (a position or amount of movement) of a pinching-out motion or pinching-in motion. In this embodiment, the more amount the user moves the fingers, the more categories to be enfolded or unfolded are designated. Also, the user performs an operation in the upper half of screen 101 to unfold categories displayed above. In this regard, there is a sensory similarity between a user's motion and a folding/unfolding processing. This enables the user to operate display apparatus 100 intuitively.
It is possible to give a stronger relevance between a category or categories within which individual icons are enfolded or unfolded and a user's motion to display apparatus 100. For example, when the user performs pinching-in motion vertically with regard to the screen, if there is displayed one or more category icons between an initial upper point of touch and an initial lower point of touch, display apparatus 100 enfolds individual icons associated with the one or more category icons, and does not fold individual icons associated with other category icons, which include a category icon(s) located above the upper initial point of touch and below the lower initial point of touch. In this case, the number of categories within which individual icons are enfolded/unfolded changes depending on positions of fingers, regardless of a value of m.
It is possible to introduce levels in a threshold of a moving mount d.
In this example, Main controller 110 sets a value for the parameter m such that m increases in stages in accordance with an increase of the count of movement d. For example, if N=5, main controller 110 collectively enfolds or unfolds individual icons associated with categories by two when d≦Th1a, by three when Th1a<d≦Th1b, by four when Th1b<d≦Th1c, and by five when Th1c<d. Alternatively, values of the thresholds and the paramour m may be defined by the user.
In this embodiment a configuration of a display is the same as of the first and second embodiments, but details of a collective unfolding and collective folding included in a display control are different from the first and second embodiment. Specifically, in this embodiment, based on the amount of movement between two points of touch in a pinching-in motion or pinching-out motion, categories within which individual icons are enfolded/unfolded are determined. Since a configuration of a display of this embodiment is similar to that of display apparatus 100 of the first impediment, like numerals are used with regard to a display of this embodiment and detailed explanations thereof are therefore omitted.
In this embodiment, it is assumed that a pinching-in motion and pinching-out motion are performed vertically with respect to screen 101. Hereinafter, of two points of touch, the upper one is referred to as “the first point of touch” and the lower one is referred to as “the second point of touch”. When the pinching-in motion and pinching-out motion is performed with a thumb and forefinger, normally, a position of the forefinger is the first touch point and a position of the thumb is the second points of touch.
If a result of the determination in step Sc1 is YES, main controller 110 compares the amounts of movement d1 and d2 (step Sc2). If the amount of movement d1 is larger, main controller 110 unfolds individual icons associated with n category icons selected from the top of the screen 101 (step Sc3). If the amount of movement d2 is larger, main controller 110 unfolds individual icons associated with n category icons selected from the bottom of the screen 101 (step Sc4). Similary to the parameter m described above, the parameter n is an integer and more than two and less than the number of all categories.
If the determination in step Sc1 is NO, main controller 110 performs the multi-unfolding of the second embodiment (step Sc5). A processing of step Sc5 is similar to the series of the process starting from step Sa1 and ending with step Sa5, which is described above (Refer to
a and 12b show the amounts of movement d1 and d2. P1 and P2 represent an initial first point of touch and second points of touch, respectively. A dashed line represents a trajectory of a point of touch.
In this embodiment, similarly to the second embodiment, a category icon(s) within which individual icons are to be enfolded/unfolded is determined based on a position of a point of touch. For example, when the second point of touch remains in a same position and movement of the first point of touch is detected as shown in
The scope of the present invention is not limited to the embodiments described above and the present invention can be implemented as other embodiments. For example, the present invention can be implemented based on the modified embodiments' provided below. Also, the present invention can be implemented based on a combination of the modified embodiments. Characteristic features of the embodiments described above are selectively combined to implement the present invention. As an example, an unfolding of individual icons is performed based on the second embodiment and a folding of individual icons is performed based on the third embodiment.
Although a screen for displaying an image and a sensing surface that senses a user's motion physically overlap in the above described embodiments, an input unit of the present invention does not necessarily include a sensor provided overlapping with the screen. For example, a touch pad, which is also referred to as a track pad or slide pad, may be employed to sense a user's motion.
Individual icons of the present invention do not necessarily represent functionalities of the applications. For example, in the present invention, individual icons consist of an image in a thumbnail-size or the like, representative of image data or audio data, and a category icon is an image (such as a folder-shaped image) for categorizing data. An individual icon of the present invention may represent an electronic ticket described above. An individual icon is an example of an image assigned to data for identifying the data.
A category icon of the present invention may be an image displayed on the background of individual icons, partially overlapping the individual icons. Also, a category icon of the present invention disappears when the individual icons are in an unfolded state and displayed again when the individual icons are in the enfolded state. Simply put, details of a category icon of the present invention are not restricted to the above embodiments, as long as a category icon represents a category of individual icons. It is possible for the number of categories or category icons to be one. In this case, similarly to the embodiments described above, the user can change a display status of individual icons by an intuitive operation to enfold/unfold individual icons in response to a pinching-out motion or pinching-in motion.
In the above described embodiments, only a part of the icon list is displayed when a number of individual icons is large, without a size of individual icons changing. It is possible to display all the individual icons within a screen by changing a size of the individual icons or category icons
In the present invention instructions can be input by a stylus or other pointing devices held by a user or put on a hand, not by fingers. In this case, the input unit may detect a position of a pointing device described above by infrared light or ultrasonic wave. When a magnetic material is provided with the pointing device at its end, a position may be detected magnetically. Simply put, a touch screen is not necessarily provided with a display apparatus of the present invention. A pinching-in motion or pinching-out motion may be performed using a finger of one hand and a finger of the other hand.
A user's touching of a screen is not necessarily to input opinion information. For example, when a detection of touch screen 130 of display apparatus 100 is performed based on a change in a capacity, the position of a finger may be detected when the finger is approaching screen 101.
In the present invention, a tapping is not necessarily used for selecting a particular category icon(s) from a plurality of category icons. Other motions can be used for the selection as long as a motion is distinctive from a pinching-out motion and pinching-in motion. For example, such a motion includes a double tapping.
The present invention can be applied adapted for a game console, audio player, e-book reader or other electronic devices. It is possible to implement the present invention by co-operating an apparatus having a screen and other apparatus provided independently from the display apparatus that controls the display apparatus.
In this case, the other apparatus has at least the functionality shown in
Number | Date | Country | Kind |
---|---|---|---|
2011-108669 | May 2011 | JP | national |
Filing Document | Filing Date | Country | Kind | 371c Date |
---|---|---|---|---|
PCT/JP12/62148 | 5/11/2012 | WO | 00 | 9/4/2013 |