1. Field of the Invention
The present invention relates to an information terminal, a screen component display method, and the like.
2. Related Art of the Invention
Information terminals such as PDAs, smartphones, tablet PCs, and car navigation systems are becoming widely used. For downsizing purposes, such information terminals typically adopt a touch panel used to input information by touching an icon or other screen components of a GUI (Graphical User Interface) displayed on a display with a touch pen or a finger. With a touch panel, screen components including a plurality of icons are displayed on a display screen, and by touching an icon with a stylus or a finger, the icon is decided on and an application program assigned to the icon can be activated.
While such information terminals require that a touch panel be touched in order to decide on an icon, an information terminal is disclosed in which, by bringing a finger close to a touch panel, icons on the touch panel are gathered around the finger (for example, refer to Japanese Patent Laid-Open No. 2008-117371).
However, with the information terminal according to Japanese Patent Laid-Open No. 2008-117371 described above, since all displayed icons move so as to surround the finger, a problem exists in that a desired icon is difficult to find when a large number of icons are displayed.
In addition, since all displayed icons move at once, it is difficult to discern original positions of the icons prior to movement thereof.
The present invention is made in considerations of problems existing in the conventional information terminal described above, and an object of the present invention is to provide an information terminal and a screen component display method that enable a desired screen component to be easily found and an original state of a screen component prior to movement thereof to be easily discerned.
To achieve the above object, the 1st aspect of the present invention is an information terminal comprising:
a display surface that displays a plurality of screen components;
a motion display detecting unit that detects information related to a distance from the display surface to a designating object that designates the screen components, and information related to a position of the designating object in a plane parallel to the display surface; and
a screen component movement rendering unit that motion-displays in sequence the screen components selected according to a selection rule, to such positions on the display surface obtained according to a display rule, based on the information related to the distance and the information related to the position.
The 2nd aspect of the present invention is the information terminal according to the 1st aspect of the present invention, wherein the display rule refers to a position directly underneath the detected position or a vicinity of the position directly underneath the detected position.
The 3rd aspect of the present invention is the information terminal according to the 1st aspect of the present invention, wherein the selection rule refers to selecting the screen components based on a genre or a decision history.
The 4th aspect of the present invention the information terminal according to the 1st aspect of the present invention, wherein the sequence refers to an order determined based on a decision history.
The 5th aspect of the present invention is the information terminal according to the 1st aspect of the present invention, wherein when the motion display detecting unit detects that the designating object is separated from the display surface beyond a predetermined distance, the screen component movement rendering unit restores the motion-displayed screen components to respective original states before the motion display of the screen components.
The 6th aspect of the present invention is the information terminal according to the 1st aspect of the present invention, further comprising
a decided position judging unit which, when the motion display detecting unit detects that the designating object enters within a definite distance that is shorter than a predetermined distance, of the display surface, and detects a position of the designating object in a plane parallel to the display surface, judges the screen component displayed at a position on the display surface directly underneath the detected position of the designating object and detects that the judged screen component is decided on by the designating object.
The 7th aspect of the present invention is the information terminal according to the 6th aspect of the present invention, further comprising
a change screen component displaying unit which, when the motion display detecting unit detects that the designating object enters within the predetermined distance of the display surface and detects a position of the designating object in a plane parallel to the display surface, displays a change screen component for changing the motion-displayed screen components to other screen components, at a position directly underneath the detected position of the designating object or in a vicinity of the position directly underneath the detected position of the designating object, wherein
when the decided position judging unit detects that the change screen component is decided on by the designating object, in order to perform the changing, the screen component movement rendering unit restores the motion-displayed screen components to original states before the motion display of the screen components and motion-displays in sequence screen components selected based on a second selection rule that differs from the selection rule, to such positions on the display surface obtained according to a second display rule that differs from the display rule, based on the position of the designating object detected by the motion display detecting unit.
The 8th aspect of the present invention is the information terminal according to the 6th aspect of the present invention, further comprising
an addition screen component displaying unit which, when the motion display detecting unit detects that the designating object enters within the predetermined distance of the display surface and detects a position of the designating object in a plane parallel to the display surface, displays an addition screen component for adding another screen component to the motion-displayed screen components, at a position directly underneath the detected position of the designating object or in a vicinity of the position directly underneath the detected position of the designating object, wherein
when the decided position judging unit detects that the addition screen component is decided on by the designating object, in order to perform the adding, the screen component movement rendering unit does not restore the motion-displayed screen components to original states before the motion display of the screen components and motion-displays in sequence screen components selected based on a second selection rule that differs from the selection rule, to such positions on the display surface obtained according to a second display rule that differs from the display rule, based on the position of the designating object detected by the motion display detecting unit.
The 9th aspect of the present invention is the information terminal according to the 5th aspects of the present inventions, wherein the screen component movement rendering unit restores the motion-displayed screen components in a determined sequence when restoring the motion-displayed screen components to original states before the motion display of the screen components.
The 10th aspect of the present invention is the information terminal according to the 7th aspect of the present invention, wherein when the change screen component is decided on by the designating object, the change screen component displaying unit erases the change screen component.
The 11th aspect of the present invention is the information terminal according to the 8th aspect of the present invention, wherein when the addition screen component is decided on by the designating object, the addition screen component displaying unit erases the addition screen component.
The 12th aspect of the present invention is the information terminal according to the 1st aspect of the present invention, wherein groups of the selected screen components to be motion-displayed at least partially differ from each other according to a position of the designating object detected by the motion display detecting unit.
The 13th aspect of the present invention is the information terminal according to the 1st aspect of the present invention, wherein the motion display detecting unit includes a capacitance panel arranged adjacent to the display surface in order to detect, by a capacitance method, the information related to a distance from the display surface to the designating object that designates the screen components, and the information related to a position of the designating object in a plane parallel to the display surface.
The 14th aspect of the present invention is the information terminal according to the 1st aspect of the present invention, wherein the information related to a distance from the display surface to the designating object designating the screen components, refers that the designating object designating the screen components enters respectively within n-number (where n is a natural number equal to or greater than 1) types of predetermined distances of the display surface.
The 15th aspect of the present invention is a screen component display method comprising:
a display step of displaying a plurality of screen components;
a motion display detecting step of detecting information related to a distance from the display surface to a designating object that designates the screen components, and information related to a position of the designating object in a plane parallel to the display surface; and
a screen component movement rendering step of motion-displaying in sequence the screen components selected according to a selection rule, to such positions on the display surface obtained according to a display rule, based on the information related to the distance and the information related to the position.
The 16th aspect of the present invention is a program embodied on a non-transitory computer-readable medium, the program causing a computer to execute the screen component display method according to the 15th aspect of the present invention.
The 17th aspect of the present invention is the information terminal according to the 7th aspects of the present inventions, wherein the screen component movement rendering unit restores the motion-displayed screen components in a determined sequence also when restoring the motion-displayed screen components to original states before the motion display of the screen components.
According to the present invention, an information terminal and a screen component display method that enable a desired screen component to be easily found and an original state of a screen component prior to movement thereof to be easily discerned can be provided.
Hereinafter, preferred embodiments of the present invention will be described in detail with reference to the drawings.
An information terminal according to a first embodiment of the present invention will now be described.
In addition, as illustrated in
In the present embodiment, positions more elevated than Z1 from the surface 15a are configured as a non-detection region 19 that is a region in which the finger 50 is not detected even when existing in the region. A region within Z1 from the surface 15a is set as a detection region 16 in which the presence of the finger 50 is detected. The detection region 16 is further divided into two regions, namely, a decision region 18 from the surface 15a to Z2 and a motion display region 17 from Z2 to Z1. A plane parallel to the display surface 11a at Z1 is illustrated by a dotted line as plane P, and a plane parallel to the display surface 11a at Z2 is illustrated by a dotted line as plane Q. While a detailed description will be given later, a penetration of the finger 50 into the decision region 18 from the motion display region 17 means that the finger 50 decide on the icon 12 displayed directly underneath the finger 50.
In the present embodiment, detection points 15b are formed in a matrix state on the contactless input unit 15 on the upper side of the display surface 11a. As the finger 50 approaches, capacitance variation increases at detection points in the vicinity of directly underneath the finger 50. In addition, based on the capacitance variation at a detection point 15b where a maximum variation is detected, a detection is made as to what position (z-axis position) the finger 50 approaches the detection point 15b. In other words, the coming and going of the finger 50 between the non-detection region 19 and the motion display region 17 and between the motion display region 17 and the decision region 18 can be detected. Furthermore, by detecting the detection point 15b whose capacitance variation is maximum, a position (x-y coordinate) of the finger 50 on a plane parallel to the display surface 11a can be detected. In other words, while the finger 50 comes and goes between the non-detection region 19 and the motion display region 17, a position (x-y coordinate) on the plane P that is an interface between the non-detection region 19 and the motion display region 17 can be detected, and while the finger 50 comes and goes between the motion display region 17 and the decision region 18, a position (x-y coordinate) on the plane Q that is an interface between the motion display region 17 and the decision region 18 can be detected. Moreover, an example of a predetermined distance according to the present invention corresponds to a length that is a sum of Z1 and a thickness h (refer to
Furthermore, while a capacitance method is used to detect a three-dimensional position of the finger 50 in the present embodiment, an infrared system may alternatively be used. In such a case, a three-dimensional position of a finger can be detected by, for example, providing a plurality of infrared irradiating units and light receiving units at an end of the display surface 11a and detecting the blocking of infrared rays by the finger.
In addition, also provided are: a third detecting unit 24 which detects that the finger 50 moves from the motion display region 17 to the non-detection region 19 and transmits the detection result to the change icon displaying unit 23 and the icon movement rendering unit 22; and a seventh detecting unit 33 that detects a position (an x-y coordinate position on the plane P) where the finger 50 entered the non-detection region 19. When a movement of the finger 50 to the non-detection region 19 in a space above the detection area 14 is detected by the third detecting unit 24 and the seventh detecting unit 33, the change icon displaying unit 23 erases the change icon 30 and the icon movement rendering unit 22 restores the icon 12 to an original state thereof.
Furthermore, also provided are: a fourth detecting unit 25 which detects that the finger 50 enters the decision region 18 from the motion display region 17; and a fifth detecting unit 26 which detects, when it is detected that the finger 50 enters the decision region 18, a position of the finger 50 on a plane parallel to the display surface 11a (the plane Q in
Moreover, provided are: a sixth detecting unit 31 which detects that the finger 50 moves from the decision region 18 to the motion display region 17; and an eighth detecting unit 34 that detects a position (an x-y coordinate position on the plane Q) where the finger 50 entered the motion display region 17. When a movement of the finger 50 to the motion display region 17 in a space above the detection area 14 is detected by the sixth detecting unit 31 and the eighth detecting unit 34, the change icon displaying unit 23 displays the change icon 30 on the display surface 11a.
The contactless input unit 15 illustrated in
In other words, the first detecting unit 20, the second detecting unit 21, the third detecting unit 24, the fourth detecting unit 25, the fifth detecting unit 26, the sixth detecting unit 31, the seventh detecting unit 33, and the eighth detecting unit 34 respectively include the contactless input unit 15 and a computing unit that computes a three-dimensional position of the finger from a capacitance value obtained from the contactless input unit 15.
In addition, an example of a motion display detecting unit according to the present invention corresponds to the first detecting unit 20, the second detecting unit 21, the third detecting unit 24, the fourth detecting unit 25, and the fifth detecting unit 26 according to the present embodiment.
Furthermore, an example of a screen component movement rendering unit according to the present invention corresponds to the icon movement rendering unit 22 according to the present embodiment. An example of a change screen component displaying unit according to the present invention corresponds to the change icon displaying unit 23 according to the present embodiment.
Next, operations performed by the information terminal 10 according to the present first embodiment will be described together with an example of the screen component display method according to the present invention.
At the same time the icons 12 are displayed, detection of the finger 50 in a space perpendicular to the display surface 11a (the z-axis direction illustrated in
First, a case will be described in which the finger 50 moves in the space above the detection area 14 of the display surface 11a from the non-detection region 19 to the motion display region 17.
In S10, when the first detecting unit 20 detects that the finger 50 enters the motion display region 17 from the non-detection region 19 in the space above the detection area 14 of the display surface 11a, the second detecting unit 21 detects a position where the finger 50 entered the motion display region 17 (an x-y coordinate position of the finger 50 passing through the plane P). The detection by the second detecting unit 21 doubles as a detection of a movement of the finger 50 at a position on the detection area 14. Moreover, in the present embodiment, the contactless input unit 15 employing a capacitance method simultaneously detects that the finger 50 enters the motion display region 17 from the non-detection region 19 and an x-y coordinate position on the plane P upon entry of the finger 50 to the motion display region 17.
Specifically, it is recognized that the finger 50 moves from the non-detection region 19 to the motion display region 17 when a position of the finger 50 is not detected at a given sampling time and the finger 50 is detected at a position in the motion display region 17 at a next sampling time. Furthermore, the position in the motion display region 17 where the finger 50 is detected at this point can be assumed to be the x-y coordinate position on the plane P of the finger 50 upon entry to the motion display region 17.
In addition, an example of a motion display detecting step according to the present invention corresponds to the detection by the first detecting unit 20 of the finger 50 entering the motion display region 17 from the non-detection region 19 in the space above the detection area 14 of the display surface 11a and the detection by the second detecting unit 21 of the position where the finger 50 entered the motion display region 17 (the x-y coordinate position of the finger 50 passing through the plane P).
Next, in S11, the change icon displaying unit 23 displays the change icon 30 at a position on the display surface 11a directly underneath the position where the finger 50 entered the motion display region 17.
Next, in S12, as illustrated in
Next, in S13, the selected icons 12A, 12B, and 12C are motion-displayed and gathered one by one at staggered timings by the icon movement rendering unit 22 to predetermined positions in the periphery of the change icon 30 and control is completed.
In this case, motion display refers to having a user visualize that an icon is moving by displaying the icon at slightly moved positions from a display position before movement to a display position after movement.
A predetermined position in the periphery of the change icon 30 corresponds to an example of a position on a display surface obtained by a display rule according to the present invention. Moreover, S12 and S13 correspond to an example of a screen component movement rendering step according to the present invention.
In
Next, a case will be described in which the finger 50 moves in the space above the detection area 14 of the display surface 11a from the motion display region 17 to the decision region 18.
In S20, a transit of the finger 50 from the motion display region 17 to the decision region 18 is detected by the fourth detecting unit 25, and a position where the finger enters the decision region 18 from the motion display region 17 (an x-y coordinate position of the finger 50 passing through the plane Q) is detected by the fifth detecting unit 26. The detection by the fifth detecting unit 26 doubles as a detection of a movement of the finger 50 at a position above the detection area 14. Moreover, in the present embodiment, the contactless input unit 15 employing a capacitance method simultaneously detects a transit of the finger 50 from the motion display region 17 to the decision region 18 and an x-y coordinate position on the plane Q upon entry of the finger 50 to the decision region 18.
Specifically, it is recognized that the finger 50 moves from the motion display region 17 to the decision region 18 when a position of the finger 50 is detected at a given sampling time in the motion display region 17 and a position of the finger 50 is detected in the decision region 18 at a next sampling time. In addition, the position where the finger 50 is detected in the decision region 18 at this point can be assumed to be the position where the finger entered the decision region 18 (the x-y coordinate position of the finger 50 passing through the plane Q). Alternatively, the position of the finger 50 last detected in the motion display region 17 may be considered to be the position where the finger entered the decision region 18, or an intersection point of a line connecting the position of the finger 50 in the motion display region 17 and the position of the finger 50 in the decision region 18 at the two sampling times described above with the plane Q may be considered to be the position where the finger entered the decision region 18.
Next, in S21, the decided position judging unit 27 judges which icon is displayed directly underneath the position of the finger 50 detected by the fifth detecting unit 26, and assumes that the finger 50 decides on the displayed icon 12.
Next, in S22, the change icon 30 is erased by the change icon displaying unit 23.
In addition, in S23, a judgment is made on whether or not the change icon 30 is decided on in S21 above, and if the change icon 30 is decided on, control proceeds to S24. If the change icon 30 is not decided on, control proceeds to S27.
If the change icon 30 is decided on, in S24, the motion-displayed icons 12A, 12B, and 12C are restored to their original states (the initial states illustrated in
Subsequently, in S25, an icon 12 to be motion-displayed next is selected by the moved icon selecting unit 29 based on a selection rule that differs from the initially-used selection rule. For example, the initially-used selection rule is set such as adopting a descending order of number of decisions made for an icon to be motion displayed first, and different rules can be set in advance such as adopting a reverse chronological order of decision history for an icon to be motion-displayed next. Alternatively, an icon related to a music genre may be selected as the icon to be motion-displayed first and an icon related to a movie genre may be selected as the icon to be motion-displayed next. As shown, various contents are conceivable as contents of a switchover of icons when the change icon 30 is decided on, including a switchover of history types, a switchover of genres, a switchover of artists, a switchover of albums, and a switchover of playlists. Moreover, an example of the selection rule according to the present invention corresponds to the initially-used selection rule according to the present embodiment, and an example of a second selection rule according to the present invention corresponds to the selection rule that differs from the initially-used selection rule according to the present embodiment.
In addition, a predetermined position in the periphery of the change icon 30 such as that illustrated in
Next, in S26, as illustrated in
On the other hand, when it is judged in S23 that the change icon 30 is not decided on, in S27, a judgment is made on whether or not the motion-displayed icons 12A, 12B, or 12C is decided on.
Subsequently, when it is judged in S27 that an icon among the motion-displayed icons 12A, 12B, and 12C is decided on, an application assigned per icon is activated by the designating unit 28 in S28. For example, when the decided icon is an icon related to a game, the game is activated, and in case of an icon related to music, an application for reproducing a music file is activated and music is reproduced.
In addition, when it is judged in S27 that the motion-displayed icons 12A, 12B, or 12C is not decided on, the control is completed.
Next, a case will be described in which the finger 50 moves in the space above the detection area 14 of the display surface 11a from the decision region 18 to the motion display region 17.
When the finger 50 exists in the decision region 18 above the detection area 14, the change icon 30 is erased from the screen by the control of S22. In this state, in S30, a transit of the finger 50 from the decision region 18 to the motion display region 17 is detected by the sixth detecting unit 31, and a position where the finger 50 enters the motion display region 17 (an x-y coordinate position of the finger 50 passing through the plane Q) is detected by the eighth detecting unit 34. The x-y coordinate is detected by the eighth detecting unit 34 and indicates whether a movement of the finger 50 occurred in the space above the detection area 14. Moreover, in the present embodiment, the contactless input unit 15 employing a capacitance method simultaneously detects that the finger 50 enters the motion display region 17 from the decision region 18 and an x-y coordinate position on the plane Q upon entry of the finger 50 to the motion display region 17.
Specifically, it is recognized that the finger 50 moves from the decision region 18 to the motion display region 17 when a position of the finger 50 is detected at a given sampling time in the decision region 18 and a position of the finger 50 is detected in the motion display region 17 at a next sampling time. In addition, the position of the finger 50 detected in the motion display region 17 at this point can be assumed to be the position where the finger entered the motion display region (the x-y coordinate position of the finger 50 passing through the plane Q). Alternatively, the position of the finger 50 last detected in the decision region 18 may be considered to be the position where the finger entered the motion display region 17, or an intersection point of a line connecting the position of the finger 50 in the motion display region 17 and the position of the finger 50 in the decision region 18 at the two sampling times described above with the plane Q may be considered to be the position where the finger entered the decision region 18.
Subsequently, in S31, the change icon 30 is once again displayed at the originally-displayed position by the change icon displaying unit 23 and the control is completed.
Moreover, as illustrated in
Next, a case will be described in which the finger 50 moves in the space above the detection area 14 of the display surface 11a from the motion display region 17 to the non-detection region 19.
When the finger 50 exists in the motion display region 17 above the detection area 14, the change icon 30 is displayed on the screen by S11 and S31. In this state, in S40, a transit of the finger 50 from the motion display region 17 to the non-detection region 19 in the space above the detection area 14 is detected by the third detecting unit 24, and a position where the finger 50 enters the non-detection region 19 (an x-y coordinate position of the finger 50 passing through the plane P) is detected by the seventh detecting unit 33. The x-y coordinate of the finger 50 is detected by the seventh detecting unit 33 and indicates whether a movement of the finger 50 occurred in the space above the detection area 14. Moreover, in the present embodiment, the contactless input unit 15 employing a capacitance method simultaneously detects that the finger 50 enters the non-detection region 19 from the motion display region 17 and an x-y coordinate position on the plane P upon entry of the finger 50 to the non-detection region 19.
Specifically, it is recognized that the finger 50 moves from the motion display region 17 to the non-detection region 19 when a position of the finger 50 is detected at a given sampling time at a position in the motion display region 17 and a position of the finger 50 is not detected at a next sampling time. Furthermore, the position in the motion display region 17 where the finger 50 is detected at this point can be assumed to be the x-y coordinate position on the plane P of the finger 50 upon entry to the non-detection region 19.
Subsequently, in S41, the change icon displaying unit 23 erases the change icon 30.
Then, as illustrated in
Next, a case will be described in which the finger 50 moves in the space above the display area 13 of the display surface 11a from the motion display region 17 to the decision region 18.
In S50, when a transit of the finger 50 from the motion display region 17 to the decision region 18 is detected by the fourth detecting unit 25, a position where the finger 50 transited from the motion display region 17 to the decision region 18 is detected by the fifth detecting unit 26. The x-y coordinate of the finger 50 is detected by the fifth detecting unit 26 and indicates whether a movement of the finger 50 occurred in the space above the display area 13.
Next, in S51, the decided position judging unit 27 judges which icon is displayed directly underneath the position of the finger 50 detected by the fifth detecting unit 26, and assumes that the finger decided on the displayed icon 12.
Subsequently, when it is judged in S52 that any of the icons 12 not motion-displayed as illustrated in
Control is also completed when it is conversely not judged that any of the icons is decided on.
As described, in the present embodiment, since only selected icons among a plurality of icons existing on the information terminal are moved to and displayed in a vicinity of a finger, a desired icon is easy to find even if a large number of icons exist on the information terminal.
In addition, in the present embodiment, since icons 12 to be motion-displayed are motion-displayed and gathered one by one, a user can identify original states of the icons. Therefore, since the user is able to learn the original states of the icons, when a finger is brought close to the display area 13 to directly decide on a desired icon 12, the icon 12 can now be promptly decided on without having to locate the position of the icon 12.
Furthermore, in the present embodiment, even when restoring the icons 12 to their original states, a user can further learn the original positions of the icons by restoring one icon at a time.
Moreover, in the present embodiment, when the finger 50 moves from the decision region 18 to the motion display region 17, by erasing the change icon 30 as described in S22, the user can be reminded that the finger 50 exists in the decision region 18. Therefore, when deciding on either the change icon 30 or a motion-displayed icon 12, the user can be reminded that the finger must be moved to the motion display region 17. In addition, when the finger 50 is moved from the decision region 18 to the motion display region 17, by displaying the change icon 30 as described in S31, the user can be reminded that the finger 50 exists in the motion display region 17.
The configuration described above can be realized as an example by a computer, for example, it can be realized as a configuration of an information terminal such as that illustrated in
The information terminal illustrated in
Moreover, in the present embodiment, when a finger moves in a space above the detection area 14 from the non-detection region 19 to the motion display region 17, only the icons 12A, 12B, and 12C are motion-displayed to the periphery of the change icon 30 as illustrated in
In addition, an example of a change screen component according to the present invention corresponds to the change icon 30 according to the present embodiment, and an example of a change screen component displaying unit according to the present invention corresponds to the change icon displaying unit 23 according to the present embodiment. In the present embodiment, the change icon 30 is shown which restores a motion-displayed icon selected according to a predetermined selection rule to an original state thereof and motion-displays an icon selected according to other predetermined selection rule (an icon of a different type). However, without restoring the motion-displayed icon to the original state, an icon of a different type may additionally be motion-displayed to the periphery of the motion-displayed icon. Specifically, when a finger moves from the non-detection region 19 to the motion display region 17, only the icons 12A, 12B, and 12C are motion-displayed to the periphery of an addition icon 32 as illustrated in
Furthermore, while three icons 12 (icons 12A, 12B, and 12C) are selected according to a single predetermined selection rule in
Moreover, in the above description, while a position of the icon 12 to be motion-displayed is displayed on the display surface 11a before movement, an icon which is not displayed on the display surface 11a and which is displayed by scrolling the screen may be arranged so as to move to the periphery of the change icon 30. For example, supposing that the icon 12C among the selected icons 12A, 12B, and 12C is not displayed on the display surface 11a, as illustrated in
As shown, by first scrolling to cause the icon 12C to be displayed and then motion-displaying the icon 12C, even when an icon is not displayed on the display surface 11a, the user is able to learn an original state of the icon.
In addition, icon display need not be limited to that illustrated in
Next, an information terminal according to a second embodiment of the present invention will now be described.
While the information terminal according to the present second embodiment is basically configured the same as that according to the first embodiment, the information terminal according to the present second embodiment differs in that a detection area is divided into a left side and a right side. Therefore, a description will be given focusing on this difference. Moreover, like components to the first embodiment are designated by like reference characters.
With the information terminal 40 according to the present second embodiment, when a finger 50 moves from a non-detection region 19 to a motion display region 17 in a space above the first detection area 14a, a change icon 30 is displayed as illustrated in
On the other hand, when the finger 50 moves from the non-detection region 19 to the motion display region 17 in a space above the second detection area 14b, a change icon 30 is displayed as illustrated in
As described above, in the present second embodiment, by dividing the detection area 14 into the first detection area 14a and the second detection area 14b, a desired icon can be found more quickly by adopting a setting where, for example, a detection in the first detection area 14a causes an icon related to a game to be motion-displayed and a detection in the second detection area 14b causes an icon related to a net application to be motion-displayed.
Moreover, while groups of icons to be motion-displayed are completely different between the first detection area 14a and the second detection area 14b in the present second embodiment, a portion of the groups of icons may be overlapped. An example of a group of screen components according to the present invention corresponds to the icons 12A, 12B, and 12C or the icons 12D, 12E, and 12F.
Next, an information terminal according to a third embodiment of the present invention will now be described.
While the information terminal according to the present third embodiment is basically configured the same as that according to the first embodiment, the information terminal according to the present third embodiment differs in that a motion display region 17 is divided in plurality in a direction parallel to a displaying unit 11 and that control is performed such that when a finger 50 approaches the displaying unit 11, a selected icon gradually approaches the finger 50. Therefore, a description will be given focusing on this difference.
Next, detection in a case where the finger 50 approaches the display surface 11a will be described.
A first detecting unit 20 detects that the finger 50 enters any motion display region 17k of the motion display regions 171 to 17n from a non-detection region 19, and a second detecting unit 21 detects a position on a plane Pk on an upper side of the motion display region 17k (where 1≦k≦n, k is a natural number) entered by the finger 50 where the finger 50 entered the motion display region 17k. Specifically, it is recognized that the finger 50 moves from the non-detection region 19 to any motion display region 17k of the motion display regions 171 to 17n when a position of the finger 50 is not detected at a given sampling time and the finger 50 is detected at a position in the motion display region 17k at a next sampling time. Furthermore, the position in the motion display region 17k where the finger 50 is detected at this point can be assumed to be an x-y coordinate position on the plane Pk of the finger 50 upon entry to the motion display region 17k.
In addition, the first detecting unit 20 detects that the finger 50 enters the motion display region 17k from any motion display region of the motion display regions 171 to 17k−1 and the second detecting unit 21 detects a position on a plane Pk on an upper side of the motion display region 17k where the finger 50 entered the motion display region 17k. Specifically, it is recognized that the finger 50 moves to the motion display region 17k when a position of the finger 50 is detected at a given sampling time in any motion display region of the motion display regions 171 to 17k−1 above the motion display region 17k and a position of the finger 50 is detected in the motion display region 17k at a next sampling time. In addition, the position of the finger 50 detected in the motion display region 17k at this point can be assumed to be the position where the finger 50 entered the motion display region 17k (an x-y coordinate position of the finger 50 passing through the plane Pk). Alternatively, the position of the finger 50 last detected in any region of the motion display regions 171 to 17k−1 may be considered to be the position where the finger 50 entered the motion display region 17k, or an intersection point of a line connecting the positions of the finger 50 at the two sampling times described above with the plane Pk may be considered to be the position where the finger entered the motion display region 17k.
A fourth detecting unit 25 detects that the finger 50 enters the decision region 18 from any motion display region 17k of the motion display regions 171 to 17n and a fifth detecting unit 26 detects a position on a plane Q on an upper side of the decision region 18 where the finger 50 entered the decision region 18. Specifically, it is recognized that the finger 50 moves from the motion display region 17k to the decision region 18 when a position of the finger 50 is detected at a given sampling time in the motion display region 17k and a position of the finger 50 is detected in the decision region 18 at a next sampling time. In addition, the position of the finger 50 detected in the decision region 18 at this point can be assumed to be the position where the finger 50 entered the decision region 18 (an x-y coordinate position of the finger 50 passing through the plane Q). Alternatively, the position of the finger 50 detected in the motion display region 17k may be considered to be the position where the finger entered the decision region 18, or an intersection point of a line connecting the positions of the finger 50 at the two sampling times described above with the plane Q may be considered to be the position where the finger entered the decision region 18.
Next, detection in a case where the finger 50 moves away from the display surface 11a will be described.
A sixth detecting unit 31 detects that the finger 50 entered any motion display region 17k of the motion display regions 171 to 17n from the decision region 18 and an eighth detecting unit 34 detects a position on a plane Pk+1 on a lower side of the motion display region 17k where the finger 50 entered the motion display region 17k. Specifically, it is recognized that the finger 50 moves from the decision region 18 to the motion display region 17k when a position of the finger 50 is detected at a given sampling time in the decision region 18 and a position of the finger 50 is detected in the motion display region 17k at a next sampling time. In addition, the position of the finger 50 detected in the motion display region 17k at this point can be assumed to be the position where the finger 50 entered the motion display region 17k (an x-y coordinate position of the finger 50 passing through the plane Pk+1). Alternatively, the position of the finger 50 last detected in the decision region 18 may be considered to be the position where the finger 50 entered the motion display region 17k, or an intersection point of a line connecting the positions of the finger 50 at the two sampling times described above with the plane Pk+1 may be considered to be the position where the finger 50 entered the motion display region 17k.
A third detecting unit 24 detects that the finger 50 enters the motion display region 17k (where 1≦k≦n, k is a natural number) from any of the motion display regions 17k+1 to 17n and a seventh detecting unit 33 detects a position on a plane Pk+1 on a lower side of the motion display region 17k where the finger 50 entered the motion display region 17k. Specifically, it is recognized that the finger 50 moves to the motion display region 17k when a position of the finger 50 is detected at a given sampling time in any motion display region of the motion display regions 17k+1 to 17n and a position of the finger 50 is detected in the motion display region 17k at a next sampling time. In addition, the position of the finger 50 detected in the motion display region 17k at this point can be assumed to be the position where the finger 50 entered the motion display region 17k (an x-y coordinate position of the finger 50 passing through the plane Pk+1). Alternatively, the position of the finger 50 last detected in any motion display region of the motion display regions 17k+1 to 17n may be considered to be the position where the finger 50 entered the motion display region 17k, or an intersection point of a line connecting the positions of the finger 50 at the two sampling times described above with the plane Pk+1 may be considered to be the position where the finger entered the motion display region 17k.
In addition, the third detecting unit 24 detects that the finger 50 enters the non-detection region 19 from any motion display region 17k of the motion display regions 171 to 17n and the seventh detecting unit 33 detects a position on a plane P1 where the finger 50 entered the non-detection region 19. Specifically, it is recognized that the finger 50 moves from the motion display region 17k to the non-detection region 19 when a position of the finger 50 is detected at a given sampling time at a position in the motion display region 17k and a position of the finger 50 is not detected at a next sampling time. Furthermore, the position in the motion display region 17k where the finger 50 is detected at this point can be assumed to be the x-y coordinate position on the plane P1 of the finger 50 upon entry to the non-detection region 19.
Moreover, an example of the n-number of types of predetermined distances according to the present invention corresponds to a length that is a sum of Z11 and a thickness h (refer to
Next, operations of the information terminal 60 according to the present third embodiment will be described using an example where n=9.
First, display positions of icons 12A, 12B, and 12C in a periphery of a change icon 30 will be described.
As illustrated in
The icon 12A is motion-displayed to a position one-third (approximately 0.33 times) the distance from a position in an initial state (refer to
Next, as illustrated in
Subsequently, the icon 12A is motion-displayed from the display position illustrated in
Next, as illustrated in
Subsequently, the icon 12A is motion-displayed from the display position illustrated in
Similarly, as the finger 50 approaches the display surface 11A, the icons 12B and 12C are also motion-displayed in sequence, and when the finger 50 enters the motion display region 179, the icons 12A, 12B, and 12C are to be displayed at predetermined arrival positions in the periphery of the change icon 30 as illustrated in
As shown, in the above example, the icon 12A is to be displayed at the arrival position thereof when the finger 50 moves to the motion display region 173, the icon 12B is to be displayed at the arrival position thereof when the finger 50 moves to the motion display region 176, and the icon 12C is to be displayed at the arrival position thereof when the finger 50 moves to the motion display region 179.
By performing control as described above, icons 12A, 12B, and 12C can be motion-displayed as though the icons gradually gather around the tip of the finger 50 of the user as the finger 50 approaches the display surface 11a. In addition, by increasing n, an appearance in which the icons gather more continuously can be achieved.
Moreover, while the icon 12A reaches the arrival position in the periphery of the change icon 30 when the finger 50 moves to the motion display region 173 in the display rule described above, the motion display region at the moment of arrival of the icon 12A may be arranged so as to be a different motion display region (for example, a motion display region 175). Furthermore, the motion-displayed positions at each display region may be changed. In this manner, settings of motion display regions when each icon is displayed at an arrival position and the motion-displayed positions of each icon in each motion display region can be arbitrarily changed.
In addition, while all of the icons 12A, 12B, and 12C are moved and displayed as the finger 50 enters the motion display region 171 in the above description, for example, a motion display of one of the icons may be arranged so as to start after another icon is motion-displayed to an arrival position. In this case, for example, at n=9, the motion display of the icon 12B is started after the icon 12A is motion-displayed to the arrival position at n=3, and the motion display of the icon 12C is started after the icon 12B is motion-displayed to the arrival position at n=6.
In essence, a position to where each icon is motion-displayed when the finger moves to each of the motion display regions 171 to 17n should be set so that icons with higher priority orders are more quickly displayed at respective arrival positions thereof in the periphery of the change icon 30.
Next, from the state illustrated in
On the other hand, when the change icon 30 is decided on, in the same manner as in
Furthermore, when the finger 50 is separated from the display surface 11a in the motion display region 17 without deciding on any of the change icon 30 and the icons 12A, 12B, and 12C, the icons 12A, 12B, and 12C return to original states (initial positions) in an operation reverse to that illustrated in
In other words, when the finger 50 moves from a state where the finger 50 exists in the motion display region 179 (refer to
In addition, when the finger 50 moves from the motion display region 176 to the motion display region 175, the icon 12B is motion-displayed from a previous display position to a position one-sixth (approximately 0.17 times) the distance from the arrival position in the periphery of the change icon 30 toward the initial position, and the icon 12C is motion-displayed from a previous display position to a position four-ninths (approximately 0.44 times) the distance from the arrival position in the periphery of the change icon 30 toward the initial position.
Furthermore, when the finger 50 moves from the motion display region 173 to the motion display region 172, the icon 12A is motion-displayed from a previous display position to a position one-third (approximately 0.33 times) the distance from the arrival position in the periphery of the change icon 30 toward the initial position, the icon 12B is motion-displayed from a previous display position to a position four-sixths (approximately 0.67 times) the distance from the arrival position in the periphery of the change icon 30 toward the initial position, and the icon 12C is motion-displayed from a previous display position to a position seven-ninths (approximately 0.78 times) the distance from the arrival position in the periphery of the change icon 30 toward the initial position. Moreover, with such a movement of the finger 50 in a separating direction from the display surface 11a in the motion display region 17 as described above is detected by the third detecting unit 24 and the seventh detecting unit 33. In other word, the third detecting unit 24 detects a movement of the finger 50 from a lower-side region to an upper-side region and the seventh detecting unit 33 detects a position where the finger 50 entered the upper-side region.
As described above, when the finger 50 moves from the motion display region 171 to the non-detection region 19, all of the icons 12A, 12B, and 12C are returned to initial positions thereof. Moreover, in the control method described above, while all of the icons 12A, 12B, and 12C simultaneously are returned to initial positions thereof when the finger 50 moves from the motion display region 171 to the non-detection region 19, control may alternatively be performed to return the icons to initial positions one by one such that an icon starts to be returned to an initial position thereof after another icon is returned to an initial position thereof.
In addition, while the positions to where the icons 12A, 12B, and 12C are motion-displayed are obtained by computation in the description above, a position to where each of the icons 12A, 12B, and 12C is motion-displayed may alternatively be decided on in advance for each entry position of the finger 50 in each motion display region. In this case, a table of positions to where the icons are to be motion-displayed is stored in the memory and the icons are to be motion-displayed based on the table.
Moreover, while the finger 50 is consecutively detected for each of the motion display regions 171 to 179 in
Moreover, in the first to third embodiments described above, the decision region 18 is provided and, when a finger moves from the motion display region 17 to the decision region 18, an icon displayed directly underneath the finger is to be decided on, thereby making the decision region 18 a decision region for icon decision. However, the decision region 18 need not be provided. In other words, a definite distance according to the present invention may take a value of zero. In this case, by using a touch panel employing a capacitance method, a decision on an icon can be detected when the display surface 11a is touched.
In addition, when a current function of the information terminals 10, 40, and 60 in the first to third embodiments described above is a home screen function, the icons 12 displayed on the display surface 11a are shortcut icons for activating various applications.
Furthermore, when a current function of the information terminals 10, 40, and 60 in the first to third embodiments described above is a music reproducing function, the icons 12 displayed on the display surface 11a are icons representing music contents. In this case, reduced screens of music albums and the like can be used as the icons.
Moreover, when the current function of the information terminals 10, 40, and 60 in the first to third embodiments described above is a video reproducing function, the icons 12 displayed on the display surface 11a are icons representing video contents. In this case, thumbnail images can be used as the icons. In addition, when the current function of the information terminals 10, 40, and 60 in the first to third embodiments described above is a photograph displaying function, the icons 12 displayed on the display surface 11a are icons representing photograph contents. In this case, reduced screens or thumbnail images can be used as the icons.
Moreover, while the change icon 30 is arranged so as to be displayed on the display surface 11a when the finger 50 enters the motion display region 17 in the embodiments described above, the change icon 30 need not be displayed. By at least having the icons motion-displayed one by one, the user can confirm at which positions the icons are displayed in the initial states. In this case, any one of the icons to be moved may be displayed at a position on the display surface 11a directly underneath the finger 50.
In addition, the icons need not necessarily be motion-displayed one by one, and may be moved simultaneously if there are only a small number of icons.
Furthermore, in the embodiments described above, while the icons 12 are arranged so as to be restored one by one even when restoring the icons 12 to their original states, all icons may be simultaneously restored to their original states instead.
Moreover, while the change icon 30 or the addition icon 32 is arranged so as to be displayed at a position on the display surface 11a directly underneath a finger in the embodiments described above, the change icon 30 or the addition icon 32 need not necessarily be displayed at a position on the display surface 11a directly underneath the finger and may alternatively be displayed at a position on the display surface 11a in the vicinity of the position directly underneath the finger.
In addition, a position of an icon to be motion-displayed may either be a position on the display surface 11a directly underneath the finger or a position on the display surface 11a in the vicinity of the position directly underneath the finger.
Furthermore, while sizes of the icons 12 are arranged so as to be the same in the embodiments described above, sizes of icons to be motion-displayed may be arranged in a descending order from the icon with the highest priority. In addition, a group of icons of a high-priority rule may be displayed larger than a group of icons of a low-priority rule. For example, if a rule for selecting the icons 12A, 12B, and 12C illustrated in
Moreover, while a priority is determined for each icon and the icons 12 are motion-displayed in a descending order of priorities in the embodiments described above, the icons 12 may alternatively be motion-displayed in sequence regardless of the priority order. For example, when a rule for selecting icons to be motion-displayed is a rule related to music genres or the like, a priority need not be determined for each icon and the icons may be motion-displayed in an order of registration to the information terminal or an order of proximity to the change icon 30.
In addition, while the detection area 14 is provided under the display surface 11a, the detection area 14 is not limited to this position and may alternatively be provided at the center as is the case of Japanese Patent Laid-Open No. 2008-117371 or be provided at an upper edge portion or a left/right edge portion.
Furthermore, an entire area on the display surface 11a in which icons 12 are not displayed may be considered to be a detection area.
In addition, while an example of a designating object according to the present invention corresponds to the finger 50 in the embodiments described above, such an arrangement is not restrictive and a pointing device such as a stylus may be used instead.
Moreover, a program according to the present invention is a program which causes operations of respective steps of the aforementioned screen component display method according to the present invention to be executed by a computer and which operates in cooperation with the computer.
In addition, a recording medium according to the present invention is a recording medium on which is recorded a program that causes a computer to execute for making a computer execute all of or a part of operation of the respective steps of the aforementioned screen component display method according to the present invention and which is a readable by the computer, whereby the read program performs the operations in collaboration with the computer.
Furthermore, the aforementioned “operations of the respective steps” of the present invention refer to all of or a part of the operations of the step described above.
Moreover, one utilizing form of the program of the present invention may be an aspect of being recorded on a recording medium, ROM and the like are included, which can be read by a computer, and operating with collaborating with the computer.
In addition, one utilizing form of the program of the present invention may be an aspect of being transmitted inside a transmission medium, transmission media such as the Internet, light, radio waves, and acoustic waves and the like are included, being read by a computer, and operating with collaborating with the computer.
Furthermore, a computer according to the present invention described above is not limited to pure hardware such as a CPU and may be arranged to include firmware, an OS and, furthermore, peripheral devices.
Moreover, as described above, configurations of the present invention may either be realized through software or through hardware.
The information terminal and the screen component display method according to the present invention enable a desired screen component to be easily found and an original state prior to movement of a screen component to be easily discerned, and is useful as an information terminal of a smartphone, a PDA, and the like.
Number | Date | Country | Kind |
---|---|---|---|
2010-123328 | May 2010 | JP | national |
2011-036176 | Feb 2011 | JP | national |