The disclosure of Japanese Patent Application No. 2010-205846, filed on Sep. 14, 2010, is incorporated herein by reference.
1. Field of the Invention
The present invention relates to a display control process conducted when displaying display-contents such as a selection object that is to be selected by a user and a content that is to be browsed by a user, and more specifically relates to a process conducted when touching and scrolling the selection object and content.
2. Description of the Background Art
An information processing terminal for browsing contents that do not fit within one screen is conventionally known. For example, in a state where one part of a content having a size larger than a screen is displayed on the screen, a mobile phone disclosed in Japanese Laid-Open Patent Publication No. 2000-66801 enables moving the content by an operation on a numerical keypad of the mobile phone. Additionally, this mobile phone displays information indicating a position of the currently displayed content with respect to all the contents in an area outside the display area of the content. For example, a ratio of an amount of contents that have been already displayed at present to the total amount of display contents is represented as a percentage. Therefore, when the display has moved to an end of the content, the user can understand that the display has moved to an end of the content by seeing information on a percentage display.
With the mobile phone described in the Japanese Laid-Open Patent Publication No. 2000-66801, it is necessary to estimate where a content has been moved by using information displayed at an area outside the display area of the content. However, the user will be paying attention to the content when browsing the content. Therefore, in order to see the information at an area outside the display area of the content, the user will take his or her sight off the content for a moment to confirm the information at an area outside the display area of the content. As a result, if the user is performing an operation to move the content while paying attention to the content, even after reaching an end of the content, the user will try to further move the content, thereby generating a futile operation. In other words, the user will conduct a futile operation by trying to further move the content even after reaching an end of the content, and then, shift his or her sight to the information displayed outside the content area to confirm the information and recognize they have reached an end of the content. Therefore, it has been difficult to intuitively understand reaching an end of the content.
Therefore, an object of the present invention is to provide a computer-readable storage medium having stored thereon a display control program which can improve usability for a user by allowing the user to intuitively understand reaching an end of a display object such as a content and the like.
In order to achieve the above described object, the present invention has adopted the following configurations.
A first aspect is a computer-readable storage medium having stored thereon a display control program executed by a computer of a display control apparatus which displays, on a display device, a selection object selected in accordance with an operation by a user, the display control program causing the computer to operate as first movement control means and display control means. The first movement control means moves, relative to a display area of the display device, a plurality of selection objects having at least one part thereof displayed on the display area, based on an output signal outputted from an input device. The display control means displays, on the display area, objects having shapes identical or similar to one part of the plurality of selection objects displayed on the display area, when, among the plurality of selection objects moved by the first movement control means, an end-located selection object reaches a predetermined position of the display area.
The first aspect allows the user to intuitively understand that the plurality of selection objects have been moved to an end, without the need of narrowing the area in order to display the plurality of selection objects; thereby enabling a further increase of usability for the user.
In a second aspect based on the first aspect, the computer is further caused to operate as object movement control means for moving, relative to the display area, the objects displayed by the display control means, based on an output signal outputted from the input device.
The second aspect allows notification of the user regarding moving and reaching an end of the plurality of selection objects in a more easily comprehensible manner, since the object, which is displayed when the plurality of selection objects have been moved to an end, moves.
In a third aspect based on the first and second aspects, the computer is further caused to operate as object erasing means for erasing, from the display area, the objects displayed by the display control means, when the input device stops outputting an output signal.
The third aspect allows the user to easily understand contents of the plurality of selection objects as soon as an operation (for example, operation of scrolling) for moving the plurality of selection objects has been stopped, thereby enabling increase of usability.
In a fourth aspect based on the second aspect, the second movement control means moves the objects after the first movement control means stops moving the selection objects.
The fourth aspect allows the user to intuitively understand that the plurality of selection objects cannot be moved further.
In a fifth aspect based on the second aspect, the second movement control means moves the objects in a moving direction determined based on an output signal outputted from the input device.
In a sixth aspect based on the fifth aspect, the computer is further caused to operate as object erasing means for erasing the objects from the display area by moving the objects in a direction opposite to the moving direction, when the input device stops outputting an output signal.
The fifth and sixth aspects allow the user to understand that the plurality of selection objects cannot be moved further, by moving the objects in a direction in accordance with an operation of moving a pointed position and erasing the objects when the operation is stopped.
In a seventh aspect based on the sixth aspect, the display control means displays the objects at a position such that one part of the plurality of selection objects displayed on the display area overlaps a display position of the object, when the end-located selection object reaches the predetermined position. Then, the second movement control means moves the objects by using, as a reference point position, the position at which the one part of the plurality of selection objects overlaps the display position of the object. Furthermore, the object erasing means erases the objects after the objects return to the reference point position.
The seventh aspect allows notification of the user in an easily comprehensible manner that further moving the selection objects is futile.
In an eighth aspect based on the seventh aspect, the display control means displays semi-transparent objects as the objects in a manner such that the semi-transparent objects are superimposed on front surfaces of the selection objects.
The eighth aspect allows the user to understand contents of the selection objects and intuitively understand that the selection objects have been moved to an end, by using the semi-transparent objects and allowing the user to understand that the plurality of selection objects have been moved to an end.
In a ninth aspect based on the eighth aspect, the display control means changes the transparency of the objects in accordance with an amount of movement of the objects.
In a tenth aspect based on the ninth aspect, when the input device stops outputting an output signal, the object erasing means erases the objects from the display area by moving the objects in a direction opposite to the direction in which the second movement control means has moved the objects, while gradually restoring the transparency of the object's changed in accordance with the change of the amount of movement.
The ninth and tenth aspects allow the user to intuitively understand that further movement is futile, since the user controls the objects with a movement conforming to the operation of the user without having any sense of discomfort while understanding the content of the plurality of selection objects.
In an eleventh aspect based on the first aspect, the display control means displays, as the objects having shapes identical or similar to one part of the plurality of selection objects, objects having colors different from the one part of the plurality of selection objects displayed on the display area.
The eleventh aspect allows the user to intuitively understand that the plurality of selection objects have been moved to an end.
In a twelfth aspect based on the first aspect, the display control means displays, on the display area, an object having a shape identical or similar to a selection object existing at a position that is based on an output signal outputted from the input device.
The twelfth aspect allows the user to intuitively understand that the plurality of selection objects have been moved to an end.
A thirteenth aspect is a display control system which displays, on a display device, a selection object selected in accordance with an operation by a user, the display control system including first movement control means and display control means. The first movement control means moves, relative to a display area of the display device, a plurality of selection objects having at least one part thereof displayed on the display area, based on an output signal outputted from an input device. The display control means displays, on the display area, objects having shapes identical or similar to one part of the plurality of selection objects displayed on the display area, when, among the plurality of selection objects moved by the first movement control means, an end-located selection object reaches a predetermined position of the display area.
A fourteenth aspect is a display control apparatus which displays, on a display device, a selection object selected in accordance with an operation by a user, the display control apparatus including first movement control means and display control means. The first movement control means moves, relative to a display area of the display device, a plurality of selection objects having at least one part thereof displayed on the display area, based on an output signal outputted from an input device. The display control means displays, on the display area, objects having shapes identical or similar to one part of the plurality of selection objects displayed on the display area, when, among the plurality of selection objects moved by the first movement control means, an end-located selection object reaches a predetermined position of the display area.
A fifteenth aspect is a display control method for displaying, on a display device, a selection object selected in accordance with an operation by a user, the display control method including a first movement control step and a display control step. The first movement step is a step of moving, relative to a display area of the display device, a plurality of selection objects having at least one part thereof displayed on the display area, based on an output signal outputted from an input device. The display control step is a step of displaying, on the display area, objects having shapes identical or similar to one part of the plurality of selection objects displayed on the display area, when, among the plurality of selection objects moved at the first movement control step, an end-located selection object reaches a predetermined position of the display area.
A sixteenth aspect is a computer-readable storage medium having stored thereon a display control program executed by a computer of a display control apparatus which displays, on a display device, a content to be browsed by a user, the display control program causing the computer to operate as first movement control means and display control means. The first movement control means moves, relative to a display area of the display device, a content having at least one part thereof displayed on the display area, based on an output signal outputted from an input device. The display control means displays, on the display area, an object having a shape identical or similar to one part of the content displayed on the display area, when an end of the content, which is moved by the first movement control means, reaches a predetermined position of the display area.
A seventeenth aspect is a computer-readable storage medium having stored thereon a display control program executed by a computer of a display control apparatus which displays, on a display device, a content to be browsed by a user, the display control program causing the computer to operate as first movement control means and display control means. The first movement control means moves, relative to a display area of the display device, a plurality of contents having at least one part thereof displayed on the display area, based on an output signal outputted from an input device. The display control means displays, on the display area, objects having shapes identical or similar to one part of the plurality of contents displayed on the display area, when among the plurality of contents moved by the first movement control means, an end-located content reaches a predetermined position of the display area.
The thirteenth to seventeenth aspects can obtain the same advantageous effect as that of the first aspect.
In a case where an operation of moving a content or a plurality of selection objects is conducted (for example, an operation of scrolling), the present invention allows the user to intuitively understand that the content or the selection objects have been moved to an end.
These and other objects, features, aspects and advantages of the present invention will become more apparent from the following detailed description of the present invention when taken in conjunction with the accompanying drawings.
Hereinafter, an embodiment of the present invention will be described with reference to the drawings. Note that the present invention is not limited to this embodiment.
The game apparatus 1 includes a lower housing 11 and an upper housing 21. The lower housing 11 and the upper housing 21 are connected to each other so as to be capable of being opened or closed (foldable). In the example of
In the lower housing 11, a lower LCD (Liquid Crystal Display) 12 is provided. The lower LCD 12 has a horizontally long shape, and is located such that a long side thereof corresponds to a long side direction of the lower housing 11. Note that although an LCD is used as a display device built-in the game apparatus 1 in the present embodiment, any other display devices such as a display device using an EL (Electro Luminescence) and the like may be used. In addition, the game apparatus 1 can use a display device of any resolution. Although details will be described below, the lower LCD 12 is used mainly for displaying an image taken by an inner camera 23 or an outer camera 25 in real time.
In the lower housing 11, operation buttons 14A to 14K and a touch panel 13 are provided as input devices. As shown in
Note that the operation buttons 141 to 14K are omitted in
The game apparatus 1 further includes the touch panel 13 as another input device in addition to the operation buttons 14A to 14K. The touch panel 13 is mounted on the lower LCD 12 so as to cover the screen of the lower LCD 12. In the present embodiment, the touch panel 13 is, for example, a resistive film type touch panel. However, the touch panel 13 is not limited to the resistive film type, but any press-type touch panel may be used. The touch panel 13 used in the present embodiment has the same resolution (detection accuracy) as that of the lower LCD 12. However, the resolution of the touch panel 13 and that of the lower LCD 12 may not necessarily be the same with each other. In a right side surface of the lower housing 11, an insertion opening (indicated by a dashed line in
In the right side surface of the lower housing 11, an insertion opening (indicated by a two-dot chain line in
Further, in the upper surface of the lower housing 11, an insertion opening (indicated by a chain line in
Three LEDs 15A to 15C are mounted on a left side part of the connection portion where the lower housing 11 and the upper housing 21 are connected to each other. Three LEDs 15A to 15C are mounted on a left side part of the connection portion where the lower housing 11 and the upper housing 21 are connected to each other. The game apparatus 1 is capable of performing wireless communication with another apparatus, and the first LED 15A is lit up while the power of the game apparatus 1 is ON. The second LED 15B is lit up while the game apparatus 1 is being charged. The third LED 15C is lit up while wireless communication is established. Thus, by the three LEDs 15A to 15C, a state of ON/OFF of the power of the game apparatus 1, a state of charge of the game apparatus 1, and a state of communication establishment of the game apparatus 1 can be notified to the user.
Meanwhile, in the upper housing 21, an upper LCD 22 is provided. The upper LCD 22 has a horizontally long shape, and is located such that a long side direction thereof corresponds to a long side direction of the upper housing 21. In a similar manner to that of the lower LCD 12, a display device of another type having any resolution may be used instead of the upper LCD 22. A touch panel may be provided so as to cover the upper LCD 22. On the upper LCD 22, for example, an operation explanation screen for teaching the user roles of the operation buttons 14A to 14K and the touch panel 13 is displayed.
In the upper housing 21, two cameras (the inner camera 23 and the outer camera 25) are provided. As shown in
In the inner main surface in the vicinity of the connection portion, a microphone (a microphone 42 shown in
In the outer main surface of the upper housing 21, a fourth LED 26 (indicated by a dashed line in
Sound holes 24 are formed in the inner main surface of the upper housing 21 and on left and right sides, respectively, of the upper LCD 22 provided in the vicinity of the center of the inner main surface of the upper housing 21. The loudspeakers are accommodated in the upper housing 21 and at the back of the sound holes 24. The sound holes 24 are for releasing sound from the speakers to the outside of the game apparatus 1 therethrough.
As described above, the inner camera 23 and the outer camera 25 which are components for taking an image, and the upper LCD 22 which is display means for displaying, for example, an operation explanation screen at the time of photographing are provided in the upper housing 21. On the other hand, the input devices for performing an operation input on the game apparatus 1 (the touch panel 13 and the buttons 14A to 14K), and the lower. LCD 12 which is display means for displaying the game screen are provided in the lower housing 11. Accordingly, when using the game apparatus 1, the user can hold the lower housing 11 and perform an input on the input device while seeing a taken image (an image taken by one of the cameras) displayed on the lower LCD 12.
Next, an internal configuration of the game apparatus 1 will be described with reference to
As shown in
The CPU 31 is information processing means for executing a predetermined program. Note that a program executed by the CPU 31 may be stored in advance in a memory within the game apparatus 1, may be obtained from the memory card 28 and/or the cartridge 29, or may be obtained from another apparatus by means of communication with said another apparatus. For example, a program may be obtained by means of download via the Internet from a predetermined server, or may be obtained by downloading a predetermined program stored in a stationary game apparatus through communication therewith.
The main memory 32, the memory control circuit 33, and the preset data memory 35 are connected to the CPU 31. The stored data memory 34 is connected to the memory control circuit 33. The main memory 32 is storage means used as a work area and a buffer area of the CPU 31. In the present embodiment, for example, a PSRAM (Pseudo-SRAM) is used as the main memory 32. The stored data memory 34 is storage means for storing a program executed by the CPU 31, data of images taken by the inner camera 23 and the outer camera 25, and the like. The stored data memory 34 is constructed of a nonvolatile storage medium, for example, a NAND flash memory, in the present embodiment. The memory control circuit 33 is a circuit for controlling reading of data from the stored data memory 34 or writing of data to the stored data memory 34 in accordance with an instruction from the CPU 31. The preset data memory 35 is storage means for storing, in the game apparatus 1, data (preset data) of various parameters and the like which are set in advance, and a later described menu processing program and the like. A flash memory connected to the CPU 31 via an SPI (Serial Peripheral Interface) bus can be used as the preset data memory 35.
The memory card I/F 36 is connected to the CPU 31. The memory card I/F 36 reads data from the memory card 28 mounted on the connector or writes data to the memory card 28 in accordance with an instruction from the CPU 31. In the present embodiment, data of images taken by the outer camera 25 is written to the memory card 28, and image data stored in the memory card 28 is read from the memory card 28 to be stored in the stored data memory 34.
The cartridge I/F 44 is connected to the CPU 31. The cartridge I/F 44 reads out data from the cartridge 29 mounted to the connector or writes data to the cartridge 29 in accordance with an instruction from the CPU 31.
The wireless communication module 37 functions to connect to a wireless LAN device, for example, by a method conformed to the standard of IEEE802.11.b/g. The local communication module 38 functions to wirelessly communicate with a game apparatus of the same type by a predetermined communication method. The wireless communication module 37 and the local communication module 38 are connected to the CPU 31. The CPU 31 is capable of receiving data from and transmitting data to another apparatus via the Internet using the wireless communication module 37, and is capable of receiving data from and transmitting data to another game apparatus of the same type using the local communication module 38.
The RTC 39 and the power circuit 40 are connected to the CPU 31. The RTC 39 counts a time, and outputs the time to the CPU 31. For example, the CPU 31 is capable of calculating a current time (date) and the like based on the time counted by the RTC 39. The power circuit 40 controls electric power from a power supply (typically, a battery accommodated in the lower housing 11) of the game apparatus 1 to supply the electric power to each electronic component of the game apparatus 1.
The game apparatus 1 includes the microphone 42 and an amplifier 43. The microphone 42 and the amplifier 43 are connected to the I/F circuit 41. The microphone 42 detects voice produced by the user toward the game apparatus 1, and outputs a sound signal indicating the voice to the I/F circuit 41. The amplifier 43 amplifies the sound signal from the I/F circuit 41, and causes the speakers (not shown) to output the sound signal. The I/F circuit 41 is connected to the CPU 31.
The touch panel 13 is connected to the I/F circuit 41. The I/F circuit 41 includes a sound control circuit for controlling the microphone 42 and the amplifier 43 (the speakers), and a touch panel control circuit for controlling the touch panel 13. The sound control circuit performs A/D conversion or D/A conversion of the sound signal, and converts the sound signal into sound data in a predetermined format. The touch panel control circuit generates touch position data in a predetermined format based on a signal from the touch panel 13, and outputs the touch position data to the CPU 31. For example, the touch position data is data indicating coordinates of a position at which an input is performed on an input surface of the touch panel 13. The touch panel control circuit reads a signal from the touch panel 13 and generates touch position data every predetermined period of time. The CPU 31 is capable of recognizing a position at which an input is performed on the touch panel 13 by obtaining the touch position data.
An operation button 14 includes the above operation buttons 14A to 14K, and is connected to the CPU 31. The operation button 14 outputs operation data indicating an input state of each of the buttons 14A to 14K (whether or not each button is pressed) to the CPU 31. The CPU 31 obtains the operation data from the operation button 14, and performs processing in accordance with an input performed onto the operation button 14.
The inner camera 23 and the outer camera 25 are connected to the CPU 31. Each of the inner camera 23 and the outer camera 25 takes an image in accordance with an instruction from the CPU 31, and outputs data of the taken image to the CPU 31. In the present embodiment, the CPU 31 gives an imaging instruction to the inner camera 23 or the outer camera 25, and the camera which has received the imaging instruction takes an image and transmits image data to the CPU 31.
The lower LCD 12 and the upper LCD 22 are connected to the CPU 31. Each of the lower LCD 12 and the upper LCD 22 displays an image thereon in accordance with an instruction from the CPU 31.
Next, a general outline of a process envisioned by the present embodiment will be described. The process of the present embodiment envisions a situation in which a scroll object such as a content and the like having a size that cannot be displayed in a single screen is browsed while being scrolled. Here, the scroll object is, for example, an electronic book content for an electronic book viewer, an electronic document for an electronic document viewer and the like, and a browse object (contents such as an HTML document and a Web page, including a combination of documents and images) for various browsers such as an internet browser (HTML browser). Also included as the scroll object referred here are those including a plurality of objects which are to be selected by the user and which are viewable as a list and which are browsed by using a scroll operation, and examples of those include thumbnails of images on an image viewer, a screen displaying a possession item listing in a game process, a screen displaying a plurality of buttons, and the like. Also categorized as the scroll object is a content of a menu in a menu screen of the game apparatus 1 (a group of contents including a plurality of contents shown as content icons 101 described later), and the content of the menu is provided as an example for the specific process of the present embodiment in the following description. Hereinafter, this scroll object will be referred to simply as a content.
When browsing total contents that cannot be displayed all on a single screen as described above, the user can browse all the contents by performing an operation of scrolling the contents (hereinafter, referred to as a scroll operation).
One example of the scroll operation as described above is a so-called drag operation. For example, when one part of the contents is displayed on a screen with a touch panel (the lower LCD 12 in the present embodiment), by performing touch-on to the touch panel 13 by using the touch pen 27 and preforming a slide movement to a predetermined direction, for example, from in the left to right direction by using the touch pen 27, the displayed contents can be scrolled to the right direction. As a result, a portion of the contents desired for viewing can be moved within a display screen (display area). Envisioned here is a case where an end of the contents of the browse object has been reached (a case where an end of the contents is displayed in the screen) by repeatedly conducting the scroll operation by the drag operation. In this case, since an end of the contents has been reached, further scroll operation will be a futile operation. In such a case, the present embodiment will notify the user in an intuitive manner about reaching an end of the contents by performing a representation process in coordination with the scroll operation. In the following, such representation process to notify reaching of an end of the contents will be referred to as “scroll limit representation”.
One specific example is a case where, as shown in
Note that, in order to easily understand the example, provided as an example is a case where the content icon 101d is touched-on; however, the above described scrolling is also possible when a portion in the content area 102 other than the content icons 101 is touched-on to perform the drag operation.
Here, in the state as shown in
Meanwhile, the content area 102 (including each of the content icons 101) will not be scrolled but is fixed during the above described operation. Therefore, by having the ghost objects 201 to appear and move in accordance with the scroll operation, the user can intuitively understand that he or she has scrolled to an end of the contents. Then, if the user performs a touch-off in the state as shown in
As described above, in the present embodiment, when an end of the contents has been reached resulting from the scroll operation and when further scroll operation is performed, a scrolling of the contents themselves is not performed but the ghost objects 201 identical or similar to the display of the contents are displayed and moved in coordination with the scroll operation. As a result, the user can intuitively recognize that he or she has scrolled to an end of the contents.
Furthermore, in the example with the scroll operation described above, only the drag operation has been provided as an example; however, other than this, in the present embodiment, a flick operation is also possible for scrolling the contents (an operation of performing a touch-on, moving a finger or the touch pen so as to lightly swipe the screen, and performing a touch-off; in other words, performing a touch-on and then an operation so as to flick). The result is a scroll operation having inertia in accordance with the strength of the flick operation. When such flick operation is performed, the scrolling will continue for a short time even after the touch-off, due to inertia force that is based on the strength of the flick operation. Note that, the scrolling stops at the moment when an end of the contents is reached during the scrolling due to this inertia force (hereinafter, referred to as inertia scrolling). Furthermore, when the flick operation is performed in a state where an end of the contents is already displayed (refer to
In the following, details of various data and program used in the present embodiment will be described by using
The program storage area 321 stores a menu processing program 322 and the like executed by the CPU 31.
Data such as scroll limit representation data 324, inertia representation data 325, a scroll limit representation flag 326, an inertia representation flag 327, operation data 328, lastly inputted coordinates data 329, second-from-lastly inputted coordinates data 330, and the like are stored in the data storage area 323.
The scroll limit representation data 324 is data used in the scroll limit representation for indicating an end of the contents when an end of the contents is displayed on the screen. In the present embodiment, data representing the ghost objects 201 is stored as the scroll limit representation data 324.
The inertia representation data 325 is data used for a process of the inertia scrolling as described above (hereinafter, referred to as inertia representation).
The scroll limit representation flag 326 is a flag for showing whether or not the scroll limit representation for indicating an end of the contents when an end of the contents is displayed on the screen is conducted. When the flag is set to be ON, this indicates that the scroll limit representation is being conducted.
The inertia representation flag 327 is a flag for indicating whether or not the process of inertia scrolling (inertia representation) is being executed. When the flag is set to be ON, this indicates being in the midst of executing the process of inertia scrolling.
The operation data 328 is data indicating an input state of each of the operation buttons 14a to 14K and an input state of the touch panel 13. Furthermore, when there is an input to the touch panel 13, data indicating coordinates of the input is also included in the operation data 328.
The lastly inputted coordinates data 329 is data indicating coordinates of an input to the touch panel in a process in an immediately preceding frame. In the process in an immediately preceding frame, if there is no input to the touch panel 13, the data will be empty, and if there is an input to the touch panel 13, the coordinates of the input is stored. Therefore, by referring to the data, a change in touch position (input coordinates) during the drag operation and the like can be calculated, and eventually an amount of movement of the touch pen 27 can be calculated.
The second-from-lastly inputted coordinates data 330 is data indicating input coordinates acquired immediately before the lastly inputted coordinates data described above; that is, input coordinates detected in a process in a frame preceding the current frame by two frames.
A flow of the menu process executed in the game apparatus 1 will be described next by using
First, in step S1, initialization process for data to be used in the following process is executed. Specifically, first, contents (in the present embodiment, the content icons 101) is generated and arranged in a virtual space (in the present embodiment, the content area 102 allocated in the virtual space) (refer to
Note that, the method for displaying the content and the method of the scroll process described above are merely examples and the present invention is not limited thereto, and any processing method may be used as long as displaying and scrolling of the contents can be conducted.
Subsequently, the menu process proceeds by having a process loop of steps S2 to S27 repeated in every single frame.
Next, at step S2, the operation data 328 is acquired. Then, at step S3, the acquire operation data 328 is referenced, and whether or not a touch input is performed to the touch panel 13 is determined. As a result, if it is determined that a touch input is conducted (YES at step S3), a coordinate value of the input is acquired and whether or not a continuous touch input is performed is determined at the next step S4. This is determined from whether or not some data is set in the lastly inputted coordinates data 329. As a result of the determination, if it is determined that a continuous touch input is not performed (NO at step S4), this means an operation categorized as the so-called touch-on is conducted. In this case, first, at step S5, it is determined whether or not an inertia representation is being conducted; that is, determined whether or not it is in a state in which the inertia scrolling by the flick operation as described above is still continuing. As a result, if it is determined that the inertia representation is being conducted (YES at step S5), a process of cancelling the inertia representation is executed at step S6. On the other hand, if it is determined that the inertia representation is not being conducted (NO at step S5), the process at step S6 is skipped.
Next, at step S7, a process to be conducted upon the touch-on is executed. In this process, a predetermined process in accordance with the input coordinates described above is executed as appropriate. For example, when the content icons 101 are touched-on, a process for displaying a description of an application corresponding to the content icons 101, or the like is executed. Then, the process is advanced to step S14, which is described later.
On the other hand, as a result of the determination at step S4 described above, if it is determined that a continuous touch input is conducted (YES at step S4), the possibility is either a state in which an identical position is continuously being touched, or a drag operation (scroll operation) is being conducted. In such case, next, at step S8, whether or not an end of the contents has been reached is determined for the object displayed in the screen. Thus, it is determined whether or not an end of the contents is within a predetermined position of the display area. For example, with regard to the above described example in
Note that, with regard to the method of determining whether or not an end of the contents has been reached, the processing method described above is merely one example and the present invention is not limit thereto, and any processing method may be used as long as reaching at an end of the contents can be distinguished.
On the other hand, as a result of the determination at step S8, if it is determined that an end of the contents is included in the display area (YES at step S8), next, at step S9, the type of operation is distinguish based on the operation data 328, and it is determined whether or not a scroll operation exceeding the end of the contents is performed. For example, with regard to the above described example in
On the other hand, as a result of the determination at step S9 described above, if it is determined that a scroll operation exceeding the end of the contents is performed (YES at step S9), a scroll limit representation process is executed at step S12.
On the other hand, if it is determined that the scroll limit representation is not being conducted (NO at step S41), the ghost objects 201 is generated at the next step S42, based on a portion of the contents currently displayed on the display area. For example, texture data is generated by copying an image of a portion the content icons in the display area. Then, a polygon having a plate-like shape is generated as appropriate, and the texture is pasted to generate the ghost objects 201.
Next, at step S43, the transparency, size, and color of the ghost objects 201 are set. That is, the transparency, size, and color of the ghost objects 201 at the time of initial arranging of the ghost objects 201 are set as appropriate. Here, the size is set as identical to the size of the content icons 101. Furthermore, the color is set to a color that appears as a monochrome tone; and, when 100% is defined as a state of complete transparency, the transparency is set at an 80% transparency. Here, all of the transparency, size, and color of the ghost objects 201 are set, but the present invention is not limited thereto, and only either one of the transparency, size, or color may be set.
Next, at step S44, the ghost objects 201 are arranged so as to be superimposed on the respective content icons 101 as shown in
Next, at step S46, the ghost objects 201 are moved as appropriate based on the input coordinates included in the operation data 328. Then, at step S47, the transparency and size of the ghost objects 201 are changed in accordance with the distance from a screen end on a limiting side of the scrolling (in the above described examples in
Other than the above, for example, instead of using the distance from the screen end, the size and transparency may be changed based on an amount of movement of the ghost objects 201 from initial arrangement positions. Furthermore, it is not necessarily needed to change both the transparency and size, and either one of the transparency or size may be changed, or the process of step S47 may not be executed so as not to change the transparency and size.
With this, the scroll limit representation process ends.
Returning to
Described next is the process conducted when it is determined, as a result of the determination at step S3 described above, that a touch input is not performed (NO at step S3). In this case, first, at step S15 in
As a result of the determination at step S16, if it is determined that the touch-off with inertia is performed (YES at step S16), next, at step S22, an inertia touch-off process is executed. This process is a process for conducting the inertia representation as described above.
On the other hand, as a result of the determination at step S61, if it is determined that an end of the contents has been reached (YES at step S61), at step S63, various parameters for conducting the scroll limit representation as described above are calculated. Thus, the ghost objects 201 as described above are generated. Furthermore, an amount of movement, velocity of motion, duration of motion, and the like are calculated for the ghost objects 201, in accordance with the amount of change of the input coordinates indicated by the lastly inputted coordinates data 329 and the second-from-lastly inputted coordinates data 330. Which means, various parameters necessary for conducting the scroll limit representation with inertia force are calculated. Then, the calculated parameters are stored as the inertia representation data 325 at step S64 described above.
Next, at step S65, the inertia representation flag 327 is set to be ON. At the following step S66, the inertia representation is initiated based on the inertia representation data 325. As a result, the above described inertia scrolling will be displayed if an end of the contents has not been reached when the touch-off caused by the flick operation is performed. Furthermore, if the touch-off caused by the flick operation is performed at a state of reaching an end of the contents, the above described scroll limit representation based on inertia force will be displayed, even in a state where the user is not touching the touch panel 13. With this, the inertia touch-off process ends.
Returning to
On the other hand, as a result of the determination at step S16 described above, if it is determined that the touch-off with inertia is not conducted (i.e., a normal touch-off without the flick operation is performed) (NO at step S16); next, at step S17, it is determined whether or not the scroll limit representation is being conducted by referring to the scroll limit representation flag 326. As a result, if it is determined that the scroll limit representation is being conducted (YES at step S17), this means the touch-off is conducted in a state where the ghost objects 201 are displayed as shown in
Next, at step S19, the scroll limit representation flag 326 is set to be OFF.
On the other hand, as a result of the determination at step S17 described above, if it is determined that the scroll limit representation is not being conducted (NO at step S17), the processes at steps S18 and S19 described above are skipped.
Next, at step S20, various processes to be conducted upon touch-off are executed. For example, if a touch-off is conducted in a state where a content icon 101 has been touched (i.e., if a tap operation is performed on the content icon 101), a process and the like for starting up an application corresponding to the content icon 101 that has been touched is executed. Note that, when any application starts up, the menu process stops for a moment but restarts when the application ends.
Next, at step S21, associated with the touch-off operation, the lastly inputted coordinates data 329 and the second-from-lastly inputted coordinates data 330 are cleared. Then, the process is advanced to step S14, which is described above.
Described next is the process conducted when it is determined as not being a touch-off as a result of the determination at step S15 described above (NO at step S15). In this case, it can be assumed that a state where the user is not touching the touch panel is continuing. In such case, first, at step S23, the inertia representation flag 327 is referenced, and it is determined whether or not the inertia representation is being conducted. As a result, if it is determined that the inertia representation is not being conducted (NO at step S23), the process is advanced to step S27, which is described later.
On the other hand, if it is determined that the inertia representation is being conducted (YES at step S23), next, at step S24, the process of the inertia representation based on the inertia representation data 325 is continued.
Next, at step S25, whether or not an ending condition of the inertia representation is satisfied is determined. For example, depending on whether or not the inertia scrolling has reached an amount indicated by the inertia representation data 325, whether or not the inertia representation should be ended is determined. In addition, the ending condition of the inertia representation is determined to be satisfied also when an end the contents has been reached during the inertia scrolling. As a result of the determination of step S25, if it is determined that the ending condition of the inertia representation is not satisfied (NO at step S25), the process is advanced to step S27, which is described later. On the other hand, if it is determined that the ending condition of the inertia representation is satisfied (YES at step S25), at step S26, the inertia representation flag 327 is set to be OFF.
Next, at step S27, various processes for those other than the inertia representation described above are performed as appropriate. Descriptions of these processes are omitted since they are not directly related to the present embodiment. Then, the process is advanced to step S14 described above. This concludes the descriptions of the menu process of the present embodiment.
As described above, in the present embodiment, when an end of the contents is within the display area and when a scroll operation is performed in a situation where further scroll operation is unnecessary, the ghost objects 201 appear and move in accordance with the scroll operation. This allows the user to intuitively understand that the contents have arrived to an end, without the need of narrowing the area in which the contents are displayed.
In the embodiment described above, used as the example for the ghost objects 201 is a case where the ghost objects 201a to 201d corresponding to all the content icons 101a to 101d in the display area are generated. As another example, a ghost object 201 may be generated for only the touched content icon 101 as shown in
Furthermore, in the embodiment described above, the scrolling stops at the time point when reaching an end of the contents during inertia scrolling. Other than stopping, the scroll limit representation as described above may be conducted in accordance with remaining inertia force when reaching an end of the contents. More specifically, the ghost objects 201 may appear and move in accordance with remaining inertia force when reaching an end of the contents.
Furthermore, in the embodiment described above, described mainly as an example is an operation on a menu screen of a hand-held game apparatus capable of touch operation. However, the applicable apparatus of the present invention is not limit thereto, and the present invention is also applicable when scrolling contents by conducting the drag operation as described above by using a pointing device on various information processing terminals such as a stationary game apparatus, a personal computer, an electronic book reader, and the like. Other than the touch panel described above, the pointing device may be, for example: a mouse capable of pointing an arbitrary position on a screen, a tablet which is without a display screen and which is for instructing an arbitrary position on an operation surface; and a pointing device that calculates coordinates which are on a display screen and which correspond to a pointed position instructed on a display screen, the coordinates being calculated by using a position of the display screen and a marker within an image taken by pointing a device in a direction of the display screen, the device including imaging means for remotely imaging the display screen, markers arranged in the periphery of the display screen, and the like.
Furthermore, with regard to the applications and the like that can be used, as described above, various applications such as an electronic document viewer, an internet browser, and the like for browsing while scrolling contents that cannot be displayed on a single screen can be used. Alternatively, the present invention is applicable to a general situation where a list of some information, for example, an item list and the like in a game process, is displayed and where it is necessary to perform a scroll operation.
Furthermore, in the embodiment described above, horizontal scrolling is used as an example, however, the scrolling direction is not limit thereto, and the present invention is also applicable to vertical scrolling.
Furthermore, in the embodiment described above, as an example of a device for detecting a position pointed by a player in an operation area when conducting the scroll operation, the touch panel is used; however, a so-called pointing device which allows the player to instruct a position within a predetermined area may be used including examples such as: a mouse capable of pointing an arbitrary position on a screen, a tablet which is without a display screen and which is for instructing an arbitrary position on an operation surface; and a pointing device that calculates coordinates which are on a display screen and which correspond to a pointed position instructed on a display screen, the coordinates being calculated by using a position of the display screen and a marker within an image taken by pointing a device in a direction of the display screen, the device including imaging means for remotely imaging the display screen, markers arranged in the periphery of the display screen, and the like. Furthermore, instead of the pointing device, present invention is also applicable when conducting the scrolling as described above by an operation using a button such as, for example, a cross key, a cursor key, and the like. When such an operation using a button is conducted, for example, when a scroll operation is performed by holding down the left button of a cross key and when the left button is continuously held down after reaching an end of the contents, the scroll limit representation as described above will be conducted.
Furthermore, with regard to the ghost objects 201, in the embodiment described above, the ghost objects are moved in the direction to which the user is intending to scroll (direction of change of input coordinates). However, as an alternative example, a scroll limit representation of moving the ghost objects 201 to a near side of the screen (move in a depth direction within the virtual space) in accordance with the scroll operation may be conducted.
Furthermore, in the embodiment described above, a case has been described where a series of processes for conducting the scroll limit representation in accordance with the scroll operation are executed on a single apparatus (the game apparatus 1). However, in another embodiment, the series of processes may be executed on an information processing system including a plurality of information processing apparatuses. For example, in an information processing system which includes a terminal side apparatus and a server side apparatus that is capable of communicating with the terminal side apparatus via a network, one part of the processes among the series of processes may be executed on the server side apparatus. Further, in an information processing system which includes a terminal side apparatus and a server side apparatus that is capable of communicating with the terminal side apparatus via a network, main processes of the series of processes may be executed on the server side apparatus, and one part of the processes may be executed on the terminal side apparatus. Still further, in the information processing system described above, the system on the server side may be configured with a plurality of information processing apparatuses, and processes to be executed on the server side may be divided to be executed by the plurality of information processing apparatuses.
While the invention has been described in detail, the foregoing description is in all aspects illustrative and not restrictive. It is understood that numerous other modifications and variations can be devised without departing from the scope of the invention.
Number | Date | Country | Kind |
---|---|---|---|
2010-205846 | Sep 2010 | JP | national |