1. Field
Methods and apparatuses consistent with exemplary embodiments of the present disclosure relate to a contents display method and a mobile terminal implementing the same.
2. Description of the Related Art
A mobile terminal provides various contents. The contents may be displayed for each of a plurality of pages. However, a contents display method and an apparatus thereof in the related art do not provide the feeling of operating pages on the mobile terminal which is similar to the feeling of operating an actual paper book for a user. According to a contents display method and an apparatus thereof of the related art, if a user provides input information (e.g., a push) associated with a page skip, for example, a next page button is detected, and a currently displayed page is replaced with a next page. Such a replacement scheme does not actually skip the currently displayed page but simply browses to a next web page. Meanwhile, a recently developed mobile terminal may include a touch screen. The mobile terminal detects and skips pages in response to the detected gesture. When the user skips the pages, the mobile terminal according to the related art provides an animation which gradually folds a current page (that is, a front surface of the page) and shows a next page (that is, a back surface of the page) regardless of a touched point or a direction of drag.
One or more exemplary embodiments provide a contents display method capable of achieving a realistic feeling for a user when the user operates a screen on which a page is displayed by a touch input device (e.g., finger or pen), and an apparatus thereof.
One or more exemplary embodiments also provide a contents display method in which an animation of pages being skipped provides a realistic feeling, and an apparatus thereof.
In accordance with an aspect of an exemplary embodiment, there is provided a method of displaying contents of pages displayed by a mobile terminal including a display unit in which a touch panel is installed, the method including: displaying a page; detecting movement of a touch input device with respect to the displayed page; and displaying the page so that the page is convexly deformed and skipped in response to the movement of the touch input device.
In accordance with an aspect of another exemplary embodiment, there is provided a mobile terminal including: a display unit in which a touch panel is installed and configured to display contents for each of a plurality of pages; a memory configured to store the pages; and a controller configured to control the display unit such that one of the pages is displayed, detect movement of a touch input device with respect to the displayed page, and control the display unit such that the page is displayed as convexly deformed and skipped in response to the movement of the touch input device.
In accordance with an aspect of another exemplary embodiment, there is provided a method to display pages including: displaying a page on a device including a touch input unit, generating a page mesh corresponding to the displayed page, the page mesh including a plurality of nodes having respective weights, detecting movement of a touch input device with respect to the displayed page, using the touch input unit, and changing an appearance of the page according to the detected movement and the page mesh.
The above and/or other aspects will be more apparent from the following detailed description of exemplary embodiments in conjunction with the accompanying drawings, in which:
Exemplary embodiments are described with reference to the accompanying drawings in detail. The same reference numbers are used throughout the drawings to refer to the same or like parts. Detailed descriptions of well-known functions and structures incorporated herein may be omitted to avoid obscuring the subject matter of the exemplary embodiments.
The contents display method according to exemplary embodiments may be implemented by a mobile terminal, for example, a smart phone, a tablet PC, an e-book reader, a navigation device, or a moving image player. Hereinafter, the contents display method and the mobile terminal thereof will be described in detail.
As used herein, according to exemplary embodiments, the term ‘contents’ may refer to photographs, videos, audio, images, calendars, contact points, memos, documents, e-books, web pages, and thumb-nails and icons, in addition to many other types of contents. The contents are displayed for each page. The pages according to exemplary embodiments may be convexly and stereoscopically deformed in response to a user gesture. Accordingly, when the user operates a screen on which pages are displayed by a touch input device (e.g., finger or pen), the user may feel as if the user is controlling real paper pages.
As used herein, according to exemplary embodiments, the term ‘page mesh’ refers to geometrical information of the pages. The page mesh includes a plurality of nodes and links connecting the nodes with each other. A weight is allocated to each node, and an elastic value is allocated to each link. The elastic value may be differently allocated for the user according to properties of the pages in order to achieve a realistic feeling.
For example, when the pages are thickly set (that is, when the weight is greatly set), the elastic value may be greatly allocated. When the page is relatively thinly set, the elastic value may be allocated to a relatively smaller value. A great weight may be allocated to nodes located in an inner side (e.g., spine) of the page. When location variation in relatively outer nodes (e.g., page edge) is greater than that of inner nodes, a small weight may be allocated to the relatively outer nodes. The same weight may be allocated to all nodes. According to exemplary embodiments, the weights of the nodes correspond to resistances of the nodes against being convexly deformed due to detected movement. According to exemplary embodiments, the weights of nodes may decrease in a direction moving away from the spine. Alternatively, the weights of the nodes may be equal throughout the displayed page.
Virtual force applied to each node may be classified into two types. The first virtual force is an internal virtual (hereinafter also referred to ‘internal force’). The second virtual force is an external virtual force (hereinafter also referred to ‘external force’) such as gravity or human power. The virtual gravity of the external force is defined as a force pulling the node down. If a screen on which the pages are displayed is arranged in an XY plane and a user's viewpoint is along a positive direction of the Z direction on the XY plane, the virtual gravity pulls the node down toward a lower portion of the XY plane. The Z axis is vertical (orthogonal) to the XY plane. The Z axis is not an actual axis, but is a virtual axis for stereoscopically controlling the virtual page. The virtual gravity may act equally on all nodes. The gravity may have a different effect according to a property of a page to achieve a realistic feeling for the user. For example, in a case where the user lifts and puts down a page of a real paper book, when the page is thin, if the page is relatively thick, the page falls down rapidly down. The following table 1 illustrates thicknesses by types of pages. For example, referring to table 1, a pamphlet falls down faster than a leaflet. That is, a deformation degree of the page may be changed according to thicknesses and materials set to the display page.
An artificial force is a force which the user applies to the page. For example, a user gesture with respect to the screen may be the artificial force. A target node touched by the touch input device is moved in a direction in which the touch input device is moved. In this case, the artificial force is transferred to other nodes through links. As a result, a sum of the internal force and the external force is applied to each node. If the artificial force is applied to a displayed page, a controller of the mobile terminal calculates forces applied to each node based on the artificial force applied to the displayed page. The force may be obtained in various ways, for example, by multiplying a moving distance of a target node by speed to obtain acceleration, and by multiplying the acceleration by a weight of a corresponding target node. The calculation of the force is generally known in the art, and thus a detailed description thereof is omitted. The mobile terminal reflects a deformed page mesh on a page to generate an animation. A procedure of generating the animation based on the artificial force is defined by a physically-based simulation. The physically-based simulation may be executed by various components, such as, for example, an Application Processor (AP), a Central Processing Unit (CPU), or a Graphics Processing Unit (GPU).
Referring to
The display unit 110 displays contents on a screen under control of the controller 160. That is, when the controller 160 processes (e.g., performs a decoding or resizing operation) and stores the contents in a buffer, the display unit 110 converts the contents stored in the buffer into an analog signal, and displays the analog signal on the screen. When power is supplied to the display unit 110, the display unit 110 displays a lock image (e.g., login image) on the screen. If lock release information (e.g., password) is detected in a state that a lock image is displayed, the controller 160 releases the lock. That is, the display unit 110 terminates the displaying of the lock image, and displays another image, for example, a home image under control of the controller 160. The home image includes a background image and a plurality of icons displayed thereon. Icons indicate applications or contents, respectively. If the user selects an icon, for example, an application icon (e.g., taps the icon), the controller 160 executes a corresponding application (e.g., gallery), and controls the display unit 110 to display an execution image of the corresponding application (e.g., page including a plurality of thumbnails).
The display unit 110 may be implemented as various types, for example, a Liquid Crystal Display (LCD), an Organic Light Emitting Diode (OLED), an Active Matrix Organic Light Emitting Diode (AMOLED), or a flexible display.
The touch panel 111 is installed on a screen of the display unit 110. For example, the touch panel 111 may be implemented as an add-on type located on a screen of the display unit 110 or an on-cell type or an in-cell type inserted into an inside of the display unit 110.
The touch panel 111 generates an analog signal (e.g., touch event) in response to touch of a touch input device (e.g., finger or pen) with respect to a screen, and a touch IC 112 converts the analog signal into a digital signal, and transfers the digital signal to the controller 160. The touch event includes a touch coordinate (x, y). For example, the touch IC 112 determines a representative coordinate of a plurality of touch coordinates, stores a determined touch coordinate in an internal memory of the touch IC 112, and transfers the touch coordinate stored in the internal memory to the controller 160 in response to a request of the controller 160. The touch coordinate may be a pixel unit. For example, when resolution of a screen is 640 (the number of horizontal pixels)*480(the number of vertical pixels), an X axis coordinate may be, for example, (0, 640), and a Y axis coordinate may be, for example, (0, 480).
When the touch coordinate is received from the touch IC 112, the controller 160 determines that the touch input device (e.g., finger or pen) touches the touch panel 111. When the touch coordinate is not received from the touch IC 112, the controller 160 determines that the touch of the touch input device is released. Further, for example, when the touched coordinate varies from (x0, y0) to (x1, y2), a variation amount of the touched coordinate (e.g., D (D2=(x0−x1)2+(y0-y1)2) exceeds a preset “moving threshold (e.g., 1 mm)”, the controller 160 determines that the touch input device has moved.
The controller 160 computes a location variation amount (dx, dy) of the touch and moving speed of the touch input device in response to movement of the touch input device. The controller 160 determines a user gesture as one of various different types of gestures, for example, touch, multi-touch, tap, double tap, long tap, tap & touch, drag, flick, press, pinch in, and pinch out based on presence of touch release of the touch input device, presence of movement of the touch input device, a location variation amount of the touch input device, and moving speed of the touch input device. The touch is a gesture where a user makes the touch input device contact with one point of a touch panel 111 on a screen. The multi-touch is a gesture where the user makes a plurality of touch input devices (e.g., thumb and index finger) contact the touch panel 111. The tap is a gesture where the user touches-off a corresponding point without movement after touching the touch input device on one point. The double tap is a gesture where a user continuously taps one point twice. The long tap is a gesture where touch of the touch input device is released from a corresponding point without a motion of the touch input device after touching one point longer than the tap. The tap & touch is a gesture where the user touches a corresponding point within a predetermined time (e.g., 0.5 seconds) after touching one point of a screen. The drag is a gesture that moves the touch input device in a predetermined direction in a state in which one point is touched. The flick is an operation that touches-off after moving the touch input device at a higher speed than the drag. The press is a gesture to maintain the touch without movement for a predetermined time (e.g., 2 seconds) after touching one point. The pinch in is a gesture where the user reduces an interval between touch input devices after simultaneously multi-touching two points by the two touch input devices. The pinch out is a gesture for increasing the interval between the touch input devices. That is, the touch is a gesture in which the user contacts the touch screen, and other gestures refer to variations in the touch.
The touch panel 111 may be a converged touch panel including a hand touch panel detecting a hand gesture and a pen touch panel detecting a pen gesture. The hand touch panel may include a capacitive type touch panel. The hand touch panel may also include a resistive type touch panel, an infrared type touch panel, or an ultrasonic type touch panel. Further, the hand touch panel does not generate a touch event only based on a hand gesture, but may also generate the touch event based on other objects touching the touch panel 111 (e.g., conductive material capable of providing variation in capacitance). The pen touch panel may include an electromagnetic induction type touch panel. Accordingly, the pen touch panel generates a touch event by a specially manufactured touch pen 170 so that a magnetic field may be formed. In particular, the touch event generated by the pen touch panel includes a value indicating a type of the touch together with a touch coordinate. For example, when a first voltage level is received from the pen touch panel, the controller 160 determines a touch of the touch input device as an indirect touch (that is, hovering). When a second voltage level greater than the first voltage level is received from the touch panel 111, the controller 160 determines the touch of the touch input device as a direct touch. The touch event generated by the pen touch panel may further include a value indicating the presence of pushing of a button installed at the pen 170. For example, when a button installed at the pen 170 is pushed, a magnetic field generated from a coil of the pen 170 varies. In response to the variation in the magnetic field, the pen touch panel generates a third voltage level, and transfers the third voltage level to the controller 160. According to exemplary embodiments, the detecting of movement includes detecting a touch input device touching the touch input unit. According to exemplary embodiments, the detecting of movement includes detecting a voltage level of the touch input device.
The key input unit 120 may include a plurality of input keys and function keys for receiving numeric or character information and setting various functions. The keys may include a menu loading key, a screen on/off key, a power on/off key, and a volume control key. The key input unit 120 generates a key event associated with user settings and function control of the mobile terminal 100 and transfers the key event to the controller 160. The key event may include a power on/off event, a volume control event, a screen on/off event, and a shutter event. The controller 160 controls the foregoing constituent elements in response to the key event. Meanwhile, a key of the key input unit 120 may refer to a hard key and a virtual key displayed on the display unit 110 may refer to a soft key.
The secondary memory 130 may include various components, such as a disk, a RAM, a ROM, and a flash memory. The secondary memory 130 stores contents generated by the mobile terminal 100 or contents received from an external device (e.g., server, desktop PC, tablet PC) through the RF communication unit 140. The secondary memory 130 may temporarily store data copied from a message, a photograph, a web page, and a document by the user for performing a copy and paste operation. The secondary memory 130 stores various preset values (e.g., screen brightness, presence of vibration upon generation of a touch, presence of automatic rotation of screen). Further, the memory 130 stores histogram information, for example, information associated with a most recently displayed page before the application is terminated.
The secondary memory 130 stores a booting program, and at least one operating system (e.g., gallery, address book, video player, calendar, note pad, electronic book viewer, music player, web browser). The operating system serves as an interface between hardware and applications and further serves as an interface between applications, and manages various computer resources, such as a CPU, a graphic processing unit (GPU), a main memory, and the secondary memory 130. The applications may be classified into an embedded application and a third party application. For example, the embedded application includes a web browser, an e-mail program, and an instant messenger. If power of a battery is supplied to the controller 160 of the mobile terminal 100, the booting program is loaded into a main memory of the controller 160. The booting program loads host and guest operating systems into the main memory 161. The operating systems load the application into the main memory 161.
The RF communication unit 140 performs voice calls, image calls, and data communications with an external device through a network under the control of the controller 160. The RF communication unit 140 may include an RF transmitter for up-converting a frequency of a transmitted signal and amplifying the converted signal, and an RF receiver for low-noise-amplifying a frequency of a received signal and down-converting the amplified signal. The RF communication unit 140 may include a mobile communication module (e.g., 3-generation mobile communication module, 3.5-generation mobile communication module, 4-generation mobile communication module, etc.), a digital broadcasting module (e.g., DMB module), and a near field communication module.
The audio processor 150 inputs and outputs an audio signal (e.g., voice data) for voice recognition, voice recording, digital recording, and call operations. The audio processor 150 receives an audio signal from the controller 160, converts the received audio signal into an analog signal, amplifies the analog signal, and outputs the amplified analog signal through the speaker SPK. The audio processor 150 converts an audio signal received from the microphone MIC into digital data, and provides the converted digital signal to the controller 160. The speaker SPK converts an audio signal received from the audio processor 150 into a sound wave and outputs the sound wave. The MIC converts the sound wave received from a person or other sound source into the audio signal.
The controller 160 controls overall operations and signal flows between internal constituent elements of the mobile terminal 100, processes data, and controls supply of power from a battery to the constituent elements. The controller 160 includes at least one CPU. As is generally known in the art, the CPU is a core control unit of a computer system and performs computation and comparison of data, and interpretation and execution of commands. The CPU includes various registers which temporarily store data and commands. The controller 160 may include at least one Graphic Processing Unit (GPU). The GPU is a graphic control unit performing computation and comparison of data associated with graphics, and interpretation and execution of commands, alternatively to the CPU. Each of the CPU and the GPU may be configured by integrating at least two independent cores (e.g., quad-core) as one package being a single IC. The CPU and the GPU may be implemented as a System on Chip (SoC). The CPU and the GPU may be a package of a multi-layer structure. A configuration including the CPU and the GPU may be referred to as an Application Processor (AP).
The GPU of the controller 160 deforms a page mesh in response to a gesture (e.g., drag) of a touch input device, and generates an animation by reflecting a page on the deformed page mesh. The GPU receives information associated with a touch gesture from the touch IC 112. The GPU deforms the page mesh using the received information. If a touch of the touch input device is released from a screen, the GPU restores the page mesh to an original state. That is, the deformed page mesh is restored to an original state by the elastic force of links and gravity applied to each node. The GPU accesses the secondary memory 130 to read a page therefrom. The GPU reflects deformation information of the page mesh on the read page to generate the animation. The deformation information of the page mesh includes coordinates (x, y, z) of each node constituting the page mesh. In addition, the GPU controls the display unit 110 to display the animation. The animation may be generated by a CPU or an application processor (AP).
The controller 160 includes a main memory, for example, a RAM. The main memory may store various programs, for example, a booting program, a host operation system, guest operating systems, and applications loaded from the secondary memory 130. The CPUs and GPUs of the controller 160 access the foregoing program to decode a command of the program and execute a function (e.g., generation of histogram) according to the interpretation result. In addition, the controller 160 temporarily stores data to be written in the secondary memory 130 and temporarily stores data read out from the secondary memory 130. A cache memory may be further provided as a temporary data warehouse.
The pen 170 is an accessory of the mobile terminal 100 which can be separated from the mobile terminal 100, and may include, for example, a penholder, a rib disposed at an end of the penholder, a coil disposed inside the penholder adjacent to the rib to generate a magnetic field, and a button which varies the magnetic field. The coil of the pen 170 forms the magnetic field around the rib. A pen touch panel of the touch panel 111 detects the magnetic field, and generates a touch event corresponding to the magnetic field.
According to exemplary embodiments, the mobile terminal 100 may further include constituent elements which are not described above, such as a Global Positioning System (GPS) module, a vibration motor, and an acceleration sensor.
Alternatively, the same weight may be allocated to all nodes. As such, the entire movement of the page mesh may be heavier than a previous case in which different weights are allocated to the nodes. That is, a deformation degree of the page may be changed according to attribute information (e.g., thickness, weight, material) set to a corresponding page. Further, the deformation degree of the page may be changed according to a calculated gradient. When an artificial force (that is, gesture of touch input device) is applied to the page, the controller 160, particularly, a GPU of the controller 160, detects a gesture, deforms the page mesh in response to the detected gesture, and reflects the deformed page mesh on the page to generate an animation. In detail, referring to
The controller 160 calculates displacement of the moved target node. The displacement may be represented as a vector having a size and a direction. In detail, the size of the displacement includes at least one of a current location of the target node, a moved distance of the target node, and speed of the target node. For example, the size of the displacement may include only a current location of the target node, only a moved distance of the target node, only speed of the target node, or a combination of the current location of the target node, the moved distance of the target node, and the speed of the target node. The controller 160 deforms the page mesh according to the computed gradient, and reflects the deformed page mesh on the page to generate the animation. The controller 160 calculates forces applied to each node using the calculated displacement. The force is a vector having a size and a direction. As described above, the force is a sum of an elastic force, gravity, and an artificial force. The controller 160 calculates location values of respective nodes using the calculated forces. As shown in
Referring to
The controller 160 determines whether a touch is detected (operation 303). When the touch is not detected, the controller 160 determines whether a threshold time elapses (operation 304). The threshold time is set to automatically turn-off the screen. If the touch is not detected by the time the threshold time elapses, the controller 160 turns-off the screen (operation 305). The threshold time may be set to many different values, e.g., 1 minute, which may be changed according to selection of the user. When the touch is detected, the controller 160 determines whether the touch input device is moved (e.g., drag, flick) (operation 306). When the touch input device is moved, the controller 160 controls the display unit 110 to display a convexly deformed page in response to the movement of the touch input device (operation 307). That is, the controller 160 deforms the page mesh in response to the movement of the touch input device, and reflects the deformed page mesh on the page to generate the animation. A detailed process of step 307 will be described with reference to
The controller 160 determines whether the touch of the touch input device is released from the screen (operation 308). If the touch of the touch input device is maintained without releasing the touch, the process returns to operation 306. Conversely, if the touch input device is touch-released, the process proceeds to operation 309. The controller 160 determines whether the touch release is an event corresponding to a page skip (operation 309). That is, the controller 160 determines whether the page skip is generated based on at least one of a moving direction of the touch input device, a touch coordinate and speed before generation of the touch release. When the page skip is generated, the controller 160 controls the display unit 110 to skip a currently display page and to display another page (operation 310). When the page skip is not generated, the controller 160 maintains the displaying of a current page (operation 311). Next, the controller 160 determines whether execution of the application is terminated (operation 312). When the execution of the application is not terminated, the process returns to operation 303.
The displacement is represented by a vector having a size and a direction. In detail, the size of the displacement may include at least one of a current location of the target node, a moved distance of the target node, and speed of the target node. For example, the size of the displacement may include only a current location of the target node, only a moved distance of the target node, only a speed of the target node, or a combination of the current location of the target node, the moved distance of the target node, and the speed of the target node.
After calculating the displacement, the controller 160 calculates forces applied to each node using the calculated displacement of the target node (operation 403). The calculation of the forces is generally known in the art. That is, the controller 160 calculates a magnitude of forces applied to each node and a direction to which the forces are applied (operation 403). Next, the controller 160 applies the calculated forces to each node to deform a page mesh (operation 404). That is, the controller 160 calculates location values of respective nodes using the calculated forces (operation 404). Finally, the controller 160 applies the deformed page mesh to a page to generate an animation (operation 405). The generated histogram is displayed such that the page is convexly deformed as the target node is moved to a direction vertical to gravity or a determined direction of a gradient. According to exemplary embodiments, the changing the appearance of the page includes turning the page over to a next page. According to exemplary embodiments, the displaying of the page includes displaying two pages connected at a spine, and the weights of the nodes corresponding to the spine are greater than weights of other nodes of the two pages. According to exemplary embodiments, the displayed page corresponds to a type of material, and the weights of the nodes correspond to the type of material.
If the touch of the touch input device is released from the deformed page, the page is restored to an original state, that is, an open state. In this case, the page may be skipped or may not be skipped and returned to an original position. Such a result is determined by forced applied to respective nodes of the page mesh. That is, if an artificial force disappears, only the elastic force and the gravity are applied to the page mesh. The controller 160 calculates a sum of forces applied to respective nodes of the page mesh. The controller 160 determines a moved direction of the page based on the sum of the forces. The controller 160 moves the page along the determined direction. For example, the page is moved in a direction towards which a mass center of the page mesh faces. The moving direction of the page is determined as a moving direction before the touch input device is separated from a screen (that is, a page). A detailed example will be described with reference to
Referring to
The controller 160 determines whether the current touch coordinate is greater than the previous touch coordinate (operation 504). When the current touch coordinate is greater than the previous touch coordinate, the controller 160 determines a moving direction of the touch input device as a ‘right direction’ (operation 505). When the current touch coordinate is less than or equal to the previous touch coordinate, the controller 160 determines the touch direction as a ‘left direction’ (operation 506). After the moving direction is determined, the controller 160 sets the current touch coordinate to the previous touch coordinate (operation 507). Next, the controller 160 determines whether a touch of the touch input device is released from the screen (operation 508). When the touch of the touch input device is not released, the process returns to operation 502. Conversely, when the touch of the touch input device is released, the controller 160 determines whether the determined touch direction is a right direction (operation 509). When the touch direction is the right direction, the controller 160 moves the touched page to the right direction (operation 510). If the touched page is a left page, operation 510 corresponds to an operation of skipping the page to a previous page. When the touched page is a right page, operation 510 corresponds to an operation of maintaining the displaying of the touched page without skipping the page to the next page. When the touch direction is the left direction, the controller 160 moves the touched page to the left direction (operation 511). If the touched page is the left page, operation 511 corresponds to an operation of maintaining the displaying of the touched page without skipping the page back. Conversely, if the touched page is the right page, step 511 corresponds to an operation of skipping the page to a next page.
Hereinafter, exemplary embodiments will be described with reference to exemplary screen diagrams. A display mode of a screen according to the exemplary embodiments may be divided into a landscape mode and a portrait mode. In a case of the landscape mode, the mobile terminal 100 displays two pages toward the left and right directions. In a case of the portrait mode, the mobile terminal displays one page. However, the exemplary embodiments are not limited thereto. If the user rotates the mobile terminal 100, a sensor (e.g., acceleration sensor) included in the mobile terminal 100 detects rotation information and transfers the rotation information to the controller 160. The controller 160 may determine the display mode of the mobile terminal 100 using the rotation information. The exemplary embodiments may use the landscape mode and the portrait mode as the display mode.
Referring to
Referring to
Referring to
Referring to
Referring to
Referring to
Referring to
Next,
The user may touch all parts of the page as well as touch coordinates described in
Referring to
Referring to
According to exemplary embodiments, the pages may be configured in the order of time. For example, when a shooting time of a first photograph is earlier than that of a second photograph, the first photograph is configured at a previous page as compared with the second photograph. Further, the pages may be configured by places. For example, a first page is configured by photographs shot in Seoul, and a second page is configured by photographs in New York. If an arrangement scheme of contents is changed from “time” to “place” or vice versa by the user, the controller 160 may reconfigure pages, and accordingly, at least one of an order of currently displayed pages and a total number of pages may be changed.
If a format of the page is changed, the controller 160 reconfigures the pages, and accordingly, at least one of an order of currently displayed pages and a total number of pages may be changed. In detail, the number of contents included in one page may be changed. For example, each of the pages shown in
The foregoing method for displaying contents according to exemplary embodiments may be implemented in an executable program command form by various computer components and may be recorded in a computer readable recording medium. According to exemplary embodiments, the computer readable recording medium may include a program command, a data file, and a data structure individually or a combination thereof. According to exemplary embodiments, the program command recorded in a recording medium may be specially designed or configured for the exemplary embodiments or be known to a person having ordinary skill in a computer software field to be used. The computer readable recording medium may include magnetic media such as a hard disk, floppy disk, or magnetic tape, optical media such as a Compact Disc Read Only Memory (CD-ROM) or Digital Versatile Disc (DVD), magneto-optical media such as a floptical disk, and a hardware device such as ROM, RAM, and flash memory storing and executing program commands. Further, the program command may include a machine language code created by a complier and a high-level language code executable by a computer using an interpreter. The foregoing hardware device may be configured to be operated as at least one software module to perform an operation according to the exemplary embodiments.
As mentioned above, according to the contents display method and the mobile terminal of the exemplary embodiments, the exemplary embodiments provide a highly realistic feeling to a user when the user operates a screen on which pages are displayed.
Although exemplary embodiments have been described in detail hereinabove, it should be clearly understood that many variations and modifications of the basic inventive concepts disclosed herein which may appear to those skilled in the present art will still fall within the spirit and scope of the exemplary embodiments, as defined in the appended claims.
Number | Date | Country | Kind |
---|---|---|---|
10-2012-0010106 | Jan 2012 | KR | national |
10-2012-0021310 | Feb 2012 | KR | national |
10-2013-0009788 | Jan 2013 | KR | national |
This application is a continuation-in-part of U.S. patent application Ser. No. 13/739,777 filed Jan. 11, 2013 in the U.S. Patent and Trademark Office, which claims priority from Korean Patent Application No. 10-2012-0010106 filed Jan. 31, 2012, and Korean Patent Application No. 10-2012-0021310 filed Feb. 29, 2012, and claims priority from Korean Patent Application No. 10-2013-0009788 filed Jan. 29, 2013 in the Korean Intellectual Property Office, the entire disclosures of which are hereby incorporated by reference in their entirety.
Number | Date | Country | |
---|---|---|---|
Parent | 13739777 | Jan 2013 | US |
Child | 13909899 | US |