The present disclosure relates to Internet technologies and to a method and an apparatus of displaying data.
This section provides background information related to the present disclosure which is not necessarily prior art.
A scratch card is a small card, often made of a thin paper-based card for competitions and plastic to conceal information, where one or more areas contain concealed information which can be revealed by scratching off an opaque covering. Applications include cards sold for gambling (especially lottery games and quizzes), free-of-charge cards for quizzes, and to conceal confidential information such as PINs (Product Identification Numbers) for telephone calling cards and other prepaid services.
In the present disclosure, the concealed information contained in a scratchpad is also referred to as target information.
This section provides a general summary of the disclosure, and is not a comprehensive disclosure of its full scope or all of its features.
Various examples of the present disclosure provide a method and an apparatus of displaying data to simulate covering target data and removing the covering off of target data in a terminal device.
According to various embodiments, a method of displaying data may include:
obtaining target data;
covering the target data with at least one covering layer;
displaying the target data being covered by the at least one covering layer; and
removing the at least one covering layer to reveal the target data after a trigger event for removing the at least one covering layer is detected.
According to various embodiments, an apparatus of displaying data may include:
an obtaining module, configured to obtain target data;
a covering module, configured to cover the target data by using at least one covering layer;
a displaying module, configured to display the target data being covered by the at least one covering layer; and
a covering removing module, configured to remove the at least one covering layer to reveal the target data after a trigger event for removing the at least one covering layer is detected.
According to various embodiments, a terminal device of displaying data may include the above apparatus.
According to various embodiments, a system of displaying data may include a server and a terminal device;
the terminal device may include the above apparatus;
the server is configured to send the target data to the apparatus.
Various embodiments of the present disclosure simulate covering target data and removing the covering over the target data.
Further areas of applicability will become apparent from the description provided herein. The description and various examples in this summary are intended for purposes of illustration and are not intended to limit the scope of the present disclosure.
The drawings described herein are for illustrative purposes of various embodiments and not all possible implementations, and are not intended to limit the scope of the present disclosure.
Features of the present disclosure are illustrated by way of example and are not limited in the following figures, in which like numerals indicate like elements.
Corresponding reference numerals indicate corresponding parts throughout the several views of the drawings.
Example embodiments will now be described more fully with reference to the accompanying drawings.
For simplicity and illustrative purposes, the present disclosure is described by referring mainly to an example thereof. In the following description, numerous details are set forth in order to provide a thorough understanding of the present disclosure. It will be readily apparent however, that the present disclosure may be practiced without limitation to these details. In other instances, some methods and structures have not been described in detail so as not to unnecessarily obscure the present disclosure. As used herein, the term “includes” means includes but not limited to, the term “including” means including but not limited to. The term “based on” means based at least in part on. Quantities of an element, unless specifically mentioned, may be one or a plurality of, or at least one.
According to various embodiments of the present disclosure, the user terminal device may be a computing device that may execute methods and software systems.
The computing device 200 may vary in terms of capabilities or features. Claimed subject matter is intended to cover a wide range of potential variations. For example, the computing device 200 may include a keypad/keyboard 256. It may also comprise a display 254, such as a liquid crystal display (LCD), or a display with a high degree of functionality, such as a touch-sensitive color 2D or 3D display. In contrast, however, as another example, a web-enabled computing device 200 may include one or more physical or virtual keyboards, and mass storage medium 230.
The computing device 200 may also include or may execute a variety of operating systems 241, including an operating system, such as a Windows™ or Linux™, or a mobile operating system, such as iOS™, Android™, or Windows Mobile™. The computing device 200 may include or run various applications 242. An application 242 is capable of implementing the method of displaying data of various embodiments of the present disclosure.
Further, the computing device 200 may include one or more non-transitory processor-readable storage medium 230 and one or multiple processors 222 in communication with the non-transitory processor-readable storage medium 230. For example, the non-transitory processor-readable storage medium 230 may be a RAM memory, flash memory, ROM memory, EPROM memory, EEPROM memory, registers, hard disk, a removable disk, a CD-ROM, or any other form of non-transitory storage medium known in the art. The one or more non-transitory processor-readable storage medium 230 may store sets of instructions, or units and/or modules that comprise the sets of instructions, for conducting operations described in the present disclosure. The one or multiple processors may be configured to execute the sets of instructions and perform the operations according to various embodiments of the present disclosure.
Various embodiments of the present disclosure implement virtual bearing of target data and simulate removing of covering of the target data to reveal the target data in a terminal device, so that the target data do not have to be borne on physical objects.
At block S31, target data, i.e., information that was concealed, is obtained. According to various embodiments, a request may be sent to a background server. After receiving the request, the background server may calculate the target data by using a pre-defined data processing algorithm, and return the target data obtained. Receiving the target data returned by the background server implements the procedure of obtaining the target data at block S31.
According to various embodiments, the target data returned may be encrypted to guarantee data safety. After the target data is received from the background server at block S31, the encrypted target data may be decrypted accordingly. The encryption and decryption may adopt existing algorithms, e.g., DES (Data Encryption Standard) or the like, and this is not limited in the present disclosure.
When the method is applied to lottery services, the data processing algorithm according to various embodiments may be an algorithm for calculating a probability of winning the lottery based on the amount of prizes, and the target data obtained from the algorithm may indicate winning a prize, not winning a prize, the type or the name of the prize, and the like. When the method is applied to prepaid services, the target data may be a PIN (Product/Personal Identification Number), or the like. When the method is applied to bankcard services, the target data may be a bankcard number, an initial password, or the like.
At block S32, the target data is covered with at least one covering layer. According to various embodiments, a canvas may be placed over the target data and serve as the covering layer. The canvas is opaque. According to various embodiments, at least two covering layers may be placed over the target data. Each of the covering layers may be opaque or partly opaque. According to various embodiments, each of the at least two covering layers covers part of the area, and the at least two covering layers may be placed in a pre-defined manner to make the target data completed concealed. The multiple covering layers may be arranged vertically or horizontally, and the manner is not limited in the present disclosure.
At block S33, the target data being covered by the at least one covering layer is displayed. According to various embodiments, when the method is implemented in a terminal device supporting web applications, the target data being covered by the at least one covering layer may be displayed in a webpage. A webpage may include various contents in addition to the target data being covered by the covering layer, e.g., other elements. Taking a lottery ticket as an example,
Displaying the target data in a webpage may include defining a covering area being an area where the target data is displayed, initiating a command of loading elements other than the target data and the at least covering layer of the webpage, and displaying the target data being covered by the at least one covering layer together with the elements loaded.
At block S34, the at least one covering layer is removed to reveal the target data after a trigger event for removing the at least one covering layer is detected. Taking a lottery application as an example, when the covering layer in the covering area is removed from the webpage as shown in
According to manner A, the process of removing the at least one covering layer may include the following procedures. In procedure A1, a “scratching” action performed on the covering layer is detected. The “scratching” action according to various embodiments is an imitation and simulation of a scratching action performed on a physical card for removing opaque covering. This procedure involves monitoring whether a pre-defined action occurs in a covering area where the at least one covering layer is placed. The pre-defined action, which is referred to herein as the “scratching” action, may be defined before the procedure in block S34 is performed, and functions for detecting the pre-defined action are also added to the apparatus or device implementing the method. The procedure A1 may determine that a pre-defined action occurs when an action satisfying a pre-defined condition is detected in real time.
In procedure A2, part of the at least one covering layer is removed according to a position, a gesture, and a speed of the action detected. For example, based on a “scratching” position, a “scratching” gesture and a “scratching” speed of the “scratching” action detected, the covering layer is removed bit by bit until the at least one covering layer over the target data is completely removed. For example, if the “scratching” gesture is moving up and down, scratches are displayed at positions where the “scratching” takes place at a speed consistent with the “scratching” speed until the covering layer is completely removed. In this example, the positions of the “scratching” is changing in real time until the covering layer is completely removed from the target data.
Manner B is mainly applied to a device having a pressure sensitive surface, and may present different removing effects for different strengths of “scratching” actions perceived. According to manner B, the process of removing the at least one covering layer may include the following procedures. In procedure B1, a pre-defined action, e.g., a simulated “scratching” action, is detected within a covering area wherein the at least one covering layer is placed. The procedure B1 is similar to the above procedure A1, and thus, is not described further.
In procedure B2, a length of to-be-removed part of the covering layer is determined according to the strength of the action. According to various embodiments, the length of the to-be-removed part of the covering layer may be determined by using the strength of the “scratching” action. If the strength is large, the part to be removed is also relatively larger in length, and vice versa. According to various embodiments, a relation between the strength of the action and the length of the to-be-removed part of the covering layer may be pre-defined to facilitate the determining of the length according to the strength. Thus, the length of the to-be-removed part of the covering layer can be determined by using the strength of the “scratching” action.
In procedure B3, part of the covering layer is removed in length based on the position where the action takes place, and the length of the part removed equals the length determined. For example, the covering layer is removed in length by the determined length based on the position where the “scratching” takes place until the covering layer is completed removed.
After the length of the part to be removed is determined, the covering layer is removed by the determined length based on the position where the action takes place in procedure B3. For example, the covering layer may be removed in length by the determined length starting from the position where the “scratching” action takes place or the covering layer may be removed in length by the determined length in a manner that a portion of the covering layer locating around a center is removed, and the center is the position where the action takes place. The position where the action takes place is changing in real time until the covering layer is removed completely from the target data.
According to manner C, the process of removing the covering layer may be as shown in
At block S71, a dragging event is detected in an area where the covering layer is placed. Before the procedure in block S71 is performed, functions of detecting a dragging event may be added to the apparatus implementing the method, and the dragging event is thus detected at block S71 by the functions added.
At block S72, each position traversed by the dragging event is obtained and recorded when the dragging event is detected. The dragging event is an event in which the position changes dynamically. Taking a touch device as an example, after it is detected that a user controlled mark, e.g., a finger or a cursor, has changed its position within the covering area, it is determined that a dragging event occurs and positions traversed by the dragging event are obtained and recorded. According to various embodiments, the terminal device may also be a device without a touch screen. The mechanism is similar to that described above, and thus, is not elaborated further herein. Since the dragging event occurs in the covering area, each position that is traversed by the dragging process recorded in block S72 is a position within the covering area.
At block S73, each of the recorded positions is converted into a pixel of the canvas. Since each of the positions recorded in block S72 is a position within the covering area, the procedure of block S73 converts each position in the covering area that is traversed in the dragging process into a pixel in the canvas.
At block S74, the transparency of each of the pixels obtained at block S73 is modified to be 0. Setting the transparency of a pixel in the canvas to be 0 has the same effect of removing the pixel in the covering layer from the covering area.
According to manner D,
At block S81, when a trigger event for removing the covering layers is detected, a position of a user controlled mark, e.g., a finger or a cursor, is obtained and recorded as a starting position. According to various embodiments, when it is detected that a user controlled mark, e.g., a finger or a cursor, is placed in a covering area where the at least two covering layers are placed, it is determined that a trigger event for removing the covering layers is detected, and the current position of the user controlled mark is obtained and recorded as a starting position.
At block S82, a dragging event triggered by the user controlled mark in the covering area is detected and positions traversed by the user controlled mark during the dragging event are obtained and used for updating an ending position. According to various embodiments, when it is detected that the position of the user controlled mark, e.g., a finger or a cursor, changes in the covering area, it is determined that a dragging event in the covering area is initiated by the user controlled mark and a current position of the user controlled mark is obtained and recorded as the ending position.
At block S83, one of the covering layers in the covering area is removed according to the starting position and the ending position. According to various embodiments, the procedure in block S83 may include calculating a distance the dragging event has traversed in the covering area by using the starting position and the ending position, judging whether the distance is greater than a pre-defined threshold, removing the topmost covering layer among the at least two covering layers from the covering area if the distance is greater than the threshold, updating the starting position with the ending position, e.g., setting the value of the starting position to be the value of the ending position, and obtaining a different position of the user controlled mark in the covering area and recording the different position as the ending position if it is detected that the dragging event continues, and repeating the above removing process to remove the current topmost covering layer from the covering area.
In the above process, it may be judged whether there is still a covering layer over the target data and the removing process may continue if there is the covering layer, or the removing process is ended if there is no remaining covering layer.
According to various embodiments, the procedure in block S83 may include the following procedures.
Procedure I: A dragging distance traversed by the user controlled mark in the covering area is calculated by using the ending position and the starting position. It is judged whether the dragging distance is greater than a pre-defined threshold. A topmost covering layer over the target data is removed if the dragging distance is greater than the pre-defined threshold and procedure II is performed. Procedure III is performed if the dragging distance is not greater than the pre-defined threshold. According to various embodiments, the dragging distance may be calculated by calculating a difference between the ending position and the starting position, and taking the different obtained as the dragging distance traversed by the user controlled mark in the covering area.
The threshold in procedure I may be determined according the needs, e.g., may be a value indicating the sensitivity of the covering area.
Procedure II: It is determined whether there is a covering layer over the target data. Procedure III is performed if there is a covering layer over the target data, or the removing process is terminated if there is no covering layer.
Procedure III: The starting position is updated to be the starting position and a current position of the user controlled mark is obtained and recorded as the ending position if it is detected that the user controlled mark is still performing the dragging action and procedure I is performed.
Since the position of the user controlled mark is changing dynamically when the user controlled mark is performing a dragging action, the starting position is updated to be the ending position in procedure III, and the ending position is then updated by obtaining the current position of the user controlled mark which keeps on performing the dragging in the covering area. Procedure I is then performed and the process may be repeated until all of the covering layers in the covering area are removed.
In the process as shown in
According to various embodiments, an ending event for terminating the removing process may be added for the covering area to dynamically terminate the monitoring of dragging events in the covering area. It may be monitored whether the ending event for terminating the removing process occurs. When an ending event is triggered by the user controlled mark, the monitoring of dragging events in the covering area is terminated.
Taking a touch device as an example of the terminal device, when it is detected that the user controlled mark leaves the covering area, it is determined that an ending event for terminating the removing process is triggered, and the monitoring of dragging events in the covering area is stopped.
According to the manners A, B, C, and D, an event for removing covering may be added in advance, and it is then monitored whether the trigger event for removing covering occurs. The event for removing covering may be pre-defined, e.g., in a touch device, it may be defined that when it is detected for the first time that a user controlled mark is placed in the covering area, it is determined that an event for removing covering occurs.
According to various embodiments, as shown in
According to various embodiments, the covering removing module 94 may be implemented by any of the structures as shown in
As shown in
The first removing unit 112 may remove part of the at least one covering layer according to a position, a gesture, and a speed of the action. For example, based on a “scratching” position, a “scratching” gesture and a “scratching” speed of the “scratching” action, the first removing unit 112 may remove the at least one covering layer bit by bit until the at least one covering layer is completely removed off the target data.
As shown in
As shown in
As shown in
The second recording unit 142 may obtain a position of the user controlled mark within the covering area and record the position as a starting position when the fourth monitoring unit 141 detects the trigger event obtain positions traversed by the user controlled mark during a dragging event when the fifth monitoring unit 143 detects the dragging event, and update an ending position with the positions. According to various embodiments, the second recording unit 142 may trigger the fourth removing unit 144 to perform a removing process each time when the end position is updated.
The fifth monitoring unit 143 may detect a dragging event triggered by the user controlled mark in the covering area.
The fourth removing unit 144 may remove a covering layer in the at least one covering layer in the covering area by using the starting position and the ending position. According to various embodiments, the fourth removing unit 144 may remove the covering layer in the covering area after receiving a trigger event from the second recording unit 142 by using the starting position and the ending position recorded by the second recording unit 142.
According to various embodiments, the fourth removing unit 144 may include:
a removing unit, which may calculate a dragging distance traversed by the user controlled mark in the covering area by using the starting position and the ending position recorded by the second recording unit 142, judge whether the dragging distance is greater than a pre-defined threshold, and remove the topmost covering layer over the target data if the dragging distance is greater than the pre-defined threshold;
an updating unit, which may update the starting position with the ending position recorded by the second recording unit 142.
According to various embodiments, the fourth removing unit 144 may include the following units.
A removing unit is capable of calculating a dragging distance traversed by the user controlled mark in the covering area by using the ending position and the starting position, judging whether the dragging distance is greater than a pre-defined threshold, removing the topmost covering layer over the target data, and sending a first instruction to a judging unit if the dragging distance is greater than the pre-defined threshold or sending a second instruction to the updating unit if the dragging distance is not greater than the pre-defined threshold.
The judging unit is capable of receiving the first instruction, judging whether there is a covering layer over the target data, and sending a third instruction to the updating unit if there is or terminating the removing process if there is not.
The updating unit is capable of updating the starting position with the ending position after receiving the second instruction or the third instruction, obtaining a position of the user controlled mark in the covering area when it is detected that the dragging event is going on, recording the position as the ending position in the second recording unit 142, and triggering the removing unit to perform the removing procedure.
Various examples also provide a terminal device of displaying data, which is capable of simulating the removing of covering over data. The terminal device includes the above apparatus, and will not be described further herein.
Various examples also provide a system of displaying data, which is capable of simulating removing of covering over data. The system may include a server and a terminal device. The terminal device includes the above apparatus. The server is capable of providing the target data for the apparatus.
Various embodiments cover target data with at least one covering layer, display the target data being covered, and remove the covering over the target data when a trigger event for removing the covering is detected to reveal the target data, thereby implementing simulated covering of target data on a virtual bearer in a terminal device and simulated removing of the covering over the target data.
In the above processes and structures, not all of the procedures and modules are necessary. Certain procedures or modules may be omitted according to certain requirements. The order of the procedures is not fixed, and can be adjusted according to the requirements. The modules are defined based on function simply for facilitating description. In implementation, a module may be implemented by multiple modules and functions of multiple modules may be implemented by the same module. The modules may reside in the same device or distribute in different devices. The “first”, “second” in the above descriptions are merely for distinguishing two similar objects, and have no substantial meanings.
According to various embodiments, a hardware module may be implemented mechanically or electronically. For example, a hardware module may comprise dedicated circuitry or logic that is permanently configured (e.g., as a special-purpose processor, such as a field programmable gate array (FPGA) or an application-specific integrated circuit (ASIC)) to perform certain operations. A hardware module may also comprise programmable logic or circuitry (e.g., as encompassed within a general-purpose processor or other programmable processor) that is temporarily configured by software to perform certain operations. The decision to implement a hardware module mechanically, in dedicated and permanently configured circuitry or in temporarily configured circuitry (e.g., configured by software), may be driven by cost and time considerations.
A machine-readable storage medium is also provided, which stores instructions to cause a machine to execute a method as described herein. A system or apparatus having a storage medium stores machine-readable program codes for implementing functions of any of the above examples and may make the system or the apparatus (or CPU or MPU) read and execute the program codes stored in the storage medium. In addition, instructions of the program codes may cause an operating system running in a computer to implement part or all of the operations. In addition, the program codes implemented from a storage medium are written in a storage device in an extension board inserted in the computer or in a storage in an extension unit connected to the computer. In this example, a CPU in the extension board or the extension unit executes at least part of the operations according to the instructions based on the program codes to realize the technical scheme of any of the above examples.
The storage medium for providing the program codes may include floppy disk, hard drive, magneto-optical disk, compact disk (such as CD-ROM, CD-R, CD-RW, DVD-ROM, DVD-RAM, DVD-RW, DVD+RW), magnetic tape drive, Flash card, ROM and so on. Optionally, the program code may be downloaded from a server computer via a communication network.
The scope of the claims should not be limited by the various embodiments, but should be given the broadest interpretation consistent with the description as a whole.
The foregoing description of the embodiments has been provided for purposes of illustration and description. It is not intended to be exhaustive or to limit the disclosure. Individual elements or features of a particular embodiment are generally not limited to that particular embodiment, but, where applicable, are interchangeable and can be used in a selected embodiment, even if not shown or described. The same may also be varied in many ways. Such variations are not to be regarded as a departure from the disclosure, and all such modifications are intended to be included within the scope of the disclosure.
Reference throughout this specification to “one embodiment,” “an embodiment,” “specific embodiment,” or the like in the singular or plural means that one or more particular features, structures, or characteristics described in connection with an embodiment is included in at least one embodiment of the present disclosure. Thus, the appearances of the phrases “in one embodiment” or “in an embodiment,” “in a specific embodiment,” or the like in the singular or plural in various places throughout this specification are not necessarily all referring to the same embodiment. Furthermore, the particular features, structures, or characteristics may be combined in any suitable manner in one or more embodiments.
Number | Date | Country | Kind |
---|---|---|---|
201310013886.9 | Jan 2013 | CN | national |
This application is a continuation of International Application No. PCT/CN2014/070269, filed Jan. 8, 2014. This application claims the benefit and priority of Chinese Application No. 201310013886.9, filed Jan. 15, 2013. The entire disclosures of each of the above applications are incorporated herein by reference.
Number | Date | Country | |
---|---|---|---|
Parent | PCT/CN2014/070269 | Jan 2014 | US |
Child | 14793217 | US |