Method and terminal for displaying an animation

Information

  • Patent Grant
  • 9684990
  • Patent Number
    9,684,990
  • Date Filed
    Monday, December 23, 2013
    10 years ago
  • Date Issued
    Tuesday, June 20, 2017
    7 years ago
Abstract
A method for a terminal to display an animation, including: generating one or more supplementary image frames on a moving path between first and second adjacent original image frames of an animation; and displaying the animation with the generated one or more supplementary image frames at a predetermined frame rate.
Description
TECHNICAL FIELD

The present disclosure relates generally to an image processing technology and, more particularly, to a method and a terminal for displaying an animation.


BACKGROUND

Most operating systems supporting touch-screen devices, such as the Android operating system, the iOS operating system, or the Windows Phone operating system, use a conventional refresh mode, i.e., single-frame rendering, for graphical interfaces displayed on screens. If a graphical interface is refreshed under the conventional refresh mode at a rate of 60 frames/second, there will be an interval of more than 15 ms between each adjacent two image frames of the graphical interface. When an interface element, such as an icon or text, on a touch screen is moving at a relatively fast speed under control of a user's finger, the interface element will have a relatively large movement distance during an interval of two frames, for example, up to 0.5-3 cm. Due to persistence of vision, a user will observe several discrete images, including intermittent afterimages, resulting in the user's feeling about the graphical interface movement being not smooth or real.


SUMMARY

According to a first aspect of the present disclosure, there is provided a method for a terminal to display an animation, comprising: generating one or more supplementary image frames on a moving path between first and second adjacent original image frames of an animation; and displaying the animation with the generated one or more supplementary image frames at a predetermined frame rate.


According to a second aspect of the present disclosure, there is provided a terminal for displaying an animation, comprising: a processor; a touch screen; and a memory for storing instructions executable by the processor; wherein the processor is configured to: generate one or more supplementary image frames on a moving path between first and second adjacent original image frames of an animation; and display, on the touch screen, the animation with the generated one or more supplementary image frames at a predetermined flame rate.


According to a third aspect of the present disclosure, there is provided a non-transitory computer-readable medium having stored therein instructions that, when executed by a processor in a terminal, cause the terminal to perform: generating one or more supplementary image frames on a moving path between first and second adjacent original image frames of an animation; and displaying the animation with the generated one or more supplementary image frames at a predetermined frame rate.


It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the invention, as claimed.





BRIEF DESCRIPTION OF THE DRAWINGS

The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the invention and, together with the description, serve to explain the principles of the invention.



FIG. 1 is a flowchart of a method for a terminal to display an animation, according to an exemplary embodiment.



FIG. 2 is a diagram showing two adjacent original image frames of an animation, according to an exemplary embodiment.



FIG. 3 is a diagram showing two adjacent original image frames of an animation, according to an exemplary embodiment.



FIG. 4A is a diagram showing an animation displayed on a screen of a terminal when a conventional method is used.



FIG. 4B is a diagram showing an animation displayed on a screen of a terminal, according to an exemplary embodiment.



FIG. 5 is a block diagram of a terminal for displaying an animation, according to an exemplary embodiment.



FIG. 6 is a block diagram of a terminal for displaying an animation, according to an exemplary embodiment.





DETAILED DESCRIPTION

Reference will now be made in detail to exemplary embodiments, examples of which are illustrated in the accompanying drawings. The following description refers to the accompanying drawings in which the same numbers in different drawings represent the same or similar elements unless otherwise represented. The implementations set forth in the following description of exemplary embodiments do not represent all implementations consistent with the invention. Instead, they are merely examples of apparatuses and methods consistent with aspects related to the invention as recited in the appended claims.



FIG. 1 is a flowchart of a method 100 for a terminal to display an animation, according to an exemplary embodiment. For example, the terminal may be a mobile terminal. Referring to FIG. 1, the method 100 includes step 101 and step 102. In step 101, before displaying an animation at a predetermined frame rate, the terminal generates one or more supplementary image frames on a moving path between each two adjacent original image frames of the animation. For example, the terminal determines the moving path between two image frames based on a moving path of a graphic interface element, such as an icon or text, in the two image frames. In step 102, the terminal displays the animation with the generated supplementary image frames at the predetermined frame rate. By using the method 100, intermittent afterimages that would otherwise appear between two adjacent original image frames may be removed or reduced, so that a realistic and coherent visual experience may be achieved.


In exemplary embodiments, the terminal generates supplementary image frames from original image frames based on a multiple-drawing method and/or a natural exposure imitation transformation, so that the generated supplementary image frames may cancel the intermittent afterimages generated on the moving path between two adjacent original image frames. However, the method 100 is not so limited, and can generate supplementary image frames according to different animation display performance and quality requirements.


In exemplary embodiments, based on the multiple-drawing method, the terminal renders multiple drawings of a previous original image frame along the moving path between the previous playback image frame and a subsequent playback image frame of the animation, according to a playback timing of each image frame of the animation. For example, the terminal determines a transparency for each supplementary image frame to be generated based on the previous original image frame, according to a length of the moving path between the previous original image frame and the subsequent original image frame and pixels per inch (PPIs) of a graphical interface, and renders multiple drawings of the previous original image frame along the moving path with the determined transparency. In one exemplary embodiment, a relatively smooth and coherent visual experience is achieved at a frame rate of 24 frames/second or more.


In exemplary embodiments, based on the natural exposure imitation transformation, the terminal performs an image transformation, which may be implemented using image matrixes, to pixels on the moving path between a previous original image frame and a subsequent original image frame of the animation, to obtain supplementary image frames, also according to a playback timing of each image frame of the animation. For example, with regard to the pixels on the moving path between the previous original image frame and the subsequent original image frame of the animation, the terminal performs a plurality of one-dimensional compression transformations on RGB color information and opacity information of the pixels based on a length of the moving path and locations of the pixels: Accordingly, the terminal compresses, at a predetermined compression ratio, the pixels to a plurality of one-dimensional images each with a transparency, thereby to generate the supplementary image frames.


In one exemplary embodiment, an image matrix M having a size of m×n may be used to represent the pixels on the moving path between the previous original image frame and the subsequent original image frame. Further, it is assumed that a represents he length of the moving path; P (i, j) represents a value of RGB color information of a pixel located at row i and column j in the image matrix M; and P1 (x, j) represents the same parameter of a pixel located at row x and column j after the one-dimensional compression transformation. The one-dimensional compression transformation may then be expressed as follows:


When a>m,












P
1



(

x
,
j

)


=






i
=
0

x







P


(

i
,
j

)



a



(

0
<
x
<
m

)



;




equation






(
1
)










P
1



(

x
,
j

)


=






i
=
0

m







P


(

i
,
j

)



a



(

m
<
x
<
a

)



;




equation






(
2
)










P
1



(

x
,
j

)


=






i
=

x
-
a


m







P


(

i
,
j

)



a



(

a
<
x
<

a
+
m


)



;




equation






(
3
)








When a<m,












P
1



(

x
,
j

)


=






i
=
0

x







P


(

i
,
j

)



a



(

0
<
x
<
a

)



;




equation






(
4
)










P
1



(

x
,
j

)


=






i
=

x
-
a


x







P


(

i
,
j

)



a



(

a
<
x
<
m

)



;




equation






(
5
)









P
1



(

x
,
j

)


=






i
=

x
-
a


m







P


(

i
,
j

)



a




(

m
<
x
<

a
+
m


)

.






equation






(
6
)








In the above illustrated embodiment based on the natural exposure imitation transformation, the generated supplementary image frames can have the same effect as a natural exposure of an original image frame within the time length of one frame, so that visual experience is enhanced.


In exemplary embodiments, the terminal has a touch screen. A user's finger sliding over a graphical interface displayed on the touch screen results in a movement of the graphical interface. When detecting the movement of the graphical interface, the terminal generates supplementary image frames on a moving path between each two adjacent original image frames, and renders and outputs the supplementary image frames using the method 100. The generated supplementary image frames each have a transparency relating to a moving speed or a moving distance. The moving distance is calculated by multiplying the moving speed by a refresh time determined according to a refresh rate of the graphical interface.



FIG. 2 and FIG. 3 are diagrams each showing two adjacent original image frames of an animation, according to exemplary embodiments. In FIG. 2, it is assumed that a previous image frame M0(1) and a subsequent image frame M1(2) corresponding to a same graphical interface element, such as a rectangular icon or text, do not overlap on the screen and are separated by a moving distance a which is larger than an image matrix length m. in FIG. 3, it is assumed that a previous image frame M0(3) and a subsequent image frame M1(4) corresponding to a same graphical interface element, such as a rectangular icon or text, overlap on the screen and are separated by a movement distance a which is smaller than the image matrix length m.


In one exemplary embodiment, the terminal performs the one-dimensional compression transformation to generate supplementary image frames. For example, for locations x1, x2, and x3 in FIG. 2, the terminal uses equations (1), (2), and (3), respectively, to perform the one-dimensional compression transformation, to generate first, second, and third one-dimensional images corresponding to the locations x1, x2, and x3, respectively. Also for example, for locations x1, x2, and x3 in FIG. 3, the terminal uses equations (4), (5), and (6), respectively, to perform the one-dimensional compression transformation, to generate first, second, and third one-dimensional images corresponding to the locations x1, x2, and x3, respectively.


Specifically, in the illustrated embodiment, the terminal performs the one-dimensional compression transformation on four variables, i.e., each component of RGB color information, based on equations (1)-(3) (FIG. 2) or equations (4)-(6) (FIG. 3). Accordingly, the terminal obtains a series of one-dimensional images from x=0 to x=a+m, thereby to generate the supplementary image frames. For example, for the pixels in the area corresponding to 0<x<x1 and 0<y<n in FIG. 2, the terminal compresses those pixels to a one-dimensional image represented by a dotted line (5) at x=x1.


in one exemplary embodiment, the terminal renders multiple drawings of the previous original image frame along the moving path to generate supplementary images. Each generated supplementary image frame has a determined transparency relating to a moving speed of the graphical interface. The transparency of a supplementary image frame is inversely proportional to the moving speed of the graphical interface, and is also determined by pixels per inch (PPIs) of the screen and a frame sequence.


Also referring to FIG. 2 and FIG. 3, when the graphical interface is sliding, the terminal records coordinates passed by the previous image frame M0(1) or M0(3), and renders multiple drawings along the moving path from the coordinate of the previous image frame M0(1) or M0(3) to the coordinate of (be subsequent image frame M1(2) or M1(4). In one exemplary embodiment, the terminal renders multiple drawings every k points (pixels) according to different pixels per inch, or PPIs, and distances, and each generated supplementary image frame has a determined transparency relating to the moving speed of the graphical interface. A relationship between the transparency of the supplementary image frame and a moving distance of this frame is obtained by adjusting an inverse proportion curve of k/a, where k is a predetermined constant, and a represents the moving distance. In one exemplary embodiment, a shape of the curve is adjusted according to actual tests, so that a final image after the processing is dose to the image formed from the original image frame by the natural exposure in the time length of one frame.



FIG. 4A is a diagram showing an animation displayed on a screen 401 of a terminal 400 when the conventional method of single-frame rendering is used. Referring to FIG. 4A, an example of a camera icon on a graphical interface shows the animation of a sliding process. When the user's finger slides over the touch screen 401, assuming that the touch screen 401 performs single-frame rendering at a refresh rate of 60 frames/second, there will be an interval of more than 15 ms between each two adjacent image frames. If the camera icon has a moving speed of about 0.3 m/s, the moving distance between two adjacent image frames is about 0.5 cm. Due to persistence of vision, the user will see intermittent afterimages appearing between two adjacent image frames, as shown in FIG. 4A.



FIG. 4B is a diagram showing an animation displayed on the screen 401 of the terminal 400 when the method 100 (FIG. 1) is used. Referring to FIG. 4B, an example of a camera icon on a graphical interface shows the animation of a sliding process. By using the method 100, the terminal generates supplementary image frames to cancel intermittent afterimages generated on the moving path of two original image frames, so that the user will not see intermittent afterimages, as shown in FIG. 4B. As a result, a smooth sliding experience may be achieved.



FIG. 5 illustrates a block diagram of a terminal 500, according to an exemplary embodiment. Referring to FIG. 5, the terminal 500 includes a pre-processing module 501 and a display module 502.


In exemplary embodiments, the pre-processing module 501 is configured to generate supplementary image frames on a moving path between each two adjacent original image frames of the animation and the display module 502 is configured to display the animation with the generated supplementary image frames at a predetermined frame rate.


In exemplary embodiments, the pre-processing module 501 includes first and second pre-processing units. The first pre-processing unit is configured to generate the supplementary image frames on the moving path between each two adjacent original image frames of the animation based on the multiple-drawing method described above. The second pre-processing unit is configured to generate the supplementary image frames on the moving path between each two adjacent original image frames of the animation based on the natural exposure imitation transformation described above.


One skilled in the art will understand that multiple models in the exemplary embodiments may be combined into one module, and one model may be divided into multiple models. Each model may be implemented with software, hardware, or a combination of software and hardware.



FIG. 6 illustrates a block diagram of a terminal 600 for displaying an animation, according to an exemplary embodiment. Referring to FIG. 6, the terminal 600 includes a processor 602 and a touch screen 604. The terminal 600 also includes memory resources, represented by a memory 606, for storing data as well as for storing program instructions and otherwise facilitating operation of the processor 602,


In exemplary embodiments, there is provided a non-transitory storage medium including instructions, such as included in the memory 606, executable by the processor 602 in the terminal 600, for performing the above described animation displaying methods,


Other embodiments of the invention will be apparent to those skilled in the art from consideration of the specification and practice of the invention disclosed here. This application is intended to cover any variations, uses, or adaptations of the invention following the general principles thereof and including such departures from the present disclosure as come within known or customary practice in the art. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the invention being indicated by the following claims.


It will be appreciated that the present invention is not limited to the exact construction that has been described above and illustrated in the accompanying drawings, and that various modifications and changes can be made without departing from the scope thereof It is intended that the scope of the invention only be limited by the appended claims.

Claims
  • 1. A method for a terminal to display an animation, comprising: generating, by the terminal, one or more supplementary image frames on a moving path between first and second adjacent original image frames of an animation, based on a natural exposure imitation transformation; anddisplaying, by the terminal, the animation with the generated one or more supplementary image frames at a predetermined frame rate,wherein generating the one or more supplementary image, frames based, on the natural exposure imitation transformation includes: performing an image transformation on pixels on the moving path between the first original image frame and the second original image frame, to generate the one or more supplementary image frames according to a playback timing of each image frame of the animation;wherein performing the image transformation includes: performing a plurality of one-dimensional compression transformation on RGB color information of the pixels based on a length of the moving path and locations of the pixels; andwherein performing the plurality of one-dimensional compression transformations includes; compressing, based on the length of the moving path and the locations of the pixels, the pixels to a plurality of one-dimensional images each with a transparency, thereby to generate the one or more supplementary images.
  • 2. The method according to claim 1, further comprising: generating one or more supplementary image frames based on an original image frame multiple-drawing method by rendering multiple drawings of the first original image frame along the moving path according to the playback timing of each image frame of the animation.
  • 3. The method according to claim 1, wherein an image matrix M having a size of m×n is used to represent the pixels, m being a first positive integer and representing a fast number of pixels corresponding to a first side of the image matrix M, n being a second positive integer and representing a second number of pixels corresponding to a second side of the image matrix M, and wherein a is a third positive integer and represents a third number of pixels corresponding to the length of the moving path, the method further comprising: when a>m, performing a calculation as follows:
  • 4. A terminal for displaying an animation, comprising: a processor;a touch screen; anda memory for storing instructions executable by the processor;wherein the processor is configured to:generate one or more supplementary image frames on a moving path between first and second adjacent original image frames of an animation, based on a natural exposure imitation transformation; anddisplay, on the touch screen, the animation with the generated one or more supplementary image frames at a predetermined frame rate,wherein the processor is further configured to generate the one or more supplementary image frames based on the natural exposure imitation transformation by performing an image transformation on pixels on the moving path between the first original image frame and the second original image frame, to generate the one or more supplementary image frames according to a playback timing of each image frame of the animation;wherein the processor is further configured to perform a plurality of one-dimensional compression transformations on RGB color information of the pixels based on a length of the moving path and locations of the pixels to perform the image transformation; andwherein the processor is further configured to compress, based on the length of the moving path and the locations of the pixels, the pixels to a plurality of one-dimensional images each with a transparency, thereby to generate the one or more supplementary images.
  • 5. The terminal according to claim 4, wherein the processor is further configured to: generate one or more supplementary image frames based on an original image frame multiple-drawing method by rendering multiple drawings of the first original image frame along the moving path according to the playback timing of each image frame of the animation.
  • 6. The terminal according to claim 4, wherein an image matrix M having a size of m×n is used to represent the pixels, m being a first positive integer and representing a first number of pixels corresponding to a first side of the image matrix M, n being a second positive integer and representing a second number of pixels corresponding to a second side of the image matrix M, and wherein a is a third positive integer and represents a third number of pixels corresponding to the length of the moving path, the processor being further configured to: when a>m, perform a calculation as follows:
  • 7. A non-transitory computer-readable medium having stored therein instructions that, when executed by a processor in a terminal, cause the terminal to perform a method for displaying an animation, the method comprising: generating one or more supplementary image frames on a moving path between first and second adjacent original image frames of an animation, based on a natural exposure imitation transformation; anddisplaying the animation with the generated one or more supplementary image frames at a predetermined frame rate,wherein generating the one or more supplementary image frames based on the natural exposure imitation transformation includes: performing an image transformation on pixels on the moving path between the first original image frame and the second original image frame, to generate the one or more supplementary image frames according to a playback timing of each image frame of the animation;wherein performing the image transformation includes; performing a plurality of one-dimensional compression transformation on RGB color information of the pixels based on a length of the moving path and locations of the pixels; andwherein performing the plurality of one-dimensional compression transformations includes; compressing, based on the length of the moving path and the locations of the pixels, the pixels to a plurality of one-dimensional images each with a transparency, thereby to generate the one or more supplementary images.
Priority Claims (2)
Number Date Country Kind
2012 1 0324327 Sep 2012 CN national
2012 1 0461080 Nov 2012 CN national
CROSS REFERENCE TO RELATED APPLICATIONS

This application is a continuation of International Application No. PCT/CN2011/079301, filed Jul. 12, 2013, which is based upon and claims priority of Chinese Patent Application No. 201210324327.5, filed Sep. 4, 2012, and of Chinese Patent Application No. 201210461080.1, filed Nov. 15, 2012, the entire contents of which are incorporated herein by reference.

US Referenced Citations (24)
Number Name Date Kind
6414685 Takakura Jul 2002 B1
7898542 Yu Mar 2011 B1
20030026422 Gerheim et al. Feb 2003 A1
20040252230 Winder Dec 2004 A1
20070103585 Takeuchi May 2007 A1
20080068359 Yoshida Mar 2008 A1
20080181312 Kimura Jul 2008 A1
20080181463 Error Jul 2008 A1
20080232705 Sohn Sep 2008 A1
20090080789 Shoji Mar 2009 A1
20090179898 Bond Jul 2009 A1
20100026904 Higuchi Feb 2010 A1
20100118037 Sheikh May 2010 A1
20100162092 Albu Jun 2010 A1
20100231534 Chaudhri et al. Sep 2010 A1
20100231535 Chaudhri et al. Sep 2010 A1
20100231536 Chaudhri et al. Sep 2010 A1
20100231537 Pisula et al. Sep 2010 A1
20100302409 Matas Dec 2010 A1
20110018880 Whited et al. Jan 2011 A1
20120147012 Lau Jun 2012 A1
20130076758 Li et al. Mar 2013 A1
20130091409 Jeffery Apr 2013 A1
20130135339 Saini May 2013 A1
Foreign Referenced Citations (13)
Number Date Country
101727282 Jun 2010 CN
101833447 Sep 2010 CN
102169594 Aug 2011 CN
102385473 Mar 2012 CN
102629460 Aug 2012 CN
102637107 Aug 2012 CN
103021007 Apr 2013 CN
2005160022 Jun 2005 JP
2005236472 Sep 2005 JP
2007272388 Oct 2007 JP
2011049952 Mar 2011 JP
2387013 Apr 2010 RU
WO 2008136116 Nov 2008 WO
Non-Patent Literature Citations (4)
Entry
“Inbetweening—Wikipedia, the free encyclopedia”, dated Sep. 3, 2012; retrieved from the Internet: https://en.wikipedia.org/w/index.php?title=Inbetweening&oldid=510655716 on Dec. 16, 2015 (2 pages).
Kawagishi Yuya et al., “Cartoon Blur: Non-Photorealistic Motion Blur”, Saitama University dated Apr. 19, 2002 (6 pages).
Office Action mailed by the Russian Patent Office on Feb. 17, 2016, in counterpart Russian Application No. 2014153030 and English translation thereof.
European Search Report for EP 13834614.3 from the European Patent Office, mailed Jan. 5, 2016.
Related Publications (1)
Number Date Country
20140111524 A1 Apr 2014 US
Continuations (1)
Number Date Country
Parent PCT/CN2013/079301 Jul 2013 US
Child 14139479 US