Animation is a popular way to provide content, whether for artistic, commercial, or other purposes. Runtime applications, such as those executed using the Adobe® Flash® player (available from Adobe Systems Incorporated of San Jose, Calif.), are an effective option for distributing animated content via the Internet. For instance, a runtime application can comprise code that, when executed in a corresponding runtime environment, presents a desired animation sequence, such as a one or more animated objects that move and/or otherwise change in a time-varying manner on a stage in an interface rendered by the runtime application. However, runtime-based animation may not always be available—for example, certain devices or platforms may not support use of a runtime environment. Developers may nonetheless wish to provide animated content for such platforms, such as animated content for use with rendering applications (e.g., browsers) that can provide animation but cannot work with the runtime environment.
A computerized device includes a hardware interconnect and a data processing hardware element (e.g., a processor and/or hardware logic) interfaced to the hardware interconnect. The data processing hardware element implements an animation coding engine to analyze timeline data defining an animation sequence and generate a code package representing the animation sequence as a set of visual assets and animation primitives supported by a rendering application, each visual asset associated with a corresponding animation primitive. The generated code package is stored via the hardware interconnect in local or remote storage. The code package is generated to include suitable code that, when processed by the rendering application, causes the rendering application (e.g., browser) to invoke the corresponding animation primitive for each visual asset to provide the animation sequence.
These illustrative embodiments are discussed not to limit the present subject matter, but to provide a brief introduction. Additional embodiments include computer-readable media embodying an application configured in accordance with aspects of the present subject matter to provide an animation coding engine. Embodiments also include computer-implemented methods for generating code packages that can be processed by a rendering application to provide animation based on invoking animation primitives native to the rendering application. These and other embodiments are described below in the Detailed Description. Objects and advantages of the present subject matter can be determined upon review of the specification and/or practice of an embodiment configured in accordance with one or more aspects taught herein.
A full and enabling disclosure is set forth more particularly in the remainder of the specification. The specification makes reference to the following appended figures.
Presently-disclosed embodiments include computing systems, methods, and computer-readable media embodying code. Turning to
The term “code package” is used to indicate that although the animation sequence is the same, the output of the animation engine is not independently executable. Rather, the animation engine provides a code package comprising one or more files that invoke native animation capabilities of rendering application when the rendering application interprets or otherwise acts based on the contents of the code package.
For example, the code package may comprise markup code 110 referencing visual assets of the animation sequence and a stylesheet 112 that defines animation primitives as styles applied to visual assets. When a rendering application renders the visual assets, animation capabilities of rendering application are invoked according to the styles applied to the respective visual assets. As a particular example, the stylesheet may be a Cascading Style Sheet (CSS) 112 formatted according to CSS3 standards, with the stylesheet referenced by or included in an HTML file specifying element such as bicycle 103's body 103A, rear wheel 103B, crank 103C, and front wheel 103D as separate image files.
Animation coding engine 104 is capable of decomposing complex animation sequences into an arrangement of animation primitives supported by the browser to thereby take advantage of the browser's native rendering capabilities without the need for the browser to itself support highly complex animation commands. Instead, the timing and arrangement of animation primitives is orchestrated by code package 108. Additionally, the structure of the HTML file can be used to drive the animation according to parameters specified in the style sheet. Because the HTML elements are handled directly by the rendering application's HTML parser, the resulting animated elements are handed in a stand-alone manner, which makes for smoother animations and easier editing as compared to other approaches, such as including animation metadata interpreted by JavaScript or other intermediate approaches.
For example, bicycle 103 can be represented using separate image files for each of components 103A, 103B, 103C, and 103D. Appropriate CSS3 animation parameters can be defined as a style applied to element 103A to translate body 103A from left to right. Styles applied to elements 103B and 103D can be used to rotate the wheels of the bicycle and translate the wheels in conjunction with body 103A. An additional style can be used to provide independent rotation of crank 103C while also translating crank 103C across the stage.
Reference will now be made in detail to various and alternative exemplary embodiments and to the accompanying drawings. Each example is provided by way of explanation, and not as a limitation. It will be apparent to those skilled in the art that modifications and variations can be made. For instance, features illustrated or described as part of one embodiment may be used on another embodiment to yield a still further embodiment.
In the following detailed description, numerous specific details are set forth to provide a thorough understanding of the subject matter. However, it will be understood by those skilled in the art that the subject matter may be practiced without these specific details. In other instances, methods, apparatuses or systems that would be known by one of ordinary skill have not been described in detail so as not to obscure the subject matter.
Generally speaking, computing device 202 features one or more data processing hardware elements implementing an animation coding engine 104. Animation coding engine 104 causes computing device 202 to analyze timeline data 106 defining animation sequence 102 and generate code package 108 representing the animation sequence 102. In this example, animation sequence 102 is movement of bicycle 103 across the stage, but in practice animation sequences may be much more complex. As noted above, the code package is generated so that, when the code is processed by rendering application, native animation capabilities of rendering application 102 are invoked to provide the animation sequence by using the corresponding animation primitive for each visual asset 114.
Animation coding engine 104 can be implemented in hardware accessible by or as part of the data processing element (e.g., as an application-specific integrated circuit, (ASIC), programmable logic device (e.g., PLAs, FPGAs, etc.)). As another example, animation coding engine 104 can be implemented using software or firmware that configures the operation of a processor or processors.
In the example shown in
Computer-readable medium 206 may comprise RAM, ROM, or other memory and in this example embodies a development environment 214 and the animation coding engine 104. More generally, development environment 214 is provided as an example of one or more applications or processes utilizing or including animation coding engine 104. For example, development environment 214 may comprise a development application such as Adobe® Flash® Professional, available from Adobe Systems Incorporated, and suitably modified to include or utilize animation coding engine 104.
As shown here, development environment 214 provides a user interface 216 that includes a stage 218 and a timeline 220. Stage 218 can be used to arrange one or more objects that are to be animated, with timeline 220 providing a visual representation of timeline data 106 and usable to select time intervals. For example, timeline 220 may be used to select different frames or other time index units, with animated objects positioned at different locations at the different frames. In some implementations, development environment 214 may support tweening and other operations so that a user does not need to specify every intermediate position of an object—instead, the user may specify starting and ending locations in respective key frames and/or a desired path, with development environment 214 handling the details of correctly positioning the object(s) in between the key frames. For example, a user may specify a starting and ending location for bicycle 103 and development environment 214 handles the details of smoothly translating the bicycle across the stage. Although this example shows a stage and timeline, a development environment may support other methods for receiving input specifying an animation sequence. For example, a source code view may be provided, with the animation specified through suitable syntax for specifying the location and motion of objects.
In this example, animation coding engine 104 is shown as part of a client-side development environment. However, animation coding engine 104 could be deployed at a remote server. For example, a web service may host animation coding engine 104 and provide a client-side front end for providing a user interface whereby a user can define the animation sequence. As another example, animation coding engine 104 could be deployed as part of a web service that receives timeline data 106 from a client and returns code package 108 in response.
In any event, animation coding engine 104 is provided with access to timeline data 106 specifying the details of animation sequence 102 and uses that data to generate code that is processed by a rendering application to replicate the animation sequence.
As shown at 302, animation coding engine 104 begins from the timeline data 106 defining the animation sequence of one or more animated objects. For example, timeline data 106 may comprise source code and/or an object model of a runtime application, such as source code of a Flash® application. However, it will be understood that timeline data 106 can comprise another representation of an animation sequence.
As shown at 304, this data is analyzed to determine visual assets comprising the animated object(s) along with data identifying motion and/or other time-varying properties of the visual assets as the animation sequence occurs. An object as represented in timeline data 106 may, practically speaking, be animated using one visual asset or multiple visual assets manipulated in concert with one another. For example, as noted above, bicycle 103 of
By analyzing how the underlying visual assets move or otherwise vary over time, animation coding engine 104 provides a representation of the animation as a set of visual assets and corresponding animation primitives as shown at 306. As discussed below, animation coding engine 104 may analyze the animation sequence using a heuristic that searches for common design patterns that can be broken down into sets of animation primitives.
As shown at 308, code package 108 can be generated by selecting appropriate code statements for the rendering application that is to process code package 108. For example, in some implementations code package 108 is provided as an HTML file along with a CSS3-compliant style sheet. The HTML file can reference the visual assets along with statements to invoke styles defined in the style sheet. The style sheet can be populated with style definitions for the corresponding animation primitives. When a rendering application such as a browser processes code package 108, the rendering application's native animation capabilities can be invoked based on the style definitions.
Block 402 represents accessing data defining an animation sequence, the animation sequence depicting motion of at least one object over time. For instance, the animation sequence may be defined in terms of a location on a stage for each of one or more objects and corresponding time index values. As a particular example, a development environment may maintain a core object model and codebase for the application under development. The object model/codebase can include data representing the various application components, including the object(s) to be animated (e.g., bicycle 103 of
Block 404 represents accessing data identifying animation primitives supported by a markup language. For instance, animation coding engine 104 may be hard-coded to recognize some or all of a set of animation operations that can be invoked through use of a style sheet, such as a CSS3 stylesheet. As another example, animation coding engine 104 may selectively access different sets of primitives supported by different style sheet languages, different rendering applications, and the like based on a desired output format.
Block 406 represents analyzing the data defining the animation sequence to determine a set of visual assets and corresponding animation primitives representing the motion of the at least one object over time. For example, animation coding engine 104 may use the data describing object positions over time and/or data defining time-varying activity of the objects to identify one or more sequences of motion that can be matched to animation primitives. The animated object(s) can be broken down into one or more visual assets, with each visual asset animated individually in a way so that the original timing of the animation sequence is maintained along with the intended relative position of the animated object(s). For example, bicycle 103 can be broken into body 103A, wheel 103B, crank 103C, and wheel 103D.
As an example, the animated object(s) may be analyzed to determine a first position of a visual asset on the stage at a first time index and a second position of the visual asset on the stage at a second time index. Based on the first and second positions and the time indices a suitable animation primitive be selected. For example, if the asset translates across the stage, then the animation coding engine can select an animation primitive that, when processed by the rendering application, causes the rendering application to move the visual asset to a second position in the interface of the rendering application corresponding to the second position of the visual asset on the stage. As a particular example, the webkit—transform primitive can be selected for use as a style applied to each element 103A, 103B, 103C, and 103D, with values to define the starting and ending locations and a desired rate for the translation can be determined. For assets such as elements 103B, 103C, and 103D that rotate, a rotation primitive can be selected through a similar analysis. Transitions (e.g., fade-in, fade-out), distortions, and other manipulations can be identified as well.
Block 408 represents generating a package comprising markup code referencing the set of visual assets and a stylesheet defining the corresponding animation primitives as styles to be applied to the visual assets, the package generated so that, when the markup code is processed by a rendering application, the rendering application invokes the corresponding animation primitives to animate the visual assets. For example, a set of files can be provided, the set of files including markup code renderable by a browser and referencing a stylesheet. As noted above, the stylesheet can define the selected animation primitives as styles applied to the corresponding visual assets.
The markup code can specify the visual assets in a way that causes the rendering application to treat each visual asset as a separate element. As a result, each element can be animated in a stand-alone manner. This can enhance the resulting animation by allowing the markup code to drive the animation as set forth in the styles.
The styles can be defined using parameter values that preserve the relative timing of the animations of various assets and spatial relationship of the assets over the course of the animation. For example, certain animations may be delayed relative to others and/or repeated, with duration values for the animation primitives used to control relative speed between animations. Coordinate values can be included in the style definitions so that the arrangement of the visual assets remains true to the original animation sequence as assets are translated, rotated, distorted, and the like. The visual assets may themselves be included as files (e.g., raster image files, scalable vector graphics files) or may be defined as elements in the markup code.
For example, animation sequence 102 of
In some implementations, the package further comprises scripting code to be interpreted by the rendering application when the markup code is processed. For example, in some implementations, JavaScript is used to control the start and repetition (if desired) of the animation. For example, a JavaScript file can be included to begin the animation when an HTML document is loaded and to repeat the animation after a specified time period. For example, if crank 103C is rotated at a high rate, its animation may be repeated on an infinite loop while slower rotations for wheels 103B and 103D are used once during the entire animation sequence.
In some implementations, analysis block 406 operates according to a heuristic that decomposes the animation sequence based on common design patterns that can be used in selecting and arranging animation primitives.
Block 502 represents determining if the animation sequence includes parallel sub-sequences over the course of the animation. This may include multiple objects moving or otherwise varying in different ways during an animation. For example, in a basic case of two objects moving across the stage in parallel, a style can be defined for each object (e.g., webkit-translate). However, the parameters for each style can be set so that the motion of the objects remains true to that defined in timeline data 106.
For example, the animation sequence may define motion of a first object that occurs simultaneous to motion of a second object. The analysis of timeline data 106 can determine appropriate location and timing parameters to use in style definitions for the first and second objects. Non-parallel animations can be handled sequentially. For instance, if one of the objects moves after the other, then the analysis can determine an appropriate delay factor to include in the style definitions so that the proper timing of the animations is maintained.
Block 504 represents determining whether the animation sequence includes a hierarchical object. Development environment 214 may allow a user to define a hierarchical object that itself comprises multiple different objects with animated behavior. For example, a Flash® application may be defined in terms of a main timeline that describes motion of an object over a time interval, with the object itself including additional objects that are animated according to their own nested timelines. As a particular example, motion of an animated creature may be defined in a main timeline and describe how the creature moves across the stage. The creature's eyes may each have their own timeline describing blinking, motion, or other effects. As another example, animation sequence 102 as defined in timeline data 106 may specify bicycle 103 as a hierarchical object including body 103A and wheel 102B, crank 102C, and wheel 102D each defining rotation in a nested timeline. In that case, translation may be defined with respect to the hierarchical object, with the translation of wheel 102B, crank 102C, and wheel 102D implied via the nested timelines.
Thus, at block 504, animation coding engine 104 dives into the nested timelines to determine all of the animated effects are to occur at a given point in time, including animations inherited by nested objects. This can be used in the code generation process—returning to the “creature” example, a style can be defined to move the visual assets corresponding to the creature's body and a style can be defined for the creature's eyes. Due to the nested timeline, the style for the creature's eyes should define a translation corresponding to the translation of the body (assuming no intended eye wobble in the original animation). However, one or more additional styles can be defined for the eye-related effects (e.g., a webkit-transition primitive can be used to bring the creature's eyelids in and out of view for the blink animations).
Block 506 represents identifying a looping sequence in the animation sequence. A looping sequence can be identified based on repeated activity in a timeline, either expressly repeated or implied. As an example of an expressly repeated loop, a timeline may indicate that an object is to begin from a first position, move, and return to the first position a number of times. During the code generation process, this activity can be simplified into a single sequence repeated a number of times. For example, the analysis may determine how many times the sequence repeats and suitable JavaScript can be included in code package 108 to invoke the sequence the determined number of times. As another example, the desired loop count for the animation can be included in the stylesheet defining the animation. If the loop includes a delay (e.g., the object appears, moves, and disappears before reappearing), a delay factor can be included in the object's style definition.
A hierarchical object can imply a looping sequence. For example, as noted above bicycle 103 may be defined in a main timeline with nested timelines for the wheels and cranks. The wheel rotation may be defined as only a few frames which are automatically repeated while the main timeline extends over a much larger number of frames (e.g., as the bicycle object translates across the screen). If in animation sequence 102 bicycle 103 moves, stops, and moves again, then the rotation of wheels 102B/102D and crank 102C may be looped on a delay to correspond to the start and stop of the bicycle. Thus, block 506 can represent determining a number of repetitions of the wheel rotation animation for use over the course of the full animation.
By analyzing complex animations defined in timeline data 106 according to a heuristic, such as that of
By specifying the visual assets directly in markup to be processed by the HTML rendering engine according to CSS or other style-based animation definitions, performance can be enhanced as compared to solutions that use an intermediate layer of processing. In particular, the HTML file drives the animation sequence because the HTML file is what is parsed and rendered. The HTML parser and its event model, which are provided by functional modules compiled/configured in a way to efficiently invoke the hardware capabilities of the device providing the HTML parser, are used to provide the animations in accordance with the style definitions.
Thus, there is not delay due to intermediate processing while actually carrying out the animations, such as delay due to parsing and interpreting JavaScript code to move a <canvas> or other element. Of course, present embodiments can use JavaScript as an event driver (e.g., to start and repeat animation sequences), but such operations are less computationally intensive than using JavaScript to actually provide the animation.
Object model manager module 604 can access stored data representing a core object model and codebase 606 for the application under development. The object model/codebase can include data representing the various application components, including media elements, scripting components, and the like and is representative of timeline data 106 used by animation coding module 104. For example, module 604 can store vector, raster, or other graphics representing bicycle 103 along with data defining the location of bicycle 103 in various frames, along with desired motion effects as bicycle 103 changes position in the different frames.
As discussed above, animation coding module 104 can use timeline data 106 to decompose an animation sequence into a plurality of visual assets and select corresponding animation primitives to be applied to those visual assets in order to replicate the animation sequence by way of a code package 108 processed by a rendering application. Module 104 can, for example, carry out analysis according to
It will be understood that the present subject matter can be used regardless of the format or type of the application under development, and construction and use of appropriate compilers, linkers, and packaging components (e.g., for cross-platform compatibility) will be within the ability of one of skill in the art. This may allow a developer to generate an animation sequence once and then output the sequence in multiple different formats (e.g., in HTML/CSS3 and as a Flash® application).
Animation coding engine 104 is shown integrated into development environment 214 in this example. It will be understood that animation coding engine 104 can operate independently of development environment 214. For example, animation coding engine 104 could be implemented with its own UI module used to select files containing timeline data 106 and convert those files into a code package 108.
Some portions of the detailed description were presented in terms of algorithms or symbolic representations of operations on data bits or binary digital signals stored within a computing system memory, such as a computer memory. These algorithmic descriptions or representations are examples of techniques used by those of ordinary skill in the data processing arts to convey the substance of their work to others skilled in the art. An algorithm is here and generally is considered to be a self-consistent sequence of operations or similar processing leading to a desired result. In this context, operations or processing involve physical manipulation of physical quantities.
Typically, although not necessarily, such quantities may take the form of electrical or magnetic signals capable of being stored, transferred, combined, compared or otherwise manipulated. It has proven convenient at times, principally for reasons of common usage, to refer to such signals as bits, data, values, elements, symbols, characters, terms, numbers, numerals or the like. It should be understood, however, that all of these and similar terms are to be associated with appropriate physical quantities and are merely convenient labels.
Unless specifically stated otherwise, as apparent from the foregoing discussion, it is appreciated that throughout this specification discussions utilizing terms such as “processing,” “computing,” “calculating,” “determining” or the like refer to actions or processes of a computing platform, such as one or more computers and/or a similar electronic computing device or devices, that manipulate or transform data represented as physical electronic or magnetic quantities within memories, registers, or other information storage devices, transmission devices, or display devices of the computing platform.
Although several examples featured mobile devices, the various systems discussed herein are not limited to any particular hardware architecture or configuration. A computing device can include any suitable arrangement of components that provide a result conditioned on one or more inputs. Suitable computing devices include multipurpose microprocessor-based computer systems accessing stored software, that programs or configures the computing system from a general-purpose computing apparatus to a specialized computing apparatus implementing one or more embodiments of the present subject matter. Any suitable programming, scripting, or other type of language or combinations of languages may be used to implement the teachings contained herein in software to be used in programming or configuring a computing device.
A computing device may access one or more non-transitory computer-readable media that embody computer-readable instructions which, when executed by at least one computer, cause the at least one computer to implement one or more embodiments of the present subject matter. When software is utilized, the software may comprise one or more components, processes, and/or applications. Additionally or alternatively to software, the computing device(s) may comprise circuitry that renders the device(s) operative to implement one or more of the methods of the present subject matter.
Examples of computing devices include, but are not limited to, servers, personal computers, mobile devices (e.g., tablets, smartphones, personal digital assistants (PDAs), etc.) televisions, television set-top boxes, portable music players, and consumer electronic devices such as cameras, camcorders, and mobile devices. Computing devices may be integrated into other devices, e.g. “smart” appliances, automobiles, kiosks, and the like.
Embodiments of the methods disclosed herein may be performed in the operation of computing devices. The order of the blocks presented in the examples above can be varied—for example, blocks can be re-ordered, combined, and/or broken into sub-blocks. Certain blocks or processes can be performed in parallel.
Any suitable non-transitory computer-readable medium or media may be used to implement or practice the presently-disclosed subject matter, including, but not limited to, diskettes, drives, magnetic-based storage media, optical storage media (e.g., CD-ROMS, DVD-ROMS, and variants thereof), flash, RAM, ROM, and other memory devices.
The use of “adapted to” or “configured to” herein is meant as open and inclusive language that does not foreclose devices adapted to or configured to perform additional tasks or steps. Additionally, the use of “based on” is meant to be open and inclusive, in that a process, step, calculation, or other action “based on” one or more recited conditions or values may, in practice, be based on additional conditions or values beyond those recited. Headings, lists, and numbering included herein are for ease of explanation only and are not meant to be limiting.
While the present subject matter has been described in detail with respect to specific embodiments thereof, it will be appreciated that those skilled in the art, upon attaining an understanding of the foregoing may readily produce alterations to, variations of, and equivalents to such embodiments. Accordingly, it should be understood that the present disclosure has been presented for purposes of example rather than limitation, and does not preclude inclusion of such modifications, variations and/or additions to the present subject matter as would be readily apparent to one of ordinary skill in the art.