In general, computer-based drawing applications enable a user to generate structures, graphics or illustrations as static objects which then are output to a display. In some cases, those structures, graphics or illustrations can be animated by generating copies of the original objects, applying geometric transformations (such as translating, rotating and scaling, among others) to the copied objects, and displaying the transformed objects sequentially in time.
This specification describes technologies relating to transforming time-based drawings. In general, one aspect of the subject matter described in this specification can be embodied in a method performed by a data processing apparatus, in which the method includes rendering a first object on a display, the first object having first location coordinates and first temporal coordinates, in which the first location coordinates define a first drawing and each first location coordinate is associated with a respective temporal coordinate, receiving input defining a second object, the second object having second location coordinates and second temporal coordinates, in which the second location coordinates define a second drawing and each second location coordinate is associated with a respective second temporal coordinate; applying a transformation to one or more of the first location coordinates responsive to receiving each second location coordinate, in which the transformation is based on a most recently received second location coordinate, and generating an animation by rendering the one or more transformed first location coordinates on the display according to the respective first temporal coordinates. Other embodiments of this aspect include corresponding computing platforms and computer program products.
These and other embodiments can optionally include one or more of the following features. In some embodiments, the transformation is applied to each of the one or more first location coordinates based on the most recently received second location coordinate. The animation can be generated after each transformation is applied.
In some embodiments, the transformation is applied to the one or more first location coordinates based on whether a second temporal coordinate associated with the most recently received second location coordinate corresponds to a first temporal coordinate associated with the one or more first location coordinates. The transformation can be applied to the one or more first location coordinates based on whether the second temporal coordinate associated with the most recently received second location coordinate equals the first temporal coordinate associated with the one or more first location coordinates.
In some embodiments, the transformation includes a vector translation of the one or more first location coordinates. In some implementations, the method can further include receiving an input defining an animation period in which the animation is periodically generated based on the animation period.
Particular embodiments of the subject matter described in this specification can be implemented so as to realize one or more of the following advantages. For example, the application allows, in some cases, a user to simultaneously interact with an animation as it is displayed without a visual abstraction, such as a timeline, scripting window or user interface icon, interfering with or visible during the animation. Accordingly, a user can visually observe instantaneous feedback as the appearance of an animated object is altered. In some implementations, the application enables a user to generate animations that are tied to rhythmic relationships in a corresponding musical soundtrack.
The details of one or more embodiments of the subject matter described in this specification are set forth in the accompanying drawings and the description below. Other features, aspects, and advantages of the implementations will become apparent from the description, the drawings, and the claims.
In general, computer-generated animations are produced by applying the geometric transformations off-line, i.e., the intended client or audience does not observe the production of the animated feature. Nor does the act of producing of the animation typically correspond to the finalized end product that will be viewed by an audience.
In some implementations, a user or artist may be interested in producing a visual performance in which the temporal aspects of the objects, such as frequency, periodicity or phase, are altered as the objects are created. In addition, the user or artist may be interested in synchronizing such animations with a particular tempo, rhythm, soundtrack or music as part of the visual performance.
As object 10 is output to drawing space 8, the drawing application captures the image location coordinates of object 10 on drawing space 8 as well as temporal coordinates that correspond to the time at which the respective image location coordinates are captured. The rate at which the temporal and image location coordinates are captured can be synchronized with a particular tempo/rhythm established by user 2, extracted from a file, or extracted from another software application, such as a video player or audio player. In the present implementation, the captured image location coordinates and temporal coordinates of object 10 are stored in computer-readable memory as a dataset. Once the image location coordinates and corresponding temporal coordinates are stored, multiple instances of object 10 can be rendered again within drawing space 8. For example, in some cases, object 10 is rendered within drawing space 8 periodically based on a time period set by the drawing application or specified by user input to the drawing application. In some implementations, rendering object 10 includes animating object 10. That is, the image location coordinates are displayed within drawing space 8 in time according to their corresponding temporal coordinates.
The user can provide additional input to the drawing application in order to modify the appearance or animation of object 10 within drawing space 8. In some implementations, the user can employ input device 4 to modify features of object 10 based on the input device motion. For example, the time-based drawing application can transform the image location coordinates or the temporal coordinates of the first object 10 based on a position of a mouse, trackball or other input device. In some cases, the transformation can also be based, in part, on the time at which the position of the user input device was determined. In some implementations, the transformation is based on the image location coordinates or temporal coordinates of a second object drawn within drawing space 8.
The transformed first object can be re-drawn, one or more times, by the drawing application as a new object 12. In some cases, the new object 12 is re-drawn concurrently as the image location coordinates or temporal coordinates of the first object are transformed. Alternatively, or in addition, the new object 12 is re-drawn to drawing space 8 after the transformation of the image coordinates or temporal coordinates of the first object 10. User 2 can modify the new object's phase, rate, visibility or periodic attributes to provide a performance-based mechanism for creating artwork.
In the implementation of
Referring to
The time-based drawing application 206 can be an image processing application or a portion thereof. As used herein, an application refers to a computer program that the user perceives as a distinct computer tool used for a defined purpose. An application can be built entirely into the OS of the data processing apparatus 204, or an application can have different components located in different locations (e.g., one portion in the OS and one portion in a remote server connected to the platform 200), and an application can be built on a runtime library serving as a software platform of the data processing apparatus 204. The time-based drawing application 206 can include image editing software, digital publishing software, video editing software, presentation and learning software, and graphical/text editing software (e.g., Adobe® Photoshop® software, Adobe® InDesign® software, Adobe® Captivate® software, Adobe® AfterEffects® software, Adobe® Premiere®, Adobe® Flash Pro® and Adobe® Illustator® software, available from Adobe Systems Incorporated of San Jose, Calif.). The user input device(s) 202 can include, for example, keyboard(s) and a pointing device, such as a mouse, trackball, stylus, or any combination thereof. The display device(s) 214 can include a display monitor capable of producing color or gray scale pixels on a display screen. For example, the display device(s) can include a cathode ray tube (CRT) or liquid crystal display (LCD) monitor for displaying information to the user. The computer platform 200, the input device 202 and the display device 214 can together be included in a single system or device, such as a personal computer, a mobile telephone, a personal digital assistant (PDA), a mobile audio player, to name just a few.
As shown in the example of
Based on the image location coordinates and/or temporal coordinates generated by object input module 208, Object transformation module 210 applies a transformation to the image location coordinates and/or temporal coordinates of a pre-existing object that has been rendered to a display by the time-based drawing application. The transformation can include a transformation applied to each coordinate of the pre-existing object based on a single value obtained from the object input module 208. In some cases, the transformation is applied to each coordinate of the pre-existing object based on the most recent coordinate generated by the object input module. In some implementations, the transformation is applied to the coordinates of the pre-existing object based on an image location coordinate generated by the object input module, in which the image location coordinate corresponds temporally to the coordinate of the pre-existing object.
Once the transformation has been applied, the transformed image location coordinates and/or temporal coordinates are transferred to the image generation module 212. Image generation module 212 produces an image or animation based on the transformed image location coordinates and/or temporal coordinates and outputs the image to the display device 214. Alternatively, or in addition, image generation module 212 produces an image or animation based on image location coordinates and/or temporal coordinates provided by object input module 208.
The object input module 208 then receives (303) user input defining a second object, in which the second object includes multiple second image location coordinates and second temporal coordinates respectively corresponding to the second image location coordinates. The second image location coordinates and the second temporal coordinates then are provided to the object transformation module 210. Upon receiving each second image location coordinate, the object transformation module 210 applies a transformation to one or more of the first image location coordinates, based on the most recently received second image location coordinate.
The transformed image location coordinate(s) is then transferred to the image generation module 212 which generates (305) an animation by rendering the one or more transformed first image coordinates on the display according to the respective first temporal coordinates. The type of transformation applied to the image location coordinates can be determined by the user or applied automatically by the drawing application.
The animations produced by time-based drawing application 106 also can be rendered in a repeated (i.e., periodic) manner. For the purposes of this disclosure, the period of repetition will be referred to as a measure. The measure can be specified by the user. For example, the user can enter the period as a numeric value measured in micro-seconds, milliseconds, seconds or minutes. Other units of time-based measurement may be used as well. In some cases, the period can be extracted by the time-based drawing application 106 from a separate file or another software application. For example, in some cases, the application 206 defines a measure by analyzing an associated audio file to determine the length of the measure at a specified tempo. Drawing space 8, on which objects are formed, can be a blank image space produced on display 9 by time-based drawing application 206. A user can select various drawing tools, such as a line tool, brush tool, or shape tool, among others to draw objects in drawing space 8.
Stroke A can be rendered in the drawing application as a static object in which the image location coordinates are displayed concurrently at one time within drawing space 8. Alternatively, or in addition, stroke A can be rendered as a static object repeatedly within drawing space 8. For example, in some cases, the entirety of stroke A is rendered periodically based on a specified period of time.
In some implementations, a single instance of stroke A can be animated within drawing space 8 or, alternatively, multiple instances of stroke A can be animated within drawing space 8. In some cases, the animations occur within a single period. The length of animation depends on the temporal coordinates associated with stroke A. For example, the animation of stroke A can occur over a length of time that is less than one period, equal to one period, or greater than one period. The user 2 can initiate, through the input device 202, the periodic rendering of stroke A or, alternatively, the repeated and/or periodic rendering of stroke A is applied automatically by the drawing application.
In some implementations, stroke A is animated such that each image location coordinate of stroke A is rendered to drawing space 8 over time based on a corresponding temporal coordinate so that stroke A appears as if it is being drawn on the display. For example,
After stroke A is rendered, user 2 can employ input device 202 to draw a second stroke B, represented by the dashed line in
In some implementations, stroke B is used to modify the appearance of stroke A. For example,
In some implementations, the time-based drawing application renders objects periodically within drawing space 8. For example, if an animation period is set equal to a time t=M1, where M1 represents the length of a measure, stroke A may be rendered in drawing space 8 at the beginning of every period, i.e., t=M1, M1+length(M1), M1+2*length(M1) and so forth. Accordingly, if modifying stroke B is applied to stroke A, the application may render new stroke C at the beginning of the each period, such that both strokes A and C are visible within drawing space 8.
In some cases, portions of strokes A and/or C may disappear from drawing space 8 over time. For example, if either stroke A or C is rendered to drawing space in the first half of a period, the strokes may begin to disappear in the second half of the period. Alternatively, if either stroke A or C is rendered to drawing space over a length of time equal to one period, then the strokes may begin to disappear at the start of the second period. The disappearance of the objects in drawing space 8 can occur according to the temporal coordinates. For example, the objects can begin to disappear starting with the image location coordinates associated with the earliest temporal coordinate. Other implementations for removing objects from drawing space 8 can be employed as well.
In some implementations, a series of values from the modifying object can be applied globally to a previously rendered object. For example,
As shown in
In some implementations, the modifying stroke is drawn over a length of time that is greater than one period as defined by the time-based drawing application. For example, FIG. 5C illustrates an initial stroke A rendered in drawing space 8 and a modifying stroke B, which may or may not be rendered in drawing space 8. Stroke A includes position coordinates pA and corresponding temporal coordinates tA. Stroke B includes position coordinates pB and corresponding temporal coordinates tB in which stroke B is drawn over a time period greater than one measure M1.
In some implementations, a single value from the modifying object can be applied incrementally to the original object. That is, each coordinate of the original object can be modified once by a value that corresponds temporally in the modifying object.
The skewing is a result of the change in image location values of stroke B applied to stroke A as stroke B is drawn in the time-based drawing application. For example, as shown in
In some implementations, the user draws a modifying stroke B which extends over a time period that is less than one measure and which also is less than the duration of stroke A. Accordingly, one or more of the image location coordinates in the original stroke A will not be transformed by a temporally corresponding value in the modifying object. Instead, in some cases, the new stroke simply incorporates the remaining coordinates of the original stroke A without applying a transformation to those coordinates. Thus, in certain implementations, the new stroke may appear to have a discontinuity between the modified image location coordinates and the non-modified image location coordinates. Alternatively, in some cases, the one or more image location coordinates may be transformed by the last value of the modifying stroke. For example,
In some implementations, the modifying object extends over a time period that is longer than one full measure. In such cases, each new object that is rendered in subsequent measures may accumulate a translation based on the temporally corresponding portion of the modifying object rendered in the first measure and the temporally corresponding portion of the modifying object rendered in subsequent measures.
For example,
In some implementations, the modifying object is drawn at a rate that is faster than the initial object is rendered in the drawing space. As in the implementation described in
For example,
Such transformations also allow a user, in some cases, to adjust the time evolution of a drawing so that it synchronizes with an associated musical soundtrack. For example, the phase of one or more modifying objects can be adjusted to synchronize the initial position of those objects with a semantically meaningful moment in the associated soundtrack, such as the downbeat of a measure. In some cases, a user can modify or refine the evolution of drawings or objects by changing their specific rate, position or appearance relative to the evolution of other objects being displayed within the application. A user can add additional objects to build up a collection of objects that are displayed and evolve in time in the application.
Embodiments of the subject matter and the operations described in this specification can be implemented in digital electronic circuitry, or in computer software, firmware, or hardware, including the structures disclosed in this specification and their structural equivalents, or in combinations of one or more of them. Embodiments of the subject matter described in this specification can be implemented as one or more computer programs, i.e., one or more modules of computer program instructions, encoded on a computer storage medium for execution by, or to control the operation of, data processing apparatus. The computer storage medium can be, or be included in, a computer-readable storage device, a computer-readable storage substrate, a random or serial access memory array or device, or a combination of one or more of them. The computer storage medium can also be, or be included in, one or more separate physical components or media (e.g., multiple CDs, disks, or other storage devices).
The operations described in this specification can be implemented as operations performed by a data processing apparatus on data stored on one or more computer-readable storage devices or received from other sources.
The term “data processing apparatus” encompasses all kinds of apparatus, devices, and machines for processing data, including by way of example a programmable processor, a computer, a system on a chip, or multiple ones, or combinations, of the foregoing The apparatus can include special purpose logic circuitry, e.g., an FPGA (field programmable gate array) or an ASIC (application-specific integrated circuit). The apparatus can also include, in addition to hardware, code that creates an execution environment for the computer program in question, e.g., code that constitutes processor firmware, a protocol stack, a database management system, an operating system, a cross-platform runtime environment, a virtual machine, or a combination of one or more of them. The apparatus and execution environment can realize various different computing model infrastructures, such as web services, distributed computing and grid computing infrastructures.
A computer program (also known as a program, software, software application, script, or code) can be written in any form of programming language, including compiled or interpreted languages, declarative or procedural languages, and it can be deployed in any form, including as a stand-alone program or as a module, component, subroutine, object, or other unit suitable for use in a computing environment. A computer program may, but need not, correspond to a file in a file system. A program can be stored in a portion of a file that holds other programs or data (e.g., one or more scripts stored in a markup language document), in a single file dedicated to the program in question, or in multiple coordinated files (e.g., files that store one or more modules, sub-programs, or portions of code). A computer program can be deployed to be executed on one computer or on multiple computers that are located at one site or distributed across multiple sites and interconnected by a communication network.
The processes and logic flows described in this specification can be performed by one or more programmable processors executing one or more computer programs to perform functions by operating on input data and generating output. The processes and logic flows can also be performed by, and apparatus can also be implemented as, special purpose logic circuitry, e.g., an FPGA (field programmable gate array) or an ASIC (application-specific integrated circuit).
Processors suitable for the execution of a computer program include, by way of example, both general and special purpose microprocessors, and any one or more processors of any kind of digital computer. Generally, a processor will receive instructions and data from a read-only memory or a random access memory or both. The essential elements of a computer are a processor for performing actions in accordance with instructions and one or more memory devices for storing instructions and data. Generally, a computer will also include, or be operatively coupled to receive data from or transfer data to, or both, one or more mass storage devices for storing data, e.g., magnetic, magneto-optical disks, or optical disks. However, a computer need not have such devices. Moreover, a computer can be embedded in another device, e.g., a mobile telephone, a personal digital assistant (PDA), a mobile audio or video player, a game console, a Global Positioning System (GPS) receiver, or a portable storage device (e.g., a universal serial bus (USB) flash drive), to name just a few. Devices suitable for storing computer program instructions and data include all forms of non-volatile memory, media and memory devices, including by way of example semiconductor memory devices, e.g., EPROM, EEPROM, and flash memory devices; magnetic disks, e.g., internal hard disks or removable disks; magneto-optical disks; and CD-ROM and DVD-ROM disks. The processor and the memory can be supplemented by, or incorporated in, special purpose logic circuitry.
To provide for interaction with a user, embodiments of the subject matter described in this specification can be implemented on a computer having a display device, e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor, for displaying information to the user and a keyboard and a pointing device, e.g., a mouse or a trackball, by which the user can provide input to the computer. Other kinds of devices can be used to provide for interaction with a user as well; for example, feedback provided to the user can be any form of sensory feedback, e.g., visual feedback, auditory feedback, or tactile feedback; and input from the user can be received in any form, including acoustic, speech, or tactile input. In addition, a computer can interact with a user by sending documents to and receiving documents from a device that is used by the user; for example, by sending web pages to a web browser on a user's client device in response to requests received from the web browser.
Embodiments of the subject matter described in this specification can be implemented in a computing system that includes a back-end component, e.g., as a data server, or that includes a middleware component, e.g., an application server, or that includes a front-end component, e.g., a client computer having a graphical user interface or a Web browser through which a user can interact with an implementation of the subject matter described in this specification, or any combination of one or more such back-end, middleware, or front-end components. The components of the system can be interconnected by any form or medium of digital data communication, e.g., a communication network. Examples of communication networks include a local area network (“LAN”) and a wide area network (“WAN”), an inter-network (e.g., the Internet), and peer-to-peer networks (e.g., ad hoc peer-to-peer networks).
The computing system can include clients and servers. A client and server are generally remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other. In some embodiments, a server transmits data (e.g., an HTML page) to a client device (e.g., for purposes of displaying data to and receiving user input from a user interacting with the client device). Data generated at the client device (e.g., a result of the user interaction) can be received from the client device at the server.
While this specification contains many specific implementation details, these should not be construed as limitations on the scope of what may be claimed, but rather as descriptions of features specific to particular embodiments. Certain features that are described in this specification in the context of separate embodiments can also be implemented in combination in a single embodiment. Conversely, various features that are described in the context of a single embodiment can also be implemented in multiple embodiments separately or in any suitable subcombination. Moreover, although features may be described above as acting in certain combinations and even initially claimed as such, one or more features from a claimed combination can in some cases be excised from the combination, and the claimed combination may be directed to a subcombination or variation of a subcombination.
Similarly, while operations are depicted in the drawings in a particular order, this should not be understood as requiring that such operations be performed in the particular order shown or in sequential order, or that all illustrated operations be performed, to achieve desirable results. In certain circumstances, multitasking and parallel processing may be advantageous. Moreover, the separation of various system components in the embodiments described above should not be understood as requiring such separation in all embodiments, and it should be understood that the described program components and systems can generally be integrated together in a single software product or packaged into multiple software products.
Thus, particular embodiments have been described. Other embodiments are within the scope of the following claims. For example, multiple users could interact with the application at the same time. In some cases, multiple users could collaborate to modify animations and transform time-based drawings in a shared drawing space. Each user could employ a separate input device represented in the drawing space with a particular pointer icon. Alternatively, in some cases, multiple users may interact with the drawing application in separate drawing spaces that are simultaneously visible on a display. The users could interact with the drawing application in the same location or interact remotely with the drawing application over a network from separate areas. In some cases, the actions recited in the claims can be performed in a different order and still achieve desirable results. In addition, the processes depicted in the accompanying figures do not necessarily require the particular order shown, or sequential order, to achieve desirable results. In certain implementations, multitasking and parallel processing may be advantageous.