A variety of computer software programs are available for defining and manipulating objects in a virtual three-dimensional (3D) world. For example, “3DSMax” from Autodesk, Inc. and Solidworks are available. They provide an assortment of tools in a convenient graphical user interface (GUI) for manipulation and editing of 3D virtual objects. Programs for computer display screensavers also permit manipulation of moving images.
Also having great popularity are computer software programs for manipulation of video clips, multimedia clips, and the like. Such programs include Max, Aperture, and ArKaos.
Another popular medium that supports creativity with computers are the various computer software applications that involve the musical instrument digital interface (MIDI) standard. The MIDI standard permits connection of musical instruments with digital output to related digital sound processing devices, including computers with sound cards and sound editing applications, sound boards, broadcast equipment, and the like.
Music has become commonly performed with instruments that send digital MIDI data since the introduction of MIDI in approximately 1985. MIDI provides a flexible set of instructions that are sent via a serial data link from a controller to a receiver that processes those commands in a variety of ways that pertain to the output functions of the receiving device. The data and instructions involve most commonly that of sounds and music, but can also involve instructions for machine control and lighting control devices.
A separate branch of technology is the development of computer video graphics, the digital electronic representation and manipulation of virtual worlds comprised of three dimensional objects in a 3D space with applications in many fields, from microscopic imaging to galactic modeling and notably computer graphics for films and gaming environments.
There have been a few attempts to associate the direct performance of music with computer video and graphics in an effort to create new art forms. One program, Bliss Paint for the Macintosh, used MIDI input to change colors on an evolving rendering of a fractal image. Another program, ArKaos, uses MIDI commands to play video clips in a DJ-like process other program, MaxMSP, uses MIDI commands in flexible environment to drive video clips, audio clips, and drive external events.
There are many computer programs that control sound in various ways in response to a MIDI command stream. The “3DMIDI” program appears to be un-supported and it is not clear if the software works or ever worked. The available documentation describes set of separate programs that each performs a prescribed set of transformations to an embedded set of objects in response to MIDI. Each different performance is loaded and executed separately, and has its own unique tool set to make specific adjustments particular to the objects in that scene. There is an API shown that invites others to develop their own performances, each with their own unique sets of objects and tools which can not be edited at that point.
Unfortunately, there is no convenient user interface available for interacting computer graphics with musical instrument digital data. Conventional methods generally require cumbersome specification of input sources, scene description parameters and data objects, and linking of input sources and scene description objects. As a result, a relatively high level of computer skills are necessary for creating graphical presentations in conjunction with music input. It would be improve creative output if users could create scenes with objects and change both the objects and the nature of the interaction between the video graphics and MIDI music data.
As a result of these difficulties and increased complexity, there is need for a graphical user interface that supports integration with digital musical instruments. The present invention satisfies this need.
In accordance with embodiments of the invention, a graphical presentation is produced at a display of a host computer such that a scene description is rendered and updated by a received digital music input, wherein the digital music input is matched to trigger events of the scene description and action of each matched trigger event are executed in accordance with action processes of the scene description, thereby updating the scene description with respect to objects depicted in the scene on which the actions are executed. The updated scene description is then rendered. Thus, the invention provides a means for connecting a graphics API to a musical instrument digital interface (e.g., MIDI) data stream and producing a presentation. In this way, digital musical instruments can be integrated with graphical presentation techniques for an entertaining user experience.
In one embodiment, a computer system includes two active display windows, an Editor Application window and a Render (display) window. The Editor Application can be used for generation of scene descriptions with user input panels for defining parameters relating to six characteristics of the scene description: Triggers, Actions, Objects, Links, Splines, and Play control. A wide variety of playback and scene description controls are provided for user specification through the Editor Application. The scene description itself can be edited to specify scenes parameters such as groups and of combinations of triggers and scene objects and actions, as well as linkages and play control. The triggers can comprise selected MIDI commands, and the actions can comprise transformations performed on scene objects. The objects themselves can be defined in imported scene descriptions, including 3D graphics files, which may be rendered on the computer screen. The scene description can also include spline nodes that define an animation path. The Render window displays the results of the rendering operation to show the scene description file as processed with digital musical input.
Other features and advantages of the present invention should be apparent from the following description of the preferred embodiment, which illustrates, by way of example, the principles of the invention.
A system constructed in accordance with the present invention produces a graphical presentation at a display of a host computer. The host computer executes a presentation application constructed in accordance with the present invention. The graphical presentation comprises a rendered audiovisual presentation at the display of the host computer according to a scene description. The presentation is updated in response to a digital music input. In updating the presentation and rendering it, the system evaluates trigger events in the scene description and matches the digital music events to trigger events of the scene description, then processes the action lists of the matched events to modify objects in the scene, thereby updating the scene description and the display presentation.
The processing is illustrated by the flow diagram of
The processed UI events from the user (at box 104 of
The digital music input (at box 106 of
After the user UI events and MIDI port events are processed, the system updates the scene (box 108). Next, at box 109, the scene is rendered, meaning that applicable video and audio output is generated. Lastly, if no halt instruction or the like is received at box 110 execution is continued by returning to listening for, and processing, input from the user (box 104) and the musical instrument (box 106).
User Input
Digital Music Input
A variety of actions associated with the process functions may be carried out. For example, the actions may specify collisions between two or more objects of the scene description, and can include explosion of one or more objects of the scene description, or other movements of the objects in the scene description. The actions can be specified by user input so as to permit changes in speed, size, movement, color, and behavior of the scene objects.
Scene Description Data File
In the illustrated embodiment, the scene description can be recorded in a data file stored in a scene description format recognized by the playback (rendering) application of the system. One such scene description format is the “MIDI Paint Box” format (file extension of “.mpb”) by Starr Laboratories, Inc., the assignee of the present invention. Additional formats can be imported into the rendering application tool described herein, including 3D description formats such as those of “3D Studio Max” (*.3DS) from Autodesk, Inc. of San Rafael, Calif. USA and “X” graphics format files (*.X) of the DirectX 3D format supported by the DirectX API of Microsoft Corporation. Those skilled in the art will know of additional formats for 3D scene descriptions. The recorded scene description can be stored, for example, and loaded from auxiliary storage of the host computer. The scene description can be read from or obtained from a variety of data repositories, including memory of the host computer and external media such as disc storage, flash memory, network accessible storage, and the like.
As noted above, is not necessary that the digital music input must be received directly (live) from a musical instrument connected to the host computer. Similarly to the scene description, the digital music input can be received as a data file in a digital music file format, such as MIDI file or SMF file format. Thus, the digital music input format can be in accordance with the MIDI format of the MMA, referred to above, or can be implemented according to a different data format.
Editor Application
The illustrated embodiment includes a scene description editor application, which provides a graphical user interface (GUI) through which a user can specify the scene description, edit existing scene descriptions, and control rendering and playback of the scene description that is loaded in the host computer. In the present discussion, the editor application will be referred to as the MidiPaintBox.
The editor window Scene panel 500 shows objects and elements of the scene description. Scene objects are listed in the left frame, and scene triggers are listed in the right frame. A user may add objects and specify their characteristics and parameters through the tool bar in
Triggers can be associated with channels of MIDI input. The Trigger panel can be used to set a note filter and to set a velocity filter. The note filter processing comprises processing that responds to note number and channel, note range, musical scale and chordal tonalities, and/or note density of the updated scene description from what is otherwise specified by the musical instrument digital interface input. The velocity filter processes notes of the musical instrument digital interface input so as to adjust the note velocity from what is otherwise specified by the musical instrument digital interface input.
The musical instrument digital interface input can be compatible with the MIDI interface specified by the MIDI (Musical Instrument Digital Interface) Manufacturer's Association (MMA) protocol specification. The illustrated embodiment operates in accordance with the MMA MIDI standard and therefore MIDI commands, such as Note-On, Note-Off, and ContinuousControl change, are used to initiate actions pursuant to the Trigger events, to provide the various graphic behaviors and transformations to the scene objects.
Spline nodes are automatically given an initial position in the scene description based on the camera position, just as with regular scene objects. The initial position can be conveniently modified in a graphical interface by clicking the node on the display and dragging it with the computer mouse. The spline currently being edited appears in the playback window 402 as a segmented white line punctuated by balls. The balls are the spline nodes and indicate the actual positions in the scene that will be occupied by the object as it travels the spline path. When a spline motion is rendered the indexing balls are hidden from view. An object may also be positioned anywhere along the length of a spline rather than placing it at the aforementioned spline generation nodes.
The illustrated embodiment provides an environment in which a digital music data stream interfaces with a Graphics API or function library, allowing 3D video graphics objects to be manipulated in real-time by a digital musical instrument, such as a MIDI -capable instrument, or a music playback file (e.g. *.WAV or *.AIFF). In this way, the described technique receives a musical instrument input stream and uses it to provide the kinetic on-screen activity for a 3D scene description, such that when the musical input stream stops, the scene activity on the display screen stops. Autonomous moving environments can be created that are similar to many video games that dictate the action and force the user's responses for navigation, avoidance, and targeting purposes.
The processor 1202 also responds to input devices 1206 that receive user input, including such input devices as a computer keyboard, mouse, and other similar devices. The computer includes memory 1208, typically provided as volatile (dynamic) memory for storing program instructions, operating data, and so forth. The datastore 1210 is typically non-volatile memory, such as data disks or disk arrays. The computer can also include a program product reader 1212 that receives externally accessible media 1214 such as flash drives and optical media discs, and the like. Such media 1214 can include program instructions, comprising program products, that can be read by the reader 1212 and executed by the processor 1202 to provide the operation described herein. The processor uses a graphics or video card 1216 to visually render the objects in a scene description according to the digital music input received through the sound card 1204. The visually rendered graphics output can be viewed at a display device 1218, such as visual display devices and the like. The sound output 1205 and rendered graphics output 1218 together comprise the rendered scene output, providing a multimedia presentation.
The system 1200 receives digital music in the form of a MIDI data stream that is interpreted by the processor 1202 and is sent as instructions to a sound card to play effects and musical notes, which may be stored as *.WAV files or in other suitable formats, or to be synthesized directly by the music application software. The same MIDI stream, which may be delivered from a live performance or from a stored MIDI file, is also processed by the software described herein and is delivered to the computer's video processor for output to the video display monitor as the rendered scene. It should be understood that the computer sound card 1204 could be replaced by a dedicated hardware music synthesizer.
The scene description described above can be stored as a data file in the computer memory 1208 or can be stored as a data file in the datastore 1210. As noted above, the datastore can comprise internal storage such as a hard disk, or can comprise external or auxiliary storage, such as removable disk media or flash drives or network connected datastore devices. The processor 1202 can execute instructions to provide a graphics engine capability for rendering the scene description, or can provide such a graphics rendering engine in conjunction with processing by the graphics card 1216.
In this way, the scene description file 1304 provides a means of encapsulated scene description data in a single, unitary data file that can be recognized by the operating system of the host computer so that the MidiPainter application can receive all the data necessary for rendering a scene and can automatically perform rendering in response to digital music input, in accordance with the scene description data. Thus, a user can work through a convenient user interface for editing a scene description using the MidiPaintBox module, and the MidiPainter module can process the self-contained scene description file and automatically utilize system resources and perform internal data transfers necessary to render the scene in a fashion that is transparent to the user. That is, users need not concern themselves with linking various data files and specifying system resources to be used for rendering, and edits to the scene are easily accomplished through the MidiPaintBox module and stored in the scene description data file for processing by the MidiPainter module.
The present invention has been described above in terms of a presently preferred embodiment so that an understanding of the present invention can be conveyed. There are, however, many configurations for graphics-musical interface interaction not specifically described herein but with which the present invention is applicable. The present invention should therefore not be seen as limited to the particular embodiments described herein, but rather, it should be understood that the present invention has wide applicability with respect to graphics-musical interface interaction systems generally. All modifications, variations, or equivalent arrangements and implementations that are within the scope of the attached claims should therefore be considered within the scope of the invention.
This application claims the benefit of U.S. Provisional Patent Application Ser. No. 60/912,654 entitled “Digital Music Input Rendering For Graphical Presentations” by Harvey W. Starr et al., filed Apr. 18, 2007. Priority of the filing date is hereby claimed, and the disclosure of the application is hereby incorporated by reference.
Number | Date | Country | |
---|---|---|---|
60912654 | Apr 2007 | US |