The present application relates generally to playing media, more specifically, to multilayered media.
Music includes several layers, such as vocals, guitar, drums, etc. Each layer can have a unique sound and may share a similar tempo and pace. Combined, the layers of music form a musical composition.
Playback applications allow digital devices to play music and videos. Playback applications generally play entire compositions that include several layers of music. As playback applications grow more complex, there is a need for controlling the playback of individual layers of music.
A method of operating a device for playback of a multilayered media file is provided. The method comprises receiving one or more attributes of a musical program of a multilayered media file comprising a plurality of musical programs. The method also comprises receiving a command related to the musical program of the multilayered media file. The method also comprises outputting the multilayered media file based on the attributes and the command.
An apparatus configured for playback of a multilayered media file is provided. The apparatus comprises a speaker, a display; and one or more processors. The one or more processors are configured to receive one or more attributes of a musical program of a multilayered media file comprising a plurality of musical programs, receive a command related to the musical program of the multilayered media file, and cause the speaker and the display to output the multilayered media file based on the attributes and the command.
A computer readable medium configured to store program instructions for playback of a multilayered media file is provided. The program instructions are configured to cause one or more processors to receive one or more attributes of a musical program of a multilayered media file comprising a plurality of musical programs, receive a command related to the musical program of the multilayered media file, and cause the speaker and the display to output the multilayered media file based on the attributes and the command.
Before undertaking the DETAILED DESCRIPTION below, it may be advantageous to set forth definitions of certain words and phrases used throughout this patent document: the terms “include” and “comprise,” as well as derivatives thereof, mean inclusion without limitation; the term “or,” is inclusive, meaning and/or; the phrases “associated with” and “associated therewith,” as well as derivatives thereof, may mean to include, be included within, interconnect with, contain, be contained within, connect to or with, couple to or with, be communicable with, cooperate with, interleave, juxtapose, be proximate to, be bound to or with, have, have a property of, or the like; and the term “controller” means any device, system or part thereof that controls at least one operation, such a device may be implemented in hardware, firmware or software, or some combination of at least two of the same. It should be noted that the functionality associated with any particular controller may be centralized or distributed, whether locally or remotely. Definitions for certain words and phrases are provided throughout this patent document, those of ordinary skill in the art should understand that in many, if not most instances, such definitions apply to prior, as well as future uses of such defined words and phrases.
For a more complete understanding of the present disclosure and its advantages, reference is now made to the following description taken in conjunction with the accompanying drawings, in which like reference numerals represent like parts:
The electronic device 102 can be a standalone device and includes an antenna 105, a radio frequency (RF) transceiver 110, transmit (TX) processing circuitry 115, a microphone 120, and receive (RX) processing circuitry 125. The electronic device 102 also includes a speaker 130, a processing unit 140, an input/output (I/O) interface (IF) 145, a keypad 150, a display 155, and a memory 160. The electronic device 102 could include any number of each of these components.
The processing unit 140 includes processing circuitry configured to execute instructions, such as instructions stored in the memory 160 or internally within the processing unit 140. The memory 160 includes a basic operating system (OS) program 161 and one or more applications 162. The electronic device 102 could represent any suitable device. In particular embodiments, the electronic device 102 represents a mobile telephone, smartphone, personal digital assistant, tablet computer, a touchscreen computer, and the like. The electronic device 102 plays multilayered media.
The RF transceiver 110 receives, from the antenna 105, an incoming RF signal transmitted by a base station or other device in a wireless network. The RF transceiver 110 down-converts the incoming RF signal to produce an intermediate frequency (IF) or baseband signal. The IF or baseband signal is sent to the RX processing circuitry 125, which produces a processed baseband signal (such as by filtering, decoding, and/or digitizing the baseband or IF signal). The RX processing circuitry 125 can provide the processed baseband signal to the speaker 130 (for voice data) or to the processing unit 140 for further processing (such as for web browsing or other data). The RF transceiver can include one or more of a Bluetooth transceiver, a Wifi transceiver, and an infrared (IR) transceiver, and so on; and limitation to the type of transceiver is not to be inferred.
The TX processing circuitry 115 receives analog or digital voice data from the microphone 120 or other outgoing baseband data (such as web data, e-mail, or interactive video game data) from the processing unit 140. The TX processing circuitry 115 encodes, multiplexes, and/or digitizes the outgoing baseband data to produce a processed baseband or IF signal. The RF transceiver 110 receives the outgoing processed baseband or IF signal from the TX processing circuitry 115 and up-converts the baseband or IF signal to an RF signal that is transmitted via the antenna 105.
In some embodiments, the processing unit 140 includes one or more processors, such as central processing unit (CPU) 142 and graphics processing unit (GPU) 144, embodied in one or more discrete devices. In some embodiments, the CPU 142 and the GPU 144 are implemented as one or more integrated circuits disposed on one or more printed circuit boards. The memory 160 is coupled to the processing unit 140. In some embodiments, part of the memory 160 represents a random access memory (RAM), and another part of the memory 160 represents a Flash memory acting as a read-only memory (ROM).
In some embodiments, the memory 160 is a computer readable medium that stores program instructions to play multilayered media. When the program instructions are executed by the processing unit 140, the program instructions cause one or more of the processing unit 140, CPU 142, and GPU 144 to execute various functions and programs in accordance with embodiments of this disclosure.
The processing unit 140 executes the basic OS program 161 stored in the memory 160 in order to control the overall operation of electronic device 102. For example, the processing unit 140 can control the RF transceiver 110, RX processing circuitry 125, and TX processing circuitry 115 in accordance with well-known principles to control the reception of forward channel signals and the transmission of reverse channel signals.
The processing unit 140 is also capable of executing other processes and programs resident in the memory 160, such as operations for playing multilayered media as described in more detail below. The processing unit 140 can also move data into or out of the memory 160 as required by an executing process. In some embodiments, the processing unit 140 is configured to execute a plurality of applications 162. The processing unit 140 can operate the applications 162 based on the OS program 161 or in response to a signal received from a base station. The processing unit 140 is coupled to the I/O interface 145, which provides electronic device 102 with the ability to connect to other devices, such as laptop computers, handheld computers, and server computers. The I/O interface 145 is the communication path between these accessories and the processing unit 140.
The processing unit 140 is also optionally coupled to the keypad 150 and the display unit 155. An operator of electronic device 102 uses the keypad 150 to enter data into electronic device 102. The display 155 may be a liquid crystal display, light emitting diode (LED) display, or other display capable of rendering text and/or at least limited graphics from web sites. Display unit 155 may be a touchscreen which displays keypad 150. Alternate embodiments may use other types of input/output devices and displays.
Gesture inputs 212 include one or more touch gestures that indicate when and how touchscreen 202 is being touched. Gesture inputs 212 include a tap gesture, a long press gesture, and a drag gesture. With a tap gesture or a long press gesture, a touch starts and ends at substantially the same point on touchscreen 202 on display 155 of electronic device 102. With a tap gesture, the touch is held at substantially the same point on touch screen x302 on display 155 for a substantially short period of time, such as with a threshold for the short period of time of 0.5 seconds or less. With a long press gesture, the touch is held at substantially the same point on touch screen x302 on display 155 for a longer period of time, such as with a threshold for the longer period of time of 0.5 seconds or more. Additional thresholds may be used for a long press gesture with each threshold associated with a different action to be taken by the application 162. With a drag gesture, the touch is at least partially moved while it is being held on touchscreen 202 of display 155 of electronic device 202 and is held until the touch is released.
Output from application engine 206 is displayed on the display 155 and output from sound engine 220 is played on speaker 130. The combination of application engine 206 and 220 form an application, such as application 162. Display 155 comprises touchscreen 202. When displayed, output from application engine 206 can be shown to simulate beam break hardware 204 on display 155.
Multilayered media file 216 comprises a plurality of music programs, such as media files 210, that each comprises one or more audio files and video files. Multilayered media file 216 includes definition file 208. Each of the music programs comprises a subset of a predetermined musical composition, shown in
Definition file 208 describes media files 210 and one or more beam layouts for application engine 206 and sound engine 220. Based on the information of definition file 18, application engine 206 and sound engine 220 determine specific timings for when media files 210 are played based on one or more of gesture inputs 212 and beam break inputs 214.
Definition file 208 is illustrated as an extensible markup language (XML) file comprising one or more elements described using one or more tags and attributes. An alternative file format can be used without departing from the scope of the present disclosure.
Definition file 208 includes comment 302, which states “<!-Program->”. Comment 302 indicates that the definition file includes a program.
Definition file 208 also includes “Program” element 304 comprising a plurality of attributes and additional elements. The attributes comprise name value pairs that multilayered media file 208. The attributes include:
Program element 304 includes a “Beams” element 306. Beams element 306 includes one or more “Beam” elements 308, which is further described in
Program element 304 includes a “BeamAssignmentsU4” element 310. BeamAssignmentsU4 element 310 includes one or more “Assign” elements 312. Assign element 312 includes attributes that define a beam assignment. The attributes of Assign element 312 include:
Program element 304 includes a “TriggerVolumesU4” element 314. TriggerVolumesU4 element 314 includes one or more “Volume” elements 316. Volume element 316 includes attributes that define volume of a beam relative to a master volume of multilayered media file 216. The attributes of Volume element 316 include:
Program element 304 includes a “Sections” element 318. Sections element 318 includes one or more “Section” elements 320. Section element 320 includes attributes that define sections of an audio file associated with a musical program or layer of multilayered media file 216. The attributes of Section element 320 include:
“Volume=“37””, which indicates an amount by which to adjust the volume of the section of multilayered media file 216.
Definition file 208 includes at least one “Beam” element 308 comprising a plurality of attributes and additional elements. The attributes comprise name value pairs that describe a beam or trigger of multilayered media file 208. The attributes include:
“LoopInterval=“1””, which indicates a loop interval of Beam element 308 comprises a value of 1;
“Trigger=“OneShot””, which is a categorical value that indicates a type of trigger being one a “OneShot” trigger, a StartStop trigger, a Pulsed trigger, and a swap sounds trigger, Beam element 308 being a “OneShot” trigger;
Beam element 308 includes a “Regions” element 402. Regions element 402 includes one or more “Region” elements 404, 408, 416.
Region element 404 includes a “Name” attribute with a value of “Ending” that indicates a name of Region 404. Region element 404 includes a “Title” attribute with a value of “Crash” indicating a title of Region element 404. Region element 404 includes an empty “Segments” element 406.
Region element 408 includes a “Name” attribute with a value of “Default” that indicates a name of Region 408. Region element 408 includes a “Title” attribute with a value of “Crash” indicating a title of Region element 408. Region element 408 includes a “Segments” element 410, which comprises “Segment” elements 412 and 414. Attributes of Segment element 412 include:
GUI 502 includes several user interface (UI) elements to manipulate multilayered media playback. GUI 502 is displayed on touchscreen 202 to allow a user to interact with the UI elements of GUI 502.
Text elements 504 and 506 provide information about current multilayered media file 216. Text element 504 indicates a song name of multilayered media file 216 is “Cool Jazz”, as specified in the Name attribute of Program element 304 of definition file 208 of multilayered media file 216. Text element 506 indicates a name of an artist of multilayered media file 216 is “Beamz Original”, as specified in the Artist attribute of Program element 304 of definition file 208 of multilayered media file 216.
Display of beam 512 on GUI 502 includes text elements 508 and 510. Beam 512 is defined by Beam element 308 of
At step 602, processing unit 140 receives attributes of a musical program of multilayered media file 216, which comprises a plurality of musical programs. The attributes of the musical program of the multilayered media file comprise one or more values each related to one of: a description of the musical program, a pulse rate of the musical program, a pulse delay of the musical program, a trigger of the musical program, a volume of the musical program, and a time shift of the musical program. The values related to the trigger comprise an indication of a type of the trigger and a debounce value of the trigger. The type of the trigger comprises one of a one shot trigger, a start stop trigger, a pulsed trigger, and a swap sounds trigger. The value related to the volume of the musical program is relative to a volume of the multilayered media file. The value related to the time shift of the musical program is a time shift relative to playback of the multilayered media file.
At step 604, processing unit 140 receives a command related to a musical program of multilayered media file 216. The command is a trigger that controls the musical program of the multilayered media file.
At step 606, processing unit 140 outputs a multilayered media file 216 based on one or more attributes and commands. Each of the musical programs of multilayered media file 216 comprises a subset of a predetermined musical composition, and each of the musical programs is correlated to each other. Each of the musical programs of multilayered media file 216 comprises sound elements configured to generate sympathetic musical sounds.
At step 608, processing unit 140 displays a description of a musical program of multilayered media file 216. The description is defined in definition file 208 of multilayered media file 216.
Although the present disclosure has been described with an exemplary embodiment, various changes and modifications may be suggested to one skilled in the art. It is intended that the present disclosure encompass such changes and modifications as fall within the scope of the appended claims.
The present application is a continuation in part of U.S. patent application Ser. No. 14/088,178, filed Nov. 22, 2013, entitled “APPARATUS AND METHOD FOR MULTILAYERED MUSIC PLAYBACK”. The present application claims priority to U.S. Provisional Patent Application Ser. No. 61/863,824, filed Aug. 8, 2013, entitled “APPARATUS AND METHOD FOR LAYERED MUSIC PLAYBACK”; the content of the above-identified patent document is incorporated herein by reference.
Number | Date | Country | |
---|---|---|---|
61863824 | Aug 2013 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 14088178 | Nov 2013 | US |
Child | 14165449 | US |