The disclosed embodiments relate generally to music-related processing, and more particularly to techniques that enable a group of users to create and participate in a collaborative music session.
Advances in recording devices and virtual instruments have allowed users to more easily create, record, and edit music in the digital realm. The proliferation of computers in various forms has made both the creation and playback of music recordings accessible to users, including musicians and non-musicians alike, without the need for music studios, expensive equipment, and the like. The rising popularity of mobile devices, such as portable laptops, smartphones, which can function as virtual musical instruments, has enabled users to make music in an easy manner.
In the following description, for the purposes of explanation, specific details are set forth in order to provide a thorough understanding of embodiments of the invention. However, it will be apparent that various embodiments may be practiced without these specific details.
Certain embodiments of the invention allow two or more users to form a band, jam together and save a musical memento of the jam session. Two or more computing devices (e.g., tablet computers, laptops, desktops, etc.) operating music creation and recording software can be communicatively coupled together (e.g., wirelessly, hardwired, etc.) to provide a synchronized real-time jamming experience. Each jam session can have a band leader, who may be the user operating a host device that creates the jam session, and one or more band members, who may operate client devices to join the jam session. Tasks performed by the band leader may include creating a jam session, selecting one or more songs for the jam session, operating playback and recording controls (e.g., playback, record, rewind, fast forward functions, and the like), verifying song architecture uniformity (e.g., tempo, time signature, key signature, etc.) and collecting the recordings from the devices involved in the jam session after the session is complete. In some embodiments, if a jam session is interrupted (e.g., school break is over, network failure, etc.), the participants can pick up the session and continue where they left off. Once the band members of the jam session are satisfied with the result, the band leader (i.e., host) can either manually or automatically collect the recordings of each band member via the communicative coupling means (e.g., wireless coupling) and archive a complete recording of the session for subsequent playback, editing, or further jam sessions (i.e., “sessioning”).
System Architecture
It should be appreciated that musical performance system 1000 as shown in
In some embodiments, display subsystem 1005 can provide an interface that allows a user to interact with musical performance system 1000. The display subsystem 1005 may be a cathode ray tube (CRT), a flat-panel device such as a liquid crystal display (LCD), a projection device, a touch screen, and the like. In general, use of the term “output device” is intended to include all possible types of devices and mechanisms for outputting information from computer system 1400. For example, a software keyboard may be displayed using a flat-panel screen. In some embodiments, the display subsystem 1005 can be a touch interface, where the display provides both an interface for outputting information to a user of the device and also as an interface for receiving inputs. In other embodiments, there may be separate input and output subsystems. Through the display subsystem 1005, the user can view and interact with a GUI (Graphical User Interface) 1020 of a musical performance system 1000. In some embodiments, display subsystem 1005 can include a touch-sensitive interface (also sometimes referred to as a touch screen) that can both display information to the user and receive inputs from the user. Processing unit(s) 1010 can include one or more processors that each have one or more cores. In some embodiments, processing unit(s) 1010 can execute instructions stored in storage subsystem 1015.
Communications system 1060 can include various hardware, firmware, and software components to enable electronic communication between multiple computing devices. Communications system 1060 or components thereof can communicate with other devices via Wi-Fi, Bluetooth, infra-red, or any other suitable communications protocol that can provide sufficiently fast and reliable data rates to support the real-time jam session functionality described herein.
Storage subsystem 1015 can include various memory units such as a system memory 1030, a read-only memory (ROM) 1040, and a non-volatile storage device 1050. The system memory can be a read-and-write memory device or a volatile read-and-write memory, such as dynamic random access memory. System memory 1030 can store some or all of the instructions and data that the processor(s) or processing unit(s) need at runtime. ROM 1040 can store static data and instructions that are used by processing unit(s) 1010 and other modules of system 1000. Non-volatile storage device 1050 can be a read-and-write capable memory device. Embodiments of the invention can use a mass-storage device (such as a magnetic or optical disk or flash memory) as a permanent storage device. Other embodiments can use a removable storage device (e.g., a floppy disk, a flash drive) as a non-volatile (e.g., permanent) storage device.
Storage subsystem 1015 can store MIDI (Musical Instrument Digital Interface) data 1034 relating to music played on virtual instruments of the musical performance system 1000, song architecture data 1032 to store song architecture parameters (which may be a subset of general song data) for each jam session, and collected recordings 1036 for storing collected tracks after each jam session. Storage subsystem 1015 may also store audio recording data, and general song data (e.g., with track and instrument data). For MIDI-based tracks, MIDI data may be stored. Similarly, for audio-based tracks, audio data can be stored (e.g., audio files such as .wav, .mp3, and the like). Further detail regarding system architecture and the auxiliary components thereof (e.g., input/output controllers, memory controllers, etc.) are not discussed in detail so as not to obfuscate the focus on the invention and would be understood by those of ordinary skill in the art.
Jam Session Interface
Certain embodiments described herein can be implemented by any suitable electronic device with music creation and recording capabilities. An example of such an electronic device is a device that is capable of executing a music creation and recording application (referred to herein as a “music application”). An example of such an application is the GarageBand™ application provided by Apple Inc. A variety of different devices including but not limited to tablet computers, smart phones, laptops, PDA's, and the like may provide this functionality.
According to certain embodiments, Jam Session functionality can be provided as a feature by the music application. For example, the GarageBand™ application may provide a user-selectable option (e.g., a UI icon or control button in a control bar of GarageBand™), which when selected by a user, invokes the jam session functionality described herein. When a host establishes a jam session on a host device, the icon can indicate that a jam session is active. For example, the icon can change color, appear illuminated, flash, or perform some other visual cue to alert the host that a jam session is active. In other embodiments, other indicators can be used to provide the user with a visual notification that a Jam Session is in progress. As each new band member (i.e., client and client device) joins the Jam Session, their UI icons can also indicate that they have joined the host's Jam Session. In one embodiment, a client can join a session while a Jam Session pop-up menu is open on the host device. If the pop-up menu is closed, subsequent join requests may be automatically denied.
A user of a musical performance system 1000 may create a new jam session. As the creator of the jam session, the user may also be referred to as the band leader or host. For a jam session created by the band leader, the band leader can control the permissions and privileges associated with that jam session. For example, in certain embodiments, the band leader can limit the functionalities provided to other participants in that jam session. In some embodiments, the music application may provide a “bandleader control” feature 820 that allows a band leader to set certain permissions for participants in the jam session. For example, the jam session application may provide transport or playback controls including Stop, Play, Record, Fast Forward, Rewind, and the like. A band leader can limit access to the transport controls to himself/herself or may alternatively share the controls between a number of clients participating in a jam session. In one embodiment, if the bandleader control is turned off, all parties to the jam session may have transport control access. In cases where multiple clients initiate transport control commands, the last change may be applied to the jam session. For example, if three band members initiated three different transport controls, the last initiated transport control would apply to the jam session.
In some embodiments, the play head position (i.e., time position in a song) of the device that initiated play or record can be transmitted to all other peer devices to initiate a “Play” command. Peer devices can include all devices in the jam session, particularly when a bandleader-client hierarchical relationship is not established (i.e., when the bandleader control is turned off), such that each device included in a jam session have equal functionality (e.g., each peer can perform bandleader-like functions such as recording, collecting tracks, controlling the transport controls of other peers, etc.). For example, as Play or Record functions are initiated, each peer device can see the same arrange area (and thus song section) as the peer who initiated the Play or Record function. In some cases, if a peer (e.g., host or client) presses Record, the peer's device begins recording while the other peer devices of the current Jam Session are placed in Play mode. As such, with the bandleader control off, any peer can record (e.g., punch-in or punch-out) at any time.
If the bandleader control is turned on, transport control access is limited to the host. In some embodiments, the transport controls are disabled on client devices and may appear “grayed out” or include some other visual cue to indicate that access is presently denied. In this mode, the host can solely and simultaneously operate the transport controls on all devices in the Jam Session. For example, if Play or Record is pressed on the host device, a play head position of the host device is transmitted to all other devices and a Play command is executed. In other words, as Play or Record functions are initiated, all peer devices should see the same arrange area (and thus song section) as the peer who initiated the Play or Record function.
Some of the embodiments described herein incorporate “slide-out” notifications, which may be displayed in response to various Jam Session events (e.g., a client leaves a session). For example, when a client device disconnects from a session, a slide-out notification may be displayed to the host device and/or remaining client devices in the jam session informing them that a client device has left the jam session. In certain cases, a slide-out notification can be displayed such that it is semi-transparent and overleaved upon the window. The semi-transparency of the notification allows a user to read the information conveyed by the notification but also enables the user to see through the notification to the underlying layer (e.g., virtual instrument controls) and also to touch “through” to the controls covered by the notification. For example, the transport controls underneath the slide-out notification can still be touched while the notification is displayed.
Synchronization and Transport Control
To create and maintain a Jam Session between multiple peers, the host and client devices within the Jam Session should be synchronized with one another to ensure that transport control operations are aligned. In certain embodiments, each participating device in the Jam Session establishes a common time base (i.e., absolute time) to synchronize operations. A Transport Control State Machine (TCSM), which is operated by the host device, receives transport control requests from both the host and client devices. Some transport control requests can include play, pause, rewind, fast forward, record, and the like. The TCSM processes the transport control requests and sends all according actions (e.g., play, record, etc.) to all participating devices in the Jam Session. In some embodiments, a command (e.g., play command) from the TCSM first passes through the various layers of the host device operating system (OS), transmits via a wireless network to one or more client devices, then passes thru the various layers of the one or more client devices' OS to finally be executed on the target devices (i.e., the client device executes the play command). In some cases, each of these stages (e.g., network, operating system and execution) can each add a delay to the transport control information. Depending on the devices used and the various operating system operations therein, a total delay time may differ from one device to the next. Thus, short time delays and good synchronization between peer devices can assure that users participating in a Jam Session can experience what would appear for all practical purposes to be synchronized and simultaneous operation between devices.
Time stamping can be used to resolve potential time delay and synchronization aspects discussed above. In some embodiments, the “Transport Control State Machine” attaches a timestamp to a transport control command. In some embodiments, the timestamp can be the host device's absolute time plus an offset that is larger than the longest latency (e.g., network latency plus device latency) that may potentially occur between devices. The timestamp can be a predetermined static value or may be dynamically optimized at runtime. In some cases, if the client/host device knows its associated latencies (e.g., device and/or network latency), these values can be subtracted from the time stamped transport command to more accurately determine when to execute the transport command.
In certain embodiments, each device in a Jam Session can determine device latency by measuring the time between executing a start command on a host device and executing the start command on the client device. From these time values, an average value can be generated from a number of time measurements during the session. In certain embodiments, the time measurements used to determine the latency between devices in a Jam Session can be determined prior to the users (e.g., host and clients) beginning the Jam Session. For example, the first measurement can be generated on the first start of the application, such as during audio engine initialization due to restart, audio route change, background to foreground selection, and the like. In other words, the inherent latency in a Jam Session can be determined very quickly such that the duration of the latency determination process can be practically imperceptible from a user's perspective. Latency can be determined in any number of ways that would be appreciated by one of ordinary skill in the art with the benefit of this disclosure.
Referring to
Round trip calculations can be performed using single round trip signals, or burst signals including a number of round trip signals. In some embodiments, the quality or reliability of the latency measurements are judged by their roundtrip time. For example, calculations to determine latency may consider that the shorter the round trip time, the smaller the error associated in the round trip measurements. In such cases, smaller latency times are weighed heavier than higher latency times in latency calculations, particularly in burst signal measurements. In other words, some embodiments may employ a weighted average to determine latency with more weight given to the shorter measurements. Latencies can be determined a variety of ways including the methods discussed herein, various permutations of those methods, and any other suitable method known by those skilled in the art that can synchronize the operation of peer devices in a jam session.
According to certain embodiments, each song created during a Jam Session includes a number of song architecture parameters. The Jam Session software checks certain song architectural parameters to determine if the songs in each peer device are similar enough to be combined (e.g., during track collection) and/or used together in a Jam Session. In some embodiments, these parameters are aligned across all devices participating in the Jam Session to ensure that the host song is aligned with each client song.
Song architecture parameters can include song section data, time signature data, tempo data, key signature data, custom chord data, master effects preset selection data, count-in data, and fadeout data. The song architecture parameters can be divided into primary and secondary parameters. The primary parameters can include song section data and time signature data. The secondary parameters may include tempo data, key signature data, custom chord data, master effects preset selection data, count-in data, and fade out data.
In some embodiments where Song Sections and/or Time Signature (i.e., primary parameters) differ between a host song on a host device and a client song on a client device, a new empty song may be created on the client device, and the time signature and song section values of the new song are adapted (e.g., automatically) to match the values of the host song parameters. The other secondary parameters of the client song may also be matched to the values of the host song parameters. In some embodiments where both Song Sections and Time Signature are the same on the client and host devices, but the other song architecture parameter values differ, the client may continue with the current song, but the secondary parameter values are matched to those of the host song. In some embodiments where all song architecture parameter values are the same on the client and the host device, the client may continue with the current client song without any changes. It should be noted that matching song architecture parameters between host and client devices as described can ensure that in case of an unintentional Jam Session shutdown (e.g., network failure, etc.), Jam Session peers can pick up the Jam Session where they left off provided that the primary parameters were not changed during Jam Session shutdown.
Starting a Jam Session
Referring to
Referring to
If it is determined in 130 that the client song's song section does not match the host song's song section, then at 145, the client device determines whether the client's song is new or unsaved and not dirty. A song is considered “unsaved and not dirty” if the song is a newly created song and does not contain any significant user input (i.e., the user input is insignificant insofar that altering or removing such user input would not be perceived as data loss by the user). For example, a recorded audio track may be significant, while only changing a secondary parameter with no recorded audio may be insignificant. If it is determined in 145 that the client's song is new or unsaved and not dirty, then at 154, the client device alerts the client user that the session is ready to begin a jam session. In some cases, at 154, the client device may optionally alert the client that the host device controls the tempo, length, chords and other setting for the jam session if the conductor mode of the client device is off. Alternatively, in 154, the client device may alert the client that the host device controls the transport, tempo, length, chords, and other settings for the jam session, if the conductor mode is turned on. In 154, the client device may prompt the client to proceed (e.g., press “OK”) and the method then proceeds to F in
If it is determined in 145 that the client song is neither new nor unsaved and not dirty, then at 156, the client device can alert the client that a new song is being created. As part of 156, in cases when the conductor mode is off, the client device can alert the client that a new song is needed for the jam session and provide an option to save the current song. The client device can further alert the client that the host device controls the tempo, length, and other song settings for the jam session. In cases when the conductor mode is on, the client device can alert the client that a new song is needed for the jam session and provide an option to save the current song. In addition, the client device can further alert the client that the host device controls the transport, tempo, length, and other song settings for the jam session. The client device may then prompt the client to either continue (e.g., press “OK”) or cancel. If the client continues, the method then proceeds to G in
Referring back to 130, if it is determined that the client song's song sections match the host song's song sections, the client device at 140 determines if the client song's time signature matches the host song's time signature. If it is determined in 140 that the client song's time signature does not match the host song's time signature, then processing continues with 145, as described above, where it is determined if the client song is new or unsaved and not dirty. If it is determined in 140 that the client and host songs do share the same time signature, then in 150, the client device determines if the client song's other song architecture parameters (e.g., tempo, custom chords, etc.) match the host song's other song architecture parameters. If they do not match, the method proceeds to 154, as described above. If they do match, at 152, the client device can display an alert that the device is ready to begin the jam session. In some embodiments, as part of 152, the client device may alert the client that the host device controls the tempo, length, chords and other setting for the jam session if the conductor mode of the client device is off. Alternatively, the client device may alert the client that the host device controls the transport, tempo, length, chords, and other settings for the jam session, if the conductor mode is turned on. The client device prompts the client to proceed (e.g., press “OK”) and the method proceeds to E, as depicted in
Referring to point E of
Referring to point F, the client device at 162 adapts the secondary parameters of the current client song to the secondary parameters of the host song architecture. As described above, further song architectural changes can be initiated by the host device at 170 and the song architecture controls are disabled on the client devices at 175. At 180, if another user wants to join the host initiated jam session, the method returns to 104 of
Referring to point G, if the client song is not new nor unsaved and not dirty at 156, the client device, at 164, saves the current song and creates a new song. The client device then adapts the current song to the host song architecture at 166, including the primary and secondary parameters. At this stage, further song architectural changes can be initiated by the host device at 170 and the song architecture controls are disabled on the client devices at 175. At 180, if another user wants to join the host initiated jam session, the method returns to 104 of
It should be appreciated that the specific steps illustrated in
Song Architecture Changes Made During a Jam Session
In certain embodiments, if the host or client attempts to alter song architecture parameters during or after a jam session, the song may be rendered incompatible between peers. Alerts may be set up to inform the host and/or clients of these situations to prevent these issues. In one example, if one or more clients (i.e., client devices) leave a Jam Session, and the host attempts to change a primary parameter (e.g., song sections or time signature), an alert may pop up that informs the host that such changes may render the song incompatible for the missing band members in the event that they wish to rejoin the jam session at a later time. For example, if a client device is momentarily disconnected from a jam session and the host device changes the time signature during the absence of the client device, the client device may not be able to rejoin the jam session because a primary parameter was changed, thus making the host device song and client device song incompatible. It should be noted that the use of alerts, the frequency of their use, and their application to different scenarios can be customized as per each host/client's preference. Similarly, if a client device that was previously a participant in a jam session tries to change primary parameters of a song used in the Jam Session while offline, the client device may display a similar alert that changes may render the song incompatible for the jam session.
Referring to
In some embodiments, if the host device changes the song section or time signature at 230, but a client device has not left the jam session since the last song architecture was changed at 240, the host device can transmit all host song architecture parameters to the one or more clients in the current jam session at 270 and proceed as described above. If a client device has left a session since the last song architecture change, at 240, and the host is alerted that changing the song architecture will prevent the former clients to join the session again without having to automatically load a new song, at 250, then the host device can transmit all host song architecture parameters to the one or more client devices in the current jam session, at 270, and proceed as described above. If, at 270, the time signature or song sections changes, a peer (e.g., client device) leaves the jam session since the last song architecture change, at 240, and host device is not alerted that a band member is offline, at 250, then the host device alerts the host (e.g., host device user) that a band member is offline and advises the host that the band member will start with a new song, at 260, if the song sections or time signature is changed. If the host changes the song sections or time signature on the host device despite the alert, then, at 280, the host device transmits all host song architecture parameters to the one or more clients in the current jam session, at 270, and proceeds as described above. If the host (using the host device) does not change the song sections or time signature, at 280, after receiving the alert, at 260, method 200 can return to 220.
It should be appreciated that the specific steps illustrated in
Song Architecture Changes Made Offline
Referring to
At 320, if the peer (using a host or client device) tries to edit the song sections or the time signature (i.e., primary song architecture parameters), an alert may be displayed on the peer device (e.g., client device) prompting the user to verify if they intend to modify the jam session song. The alert may notify the peer (on the peer device) that changing song sections or time signatures can prevent them from re-joining the original jam session. In some cases, the alert may only be shown the first time. Alerts may be enabled, disabled, or configured by host or client devices as needed. If, at 340, the peer device follows through and edits song sections or changes the time signature of the jam session song, then, at 380, a new song is created on the peer device when rejoining the jam session. At 340, if the peer does not edit the primary architectural song parameters, but they are changed in the jam session while the peer is offline, at 350, a new song is created on the peer device at 380 when rejoining the jam session. If the primary architectural song parameters are not changed by the peer device, at 340, or in the jam session at 350, but the peer loads a different song instead, at 360, a new song is created on the peer device when rejoining the jam session at 380. If none of the cases 340, 350, 360 exist, the current song is kept when rejoining the jam session at 390.
It should be appreciated that the specific steps illustrated in
Accessing Host/Client Songs during Jam Session
Referring to
At 420, if the peer accessing its song browser is a client, the client device displays an alert requesting confirmation to change the song. In some cases, the alert informs the client that they have to leave the current jam session in order to access the song browser. It should be noted that alerts may be optional and can be enabled, disabled, or modified as required. At 435, if the client cancels the song change request, method 400 returns to the ongoing jam session at 405. If the client continues with the song change, the client disconnects from the jam session at 445. Following disconnection of the client, the host device can receive a notification that the client device left the jam session, at 475, and the method returns to the ongoing jam session at 405.
It should be appreciated that the specific steps illustrated in
It should be noted that when a Jam Session is created on a host device, the Jam Session can be assigned a Universally Unique Identifier (UUID). The UUID is typically transmitted on the initial client configuration and can be used to trigger alerts if a user (utilizing a host or client device) tries to change song architecture parameters while offline. The host can optionally use the UUID to automatically access song directories of clients for finding and uploading Jam Session songs with a UUID that matches the Jam Session song UUID on the host. This may be useful in which former participants of a Jam Session reconnect and wish to continue work on the song of that particular Jam Session, but loaded a different song while being offline.
Leaving a Jam Session
Referring to
Referring to
It should be appreciated that the specific steps illustrated in
Collecting Recordings After a Jam Session
At the end of a jam session, the band leader (i.e., host device) can collect all recordings from each peer device in the jam session. The host can manually collect recordings from one or more client devices, or set up an automated collection process. The jam session control user interface can provide a list of peers (e.g., client devices) connected to a current jam session to allow the band leader to identify which client devices to retrieve recordings from. In some embodiments, the “auto-collect recordings” and “bandleader control” features control the recording collection process.
With the auto-collect and bandleader controls on, the bandleader (via the host device) can automatically collect unmuted and/or soloed tracks from band members (e.g., client devise) after recording stops. Tracks collected from band members may be flagged as “band member tracks” on the bandleader device. In some cases, all “band member tracks” are automatically muted as they are collected by the bandleader device (e.g., host). The bandleader can optionally unmute collected tracks after collection. In some embodiments, the host device automatically deletes muted “band member tracks” when a recording is initiated and saves unmuted tracks. Typically, tracks that are both muted and flagged as a band member track are deleted when starting a new recording. The auto-collect and bandleader controls are typically a default setting, but can be customized to user preference.
With the auto-collect or bandleader controls turned off, the bandleader has to manually collect recordings from each client device. In certain embodiments, if the band leader ends the session (e.g., terminates session by pressing “Stop Session” in a Jam Session pop-up menu), the “band member track” flag is removed from all “band member tracks,” and any muted “peer tracks” (automatically or manually muted) are unmuted.
Referring to
At 610, if the band leader control is turned on, the host device determines whether an auto-collect mode is enabled at 615. The auto-collect mode can automatically collect all recordings from each client device at the end of a session. At 615, if the auto-collect mode is turned off, the host device at 624 can collect the recordings by manual selection and method 600 continues as described above. If the auto-collect mode is turned on, the host can press a record button on a transport control of the host device at 620, and the host device determines how many unmuted tracks are in the host arrange at 625. If eight tracks are currently in the host arrange, at 630 the host device can alert the host that there are not enough tracks available to collect any additional recordings as described above. At 625, if there are less than eight tracks in the host arrange, the host device determines at 650 if there is at least one muted “band member track” already stored in the host arrange. If there is at least one band member track muted in the host arrange, the method continues to B. If there is not at least one band member track muted in the host arrange, the method continues to C. Both B and C are discussed below with respect to
Referring to
Referring to
Referring to
Referring to
It should be appreciated that the specific steps illustrated in
In some embodiments, track UUIDs can be used for marking tracks as client tracks to preserve a reordering of tracks and, if desired, to prioritize already imported client tracks over newly created tracks. UUIDs may also be assigned to audio and sampler files to avoid duplicated transmission of data already existing on the host.
In further embodiments, the host (i.e., via the host device) can mark tracks received from the client device as client tracks. The host may delete muted client tracks when the Collect Recordings function is initiated. The client track flag is typically removed upon editing a track or opening the Touch Instrument of a track. In some cases, when the Jam Session is disconnected from the host side, muted client tracks are unmuted, and the client flags can be removed from the client tracks.
Processing unit(s) 1105 can include a single processor, which can have one or more cores, or multiple processors. In some embodiments, processing unit(s) 1105 can include a general purpose primary processor as well as one or more special purpose co-processors such as graphics processors, digital signal processors, or the like. In some embodiments, some or all processing units 1105 can be implemented using customized circuits, such as application specific integrated circuits (ASICs) or field programmable gate arrays (FPGAs). In some embodiments, such integrated circuits execute instructions that are stored on the circuit itself. In other embodiments, processing unit(s) 1105 can execute instructions stored in storage subsystem 1110.
Storage subsystem 1110 can include various memory units such as a system memory, a read-only memory (ROM), and a permanent storage device. The ROM can store static data and instructions that are needed by processing unit(s) 1105 and other modules of electronic device 1100. The permanent storage device can be a read-and-write memory device. This permanent storage device can be a non-volatile memory unit that stores instructions and data even when computer system 1100 is powered down. Some embodiments of the invention can use a mass-storage device (such as a magnetic or optical disk or flash memory) as a permanent storage device. Other embodiments can use a removable storage device (e.g., a floppy disk, a flash drive) as a permanent storage device. The system memory can be a read-and-write memory device or a volatile read-and-write memory, such as dynamic random access memory. The system memory can store some or all of the instructions and data that the processor needs at runtime.
Storage subsystem 1110 can include any combination of computer readable storage media including semiconductor memory chips of various types (DRAM, SRAM, SDRAM, flash memory, programmable read-only memory) and so on. Magnetic and/or optical disks can also be used. In some embodiments, storage subsystem 1110 can include removable storage media that can be readable and/or writeable; examples of such media include compact disc (CD), read-only digital versatile disc (e.g., DVD-ROM, dual-layer DVD-ROM), read-only and recordable Blue-Ray® disks, ultra density optical disks, flash memory cards (e.g., SD cards, mini-SD cards, micro-SD cards, etc.), magnetic “floppy” disks, and so on. The computer readable storage media do not include carrier waves and transitory electronic signals passing wirelessly or over wired connections.
In some embodiments, storage subsystem 1110 can store one or more software programs to be executed by processing unit(s) 1105, such as a user interface 1115. As mentioned, “software” can refer to sequences of instructions that, when executed by processing unit(s) 1105 cause computer system 1100 to perform various operations, thus defining one or more specific machine implementations that execute and perform the operations of the software programs. The instructions can be stored as firmware residing in read-only memory and/or applications stored in magnetic storage that can be read into memory for processing by a processor. Software can be implemented as a single program or a collection of separate programs or program modules that interact as desired. Programs and/or data can be stored in non-volatile storage and copied in whole or in part to volatile working memory during program execution. From storage subsystem 1110, processing unit(s) 1105 can retrieve program instructions to execute and data to process in order to execute various operations described herein.
A user interface can be provided by one or more user input devices 1120, display device 1125, and/or and one or more other user output devices (not shown). Input devices 1120 can include any device via which a user can provide signals to computing system 1100; computing system 1100 can interpret the signals as indicative of particular user requests or information. In various embodiments, input devices 1120 can include any or all of a keyboard touch pad, touch screen, mouse or other pointing device, scroll wheel, click wheel, dial, button, switch, keypad, microphone, and so on.
Output devices 1125 can display images generated by electronic device 1100. Output devices 1125 can include various image generation technologies, e.g., a cathode ray tube (CRT), liquid crystal display (LCD), light-emitting diode (LED) including organic light-emitting diodes (OLED), projection system, or the like, together with supporting electronics (e.g., digital-to-analog or analog-to-digital converters, signal processors, or the like), indicator lights, speakers, tactile “display” devices, headphone jacks, printers, and so on. Some embodiments can include a device such as a touchscreen that function as both input and output device.
In some embodiments, output device 1125 can provide a graphical user interface, in which visible image elements in certain areas of output device 1125 are defined as active elements or control elements that the user selects using user input devices 1120. For example, the user can manipulate a user input device to position an on-screen cursor or pointer over the control element, then click a button to indicate the selection. Alternatively, the user can touch the control element (e.g., with a finger or stylus) on a touchscreen device. In some embodiments, the user can speak one or more words associated with the control element (the word can be, e.g., a label on the element or a function associated with the element). In some embodiments, user gestures on a touch-sensitive device can be recognized and interpreted as input commands; these gestures can be but need not be associated with any particular array in output device 1125. Other user interfaces can also be implemented.
Network interface 1135 can provide voice and/or data communication capability for electronic device 1100. In some embodiments, network interface 1135 can include radio frequency (RF) transceiver components for accessing wireless voice and/or data networks (e.g., using cellular telephone technology, advanced data network technology such as 3G, 4G or EDGE, Bluetooth, WiFi (IEEE 802.11 family standards, or other mobile communication technologies, or any combination thereof), GPS receiver components, and/or other components. In some embodiments, network interface 1135 can provide wired network connectivity (e.g., Ethernet) in addition to or instead of a wireless interface. Network interface 1135 can be implemented using a combination of hardware (e.g., antennas, modulators/demodulators, encoders/decoders, and other analog and/or digital signal processing circuits) and software components.
Bus 1140 can include various system, peripheral, and chipset buses that communicatively connect the numerous internal devices of electronic device 1100. For example, bus 1140 can communicatively couple processing unit(s) 1105 with storage subsystem 1110. Bus 1140 also connects to input devices 1120 and display 1125. Bus 1140 also couples electronic device 1100 to a network through network interface 1135. In this manner, electronic device 1100 can be a part of a network of multiple computer systems (e.g., a local area network (LAN), a wide area network (WAN), an Intranet, or a network of networks, such as the Internet. Any or all components of electronic device 1100 can be used in conjunction with the invention.
Some embodiments include electronic components, such as microprocessors, storage and memory that store computer program instructions in a computer readable storage medium. Many of the features described in this specification can be implemented as processes that are specified as a set of program instructions encoded on a computer readable storage medium. When these program instructions are executed by one or more processing units, they cause the processing unit(s) to perform various operation indicated in the program instructions. Examples of program instructions or computer code include machine code, such as is produced by a compiler, and files including higher-level code that are executed by a computer, an electronic component, or a microprocessor using an interpreter.
It will be appreciated that computer system 1100 is illustrative and that variations and modifications are possible. Computer system 1100 can have other capabilities not specifically described here (e.g., mobile phone, global positioning system (GPS), power management, one or more cameras, various connection ports for connecting external devices or accessories, etc.). Further, while computer system 1100 is described with reference to particular blocks, it is to be understood that these blocks are defined for convenience of description and are not intended to imply a particular physical arrangement of component parts. Further, the blocks need not correspond to physically distinct components. Blocks can be configured to perform various operations, e.g., by programming a processor or providing appropriate control circuitry, and various blocks might or might not be reconfigurable depending on how the initial configuration is obtained. Embodiments of the present invention can be realized in a variety of apparatus including electronic devices implemented using any combination of circuitry and software.
System 1000 depicted in
Network 1306 may include one or more communication networks, which could be the Internet, a local area network (LAN), a wide area network (WAN), a wireless or wired network, an Intranet, a private network, a public network, a switched network, or any other suitable communication network. Network 1306 may include many interconnected systems and communication links including but not restricted to hardwire links, optical links, satellite or other wireless communications links, wave propagation links, or any other ways for communication of information. Various communication protocols may be used to facilitate communication of information via network 1306, including but not restricted to TCP/IP, HTTP protocols, extensible markup language (XML), wireless application protocol (WAP), protocols under development by industry standard organizations, vendor-specific protocols, customized protocols, and others.
In the configuration depicted in
In the configuration depicted in
It should be appreciated that various different distributed system configurations are possible, which may be different from distributed system 1300 depicted in
While the invention has been described with respect to specific embodiments, one skilled in the art will recognize that numerous modifications are possible. Thus, although the invention has been described with respect to specific embodiments, it will be appreciated that the invention is intended to cover all modifications and equivalents within the scope of the following claims.
The above disclosure provides examples and aspects relating to various embodiments within the scope of claims, appended hereto or later added in accordance with applicable law. However, these examples are not limiting as to how any disclosed aspect may be implemented,
All the features disclosed in this specification (including any accompanying claims, abstract, and drawings) can be replaced by alternative features serving the same, equivalent or similar purpose, unless expressly stated otherwise. Thus, unless expressly stated otherwise, each feature disclosed is one example only of a generic series of equivalent or similar features.
Any element in a claim that does not explicitly state “means for” performing a specified function, or “step for” performing a specific function, is not to be interpreted as a “means” or “step” clause as specified in 35 U.S.C. §112, sixth paragraph. In particular, the use of “step of” in the claims herein is not intended to invoke the provisions of 35 U.S.C. §112, sixth paragraph.
The present non-provisional application claims benefit under 35 U.S.C. §119 of U.S. Provisional Patent Application No. 61/607,577, filed on Mar. 6, 2012, and entitled “SYSTEM AND METHOD FOR MUSIC COLLABORATION” which is herein incorporated by reference in its entirety for all purposes.
Number | Date | Country | |
---|---|---|---|
61607577 | Mar 2012 | US |