Virtual musical instruments, such as MIDI-based or software-based keyboards, guitars, basses, and the like, are ubiquitous in contemporary music across many different genres. Virtual instruments allow a user to play virtually any sound that a typical acoustic instrument could play and much more. Amateur musicians with little to no experience on a particular instrument or with music composition may find that virtual instruments are more intuitive and can provide simplified ways of creating music without needing the manual dexterity or knowledge of music theory that a conventional instrument may require.
Software-based music production tools can be used to create many different genres of music and provide resources that can allow a user to quickly and easily create musical compositions without the need for any appreciable proficiency at a particular instrument. For example, musical passages can be created in real-time, in a methodical stepwise fashion, or a combination thereof. Notes, chords, melodies, and harmonies can be created, and in some cases, the software can provide shortcuts that can make producing music even easier without the need for understanding its theoretical underpinnings. For example, music production software may help a user create a chord progression with diatonic harmony without requiring the user to understand the theory of the chord sequence. As a result, software-based music production tools have become ubiquitous across many genres of music. Although this document refers to music production tools generally as digital audio workstations (DAWs)(e.g., Logic Pro™), it should be understood that any suitable production tool can implement the concepts and embodiments described herein and can additionally include, but are not limited to, software sequencers, synthesizers, drum machines, Musical Instrument Digital Interface (MIDI) keyboard workstations, software plug-ins, and the like.
One type of musical technique that conventionally requires some proficiency is the arpeggio. An arpeggio involves the playing or sounding notes of a chord in a sequence, rather than playing them simultaneously. For example, a C major chord comprises the notes of C, E, and G. One example of a C major arpeggio may involve playing the notes of C, E, G, E, and C in succession, one after the other. This technique can become physically challenging to perform when played with fast tempos, large octave ranges, complex chord structures, difficult chord changes, or the like. Thus, many systems incorporate features to automate the performance of arpeggios and help create musical sequences and progressions that could not otherwise be played by those lacking in musical proficiency. An arpeggiator can streamline the process of creating an arpeggio by automatically stepping through a sequence of notes based on an input (e.g., chord). An arpeggiator is a feature typically available on synthesizers, digital audio workstations (DAW), software sequencers, or other music creation programs or tools that can automatically step through a sequence of notes based on an input (e.g., chord) to create an arpeggio. The notes can often be transmitted to a MIDI sequencer for recording and editing. An arpeggiator typically can control the speed, range, and order in which the notes play, including patterns trending upwards, downwards, or randomly. More contemporary arpeggiators allow the user to step through a pre-programmed complex sequence of notes, or even play several arpeggios at once.
Although arpeggiators can be a highly useful and powerful creative tool, many users find that conventional arpeggiators are difficult or cumbersome to use, they are limited in their application, or require extensive tinkering to generate a harmonically pleasing and useful sequence. These problems lead to frustration and make arpeggiators less useful for many practical applications. Therefore, a need exists for an arpeggiator that can be applied to a broad spectrum of applications in a seamless, intuitive, and musically inspiring way.
Embodiments of the invention generally relate to software configured for generating, recording, editing, and producing musical performances. More specifically, embodiments of the invention relate to real-time editing of an arpeggio in a musical performance.
Real-time editing of live-played arpeggios allows a user to physically play an arpeggio (or build one in a step-wise fashion) and capture aspects of the performance in a grid-type interface. Certain aspects of the arpeggio performance (i.e., performance data) include the velocity of the notes, the type of note (e.g., rest, note, tie), and the rhythmic order of the arpeggiated notes. The captured arpeggio performance data can then be applied to subsequent chords in real-time to automatically create new arpeggio sequences based on the captured performance data. The user can further edit the performance data (e.g., change velocity data) in real-time after the performance is captured, while simultaneously creating, playing, and altering arpeggiated performances during a live performance.
In some embodiments, a method includes receiving a first set of performance data corresponding to a first plurality of MIDI-based notes in a first rhythmic order. The first plurality of MIDI-based notes may form a first arpeggio, with each of the first plurality of notes having a corresponding first performance data. The method further includes receiving input data indicating a change to the first performance data corresponding to a note in the first plurality of notes, changing the first performance data for the corresponding note using the input data, receiving a second set of performance data corresponding to a second plurality of MIDI-based notes, and applying the changed first performance data to the second performance data. Applying the first changed performance data includes editing the second set of performance data in real-time by replacing the second performance data with the changed first performance data. In some embodiments, note data can be received and can include pitch, velocity, note characteristics (rest, tie, etc.), tone, other other aural characteristics, all of which can be edited using the grid-editing capabilities described herein. Although the various embodiments described herein tend to focus on editing note velocities and identifiers (e.g., note, tie, rest), it would be appreciated by one of ordinary skill in the art with the benefit of this disclosure that incorporating other parameters in grid-editing implementations are possible.
Musical Performance Data
Musical performance data can include any number of performance characteristics that define how a musical element is played (or not played) in an arpeggio. For example, musical performance data can include velocity data, note data (e.g., note type, rest type, note ties, etc.), rhythmic order data, timing data, pitch data, or any type of data that can characterize aspects of the performance, as would be appreciated by one of ordinary skill in the art with the benefit of this disclosure.
Velocity is one type of musical performance data that can be described as the speed or force with which the key is being hit. In a MIDI input device (e.g., keyboard), the harder a key is played, the higher the velocity value is registered. Similarly, the softer the key is played, the lower the velocity value. In MIDI, velocity is typically measured on a scale from 0 to 127, with 127 being the highest value that can be registered. It would be understood by one of ordinary skill in the art that any range of values, MIDI or otherwise, can be used to represent key velocity values.
Note data is another type of musical performance data that can be described or identified as one of a note, a rest, or a tie. Notes can include quarter notes, half notes, whole notes, or other note intervals, and any such arrangement or grouping, etc., is possible. A rest is an interval of silence in a piece of music, marked by a symbol indicating the length of the pause and, like notes, may be of any desired duration. A tie is represented by a curved line that connects the heads of two notes of the same pitch and name, indicating that they are to be played as a single note with a duration equal to the sum of the individual notes' note values. For example, a whole note may span over two measures and be noted with a tie to show that the note sustains during that time. Notes, rests, and ties, and their respective uses and applications would be understood by one of ordinary skill in the art.
Rhythmic order data is musical performance data that refers to the order in which notes are received. Timing data, which can be a part of rhythmic order data, is musical performance data that refers to the timing of the sequence of, e.g., notes or rests, in a musical passage.
Grid-Edit Arpeggiator Interface
Start/stop button 402 starts and stops an arpeggio generated by the arpeggiator. Latch button 404 is configured to “latch” notes (or rests) to create arpeggios of any desired length, e.g., in a live setting. With latch button 404 selected, notes can be individually added by a user (or by automation) to create an arpeggio of any desired length without having to physically play every note of the arpeggio at the same time. For example, a user can play 4 notes with her left hand and four notes with her right hand for a total of eight notes being played and shown in grid-edit field 440. With latch button 404 selected, the 8 notes remain “in play,” similar to a sustain pedal on a piano, allowing the user to release the keys and add more notes if desired. In another non-limiting example, a user can keep adding notes by depressing one key repeatedly with latch button 404 selected. For the sake of clarity, many examples throughout this document may refer to adding “notes” to an arpeggiator by a user. However, it should be understood that rests and note ties can be used instead of or in addition to notes, and that arpeggios may be created by a user, by automation, or other method that would be appreciated by one of ordinary skill in the art with the benefit of this disclosure. Furthermore, some embodiments only “grid capture” performance data, which can include velocity data and/or note data identifying whether the “note” is a note (has a pitch), is a rest, or is a chord, as shown in
Mode selector 406 sets the mode of the arpeggiator. Certain modes may include reset mode, add mode, add temporarily, transpose, gated transpose, and through mode, among other possible implementations. In add mode, a user can add addition notes to an arpeggio in live-mode when latch 404 is selected. For example, if a user plays three notes (C, E, G) and releases the input (e.g., keys), the three notes will play an arpeggio based on the Arc GUI 400 settings (e.g., rate 408, note order 410, variation 420, etc.). Notes played subsequent to the first three notes will be “added” to the arpeggio, such that two additional notes will create a 5-note arpeggio. In reset mode, after the first three notes are latched and released (e.g., user lets go of the keys), the three note arpeggio will play per Arc GUI 400 settings. When the user plays additional notes, the arpeggio is “reset” and the first three notes are replaced with the additional notes. In through mode, after the first three notes are latched and released, the three note arpeggio will play per Arc GUI 400 settings. The user can then play additional notes, such as a melody, to play along with or accompany the repeating arpeggio without affecting the notes of the arpeggio. These modes (and others) and their use cases would be understood by one of ordinary skill in the art with the benefit of this disclosure.
Note order selector 410 can control the melodic trend of the arpeggio. Up button 412 causes the notes of an underlying chord to be played in a repeating arpeggiated pattern of increasing pitch. For example, an A7 chord (A, C#, E, G) may be played in ascending order such as A, C#, E, G, A C#, E, G. Down button 413 causes notes of an underlying chord to be played in a repeating arpeggiated pattern of decreasing pitch. For example, the A7 chord may be played in a descending pattern such as A, G, E, C#, A, G, E. Up/down button 414 causes notes of an underlying chord to be played in alternating increasing and decreasing arpeggiated patterns. For example, the A7 chord may be played in a pattern such as A, C#, E, G, E, C#, A. In button 415 causes notes of an underlying chord to be played as an arpeggio from outside notes going inward. For example, the A7 chord may be played in a pattern such as A, G, C#, E, A, G, C#, E. Random button 416 can cause random patterns of the underlying chord to be played in a randomized fashion (e.g., combinations of upward, downward, in, outward, or other pattern). Free play button 417 causes the arpeggiator to play an arpeggio that matches a pattern played by the user. Any permutation of controlling the note order in an arpeggio can be implemented in Arp GUI 400.
Rate control 408 can control the rate or speed at which a generated arpeggio is played. For example, the notes of the generated arpeggio can be set as whole notes, half notes, quarter notes, eighth notes, sixteenth notes, eighth triplets, or the like. Arpeggio notes set to whole note values may play the underlying arpeggio more slowly than an arpeggio comprised of sixteenth notes. Any suitable method of controlling the rate of an arpeggio is possible (e.g., virtual knobs, faders, MIDI keyboard, etc.), as would be understood by one of ordinary skill in the art.
Variation control 420 controls the manner in which the arpeggio for the underlying chord is played. For example, a first variation pattern may start the arpeggio on a bass note of the underlying chord. The second variation pattern may start the arpeggio on the second note of the underlying chord (e.g., first inversion). The third variation may start the arpeggio on the third note of the underlying chords (e.g., second inversion), and so on. Any algorithm for arpeggiating the underlying chord can be associated with variation control 420, as would be appreciated by one of ordinary skill in the art with the benefit of this disclosure.
Octave Range selector 422 can control the harmonic range of the underlying chord. For example, with an octave range of one, a generated arpeggio may include notes limited to one octave range. With an octave range of two, a generated arpeggio may include notes over two octave ranges, and so on. Any number of octave ranges can be used or assigned to the arpeggio, as would be appreciated by one of ordinary skill in the art.
Control tabs 430 controls content displayed in edit field 440. Control tabs 430 includes a tab designated for pattern control 432, options control 434, keyboard control 436, and controller options 438. Pattern control 432 allows a user to control the content of an arpeggio in real-time, grid-editing, or a combination thereof in edit field 440. In live mode 442, a user can input notes, rests, or note ties in real-time, which appear in note velocity region 450, as further described below. In grid mode 446, a user can input notes, rests, or note ties in note velocity region 450 in a step-wise fashion. By pressing the live-to-grid selector 444, a user can input notes, rests, and note ties in real-time (e.g., with latch 404 and mode selector 406 set to “add” mode) in live-mode and capture or “freeze” the live-played arpeggio in note velocity region 450 for playback, editing, or real-time editing during playback. This transition is shown, e.g., in
Note position region 460 includes a plurality of positions (461, 462, 463, . . . ) for each note, rest, or note tie in a live-played or grid-captured arpeggio. Position 1 (461) is the first note or rest in the arpeggio, followed by position 2 (462), position 3 (463), and so on. Any number of notes, rests, or note ties can be included in a live-played or grid-captured arpeggio.
Note velocity region 450 depicts the velocities of notes played in the note position region 460 in either live mode 442 or grid mode 446. Notes can vary in velocity and may range from 0 (i.e., a rest) to 127, which is typically a maximum velocity in MIDI. Although the resolution of the velocity is shown to have 127 levels, any resolution of velocity (i.e., number of velocity levels) can be used.
Arpeggio progress bar 480 shows the progress of a played arpeggio. For example, as an arpeggio is played (e.g., see
Selecting options control 434 populates the edit field with a number of controls (see, e.g.,
A velocity normalizer 1430 can normalize the velocity of each note of the arpeggio. For example, normalization may be set anywhere from zero percent (i.e., default to velocity values set in pattern control tab 432) to 100 percent (i.e., each note of the arpeggio is set to a uniform programmable value). In some cases, the velocity can also be randomized, crescendoed (1440), or decrescendoed at any predetermined value (e.g., crescendo rate, randomization amount, etc.). Any type of control can be applied to the arpeggio (e.g., swing (1450), cycle length (1460), etc.) as would be appreciated by one of ordinary skill in the art with the benefit of this disclosure.
Selecting the keyboard control tab 436 populates edit field 440 with a depiction of a programmable keyboard (see, e.g.,
The controller options tab 438 can be configured to edit or assign controls and/or functions to an external controller (e.g., MIDI controller). For example, each key of a MIDI controller can be assigned to any function associated with Arp GUI 400 (see, e.g.,
Velocity control 1430 controls the overall velocity of the arpeggio. In some embodiments, as the velocity is increased, the velocity deviates progressively less from the velocities set in the pattern mode (e.g., 432). As the velocity is decreased, the velocity values for all of the notes in the arpeggio become more uniform and fixed. Crescendo controller 1440 causes the velocities of the arpeggio crescendo at a rate and range dictated by the setting. Increasing crescendo (e.g., increasing positive values) can cause the velocities of the notes to increase at a faster rate and/or an increasing range. Reducing crescendo (decreasing negative values) can cause the velocities of the notes to decrease at an increasing rate and/or an increasing range. In some embodiments, a randomizer control can be applied to the velocity control 1430, as would be appreciated by one of ordinary skill in the art with the benefit of this disclosure.
Swing control 1450 can control an amount of swing to add to the playback of the arpeggio. Cycle length controller 1460 can control how the arpeggio is played and can range from “as played” to the cycle defined in the grid.
At 1710, method 1700 begins with receiving a first performance data corresponding to a first plurality of notes in a first order. The first plurality of notes can be MIDI-based notes and may form an arpeggio, according to an embodiment of the invention. The first performance data can originate from an external MIDI keyboard, a virtual keyboard, or may be automated and/or previously generated, or can come from any other source as would be appreciated by one of ordinary skill in the art. The performance data can correspond to any number of notes and does not necessarily have to be more than one note. Furthermore, the corresponding first plurality of notes can be “live-played” in real time. For instance, a musician may play the notes in real time, or the notes may be received in real time from a database. The performance data can include velocity data and/or an identifier indicating whether a corresponding note is one of a musical note, a rest, or a note-tie. Performance data can further include the timing of the rhythmic order that the performance data was received (e.g., notes played).
At 1720, the first performance data is “grid-captured”, i.e., arranged in a graphical grid pattern in the order that they were received. In some embodiments, grid-capturing can occur when the corresponding notes (e.g., arpeggio) played in “live-mode” 444 are captured in grid-mode 446, which may occur when grid capture button 444 is selected (e.g., directly or through an external MIDI controller). In one non-limiting example, grid-capturing performance data corresponding to an arpeggio is shown and described in the transition between
At 1730, method 1700 continues with receiving input data indicating a change to performance data that corresponds to one or more notes of the first plurality of notes. Input data can include data corresponding to changes in velocity data for a particular note of the first plurality of notes. Input data can include data corresponding to changes in a note identifier from a musical note to a rest or tying one note to another note. At 1740, the performance data is changed as dictated by the input data. Some non-limiting examples of changing performance data to one or more notes of the first plurality of notes are shown and described in
At 1750, method 1700 continues with receiving second performance data corresponding to a second plurality of notes in a second order. The second plurality of notes can be MIDI-based notes and may form an arpeggio, according to an embodiment of the invention. The corresponding second plurality of notes can originate from an external MIDI keyboard, a virtual keyboard, or may be automated and/or previously generated, or can come from other input source as would be appreciated by one of ordinary skill in the art. The second performance data can correspond to any number of notes and does not necessarily have to be more than one note. Furthermore, the corresponding first plurality of notes can be “live-played” in real time. For instance, a musician may play the notes in real time, or the notes may be received in real time from a database. The second performance data can include velocity data and/or an identifier indicating whether a corresponding note is one of a musical note, a rest, or a note-tie. Performance data can further include the timing of the rhythmic order that the performance data was received (e.g., notes played).
At 1760, method 1700 concludes with applying the changes made to the first performance data to the second performance data. For example, referring back to
It should be appreciated that the specific steps illustrated in
In some implementations, system 2100 is configured to receive MIDI inputs 2104 (e.g., from a keyboard or other MIDI instrument) or from a user interface 2102. At 2110, the MIDI note message is deconstructed. For example, a MIDI note includes pitch information, velocity data, note type data (e.g., rest, tie, note, chord, etc.), and the like. The pitch data is analyzed (step 2120), the note order is processed (2130), and a MIDI message is assembled (2140). MIDI message assembly can include the note number (e.g., pitch), whether the note is on or off, etc. MIDI message assembly can further include velocity, which is further discussed below. The MIDI note message is finally output at MIDI output 2150.
Referring back to the MIDI note message deconstruction (2110), various performance data from the MIDI input is allocated to arpeggiator steps and memory (2160). Performance data can include MIDI velocity data, note data (e.g., is the note an actual note, a rest, or a chord), or other performance data, as would be appreciated by one of ordinary skill in the art. At 2170, the performance data is either live-played 2172 (i.e., not captured) or captured in a step grid 2174. The step grid can be edited by a user 2176, automated, or a combination thereof. The performance data (grid edited or live played) is passed to a note and velocity processing block 2180. The note and velocity processing block 2180 can determines whether the corresponding note will be played based on a note on/off generation block 2190 and corresponding rate input 2192, as well as the user edits of block 2182, which may correspond to the user edits of block 2176. Block 2180 further processes velocity data (i.e., performance data) received from the grid editing section. The output of block 2180 is fed to MIDI message assembly block 2140, as further discussed above.
System Architecture
It should be appreciated that system 1800 as shown in
In some embodiments, display subsystem 1805 can provide an interface that allows a user to interact with system 1800. The display subsystem 1805 may be a cathode ray tube (CRT), a flat-panel device such as a liquid crystal display (LCD), a projection device, a touch screen, or the like. In general, use of the term “output device” is intended to include all possible types of devices and mechanisms for outputting information from system 1800. For example, a software keyboard may be displayed using a flat-panel screen. In some embodiments, the display subsystem 1805 can be a touch interface, where the display provides both an interface for outputting information to a user of the device and also as an interface for receiving inputs. In other embodiments, there may be separate input and output subsystems. Through the display subsystem 1805, the user can view and interact with a GUI (Graphical User Interface) 1820 of a system 1800. In some embodiments, display subsystem 1805 can include a touch-sensitive interface (also sometimes referred to as a touch screen) that can both display information to the user and receive inputs from the user. Processing unit(s) 1810 can include one or more processors, each having one or more cores. In some embodiments, processing unit(s) 1810 can execute instructions stored in storage subsystem 1815. System 1800 can further include an audio system to play music (e.g., accompaniments, musical performances, etc.) through one or more audio speakers (not shown).
Communications system 1860 can include various hardware, firmware, and software components to enable electronic communication between multiple computing devices. Communications system 1860 or components thereof can communicate with other devices via Wi-Fi, Bluetooth, infra-red, or any other suitable communications protocol that can provide sufficiently fast and reliable data rates to support the real-time jam session functionality described herein.
Storage subsystem 1815 can include various memory units such as a system memory 1830, a read-only memory (ROM) 1840, and a non-volatile storage device 1850. The system memory can be a read-and-write memory device or a volatile read-and-write memory, such as dynamic random access memory. System memory 1830 can store some or all of the instructions and data that the processor(s) or processing unit(s) need at runtime. ROM 1840 can store static data and instructions that are used by processing unit(s) 1810 and other modules of system 1800. Non-volatile storage device 1850 can be a read-and-write capable memory device. Embodiments of the invention can use a mass-storage device (such as a magnetic or optical disk or flash memory) as a permanent storage device. Other embodiments can use a removable storage device (e.g., a floppy disk, a flash drive) as a non-volatile (e.g., permanent) storage device.
Storage subsystem 1815 can store MIDI (Musical Instrument Digital Interface) data relating to notes played on a virtual instrument of system 1800 in MIDI database 1832. A performance data database 1834 can store performance data including velocity data, note identifier data (e.g., note, rest, note tie), rhythmic data, and the like). Further detail regarding system architecture and the auxiliary components thereof (e.g., input/output controllers, memory controllers, etc.) are not discussed in detail so as not to obfuscate the focus on the invention and would be understood by those of ordinary skill in the art.
Processing unit(s) 1905 can include a single processor, which can have one or more cores, or multiple processors. In some embodiments, processing unit(s) 1905 can include a general purpose primary processor as well as one or more special purpose co-processors such as graphics processors, digital signal processors, or the like. In some embodiments, some or all processing units 1905 can be implemented using customized circuits, such as application specific integrated circuits (ASICs) or field programmable gate arrays (FPGAs). In some embodiments, such integrated circuits execute instructions that are stored on the circuit itself. In other embodiments, processing unit(s) 1905 can execute instructions stored in storage subsystem 1910.
Storage subsystem 1910 can include various memory units such as a system memory, a read-only memory (ROM), and a permanent storage device. The ROM can store static data and instructions that are needed by processing unit(s) 1905 and other modules of electronic device 1900. The permanent storage device can be a read-and-write memory device. This permanent storage device can be a non-volatile memory unit that stores instructions and data even when computer system 1900 is powered down. Some embodiments of the invention can use a mass-storage device (such as a magnetic or optical disk or flash memory) as a permanent storage device. Other embodiments can use a removable storage device (e.g., a floppy disk, a flash drive) as a permanent storage device. The system memory can be a read-and-write memory device or a volatile read-and-write memory, such as dynamic random access memory. The system memory can store some or all of the instructions and data that the processor needs at runtime.
Storage subsystem 1910 can include any combination of computer readable storage media including semiconductor memory chips of various types (DRAM, SRAM, SDRAM, flash memory, programmable read-only memory) and so on. Magnetic and/or optical disks can also be used. In some embodiments, storage subsystem 1910 can include removable storage media that can be readable and/or writeable; examples of such media include compact disc (CD), read-only digital versatile disc (e.g., DVD-ROM, dual-layer DVD-ROM), read-only and recordable Blue-Ray® disks, ultra density optical disks, flash memory cards (e.g., SD cards, mini-SD cards, micro-SD cards, etc.), magnetic “floppy” disks, and so on. The computer readable storage media do not include carrier waves and transitory electronic signals passing wirelessly or over wired connections.
In some embodiments, storage subsystem 1910 can store one or more software programs to be executed by processing unit(s) 1905, such as a user interface 1915. As mentioned, “software” can refer to sequences of instructions that, when executed by processing unit(s) 1905 cause computer system 1900 to perform various operations, thus defining one or more specific machine implementations that execute and perform the operations of the software programs. The instructions can be stored as firmware residing in read-only memory and/or applications stored in magnetic storage that can be read into memory for processing by a processor. Software can be implemented as a single program or a collection of separate programs or program modules that interact as desired. Programs and/or data can be stored in non-volatile storage and copied in whole or in part to volatile working memory during program execution. From storage subsystem 1910, processing unit(s) 1905 can retrieve program instructions to execute and data to process in order to execute various operations described herein.
A user interface can be provided by one or more user input devices 1920, display device 1925, and/or and one or more other user output devices (not shown). Input devices 1920 can include any device via which a user can provide signals to computing system 1900; computing system 1900 can interpret the signals as indicative of particular user requests or information. In various embodiments, input devices 1920 can include any or all of a keyboard touch pad, touch screen, mouse or other pointing device, scroll wheel, click wheel, dial, button, switch, keypad, microphone, and so on.
Output devices 1925 can display images generated by electronic device 1900. Output devices 1925 can include various image generation technologies, e.g., a cathode ray tube (CRT), liquid crystal display (LCD), light-emitting diode (LED) including organic light-emitting diodes (OLED), projection system, or the like, together with supporting electronics (e.g., digital-to-analog or analog-to-digital converters, signal processors, or the like), indicator lights, speakers, tactile “display” devices, headphone jacks, printers, and so on. Some embodiments can include a device such as a touchscreen that function as both input and output device.
In some embodiments, output device 1925 can provide a graphical user interface, in which visible image elements in certain areas of output device 1925 are defined as active elements or control elements that the user selects using user input devices 1920. For example, the user can manipulate a user input device to position an on-screen cursor or pointer over the control element, then click a button to indicate the selection. Alternatively, the user can touch the control element (e.g., with a finger or stylus) on a touchscreen device. In some embodiments, the user can speak one or more words associated with the control element (the word can be, e.g., a label on the element or a function associated with the element). In some embodiments, user gestures on a touch-sensitive device can be recognized and interpreted as input commands; these gestures can be but need not be associated with any particular array in output device 1925. Other user interfaces can also be implemented.
Network interface 1935 can provide voice and/or data communication capability for electronic device 1900. In some embodiments, network interface 1935 can include radio frequency (RF) transceiver components for accessing wireless voice and/or data networks (e.g., using cellular telephone technology, advanced data network technology such as 3G, 4G or EDGE, WiFi (IEEE 802.11 family standards, or other mobile communication technologies, or any combination thereof), GPS receiver components, and/or other components. In some embodiments, network interface 1935 can provide wired network connectivity (e.g., Ethernet) in addition to or instead of a wireless interface. Network interface 1935 can be implemented using a combination of hardware (e.g., antennas, modulators/demodulators, encoders/decoders, and other analog and/or digital signal processing circuits) and software components.
Bus 1940 can include various system, peripheral, and chipset buses that communicatively connect the numerous internal devices of electronic device 1900. For example, bus 1940 can communicatively couple processing unit(s) 1905 with storage subsystem 1910. Bus 1940 also connects to input devices 1920 and display 1925. Bus 1940 also couples electronic device 1900 to a network through network interface 1935. In this manner, electronic device 1900 can be a part of a network of multiple computer systems (e.g., a local area network (LAN), a wide area network (WAN), an Intranet, or a network of networks, such as the Internet. Any or all components of electronic device 1900 can be used in conjunction with the invention.
Some embodiments include electronic components, such as microprocessors, storage and memory that store computer program instructions in a computer readable storage medium. Many of the features described in this specification can be implemented as processes that are specified as a set of program instructions encoded on a computer readable storage medium. When these program instructions are executed by one or more processing units, they cause the processing unit(s) to perform various operation indicated in the program instructions. Examples of program instructions or computer code include machine code, such as is produced by a compiler, and files including higher-level code that are executed by a computer, an electronic component, or a microprocessor using an interpreter.
It will be appreciated that computer system 1900 is illustrative and that variations and modifications are possible. Computer system 1900 can have other capabilities not specifically described here (e.g., mobile phone, global positioning system (GPS), power management, one or more cameras, various connection ports for connecting external devices or accessories, etc.). Further, while computer system 1900 is described with reference to particular blocks, it is to be understood that these blocks are defined for convenience of description and are not intended to imply a particular physical arrangement of component parts. Further, the blocks need not correspond to physically distinct components. Blocks can be configured to perform various operations, e.g., by programming a processor or providing appropriate control circuitry, and various blocks might or might not be reconfigurable depending on how the initial configuration is obtained. Embodiments of the present invention can be realized in a variety of apparatus including electronic devices implemented using any combination of circuitry and software.
While the invention has been described with respect to specific embodiments, one skilled in the art will recognize that numerous modifications are possible including the displayed representation of the user interface 130 and the configuration of the various elements therein, such as their position, organization, and function, filtering rules and analysis, etc. Thus, although the invention has been described with respect to specific embodiments, it will be appreciated that the invention is intended to cover all modifications and equivalents within the scope of the following claims.
Network 2006 may include one or more communication networks, which could be the Internet, a local area network (LAN), a wide area network (WAN), a wireless or wired network, an Intranet, a private network, a public network, a switched network, or any other suitable communication network. Network 2006 may include many interconnected systems and communication links including but not restricted to hardwire links, optical links, satellite or other wireless communications links, wave propagation links, or any other ways for communication of information. Various communication protocols may be used to facilitate communication of information via network 2006, including but not restricted to TCP/IP, HTTP protocols, extensible markup language (XML), wireless application protocol (WAP), protocols under development by industry standard organizations, vendor-specific protocols, customized protocols, and others. In the configuration depicted in
In the configuration depicted in
It should be appreciated that various different distributed system configurations are possible, which may be different from distributed system 2000 depicted in
While the invention has been described with respect to specific embodiments, one skilled in the art will recognize that numerous modifications are possible. Thus, although the invention has been described with respect to specific embodiments, it will be appreciated that the invention is intended to cover all modifications and equivalents within the scope of the following claims.
The above disclosure provides examples and aspects relating to various embodiments within the scope of claims, appended hereto or later added in accordance with applicable law. However, these examples are not limiting as to how any disclosed aspect may be implemented,
All the features disclosed in this specification (including any accompanying claims, abstract, and drawings) can be replaced by alternative features serving the same, equivalent or similar purpose, unless expressly stated otherwise. Thus, unless expressly stated otherwise, each feature disclosed is one example only of a generic series of equivalent or similar features.
Any element in a claim that does not explicitly state “means for” performing a specified function, or “step for” performing a specific function, is not to be interpreted as a “means” or “step” clause as specified in 35 U.S.C. §112, sixth paragraph. In particular, the use of “step of” in the claims herein is not intended to invoke the provisions of 35 U.S.C. §112, sixth paragraph.
Number | Name | Date | Kind |
---|---|---|---|
3617602 | Kniepkamp | Nov 1971 | A |
3718748 | Bunger | Feb 1973 | A |
3842182 | Bunger | Oct 1974 | A |
3842184 | Kniepkamp et al. | Oct 1974 | A |
4154131 | Studer et al. | May 1979 | A |
4156379 | Studer | May 1979 | A |
4179970 | Faulkner | Dec 1979 | A |
4182212 | Sigeki | Jan 1980 | A |
4185530 | Robinson et al. | Jan 1980 | A |
4187756 | Robinson et al. | Feb 1980 | A |
4191081 | Deutsch et al. | Mar 1980 | A |
4881440 | Kakizaki | Nov 1989 | A |
4926737 | Minamitaka | May 1990 | A |
5973253 | Hirata | Oct 1999 | A |
6051771 | Iizuka | Apr 2000 | A |
20050016366 | Ito et al. | Jan 2005 | A1 |
20080072744 | Ito et al. | Mar 2008 | A1 |
20150013532 | Adam et al. | Jan 2015 | A1 |