The present disclosure relates generally to electronic musical instruments, and, more particularly, to an electronic musical instrument with separated pitch and articulation control.
Existing electronic musical instruments (EMIs) tend to be modeled on a well-known, traditional acoustic instrument, such as the piano, guitar, or saxophone. Electronic disc-jockeys (DJs) are also limited to the form factors of laptops, switchboards, electronic turntables, etc.
Moreover, existing keyboard or percussion EMIs also combine pitch selection and sound triggering (articulation) within the same hand placement. For example, an electronic keyboard has a series of keys, where depressing a first key produces a first sound (first pitch), depressing a second key produces a second and different sound (second pitch), and so on. This makes bends or modulations (changing the pitch of a sound) awkward and unnatural and limits rhythmic control.
Additionally, existing guitar EMIs separate pitch from rhythm control, but fixed fret buttons do not allow bending pitch in a natural way. Also, existing wind and percussion EMIs lack the flexibility to play in any other way.
Still further, conventional touchscreen EMIs, such as simple piano keys projected on a tablet screen, provide no sense of touch, no velocity, and no volume control. That is, such instruments do not determine how hard a key was hit, so there is no control over how soft or loud a sound is to be played.
According to one or more embodiments herein, an electronic musical instrument (EMI) (or “electronic multi-instrument”) is described that separates pitch choice from percussive sound control (“articulation”). A pitch sensor interface (by which notes are selected) may comprise a software-programmed touchscreen interface (that can be io modeled on existing musical instruments or entirely new) configured to allow pitch choice, while sound control may be made on a separate articulation control sensor (by which notes are triggered and modified), such as an illustrative double-sided touch pad, that senses one or more of a velocity, pressure, movement, and location of a user's contact. The design facilitates a portable, ergonomic, and intuitive way to express music is via standard digital protocols (e.g., MIDI) through a physical interface that encourages fluid, non-static, personally distinguishable musical expression. Notably, the instrument may illustratively be a controller that requires a compatible synthesizer sound source (e.g., on-board or separate).
This separation of pitch from articulation/percussion solves several problems faced by existing electronic musical instruments providing greater detail for expression of each note, improved rhythmic feel, natural pitch movement, and more precise velocity control.
Notably, this summary is meant to be illustrative of certain example aspects and embodiments of the detailed description below, and is not meant to be limiting to the scope of the present invention herein.
The embodiments herein may be better understood by referring to the following description in conjunction with the accompanying drawings in which like reference numerals indicate identically or functionally similar elements, of which:
Electronic musical instruments (EMIs), such as Musical Instrument Digital Interface (MIDI) controller instruments and synthesizers, have many capabilities. However, the controller mechanisms, although numerous, are disjointed and difficult to manage. For example, sliders, wheels, foot controllers, and so on are conventional features used for enhanced electronic control, which may be located at random places on an instrument. Furthermore, certain instruments may not have such features at all, and some musicians might desire such features or even greater control. For example, some electronic keyboards have a built-in pitch-bender lever or wheel, where played notes may io be bent in pitch (e.g., ½ tone up and/or down). However, not all electronic keyboards have such functionality, and those that do have pitch-benders are limited to merely bending pitch.
The novel EMI described herein, on the other hand, solves these problems by offering a single ergonomic multi-sided articulation surface that provides a way to fluidly is and intuitively manipulate performance parameters and rhythm. This surface integrates with any pitch selection component to allow seamless transitions between staccato/legato articulations, timbre, amplitude, and pitch-bend. The techniques described below also allow for the use of a touchscreen interface that does not support force, so that natural velocity can be easily added to this interface.
As described below, the system need not directly emulate any particular instrument, yet any musician regardless of background can adapt to play (e.g., being a guitar, keyboard, wind instrument, percussion instrument, and so on, or even another non-standard interface).
According to one or more embodiments described herein, the illustrative EMI allows for various interfaces to be displayed and/or used for pitch selection. Rhythmic playing is enhanced by addition of a separate tactile input controller to trigger selected notes and to modulate tone. In particular, as described in greater detail below, a combination of input methods, such as combining a touchscreen with a touchpad, creates a unique combination of input methods allowing for flexible playing styles, more precise rhythmic control, and more fluid/natural way to control complex performance parameters.
Specifically, according to a first aspect of the present disclosure described in greater detail below, an adaptable touchscreen configuration may provide a graphic user interface that can be programmed via software into a unique note configuration or modeled on an existing acoustic instrument (e.g., keyboard, strings, valves, percussion, etc.). It can also dynamically adapt to left or right-handed playing, varied hand sizes, and other user requirements. According to a second aspect of the present disclosure described in greater detail below, the separation of note selection and note trigger solves two io problems in touchscreen-based instruments: velocity and latency. That is, touchscreens do not easily detect strike velocity, and as such the volume of a note is no different between a softly struck note and a firmly struck note. Also, regarding latency, touchscreen scan rates are generally too low to pick up very fast pitch changes. Moving the note trigger to a separate articulation (or percussion) pad, which detects velocity with is no limitation on scan rate, solves both problems.
Furthermore, for general reference during the description below,
According to the techniques herein, a primary component of the embodiments herein is pitch control for the EMI. As such, various types of pitch detection sensors (PS) may be used. For instance, a pitch detection (or control) sensor may be configured as either a hardware sensor array (e.g., physical piano keys or other buttons with sensor pickup technology) or a software-defined touch-sensitive display (e.g., a displayed image of piano keys on a touchscreen, such as a midi keyboard). Singular and/or plural note selection is supported, and in the illustrative (and preferred) embodiment herein, selected notes need not (and preferably do not) trigger until the articulation sensor (e.g., pad/exciter) portion is “struck”.
According to an illustrative embodiment, the pitch sensor may be configured as an open touchscreen interface that can be programmed via software into a unique configuration or modeled on an existing acoustic instrument (keyboard, strings, valves, percussion, etc.), as a user's choice. Touching the visible graphics (that is, selecting one or more independent notes, chords, sounds, etc.) will select the musical notes, and sliding between notes may allow for corresponding pitch changes. Pitch selection can be polyphonic or monophonic. Once the note is selected, sliding movements will create io pitch bends or vibrato, based on lengths and directions determined by the software. This leads to flexible and intuitive pitch control similar to an acoustic instrument but only limited by the software and the client synthesizer.
Said differently, pitch selection may be illustratively embodied as a touchscreen capable of detecting X axis and Y axis position and movements , and that is capable of is translating X/Y positions to musical notes (e.g., MIDI notes, such as fretted or keyboard quantized) and pitch-bend (high-resolution) data. The touchscreen may be a variable design (e.g., touchscreen with display capabilities), or may be fixed (e.g., touchpad with printed graphics). Also, in one embodiment, the actual pitch selection sensor component may be fixed to the EMI, or may be removable and/or interchangeable (e.g., different locations of a pitch selection component from the articulation component described below, or else for interchanging between different (static) pitch selection configurations, such as switching from a piano keyboard to a guitar fretboard).
(Note that as described below, pitch selection may be capable seamless detection of pitch in between notes, i.e., independent note pitch-bend (e.g., multidimensional polyphonic expression, “MPE”). The pitch sensor herein, therefore, solves the issue of pitch-bend not being per note via the MIDI spec, as described below.)
Another primary component of the embodiments herein is articulation control (rhythm, percussion, etc.) for the EMI. An articulation/excitation sensor (AS) assembly is illustratively a multi-faceted ergonomic touch-sensitive sensor array for enhanced musical expression. In one preferred embodiment, a double-sided touch pad may be struck by a human hand and it (e.g., in conjunction with sensor-reading software) may measure the velocity, pressure, location, and movement of the hand strike. The touch pad provides tactile feedback and can be struck in many ways and in multiple areas, such as, for example, simple tapping or pressing, strumming up and down like a guitar, drummed like a tabla, or by sliding back and forth like a violin. The X/Y spatial location of the strike can determine tone, crossfade between different sounds, etc., depending upon implementation. Strikes on each side of a double-sided pad could be set to arpeggiate for an up/down strum-like effect. The range of effects is only limited by software and the client synthesizer.
In more general detail, an example touchpad may be a force-sensing resistor (or force-sensitive resistor) (FSR) pad, which illustratively comprises FSR 4-wire sensors for XYZ sensing, preferably with enough space and resolution for ease of sliding hand movement to facilitate natural musical articulations, such as, among others, timbre, harmonics, envelope, bowing, sustain, staccato, pizzicato, etc. Though a simple is embodiment merely requires a touch “on/off” sensing ability, and even more sophistication with a force-sensing ability (i.e., velocity or “how hard” a user strikes the pad), the illustratively preferred XYZ sensor indicates response from three dimensions: X-axis, Y-axis, and Z-axis (force/velocity). That is, the main surfaces of an illustrative touchpad use 3D plane resistive touch sensor technology for X, Y, and Z axis position response.
Illustratively, and with reference to diagram 400 of
In one embodiment, the XYZ FSR sensor design and firmware may be capable of low-latency, e.g., <1 ms, velocity detection. In another embodiment, the XYZ sensor outputs data using the universal serial bus (USB) communication protocol.
In general, on the articulation control, pad strikes determine, for one or more notes, the velocity/amplitude/transient. Subsequent note movement while the pad is active may result in (no transient) Legato articulation. Subsequent pad strikes may then result in re-triggering of the selected note transient. In certain embodiments, the location of the strike on a pad may result in various timbre and/or envelope modifications of the sound. Furthermore, velocity is determined by force and velocity of striking the pad. io Subsequent force after the strike may control Legato amplitude, unless using a velocity capable keyboard for pitch selection. In that case Legato velocity may be determined by the MIDI keyboard input.
The use of an articulation sensor thus solves the issue that touchscreens generally do not provide force sensitivity to allow for velocity information, as well as the issue that is pitch-bend/modulation wheels are awkward to use simultaneously. Moreover, the use of an articulation sensor in this manner also expands continuous controller (CC) expression and versatility, as may be appreciated by those skilled in the art (e.g., as defined by the MIDI standards).
Multi-faceted ergonomic touch sensitive articulation sensors, such as a dual-sided articulation sensor configuration 500 shown in
Said differently, each FSR pad may be located on opposite sides of a hand-sized parallelepiped (or rectangular cuboid) facilitating rapid percussive strikes and sliding movements over the X/Y axis. The Z axis can also be accessed following a strike by applying pressure. The axis movements may send data in a mirror configuration to facilitate natural up and down strikes of a hand (e.g., sliding the hand in the same direction). That is, the two XYZ sensors may be (though need not be) identical in size io and shape, and mirror axis movements such that natural movements from both sides intuitively result in the same expressions. This also facilitates left or right hand play and a variety of play variations. In one embodiment, however, as an alternative to synchronized articulation pads, each pad may offer individual input and control, for more advanced control and instrument play.
According to one or more embodiments herein, the EMI may be preferably configured to combine the pitch selection control and the articulation/excitation control. For instance, in one embodiment, one hand of a user/player may select pitch on the touchscreen (pitch sensor), while the other hand triggers the sound by striking the touch pad (articulation sensor). The harder (more velocity) the articulation sensor is struck, the louder the notes selected by the pitch sensor may be played. The longer the articulation is held down, the longer the notes selected by the pitcher sensor may be played. Similarly, as described above, sliding the user's fingers along the touchscreen (e.g., in the X and or Y axis direction) allows for various control and/or expression (e.g., sliding to pitch-bend, circling to create natural vibrato, or “fretless” slide effect, and so on). This separation of control provides greater detail and flexibility for a wider range of musical expressions (which is particularly good for percussive playing styles, but can be played in a variety of ways depending on implementation).
Note that striking the touch pad without selecting a pitch may be configured trigger a non-pitched percussive sound for rhythmic effect. That is, without any selected pitch, tapping the articulation sensor may produce a muted sound, such as muted/dampened strings, hitting the body of an acoustic guitar, or other percussive sounds or noises as dictated by the associated control software. Note that selecting notes on the pitch sensor without striking the articulation sensor may generally be mute (i.e., no sound), or else alternatively, if so configured, may play as legacy mode e.g., “tapping”.
In one embodiment, touching and holding an articulation sensor (or both articulation sensors simultaneously) may enable other play modes, such as legacy mode to allow piano-like playback from the pitch sensor, i.e., standard single hand keyboard play with note-on triggering control transferred back to the pitch selection component. In this mode, X/Y movement on the articulation sensor and its corresponding functionality io may remain active. Moreover, additional Z-axis pressure/force (e.g., “squeezing” the pad) may control volume/velocity, though in alternative configurations in this mode, other axis movement (e.g., X-axis) may be used to control volume/velocity. This is particularly useful if the pitch selection device is a capacitive touchscreen that does not support force detection. Further, other arrangements may be made, such as holding down is a first articulation sensor to allow piano play by the pitch sensor, and pressing on a second articulation sensor for features such as sustain.
According to one or more embodiments herein, a damper sensor may be used to facilitate quick, intuitive dampening of ringing notes during play. For instance, the EMI may comprise one or two damper sensor(s), e.g., ribbon soft-pot voltage detection sensors, which may be positioned in proximity to the XYZ sensor or between dual XYZ sensors (e.g., orthogonally to the other sensors). Illustratively, a damper sensor only requires on/off functionality, e.g., to send MIDI CC 64 data. Notably, this damper sensor (e.g., 315 above) may be generally an additional sensor, and may be used for any suitably configured control, such as to mute, sustain, damper, etc., as well as any other control program change (e.g., tone changes, program changes, instrument changes, etc., such as cycling through various configurations/programs, accordingly).
As mentioned above, control software according to the techniques herein may comprise a computer-based application (e.g., desktop, laptop, tablet, smartphone, etc.) that supports input from the EMI and peripheral control device (e.g., USB) and EMI input/output (I/O) generally (e.g., MIDI). The communication between the EMI, peripheral control device, and the control software may illustratively be USB direct, though other embodiments that utilize one or more of wireless, MIDI, Ethernet, and so on. Note that in one embodiment, the control software may be integrated into the EMI hardware for a more self-contained implementation, or else in another embodiment may be contained remotely (e.g., through a wired or wireless connection, or even over an Internet connection) on a standard operating system (OS) such as MICROSOFT WINDOWS, APPLE MACOSX or IOS, or ANDROID operating systems.
As also mentioned above, the pitch sensor may be capable of high scan rates for low latency detection, as well as the articulation sensor, and the control sensor is thus correspondingly configured to correlate the differentiated sensor input and translate the input from both sensors into a digital musical standard for output, e.g., MIDI. For example, the control software may correlate and store the pitch sensor information, and then may trigger the pitch data at rhythmic moments, velocity, and durations as dictated by strikes to the articulation sensor(s).
The control software may also be configured to manage the configuration of the EMI, such as the mode and configuration of the pitch sensor, as well as to select from various presets to manage user configurations and synthesizers. Other controls, such as managing channel and pitch-bend data via MPE standards, or else further capability of managing MIDI input parsing and output MIDI commands. Further, the control software may be capable of creating and storing user presets to manage setups, configurations, ranges, CC data mapping, and so on.
Note that because of the unique configuration of the separated pitch sensor and articulation sensor(s), various musical features are made available by the embodiments herein. For instance, polyphonic pitch-bend (Multidimensional Polyphonic Expression or “MPE”) with re-triggering support during bend, and polyphonic pitch-bend (MPE) with full legato support, i.e., mono synth style envelope response with chords (e.g., a super lap steel guitar style play). (Polyphonic legato is similar to a guitar's “hammer on” technique.) Note that the MPE allows for pitch-per-note control, i.e., independent control of each note, and not simply all selected notes moving in the same direction (e.g., ½ tone up/down), but however so configured and/or controlled (e.g., some notes up, some down). (At the same time, of course, simultaneous pitch-bend, XYZ articulation, and rhythm triggering, are configurable and controllable in any suitable manner as well.) Various abilities to slide between notes are available in different configurations, e.g., sliding along a cello between strings, a keyboard shifting from triad to 6/9 chord, etc. Further, subtle randomness of XY locations of note triggers can create a less static, unique-to-player sound. Additional articulation and pitch capabilities are thus offered than conventional MIDI controllers.
The physical layout of the EMI described herein may vary based on user design, preference, and style. Having a virtual interface provides the advantages of any interface io for any type player, and allows adjustable interface for different styles, hand sizes, etc. In addition, a virtual interface provides something unique for the audience to see while performing. The display on a touchscreen, or any physically changeable pitch sensor modules, may consist of any of a keyboard, strings, valves, percussion, DJ controller boards, or other custom/alternative designs. In one embodiment, touchscreen technology is that actually changes shape (e.g., “bubble-up” technology) may be used to add a tactile feel (e.g., key or valve locations) for user sensation. There could even be a non-musician mode for singer self-accompaniment without knowledge of any traditional instruments.
Various overall configurations may be created for the EMI described herein, such as a portable version, a desktop version, a tablet version, a complete/embedded version, and so on. For instance,
According to the example embodiment EMI 600 in
In still another embodiment, a table-top version of the articulation sensors may be designed, such as the example three-sided device 700 as shown in
In fact, according to one or more embodiments herein, a peripheral control device may also be configured for any EMI, comprising at least one touch-sensitive control sensor (by which notes are modified and/or triggered) that senses one or more of a velocity, pressure, movement, and location of a user's contact, as described above. That is, a peripheral device is generally defined as any auxiliary device that connects to and works with the EMI in some way. For instance, the peripheral control device may interface with the EMI, or with the EMI controller software (e.g., MAINSTAGE). As described below, the design facilitates a portable, ergonomic, and intuitive way to express music via standard digital protocols (e.g., MIDI) through a peripheral physical interface that encourages fluid, non-static, personally distinguishable musical expression.
For instance, according to one or more embodiments herein, an XYZ Pad Expression Controller (e.g., an “expression sensor”) as mentioned above may respond simultaneously to three dimensions of touch (e.g., XY-axis location and Z-axis pressure) that may be rendered to MIDI. There are a wide number of potential musical uses, as described above, such as X for timbre, Y for envelope, and Z for velocity. As an alternative example for a peripheral device (or any device), the following actions may be configured:
Other configurations may be made, such as using different quadrants of the device for different controls, or else defining regions where different controls do or do not function (e.g., for the Y axis, having only the upper ⅔rds of the device being used for modulation, while the lower 3rd is used for pitchbend with no modulation). The configuration can be changed with standard MIDI program change messages. Illustratively, changes will persist after reboot, though default configurations may also be used.
The form factor of the peripheral control device may be any suitable design (shape and/or size), such as the table-top design 700 above, or else any other design (e.g., flat surfaces, add-ons, etc.), some of which being described below. Also, the communication configuration for a peripheral control device may either be “parallel” or “serial”. For example,
As mentioned, various configuration control for the functionality of the peripheral control device may be based on manufacturer-configured (static) configurations, or else may be controlled by the user through a control app interpretation of the input signals, or else on the device itself, such as via wireless or wired connection to a computer (e.g., phone, tablet, laptop, etc.).
The techniques described herein, therefore, provide generally for an electronic io musical instrument with separated pitch and articulation controls. Advantageously, the embodiments herein solve several problems faced by existing electronic musical instruments. In particular, by separating pitch from percussion/articulation, the embodiments herein provide greater detail for expression of each note, improved rhythmic feel, natural pitch movement, and more precise velocity control. In addition, is the specific embodiments shown and described above provide for comfortable and intuitive ergonomics of sensors, particularly the two-sided articulation XYZ sensors, in a manner that illustratively provides many (e.g., seven) parameters of control, which are conventionally only available through sliders and knobs on a production board (where even a production board doesn't allow for simultaneous control of the parameters).
Specifically, the articulator described above provides an intuitive way to modify timbre, envelope, and sustain in real-time, and there is no need for extra hands to manipulate cumbersome pedals or sliders. Also, while playing a touchscreen instrument, the articulator provides a way to add velocity (volume/force) control. For keyboardists, the EMI techniques herein provides polyphonic legato, seamless slides between notes/chords, and easy re-triggering of single notes/chords in a percussion style in a way never before available. For guitarists, the EMI techniques herein provide a low-latency MIDI, multiple notes “per-string”, and pitch-bending between strings. Even further, for microtonalists, the techniques herein can provide a matrix interface or any alternative scale. Still further, the EMI herein can provide a way for beginners to play chords easily.
Furthermore, the techniques described herein may also provide generally for a peripheral control device for electronic musical instruments. In particular, by adding a control device to a legacy EMI, or else to an EMI with limited capability, the embodiments herein can still provide greater detail for expression of each note for any EMI.
Note also that pitch selection may be capable of seamless detection of pitch in between notes, i.e., independent note pitch-bend (e.g., multidimensional polyphonic expression, “MPE”). The techniques herein, therefore, also solve the issue of pitch-bend not being per note via the MIDI specification, whether directly incorporating MPE io capability or else by being compatible with MPE-processing software.
Note that the embodiments above also provide the benefit, in certain configurations, of being a self-contained virtual studio technology (VST) device, where there is no need to connect the device to a phone, tablet, or PC, simply allowing the device to be plugged directly into an amp or PA system.
Those skilled in the art will appreciated that although certain embodiments, form factors, aspects, and use-cases, and particularly their associated advantages, have been described above, it should be noted that the opportunity for other arrangements may be contemplated according to the details described above that may provide additional advantages than those mentioned herein.
Illustratively, the certain techniques described herein may be performed by hardware, software, and/or firmware, such as in accordance with the various processes of user devices, computers, personal computing devices (e.g., smartphones, tablets, laptops, etc.), online servers, and so on, which may contain computer executable instructions executed by processors to perform functions relating to the techniques described herein. That is, various systems and computer architectures may be configured to implement the techniques herein, such as various specifically-configured electronics, embedded electronics, various existing devices with certain programs, applications (apps), various combinations there-between, and so on.
For example, various computer networks (e.g., local area networks, wide area networks, the Internet, etc.) may interconnect devices through a series of communication links, such as through personal computers, routers, switches, servers, and the like. The communication links interconnecting the various devices may be wired or wireless links. Those skilled in the art will understand that any number and arrangement of nodes, devices, links, etc. may be used in a computer network, and any connections and/or networks shown or described herein are merely for example.
Illustratively, the computing devices herein (e.g., the EMI, the peripheral control device, or any device configured to operate in conjunction with the EMI or peripheral io control device) may be configured in any suitable manner. For example, the device may have one or more processors and a memory, as well as one or more interface(s), e.g., ports or links (such as USB ports, MIDI ports, etc.). The memory comprises a plurality of storage locations that are addressable by the processor(s) for storing software programs and data structures associated with the embodiments described herein. The processor(s) is may comprise necessary elements or logic adapted to execute software programs (e.g., apps) and manipulate data structure associated with the techniques herein (e.g., sounds, images, input/output controls, etc.). An operating system may be used, though in certain simplified embodiments, a conventional sensor-based configuration may be used (e.g., MIDI controllers with appropriate sensor input functionality).
It will be apparent to those skilled in the art that other processor and memory types, including various computer-readable media, may be used to store and execute program instructions pertaining to the techniques described herein. Also, while the description illustrates various processes, it is expressly contemplated that various processes may be embodied as modules configured to operate in accordance with the techniques herein (e.g., according to the functionality of a similar process). Further, while the processes may have been shown separately, or on specific devices, those skilled in the art will appreciate that processes may be routines or modules within other processes, and that various processes may comprise functionality split amongst a plurality of different devices (e.g., controller/synthesizer relationships).
In addition, it is expressly contemplated that certain components and/or elements of the present disclosure may be embodied as non-transitory computer readable media on a computer readable medium containing executable program instructions executed by a processor, controller or the like. Examples of the computer readable mediums include, but are not limited to, ROM, RAM, compact disc (CD)-ROMs, magnetic tapes, floppy disks, flash drives, smart cards, optical data storage devices, and other types of internal or external memory mediums. The computer readable recording medium can also be distributed in network coupled computer systems so that the computer readable media is stored and executed in a distributed fashion.
While there have been shown and described illustrative embodiments that provide for an electronic musical instrument with separate pitch and articulation control, or also a peripheral control device for an electronic musical instrument, it is to be understood that various other adaptations and modifications may be made within the spirit and scope of the embodiments herein, with the attainment of some or all of their advantages. For instance, though much of the example above illustrates certain configurations and styles (e.g., “look and feel”), other arrangements or configurations may be made, and the techniques herein are not limited to merely those illustrated in the figures. That is, it should be understood that aspects of the figures depicted herein, such as the depicted functionality, design, orientation, terminology, and the like, are for demonstration purposes only. Thus, the figures merely provide an illustration or the disclosed embodiments and do not limit the present disclosure to the aspects depicted therein. Also, while certain protocols are shown and described, such as MIDI, the embodiments herein may be used with other suitable protocols, as may be appreciated by those skilled in the art.
Accordingly this description is to be taken only by way of example and not to otherwise limit the scope of the embodiments herein. Therefore, it is the object of the appended claims to cover all such variations and modifications as come within the true spirit and scope of the embodiments herein.
This application claims priority to U.S. Provisional Application No. 62/448,124 filed Jan. 19, 2017, entitled “ELECTRONIC MUSICAL INSTRUMENT WITH SEPARATE PITCH AND ARTICULATION CONTROL,” by Eric Netherland, the contents of which are hereby incorporated by reference.
Number | Date | Country | |
---|---|---|---|
62448124 | Jan 2017 | US |