The technology disclosed herein (the “technology”) relates to electronic musical instruments. More specifically, the technology relates to gestures for a touchscreen of an electronic percussion instrument.
Dedicated electronic percussion instruments, e.g., electronic drums, having a drum pad are known. An electronic drum is a percussion instrument in which the sound is generated by an electronic waveform generator or sampler instead of by acoustic vibration. Typically, when an electronic drum pad is struck (i.e., triggered), a voltage change is caused in an embedded piezoelectric transducer (piezo) or force sensitive resistor (FSR). The resultant signals are translated into digital waveforms, which produce the desired percussion sound assigned to that particular trigger pad. Most newer drum modules have trigger inputs for 2 or more cymbals, a kick, 3-4 toms, a dual-zone snare (head and rim), and a hi-hat. The hi-hat has a foot controller which produces open and closed sounds with some models offering variations in-between.
Percussion instrument functionality can be implemented on devices having a touchscreen, but without a dedicated drum pad. Examples of such devices include tablet computers and smart phones. However, the touchscreen of such devices are typically “on/off” and not velocity sensitive like most drum pads in dedicated electronic percussion instruments. This may inhibit the range of expression of such devices to less than that of acoustic drums or dedicated electronic percussion instruments. Also, touchscreen electronic percussion instruments typically do not take advantage of some of the characteristics of dedicated electronic percussion instruments. Further, the limited user interface space typically available in touchscreen devices limits the number of controls that can be offered concurrently. A need exists to provide gesture interfaces for percussion instrument functionality implemented on devices having a touchscreen interface, but no drum pad, and a limited touch surface area.
Disclosed are methods, computer program products, and systems for receiving gestures and producing percussion instrument signals. An exemplary method includes receiving a gesture on an area of a touchscreen representing a percussion instrument. The exemplary method further includes generating a signal based on the gesture and area. The gesture can be a single point initiation touch on an area, and a drag into another area. Such signal is the signal associated with the first area and the signal associated with each subsequent area upon entering the subsequent area. The gesture can be a two-touch point initiation touch on a first area, and a change in distance between the points. Then the signal is the signal associated with the first area, and changing a first parameter of the signal upon a change in the distance. The first parameter can be repeat rate. Such gestures can further include translation of the touch points as a group along an axis to change a second parameter such as volume, pitch, or reverb.
Reference will now be made in detail to implementations of the technology. Each example is provided by way of explanation of the technology only, not as a limitation of the technology. It will be apparent to those skilled in the art that various modifications and variations can be made in the present technology without departing from the scope or spirit of the technology. For instance, features described as part of one implementation can be used on another implementation to yield a still further implementation. Thus, it is intended that the present technology cover such modifications and variations that come within the scope of the technology.
Touch sensor panel 124 can include a capacitive sensing medium having a plurality of drive lines and a plurality of sense lines, although other sensing media also can be used. Each intersection of drive and sense lines can represent a capacitive sensing node and can be viewed as picture element (pixel) 126, which can be particularly useful when touch sensor panel 124 is viewed as capturing an “image” of touch. In other words, after panel subsystem 106 has determined whether a touch event has been detected at each touch sensor in the touch sensor panel, the pattern of touch sensors in the multi-touch panel at which a touch event occurred can be viewed as an “image” of touch (e.g., a pattern of fingers touching the panel). Each sense line of touch sensor panel 124 can drive sense channel 108 in panel subsystem 106. The touch sensor panel can enable multi-touch gesture detection so that shapes can be generated and modified according to implementations of the technology.
Computing system 100 also can include host processor 128 for receiving outputs from panel processor 102 and performing actions based on the outputs that can include, but are not limited to, moving an object such as a cursor or pointer, scrolling or panning, adjusting control settings, opening a file or document, viewing a menu, making a selection, executing instructions, operating a peripheral device coupled to the host device, answering a telephone call, placing a telephone call, terminating a telephone call, changing the volume or audio settings, storing information related to telephone communications such as addresses, frequently dialed numbers, received calls, missed calls, logging onto a computer or a computer network, permitting authorized individuals access to restricted areas of the computer or computer network, loading a user profile associated with a user's preferred arrangement of the computer desktop, permitting access to web content, launching a particular program, encrypting or decoding a message, prompting the generation of a signal corresponding to a sound of a percussion instrument, and/or the like. Host processor 128 also can perform additional functions that may not be related to panel processing, and can be coupled to program storage 132 and display device 130 (which may correspond to system 100) such as an LCD display for providing a UI to a user of the device. Display device 130 together with touch sensor panel 124, when located partially or entirely under the touch sensor panel, can form a touchscreen.
Note that one or more of the functions described above can be performed by instructions (e.g., programming, software, firmware) stored in memory (e.g. one of the peripherals 104 in
The instructions also can be propagated within any transport medium for use by or in connection with an instruction execution system, such as a computer-based system, processor-containing system, or other system that can fetch the instructions from the memory and execute the instructions. In the context of this document, a “transport medium” can be any medium that can communicate, propagate or transport the program for use by or in connection with the instruction execution system, apparatus, or device. The transport readable medium can include, but is not limited to, an electronic, magnetic, optical, electromagnetic or infrared wired or wireless propagation medium.
The mobile telephone, media player and personal computer of
A touchscreen is an electronic visual display that can detect the presence and location of a touch (or multiple touches) within the touchscreen area. The term generally refers to touching the display of the device with a finger or hand. Touchscreens also can sense other passive objects, such as a stylus. Touchscreen is common in devices such as all-in-one computers, tablet computers, and smartphones. Touchsreens enable one to interact directly with what is displayed, rather than indirectly with a cursor controlled by a mouse or touchpad. Touchscreens also let one do so without requiring an intermediate device that would need to be held in the hand. Such displays can be attached to computers, or to networks as terminals. They also play a prominent role in the design of digital appliances such as the personal digital assistant (PDA), satellite navigation devices, mobile phones, and video games. Multi-touch touch-sensitive panels can detect multiple touches (touch events or contact points) that occur at about the same time (and at different times), and identify and track their locations.
Dedicated electronic percussion instruments, e.g., electronic drums, having a drum pad are known. An electronic drum is a percussion instrument in which the sound is generated by an electronic waveform generator or sampler instead of by acoustic vibration. Typically, when an electronic drum pad is struck (i.e., triggered), a voltage change is caused in the embedded piezoelectric transducer (piezo) or force sensitive resistor (FSR). The resultant signals are translated into digital waveforms, which produce the desired percussion sound assigned to that particular trigger pad. Most newer drum modules have trigger inputs for 2 or more cymbals, a kick, 3-4 toms, a dual-zone snare (head and rim), and a hi-hat. The hi-hat has a foot controller which produces open and closed sounds with some models offering variations in-between.
Percussion instrument functionality can be implemented on devices having a touchscreen, but without a dedicated drum pad. Examples of such devices include tablet computers and smart phones, e.g., as described above. However, the touchscreen of such devices are typically “on/off” and not velocity sensitive like most drum pads in dedicated electronic percussion instruments. This may inhibit the range of expression of such devices to less than that of acoustic drums or dedicated electronic percussion instruments. Also, touchscreen electronic percussion instruments typically do not take advantage of some of the characteristics of dedicated electronic percussion instruments. Further, the limited user interface space typically available in touchscreen devices limits the number of controls that can be offered concurrently. A need exists to provide gesture interfaces for percussion instrument functionality implemented on devices having a touchscreen interface, but no drum pad, and a limited touch surface area.
In some implementations, the technology comprises one or more of computer-implemented methods, computer program products, and systems for receiving gestures and producing percussion instrument signals. The gestures can be one or more of single touch and multi-touch. The signals can be acoustic, such as where a device of the technology includes a speaker. The signals can be electric, such as where a device of the technology can be connected, directly or indirectly, to a speaker, and/or where a device of the technology records data related to the signal for playing at a later time.
Gesture-Enhanced Single-Touch, also known as “Dual Control”, “Gesture Touch” and often “Dual-Touch” describes the ability of a touchscreen to register certain two-finger gestures, even though the display hardware does not have full multi-touch capabilities. A very common application is the pinch-to-zoom gesture, which allows the user to zoom in or out by moving two fingers farther apart or closer together while touching the display.
In some implementations, the technology accepts a group rotation (e.g., rotation of an axis formed by the two touch points) to vary a fourth parameter of the signal produced by the touch. In some implementations, once the two-point touch is initiated, the touch can be translated and rotated as a group outside the area of the initial touch and still maintain the signal produced by the initial touch.
In each implementation, gestures can be combined. For example, touching the side stick hit area 610 of the snare in
Referring to
Referring to
Referring to
Referring to
Referring to
Referring to
Number | Name | Date | Kind |
---|---|---|---|
4526078 | Chadabe | Jul 1985 | A |
4716804 | Chadabe | Jan 1988 | A |
6201173 | Black | Mar 2001 | B1 |
6245984 | Aoki et al. | Jun 2001 | B1 |
6390923 | Yoshitomi et al. | May 2002 | B1 |
6607436 | Ueshima et al. | Aug 2003 | B1 |
7060887 | Pangrle | Jun 2006 | B2 |
7088343 | Smith et al. | Aug 2006 | B2 |
7271328 | Pangrle | Sep 2007 | B2 |
7682237 | Ueshima et al. | Mar 2010 | B2 |
7842877 | Charles | Nov 2010 | B2 |
7960639 | Mizuhiki et al. | Jun 2011 | B2 |
8030567 | Ludwig | Oct 2011 | B2 |
8093486 | Behringer et al. | Jan 2012 | B2 |
8163992 | Charles | Apr 2012 | B2 |
8173884 | Gatzsche et al. | May 2012 | B2 |
8207435 | Charles | Jun 2012 | B2 |
8367922 | Jung et al. | Feb 2013 | B2 |
20040200338 | Pangrle | Oct 2004 | A1 |
20050096132 | Ueshima et al. | May 2005 | A1 |
20060034043 | Hisano et al. | Feb 2006 | A1 |
20060174756 | Pangrle | Aug 2006 | A1 |
20070221046 | Ozaki et al. | Sep 2007 | A1 |
20070229477 | Ludwig | Oct 2007 | A1 |
20070252327 | Ueshima et al. | Nov 2007 | A1 |
20080110323 | Bergfeld et al. | May 2008 | A1 |
20090091543 | Camp et al. | Apr 2009 | A1 |
20090174677 | Gehani et al. | Jul 2009 | A1 |
20090239517 | Ota | Sep 2009 | A1 |
20100287471 | Nam et al. | Nov 2010 | A1 |
20100288108 | Jung et al. | Nov 2010 | A1 |
20100319519 | Takehisa et al. | Dec 2010 | A1 |
20110100198 | Gatzsche et al. | May 2011 | A1 |
20110134061 | Lim | Jun 2011 | A1 |
20110316793 | Fushiki | Dec 2011 | A1 |
20120011989 | Takahashi | Jan 2012 | A1 |
20120139861 | Jung et al. | Jun 2012 | A1 |
20120223903 | Ludwig | Sep 2012 | A1 |
20120235940 | Ludwig | Sep 2012 | A1 |
20120280928 | Ludwig | Nov 2012 | A1 |
20130118337 | Behringer et al. | May 2013 | A1 |
20130118338 | Wallace et al. | May 2013 | A1 |
20130152768 | Rapp | Jun 2013 | A1 |
Number | Date | Country | |
---|---|---|---|
20120223891 A1 | Sep 2012 | US |