Electronic musical performance controller based on vector length and orientation

Information

  • Patent Grant
  • 10152958
  • Patent Number
    10,152,958
  • Date Filed
    Thursday, April 5, 2018
    6 years ago
  • Date Issued
    Tuesday, December 11, 2018
    6 years ago
  • Inventors
    • Sheely; Martin J
  • Examiners
    • Warren; David
Abstract
An electronic musical performance controller comprising a microprocessor, proximity sensor, gyroscope, accelerometer, narrow beam guide light, and one or more finger monitoring sensors. The proximity sensor is mounted on the front of the controller and represents the origin of a Cartesian coordinate system. Preprogrammed events are mapped into the surrounding space at fixed distances and pitch and yaw angles from the proximity sensor. The guide light beam illuminates the proximity sensor's field of view. The controller is held in one hand and the guide light beam is aimed at the other hand. When the player's finger triggers a finger monitoring sensor, the length of the guide light beam and the pitch and yaw of the proximity sensor are measured. This information is used to determine which mapped event the player is selecting. The preprogrammed event is then output via a MIDI bus or built in sound module and speaker.
Description
FIELD

The subject matter herein generally relates to electronic musical instrument technology, and particularly to an electronic musical performance device comprising sensor and microcontroller technology.


BACKGROUND

Musical instruments and media controllers utilizing sensor technology and microelectronics continue to evolve. One category of device uses this technology to emulate previously existing acoustic musical instruments, for example drums, flutes, and harps. Another area creates performance spaces in which sensors, embedded in the floor, suspended overhead, or mounted on surrounding stands, monitor the movement of the performer and translate this movement into sound. More recently, sensor technology has been integrated into clothing, where the gestures and motion of the wearer trigger sound events.


The devices that have moved beyond replicas of traditional acoustic instruments suffer from various drawbacks. Performance space systems are inherently large and difficult to set up making their adoption problematic. Clothing integrated technology, while portable, is cumbersome to wear and prone to wiring problems. In addition, the gesture, motion, and break beam based systems that are available do not allow rapid and accurate note selection limiting their playability. Accordingly, there is a need in the field for an improved electronic musical instrument that overcomes these limitations.


SUMMARY OF THE INVENTION

The invention described in this document is an electronic musical performance controller, comprising a proximity sensor responsive to change in distance between a selectively positionable member and the proximity sensor, at least one finger monitoring sensor responsive to movement of an operator's finger, at least one angle sensor responsive to change in angle of the proximity sensor around an axis, and a microcontroller configured to output a data packet when triggered by the finger monitoring sensor, wherein the output data packet varies in response to at least one of, change in distance between the selectively positionable member and the proximity sensor, and change in angle of the proximity sensor around an axis.


Having the triggering finger monitoring sensor separate from the proximity sensor achieves a technical advantage over systems that are triggered by approaching the proximity sensor or breaking a beam in that selections can be made much more rapidly and accurately. The addition of a plurality of finger monitoring sensors and a plurality of angle sensors allows many sets of different data packets from the same proximity sensor greatly expanding the number of selections available without increasing the size of the device.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1 is a side view of an embodiment of the instrument body;



FIG. 2 shows a view of an embodiment of the base station receiver;



FIG. 3 is a block diagram showing the electronics inside the embodiment of the instrument body shown in FIG. 1;



FIG. 4 is a block diagram showing the electronics inside the embodiment of the base station receiver in FIG. 2;



FIG. 5 shows a view of the instrument body in relation to the Cartesian coordinate system;



FIG. 6 shows selection group one mapped in the (−x, ±z) plane;



FIG. 7 shows selection group two mapped in the (+y, ±z) plane;



FIG. 8 shows selection group three mapped in the (+x, ±z) plane;



FIG. 9 shows selection group four mapped in the (−y, ±z) plane;



FIG. 10 shows a top view of the four selection groups in 3d space;



FIG. 11 is a top view of the instrument being played;



FIG. 12 is a front view of the instrument being played;



FIG. 13 is a side view of an embodiment of the instrument body;



FIG. 14 is a block diagram showing the electronics inside the embodiment of the instrument body shown in FIG. 13;



FIG. 15 is a side view of an embodiment of the instrument body;



FIG. 16 is a block diagram showing the electronics inside the embodiment of the instrument body shown in FIG. 15;





DETAILED DESCRIPTION OF THE INVENTION

It is to be understood that the specific devices and processes illustrated in the attached drawings, and described in the following specification are exemplary embodiments of the inventive concepts defined in the appended claims. Hence, specific dimensions and other physical characteristics relating to the embodiments disclosed herein are not to be considered as limiting, unless the claims expressly state otherwise.


One embodiment of the device is comprised of a wireless hand held sensor unit shown in FIG. 1, and a base station, shown in FIG. 2.


In FIG. 1 a hemispherical body 101, two infrared reflective optical finger monitoring sensors 102 and 103, an ultrasonic proximity sensor 104, and a narrow beam guide LED 105 are shown. The proximity sensor 104 is mounted on the flat side of the body 101 projecting perpendicularly from the flat side out into space. The guide LED 105 is positioned to illuminate the center of the proximity sensor's field of view. The two finger monitoring sensors 102, 103 (upper and lower respectively) are mounted in holes that are positioned so that when the hemispherical body 101 is held in the hand, the holes are under the tips of the index and middle fingers. In FIG. 2 the base station with a slot for a memory card 201, and a MIDI (musical instrument digital interface) out jack 202 is shown.



FIG. 3 shows a block diagram of the electronics enclosed in the hemispherical body 101 of FIG. 1. A microcontroller 301 is connected to an inertial measurement unit 302, containing a gyroscope 303 and an accelerometer 304, and a wireless transceiver 305. The microcontroller 301 is also connected to the proximity sensor 104, the two finger monitoring sensors 102, 103, and the guide LED 105. Electronics are battery powered (battery not shown).



FIG. 4 shows a block diagram of the electronics enclosed in the base station of FIG. 2. A microcontroller 401, is connected to a wireless transceiver 402, and a memory card socket 403. The UART (Universal Asynchronous Receiver/Transmitter) of microcontroller 401 is connected to the MIDI out jack 202. Display, user interface, and power supply are not shown.


The proximity sensor 104 in FIG. 5, lies at the origin (x0,y0,z0) of a Cartesian coordinate system. A dashed line represents the center of the proximity sensor's field of view and is illuminated by the guide LED 105. Aircraft principal axes, yaw, pitch, and roll, are also shown with the field of view of the proximity sensor 104 being relative to the aircraft nose with its initial orientation along the −X axis.


As shown in FIG. 6, FIG. 7, FIG. 8, FIG. 9 and FIG. 10, groups of eight selections are mapped in the proximity sensor's field of view at incremental distances from the proximity sensor 104. Twelve of the groups of eight are mapped at the pitch and yaw angles shown relative to the proximity sensor 104. The 96 selections are numbered as shown.


The proximity sensor 104 is pitched up 45°, held level, or pitched down 45° to select from each group of selections. The upper finger monitoring sensor 102 and the lower finger monitoring sensor 103 correspond to the odd numbered and even numbered selections respectively. The operator can also rotate the proximity sensor at 90°, 180°, and 270° yaw intervals to change selection groups.


Data packets are programmed using computer software (not shown) and saved to a file on a memory card. The data packets contained in this file are read via the memory card socket 403, in FIG. 4. into a memory of the microcontroller 401. Each data packet in the memory contains MIDI messages corresponding to the 96 selections that are mapped in the space surrounding the proximity sensor.


The device is held in one hand and the guide LED 105 is aimed at the free hand 901 (the selectively positionable member) as shown in FIG. 11 and FIG. 12. When the operator's finger triggers either the upper 102 or lower 103 finger monitoring sensor, an interrupt service routine (ISR) is initiated in the microcontroller 301, see FIG. 3. The microcontroller 301 then uses the proximity sensor 104 to measure the distance between the proximity sensor 104 and the free hand 901. The inertial measurement unit 302 is used to measure the pitch and yaw of the proximity sensor 104. Using the pitch, yaw, and distance data the microcontroller 301 calculates which selection the operator is choosing and transmits a data packet including the selection number via the wireless transceiver 302 to the wireless transceiver 402 of the base station of FIG. 4. The base station microcontroller 401 then sends the corresponding data packet of MIDI messages from memory, out its UART onto the MIDI bus via the MIDI out jack 202 which is connected to a standard MIDI sound synthesizer/sampler voice module.


When the operator's finger disengages either the upper 102 or lower 103 finger monitoring sensor, the microcontroller 301 then outputs a selection released data packet which is sent via the wireless transceiver 302 to the wireless transceiver 402 of the base station of FIG. 4. The base station microcontroller 401 then sends the corresponding data packet of MIDI messages from memory, out its UART onto the MIDI bus via the MIDI out jack 202.


Rotating the proximity sensor 104 around the X axis changes the roll angle, see FIG. 5, wherein the microcontroller 301 outputs data packets related to effects such as musical pitch bend.


The device can be operated in 3d mode, as described above, or in 2d mode. In a 2d mode where only pitch angle is used, the operator chooses from 24 selections positioned in the (−x, ±z) plane, see FIG. 6. In a 2d mode where only yaw angle is used, the operator chooses from 32 selections positioned in the (±x, ±y) plane. Alternative embodiments can operate in 2d mode exclusively.


In another embodiment of the device the MIDI out jack 202, and the memory card slot 201 and socket 403, are incorporated directly into the body 101, see FIG. 13 and FIG. 14. Data packets are read via the memory card socket 403 into memory of the microcontroller 301. Each data packet in the memory contains MIDI messages corresponding to the 96 selections that are mapped in the space surrounding the proximity sensor as described above. Electronics are battery powered (battery not shown).


The device is held in one hand and the guide LED 105 is aimed at the free hand 901 (the selectively positionable member) as shown in FIG. 11 and FIG. 12. When the operator's finger triggers either the upper 102 or lower 103 finger monitoring sensor, an interrupt service routine is initiated in the microcontroller 301, see FIG. 14. The microcontroller 301 then uses the proximity sensor 104 to measure the distance between the proximity sensor 104 and the free hand 901. The inertial measurement unit 302 is used to measure the pitch and yaw of the proximity sensor 104. Using the pitch, yaw, and distance data the microcontroller 301 calculates which selection the operator is choosing and then sends the corresponding data packet of MIDI messages from memory, out its UART onto the MIDI bus via the MIDI out jack 202 which is connected to a standard MIDI sound synthesizer/sampler voice module.


When the operator's finger disengages either the upper 102 or lower 103 finger monitoring sensor in FIG. 14, the microcontroller 301 then outputs a selection released data packet which then sends the corresponding data packet of MIDI messages from memory, out its UART onto the MIDI bus via the MIDI out jack 202.


In an alternate embodiment, a speaker 902 and a sound synthesis module 903, are incorporated directly into the body 101, see FIG. 15 and FIG. 16. Electronics are battery powered (battery not shown).


When the operator's finger triggers either the upper 102 or lower 103 finger monitoring sensor, an interrupt service routine is initiated in the microcontroller 301, see FIG. 16. The microcontroller 301 then uses the proximity sensor 104 to measure the distance between the proximity sensor 104 and the free hand 901 as shown in FIG. 11 and FIG. 12. The inertial measurement unit 302 is used to measure the pitch and yaw of the proximity sensor 104. Using the pitch, yaw, and distance data the microcontroller 301 calculates which selection the operator is choosing and then sends preprogrammed data to the sound synthesis module 903. These sounds are then output through speaker 902.


When the operator's finger disengages either the upper 102 or lower 103 finger monitoring sensor, the microcontroller 301 then outputs a selection released data packet to the sound synthesis module 903.


Alternative types of proximity sensors, angle sensors, and finger monitoring sensors can be substituted in the above embodiments. Additional selections can be mapped in the space surrounding the proximity sensor.

Claims
  • 1. An electronic musical performance controller, comprising: a guide light beam projecting onto a selectively positionable member; anda sensor responsive to change in length of the guide light beam; andan angle sensor responsive to change in angle of the guide light beam around an axis; anda finger monitoring sensor responsive to movement of an operator's finger; anda controller configured to output a data packet when triggered by the finger monitoring sensor, wherein the output data packet varies in response to at least one ofchange in length of the guide light beam andchange in angle of the guide light beam around an axis.
  • 2. The electronic musical performance controller as specified in claim 1 further comprising: a plurality of finger monitoring sensors, wherein each additional finger monitoring sensor corresponds to a different set of data packets.
  • 3. The electronic musical performance controller as specified in claim 1 further comprising: a plurality of angle sensors responsive to angle changes around multiple axes.
  • 4. The electronic musical performance controller as specified in claim 1 further comprising: a hand held component mounting structure.
  • 5. A method of selecting a musical performance data packet, comprising: providing a guide light beam projecting onto a selectively positionable member; andproviding a sensor responsive to change in length of the guide light beam; andproviding an angle sensor responsive to change in angle of the guide light beam around an axis; andproviding a finger monitoring sensor responsive to movement of an operator's finger; andproviding a controller configured to output a data packet when triggered by the finger monitoring sensor, wherein the output data packet varies in response to at least one ofchange in length of the guide light beam andchange in angle of the guide light beam around an axis.
  • 6. The method of selecting a musical performance data packet specified in claim 5 further comprising: providing a plurality of finger monitoring sensors, wherein each additional finger monitoring sensor corresponds to a different set of data packets.
  • 7. The method of selecting a musical performance data packet specified in claim 5 further comprising: providing a plurality of angle sensors responsive to angle changes around multiple axes.
  • 8. The method of selecting a musical performance data packet specified in claim 5 further comprising: providing a hand held component mounting structure.
US Referenced Citations (35)
Number Name Date Kind
3691675 Rodgers Sep 1972 A
4526078 Chadabe Jul 1985 A
4968877 McAvinney Nov 1990 A
5533949 Hwang Jul 1996 A
5541358 Wheaton et al. Jul 1996 A
5648627 Usa Jul 1997 A
6000991 Truchsess Dec 1999 A
7060885 Ishida Jun 2006 B2
7183477 Nishitani Feb 2007 B2
7474197 Choi et al. Jan 2009 B2
8217253 Beaty Jul 2012 B1
8242344 Moffatt Aug 2012 B2
8362350 Kockovic Jan 2013 B2
8609973 D'Amours Dec 2013 B2
8723012 Mizuta May 2014 B2
8872014 Sandler et al. Oct 2014 B2
9024168 Peterson May 2015 B2
9536507 Zhang Jan 2017 B2
9646588 Bencar et al. May 2017 B1
9812107 Butera Nov 2017 B2
20040046736 Pryor Mar 2004 A1
20060174756 Pangrle Aug 2006 A1
20070021208 Mao Jan 2007 A1
20070119293 Rouvelle May 2007 A1
20090308232 McMillen Dec 2009 A1
20110296975 de Jong Dec 2011 A1
20120056810 Skulina Mar 2012 A1
20120103168 Yamanouchi May 2012 A1
20130118340 D'Amours May 2013 A1
20130138233 Sandler May 2013 A1
20130207890 Young Aug 2013 A1
20140007755 Henriques Jan 2014 A1
20170047055 Monsarrat-Chanon Feb 2017 A1
20170092249 Skulina Mar 2017 A1
20180188850 Heath Jul 2018 A1
Non-Patent Literature Citations (1)
Entry
www.proximitar.com Inventor's web site promoting product based on this patent application. (U.S. Appl. No. 15/945,751).