This disclosure relates to the composition and performance of sound and video content in a cyber reality environment.
This disclosure relates to the composition and performance of sensory stimulating content, such as, but not limited to, sound, video, and cyber reality content. More specifically, the disclosure includes a system through which a composer can pre-package certain sensory stimulating content for use by a performer. Another aspect of the disclosure includes an apparatus through which the performer can virtually trigger and control the presentation of the pre-packaged sensory stimulating content. A common theme for both the composer and performer is that the pre-packaged sensory stimulating content is preferably chosen such that, even where the performer is a novice, the sensory stimulating data is presented in a pleasing and sympathetic manner.
The accompanying drawings, which are included to provide a further understanding of the invention and are incorporated in and constitute a part of this specification, illustrate embodiments of the invention and together with the description serve to explain the principles of the invention.
The present disclosure enables a performer to create sympathetic music using a plurality of triggers in a cyber environment, including cyber reality, technology assisted reality, and augmented reality.
Members 200 can be easily attached to base 210 by inserting base 240 of members 200 into an appropriately sized groove in base 210. This allows base 210 to support members 200; places members 200 at a comfortable, consistent angle; and allows members 200 to be electronically connected to base 210 via cables (not illustrated) that plug into ports 230.
Base 210 also preferably includes switches 220 and 225, and a display 215. Switches 220 and 225 can be configured to allow a performer to switch from program to program, or from segment to segment within a program; adjust the intensity with which the content is presented; adjust the tempo or pitch at which content is presented; start or stop recording of a given performance; and other such functions. Display 215 can provide a variety of information, including the program name or number, the segment name or number, the current content presentation intensity, the current content presentation tempo, or the like.
When the embodiment illustrated in
In an alternative embodiment, base 210 and/or members 200 may also contain one or more speakers, video displays, or other content presentation devices, and one or more data storage devices, such that the combination of base 210 and members 200 provide a self-contained content presentation unit. In this embodiment, as the performer activates the triggers, base 210 can cause the content presentation devices to present the appropriate content to the audience. This embodiment can also preferably be configured to detect whether additional and/or alternative content presentation devices are attached thereto, and to trigger those in addition to, or in place of, the content presentation device(s) within the content presentation unit.
According to one embodiment of this disclosure, the light based triggers are substituted with triggers operable in cyber reality. Cyber reality is defined as the collection of virtual reality, technology assisted reality, and augmented reality that do not require the performer to physically touch the trigger in order to activate it.
Virtual and Augmented Reality
Virtual Reality (VR), Mixed Reality, Technology Assisted Reality, and Augmented Reality can be collectively lumped together with the term Cyber Reality. For simplicity, the collective term will be applied here.
The cyber environment is what the user sees or experiences within the cyber reality display, which often requires the user to wear a special headset viewer.
A Virtual Trigger can be any interactive object or element within the cyber environment that indicates when a user interacts with it. These interactive cyber reality objects/elements are spatial with respect to the overall display and they send standard Gestures or notifications to the Application Engine when the user interacts with them in a prescribed way. For purposes of this disclosure, these Interactive Cyber Reality objects will be referred to as Virtual Triggers.
Cyber Reality provides a plurality of great possibilities for making the end users musical experience even better and more immersive by creating the perfect environment to fit the overall mood of a song. This not only allows a user to listen to the music that they, and others are creating, but it puts them inside the music they are making. Musically Augmented Reality (environments) can bring everyone, musicians and non-musicians alike, closer to this experience than ever before.
Infinite possibilities exist for presenting the virtual triggers in a Cyber Reality environment. Virtual trigger imagery is referred to as the Foreground and it appears on top of, or in front of, the Background imagery. What the user sees behind the Foreground is the Background.
Foreground and Background manipulations that are based on a triggered interactive music provide broad application for music. The music being triggered dynamically affects the Cyber Reality environment on a real-time basis.
There are many possibilities for Foreground and Background manipulations based on trigger activity and/or the sound being produced.
Being able to manipulate (create) the background environment is one of the biggest advantages of Cyber Reality. Immersive “Interactive Music Videos” can be created that puts the user INSIDE the music which takes their music playing experience to a whole new level. In an example, a cyber reality can be created where the user perceives himself as being up on the stage as part of a performing band and playing along with them on the instrument of his choice, and/or playing with a virtual image of his favorite artist.
Foreground Controller
A virtual hologram version of a laser controller, or other types of controllers, can be positioned in front of the user where passing a hand thru the virtual laser beams (or other controls) triggers the instruments as they do on the physical unit.
Foreground Interactive Icons
Virtual Triggers are virtual instrument images or other icons that float in front of the user. The virtual triggers are spatial within the cyber reality environment (display) and the user can position them wherever they want. Briefly pass the hand thru them to play a one-shot or hold the hand in an object to pulse it. In some cases a hand-held controller can be used. Visual feedback can be provided for trigger activity. While a trigger is being broken it could glow, or it could emit a single music note icon for a one-shot and a stream of icons to illustrate pulsing.
Strategic Placement in the Cyber Environment
Placing the virtual triggers strategically in the Cyber Reality Environment brings immense benefits to therapy applications by allowing health care practitioners to set up programs designed to focus on specific therapy models.
Cueing
Virtual triggers can be manipulated to stand out as to suggest when it would be a good time to play that instrument. For kids' songs, the instrument images could rock back & forth to the (real-time) music.
Background environments can be linked to the virtual (music program) triggers so the total environment is affected by what the user plays.
Background Color Organ
Color organs assign a color to a specific frequency range of an audio track—typically 3 or 4 color ranges. Each range illuminates its respective colored light where the brightness is linked with the volume level. Depending on which frequency ranges are the loudest at the moment, different variations of colors are produced, and they all pulse with the music they represent. Although color organs have been around for a long time, this behavior can be mimicked by assigning the different colors to each of the beam triggers, where their brightness is controlled by the audio output level of the trigger being broken. This makes the background environment part of the interactive music experience as well.
There can be any number of virtual triggers 300, and the virtual triggers 300 can be placed anywhere in the cyber environment 312, directly in front of the user, or off to the side, requiring the user to look to the side to see them.
The virtual trigger 300 can be any Cyber Reality object or element that indicates when a user interacts with it. These interactive Cyber Reality objects/elements send standard gestures or notifications to an Application Engine 502, as shown in
Interactive virtual triggers 300 are configured to manipulate the Foreground environment to provide visual feedback on a display when they are triggered, such as, but not limited to, highlighting or altering the trigger imagery in some way, or by introducing additional graphic objects into the foreground. Such trigger-specific manipulations cause the cyber reality Foreground to be dynamically linked to the music programs such as previously described that are being interactively triggered.
Background Kaleidoscope
A kaleidoscope is rendered as background—optionally pulsing to the music.
Each virtual (music program) trigger 300 is assigned to alter a component in a formula that produces the kaleidoscopic image, altering the color, or the kaleidoscope imagery when it is triggered.
The virtual triggers 300 are configured to manipulate the background cyber environment 302 when they are triggered, such as, but not limited to, modifying the color properties of specific elements in the display or changing the imagery entirely.
On an individual basis, each virtual trigger 300, or the sound produced by the virtual trigger 300, controls or adjusts a unique color component for the display. The brightness of the color could optionally be linked to the volume level of the sound being produced by the virtual trigger 300.
In addition, each virtual trigger 300 can increase or decrease the value of a property used to generate the kaleidoscopic design itself (Number of petals. Number of orbits, Radial suction, & Scale factor). The amount of adjustment can be linked to the volume level of the sound being interactively produced by the virtual trigger 300.
The same concept can be applied to simulated multi-colored laser displays that draw geometric patterns in the cyber reality background, where the color attributes or geometric rendering properties are manipulated interactively by the virtual triggers 300 and/or the sounds that are interactively produced by the virtual triggers 300.
Such trigger-specific manipulations cause the cyber reality background 302 to be dynamically linked to the music programs that are being interactively triggered.
The system 500 can be implemented in electronic device 501, and embodied as one of a computer, a smart phone, a tablet, a touchscreen computer, a cyber reality headset 601 and the like having a display 505.
Application engine 502 is operable on an electronic processor 503 and receives one or more Gestures from the multiple virtual triggers 300 within the cyber reality environment 312, such as shown in
Application engine 502 controls playback of media files 506 that are combined to form a multilayered media file based on one or more of Gesture inputs 508, and definition file 510 via sound engine 504. The media files 506 can be one or more MIDI files, samples such as .wav and .mp3 files, video files in a plurality of formats, and/or any other audio or video file format.
Gesture inputs 508 include one or more standard gestures that indicate when and how an interactive virtual trigger 300 is being “touched” by the user within the cyber environment 312. Gesture inputs 508 used for triggering may include, but are not limited to, a Tap gesture, and a Tap-and-Hold gesture.
With a Tap gesture, the touch is held at substantially the same point within the virtual trigger 300 for a substantially short period of time, such as with a threshold for the short period of time of 0.5 seconds or less. The application engine 502 can use the Tap gesture to trigger a one-shot, play a single note in a streamed sequence, or start and stop a loop.
With a Tap-and-Hold gesture, the touch is held at substantially the same point within the virtual trigger object 300 a longer period of time, such as with a threshold for the longer period of time of 0.5 seconds or more. Additional thresholds may be used for Tap-and-Hold gesture with each threshold associated with a different action to be taken by the application engine 502.
The Application engine 502 can use a Tap-and hold gesture to Pulse (stream) notes.
Processor 503 is configured such that visual outputs from application engine 502 are displayed within the cyber reality environment 312 and output from sound engine 502 is played on speaker 512. The combination of application engine 502 and sound engine 504 form an application on the processor 503. The processor 503 is configured to selectively associate a music programs with each of the plurality of virtual triggers. The processor 503 is configured such that when one of the virtual triggers 300 is in a first state for a prolonged period of time successive said audible musical sounds are generated, such that, for instance, the musical program associated with the virtual trigger continues to play uninterrupted, along with any other music programs that are playing in response the associated virtual trigger 300 being triggered.
Display 505 displays the total cyber reality environment 312, which includes Foreground and Background visualizations.
When a virtual trigger 300 is virtually touched or triggered by a user Gesture, trigger-specific visual output from application engine 502 can be shown to simulate triggering a virtual trigger 300 within the cyber reality environment 312.
When a virtual trigger 300 is triggered by a user Gesture, trigger-specific visual output from application engine 502 can be shown to alter the display properties or attributes of any element within the cyber reality environment 312, such as the virtual triggers 300 in the Foreground or what the user sees in the Background behind the virtual triggers 300.
Although applicant has described applicant's preferred embodiments of the present disclosure, it will be understood that the broadest scope of this disclosure includes such modifications as diverse shapes, sizes, materials, and content types. Further, many other advantages of applicant's disclosure will be apparent to those skilled in the art from the above descriptions, including the drawings, specification, and other contents of this patent application and the related patent applications.
Number | Name | Date | Kind |
---|---|---|---|
5513129 | Bolas | Apr 1996 | A |
7915514 | Shrem | Mar 2011 | B1 |
8664508 | Tabata | Mar 2014 | B2 |
9018508 | Sakurai | Apr 2015 | B2 |
9171531 | David | Oct 2015 | B2 |
9208763 | Avitabile | Dec 2015 | B2 |
9542919 | Bencar | Jan 2017 | B1 |
20050223330 | Riopelle | Oct 2005 | A1 |
20050241466 | Riopelle | Nov 2005 | A1 |
20080122786 | Pryor | May 2008 | A1 |
20080223196 | Nakamura | Sep 2008 | A1 |
20080240454 | Henderson | Oct 2008 | A1 |
20090114079 | Egan | May 2009 | A1 |
20090221369 | Riopelle | Sep 2009 | A1 |
20100107855 | Riopelle | May 2010 | A1 |
20110143837 | Riopelle | Jun 2011 | A1 |
20110191674 | Rawley | Aug 2011 | A1 |
20120266741 | Bencar | Oct 2012 | A1 |
20130118339 | Lee | May 2013 | A1 |
20130138233 | Sandler | May 2013 | A1 |
20130291708 | Orshan | Nov 2013 | A1 |
20150046808 | Dejban | Feb 2015 | A1 |
20150243083 | Coggins | Aug 2015 | A1 |
20160104471 | Hyna | Apr 2016 | A1 |
20160179926 | Leppanen | Jun 2016 | A1 |
Number | Date | Country | |
---|---|---|---|
Parent | 15215427 | Jul 2016 | US |
Child | 15402012 | US |