The features of the system, which are believed to be novel, are set forth with particularity in the appended claims. The embodiments herein, can be understood by reference to the following description, taken in conjunction with the accompanying drawings, in the several figures of which like reference numerals identify like elements, and in which:
While the specification concludes with claims defining the features of the embodiments of the invention that are regarded as novel, it is believed that the method, system, and other embodiments will be better understood from a consideration of the following description in conjunction with the drawing figures, in which like reference numerals are carried forward.
As required, detailed embodiments of the present method and system are disclosed herein. However, it is to be understood that the disclosed embodiments are merely exemplary, which can be embodied in various forms. Therefore, specific structural and functional details disclosed herein are not to be interpreted as limiting, but merely as a basis for the claims and as a representative basis for teaching one skilled in the art to variously employ the embodiments of the present invention in virtually any appropriately detailed structure. Further, the terms and phrases used herein are not intended to be limiting but rather to provide an understandable description of the embodiment herein.
The terms “a” or “an,” as used herein, are defined as one or more than one. The term “plurality,” as used herein, is defined as two or more than two. The term “another,” as used herein, is defined as at least a second or more. The terms “including” and/or “having,” as used herein, are defined as comprising (i.e., open language). The term “coupled,” as used herein, is defined as connected, although not necessarily directly, and not necessarily mechanically. The term “processing” or “processor” can be defined as any number of suitable processors, controllers, units, or the like that are capable of carrying out a pre-programmed or programmed set of instructions. The term “performance” can be defined as a musical performance, a theatrical performance, a concert event, a public event, an exposition, or any other suitable event wherein sound production, video production, and lighting production are part of the event. The term “evaluation” can be defined as a response provided by an audience member that reptors, evaluates, or commentates on an aspect of a performance, wherien an aspect can include be audio or visual. The term “sensor” can be defined as a transducer for converting a physical action to an electronic signal. The term “sensory action” can be defined as a physical feedback, a physical response, a physical stimulation, physical action, or physical manipulation applied to a device.
The terms “program,” “software application,” and the like as used herein, are defined as a sequence of instructions designed for execution on a computer system. A program, computer program, or software application may include a subroutine, a function, a procedure, an object method, an object implementation, an executable application, an applet, a midlet, a servlet, a source code, an object code, a shared library/dynamic load library and/or other sequence of instructions designed for execution on a computer system. The term “real-time” is defined as occurring during a performance. The term “aspect” is defined as an audio or visual configuration of a performance. The term “evaluation” is defined as a request to change a sensory aspect of a performance. The term “sensory action” is a physical action an audience member applies to a device during a performance. The term “affecting” is to apply one or more changes to a sensory aspect of a performance including aspects of a performance such as scents, motion of floors, walls, objects or motions and vibrations of objects.
Embodiments of the invention are directed to a system and method for affecting a sensory aspect of a performance. Users in an audience can collectively adjust an audio or visual aspect of a performance to cause an audience-wide feedback to produce a desired effect. As an example, audience members can squeeze or depress sensors on a mobile device for adjusting an equalization or lighting of a performance to produce a desired effect. The performance may be a live show and the audience may be people attending the show. Alternatively, the performance may be a televised event and the audience members comprise attendees of the event as well as people watching the event from a home television, or people enrolled on-line watching the event.
The sensory actions are collected and evaluated, and an auditory or visual aspect of the performance can be adjusted in accordance with the collective audience feedback. The audience can collectively determine changes to the performance and a vote can be cast for adjusting the aspect. For instance, a plurality of users can squeeze the mobile device at a particular location for affecting the change. As one example, users can squeeze a bottom of the mobile device to adjust bass content, squeeze the middle to adjust mid-range, and squeeze the top to adjust treble. Notably, a sound aspect of the performance can be adjusted in accordance with audio evaluations provided by the users. Furthermore, the adjustment can be continued throughout the performance to adjust a sensory aspect of the performance for providing dynamic effect.
Referring to
The mobile device 160 can also connect to the Internet 120 over a WLAN. Wireless Local Access Networks (WLANs) provide wireless access to the mobile communication environment 100 within a local geographical area. WLANs can also complement loading on a cellular system, so as to increase capacity. WLANs are typically composed of a cluster of Access Points (APs) 140 also known as base stations. The mobile communication device 160 can communicate with other WLAN stations such as the laptop 170 within the base station area 150. In typical WLAN implementations, the physical layer uses a variety of technologies such as 802.11b or 802.11g WLAN technologies. The physical layer may use infrared, frequency hopping spread spectrum in the 2.4 GHz Band, or direct sequence spread spectrum in the 2.4 GHz Band. The mobile device 160 can send and receive data to the server 130 or other remote servers on the mobile communication environment 100.
In one example, the mobile device 160 can send and receive data to and from the laptop 170 or other devices or systems over the WLAN connection or the RF connection. The data can include an evaluation for conveying a user's response to a performance. Briefly, the mobile device 160 can be deployed within the communication environment 100 to affect a performance. In particular, the mobile device 160 can include at least one sensor 162 for receiving a response from a user. The user can press the sensor 162 to adjust a sensory aspect of a performance, such as an audio or visual aspect of the performance. For example, during the performance the user can squeeze the mobile device to change an audio equalization or lighting effect of the performance.
The mobile device 160 can be a cell-phone, a personal digital assistant, a portable music player, or any other suitable communication device. The mobile device 160 and the laptop 170 can be equipped with a transmitter and receiver (not shown) for communicating with the AP 140 according to the appropriate wireless communication standard. In one embodiment of the present invention, the wireless station 160 is equipped with an IEEE 802.11 compliant wireless medium access control (MAC) chipset for communicating with the AP 140. IEEE 802.11 specifies a wireless local area network (WLAN) standard developed by the Institute of Electrical and Electronic Engineering (IEEE) committee. The standard does not generally specify technology or implementation but provides specifications for the physical (PHY) layer and Media Access Control (MAC) layer. The standard allows for manufacturers of WLAN radio equipment to build interoperable network equipment.
Referring to
In another arrangement, the nodes 102-n within the mobile communication environment can communicate via Bluetooth within the ad-hoc network 100. Bluetooth is suitable for short-distance communication and is an industrial specification for wireless personal area networks (PANs), also known as IEEE 802.15.1. Bluetooth provides a way to connect and exchange information between devices like personal digital assistants (PDAs), mobile phones, laptops, PCs, printers, digital cameras and video game consoles. For example, nodes 102-n can communicate amongst one another using Bluetooth communications. In particular, the nodes 102-n can share information through messages concerning aspects of a performance. For example, nodes can share audio or visual information related to one or more aspects of a performance.
Referring to
The performance 210 can be a musical performance, a theatrical performance, a concert event, a public event, an exposition, or any other suitable event wherein sound production, video production, and lighting production are part of the event. The performance 210 may also be a televised event that is broadcast to numerous households, institutions, places of meeting, or facilities. The event can also be delivered via satellite, terrestrial wireless systems, AM/FM radio, and the like. The performance 210 may also be broadcast over the internet which may include home entertainment systems, personal computers, or portable media players. The audience 210 includes those individuals attending or watching the performance. The audience 240 may be physically present with the performance 210 or watching the performance 210 on-line, at home, or over a portable media player.
As an example, a musical performance may include performers such as musicians that play musical instruments and sing. The musicians may play one or more electronic or acoustic instruments that can be amplified to produce sound for the audience 240. The sound may also be broadcast or televised to a home audience or on-line audience. Furthermore, the musicians can sing into one or more microphones where their voice can be amplified and presented to the audience 240. In one arrangement, the performance can take place on a stage that may include a sound production system 220 for conveying sound to a general audience. For example, the sound production system 220 may include one or more amplifiers for amplifying the performers' instruments or voices, and one or more speakers, horns, tweeters, or other high power transducers for converting electrical signals to acoustic signals. The sound production system 220 can also include multiple processors for adjusting one or more audio aspects of the sound produced. For example, the processors may include effects processors for adding reverb, delay, phase, flange, or other effects known in the musical industry. Moreover, the processors may include graphic equalizers, filters, or other sound processing devices for enhancing the sound quality or adjusting the sound. For example, a bass, mid-range, and treble of the sound can be controlled by the sound production system 220.
The performance 210 may include visual effects such as lighting to enhance a visual experience of the event. For example, a lighting system 230 may include one or more lighting elements to adjust a lighting of the performance. The lighting elements may change an intensity, a color, or a pattern of the visual effects in conjunction with the performance 210. The lighting system 230 may include high power lights, lasers, holograms, or other suitable lighting components. The lighting effects may also include fog machines or pyrotechnics to add visual effects to the lighting. For example, a fog machine can introduce fog during critical moments during the performance for adding excitement or illusion. The pyrotechnics can include fireworks or other suitable components for enhancing the visual experience of the performance. For example, high voltage sparklers or electronic flames can be used to generate visual cues in conjunction with the performance.
The performance 210 may also be a theatrical performance wherein certain behaviors or physical actions of the performers are captured and presented in visual form to the audience. A video production system 235 may be employed that captures one or more images or videos of the performance and presents the performance on one or more video screens. The video production system 235 may include one or more video cameras (not shown) for capturing footage of the event. The footage can be played on the video screens during the performance for allowing audience members to see the show. For example, during a solo performance, a zoom in video of the performer can be presented to the video screen for viewing by the audience.
The sound production system 220, the lighting production system 230, and the video production system 235, can be controlled by a central media console 250. The media console 250 can communicate with the various production systems to coordinate visual and audio effects with the performance 210. The media console system 250 may be a computer, a server, or any other suitable electronic system capable of coordinating performance activities. Briefly referring back to
Referring to
Briefly, referring to
Furthermore, if so desired, a graphical user interface (GUI) resident on the mobile device 160 may also be referred to as a sensor and can be used for affecting the musical performance. The GUI may be controlled by one or more touchpads or keypads on the mobile device 160. The user can select a GUI corresponding to a sensory aspect of the performance. For example, an audio GUI can be deployed that presents a graphic equalizer for adjusting an equalization of the sound. A lighting GUI can be deployed that presents a visual aspect of the performance including colors, intensity, and hues. A video GUI can be deployed that presents visual perspectives or camera angles of the performance. An audience member can interact with the GUI to adjust the performance in a manner similar to the sensors.
Briefly, referring back to
Referring to
At step 301, the method 300 can begin. At step 310, a plurality of evaluations can be received from an audience. For example, referring back to
At step 320, the plurality of evaluations can be associated with the performance. For instance, the evaluations can propose adjustments to an audio or visual sensory aspect of the performance. Referring to
As previously discussed, the embodiments are also directed adjusting a visual aspect of the performance such as lighting (selecting color and/or intensity), fog, pyrotechnics, and the like. Besides discrete areas, the mobile device 160 can also communicate a specific point along a range if so configured. For example, referring back to
At step 330, an aspect of the performance can be adjusted in accordance with the plurality of evaluations. Notably, the evaluations convey the audience's collective adjustment to an aspect of the performance. The evaluations may specify values associated with one or more parameters such as an audio equalization or lighting intensity. Referring back to
Notably, the media console 250 can assess the evaluations and adjust an aspect of the performance 210 in real-time during the performance. For example, the media console 250 can direct the sound production system 220 to adjust an equalization of the performance 210 in response to a collective response from multiple audience members. That is, the audience can provide collective feedback regarding their assessment of the performance, and propose changes to aspects of the performance. The collective feedback can be evaluated in real-time to determine changes to audio or visual effects of the performance 210. Notably, the media console 250 assesses multiple evaluations and changes an aspect in accordance with an audience-wide response. In one regard, the audience members cast a vote through their evaluations. The media console 250 assess the votes and adjusts an aspect of the performance in accordance with the majority vote.
Where applicable, the present embodiments of the invention can be realized in hardware, software or a combination of hardware and software. Any kind of computer system or other apparatus adapted for carrying out the methods described herein are suitable. A typical combination of hardware and software can be a mobile communications device with a computer program that, when being loaded and executed, can control the mobile communications device such that it carries out the methods described herein. Portions of the present method and system may also be embedded in a computer program product, which comprises all the features enabling the implementation of the methods described herein and which when loaded in a computer system, is able to carry out these methods.
While the preferred embodiments of the invention have been illustrated and described, it will be clear that the embodiments of the invention is not so limited. Numerous modifications, changes, variations, substitutions and equivalents will occur to those skilled in the art without departing from the spirit and scope of the present embodiments of the invention as defined by the appended claims.