Methods and systems for semiconductor illumination have been described, including those techniques disclosed by Color Kinetics Incorporated of Boston, Mass. in the patent applications incorporated by reference herein. Digital processors enable the creation of illumination effects, such as fades or other transitions between different colors. When more than one lighting system is provided, among the lightings systems may be coordinated to achieve both spatial and temporal effects. Thus a color-chasing rainbow may be created using a number of suitably arranged lighting systems under processor control.
However, creating coordinated lighting effects presents many challenges, particularly in how to create complex effects that involve multiple lighting units in unusual geometries. A need exists for improved systems for creating and deploying lighting shows. A need also exists for improved systems for allowing users to create and modify lighting effects in real-time, such as during audio/visual performances that have a lighting component.
Provided herein are methods and systems for managing control instructions for a plurality of light systems. The methods and systems may include providing a light system manager for mapping locations of a plurality of light systems. The methods and systems may include providing a light system composer for composing a lighting show. The methods and systems may include providing a light system engine for playing a lighting show on a plurality of light systems. In embodiments the light system manager may include a mapping facility that maps an incoming entertainment signal, such as an audio signal, to a lighting control signal. In embodiments a control system may be used to modify lighting effects based on a user interface.
In one aspect, a method of illuminating an environment in coordination with a media display includes: providing a lighting control system for controlling a lighting system; providing a user interface for controlling a media display that is distinct from the lighting system; and associating an input of the lighting control system with an output of the user interface.
The method may further include providing an intermediate object representation between a visualization and the lighting control system, so that a mapping between the visualization and the lighting control system can be dynamically modified by a user in real time. The method may further include taking an output from an audio/visual controller with a physical interface and using it to control one or more lights in an entertainment venue.
The lighting system may include a string of lighting units. The string may be displayed on an area. The area may include a curtain. The area may include a panel-type display. The area may include a tile. The area may include a floor. The area may include a dance floor. The area may include a stage. The area may include a bar. The area may include a wall.
The method may include taking a visualization from a computer display associated with an audio player and displaying lights that correspond to the visualization in a physical entertainment venue. The method may include allowing a user to modify a skin of the audio player through a physical interface. The physical interface may include a touch screen. Touching the screen may change at least one of the brightness and the color of at least a part of the skin. The lighting system in such embodiments may include a string of lighting units displayed on an area, and the area may include one or more of a curtain, a panel-type display, a tile, a floor, a dance floor, a stage, a bar, and/or a wall.
The method may include taking video input for a music video and displaying one or more corresponding lights on a string of lights that use a serial addressing protocol. The method may include allowing a user to modify the video input by interacting with a physical interface. The physical interface may include a job dial. The job dial may allow a user to control a rate of the playback of the video input. The physical interface may include a touch screen. Touching the screen may modify a color of at least a part of a video frame. Touching the screen may modify a brightness of at least a part of a video frame. The lighting system in such embodiments may include a string of lighting units displayed on an area, and the area may include one or more of a curtain, a panel-type display, a tile, a floor, a dance floor, a stage, a bar, and/or a wall.
The method may include providing a handheld graphical user interface for modifying alighting effect associated with an audio visual display in real time. The graphical user interface may include an icon that represents a dynamic effect.
In another aspect, a method of location-based addressing of lighting units in a lighting network may include: providing an audio/visual control facility; providing a lighting control facility; providing a plurality of lighting units in a lighting network; associating a location-determination facility with a plurality of the lighting units; and mapping the physical locations of the lighting units to addresses for the lighting units in the lighting network. The location-determination facility may include a triangulation facility.
In another aspect, a system disclosed herein includes: a plurality of light emitting diodes associated with an entertainment event; a network facility interconnecting the plurality of light emitting diodes, wherein the plurality of light emitting diodes respond to control signals carried over the network facility; a control facility for generating control signals; and one or more input facilities connecting input data to the control facility.
In the system, the event may take place in an entertainment venue. The system may include a network of control facilities. The system may include a network of input facilities. The entertainment venue may include at least one of a stadium, an arena, a concert hall, an auditorium, a convention center, a display hall, a nightclub, a discotheque, a live-performance theater, a movie theater, an outdoor theater, a band shell, a recording studio, a film studio, a video studio, a home, a home theater center, a home audio/visual center, a vehicle, an article of clothing, an interior wall, an exterior wall, a sign, a billboard, a tent, and a racetrack. The entertainment event may include one or more of a concert, a play, a movie, a musical, a sporting event, a speech, a rally, and a convention.
The input data may include one or more of aural data, visual data, peripheral data, sensor data, or simulated input data. The aural data may include at least one of duration, periodicity, meter, beat, pitch, amplitude, timbre, harmonic profile, rhyme, spectral profile, mixing data, sequencing data, digital filter coefficients, and transformation data. The visual data may include at least one of color, tone, saturation, depth of field, focus, light, movement, hue, intensity, chromaticity, luminosity, color decomposition, pixel data, visual filter data, visual effect data, and transformation data. The peripheral data may include at least one of genre, popularity, source of origin, creation date, release date, author, and ownership. The simulated input data may be created to simulate output from an audio-visual device.
A source of the input data may include at least one of a live event and prerecorded media. The live event may include a music concert or a recital. The live event may include a dramatic performance. The live event may include a sporting event. The live event may include an ambient sound. The live event may include a natural occurrence. The natural occurrence may include one or more of weather, a natural phenomenon, an erupting volcano, a celestial state, and a celestial event.
The system may further include a live control facility that provides live control of input data. The live control may include a live creation of media. The media may include one or more of a reproduction of a live event, a representation of a live event, and a simulated event. The media may include one or more of a television program, a radio program, a motion picture, a sound recording, a video recording, an image, a video game, a text display, an audio source, a visual source, a mixed audio-visual source, an algorithm. The media may include a display on one or more of a display device, a television, a computer monitor, a billboard, a sign, a touch screen, a projection monitor, a projection screen, an interactive screen, and interactive display, an interactive monitor, and a display associated with an electronic device. The electronic device may include one or more of a mobile telephone, a wireless electronic mail device, a personal digital assistant, an mp3 player, a CD player, and a radio. The live control may include capturing a sound associated with an event. The sound may be associated with a projected image. The sound may be associated with an audio source. The sound may be associated with a video source.
The system may further include a means for receiving the input data in the input facility. The input data may be received by a microphone connected to the input facility. The input data may be received directly through an audio-visual cable connected to the input facility. The input data may be received wirelessly by the input facility. The input data may be projected on a device connected to the input facility. The input data may be superimposed on a device connected to the input facility. The input data may be received on a device connected to the input facility.
The system may further include a conversion means for the input facility to convert the input data to output signals recognized by the control facility. A location of one or more of the light emitting diodes may be known. A map of the light emitting diodes may be used to represent input data. A map of the light emitting diodes may be used to represent output data.
The input facility may include an interactively controllable coupling between the input data and the control facility. The interactively controllable coupling may be at least one of an audio soundboard, an audio mixing board, an audio editing device, a video editing device, a computer, a personal digital assistant, a housing with a display and switches, a wireless network, an interactive audio/visual device allowing the control of audio and visual output from the device different from the audio and visual input to the device, a mapping device, or a computer-based program allowing interaction with and manipulation of input data. The interactively controllable coupling may be manually controlled. The interactively controllable coupling may be automatically controlled. The interactively controllable coupling may be controlled by a software algorithm. Control of the interactively controllable coupling may be shared by an operator and a software algorithm.
A user interface for control of the interactively controllable coupling includes one or more of a touch screen, a dial, a turntable, a button, a knob, a lever, a trackball, a switch, a haptic interface, a gestural input using proximity sensors, a gestural input using motion sensors, a gestural input using imaging devices, a physiological monitor, a temperature sensor, a decimeter, a video camera, a microphone, a virtual reality headset, and virtual reality gloves.
An output data may be controlled by one or more of a characteristic, the characteristic corresponding to one or more of the input data and the output data, and a control signal. The characteristic may include beats per minute. The characteristic may include pitch. The characteristic may include spectral content. The characteristic may include descriptive metadata. The descriptive metadata includes one or more of genre, title, author, data, and media type. The characteristic may include a visual criteria selected from the group consisting of color, tone, saturation, depth of field, focus, light, movement, hue, intensity, chromaticity, luminosity, color decomposition, pixel data, visual filter data, visual effect data, and transformation data. The characteristic may include a location of input data on a map of the light emitting diodes.
The output data may be controlled based upon at least one external criterion not connected to the input data. The at least one external criterion may include a weather criterion. The at least one external criterion may include a temperature. The at least one external criterion may include a mood criterion. The at least one external criterion may include data representative of a density of an audience. The at least one external criterion may include a season criterion. The at least one external criterion may include a time of day. The at least one criterion may include data representative of previous entertainment. The at least one criterion may include data representative of one or more characteristics of planned subsequent entertainment. The at least one criterion may include data representative of an operator of the system. The at least one criterion may include data representative of physical movements. The physical movements may include movements of one or more of a hand, an arm, a body, and a head. The at least one criterion may include data representative of a body temperature. The at least one criterion may include data representative of a heart rate. The at least one criterion may include data representative of a blood pressure. The output data may be recorded. The output data may be reproduced.
Existing and desired characteristics of the output data can be presented in at least one of a graphical form, a textual form, an automated voice description, or an abstract visual signal. The input data and the output data may be manually coupled. The input data and the output data may be automatically coupled. Existing characteristics of the input data can be presented in at least one of a graphical form, a textual form, an automated voice description, or an abstract visual signal. The input data and the output data may be manually coupled. The input data and the output data may be automatically coupled.
The network facility may use a serial addressing protocol.
It should be appreciated that all combinations of the foregoing concepts and additional concepts discussed in greater detail below are contemplated as being part of the inventive subject matter disclosed herein. In particular, all combinations of claimed subject matter appearing at the end of this disclosure are contemplated as being part of the inventive subject matter disclosed herein.
Definitions used herein are for purposes of illustration and are not intended to be limiting in any way.
As used herein for purposes of the present disclosure, the term “LED” should be understood to include any electroluminescent diode or other type of carrier injection/junction-based system that is capable of generating radiation in response to an electric signal. Thus, the term LED includes, but is not limited to, various semiconductor-based structures that emit light in response to current, light emitting polymers, electroluminescent strips, and the like.
In particular, the term LED refers to light emitting diodes of all types (including semi-conductor and organic light emitting diodes) that may be configured to generate radiation in one or more of the infrared spectrum, ultraviolet spectrum, and various portions of the visible spectrum (generally including radiation wavelengths from approximately 400 nanometers to approximately 700 nanometers). Some examples of LEDs include, but are not limited to, various types of infrared LEDs, ultraviolet LEDs, red LEDs, blue LEDs, green LEDs, yellow LEDs, amber LEDs, orange LEDs, and white LEDs (discussed further below). It also should be appreciated that LEDs may be configured to generate radiation having various bandwidths for a given spectrum (e.g., narrow bandwidth, broad bandwidth).
For example, one implementation of an LED configured to generate essentially white light (e.g., a white LED) may include a number of dies which respectively emit different spectra of electroluminescence that, in combination, mix to form essentially white light. In another implementation, a white light LED may be associated with a phosphor material that converts electroluminescence having a first spectrum to a different second spectrum. In one example of this implementation, electroluminescence having a relatively short wavelength and narrow bandwidth spectrum “pumps” the phosphor material, which in turn radiates longer wavelength radiation having a somewhat broader spectrum. A variety of interfaces, tools, protocols, and the like are disclosed for managing control instructions to light systems. Generally, any arrangement of lighting systems may be mapped into an environment for design and deployment of lighting effect. Various aspects of the environment may include a light system manager for mapping locations of a plurality of light systems, a light system composer for composing lighting shows or effects, and a light system engine for executing lighting shows on various light systems.
Definitions used herein are for purposes of illustration and are not intended to be limiting in any way. As used herein, “Color Kinetics” means Color Kinetics Incorporated, a Delaware corporation with headquarters in Boston, Mass.
As used herein, the term “LED” means any system that is capable of receiving an electrical signal and producing a color of light in response to the signal. The term “LED” should be understood to include light emitting diodes of all types, as well as other types of carrier injection/junction-based systems, or any other semiconductor or organic structures or systems that emit light in response to an electric signal. Thus, LEDs may include light emitting polymers, light emitting strips, semiconductor dies that produce light in response to current, organic LEDs, electro-luminescent strips, and other such systems. Additionally, an “LED” may refer to a single light emitting diode package having multiple semiconductor dies that are individually controlled. It should also be understood that the term “LED” does not restrict the package type of the LED. The term “LED” includes packaged LEDs, non-packaged packaged LEDs, surface mount LEDs, chip on board LEDs and LEDs of all other configurations. It should also be appreciated that LEDs may be configured to generate radiation in one or more of the infrared spectrum, ultraviolet spectrum, and various portions of the visible spectrum (generally including radiation wavelengths from approximately 400 nanometers to approximately 700 nanometers). Some examples of LEDs include, but are not limited to, various types of infrared LEDs, ultraviolet LEDs, red LEDs, blue LEDs, green LEDs, yellow LEDs, amber LEDs, orange LEDs, and white LEDs. It also should be appreciated that LEDs may be configured to generate radiation having various bandwidths or ranges of bandwidths, and may thus be single frequency (or nearly single frequency), narrow band, or broad band sources of light.
It should also be understood that the term LED does not limit the physical and/or electrical package type of an LED. For example, as discussed above, an LED may refer to a single light emitting device having multiple dies that are configured to respectively emit different spectrums of radiation (e.g., that may or may not be individually controllable). Also, an LED may be associated with a phosphor that is considered as an integral part of the LED (e.g., some types of white LEDs). In general, the term LED may refer to packaged LEDs, non-packaged LEDs, surface mount LEDs, chip-on-board LEDs, radial package LEDs, power package LEDs, LEDs including some type of encasement and/or optical element (e.g., a diffusing lens), etc.
An LED system is one type of illumination source. As used herein “illumination source” should be understood to include all illumination sources, including LED systems, as well as incandescent sources, including filament lamps, pyro-luminescent sources, such as flames, candle-luminescent sources, such as gas mantles and carbon arch radiation sources, as well as photo-luminescent sources, including gaseous discharges, fluorescent sources, phosphorescence sources, lasers, electro-luminescent sources, such as electro-luminescent lamps, light emitting diodes, and cathode luminescent sources using electronic satiation, as well as miscellaneous luminescent sources including galvano-luminescent sources, crystallo-luminescent sources, kine-luminescent sources, thermo-luminescent sources, triboluminescent sources, sonoluminescent sources, and radioluminescent sources. Illumination sources may also include luminescent polymers capable of producing primary colors.
The term “illuminate” should be understood to refer to the production of one or more frequencies of radiation by an illumination source. The term “color” should be understood to refer to any frequency or combination of frequencies of radiation within a spectrum, and may include frequencies not only of the visible spectrum, but also frequencies in the infrared and ultraviolet areas of the spectrum, and in other areas of the electromagnetic spectrum.
The term “light source” should be understood to refer to any one or more of a variety of radiation sources, including, but not limited to, LED-based sources as defined above, incandescent sources (e.g., filament lamps, halogen lamps), fluorescent sources, phosphorescent sources, high-intensity discharge sources (e.g., sodium vapor, mercury vapor, and metal halide lamps), lasers, other types of luminescent sources, electro-luminescent sources, pyro-luminescent sources (e.g., flames), candle-luminescent sources (e.g., gas mantles, carbon arc radiation sources), photo-luminescent sources (e.g., gaseous discharge sources), cathode luminescent sources using electronic satiation, galvano-luminescent sources, crystallo-luminescent sources, kine-luminescent sources, thermo-luminescent sources, triboluminescent sources, sonoluminescent sources, radioluminescent sources, and luminescent polymers.
A given light source may be configured to generate electromagnetic radiation within the visible spectrum, outside the visible spectrum, or a combination of both. Hence, the terms “light” and “radiation” are used interchangeably herein. Additionally, a light source may include as an integral component one or more filters (e.g., color filters), lenses, or other optical components. Also, it should be understood that light sources may be configured for a variety of applications, including, but not limited to, indication and/or illumination. An “illumination source” is a light source that is particularly configured to generate radiation having a sufficient intensity to effectively illuminate an interior or exterior space.
The term “spectrum” should be understood to refer to any one or more frequencies (or wavelengths) of radiation produced by one or more light sources. Accordingly, the term “spectrum” refers to frequencies (or wavelengths) not only in the visible range, but also frequencies (or wavelengths) in the infrared, ultraviolet, and other areas of the overall electromagnetic spectrum. Also, a given spectrum may have a relatively narrow bandwidth (essentially few frequency or wavelength components) or a relatively wide bandwidth (several frequency or wavelength components having various relative strengths). It should also be appreciated that a given spectrum may be the result of a mixing of two or more other spectrums (e.g., mixing radiation respectively emitted from multiple light sources).
For purposes of this disclosure, the term “color” is used interchangeably with the term “spectrum.” However, the term “color” generally is used to refer primarily to a property of radiation that is perceivable by an observer (although this usage is not intended to limit the scope of this term). Accordingly, the terms “different colors” implicitly refer to different spectrums having different wavelength components and/or bandwidths. It also should be appreciated that the term “color” may be used in connection with both white and non-white light.
The term “color temperature” generally is used herein in connection with white light, although this usage is not intended to limit the scope of this term. Color temperature essentially refers to a particular color content or shade (e.g., reddish, bluish) of white light. The color temperature of a given radiation sample conventionally is characterized according to the temperature in degrees Kelvin (K) of a black body radiator that radiates essentially the same spectrum as the radiation sample in question. The color temperature of white light generally falls within a range of from approximately 700 degrees K (generally considered the first visible to the human eye) to over 10,000 degrees K.
Lower color temperatures generally indicate white light having a more significant red component or a “warmer feel,” while higher color temperatures generally indicate white light having a more significant blue component or a “cooler feel.” By way of example, a wood burning fire has a color temperature of approximately 1,800 degrees K, a conventional incandescent bulb has a color temperature of approximately 2848 degrees K, early morning daylight has a color temperature of approximately 3,000 degrees K, and overcast midday skies have a color temperature of approximately 10,000 degrees K. A color image viewed under white light having a color temperature of approximately 3,000 degree K has a relatively reddish tone, whereas the same color image viewed under white light having a color temperature of approximately 10,000 degrees K has a relatively bluish tone.
The terms “lighting unit” and “lighting fixture” are used interchangeably herein to refer to an apparatus including one or more light sources of same or different types. A given lighting unit may have any one of a variety of mounting arrangements for the light source(s), enclosure/housing arrangements and shapes, and/or electrical and mechanical connection configurations. Additionally, a given lighting unit optionally may be associated with (e.g., include, be coupled to and/or packaged together with) various other components (e.g., control circuitry) relating to the operation of the light source(s). An “LED-based lighting unit” refers to a lighting unit that includes one or more LED-based light sources as discussed above, alone or in combination with other non LED-based light sources.
The terms “processor” or “controller” are used herein interchangeably to describe various apparatus relating to the operation of one or more light sources. A processor or controller can be implemented in numerous ways, such as with dedicated hardware, using one or more microprocessors, microcontrollers, programmable digital signal processors, programmable gate arrays, programmable logic devices or other devices that can be programmed to perform the various functions discussed herein, or as a combination of dedicated hardware to perform some functions and programmed microprocessors and associated circuitry to perform other functions.
In various implementations, a processor may be associated with one or more storage media generically referred to herein as “memory,” e.g., volatile and/or non-volatile computer memory such as RAM, PROM, EPROM, and EEPROM, floppy disks, compact disks, optical disks, magnetic tape, removable or integrated flash memory devices, and so on. In some implementations, the storage media may be encoded with one or more programs that, when executed on one or more processors, perform at least some of the functions discussed herein. Various storage media may be fixed within a processor or controller or may be transportable, such that the one or more programs stored thereon can be loaded into a processor or controller so as to implement various aspects of the systems discussed herein. The terms “program” or “computer program” are used herein in a generic sense to refer to any type of computer code (e.g., high or low level software, microcode, and so on) that can be employed to control operation of a processor.
The term “addressable” as used herein means accessible through an address, or generally configured to receive information (e.g., data) intended for one or more devices, and to selectively respond to particular information therein. Addressable devices may include light sources in general, lighting units or fixtures, processors associated with one or more light sources or lighting units, other non-lighting related devices and so on. The term “addressable” often is used in connection with a networked environment (or a “network,” discussed further below), in which multiple devices are coupled in a communicating relationship.
In one network implementation, one or more devices coupled to a network may serve as a controller for one or more other devices coupled to the network (e.g., in a master/slave relationship). In another implementation, a networked environment may include one or more dedicated controllers that are configured to control one or more of the devices coupled to the network. Generally, multiple devices coupled to the network each may have access to data that is present on the communications medium or media; however, a given device may be “addressable” in that it is configured to selectively exchange data with (i.e., receive data from and/or transmit data to) the network, based, for example, on one or more particular identifiers (e.g., “addresses”) assigned to it.
The term “network” as used herein refers to any interconnection of two or more devices (including controllers) that facilitates the transport of information (e.g. for device control, data storage, data exchange, diagnostics, etc.) between any two or more devices and/or among multiple devices coupled to the network. As should be readily appreciated, various implementations of networks suitable for interconnecting multiple devices may include any of a variety of network topologies and employ any of a variety of communication protocols. Additionally, in various networks according to the present invention, any one connection between two devices may represent a dedicated connection between the two systems, or alternatively a shared or other non-dedicated connection. In addition to carrying information intended for the two devices, such a non-dedicated connection may carry information not necessarily intended for either of the two devices. Furthermore, it should be readily appreciated that various networks of devices as discussed herein may employ one or more wireless, wired, cable, fiber optic and/or other links to facilitate information transport throughout the network.
The term “user interface” as used herein refers to an interface between a human user or operator and one or more devices that enables communication between the user and the device(s). Examples of user interfaces that may be employed in various implementations of the present invention include, but are not limited to, switches, potentiometers, buttons, dials, sliders, a mouse, keyboard, keypad, various types of game controllers (e.g., joysticks), track balls, display screens, various types of graphical user interfaces (GUIs), touch screens, microphones and other types of sensors that may receive some form of human-generated stimulus and generate a signal in response thereto.
The following patents and patent applications are hereby incorporated herein by reference:
U.S. Pat. No. 6,016,038, issued Jan. 18, 2000, entitled “Multicolored LED Lighting Method and Apparatus;”
U.S. Pat. No. 6,211,626, issued Apr. 3, 2001, entitled “Illumination Components,”
U.S. Pat. No. 6,608,453, issued Aug. 19, 2003, entitled “Methods and Apparatus for Controlling Devices in a Networked Lighting System;”
U.S. Pat. No. 6,548,967, issued Apr. 15, 2003, entitled “Universal Lighting Network Methods and Systems;”
U.S. Pat. No. 6,777,891, filed May 30, 2002, entitled “Methods and Apparatus for Controlling Devices in a Networked Lighting System;”
U.S. patent application Ser. No. 09/886,958, filed Jun. 21, 2001, entitled Method and Apparatus for Controlling a Lighting System in Response to an Audio Input;”
U.S. patent application Ser. No. 10/078,221, filed Feb. 19, 2002, entitled “Systems and Methods for Programming Illumination Devices;”
U.S. patent application Ser. No. 09/344,699, filed Jun. 25, 1999, entitled “Method for Software Driven Generation of Multiple Simultaneous High Speed Pulse Width Modulated Signals;”
U.S. patent application Ser. No. 09/805,368, filed Mar. 13, 2001, entitled “Light-Emitting Diode Based Products;”
U.S. patent application Ser. No. 09/716,819, filed Nov. 20, 2000, entitled “Systems and Methods for Generating and Modulating Illumination Conditions;”
U.S. patent application Ser. No. 09/675,419, filed Sep. 29, 2000, entitled “Systems and Methods for Calibrating Light Output by Light-Emitting Diodes;”
U.S. patent application Ser. No. 09/870,418, filed May 30, 2001, entitled “A Method and Apparatus for Authoring and Playing Back Lighting Sequences;”
U.S. patent application Ser. No. 10/045,629, filed Oct. 25, 2001, entitled “Methods and Apparatus for Controlling Illumination;”
U.S. patent application Ser. No. 10/163,085, filed Jun. 5, 2002, entitled “Systems and Methods for Controlling Programmable Lighting Systems;”
U.S. patent application Ser. No. 10/325,635, filed Dec. 19, 2002, entitled “Controlled Lighting Methods and Apparatus;” and
U.S. patent application Ser. No. 10/360,594, filed Feb. 6, 2003, entitled “Controlled Lighting Methods and Apparatus.”
Methods and systems are provided herein for supplying control signals for lighting systems, including methods and systems for authoring effects and shows for lighting systems.
Various embodiments of the present invention are described below, including certain embodiments relating particularly to LED-based light sources. It should be appreciated, however, that the present invention is not limited to any particular manner of implementation, and that the various embodiments discussed explicitly herein are primarily for purposes of illustration. For example, while many of the examples herein describe LED-based implementations, the various concepts discussed herein may be usefully employed in a variety of environments involving LED-based light sources, other types of light sources not including LEDs, environments that involve both LEDs and other types of light sources in combination, and environments that involve non-lighting-related devices alone or in combination with various types of light sources. It will be understood that the term environment or lighting environment, as used herein, are intended to refer to any environment or venue in which a lighting system and/or other related devices might be deployed to generate lighting effects, unless a different meaning is explicitly stated or otherwise clear from the context.
The lighting unit 100 shown in
Additionally, one or more lighting units 100 may be implemented in a variety of products including, but not limited to, various forms of light modules or bulbs having various shapes and electrical/mechanical coupling arrangements (including replacement or “retrofit” modules or bulbs adapted for use in conventional sockets or fixtures), as well as a variety of consumer and/or household products such as night lights, toys, games or game components, entertainment components or systems, utensils, appliances, kitchen aids, cleaning products, and the like.
The lighting unit 100 may include one or more light sources 104 (shown collectively as 104), wherein one or more of the light sources may be an LED-based light source that includes one or more light emitting diodes (LEDs). In one aspect of this embodiment, any two or more of the light sources 104 may be adapted to generate radiation of different colors (e.g. red, green, and blue, respectively). Although
As shown in
One or more of the light sources 104 may include a group of multiple LEDs or other types of light sources (e.g., various parallel and/or serial connections of LEDs or other types of light sources) that are controlled together by the processor 102. Additionally, it should be appreciated that one or more of the light sources 104 may include one or more LEDs that are adapted to generate radiation having any of a variety of spectra (i.e., wavelengths or wavelength bands), including, but not limited to, various visible colors (including essentially white light), various color temperatures of white light, ultraviolet, or infrared. LEDs having a variety of spectral bandwidths (e.g., narrow band, broader band) may be employed in various implementations of the lighting unit 100.
In another aspect, the lighting unit 100 may be constructed and arranged to produce a wide range of variable color radiation. For example, the lighting unit 100 may be particularly arranged such that the processor-controlled variable intensity light generated by two or more of the light sources 104 combines to produce a mixed colored light (including essentially white light having a variety of color temperatures). In particular, the color (or color temperature) of the mixed colored light may be varied by varying one or more of the respective intensities of the light sources (e.g., in response to one or more control signals output by the processor 102). Furthermore, the processor 102 may be particularly configured (e.g., programmed) to provide control signals to one or more of the light sources so as to generate a variety of static or time-varying (dynamic) multi-color (or multi-color temperature) lighting effects.
Thus, the lighting unit 100 may include a wide variety of colors of LEDs in various combinations, including two or more of red, green, and blue LEDs to produce a color mix, as well as one or more other LEDs to create varying colors and color temperatures of white light. For example, red, green and blue can be mixed with amber, white, UV, orange, IR or other colors of LEDs. Such combinations of differently colored LEDs in the lighting unit 100 can facilitate accurate reproduction of a host of desirable spectrums of lighting conditions, examples of which include, but are not limited to, a variety of outside daylight equivalents at different times of the day, various interior lighting conditions, lighting conditions to simulate a complex multicolored background, and the like. Other desirable lighting conditions can be created by removing particular pieces of spectrum that may be specifically absorbed, attenuated or reflected in certain environments. Water, for example tends to absorb and attenuate most non-blue and non-green colors of light, so underwater applications may benefit from lighting conditions that are tailored to emphasize or attenuate some spectral elements relative to others.
The lighting unit 100 also may include a memory 114 to store various information. For example, the memory 114 may be employed to store one or more lighting programs for execution by the processor 102 (e.g., to generate one or more control signals for the light sources), as well as various types of data useful for generating variable color radiation (e.g., calibration information, discussed further below). The memory 114 also may store one or more particular identifiers (e.g., a serial number, an address, etc.) that may be used either locally or on a system level to identify the lighting unit 100. The memory 114 may include read-only memory, which may be programmable read-only memory, for storing information such as identifiers or boot information. In various embodiments, identifiers may be pre-programmed by a manufacturer, for example, and may be either alterable or non-alterable thereafter (e.g., via some type of user interface located on the lighting unit, via one or more data or control signals received by the lighting unit, etc.). Alternatively, such identifiers may be determined at the time of initial use of the lighting unit in the field, and again may be alterable or non-alterable thereafter.
One issue that may arise in connection with controlling multiple light sources 104 in the lighting unit 100, and controlling multiple lighting units 100 in a lighting system relates to potentially perceptible differences in light output between substantially similar light sources. For example, given two virtually identical light sources being driven by respective identical control signals, the actual intensity of light output by each light source may be perceptibly different. Such a difference in light output may be attributed to various factors including, for example, slight manufacturing differences between the light sources, normal wear and tear over time of the light sources that may differently alter the respective spectrums of the generated radiation, or other environmental factors or normal fabrication variations. For purposes of the present discussion, light sources for which a particular relationship between a control signal and resulting intensity are not known are referred to as “uncalibrated” light sources.
The use of one or more uncalibrated light sources as light sources 104 in the lighting unit 100 may result in generation of light having an unpredictable, or “uncalibrated,” color or color temperature. For example, consider a first lighting unit including a first uncalibrated red light source and a first uncalibrated blue light source, each controlled by a corresponding control signal having an adjustable parameter in a range of from zero to 255 (0-255). For purposes of this example, if the red control signal is set to zero, blue light is generated, whereas if the blue control signal is set to zero, red light is generated. However, it both control signals are varied from non-zero values, a variety of perceptibly different colors may be produced (e.g., in this example, at very least, many different shades of purple are possible). In particular, perhaps a particular desired color (e.g., lavender) is given by a red control signal having a value of 125 and a blue control signal having a value of 200.
Now consider a second lighting unit including a second uncalibrated red light source substantially similar to the first uncalibrated red light source of the first lighting unit, and a second uncalibrated blue light source substantially similar to the first uncalibrated blue light source of the first lighting unit. As discussed above, even if both of the uncalibrated red light sources are driven by respective identical control signals, the actual intensity of light output by each red light source may be perceptibly different. Similarly, even if both of the uncalibrated blue light sources are driven by respective identical control signals, the actual intensity of light output by each blue light source may be perceptibly different.
With the foregoing in mind, it should be appreciated that if multiple uncalibrated light sources are used in combination in lighting units to produce a mixed colored light as discussed above, the observed color (or color temperature) of light produced by different lighting units under identical control conditions may be perceivably different. Specifically, consider again the “lavender” example above; the “first lavender” produced by the first lighting unit with a red control signal of 125 and a blue control signal of 200 indeed may be perceptibly different than a “second lavender” produced by the second lighting unit with a red control signal of 125 and a blue control signal of 200. More generally, the first and second lighting units generate uncalibrated colors by virtue of their uncalibrated light sources.
In view of the foregoing, the lighting unit 100 may include a calibration facility 104 to facilitate the generation of light having a calibrated (e.g., predictable, reproducible) color at any given time. In one aspect, the calibration facility 104 may be configured to adjust the light output of at least some light sources 104 of the lighting unit 100 so as to compensate for perceptible differences between similar light sources used in different lighting units.
For example, the processor 102 of the lighting unit 100 may be configured to control one or more of the light sources 104 so as to output radiation at a calibrated intensity that substantially corresponds in a predetermined manner to a control signal for the light source(s) 104. As a result of mixing radiation having different spectra and respective calibrated intensities, a calibrated color is produced. One or more calibration values for one or more of the light sources 104 may be stored in the memory 114, and the processor 102 may be programmed to apply the respective calibration values to the control signals for the corresponding light sources 104 so as to generate the calibrated intensities.
In one aspect, calibration values may be determined once (e.g., during a lighting unit manufacturing/testing phase) and stored in the memory 114 for use by the processor 102. In another aspect, the processor 102 may be configured to derive one or more calibration values dynamically (e.g. from time to time) with the aid of one or more photosensors (not shown) or other suitable devices. The photosensor(s) may be one or more external components coupled to the lighting unit 100, or alternatively may be integrated as part of the lighting unit 100 itself. A photosensor is one example of a signal source that may be integrated or otherwise associated with the lighting unit 100, and monitored by the processor 102 in connection with the operation of the lighting unit 100. Other examples of such signal sources are discussed further below.
The processor 102 may derive one or more calibration values by applying a reference control signal to a light source 104, and measuring (e.g., via one or more photosensors) an intensity of radiation generated in response. The processor 102 may be programmed to then make a comparison of the measured intensity and at least one reference value (e.g., representing an intensity that nominally would be expected in response to the reference control signal). Based on such a comparison, the processor 102 may determine one or more calibration values for the light source 104. In particular, the processor 102 may derive a calibration value such that, when applied to the reference control signal, the light source 104 outputs radiation having an intensity the corresponds to the reference value (i.e., the “expected” intensity).
In various aspects, one calibration value may be derived for an entire range of control signal/output intensities for a given light source 104. The calibration value may serve as a source value for a formula that calibrates light output, such as through a straight line approximation of calibration over the range of operation of the light source 104. Alternatively, multiple calibration values may be derived for a given light source (i.e., a number of calibration value “samples” may be obtained) that are respectively applied over different control signal/output intensity ranges, to approximate a nonlinear calibration function in a piecewise linear manner.
The lighting unit 100 may include one or more interfaces 118 that are provided to facilitate any of a number of user-selectable settings or functions (e.g., generally controlling the light output of the lighting unit 100, changing and/or selecting various pre-programmed lighting effects to be generated by the lighting unit, changing and/or selecting various parameters of selected lighting effects, setting particular identifiers such as addresses or serial numbers for the lighting unit, etc.). Communication with the interface 118 of the lighting unit 100 may be accomplished through wire or cable, or wireless transmission. The interface 118 may present external controls that are, for example, physical controls such as switches, dials, buttons, or the like, programmatic controls, such as an application programming interface, or a user interface such as a graphical user interface on a computer. Similarly, the interface 118 may simply present a network interface that may be accessed through any corresponding network facility, and may be coupled in turn to a computer that provides a graphical user interface to a user for controlling the lighting unit 100. All such interfaces may be used, alone or in combination, to control operation of the lighting unit 100 described herein.
In one implementation, the processor 102 of the lighting unit 100 monitors the interface 118 and controls one or more of the light sources 104 based at least in part on signals, such as user signals, provided through the interface 118. For example, the processor 102 may be configured to respond to operation of the interface 118 by originating one or more control signals for controlling one or more of the light sources 104. Alternatively, the processor 102 may be configured to respond by selecting one or more pre-programmed control signals stored in memory 114, modifying control signals generated by executing a lighting program, selecting and executing a new lighting program from memory 114, or otherwise affecting the radiation generated by one or more of the light sources 104.
In a manually controlled embodiment, the interface 118 may include one or more switches (e.g., a standard wall switch) that interrupt power to the processor 102. The processor 102 may be configured to monitor the power as controlled by the switch of the interface 118, and in turn control one or more of the light sources 104 based at least in part on a duration of a power interruption caused by operation of the interface 118. As discussed above, the processor 102 may be particularly configured to respond to a predetermined duration of a power interruption by, for example, selecting one or more pre-programmed control signals stored in memory 114, modifying control signals generated by executing a lighting program, selecting and executing a new lighting program from memory 114, or otherwise affecting the radiation generated by one or more of the light sources 104.
Examples of the signal(s) 122 that may be received and processed by the processor 102 include, but are not limited to, one or more audio signals, video signals, power signals, various types of data signals, signals representing information obtained from a network (e.g., the Internet), signals representing one or more detectable/sensed conditions, signals from lighting units, signals consisting of modulated light, etc. In various implementations, the signal source(s) 124 may be located remotely from the lighting unit 100, or included as a component of the lighting unit 100. For example, in one embodiment, a signal from one lighting unit could be sent over a network to another lighting unit.
Some examples of a signal source 124 that may be employed in, or used in connection with, the lighting unit 100 include any of a variety of sensors or transducers that generate one or more signals 122 in response to some stimulus. Examples of such sensors include, but are not limited to, various types of environmental condition sensors, such as thermally sensitive (e.g., temperature, infrared) sensors, humidity sensors, motion sensors, photosensors/light sensors (e.g., sensors that are sensitive to one or more particular spectra of electromagnetic radiation), various types of cameras, sound or vibration sensors or other pressure/force transducers (e.g., microphones, piezoelectric devices), and the like.
Additional examples of a signal source 124 include various metering/detection devices that monitor electrical signals or characteristics (e.g., voltage, current, power, resistance, capacitance, inductance, etc.) or chemical/biological characteristics (e.g., acidity, a presence of one or more particular chemical or biological agents, bacteria, etc.) and provide one or more signals 122 based on measured values of the signals or characteristics. Yet other examples of a signal source 124 include various types of scanners, image recognition systems, voice or other sound recognition systems, artificial intelligence and robotics systems, and the like. A signal source 124 could also be another lighting unit 100, a processor 102, or any one of many available signal generating devices, such as media players, MP3 players, computers, DVD players, CD players, television signal sources, camera signal sources, microphones, speakers, telephones, cellular phones, instant messenger devices, SMS devices, wireless devices, personal organizer devices, and many others.
The lighting unit 100 also may include one or more optical elements 130 to optically process the radiation generated by the light sources 104. For example, one or more optical elements 130 may be configured to alter both of a spatial distribution and a propagation direction of radiation from the light sources 104. In particular, one or more optical elements may be configured to change a diffusion angle of the generated radiation. In one aspect of this embodiment, one or more optical elements 130 may be particularly configured to variably change one or both of a spatial distribution and a propagation direction of the generated radiation (e.g., in response to some electrical and/or mechanical stimulus). Examples of optical elements that may be included in the lighting unit 100 include, but are not limited to, reflective materials, refractive materials, diffusing materials, translucent materials, filters, lenses, mirrors, and fiber optics. The optical element 130 also may include a phosphorescent material, luminescent material, or other material capable of responding to or interacting with radiation from the light sources 104.
The lighting unit 100 may include one or more communication ports 120 to facilitate coupling of the lighting unit 100 to any of a variety of other devices. For example, one or more communication ports 120 may facilitate coupling multiple lighting units together as a networked lighting system, in which at least some of the lighting units are addressable (e.g., have particular identifiers or addresses) and are responsive to particular data transported across the network. It will be appreciated that the interface 118 may also serve as a communication port, and that the communication port 120 may include an interface for any suitable wired or wireless communications, and that notwithstanding the separate description of these components, all such possible combinations are intended to be included within the lighting unit 100 as described herein.
In a networked lighting system environment, as discussed in greater detail further below, the processor 102 of the lighting unit 100 may be configured to respond to particular data (e.g., lighting control commands) received over the network (not shown) that pertain to it. Once the processor 102 identifies particular data intended for it, such as by examining addressing information therein, it may read the data and, for example, change the lighting conditions produced by its light sources 104 according to the received data (e.g., by generating appropriate control signals to the light sources 104). In one aspect, the memory 114 of the lighting unit 100 (and other lighting units in a network) may be loaded, for example, with a table of lighting control signals that correspond with data the processor 102 receives. Once the processor 102 receives data from the network, the processor 102 may consult the table to select the control signals that correspond to the received data, and control the light sources 104 of the lighting unit 100 accordingly.
In one aspect of this embodiment, the processor 102 may be configured to interpret lighting instructions/data that are received in a DMX protocol (as discussed, for example, in U.S. Pat. Nos. 6,016,038 and 6,211,626), which is a lighting command protocol conventionally employed in the lighting industry for some programmable lighting applications. However, it should be appreciated that other communication protocols may be suitably employed with the systems described herein.
The lighting unit 100 may include and/or be coupled to one or more power sources 108. The power source(s) 108 may include, but are not limited to, AC power sources, DC power sources, batteries, solar-based power sources, thermoelectric or mechanical-based power sources and the like. Additionally, in one aspect, the power source(s) 108 may include or be associated with one or more power conversion devices that convert power received by an external power source to a form suitable for operation of the lighting unit 100.
While not shown explicitly in
The lighting unit 100 also may have any one of a variety of mounting arrangements, enclosure/housing arrangements and shapes to partially or fully enclose the light sources 104, and/or provide electrical and mechanical connection configurations to the lighting unit 100 or the light sources 104. In particular, a lighting unit 100 may be configured as a replacement or “retrofit” to engage electrically and mechanically in a conventional socket or fixture arrangement (e.g., an Edison-type screw socket, a halogen fixture arrangement, a fluorescent fixture arrangement, etc.). Additionally, the mounting arrangements may include electromechanical devices for controlling light output, such as robotics to point a lighting unit 100 in various directions, a focus control to change focus of a beam of light emitting from the lighting unit 100, or a selector to change filters for light emitted from the lighting unit 100. All such electromechanical systems may be including in the lighting unit 100, and may be employed to generate the various lighting effects described herein.
Additionally, one or more optical elements 130 as discussed above may be partially or fully integrated with an enclosure/housing arrangement for the lighting unit 100. Furthermore, the lighting unit 100 may optionally be associated with (e.g., include, be coupled to and/or packaged together with) various other components such as control circuitry including the processor 102 and/or memory 114, one or more sensors, transducers, or other signal sources 124, interfaces 118 (including user interfaces and controls), displays, power sources 108, power conversion devices, and other components relating to the operation of the light source(s) 300.
While not shown explicitly in
As shown in
In the system 200 of
For example, according to one embodiment of networked lighting system 200, the central controller 204 may communicate with the LUCs 208 using an Ethernet network, and in turn the LUCs 208 may use DMX-based communications with the lighting units 100. This topology is depicted generally in
In the networked lighting system 200 of
It should again be appreciated that the foregoing example of using multiple different communication implementations (e.g., Ethernet/DMX) in a lighting system according to one embodiment of the present invention is for purposes of illustration only, and that the invention is not limited to this particular example. Rather, the generic network architecture depicted in
The graphical source 302 may be any source for a graphical representation of an image, such as a drawing or photograph, or an image source file using any of a variety of formats for storing graphical file such as bit-mapped files, JPEG files, PNG files, PDF files, and so on. The static image may include images captured from a computer screen, television screen, or other video output. The static image may also be a printed or hand-rendered image, or any other image in any tangible form or media. The graphical representation may also be an image generated by a computer application, such as any number of graphical computer tools, page layout programs, software design studios, computer-assisted design tools, or a rendering program or routine for more general aesthetic images such as screen savers, skins, and visualizations used in many audio-visual and media programs. Specific examples of software that may be used to render images include the Flash media family of programs offered by Macromedia, Incorporated, as well as Adobe Illustrator, Adobe Photoshop, and Adobe LiveMotion. There are many other programs that can be used to generate both static and dynamic images. For example, Microsoft Corporation provides a number of software products for image manipulation including Paint for working directly with bit-mapped files, a generic set of drawing tools available in the Microsoft Office suite, and DirectX software libraries for rendering three-dimensional objects. Other formats such as vector graphics formats, printing formats, media compression formats, audio-visual communication formats, and so on provide various techniques for creating and communicating images in computer form. All such programs and formats may be usefully employed as graphical sources 302 in the systems described herein.
It will be generally appreciated that images provided by the graphical source 302 may never be displayed, and may be provided directly to other components such as the conversion module 308 directly in digital form. Thus visual effects such as a flame may be synthesized without human intervention. Additionally, some visual effects may be applied to still or moving images including morphs, fades, swipes, and other well known effects in the audio visual arts. Algorithms or functions may be applied to still and/or moving images to generate these effects as graphical representations without display or human intervention. Generally, lighting units 100 in a networked lighting system 200 may generate effects without requiring viewing or review on a display device prior to rendering in the networked lighting system 200.
The graphical source 302 may be a program for creating lighting effects in a two-or three-dimensional lighting environment. For example, a user may specify an explosion lighting effect. The desired effect may be an initial bright white light in a corner of a room with light traveling away from the corner (possibly with changing color) at a specified speed and in a specified direction. In an embodiment, the program may execute a function or algorithm that produces an event such as an explosion, a lighting strike, headlights, a train passing through a room, a bullet shot through a room, a light moving through a room, a sunrise across a room, or any other event that might be realized with a number of lighting units 100 in a space. The function or algorithm may represent an image such as lights swirling in a room, balls of light bouncing in a room, sounds bouncing in a room, or other images. The function or algorithm may also represent randomly generated effects, repeating effects or other effects.
Referring again to
The light system configuration facility 304 can represent individual lighting units 100 or a networked lighting system 200, or a number of networked lighting systems 200, or any combination of these, and may provide configuration data on the capabilities and control of individual lighting units 100, as well as information concerning physical locations of lighting units 100. In this context, lighting units 100 may include, for example, tiles including arrays of LEDs, organic LEDs (which may be fabricated as luminous sheets), cove lights, ceiling lights, spot lights, and so on. Similarly, the configuration facility 304 may determine, or be provided with, surfaces that can be lit by various lighting units 100. For example, where a lighting effect calls for a particular section of a room to change in hue, saturation or brightness, control signals may be provided to direct one or more lighting units 100 at walls, or regions of walls in the appropriate section of the room.
Referring still to
Referring to
The representation 602 may be used to design and generate lighting effects. For example, a set of stored effects can be represented by icons 610 on the screen 612. An explosion icon can be selected with a mouse-controlled cursor or other interface technique, which may prompt the user to click on a starting and ending point for the explosion in the coordinate system. By locating a vector in the representation, the user can cause an explosion to be initiated, for example, in an upper corner of the environment 602, and a wave of light and or sound may propagate through the environment 602. With all of the light systems 102 in predetermined positions, as identified in the configuration file 500, the representation of the explosion can be played in the room by the light system and or another system such as a sound system.
Once information is entered for the lighting system 200, the program may be used to deploy lighting effects on the lighting system using the techniques generally described above with reference to
Referring again to
In an embodiment, the image information may be communicated from a central controller. The information may be altered before a lighting system responds to the information. For example, the image information may be directed to a position within a position map. All of the information directed at a position map may be collected prior to sending the information to a lighting system. This may be accomplished every time the image is refreshed or every time this section of the image is refreshed or at other times. In an embodiment, an algorithm may be performed on information that is collected. The algorithm may average the information, calculate and select the maximum information, calculate and select the minimum information, calculate and select the first quartile of the information, calculate and select the third quartile of the information, calculate and select the most used information calculate and select the integral of the information or perform another calculation on the information. This step may be completed to level the effect of the lighting system in response to information received. For example, the information in one refresh cycle may change the information in the map several times and the effect may be viewed best when the projected light takes on one value in a given refresh cycle.
In an embodiment, the information communicated to a lighting system may be altered before a lighting system responds to the information. The information format may change prior to the communication for example. The information may be communicated from a computer through a USB port or other communication port and the format of the information may be changed to a lighting protocol such as DMX when the information is communicated to the lighting system. In an embodiment, the information or control signals may be communicated to a lighting system or other system through a communications port of a computer, portable computer, notebook computer, personal digital assistant or other system. The information or control signals may also be stored in memory, electronic or otherwise, to be retrieved at a later time. Systems such the iPlayer and SmartJack systems manufactured and sold by Color Kinetics Incorporated can be used to communicate and or store lighting control signals.
In an embodiment, several systems may be associated with position maps and the several systems may a share position map or the systems may reside in independent position areas. For example, the position of a lighted surface from a first lighting system may intersect with a lighted surface from a second lighting system. The two systems may still respond to information communicated to the either of the lighting systems. In an embodiment, the interaction of two lighting systems may also be controlled. An algorithm, function or other technique may be used to change the lighting effects of one or more of the lighting systems in a interactive space. For example, if the interactive space is greater than half of the non-interactive space from a lighting system, the lighting system's hue, saturation or brightness may be modified to compensate the interactive area. This may be used to adjust the overall appearance of the interactive area or an adjacent area for example.
Control signals generated using methods and or systems according to the principles of the present invention can be used to produce a vast variety of effects. Imagine a fire or explosion effect that one wishes to have move across a wall or room. It starts at one end of the room as a white flash that quickly moves out followed by a high brightness yellow wave whose intensity varies as it moves through the room. When generating a control signal according to the principles of the present invention, a lighting designer does not have to be concerned with the lights in the room and the timing and generation of each light system's lighting effects. Rather the designer only needs to be concerned with the relative position or actual position of those lights in the room. The designer can lay out the lighting in a room and then associate the lights in the room with graphical information, such as pixel information, as described above. The designer can program the fire or explosion effect on a computer, using Flash 5 for example, and the information can be communicated to the light systems 102 in an environment. The position of the lights in the environment may be considered as well as the surfaces 107 or areas 702 that are going to be lit.
In an embodiment, the lighting effects could also be coupled to sound that will add to and reinforce the lighting effects. An example is a ‘red alert’ sequence where a ‘whoop whoop’ siren-like effect is coupled with the entire room pulsing red in concert with the sound. One stimulus reinforces the other. Sounds and movement of an earthquake using low frequency sound and flickering lights is another example of coordinating these effects. Movement of light and sound can be used to indicate direction.
In an embodiment the lights are represented in a two-dimensional or plan view. This allows representation of the lights in a plane where the lights can be associated with various pixels. Standard computer graphics techniques can then be used for effects. Animation tweening and even standard tools may be used to create lighting effects. Macromedia Flash works with relatively low-resolution graphics for creating animations on the web. Flash uses simple vector graphics to easily create animations. The vector representation is efficient for streaming applications such as on the World Wide Web for sending animations over the net. The same technology can be used to create animations that can be used to derive lighting commands by mapping the pixel information or vector information to vectors or pixels that correspond to positions of light systems 102 within a coordinate system for an environment 100.
For example, an animation window of a computer 600 can represent a room or other environment of the lights. Pixels in that window can correspond to lights within the room or a low-resolution averaged image can be created from the higher resolution image. In this way lights in the room can be activated when a corresponding pixel or neighborhood of pixels turn on. Because LED-based lighting technology can create any color on demand using digital control information, see U.S. Pat. Nos. 6,016,038, 6,150,774, and 6,166,496, the lights can faithfully recreate the colors in the original image.
Some examples of effects that could be generated using systems and methods according to the principles of the invention include, but are not limited to, explosions, colors, underwater effects, turbulence, color variation, fire, missiles, chases, rotation of a room, shape motion, tinkerbell-like shapes, lights moving in a room, and many others. Any of the effects can be specified with parameters, such as frequencies, wavelengths, wave widths, peak-to-peak measurements, velocities, inertia, friction, speed, width, spin, vectors, and the like. Any of these can be coupled with other effects, such as sound.
In computer graphics, anti-aliasing is a technique for removing staircase effects in imagery where edges are drawn and resolution is limited. This effect can be seen on television when a narrow striped pattern is shown. The edges appear to crawl like ants as the lines approach the horizontal. In a similar fashion, the lighting can be controlled in such a way as to provide a smoother transition during effect motion. The effect parameters such as wave width, amplitude, phase or frequency can be modified to provide better effects.
For example, referring to
The wave 802 shown in
Effects can have associated motion and direction, i.e. a velocity. Even other physical parameters can be described to give physical parameters such as friction, inertia, and momentum. Even more than that, the effect can have a specific trajectory. In an embodiment, each light may have a representation that gives attributes of the light. This can take the form of 2D position, for example. A light system 102 can have all various degrees of freedom assigned (e.g., xyz-rpy), or any combination.
The techniques listed here are not limited to lighting. Control signals can be propogated through other devices based on their positions, such as special effects devices such as pyrotechnics, smell-generating devices, fog machines, bubble machines, moving mechanisms, acoustic devices, acoustic effects that move in space, or other systems.
An embodiment of the present invention is a method of automatically capturing the position of the light systems 102 within an environment. An imaging device may be used as a means of capturing the position of the light. A camera, connected to a computing device, can capture the image for analysis can calculation of the position of the light.
Where a 3D position is desired a second image may be captured to triangulate the position of the light in another coordinate dimension. This is the stereo problem. In the same way human eyes determine depth through the correspondence and disparity between the images provided by each eye, a second set of images may be taken to provide the correspondence. The camera is either duplicated at a known position relative to the first camera or the first camera is moved a fixed distance and direction. This movement or difference in position establishes the baseline for the two images and allows derivation of a third coordinate (e.g., (x,y,z)) for the light system 102.
Another embodiment of the invention is depicted in
Using the techniques described herein, including techniques for determining positions of light systems in environments, techniques for modeling effects in environments (including time- and geometry-based effects), and techniques for mapping light system environments to virtual environments, it is possible to model an unlimited range of effects in an unlimited range of environments. Effects need not be limited to those that can be created on a square or rectangular display. Instead, light systems can be disposed in a wide range of lines, strings, curves, polygons, cones, cylinders, cubes, spheres, hemispheres, non-linear configurations, clouds, and arbitrary shapes and configurations, then modeled in a virtual environment that captures their positions in selected coordinate dimensions. Thus, light systems can be disposed in or on the interior or exterior of any environment, such as a room, building, home, wall, object, product, retail store, vehicle, ship, airplane, pool, spa, hospital, operating room, or other location.
In embodiments, the light system may be associated with code for the computer application, so that the computer application code is modified or created to control the light system. For example, object-oriented programming techniques can be used to attach attributes to objects in the computer code, and the attributes can be used to govern behavior of the light system. Object oriented techniques are known in the field, and can be found in texts such as “Introduction to Object-Oriented Programming” by Timothy Budd, the entire disclosure of which is herein incorporated by reference. It should be understood that other programming techniques may also be used to direct lighting systems to illuminate in coordination with computer applications, object oriented programming being one of a variety of programming techniques that would be understood by one of ordinary skill in the art to facilitate the methods and systems described herein.
In an embodiment, a developer can attach the light system inputs to objects in the computer application. For example, the developer may have an abstraction of a light system 102 that is added to the code construction, or object, of an application object. An object may consist of various attributes, such as position, velocity, color, intensity, or other values. A developer can add light as an instance in the object in the code of a computer application. For example, the object could be vector in an object-oriented computer animation program or solid modeling program, with attributes, such as direction and velocity. A light system 102 can be added as an instance of the object of the computer application, and the light system can have attributes, such as intensity, color, and various effects. Thus, when events occur in the computer application that call on the object of the vector, a thread running through the program can draw code to serve as an input to the processor of the light system. The light can accurately represent geometry, placement, spatial location, represent a value of the attribute or trait, or provide indication of other elements or objects.
Referring to
Using such object-oriented light input to the light system 102 from code for a computer application, various lighting effects can be associated in the real world environment with the virtual world objects of a computer application. For example, in animation of an effect such as explosion of a polygon, a light effect can be attached with the explosion of the polygon, such as sound, flashing, motion, vibration and other temporal effects. Further, the light system 102 could include other effects devices including sound producing devices, motion producing devices, fog machines, rain machines or other devices which could also produce indications related to that object.
Referring to
At a step 1312, the host of the method may provide an interface for mapping. The mapping function may be done with a function, e.g., “project-all-lights,” as described in Directlight API described below and in Appendix A, that maps real world lights using a simple user interface, such as drag and drop interface. The placement of the lights may not be as important as the surface the lights are directed towards. It may be this surface that reflects the illumination or lights back to the environment and as a result it may be this surface that is the most important for the mapping program. The mapping program may map these surfaces rather than the light system locations or it may also map both the locations of the light systems and the light on the surface.
A system for providing the code for coordinated illumination may be any suitable computer capable of allowing programming, including a processor, an operating system, and memory, such as a database, for storing files for execution.
Each real light 102 may have attributes that are stored in a configuration file. An example of a structure for a configuration file is depicted in
To simplify the configuration file, various techniques can be used. In embodiments, hemispherical cameras, sequenced in turn, can be used as a baseline with scaling factors to triangulate the lights and automatically generate a configuration file without ever having to measure where the lights are. In embodiments, the configuration file can be typed in, or can be put into a graphical user interface that can be used to drag and drop light sources onto a representation of an environment. The developer can create a configuration file that matches the fixtures with true placement in a real environment. For example, once the lighting elements are dragged and dropped in the environment, the program can associate the virtual lights in the program with the real lights in the environment. An example of a light authoring program to aid in the configuration of lighting is included in U.S. patent application Ser. No. 09/616,214 “Systems and Methods for Authoring Lighting Sequences.”Color Kinetics Inc. also offers a suitable authoring and configuration program called “ColorPlay.”
Further details as to the implementation of the code can be found in the Directlight API document attached hereto as Appendix A. Directlight API is a programmer's interface that allows a programmer to incorporate lighting effects into a program. Directlight API is attached in Appendix A and the disclosure incorporated by reference herein. Object oriented programming is just one example of a programming technique used to incorporate lighting effects. Lighting effects could be incorporated into any programming language or method of programming. In object oriented programming, the programmer is often simulating a 3D space.
In the above examples, lights were used to indicate the position of objects which produce the expected light or have light attached to them. There are many other ways in which light can be used. The lights in the light system can be used for a variety of purposes, such as to indicate events in a computer application (such as a game), or to indicate levels or attributes of objects.
Simulation types of computer applications are often 3D rendered and have objects with attributes as well as events. A programmer can code events into the application for a simulation, such as a simulation of a real world environment. A programmer can also code attributes or objects in the simulation. Thus, a program can track events and attributes, such as explosions, bullets, prices, product features, health, other people, patterns of light, and the like. The code can then map from the virtual world to the real world. In embodiments, at an optional step, the system can add to the virtual world with real world data, such as from sensors or input devices. Then the system can control real and virtual world objects in coordination with each other. Also, by using the light system as an indicator, it is possible to give information through the light system that aids a person in the real world environment.
Architectural visualization, mechanical engineering models, and other solid modeling environments are encompassed herein as embodiments. In these virtual environments lighting is often relevant both in a virtual environment and in a solid model real world visualization environment. The user can thus position and control a light system 102 the illuminates a real world sold model to illuminate the real world solid model in correspondence to illumination conditions that are created in the virtual world modeling environment. Scale physical models in a room of lights can be modeled for lighting during the course of a day or year or during different seasons for example, possibly to detect previously unknown interaction with the light and various building surfaces. Another example would be to construct a replica of a city or portion of a city in a room with a lighting system such as those discussed above. The model could then be analyzed for color changes over a period of time, shadowing, or other lighting effects. In an embodiment, this technique could be used for landscape design. In an embodiment, the lighting system is used to model the interior space of a room, building, or other piece of architecture. For example, an interior designer may want to project the colors of the room, or fabric or objects in the room with colors representing various times of the day, year, or season. In an embodiment, a lighting system is used in a store near a paint section to allow for simulation of lighting conditions on paint chips for visualization of paint colors under various conditions. These types of real world modeling applications can enable detection of potential design flaws, such as reflective buildings reflecting sunlight in the eyes of drivers during certain times of the year. Further, the three-dimensional visualization may allow for more rapid recognition of the aesthetics of the design by human beings, than by more complex computer modeling.
Solid modeling programs can have virtual lights. One can light a model in the virtual environment while simultaneously lighting a real world model the same way. For example, one can model environmental conditions of the model and recreate them in the real world modeling environment outside the virtual environment. For example, one can model a house or other building and show how it would appear in any daylight environment. A hobbyist could also model lighting for a model train set (for instance based on pictures of an actual train) and translate that lighting into the illumination for the room wherein the model train exists. Therefore the model train may not only be a physical representation of an actual train, but may even appear as that train appeared at a particular time. A civil engineering project could also be assembled as a model and then a lighting system according to the principles of the invention could be used to simulate the lighting conditions over the period of the day. This simulation could be used to generate lighting conditions, shadows, color effects or other effects. This technique could also be used in Film/Theatrical modeling or could be used to generate special effects in film making. Such a system could also be used by a homeowner, for instance by selecting what they want their dwelling to look like from the outside and having lights be selected to produce that look. This is a possibility for safety when the owner is away. Alternatively, the system could work in reverse where the owner turns on the lights in their house and a computer provides the appearance of the house from various different directions and distances.
Although the above examples discuss modeling for architecture, one of skill in the art would understand that any device, object, or structure where the effect of light on that device, object, or structure can be treated similarly.
Medical or other job simulation could also be performed. A lighting system according to the principles of the present invention may be used to simulate the lighting conditions during a medical procedure. This may involve creating an operating room setting or other environment such as an auto accident at night, with specific lighting conditions. For example, the lighting on highways is generally high-pressure sodium lamps which produce nearly monochromatic yellow light and as a result objects and fluids may appear to be a non-normal color. Parking lots generally use metal halide lighting systems and produce a broad spectrum light that has spectral gaps. Any of these environments could be simulated using a system according to the principles of the invention. These simulators could be used to train emergency personnel how to react in situations lit in different ways. They could also be used to simulate conditions under which any job would need to be performed. For instance, the light that will be experienced by an astronaut repairing an orbiting satellite can be simulated on earth in a simulation chamber.
Lights can also be used to simulate travel in otherwise inaccessible areas such as the light that would be received traveling through space or viewing astronomical phenomena, or lights could be used as a three dimensional projection of an otherwise unviewable object. For instance, a lighting system attached to a computing device could provide a three dimensional view from the inside of a molecular model. Temporal Function or other mathematical concepts could also be visualized.
Referring to
Referring still to
In certain preferred embodiments, the lighting units 1400 are networked lighting systems where the lighting control signals are packaged into packets of addressed information. The addressed information may then be communicated to the lighting systems in the lighting network. Each of the lighting systems may then respond to the control signals that are addressed to the particular lighting system. This is an extremely useful arrangement for generating and coordinating lighting effects in across several lighting systems. Embodiments of U.S. patent application Ser. No. 09/616,214 “Systems and Methods for Authoring Lighting Sequences” describe systems and methods for generating system control signals and is herby incorporated by reference herein.
A lighting system, or other system according to the principles of the present invention, may be associated with an addressable controller. The addressable controller may be arranged to “listen” to network information until it “hears” its address. Once the systems address is identified, the system may read and respond to the information in a data packet that is assigned to the address. For example, a lighting system may include an addressable controller. The addressable controller may also include an alterable address and a user may set the address of the system. The lighting system may be connected to a network where network information is communicated. The network may be used to communicate information to many controlled systems such as a plurality of lighting systems for example. In such an arrangement, each of the plurality of lighting systems may be receiving information pertaining to more than one lighting system. The information may be in the form of a bit stream where information for a first addressed lighting system is followed by information directed at a second addressed lighting system. An example of such a lighting system can be found in U.S. Pat. No. 6,016,038, which is hereby incorporated by reference herein.
In an embodiment, the lighting unit 100 is placed in a real world environment 1400. The real world environment 1400 could be a room. The lighting system could be arranged, for example, to light the walls, ceiling, floor or other sections or objects in a room, or particular surfaces 1407 of the room. The lighting system may include several addressable lighting units 100 with individual addresses. The illumination can be projected so as to be visible to a viewer in the room either directly or indirectly. That is a light of a lighting unit 100 could shine so that the light is projected to the viewer without reflection, or could be reflected, refracted, absorbed and reemitted, or in any other manner indirectly presented to the viewer.
Referring to
Referring to
The light system manager 1650, mapping facility 1658, light system composer 1652 and light system engine 1654 may be provided through a combination of computer hardware, telecommunications hardware and computer software components. The different components may be provided on a single computer system or distributed among separate computer systems.
Referring to
Referring still to
Referring to
Thus, methods and systems provided herein include providing a light system engine for relaying control signals to a plurality of light systems, wherein the light system engine plays back shows. The light system engine 1654 may include a processor, a data facility, an operating system and a communication facility. The light system engine 1654 may be configured to communicate with a DALI or DMX lighting control facility. In embodiments, the light system engine communicates with a lighting control facility that operates with a serial communication protocol. In embodiments the lighting control facility is a power/data supply for a lighting unit 100.
In embodiments, the light system engine 1654 executes lighting shows downloaded from the light system composer 1652. In embodiments the shows are delivered as XML files from the light show composer 1652 to the light system engine 1654. In embodiment the shows are delivered to the light system engine over a network. In embodiments the shows are delivered over an Ethernet facility. In embodiments the shows are delivered over a wireless facility. In embodiments the shows are delivered over a Firewire facility. In embodiments shows are delivered over the Internet.
In embodiments lighting shows composed by the lighting show composer 1652 can be combined with other files from another computer system, such as one that includes an XML parser that parses an XML document output by the light show composer 1652 along with XML elements relevant to the other computer. In embodiments lighting shows are combined by adding additional elements to an XML file that contains a lighting show. In embodiments the other computer system comprises a browser and the user of the browser can edit the XML file using the browser to edit the lighting show generated by the lighting show composer. In embodiments the light system engine 1654 includes a server, wherein the server is capable of receiving data over the Internet. In embodiments the light system engine 1654 is capable of handling multiple zones of light systems, wherein each zone of light systems has a distinct mapping. In embodiments the multiple zones are synchronized using the internal clock of the light system engine 1654.
The methods and systems included herein include methods and systems for providing a mapping facility 1658 of the light system manager 1650 for mapping locations of a plurality of light systems. In embodiments, the mapping system discovers lighting systems in an environment, using techniques described above. In embodiments, the mapping facility then maps light systems in a two-dimensional space, such as using a graphical user interface.
In embodiments of the invention, the light system engine 1654 comprises a personal computer with a Linux operating system. In embodiments the light system engine is associated with a bridge to a DMX or DALI system.
Referring to
Referring to
Various other geometrical configurations of lighting units are so widely used as to benefit from the storing of pre-authored coordinate transformations, shows and effects. For example, referring to
Referring to
Referring to
Referring to
Referring to
Referring to
Referring to
Referring to
Referring to
Referring to
Referring to
Referring to
Referring to
Referring to
Referring to
Referring to
Referring to
Wherein the lighting systems are selected from the group consisting of an architectural lighting system, an entertainment lighting system, a restaurant lighting system, a stage lighting system, a theatrical lighting system, a concert lighting system, an arena lighting system, a signage system, a building exterior lighting system, a landscape lighting system, a pool lighting system, a spa lighting system, a transportation lighting system, a marine lighting system, a military lighting system, a stadium lighting system, a motion picture lighting system, photography lighting system, a medical lighting system, a residential lighting system, a studio lighting system, and a television lighting system.
Using a mapping facility, light systems can optionally be mapped into separate zones, such as DMX zones. The zones can be separate DMX zones, including zones located in different rooms of a building. The zones can be located in the same location within an environment. In embodiments the environment can be a stage lighting environment.
Thus, in various embodiments, the mapping facility allows a user to provide a grouping facility for grouping light systems, wherein grouped light systems respond as a group to control signals. In embodiments the grouping facility comprises a directed graph. In embodiments, the grouping facility comprises a drag and drop user interface. In embodiments, the grouping facility comprises a dragging line interface. The grouping facility can permit grouping of any selected geometry, such as a two-dimensional representation of a three-dimensional space. In embodiments, the grouping facility can permit grouping as a two-dimensional representation that is mapped to light systems in a three-dimensional space. In embodiments, the grouping facility groups lights into groups of a predetermined conventional configuration, such as a rectangular, two-dimensional array, a square, a curvilinear configuration, a line, an oval, an oval-shaped array, a circle, a circular array, a square, a triangle, a triangular array, a serial configuration, a helix, or a double helix.
Referring to
Referring to
Referring to
Referring to the schematic diagram 4350 of
Referring to
Referring still to the interface 4050 of
Referring still to
The user interface 4050 of
Referring to
Referring to
Referring to
Referring to
Referring to
Referring still to
Referring to
Referring to
Referring to
Referring to
Referring to
Referring to
Referring to
Referring to
Referring to
Referring to
Referring to
As seen in connection with the various embodiments of the user interface 4050 and related figures, methods and systems are included herein for providing a light system composer for allowing a user to author a lighting show using a graphical user interface. The light system composer includes an effect authoring system for allowing a user to generate a graphical representation of a lighting effect. In embodiments the user can set parameters for a plurality of predefined types of lighting effects, create user-defined effects, link effects to other effects, set timing parameters for effects, generate meta effects, and generate shows comprised of more than one meta effect, including shows that link meta effects.
In embodiments, a user may assign an effect to a group of light systems. Many effects can be generated, such as a color chasing rainbow, a cross fade effect, a custom rainbow, a fixed color effect, an animation effect, a fractal effect, a random color effect, a sparkle effect, a streak effect, an X burst effect, an XY spiral effect, and a sweep effect.
In embodiments an effect can be an animation effect. In embodiments the animation effect corresponds to an animation generated by an animation facility. In embodiments the effect is loaded from an animation file. The animation facility can be a flash facility, a multimedia facility, a graphics generator, or a three-dimensional animation facility.
In embodiments the lighting show composer facilitates the creation of meta effects that comprise a plurality of linked effects. In embodiments the lighting show composer generates an XML file containing a lighting show according to a document type definition for an XML parser for a light engine. In embodiments the lighting show composer includes stored effects that are designed to run on a predetermined configuration of lighting systems. In embodiments the user can apply a stored effect to a configuration of lighting systems. In embodiments the light system composer includes a graphical simulation of a lighting effect on a lighting configuration. In embodiments the simulation reflects a parameter set by a user for an effect. In embodiments the light show composer allows synchronization of effects between different groups of lighting systems that are grouped using the grouping facility. In embodiments the lighting show composer includes a wizard for adding a predetermined configuration of light systems to a group and for generating effects that are suitable for the predetermined configuration. In embodiments the configuration is a rectangular array, a string, or another predetermined configuration.
Referring to
In embodiments, other user interfaces can trigger shows stored on a light system engine 1654, such as a knob, a dial, a button, a touch screen, a serial keypad, a slide mechanism, a switch, a sliding switch, a switch/slide combination, a sensor, a decibel meter, an inclinometer, a thermometer, a anemometer, a barometer, or any other input capable of providing a signal to the light system engine 1654. In embodiments the user interface is the serial keypad 6350, wherein initiating a button on the keypad 6350 initiates a show in at least one zone of a lighting system governed by a light system engine connected to the keypad.
Referring to
Referring to
Many different forms of playback control can be provided. Since the light shows composed by the light show composer 1652 can be exported as XML files, any form of playback or download mechanism suitable for other markup language files can be used, analogous to playback facilities used for MP3 files and the like.
Referring to
Referring to
Certain embodiments of the present invention are directed to methods and systems for controlling a lighting network in response to an audio input. This can be accomplished in any of numerous ways, as the present invention is not limited to any particular implementation technique. In accordance with one illustrative embodiment, an audio input is digitally processed to analyze the audio input, and at least one aspect of a lighting system is controlled in response to a characteristic of the audio input. In another embodiment of the present invention, timing information is also considered so that the control signals sent to the lighting network for a particular audio input can vary over time, to avoid repetitiveness.
The assignee of the present application has previously developed other systems on which users can author lighting programs including one more lighting sequences, as well as devices for playing back a lighting program to control a lighting system. Many of the features of those systems can be incorporated in the present invention to enable the control of a lighting system in response to an audio input. Therefore, a description will initially be provided of authoring software and playback devices for lighting programs to control a lighting system, before turning to the specific aspects of the present invention relating to performing such control in response to an audio input.
In one embodiment of the invention, lighting effects can have priorities or cues attached to them which could allow a particular lighting unit to change effect on the receipt of a cue. This cue could be any type of cue, received externally or internally to the system, and includes, but is not limited to, a user-triggered cue such as a manual switch or bump button; a user-defined cue such as a certain keystroke combination or a timing key allowing a user to tap or pace for a certain effect; a cue generated by the system such as an internal clocking mechanism, an internal memory one, or a software based one; a mechanical cue generated from an analog or digital device attached to the system such as a clock, external light or motion sensor, music synchronization device, sound level detection device, or a manual device such as a switch; a cue received over a transmission medium such as an electrical wire or cable, RF signal or IR signal; a cue that relates to a characteristic of an audio signal; or a cue received from a lighting unit attached to the system. The priority can allow the system to choose a default priority effect that is the effect used by the lighting unit unless a particular cue is received, at which point the system instructs the use of a different effect. This change of effect could be temporary, occurring only while the cue occurs or defined for a specified period, could be permanent in that it does not allow for further receipt of other effects or cues, or could be priority based, waiting for a new cue to return to the original effect or select a new one. Alternatively, the system could select effects based on the state of a cue and the importance of a desired effect. For instance, if a sound sensor sensed sudden noise, it could trigger a high priority alarm lighting effect overriding all the effects otherwise present or awaiting execution. The priority could also be state dependent where a cue selects an alternative effect or is ignored depending on the current state of the system. Again, it should be appreciated that the embodiments of the present invention that employ priorities or queues for various lighting effects are not limited to the particular types of queues and priorities discussed above, as numerous other types are possible.
In event-driven embodiments, such as those using external inputs and those using outputs of other effects as inputs, a menu may be provided to define inputs and the consequences thereof. For example, a palette of predetermined inputs may be provided to a user. Each input, such as a specified transducer or the output of another effect, may be selected and placed within an authored lighting sequence as a trigger for a new effect, or as a trigger to a variation in an existing effect. Known inputs may include, for example, thermistors, clocks, keyboards, numeric keypads, Musical Instrument Digital Interface (“MIDI”) inputs, DMX control signals, TTL or CMOS logical signals, signals from music players, such as the iPod from Apple Computer or MP3 players, other visual or audio signals, or any other protocol, standard, or other signaling or control technique, whether analog, digital, manual, or any other form. The palette may also include a custom input, represented as, for example, an icon in a palette, or an option in a dropdown menu. The custom input may allow a user to define the characteristics of an input signal (e.g., its voltage, current, duration, and/or form (i.e., sinusoid, pulse, step, modulation)) that will operate as a control or trigger in a sequence.
For instance, a theatrical lighting sequence may include programmed lighting sequences and special effects in the order in which they occur, but requiring input at specified points before the next sequence or portion thereof is executed. In this way, scene changes may take place not automatically as a function of timing alone, but at the cue of a director, producer, stage hand, or other participant. Similarly, effects which need to be timed with an action on the stage, such as brightening when an actor lights a candle or flips a switch, dramatic flashes of lightning, etc., can be indicated precisely by a director, producer, stage hand, or other participant—even an actor—thereby reducing the difficulty and risk of relying on preprogrammed timing alone.
As mentioned above, one embodiment of the present invention is directed to a method and apparatus for controlling a lighting system in response to an audio input.
The audio input can be provided in any of numerous ways. In the embodiment shown in
The audio data 6805 may be stored in any format suitable for the storage of digital data. One popular format is the MPEG Layer III data compression algorithm, which is often used for transmitting files over the Internet, and is widely known as MP3. The files stored in the MP3 format are typically processed by an MP3 decoder for playback. It should be appreciated that MP3 is merely one of numerous types of formats suitable for the storage of digital data, with other examples including MIDI, MOD, CDA, WMA, AS and WAV. It should be appreciated that these are merely examples of suitable formats, and that there are other standards and formats that can be used, including formats that do not adhere to any particular standard. In addition, while the MP3 format compresses the data, it should be appreciated that other formats may not. It should further be appreciated that the present invention is not limited to use with data stored in any particular format.
Rather than originating from a computer readable medium accessible to the computer system 6809, such as a microphone, stereo system, musical instrument or any other source capable of generating an audio signal 6803. The audio signal 6803 may be a digital signal, input to the computer system 6809 via a digital interface such as a USB, serial or parallel port or any other suitable interface, or may be an analog signal, input to the computer system 6809 via an audio jack or any other suitable interface. In accordance with one embodiment of the present invention, when the audio signal 6803 is provided in analog form, it can be converted (via an analog-to-digital converter not shown) within the computer system 6809, so that the audio signal can be processed digitally, which provides a number of advantages as discussed below. However, it should be appreciated that not all aspects of the present invention are limited in this respect, such that other embodiments of the present invention can process the audio signal in analog form.
In the embodiment shown in
It should be appreciated that many audio signal formats comprise two or more independently encoded channels, and that many audio file formats maintain the independence of the channel data. Examples of such multi-channel audio signals include stereo signals, AC-1 (Audio Coding-1), AC-2 and AC-3 (Dolby Digital). In accordance with one embodiment of the present invention, each channel for a single audio signal is analyzed separately by the audio decoder 6811, such that separate information is generated by analyzing the characteristics of the different channels. For example, using the example described above, wherein the information concerning an audio signal includes frequency domain information and time domain information, in one embodiment of the present invention the audio decoder 6811 generates separate frequency domain information and time domain information for each separate channel for a single input audio signal (e.g., audio data 6805 or external audio signal 6803).
The audio decoder 6811 can be implemented in any of numerous ways, as the present invention is not limited to any particular implementation technique. For example, the audio decoder 6811 can be implemented in dedicated hardware, or can be implemented in software executed on a processor (not shown) within the computer system 6809. When implemented in software, the audio decoder 6811 can be provided as an executable program written in any suitable computer programming language (e.g., Fortran, C, Java, C++, etc.). The software for implementing the audio decoder 6811 can be stored on any computer readable medium accessible to the computer system 6809, including the computer readable medium 6807 that stores the audio data 6805, or any other computer readable media. The software for implementing the audio decoder 6811 can, for example, can be any one of a number of commercially available software programs that perform the above-described functions. Examples of such commercially available software programs include MP3 players such as Winamp.™., available from Nullsoft, Inc. Such commercially available MP3 players include application programming interfaces (APIs) that enable third party add-on plug-in software components to interface with the MP3 player, and to take advantage of the functionality provided thereby, including the above-described information that the audio decoder 6811 provides concerning the characteristics of an audio input. Thus, as discussed further below, one embodiment of the present invention is directed to software, for execution on a computer system 6809, that acts as a plug-in to a commercially available MP3 player to provide the mapping functions described below to control a lighting network in response to an input audio signal (e.g., stored audio data 6805 or an external audio signal 6803).
The mapping facility 6815 performs a function that is similar in many respects to the playback function performed by other components described above, such as the processing facilities and data storage facilities described elsewhere herein. In this respect, the mapping facility 6815 can be provided with a lighting program (e.g., stored in a mapping table 6815t) that can include one or more variables to receive input values at execution time. As shown in
In accordance with one illustrative embodiment of the present invention, the mapping facility 6815 can execute lighting programs that each includes only a single entry defining the manner in which control signals, to be passed to the lighting network, will be generated. Each such lighting program for the mapping facility 6815 may be programmed using a number of if/then statements or Boolean logic to interpret the numerous varied permutations of inputs from the audio decoder 6811 relating to characteristics of the audio input signal, and may generate control signals to the lighting network accordingly. Even with such static lighting programs, the control signals transmitted to the lighting network will result in a changing light show as the input audio signal is played, as the characteristics of the audio signal will change over time, resulting in changing inputs to the mapping facility 6815 and, consequently, changing control signals sent to the lighting network. Alternatively, the mapping table 6815t can include lighting programs that include a plurality of lighting sequences, in much the same manner as the embodiments described herein. In accordance with these embodiments of the present invention, the mapping facility 6815 will step through various lighting sequences as the input audio signal is played back, which can result in a more varied light show, as not only will the inputs from the audio decoder 6811 change as the input audio signal is played back, but the mapping function executed by the mapping facility 6815 can also be programmed to change over time.
It should be appreciated that the embodiment of the present invention shown in
In the embodiment shown in
In the embodiment of
In accordance with one illustrative embodiment of the present invention, the external interface 6845 is a graphical user interface (GUI) that can be displayed on a display of the computer system 6809 to facilitate a user in selecting a particular mapping function to be provided by the mapping table 6815t. This aspect of the present invention can be implemented in any of numerous ways, and is not limited to any particular implementation technique. As an example, a graphical user interface can be provided that lists various types of mapping functions that are considered to be particularly suitable for particular music types. Thus, prior to playing a particular song as the audio input signal, a user can select a mapping function (e.g., from the mapping table 6815t) that fits the style of music of the song to be played. In this manner, the user can customize the lighting show generated based upon the type of music to be played. Of course, it should be appreciated that this is simply one example of the manner in which a graphical user interface can be used, as numerous other implementations are possible.
In another embodiment of the present invention, the particular mapping function employed can be selected based upon information provided with the audio signal that provides an indication of the type of music included therein. Specifically, some pieces of music can include a tag or other information in the music, or associated therewith, that identifies the type of music. In accordance with one embodiment of the present invention, such information can be used to select a mapping function that fits the style of music in much the same manner as described above.
As should be appreciated from the foregoing, changes in the mapping performed by the mapping facility 6815 can be accomplished in numerous ways by including a variable in a single mapping function that can result in changes of the mapping output or by switching between different mapping functions in the mapping table 6815t. The changes in the mapping performed by the mapping facility 6815 can be accomplished in response to any of numerous stimuli, including input provided from an external input (e.g., from a user selecting a different mapping function), in response to timing information from the timer 6821, in response to some characteristic of an input audio signal (e.g., provided to the mapping facility 6815 by the audio decoder 6811), in response to a detection by the audio decoder that a particular audio signal (e.g., a song) has terminated and a new one is beginning, etc. Thus, there are numerous ways of continually updating the mapping performed by the mapping facility 6815. Of course, it should be appreciated that the present invention is not limited to using any or all of these techniques, as these are described herein merely for illustrative purposes.
In embodiments, a cue table or transient memory may be provided in connection with the computer system 6809. A cue table can be provided between the external interface 6845 and the mapping facility 6815, and/or between the audio decoder 6811 and the mapping facility 6815 to assist in analyzing the inputs provided by the external interface 6845 and/or the characteristics of the input audio signal provided by the audio decoder 6811. Of course, it should be appreciated that these features are optional, and need not be employed in all embodiments of the present invention.
As mentioned above, it should be appreciated that the manner in which the characteristics of the input audio signal are analyzed by the mapping facility 6815 to impact the control signals sent to the lighting network to control the lighting show can be performed in any of numerous ways, as the present invention is not limited to any particular type of analysis. For example, the mapping facility 6815 can look for particular activity levels within a particular frequency band, can detect a beat of the music based upon pulses within particular frequency bands or overall activity of the input signal, can look for an interaction between two or more different frequency bands, can analyze intensity levels characteristic of a volume at which the audio signal is being played, etc. One variable for consideration by the mapping facility 6815 is the sensitivity of the system at which differences in a characteristic of the audio signal will be recognized, resulting in a change in the control signals sent to the lighting network, and thereby a change in the lighting show. As indicated above, in one embodiment of the present invention, the external interface 6845 can also enable external inputs (e.g., inputs from a user) to change any of numerous variables within the mapping function to impact the lighting show produced.
It should be appreciated that the mapping facility 6815 can be implemented in any of numerous ways, including with dedicated hardware, or with software executed on a processor (not shown) within the computer system 6809. When implemented in software, the software can be stored on any computer readable medium accessible to the computer system 6809, including a computer readable medium 6807 that stores the audio data 6805. The software that implements the mapping facility 6815 can be implemented as an executable program written in any number of computer programming languages, such as those discussed above. The software can be implemented on a same processor that also executes software to implement the audio decoder 6811, or the computer system 6809 can be provided with separate processors to perform these functions.
As discussed above, one embodiment of the present invention is directed to the provision of a software plug-in that is compatible with commercially available MP3 players to enable the control of a lighting network in response to an audio signal being played by the MP3 player. Thus, one embodiment of the present invention is directed to a computer readable medium encoded with a program that, when executed by a processor on a computer system such as 6809, interacts with an audio decoder 6811 of an MP3 player executing on the computer system 6809, and implements the functions of the mapping facility 6815 to generate the control signals necessary to control a lighting network as described above. Of course, it should be understood that this is simply one illustrative embodiment of the present invention, as numerous other implementations are possible.
As with the other embodiments of the invention described above, the lighting units 100 of the lighting network may be any type of light source, including incandescent, LED, fluorescent, halogen, laser, etc. Each lighting unit may be associated with a predetermined assigned address as discussed above. The computer system 6809 may send control signals to the lighting network in any of numerous ways, as the present invention is not limited to any particular technique. In the embodiment shown in
It should be appreciated that the information stored in the mapping table 6815t and output from the mapping facility 6815 may not be in a format capable of directly controlling a lighting network, such that in one embodiment of the present invention, a format conversion is performed. As discussed above, examples of formats for controlling a plurality of lighting units include data streams and data formats such as DMX, RS-485, RS-232, etc. Any format conversion can be performed by the mapping facility 6815, or a separate converter can be employed. The converter can be implemented in any of numerous ways, including in dedicated hardware or in software executing on a processor within the computer system 6809.
In the embodiment of the invention shown in
It should be appreciated that the external audio signal 6803 can be provided in either digital form, or in analog form. When provided in analog form, the external audio signal may pass through an analog-digital converter (not shown) within the computer system 6809 prior to being passed to the audio decoder 6811. This conversion can be accomplished in any of numerous ways, as the present invention is not limited to any particular implementation. For example, the external audio signal can be provided to a sound card within the computer system 6809, which can perform the analog-to-digital conversion.
It should be appreciated that in the embodiment of the present invention wherein the same computer system 6809 that generates the control signals for the lighting network also drives speakers to generate an audible sound for the audio signal, some synchronization may be performed to ensure that the lighting show produced on the lighting network is synchronized with the audible playing of the audio signal. This can be accomplished within the computer system 6809 in any of numerous ways. For example, when the audio player 6822 and audio decoder 6811 are provided as part of a commercially available MP3 player, the MP3 player will automatically perform this synchronization.
As should be appreciated from the foregoing, in one embodiment of the present invention, the analyzing of an audio input signal is performed essentially simultaneously with a playing of the audio signal to generate an audible sound. However, the present invention is not limited in this respect, as in another embodiment of the present invention, the analysis of the audio input signal is performed prior to playing the audio signal to generate an audible sound. This can provide for some flexibility in performing the mapping of the audio input signal to control signals for the lighting network, as the mapping function can consider not only the characteristics of the audible signal that corresponds with the instant in time for the control signals being generated, but can also look ahead in the audio signal to anticipate changes that will occur, and thereby institute lighting effects in advance of a change in the audible playback of the audio signal. This can be performed in any of numerous ways. For example, the audio input signal can be analyzed prior to it being played to generate an audible output, and the results of that analysis (e.g., from the audio decoder 6811) can be stored in memory (e.g., in a transient memory) or in the mapping table 6815t, for future reference by the mapping facility 6815 when the audio signal is audibly played. Thus, the function performed by the mapping facility 6815 can look not only to characteristics of the music that correspond to the point in time with the audio signal being played, but can also look ahead (or alternatively behind) in the audio signal to anticipate changes therein. Alternatively, rather than storing the outputs that are characteristic of the audio signal, another option is to perform the mapping at the time when the audio input signal is first analyzed, and store the entire control signal sequence in memory (e.g., in the mapping table 6815t). Thereafter, when the audio signal is audibly played, the mapping facility 6815 need not do any analysis in real time, but rather, can simply read out the previously defined control signals, which for example can be stored at a particular sample rate to then be played back when the audio signal is played to generate an audible signal.
While the embodiment of the present invention directed to performing an analysis of the audio signal prior to playing it back provides the advantages described above, it should be appreciated that this is not a requirement of all embodiments of the present invention.
It should be appreciated that the lighting programs (e.g., entries in the mapping table 6815t) for the embodiment shown in
In accordance with an alternate embodiment of the invention, Applicants have appreciated that the device used to control a set of lighting units 100 need not have all of the functionality and capability of a computer system, for example it need not include a video monitor, keyboard, or other robust user interface. Furthermore, Applicants have appreciated that in many instances, it is desirable to provide a relatively small and inexpensive device to perform the lighting control function in response to an audio input, so that the device can be portable.
In view of the foregoing, one embodiment of the present invention, is directed to a lighting control device that includes all of the functionality described above in connection with
An even further simplified embodiment of the present invention is illustrated in
In the embodiment shown in
It should be appreciated that the lighting control device 7030 may receive the external audio signal using any suitable interface, such as the serial port, USB port, parallel port, IR receiver, a standard stereo audio jack, or any other suitable interface.
The components on the lighting control device 7030 can be powered in any of numerous ways, including through the provision of a power facility as described herein and in the documents incorporated herein by reference.
The lighting control device 7030 may begin processing of the external audio signal 6803 and/or initiate the sending of control signals to the lighting network to initiate a lighting show either in response to a signal received at the external input 7046, or immediately upon receipt of the external audio signal 6803. Alternatively, the lighting control device 7030 may initiate a lighting show at a specified time, or upon any suitable condition. The lighting control device 7030 may continue to send control information to the lighting network until it no longer receives any external audio signal 6803, until a signal is received at the external input 7046, until the occurrence of a specified condition, until a particular point in time, or any other suitable event. In one embodiment of the present invention, the lighting control device 7030 includes a storage device to store the mapping table 6815t. The storage device can be a memory unit, database, or other suitable module (e.g., a removable Flash memory) for storing one or more lighting programs in the mapping table 6815t. In accordance with one embodiment of the present invention, the storage device is formed as a non-volatile memory device, such that once information is stored thereon, the information is maintained, even when no power is provided to the lighting control device 7030.
It should be appreciated that any single component or collection of multiple components of the above-described embodiments that perform the functions described above can be generically considered as one or more controllers that control the above-discussed functions. The one or more controllers can be implemented in numerous ways, such as with dedicated hardware, or using a processor that is programmed to perform the functions recited above. In this respect, it should be appreciated that one implementation of the present invention comprises at least one computer readable medium (e.g., a computer memory, a floppy disk, a compact disk, a tape, etc.) encoded with a computer program that, when executed on a processor, performs the above-discussed functions of the present invention. The computer readable medium can be transportable such that the program stored thereon can be loaded onto any device having a processor to implement the aspects of the present invention discussed above. In addition, it should be appreciated that the reference to a computer program that, when executed, performs the above-discussed functions is not limited to an application program, but rather is used herein in the generic sense to reference any type of computer code (e.g., software or microcode) that can be employed to program a processor to implement the above-discussed aspects of the present invention.
In embodiments of the present disclosure, lighting technology can be coupled with media. Media forms include, without limitation, audio, music, movies, television, video, video games and all forms of text display, audio sources, and visual sources. Media is a means to communicate information, tell a story, or provide a pleasing effect or experience. Most media has migrated towards a digital form, enabling new forms of interaction and control.
Referring to
Newer technology, such as CD players with interactive controls, can be directed, on the fly, much like a jazz performance where improvisation plays as much a role as the underlying composition. Very often the DJ is editing and selecting in real-time as the music and audio unfolds. There are a variety of products on the market that cater to the DJ for control of recorded media,, including turntables for LPs and CDs.
Control capabilities now extend to real-time control of videos through similar control systems and interfaces. For example, there now exists a digital audio and video DVD turntable that enables the live control of audio and video from a DVD or other media to create effects and experiences that are also directed but provide capability in video realm. Projected and displayed images on monitors, displays, screens and other forms of image display can be controlled directly by manipulating the playback of imagery and video. An entertainment control system 7118 can include various control components, such as an audio/visual controller 7100, described in more detail below.
One aspect of the invention includes a coupling between one or more A/V devices and one or more lighting devices to be interactively controlled and directed by a person. Another aspect of the invention involves the generation and control of lighting effects.
In the entertainment control system 7118 of the embodiment in
As shown in
In an embodiment the functions of many of the elements of
In an embodiment, the lighting deck is a stand-alone device whose size, shape and interface are similar to that of the audio/visual controller 7100 to facilitate use, setup and transportation. In addition to a specialized device, a personal computer with software may provide much of the same functionality for the user. The interface can take the form of a tablet PC, a PDA, a personal computer, a laptop, or housing with a display and switches. In embodiments the lighting control interface 7112 may include a mapping facility, such as described in connection with
In an embodiment the memory 7102 can also be used for the recording of a performance. The interactive inputs can be recorded in real-time as well, so that a particular performance can be stored and replayed. These media shows can be self-contained shows that are connected to a light, sound and video system for playback. In another embodiment, media shows can be sold, traded or given away to consumers to be part of their music, audio and video collection and played back on consumers' audio/visual systems to create a concert or club venue in the home.
In an embodiment, the lighting control interface 7112 allows for the storage and downloading of new functions and new parameter settings. A modular architecture allows for user modification and implementation of such effects or downloading via interfaces such as USB or Firewire over standard media (disk, flash memory etc), network connections or other wireless capability (e.g. IR, 802.11 etc).
Referring to
In an embodiment, an effect or a sequence of effects is selected, and various parameters of the effect can be further selected to provide control of color(s) and tempo within the effect 7402. For example, as shown in
Other audio analysis tools can be incorporated to provide many types of information as shown in
In one embodiment, the information can be at a higher level. This takes the form of meta information such as genre, type or general characteristics such as ‘rock music’, album name, playlist name, Gracenotes information, artist name or mid-level information such as beats, effects setting, etc or provides detailed spectral information. Additionally in this embodiment a light track that is associated with a selection can be streamed directly out as well. In an embodiment, the light track can be at a fixture level, or preferably, as a sequence of effects that is independent of installation geometry.
The A/V controller 7100, can be one or more of the following: a specialized device for audio/video control, such as the Pioneer DVJ-X1, a computer, a controller, an interface device, and/or a player, such as a DVD or CD player with controls beyond simple playback. In an embodiment, audio/visual controller 7100 could have recording capability to record operator input sequences to provide ‘live’ playback at a later time. Such sequences would be stored in memory, 7102, or writable media such as recordable DVDs or CD-RW, CDR etc. This capability allows for live and interactive control of video playback to provide active and changing sequences during a performance. In an embodiment, the unit can also incorporate built-in effects for audio and video effects such as stuttering playback, modulating the output, false coloring, warping of the image and more.
In certain embodiments, the lighting control interface 7112 may be a conventional lighting console, such as available from one of several manufacturers, such as High End Systems, ETC, Strand, and others. The input to such a lighting console would allow a high speed connection for the purpose of controlling lighting effects that are directed by the user of the audio/visual controller 7100. In general, the current generation of such lighting controllers is cue-based and would not easily provide for high bandwidth input for the purposes of controlling a lighting network, but such consoles could have improved input and control features to provide such capabilities.
In an embodiment, the controller, 7302, as shown in this invention can be a computer-based controller or a embedded system that is stand-alone. It can be a laptop, tablet PC, PDA, or other form of computer with a user interface.
In an embodiment, in addition to the music input, video parameters can be used to effect lighting control as well in real-time. Colors, frame or clip rate (not just standard frame or interlace rate but edited clip rate) or spatial frequencies, movements, change, etc are then used to modulate lighting. Features would include hot cues, hyperjog, time-stretching and so on. Video effects include looping ability, scratch re-wind, pitch-shift etc.
In other embodiments, lighting can be controlled and drive the audio and video portions of a performance in response to the lighting control.
The graphical representation, shown in the figures, does not limit user interface means and possibilities for user control. Different graphical elements and user interaction can be used to create a wide variety of interfaces including mechanical interfaces or graphical interfaces associated with a particular effect or set of effects.
Recordable media has often taken the form of rotating physical media such as records, LP records, CDs and back to wax and metal cylinders. The interface of controlling such rotating media makes it attractive to control for DJing purposes, and it is likely that even when moving media becomes obsolete in favor of solid-state solutions (e.g. digital memory) that the rotating or other mechanical or motion interface will remain a preferable one. Such interfaces to media can take the form of trackballs, mouse, levers, switches, joysticks, gestural interfaces, haptic interfaces and other forms of motion-to-control interfaces.
The user interface to the lighting control interface 7112, can be a touch screen, a series of dials, turntables, buttons, knobs, levers, trackballs, switches, and haptic interfaces, gestural inputs using proximity or motion sensors or imaging devices. Physiological inputs may include heartbeats, body movement and other motor skill-based movements.
In another embodiment, inputs from sensors can be used to effect changes in the lighting and audio and video output. In such an embodiment, sensors provide information on movement through pressure, strain, and proximity sensors or imaging sensors as people move and dance.
In embodiment, a wide variety of effects and effect parameters can be used. They can be predetermined—as in played back verbatim, algorithmic in nature, have a variety of different interacting parameters. Effect examples include the effects described elsewhere herein and in the documents incorporated herein by reference, including, but are not limited to the following:
In an embodiment, additional control functions can incorporate special effects machines such as fog, smoke, bubbles, confetti cannons, pyrotechnics, motion activated systems and aroma which are used in clubs and theatrical venues for dramatic impact. Traditional theatrical venue control of these types of devices can now be incorporated into a live performance whereby music, video, lighting and special effects devices are all tied into performance control. In this way the DJ/VJ can control and influence all of this in a live setting.
Adjustable delays can also be introduced as a function of distance, since control signals and light travel much faster than sound. In large venues, such as large interior spaces or exterior spaces, this can adjust for the speed of sound so perfect synchronization is available locally or delayed echo effects are purposefully generated for a performance with the appropriate delays for the whole system or as a function of distance.
Installation geometries can be incorporated into the control configuration for the lighting system so that a Surround Lighting effect and apparent motion of light within an environment are possible. Effects are then both temporal and spatial and allow for control and influence of lighting effects that move through a room or venue. Venue setups can be standardized from venue to venue. A famous club installation can be replicated or scaled for other venues. Most lighting setups vary from venue to venue; some may have more or less lights than another. Some lights may have more features than another.
Standard configurations of lights can be used or a new configuration built up from a club layout. Testing mode to test ‘sweeps’ across room. Adjustments can be made by manipulating icons or graphical elements on the display.
In embodiments, various techniques can be used to determine network addresses and physical addresses of lighting units 100, such as query-based assessments described above, approaches based on vision sensors, manual mapping of addresses, and the like. In an embodiment, a configuration can be determined by disposing a location facility in connection with a lighting unit 100, such as a GPS location facility or local area transmitter/receiver system that operates on triangulation principles similar to those used in GPS. Thus, lighting units 100 equipped with such location-determination facilities can determine their locations relative to the earth (in the case of GPS), relative to a controller, such as the audio/visual controller 7100, which may be equipped with a transmitter/receiver to communicate with the lighting units 100, or relative to each other. Thus, a location-based determination can be made as to the physical locations of lighting units 100 and the correspondence of those locations to the network addresses of the lighting units 100 in a lighting network, such as one that is controlled by an audio/visual controller 7100 or lighting control interface 7112. Thus, in embodiments location-based arrays of lights may be disposed around a venue, such as dance floor or stadium. Every lighting unit 100 can have a transmitter associated with it, and user can bring in an antenna array for setup purposes to get locations of all of the lighting units 100. Thus, users can move lights around and re-locate them, including on the fly during performances, or between performances. Similarly, the presence of a new light can be recognized by the antenna and added to a known mapping of a network of lights. An antenna system allows a user to get both pieces of information needed for mapping, namely, where the lighting units 100 actually are in the venue and where the lighting units 100 are in the network.
Configurations can be created at time of installation so that a club or other venue always has the configuration file for the venue accessible and stored on media or on the web for downloading purposes. Clubs can make this available to users even before they arrive on location so they can load and be ready to go when setup is complete. For example: Avalon.map could be a complete map file for the Avalon club in Boston. Avalon.sho might be a showfile consisting of a series of effects that can be adjusted live. Similarly, configurations could be determined on the fly, or as additional units are added.
In embodiments the audio/visual controller 7100 is also capable of storing and playing back sequences. In this way, the system can be used as an authoring system by show designers. A storage sequence can be selected and triggered, and all inputs and outputs can be recorded for a time period or until certain input conditions are met and stored to memory and retrieved at a later time for playback. Even in this case, the playback need not mirror exactly what the outputs are, and they can be a function of the settings, triggers and conditions that were set. So, for example, the playback of an effect sequence may be identical, but the output may be different sue to sensor settings, time of day, musical inputs etc.
In embodiments the AN controller 7100 may be an industry standard digital audio and video controller, such as the Pioneer DVJ-X1 from Pioneer Electronics (USA) Inc. The A/V controller 7100 can allow users to manipulate and play back synchronized digital audio and video. DVJs can use an A/V controller 7100 to manipulate DVD visuals in a way similar to how they would music. Real-time digital video scratches, loops and instant cues are all possible with the audio/visual controller 7100, while the video and audio streams always stay in perfect sync, even when they're being reversed and pitched. The audio/visual controller 7100 can bring together existing AV technologies into a single unit that interfaces with currently available software and hardware to introduce a completely new form of entertainment. Two A/V controllers 7100 may be linked together via a fully integrated audio and visual mixer. This set-up allows the digital audio and video from the two separate sources to be mixed and scratched on the fly—in the same way that DJs create audio mixes in their live sets today. The Pioneer DVJ-X1, for example, offers on-board memory capacity as well as a SD Card slot similar to the CDJ-1000MK2 for even greater flexibility in performance. This allows for AV loops and cue points to be stored, either on-board or on a removable memory card. For example, a memory card that is bundled with the DVJ-X1 can store up to 500 loop or cue points. During playback, the saved cue and loop points can be searched, selected and previewed using an external preview monitor.
In embodiments, the audio/visual controller 7100 can have a memory feature for storing wave data, cue points and loop points. The data can be stored on a removable memory card (SD) or the player's internal memory. In embodiments the A/V controller 7100 may include a job dial, which allows a user to cue a music track with a touch sensitive job dial that is similar in control characteristics to a vinyl turntable. In embodiments the A/V controller 7100 may include a pitch bend controller to allow the user to speed up or slow down the tempo of video or audio playback, depending on the direction the job dial is rotated. In embodiments, the A/V controller 7100 may have different modes, such as a vinyl mode to simulate traditional turntable effects, or a standard CD mode, without touch sensitivity. In embodiments the A/V controller 7100 may include functions that simulate a vinyl turntable, such as allowing a user to cue or scratch a track by rotating a job dial in a particular direction. Similarly, the parameters of the job dial may be adjusted, such as to modify the speed at which a track slows down or stops, as well as the speed with which a track returns to normal speed.
In embodiments, the A/V controller 7100 may include master tempo control, so that when a user changes speed, the system maintains the pitch of vocal and instrumental audio, without noticeable differences. The A/V controller 7100 can include tempo control, such as a slider that allows tempo adjustment.
In embodiments, the A/V controller 7100 may provide a range of cue functions, each of which can be used to serve as a cue for a lighting control signal. Such functions may include an auto cue function, which automatically cues the beginning of a track. The A/V controller 7100 may also include a manual cue, set at a position on a track. Adjustment can be made by using either the job dial or manual search buttons. The cue point can be automatically stored in the internal memory (If SD is inserted in A/V controller 7100, the cue point is automatically stored in the SD) until it is overwritten with a new cue point. In embodiments a cue point can be set on the fly and stored into the internal memory by just simply hitting the In/Real-time cue button. In embodiments a user can start the music by sliding a cross fader. By sliding the fader back, it will return the A/V controller 7100 back to the previously selected cue point.
Referring to
Referring to
In embodiments the entertainment system controller 7118 may facilitate a mapping of addresses from locations of lighting units 100 in a room and associate them with features of media. For example, low frequencies (bass) or high frequencies (treble) could be associated with particular parts of a dance floor, curtain, or the like in the venue 7500, so that aspects of the media are displayed in lights as the media is played by the media player. Particular colors can be associated with particular tones, such as greens and blues with treble and warm colors like reds and oranges with bass tones, for example. In embodiments the mapping may be changed, such as moving the base tones around the room, such as by sweeping the job dial 7608 to rotate or flip the mapping of particular colors to particular media features. Thus, in an object oriented coding scheme that maps from a visualization to a lighting control scheme, an intermediate object may be included that allows dynamic shifting of the mapping between the visualization and the lighting control signal.
In embodiments the entertainment control system 7118 can be disposed on a table of a restaurant, so that a customer can control media displays, including lighting systems, in the venue, directly from the tabletop.
Referring to
Thus, in embodiments there is provided a method of illuminating an environment in coordination with a media display, including steps of providing a lighting control system for controlling a lighting system; providing a user interface for controlling a media display that is distinct from the lighting system; and associating an input of the lighting control system with the output of the user interface, wherein the method includes taking input from an audio/visual controller with a physical interface and using it to control lights in an entertainment venue.
In embodiments the lighting system may be a string lighting system displayed on an area, such as part of an entertainment venue. The area might be a bar, a wall, a dance floor, a curtain, a stage, tile, floor, panel-display, or the like.
In embodiments methods may include taking a visualization from a computer display associated with an audio player and displaying lights that correspond to the visualization in a physical entertainment venue. In embodiments the visualization is a skin for an MP3, and the method includes allowing a user to modify the skin through a physical interface. In embodiments the physical interface is a touch screen. In embodiments touching the screen changes at least one of the brightness and the color of at least part of the skin.
In embodiments methods and systems may include taking video input from a music video and displaying corresponding light on a string of lights that use a serial addressing protocol. The string of lights may be disposed in an entertainment venue. The user may modify output through a physical interface, such as the interface for an audio/visual controller 7100, such as a job dial, touch screen, or other interface. The user may dynamically alter the mapping between inputs and the lighting control outputs.
While the invention has been described in connection with certain preferred embodiments, other embodiments would be recognized by one of ordinary skill in the art and all such embodiments are encompassed by this disclosure.
The present application claims the benefit, under 35 U.S.C. §119(e), of the following U.S. Provisional Applications: Ser. No. 60/549,526, filed Mar. 2, 2004, entitled “Methods and Apparatus for Integrating A/V Systems with Lighting;” Ser. No. 60/553,111, filed Mar. 15, 2004, entitled “Lighting Methods and Systems;” and Ser. No. 60/558,400, filed Mar. 31, 2004, entitled “Methods and Systems for Providing Lighting Components.” The present application also claims the benefit, under 35 U.S.C. §120, as a continuation-in-part (CIP) of the following U.S. Non-provisional Applications: Ser. No. 09/886,958, filed Jun. 21, 2001, entitled “Methods and Apparatus for Controlling a Lighting System in Response to an Audio Input,” which in turn claims the benefit of Ser. No. 60/213,042, filed Jun. 21, 2000, entitled “Lighting Control MP3 Plug-in;” Ser. No. 10/995,038, filed Nov. 22, 2004, entitled “Light System Manager,” which in turn claims the benefit of Ser. No. 60/523,903, filed Nov. 20, 2003, entitled “Light System Manager,” and Ser. No. 60/608,624, filed Sep. 10, 2004, entitled “Light System Manager;” Ser. No. 10/163,164, filed Jun. 5, 2002, entitled “Systems and Methods of Generating Control Signals,” which in turn claims the benefit of Ser. No. 60/296,344, filed Jun. 6, 2001, entitled “Systems and Methods of Generating Control Signals;” Ser. No. 10/325,635, filed Dec. 19, 2002, entitled “Controlled Lighting Methods and Apparatus,” which in turn claims the benefit of Ser. No. 60/341,898, filed Dec. 19, 2001, entitled “Systems and Methods for LED Lighting” and Ser. No. 60/407,185, filed Aug. 28, 2002, entitled “Methods and Systems for Illuminating Environments;” Ser. No. 10/360,594, filed Feb. 6, 2003, entitled “Controlled Lighting Methods and Apparatus,” which in turn claims the benefit of Ser. No. 60/401,965, filed Aug. 8, 2002, entitled “Methods and Apparatus for Controlling Addressable Systems,” and Ser. No. 10/045,604, filed Oct. 23, 2001, entitled “Systems and Methods for Digital Entertainment,” which in turn claims the benefit of the following applications: Ser. No. 60/243,250, filed Oct. 25, 2000, entitled “Illumination of Liquids;” Ser. No. 60/242,484, filed Oct. 23, 2000, entitled “Systems and Methods for Digital Entertainment;” Ser. No. 60/277,911, filed Mar. 22, 2001, entitled “Systems and Methods for Digital Entertainment;” Ser. No. 60/262,022, filed Jan. 16, 2001, entitled “Color Changing LCD Screens;” Ser. No. 60/262,153, filed Jan. 17, 2001, entitled “Information Systems;” Ser. No. 60/268,259, filed Feb. 13, 2001, entitled “LED Based Lighting Systems and Methods for Vehicles.” The present application also claims the benefit, under 35 U.S.C. §120, as a continuation-in-part (CIP) of U.S. Non-provisional application Ser. No. 10/842,257, filed May 10, 2004, entitled “Methods and Apparatus for Controlling Devices in a Networked Lighting System,” which is a divisional of Ser. No. 10/158,579, filed May 30, 2002, now U.S. Pat. No. 6,777,891. Ser. No. 10/158,579 in turn claims the benefit of the following applications: Ser. No. 60/301,692, filed Jun. 28, 2001, entitled “Systems and Methods for Networking LED Lighting Systems;” Ser. No. 60/328,867, filed Oct. 12, 2001, entitled “Systems and Methods for Networking LED Lighting Systems;” and Ser. No. 60/341,476, filed Oct. 30, 2001, entitled “Systems and Methods for LED Lighting.” Each of the foregoing applications is hereby incorporated herein by reference.
Number | Date | Country | |
---|---|---|---|
60549526 | Mar 2004 | US | |
60553111 | Mar 2004 | US | |
60558400 | Mar 2004 | US | |
60213042 | Jun 2000 | US | |
60523903 | Nov 2003 | US | |
60608624 | Sep 2004 | US | |
60296344 | Jun 2001 | US | |
60341898 | Dec 2001 | US | |
60407185 | Aug 2002 | US | |
60401965 | Aug 2002 | US | |
60243250 | Oct 2000 | US | |
60242484 | Oct 2000 | US | |
60277911 | Mar 2001 | US | |
60262022 | Jan 2001 | US | |
60262153 | Jan 2001 | US | |
60268259 | Feb 2001 | US | |
60301692 | Jun 2001 | US | |
60328867 | Oct 2001 | US | |
60341476 | Oct 2001 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 10158579 | May 2002 | US |
Child | 10842257 | May 2004 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 09886958 | Jun 2001 | US |
Child | 11070870 | Mar 2005 | US |
Parent | 10995038 | Nov 2004 | US |
Child | 11070870 | Mar 2005 | US |
Parent | 10163164 | Jun 2002 | US |
Child | 11070870 | Mar 2005 | US |
Parent | 10235635 | Sep 2002 | US |
Child | 11070870 | Mar 2005 | US |
Parent | 10360594 | Feb 2003 | US |
Child | 11070870 | Mar 2005 | US |
Parent | 10045604 | Oct 2001 | US |
Child | 11070870 | Mar 2005 | US |
Parent | 10842257 | May 2004 | US |
Child | 11070870 | Mar 2005 | US |