The invention relates to interactive lighting control, particularly to the controlling and creating of light effects such as the tuning of light scenes based on location indication received from an input device, and more particularly to an interactive lighting control system and method for light effect control and creation with a location indication device.
Future home and current professional environments will contain a large number of light sources of different nature and type: incandescent, halogen, discharge or LED (Light Emitting Diode) based lamps for ambient, atmosphere, accent or task lighting. Every light source has different control possibilities like dimming level, cold/warm lighting, RGB or other methods that change the effect of the light source on the environment.
Almost all of the control paradigms in lighting are lamp driven: the user selects a lamp, and operates directly on the controls of the lamp by modifying the dimming value, or by operating on the RGB (Red Green Blue) channels of the lamp. While it can be very natural to adjust the lighting effect on the location directly and not be bothered by looking for the lamps that are responsible for the effect on the location.
When the number of light sources is greater than 20, it can be difficult to trace an effect on a location back to the light source. Moreover, the effect might be the result of a combination of different light effects from light sources of different natures (e.g. Ambient TL (Task Lighting) and wall washing LED lamps). In that case, the user has to play with the lighting controls of the different lamps, and has to evaluate the effect of changing them. In some cases, this effect is rather global (e.g. for ambient lighting), in some cases, this effect is very local (e.g. a spot light). So the user has to find out, which control is related to which effect, and has to find out the size of the effect in order to approach the desired light setting.
It is an object of the invention to improve the controlling of a lighting infrastructure.
The object is solved by the subject matter of the independent claims. Further embodiments are shown by the dependent claims.
A basic idea of the invention is to provide an interactive lighting control by combining a location indication device with a light effect driven approach on lighting control in order to improve the creating of light effects such as the tuning of light scenes especially with large and diverse lighting infrastructures. The effect driven approach in lighting control can be implemented by a computer model comprising a virtual representation of a real environment with a lighting infrastructure. The virtual view may be used to map a real location to a virtual location in the virtual environment. Lighting effects available at the real location can be detected and modelled in the virtual view. Both the virtual location and the available light effects may then be used to indicate to a user light effects for selection, and to calculate control settings for a lighting infrastructure. This automated and light effect driven approach may improve the controlling of a particularly complex lighting infrastructure and offers a more natural interaction, since users only have to point to the location of the real environment, where they would like to change the light effect created by the lighting infrastructure.
An embodiment of the invention provides an interactive lighting control system comprising
The system may further comprise a light effect creator for calculating control settings for a lighting infrastructure for creating the desired light effect on the real location based on the light effects available at the virtual location. The light effect creator may be for example implemented as a software module, which transfers light effects selected in the virtual view into light effects in the real environment. For example, when a user selects a certain location in the real environment for changing a light effect, and changes the light effect by means of the virtual view, the light effect creator may automatically process the changed light effect in the virtual view by calculating suitable control settings for creating the light effect in the real environment. The light effect creator also can take any restrictions of the lighting infrastructure in the real environment into account when creating a light effect.
The location input device may comprise one or more of the following devices:
Typically, a suitable input device in the context of the invention is a pointing device, i.e. a device for detecting a location to which a user points with the device.
The system may further comprise a camera and a video processing unit being adapted for processing video data received from the camera and for detecting the location in the real environment, to which the input device points, and outputting the detected real location to the mapping unit for further processing.
The interface may be adapted for receiving the data related to a light effect desired at the real location from a light effects input device.
The light effect controller may be adapted for indicating light effects available at the real location based on the virtual location in the virtual view and for transmitting available light effects to the input device, a display device, and/or an audio device for indication to a user.
The display device may be controlled such that a static or dynamic content with light effects is displayed for selection with a light effects input device.
The data related to a light effect desired at the real location can comprise one or more of the following:
The light effect creator may be adapted to trace back to lamps, which influence the light in the real location, of the lighting infrastructure based on the virtual location and to calculate the control settings for the lamps, which were traced back.
A further embodiment of the invention relates to an input device for a system according to the invention and as described above, wherein the input device comprises
The input device can further comprise
A yet further embodiment of the invention relates to an interactive lighting control method comprising the acts of
These and other aspects of the invention will be apparent from and elucidated with reference to the embodiments described hereinafter.
The invention will be described in more detail hereinafter with reference to exemplary embodiments. However, the invention is not limited to these exemplary embodiments.
In the following, functionally similar or identical elements may have the same reference numerals. The terms “lamp”, “light” and “luminary” describe the same.
Interactive control of the lighting created with the lighting infrastructure 34 may be performed by usage of the input device 18, which may be hold by a user 38. The user 38, who desires to create a certain lighting effect at a real location 16 on the wall 30, simply points with the input device 18 to the location 16. In order to detect the location 16, to which the user 38 points, the input device 18 is adapted to detect the location 16.
The input device 18 may be for example the uWand™ intuitive pointer and 3D control device from the Applicant. The uWand™ control device comprises an IR (Infrared) receiver, which detects signals from coded IR beacons, which may be located at the wall 30 besides a TV set. From the received signals and the positions of the beacons, the uWand™ control device may derive its pointing position and transmit the derived pointing position via a wireless 2.4 GHz communication link to the interface 12. The uWand™ control device makes 2D and 3D position detection possible. For example, also turning of the input device may be detected.
Also, the WiiMote™ input device from Nintendo Co., Ltd., may be used for the purposes of the present invention. The WiiMote™ input device allows a 2D pointing position detection by capturing IR radiation from IR LEDs with a built-in camera and deriving the pointing position from the detected position of IR LEDs. Transmission of data related to the detected pointing position occurs via a Bluetooth™ communication link, for example with the interface 12.
Furthermore, a laser pointer or light torch may be applied as input device, when combined with a camera for detection the pointing position in the real environment, for example on the wall 30. Data related to the detected pointing position are generated by a video processing of the pictures captured with the camera. The camera may be integrated in the input device similar to the WiiMote™ input device. Alternatively, the camera may be an external device combined with a video processing unit for detecting the pointing position. The external device comprising the camera may be either connected to or integrated in the interactive lighting control system 10, such as the camera 24 and the video processing unit 26 of the system 10.
The input device 18 wirelessly transmits data 14 indicating the location 16, to which it points in the real environment 30, to the interface 12 of the interactive lighting control system 10.
A light effect controller 20 of the interactive lighting control system 10 processes the received data 14 as follows: The real position of the location 16 is mapped to a virtual location of a virtual view of the real environment. The virtual view may be a 2D representation of the real environment such as the wall 30 shown in
The light effect controller 20 determines light effects available at the virtual location. This may be performed for example by means of a model of the lighting infrastructure 34 installed in the real environment, wherein the model relates the controls of the lighting infrastructure 34 to light effects and locations in the virtual view of the real environment.
The model may be created by a so called Dark Room Calibration (DRC) method, where the effect and location of every lighting control, for example a DMX channel, is measured. The light effects detected with a DRC can then be assigned to virtual locations in the virtual view to form the model. For example, a target illumination distribution can be expressed as a set of targets in discrete points, for example 500 lux on some points of a work surface, as a colorful distribution in a 2D view, for example the distribution measured on a wall, or the distribution as received by a camera or colorimetric device, or more abstractly, as a function that relates the light effect to a location.
The light effects, which are determined by the light effect controller 20 as being available at the location 16, may be displayed on the display device 28 or transmitted via the interface 12 to the input device 18 or a separate light effects input device 40, which may be for example implemented for example by a PDA (Personal Digital Assistant), a smart phone, a keyboard, a PC (Personal Computer), a remote control of for example a TV set.
A user selection of a desired light effect is transmitted from the input device 18 or the light effects input device 40 to the system 10, and via the interface 12 to the light effects controller 20, which transmits the selected light effect and the location 16 to the light effect creator 22. The creator 22 traces back to the lamps 36 of the lighting infrastructure 34, which influence the light in the location 16, calculates the control settings for the traced back lamps 36, and transmits the calculated control settings to the lighting infrastructure 34 so that the user desired light effect 32 is created by the lamps 36 at the location 16.
In the following, the selection of light effects by the user 38 will be explained by means of several use cases. In the shown use cases, the cross marks the pointing position of the input device 18 and the dashed arrows represent movements performed with the input device 18, i.e. the movement of the pointing location of the input device 18 from one to another location in the virtual view, which is a 2D representation of the real environment, for example the wall 30.
The
When pointing at a location, a display device can give some feedback on the possibilities at those locations. For example, a triangle of colors that can be rendered at the location can be shown on the input device or a separate display device.
When multiple effects are present, the interactive lighting control system 10 can select the most influencing effect at the location the user points to. It is also possible to influence a set of effects.
Finally, as in the known interaction with mouse and pointer, the user 38 can also indicate an area in the virtual view. This will select a set of effects that are mainly present in the area. Tuning operations are then performed on the set of effects.
Tuning operations possible on the selected area may be for example
To indicate the size of the selected area, the lamps that have a contribution to the area can start flashing or can be set by the interactive lighting control system 10 to a contrasting light effect. This provides the user 38 with a feedback on the selected area.
On the input device 18, several interaction methods can be used for changing the light effect:
When an area is selected, the shown values of hue, saturation and intensity can be average values, but also minima or maxima. In the latter case, the interaction makes it possible to change the extreme values. It is also possible to weaken or strengthen the distribution of extreme values in order to smoothen or sharpen the effect.
The invention can be used in environments where a large number of for example more than 20 luminaries is present, in future homes with a complex and diverse lighting infrastructure, in shops, public spaces, lobbies where light scenes are created, for chains of shops (one can think of a single reference shop, where light scenes are created for all shops; when the light scenes are deployed, some fine-tuning might be needed). The interaction is also useful for tuning the location of a redirect able spot. These spots are mainly used in shops (mannequins), art galleries, in theatres and on stages of concerts.
Typical applications of the invention are for example the creation of light scenes from scratch (areas are located and effects are increased from zero to a desired value), and the immersive fine-tuning of light scenes which are created by other generation methods.
At least some of the functionality of the invention may be performed by hard- or software. In case of an implementation in software, a single or multiple standard microprocessors or microcontrollers may be used to process a single or multiple algorithms implementing the invention.
It should be noted that the word “comprise” does not exclude other elements or steps, and that the word “a” or “an” does not exclude a plurality. Furthermore, any reference signs in the claims shall not be construed as limiting the scope of the invention.
Number | Date | Country | Kind |
---|---|---|---|
10152035.1 | Jan 2010 | EP | regional |
Filing Document | Filing Date | Country | Kind | 371c Date |
---|---|---|---|---|
PCT/IB2011/050226 | 1/19/2011 | WO | 00 | 7/18/2012 |