The invention relates to a system for controlling a first pixelated lighting device and at least a second pixelated lighting device based on a dynamic light scene, each of the pixelated lighting devices comprising a plurality of individually controllable light segments, wherein an initial mapping has been determined from the dynamic light scene to the plurality of individually controllable light segments of the first pixelated lighting device.
The invention further relates to a method of controlling a first pixelated lighting device and at least a second pixelated lighting device based on a dynamic light scene, each of the pixelated lighting devices comprising a plurality of individually controllable light segments, wherein an initial mapping has been determined from the dynamic light scene to the plurality of individually controllable light segments of the first pixelated lighting device.
The invention also relates to a computer program product enabling a computer system to perform such a method.
The introduction of LED lighting and the introduction of connected lighting have made it possible to create more sophisticated light experiences. Sometimes, it is even possible for users to create complex light scenes themselves, e.g., dynamic light scenes and/or light scenes for pixelated lighting devices.
However, when lighting devices are arranged in a particular way in a room, some light scenes that look well when they are defined could potentially start looking worse when the positions and/or orientations of the lighting devices are changed. Although it is possible to have a system automatically select a light scene based on the positions of the lighting devices, as has been disclosed in US 2018/0352635 A1, a user may still want to be able to use a certain light scene after he has changed positions and/or orientations of lighting devices. He would then need to modify this light scene to work well in the new arrangement of the lighting devices, which is especially cumbersome if the light scenes are dynamic light scenes for pixelated lighting devices.
US 2019/335560 A1 discloses a lighting device comprises an array of controllable light emitting pixels, each pixel having an adjustable light output colour. A controller is configured to receive a limited set of light output colours and to locally process these light output colours to form a colour gradient pattern to be displayed across pixels of the array.
It is a first object of the invention to provide a system, which is able to control pixelated lighting devices to render a dynamic light scene in a manner optimized for a current arrangement of the pixelated lighting devices.
It is a second object of the invention to provide a method which can be used to control pixelated lighting devices to render a dynamic light scene in a manner optimized for a current arrangement of the pixelated lighting devices.
In a first aspect of the invention, a system for controlling a first pixelated lighting device and a second pixelated lighting device based on a dynamic light scene, each of the first and second pixelated lighting devices comprising a plurality of individually controllable light segments, wherein an initial mapping has been determined from the dynamic light scene to the plurality of individually controllable light segments of the first pixelated lighting device, comprises at least one input interface, at least one transmitter, and at least one processor configured to control, via the at least one transmitter, the first pixelated lighting device and the second pixelated lighting device based on the dynamic light scene, the dynamic light scene comprising a plurality of light settings that move across the plurality of individually controllable light segments over time.
The at least one processor is further configured to obtain, via the at least one input interface, a position of the first pixelated lighting device relative to the second pixelated lighting device, determine a spatial offset for the initial mapping based on the position of the first pixelated lighting device relative to second pixelated lighting device, determine a spatial direction of the dynamic light scene relative to the first pixelated lighting device based on the position of the first pixelated lighting device relative to the second pixelated lighting device, and control, via the at least one transmitter, the first pixelated lighting device to render the dynamic light scene according to an adjusted initial mapping, the initial mapping being adjusted by offsetting the initial mapping according to the spatial offset and the spatial direction.
By determining the spatial offset and spatial direction for the first pixelated lighting device based on the position of the first pixelated lighting device relative to the second pixelated lighting device, a dynamic light scene recalled by a user may be rendered in a manner optimized for the current position of the first pixelated lighting device relative to the second pixelated lighting device without the user having to modify the parameters of his recall action to reflect any change in this position. The at least one processor may be configured to determine the initial mapping or to control the first pixelated lighting device to determine this initial mapping, for example.
The dynamic light scene may be a gradient light effect that moves across the pixelated lighting devices over time. The spatial offset may be a shift of the mapping of the light settings onto the plurality of individually controllable light segments across the pixelated lighting device. The spatial direction may be a direction in which the dynamic light scene moves across the pixelated lighting device.
In a similar manner as described above, the position of the first pixelated lighting device relative to one or more further pixelated lighting devices may also be taken into account. In a similar manner as described above, an initial mapping for the second pixelated lighting device and/or one or more further pixelated lighting devices may also be adjusted.
The at least one processor may be configured to determine a transition speed and control the first pixelated lighting device to render the dynamic light scene according to a plurality of successive mappings, a usage duration of each of the successive mappings depending on the transition speed and the plurality of successive mappings including the adjusted initial mapping. The transition speed may be defined, for example, as the distance (e.g., the number of light segments) the light settings travel per time unit.
The at least one processor may be configured to determine an angle between the first and second pixelated lighting devices based on the position of the first pixelated lighting device relative to the second pixelated lighting device and determine the transition speed based on the angle. Alternatively or additionally, the at least one processor may be configured to determine a length of the first pixelated lighting device and determine the transition speed based on the length.
The at least one processor may be configured to determine a color and/or light intensity range within the dynamic light scene and the initial mapping is further adjusted to conform to the color and/or light intensity range. This may be used to ensure that light segments (of different pixelated lighting devices) which have a similar horizontal position render a similar light setting or that light segments (of different pixelated lighting devices) which have as similar vertical position render a similar light setting.
The at least one processor may be configured to determine an angle between the first and second pixelated lighting devices based on the position of the first pixelated lighting device relative to the second pixelated lighting device and determine the color and/or light intensity range based on the angle. Additionally or alternatively, the at least one processor may be configured to determine a length of the first pixelated lighting device and determine the range based on the length. For example, a shorter pixelated lighting device may render less colors of a color palette at a time than a longer pixelated lighting device.
The at least one processor may be further configured to determine the spatial direction of the dynamic light scene relative to the first pixelated lighting device further based on a spatial direction of the dynamic light scene relative to the second pixelated lighting device as used by the second pixelated lighting device. This may be beneficial if the spatial direction is not the same for each pixelated lighting device, e.g., user configurable.
The position of the first pixelated lighting device relative to the second pixelated lighting device may be indicative of a relative distance between the first and second pixelated lighting devices, and the at least one processor may be configured to determine whether the relative distance between the first and second pixelated lighting devices exceeds a threshold, and determine the spatial offset and the spatial direction based on the position of the first pixelated lighting device relative to second pixelated lighting device if it is determined that the relative distance between the first and second pixelated lighting devices does not exceed the threshold. For example, it may be desirable to use default behavior when two pixelated lighting devices are not close to each other. The at least one processor may be configured to allow a user to adjust the threshold.
The at least one processor may be configured to select a light segment from the plurality of individually controllable light segments of the first pixelated lighting device, the light segment being closest to the second pixelated lighting device, and determine the spatial offset based on the selected light segment.
Successive mappings from the dynamic light scene to the pluralities of individually controllable light segments may be determined based on the initial mapping, the spatial offset, and the spatial direction. The at least one processor may be configured to determine these successive mappings or to control the pixelated lighting device to determine these successive mappings, for example.
In a second aspect of the invention, a method of controlling a first pixelated lighting device and a second pixelated lighting device based on a dynamic light scene, each of the first and second pixelated lighting devices comprising a plurality of individually controllable light segments, wherein an initial mapping has been determined from the dynamic light scene to the plurality of individually controllable light segments of the first pixelated lighting device, comprises obtaining a position of the first pixelated lighting device relative to the second pixelated lighting device and determining a spatial offset for the initial mapping based on the position of the first pixelated lighting device relative to second pixelated lighting device.
The method further comprises determining a spatial direction of the dynamic light scene relative to the first pixelated lighting device based on the position of the first pixelated lighting device relative to the second pixelated lighting device and controlling the first pixelated lighting device and the second pixelated lighting device based on the dynamic light scene, the dynamic light scene comprising a plurality of light settings that move across the plurality of individually controllable light segments over time and the first pixelated lighting device being controlled to render the dynamic light scene according to an adjusted initial mapping, the initial mapping being adjusted by offsetting the initial mapping according to the spatial offset and the spatial direction. The method may be performed by software running on a programmable device. This software may be provided as a computer program product.
Moreover, a computer program for carrying out the methods described herein, as well as a non-transitory computer readable storage-medium storing the computer program are provided. A computer program may, for example, be downloaded by or uploaded to an existing device or be stored upon manufacturing of these systems.
A non-transitory computer-readable storage medium stores at least one software code portion, the software code portion, when executed or processed by a computer, being configured to perform executable operations for controlling a first pixelated lighting device and a second pixelated lighting device based on a dynamic light scene, each of the first and second pixelated lighting devices comprising a plurality of individually controllable light segments, wherein an initial mapping has been determined from the dynamic light scene to the plurality of individually controllable light segments of the first pixelated lighting device.
The executable operations comprise obtaining a position of the first pixelated lighting device relative to the second pixelated lighting device, determining a spatial offset for the initial mapping based on the position of the first pixelated lighting device relative to second pixelated lighting device, determining a spatial direction of the dynamic light scene relative to the first pixelated lighting device based on the position of the first pixelated lighting device relative to the second pixelated lighting device and controlling the first pixelated lighting device and the second pixelated lighting device based on the dynamic light scene, the dynamic light scene comprising a plurality of light settings that move across the plurality of individually controllable light segments over time and the first pixelated lighting device being controlled to render the dynamic light scene according to an adjusted initial mapping, the initial mapping being adjusted by offsetting the initial mapping according to the spatial offset and the spatial direction.
As will be appreciated by one skilled in the art, aspects of the present invention may be embodied as a device, a method or a computer program product. Accordingly, aspects of the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment (including firmware, resident software, micro-code, etc.) or an embodiment combining software and hardware aspects that may all generally be referred to herein as a “circuit”, “module” or “system.” Functions described in this disclosure may be implemented as an algorithm executed by a processor/microprocessor of a computer. Furthermore, aspects of the present invention may take the form of a computer program product embodied in one or more computer readable medium(s) having computer readable program code embodied, e.g., stored, thereon.
Any combination of one or more computer readable medium(s) may be utilized. The computer readable medium may be a computer readable signal medium or a computer readable storage medium. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples of a computer readable storage medium may include, but are not limited to, the following: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or Flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of the present invention, a computer readable storage medium may be any tangible medium that can contain, or store, a program for use by or in connection with an instruction execution system, apparatus, or device.
A computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber, cable, RF, etc., or any suitable combination of the foregoing. Computer program code for carrying out operations for aspects of the present invention may be written in any combination of one or more programming languages, including an object oriented programming language such as Java™, Smalltalk, C++ or the like and conventional procedural programming languages, such as the “C” programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer, or entirely on the remote computer or server. In the latter scenario, the remote computer may be connected to the user's computer through any type of network, including a local area network (LAN) or a wide area network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet Service Provider).
Aspects of the present invention are described below with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the present invention. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor, in particular a microprocessor or a central processing unit (CPU), of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer, other programmable data processing apparatus, or other devices create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer readable medium that can direct a computer, other programmable data processing apparatus, or other devices to function in a particular manner, such that the instructions stored in the computer readable medium produce an article of manufacture including instructions which implement the function/act specified in the flowchart and/or block diagram block or blocks.
The computer program instructions may also be loaded onto a computer, other programmable data processing apparatus, or other devices to cause a series of operational steps to be performed on the computer, other programmable apparatus or other devices to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide processes for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of devices, methods and computer program products according to various embodiments of the present invention. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the blocks may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustrations, and combinations of blocks in the block diagrams and/or flowchart illustrations, can be implemented by special purpose hardware-based systems that perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
These and other aspects of the invention are apparent from and will be further elucidated, by way of example, with reference to the drawings, in which:
Corresponding elements in the drawings are denoted by the same reference numeral.
In this first embodiment, the system is a bridge 1.
The bridge 1 and the light strips 10 and 20 can communicate wirelessly, e.g., via Zigbee. The bridge 1 is connected to a wireless LAN access point 31, e.g., via Ethernet or Wi-Fi. A mobile phone 33 is also able to connect to the wireless LAN access point 31, e.g., via Wi-Fi. The mobile phone 33 can be used to control the light strips 10 and 20 via the wireless LAN access point 31 and the bridge 1, e g. to turn the light segments of the light strips on or off or to change their light settings.
The bridge 1 comprises a receiver 3, a transmitter 4, a processor 5, and a memory 7. The processor 5 is configured to control, via the transmitter 4, the first light strip 10 and the second light strip 20 based on the dynamic light scene. The dynamic light scene comprises a plurality of light settings that move across the plurality of individually controllable light segments over time. A user of the mobile device 31 may have recalled the dynamic light scene, for example. The dynamic light scene may be obtained from the mobile device 31 or from memory 7, for example.
The processor 5 is further configured to obtain, via the receiver 3, a position of the first light strip 10 relative to the second light strip 20, determine a spatial offset for the initial mapping based on the position of the first light strip 10 relative to the second light strip 20, determine a spatial direction of the dynamic light scene relative to the first light strip 10 based on the position of the first light strip 10 relative to the second light strip 20, and control, via the transmitter 4, the first light strip 10 to render the dynamic light scene according to an adjusted initial mapping.
The initial mapping is adjusted by offsetting the initial mapping according to the spatial offset and the spatial direction. The processor 5 may be configured to control the second light strip 20 to render the dynamic light scene according to a default mapping of the dynamic light scene to the plurality of individually controllable light segments of the second pixelated lighting device, for example.
The initial mapping and later mappings for the light strip 10 may be determined and adjusted by the bridge 1 or by the light strip 10 itself, for example. In the latter implementation, the processor 5 may be configured to transmit a command to a light strip that includes color palette (e.g., a list of colors), window size (e.g., a number larger than one), spatial direction (e.g., left to right or right to left), mode (e.g., normal or symmetrical), transition speed, and spatial offset, for example. If the mode is symmetrical, the spatial direction indicates whether the light effect moves toward or away from the center of the light strip.
All colors of the color palette may be specified in the command or only a subset of the colors in the color palette may be specified in the command. As an example of the latter, three to five colors of a color gradient may be specified and other colors in the color palette may be interpolated from these three to five colors. The window size may indicate how many colors should be shown simultaneously. The window size is smaller than the number of colors in the color palette. Color settings are used in the above example, but other light settings, e.g., brightness, may additionally or alternatively be used.
As described above, the processor 5 is configured to determine the spatial direction and spatial offset for the light strip 10 based on the based on the position of the first light strip 10 relative to the second light strip 20.
Typically, the bridge 1 holds the location and orientation of the lighting devices relative to each other in the room and when an accessory or application recalls a light scene, the bridge 1 knows which lighting devices participate in this light scene. When a scene is recalled, instead of using the defaults, the bridge 1 may decide to modify the spatial offset and/or spatial direction of the light scene based on the relative position(s). For example, if two pixelated lighting devices are close to each other and the bridge 1 knows which corners are closest based on the locations and orientations, the bridge 1 may modify the spatial offset and/or spatial direction for at least one of the pixelated lighting devices to make the renderings of the dynamic light scenes match.
The processor 5 may be configured to obtain the position of the first light strip relative to the second light strip 20 from a signal indicative of user input (e.g., entered on mobile device 33), from one or more camera images (e.g., captured with mobile device 33) or from received information which has been determined from one or more camera images, 10 and/or from one or more signals received from the light strips 10 and 20, for example. As an example of the latter, the processor 5 may be configured to receive signals indicative of the orientations of the light strips 10 and 20 from the light strips 10 and 20 and determine further position information from the received signal strength of the received signals by using triangulation. Light strips 10 and 20 may report received signal strengths of signals received from each other as well. Additionally or alternatively, other RF beacons may be used, for example.
In the embodiment of the bridge 1 shown in
The receiver 3 and the transmitter 4 may use one or more wired or wireless communication technologies, e.g., Ethernet for communicating with the wireless LAN access point 31 and Zigbee for communicating with the light strips 10 and 20, for example. In an alternative embodiment, multiple receivers and/or multiple transmitters are used instead of a single receiver and a single transmitter. In the embodiment shown in
In this second embodiment, the system is a mobile device 51.
The light strips 10 and 20 depicted in
The mobile device 51 comprises a transceiver 53, a transmitter 54, a processor 55, memory 57, and a touchscreen display 59. The processor 55 is configured to control, via the transmitter 54, the first light strip 10 and the second light strip 20 based on the dynamic light scene. The dynamic light scene comprises a plurality of light settings that move across the plurality of individually controllable light segments over time.
The processor 55 is further configured to obtain, via the receiver 53, a position of the first light strip 10 relative to the second light strip 20, determine a spatial offset for the initial mapping based on the position of the first light strip 10 relative to the second light strip 20, determine a spatial direction of the dynamic light scene relative to the first light strip 10 based on the position of the first light strip 10 relative to the second light strip 20, and control, via the transmitter 4, the first light strip 10 to render the dynamic light scene according to an adjusted initial mapping. The initial mapping is adjusted by offsetting the initial mapping according to the spatial offset and the spatial direction.
In the embodiment of the mobile device 51 shown in
The receiver 53 and the transmitter 54 may use one or more wireless communication technologies, e.g., Bluetooth, for communicating with the light strips 10 and 20. In an alternative embodiment, multiple receivers and/or multiple transmitters are used instead of a single receiver and a single transmitter. In the embodiment shown in
In the embodiments of
A first embodiment of the method of controlling a first pixelated lighting device and a second pixelated lighting device based on a dynamic light scene is shown in
A step 101 comprising obtaining a dynamic light scene. The dynamic light scene comprises a plurality of light settings that move across the plurality of individually controllable light segments over time. Next, a step 103 comprises obtaining a position of the first pixelated lighting device relative to the second pixelated lighting device. In the embodiment of
Steps 105 and 107 are performed after step 103. Step 105 comprises determining a spatial offset for the initial mapping based on the position of the first pixelated lighting device relative to second pixelated lighting device, as obtained in step 103. Step 107 comprises determining a spatial direction of the dynamic light scene relative to the first pixelated lighting device based on the position of the first pixelated lighting device relative to the second pixelated lighting device, as obtained in step 103.
Next, a step 109 comprises controlling the first pixelated lighting device to render the dynamic light scene obtained in step 101 according to an adjusted initial mapping. The initial mapping is adjusted by offsetting the initial mapping according to the spatial offset and the spatial direction, as determined in steps 105 and 107. A step 111 is also performed after step 101. Step 111 comprises controlling the second pixelated lighting device based on the dynamic light scene obtained in step 101.
A second embodiment of the method of controlling a first pixelated lighting device and a second pixelated lighting device based on a dynamic light scene is shown in
In the embodiment of
Further (successive) mappings from the dynamic light scene to the pluralities of individually controllable light segments are determined in a similar manner in step 131. The further (successive) mappings are determined based on the initial mapping, the spatial offset, and the spatial direction. In step 135, mappings may be determined for the second pixelated lighting device in a conventional manner, for example.
In step 133, the first pixelated lighting device is controlled to render the dynamic light scene according to adjusted mappings determined in step 131. In step 137, the second pixelated lighting device is controlled to render the dynamic light scene according to mappings determined in step 135. In the embodiment of
In the alternative embodiment mentioned above, information specifying the spatial offset and the spatial direction is transmitted to the first pixelated lighting device and the first pixelated lighting device is controlled to determine adjusted mappings by offsetting an initial mapping according to the spatial offset and the spatial direction before rendering the dynamic light scene according to the adjusted mappings. The second pixelated lighting device may be controlled with the same format of control commands but may not need to determine adjusted mappings.
A third embodiment of the method of controlling a first pixelated lighting device and a second pixelated lighting device based on a dynamic light scene is shown in
Optionally, the method comprises an additional step of determining an angle between the first and second pixelated lighting devices based on the position of the first pixelated lighting device relative to the second pixelated lighting device and the transition speed is determined based on this angle in step 151. Optionally, the method comprises an additional step of determining a length of the first pixelated lighting device and the transition speed is determined based on this length in step 151. The transition speed may be determined based on both this angle and this length, for example.
Step 153 comprises controlling the first pixelated lighting device to render the dynamic light scene according to a plurality of successive mappings. A usage duration of each of the successive mappings depends on the transition speed determined in step 151. When the embodiment of
A fourth embodiment of the method of controlling a first pixelated lighting device and a second pixelated lighting device based on a dynamic light scene is shown in
Optionally, the method comprises an additional step of determining an angle between the first and second pixelated lighting devices based on the position of the first pixelated lighting device relative to the second pixelated lighting device and the color and/or light intensity range is determined based on this angle in step 151. Optionally, the method comprises an additional step of determining a length of the first pixelated lighting device and the color and/or light intensity range is determined based on this length in step 151. The color and/or light intensity range may be determined based on both this angle and this length, for example.
Step 173 comprises controlling the first pixelated lighting device to render the dynamic light scene according to an adjusted initial mapping. The initial mapping is adjusted by offsetting the initial mapping according to the spatial offset and the spatial direction and further adjusted to conform to the color and/or light intensity range determined in step 171.
A fifth embodiment of the method of controlling a first pixelated lighting device and a second pixelated lighting device based on a dynamic light scene is shown in
A step 195 is performed after step 101. Step 195 comprises determining a spatial direction of the dynamic light scene relative to the second pixelated lighting device. This spatial direction may be a user-configurable setting, for example. Next, a step 196 comprises controlling the second pixelated lighting device based on the dynamic light scene. If necessary, the initial mapping for the second pixelated lighting device is adjusted by offsetting the initial mapping according to the spatial direction determined in step 195.
Step 103 is also performed after step 101. Step 103 comprises obtaining a position of the first pixelated lighting device relative to the second pixelated lighting device. In the embodiment of
A step 193 comprises determining whether the relative distance between the first and second pixelated lighting devices, as determined in step 191, exceeds a threshold. This threshold may be user configurable. Steps 105 and 107 are performed if it is determined in step 193 that the relative distance between the first and second pixelated lighting devices does not exceed the threshold. Otherwise, a step 198 is performed.
Step 105 comprises determining a spatial offset for the initial mapping based on the position of the first pixelated lighting device relative to second pixelated lighting device, as obtained in step 103. Step 107 is implemented by a step 197. Step 197 comprises determining a spatial direction of the dynamic light scene relative to the first pixelated lighting device based on the position of the first pixelated lighting device relative to the second pixelated lighting device, as obtained in step 103, and further based on the spatial direction of the dynamic light scene relative to the second pixelated lighting device, as determined in step 195.
Step 109 comprises controlling the first pixelated lighting device to render the dynamic light scene according to an adjusted initial mapping. The initial mapping is adjusted by offsetting the initial mapping according to the spatial offset determined in step 105 and the spatial direction determined in step 197.
Step 198 comprises determining a spatial direction of the dynamic light scene relative to the first pixelated lighting device in a different manner, e.g., in a conventional manner. This spatial direction may be a user-configurable setting, for example. Next, a step 199 comprises controlling the first pixelated lighting device based on the dynamic light scene. If necessary, the initial mapping for the first pixelated lighting device is adjusted by offsetting the initial mapping according to the spatial direction determined in step 198.
A sixth embodiment of the method of controlling a first pixelated lighting device and a second pixelated lighting device based on a dynamic light scene is shown in
Step 195 is performed after step 101. Step 195 comprises determining a spatial direction of the dynamic light scene relative to the second pixelated lighting device. This spatial direction may be a user-configurable setting, for example. Next, step 196 comprises controlling the second pixelated lighting device based on the dynamic light scene. If necessary, the initial mapping for the second pixelated lighting device is adjusted by offsetting the initial mapping according to the spatial direction determined in step 195.
Step 103 is also performed after step 101. Step 103 comprises obtaining a position of the first pixelated lighting device relative to the second pixelated lighting device. Steps 105 and 107 are performed after step 103.
Step 105 is implemented by steps 211 and 213. Step 211 comprises selecting a light segment from the plurality of individually controllable light segments of the first pixelated lighting device based on the position obtained in step 103. Specifically, the light segment closest to the second pixelated lighting device is selected in step 211. A step 213 is performed after step 211. Step 213 comprises determining the spatial offset based on the light segment selected in step 211.
Step 107 is implemented by step 197. Step 197 comprises determining a spatial direction of the dynamic light scene relative to the first pixelated lighting device based on the position of the first pixelated lighting device relative to the second pixelated lighting device, as obtained in step 103, and further based on the spatial direction of the dynamic light scene relative to the second pixelated lighting device, as determined in step 195.
Step 109 comprises controlling the first pixelated lighting device to render the dynamic light scene according to an adjusted initial mapping. The initial mapping is adjusted by offsetting the initial mapping according to the spatial offset determined in step 105 and the spatial direction determined in step 197.
In the example of
In the examples of
The color and/or light intensity ranges may be determined based on the angle between light strips 10 and 20 and/or based on the lengths of the light strips, for example. In the example of
As described in relation to
An example of a distance d between light strips 10 and 20 is shown in
The examples of
For example, it may be possible to let the pixelated lighting devices render the dynamic light scene in a symmetrical mode. This is illustrated with the help of
In order to match spatial offsets and spatial directions, a graph may be constructed. This graph may be constructed by the bridge 1 of
The embodiments of
As shown in
The memory elements 304 may include one or more physical memory devices such as, for example, local memory 308 and one or more bulk storage devices 310. The local memory may refer to random access memory or other non-persistent memory device(s) generally used during actual execution of the program code. A bulk storage device may be implemented as a hard drive or other persistent data storage device. The processing system 300 may also include one or more cache memories (not shown) that provide temporary storage of at least some program code in order to reduce the quantity of times program code must be retrieved from the bulk storage device 310 during execution. The processing system 300 may also be able to use memory elements of another processing system, e.g. if the processing system 300 is part of a cloud-computing platform.
Input/output (I/O) devices depicted as an input device 312 and an output device 314 optionally can be coupled to the data processing system. Examples of input devices may include, but are not limited to, a keyboard, a pointing device such as a mouse, a microphone (e.g. for voice and/or speech recognition), or the like. Examples of output devices may include, but are not limited to, a monitor or a display, speakers, or the like. Input and/or output devices may be coupled to the data processing system either directly or through intervening I/O controllers.
In an embodiment, the input and the output devices may be implemented as a combined input/output device (illustrated in
A network adapter 316 may also be coupled to the data processing system to enable it to become coupled to other systems, computer systems, remote network devices, and/or remote storage devices through intervening private or public networks. The network adapter may comprise a data receiver for receiving data that is transmitted by said systems, devices and/or networks to the data processing system 300, and a data transmitter for transmitting data from the data processing system 300 to said systems, devices and/or networks. Modems, cable modems, and Ethernet cards are examples of different types of network adapter that may be used with the data processing system 300.
As pictured in
Various embodiments of the invention may be implemented as a program product for use with a computer system, where the program(s) of the program product define functions of the embodiments (including the methods described herein). In one embodiment, the program(s) can be contained on a variety of non-transitory computer-readable storage media, where, as used herein, the expression “non-transitory computer readable storage media” comprises all computer-readable media, with the sole exception being a transitory, propagating signal. In another embodiment, the program(s) can be contained on a variety of transitory computer-readable storage media. Illustrative computer-readable storage media include, but are not limited to: (i) non-writable storage media (e.g., read-only memory devices within a computer such as CD-ROM disks readable by a CD-ROM drive, ROM chips or any type of solid-state non-volatile semiconductor memory) on which information is permanently stored; and (ii) writable storage media (e.g., flash memory, floppy disks within a diskette drive or hard-disk drive or any type of solid-state random-access semiconductor memory) on which alterable information is stored. The computer program may be run on the processor 302 described herein.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. As used herein, the singular forms “a,” “an,” and “the” are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms “comprises” and/or “comprising,” when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
The corresponding structures, materials, acts, and equivalents of all means or step plus function elements in the claims below are intended to include any structure, material, or act for performing the function in combination with other claimed elements as specifically claimed. The description of embodiments of the present invention has been presented for purposes of illustration, but is not intended to be exhaustive or limited to the implementations in the form disclosed. Many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope of the present invention. The embodiments were chosen and described in order to best explain the principles and some practical applications of the present invention, and to enable others of ordinary skill in the art to understand the present invention for various embodiments with various modifications as are suited to the particular use contemplated.
Number | Date | Country | Kind |
---|---|---|---|
21199335.7 | Sep 2021 | EP | regional |
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/EP2022/075886 | 9/19/2022 | WO |