This application is the U.S. National Phase application under 35 U.S.C. § 371 of International Application No. PCT/EP2019/054207, filed on Feb. 20, 2019, which claims the benefit of European Patent Application No. 18158854.2, filed on Feb. 27, 2018. These applications are hereby incorporated by reference herein.
The invention further relates to a method of rendering a dynamic light scene.
The invention relates to an electronic device for rendering a dynamic light scene.
The invention also relates to a computer program product enabling a computer system to perform such a method.
An application called Hue Sync offered by Philips Lighting enables a PC to render a dynamic light scene based on the images displayed on a display of the PC using lights that are part of the Philips Hue system. These dynamic light scenes are rendered in real-time, but not all dynamic light scenes need to be rendered in real-time. For example, dynamic light scenes may be rendered based on pre-defined light scripts, e.g. a light script labelled “sunrise”.
Solutions exist where a user can simply input his preferences with regard to dynamic light rendering, such as a “dynamics slider” in an app that enables the user to tune dynamics from mild to vivid or select a “mode” such as ‘party’ mode or ‘chillout’ mode. A drawback of this approach is that the user first needs to find those configuration options in the app, which is cumbersome. A user would prefer to quickly start the dynamic light scene and focus on the experience without having to dive into the configuration. Finding configuration options is even less desirable if the dynamic light scene starts automatically together with entertainment content.
It is a first object of the invention to provide a method, which renders a dynamic light scene according to a user's preferences without requiring the user to configure preferences for dynamic light scene rendering.
It is a second object of the invention to provide an electronic device, which is able to render a dynamic light scene according to a user's preferences without requiring the user to configure preferences for dynamic light scene rendering.
In a first aspect of the invention, the electronic device comprises at least one processor configured to identify a dynamic light scene to be rendered, determine one or more current, previous and/or planned light settings for one or more lights, determine a target dynamic light scene based on said identified dynamic light scene and said one or more light settings, and render said target dynamic light scene on at least one light. Thus, the target dynamic light scene is more like the one or more light settings than the identified dynamic light scene. Identifying the light scene may comprise receiving the light scene itself or receiving an identifier that allows the light scene to be retrieved, for example. A light is typically a light source, light node or lighting device which can be addressed and controlled individually. A scene is typically a set of light settings for a plurality of individually controllable lights.
The inventors have recognized that current, previous and planned light settings provide an indication of a user's preferences for rendering dynamic light scenes and that by taking into account these current, previous and planned light settings when rendering a dynamic light scene, it is in many cases not necessary for the user to configure his preferences for dynamic light scene rendering.
Said one or more light settings may comprise at least one of: light level (i.e. intensity), color, light distribution, beam width, number of active lights, and number of individual light beams and/or may identify at least one of: light scene which set or will set said light level and/or said color, routine which activated or will activate said light scene, and source from which said light level and/or said color have been derived, for example. For instance, the light settings may be intensity or color and the target dynamic light scene may have an (average) intensity or color palette which is closer to the light settings than the identified dynamic light scene has.
Said one or more lights may comprise said at least one light and/or comprise at least one further light located in proximity of said at least one light. This is beneficial, because light settings are often location dependent, e.g. depend on the ambient light level and/or the colors of nearby walls, carpets and/or furniture.
Said at least one processor may be configured to obtain said identified dynamic light scene and determine said target dynamic light scene by adjusting said obtained dynamic light scene based on said one or more light settings. By having the at least one processor adjust the obtained dynamic light scene, an author of a scripted dynamic light scene does not need to spend effort on authoring a group/plurality of dynamic light scenes. Adjusting the obtained dynamic light scene also works well for dynamic light scenes determined in real-time, e.g. based on entertainment content.
Said at least one processor may be configured to determine said target dynamic light scene by selecting a dynamic light scene from a group of dynamic light scenes based on said identified dynamic light scene and said one or more light settings. This allows an author of a scripted dynamic light scene to keep control of how his scripted dynamic light scene is rendered (at the cost of having to spend more effort). For example, he may author a group of three dynamic light scenes: one in which red is the dominant color, one in which green is the dominant color and one in which blue is the dominant color. In this case, obtaining the identified light scene is not required.
Said at least one processor may be configured to determine said target dynamic light scene based on how recent said one or more lights were set to said current or previous light setting. The more recent the one or more lights were set to the current or previous light setting, the more likely the current or previous light setting reflects the user's current preferences. For example, the strength of an adjustment to the obtained dynamic light scene may be based on how recent the one or more lights were set to the current or previous light setting.
Said at least one processor may be configured to determine a light level for said target dynamic light scene based on one or more current, previous and/or planned light levels for said one or more lights. A light level setting is expected to be a good indicator of a preferred light level for a dynamic light scene.
Said at least one processor may be configured to determine which colors will be dominant in said target dynamic light scene based on one or more current, previous and/or planned dominant colors and/or one or more current, previous and/or planned light levels for said one or more lights. Dominant colors and light levels are expected to be good indicators of preferred dominant colors for a dynamic light scene.
Said at least one processor may be configured to increase the intensity at which said one or more current, previous and/or planned dominant colors will be rendered as part of said target dynamic light scene compared to said identified dynamic light scene and/or increase the time period in which said one or more current, previous and/or planned dominant colors will be rendered as part of said target dynamic light scene compared to said identified dynamic light scene. By increasing the intensity and/or time period at/in which certain colors (the colors that are dominant in one or more light settings) are to be rendered, these colors become more dominant in the target dynamic scene.
Said at least one processor may be configured to determine a color palette to be used in said target dynamic light scene based on one or more current, previous and/or planned colors and/or one or more current, previous and/or planned light levels for said one or more lights. Color and light level settings are expected to be good indicators of a preferred color palette for a dynamic light scene.
Said at least one processor may be configured to determine a dynamic vividness for said target dynamic light scene based on a static vividness derived from said one or more light settings. A derived static vividness is expected to be a good indicator of a preferred dynamic vividness for a dynamic light scene.
Said at least one processor may be configured to determine a mood from said one or more light settings and/or from source data from which said one or more light settings have been derived and to determine said target dynamic light scene based on said determined mood. For example, if a light setting has been created based on an image (i.e. derived from the image data), this image may be analyzed and a mood may be selected from a plurality of predefined moods based on this analysis. Each of these predefined moods may be associated with an adjustment to an obtained identified dynamic light scene. Mood (e.g. happy or sad) is expected to be a good indicator of preferred colors or transitions for a dynamic light scene.
Said at least one light may comprise a plurality of lights and said at least one processor may be configured to map roles defined in said target dynamic light scene to said plurality of lights based on said determined light settings. If the multiple lights are to have different roles, multiple mappings are often possible. As an example of multiple lights having different roles, certain lights may be given the role of reacting to prominent sounds/beats in entertainment content, whereas other lights may be given the role of rendering functional white light. By performing the mapping automatically based on the determined light settings, a user does not need to map roles to lights manually.
In a second aspect of the invention, the method of rendering a dynamic light scene comprises identifying a dynamic light scene to be rendered, determining one or more current, previous and/or planned light settings for one or more lights, determining a target dynamic light scene based on said identified dynamic light scene and said one or more light settings, and rendering said target dynamic light scene on at least one light. The method may be implemented in hardware and/or software.
Moreover, a computer program for carrying out the methods described herein, as well as a non-transitory computer readable storage-medium storing the computer program are provided. A computer program may, for example, be downloaded by or uploaded to an existing device or be stored upon manufacturing of these systems.
A non-transitory computer-readable storage medium stores at least one software code portion, the software code portion, when executed or processed by a computer, being configured to perform executable operations comprising: identifying a dynamic light scene to be rendered, determining one or more current, previous and/or planned light settings for one or more lights, determining a target dynamic light scene based on said identified dynamic light scene and said one or more light settings, and rendering said target dynamic light scene on at least one light.
As will be appreciated by one skilled in the art, aspects of the present invention may be embodied as a device, a method or a computer program product. Accordingly, aspects of the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment (including firmware, resident software, micro-code, etc.) or an embodiment combining software and hardware aspects that may all generally be referred to herein as a “circuit”, “module” or “system.” Functions described in this disclosure may be implemented as an algorithm executed by a processor/microprocessor of a computer. Furthermore, aspects of the present invention may take the form of a computer program product embodied in one or more computer readable medium(s) having computer readable program code embodied, e.g., stored, thereon.
Any combination of one or more computer readable medium(s) may be utilized. The computer readable medium may be a computer readable signal medium or a computer readable storage medium. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples of a computer readable storage medium may include, but are not limited to, the following: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or Flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of the present invention, a computer readable storage medium may be any tangible medium that can contain, or store, a program for use by or in connection with an instruction execution system, apparatus, or device.
A computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber, cable, RF, etc., or any suitable combination of the foregoing. Computer program code for carrying out operations for aspects of the present invention may be written in any combination of one or more programming languages, including an object-oriented programming language such as Java™, Smalltalk, C++ or the like and conventional procedural programming languages, such as the “C” programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer, or entirely on the remote computer or server. In the latter scenario, the remote computer may be connected to the user's computer through any type of network, including a local area network (LAN) or a wide area network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet Service Provider).
Aspects of the present invention are described below with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the present invention. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor, in particular a microprocessor or a central processing unit (CPU), of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer, other programmable data processing apparatus, or other devices create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer readable medium that can direct a computer, other programmable data processing apparatus, or other devices to function in a particular manner, such that the instructions stored in the computer readable medium produce an article of manufacture including instructions which implement the function/act specified in the flowchart and/or block diagram block or blocks.
The computer program instructions may also be loaded onto a computer, other programmable data processing apparatus, or other devices to cause a series of operational steps to be performed on the computer, other programmable apparatus or other devices to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide processes for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of devices, methods and computer program products according to various embodiments of the present invention. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the blocks may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustrations, and combinations of blocks in the block diagrams and/or flowchart illustrations, can be implemented by special purpose hardware-based systems that perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
These and other aspects of the invention are apparent from and will be further elucidated, by way of example, with reference to the drawings, in which:
Corresponding elements in the drawings are denoted by the same reference numeral.
In the example depicted in
In the example depicted in
When the bridge 1 receives a command to activate a pre-defined dynamic light scene, it first identifies the dynamic light scene based on (information in) the command. The command may comprise an identifier of the dynamic light scene or a light script, for example. The command may be transmitted by the mobile device 19, for example. The user 18 may be able to start a dynamic light scene by interacting with an app on mobile device 19 using a touch screen. Alternatively, the user 18 may be able to start a dynamic light scene using voice commands, e.g. on mobile device 19, on a smart speaker like Amazon Echo or Google Home, or on bridge 1 directly.
Alternatively, the bridge 1 may receive one or more light commands that form a dynamic light scene. In this case, identifying the light scene may simply consist of receiving the one or more light commands. For example, multiple light commands may be transmitted to bridge 1 after starting playback of content (e.g. a movie or music track) that has a dynamic light scene associated with it, e.g. on mobile device 19 or on Television 17.
Typically, a user will have predefined ‘entertainment setups’ which are basically user selected groups of lights on which a dynamic light scene will be rendered (e.g. a group with lights 22 and 23). Typically, this will be a superset or subset of room or zone groups, which a user has configured for his static light scenes and routines. The bridge 1 can relate those to each other and thereby determine the (current, previous and/or planned) light settings for those lights. This includes the state of the lights (on, brightness, color temperature, color) as well as the ‘metadata’ e.g. whether it is connected to an activity (‘dinner’ scene vs ‘wake-up’ routine), what picture, video or color palette it is derived from or how it is triggered. Typically, there is always a current light setting to determine and sometimes there is also an upcoming light setting which is relevant if it is planned in the nearby future.
An identified dynamic light scene behaves in a certain way based on a multitude of parameters such as color palette, brightness (average and dynamic range), saturation (average and dynamic range), dynamicity, transitions (from slow to instant), effect type and frequency of effect type change, different light roles and so forth. In the target dynamic light scene determined by the bridge 1, this behavior will normally be different than in the identified dynamic light scene. For example, the target dynamic light scene may be obtained by adjusting the parameters of the identified dynamic light scene based on directly or indirectly related parameters of the determined one or more light settings.
Some parameters can be adjusted based on the one or more settings directly, such as the color palette or average brightness. But others would have an indirect adjustment based on matching the known or intended effect the light settings and dynamic scene parameters have on the human physiological state and perception. For example, a warm color temperature light scene or an upcoming go to bed routine have the known or intended effect on people of winding down. This may be translated to the dynamic effect of slow transitions and a low dynamic brightness range of the dynamic scene. Another example is a very bright scene or a specific workout activity scene, which have the known or intended effect on people of energizing them. This may be translated to the dynamic effect of high dynamism and snappy transitions.
In the embodiment of
The bridge 1 may render the target dynamic light scene on the at least one light by calculating with a certain frame rate the light output from the identified dynamic light scene, creating that that light color (e.g. by mixing different color LEDs with the correct Pulse Width Modulation values) and transmitting one or more light commands to the at least one light. If the at least one light comprises multiple lights, this calculation may be performed for each light separately.
In the embodiment of the bridge 1 shown in
In an alternative embodiment, multiple transceivers are used instead of a single transceiver, e.g. one for ZigBee and one for Wi-Fi. In the embodiment shown in
The example depicted in
In the embodiment of the mobile device 41 shown in
In the embodiment of the mobile device 41 shown in
In the embodiment shown in
In the embodiment of
In all the examples of
Furthermore, only color light settings are shown in the examples. The lights settings may further comprise light level, light distribution, beam width, number of active lights, and number of individual light beams and/or identify at least one of: light scene which set or will set the light level and/or the color, routine which activated or will activate the light scene, and/or source from which the light level and/or the color have been derived. A light level in the target dynamic light scene may be determined based on one or more current, previous and/or planned light levels for the lights 21, 22 and/or 23, for example.
A routine may be associated with an activity type. As a first example, a “dinner” or “study” scene may result in more subtle dynamics and a “workout” or “party” scene in more lively dynamics. As a second example, when a “go to bed” routine is coming up, a warmer/dimmer dynamic light scene may be used and when ‘a fresh wakeup’ routine is coming up, a colder/brighter dynamic light scene may be used. The source from which the light level and/or the color have been derived may be an image or song, for example.
In all the examples of
Alternatively, color settings could be adjusted in a different manner. As a first example, which colors will be dominant in the target dynamic light scene may be determined based on one or more current, previous and/or planned dominant colors for the lights 21, 22 and/or 23. As a second example, which colors will be dominant in the target dynamic light scene may be determined based on one or more current, previous and/or planned light levels for the lights 21, 22 and/or 23. For instance, “warmer” colors (e.g. yellow, orange) may be made dominant for low light levels and colder colors (e.g. green, blue) may be made dominant for high light levels.
These colors may be made dominant in the target dynamic light scene by increasing the intensity at which the one or more current, previous and/or planned dominant colors will be rendered as part of the target dynamic light scene compared to the identified dynamic light scene and/or by increasing the time period in which the one or more current, previous and/or planned dominant colors will be rendered as part of the target dynamic light scene compared to the identified dynamic light scene, for example.
In the example of
In the example of
In the example of
In the examples of
In the example of
In the examples of
The bridge 1 and the mobile device 41 may be enhanced by configuring their processor (processors 5 and 45, respectively) as follows:
An embodiment of the method of the invention is shown in
As shown in
The memory elements 304 may include one or more physical memory devices such as, for example, local memory 308 and one or more bulk storage devices 310. The local memory may refer to random access memory or other non-persistent memory device(s) generally used during actual execution of the program code. A bulk storage device may be implemented as a hard drive or other persistent data storage device. The processing system 300 may also include one or more cache memories (not shown) that provide temporary storage of at least some program code in order to reduce the quantity of times program code must be retrieved from the bulk storage device 310 during execution.
Input/output (I/O) devices depicted as an input device 312 and an output device 314 optionally can be coupled to the data processing system. Examples of input devices may include, but are not limited to, a keyboard, a pointing device such as a mouse, or the like. Examples of output devices may include, but are not limited to, a monitor or a display, speakers, or the like. Input and/or output devices may be coupled to the data processing system either directly or through intervening I/O controllers.
In an embodiment, the input and the output devices may be implemented as a combined input/output device (illustrated in
A network adapter 316 may also be coupled to the data processing system to enable it to become coupled to other systems, computer systems, remote network devices, and/or remote storage devices through intervening private or public networks. The network adapter may comprise a data receiver for receiving data that is transmitted by said systems, devices and/or networks to the data processing system 300, and a data transmitter for transmitting data from the data processing system 300 to said systems, devices and/or networks. Modems, cable modems, and Ethernet cards are examples of different types of network adapter that may be used with the data processing system 300.
As pictured in
Various embodiments of the invention may be implemented as a program product for use with a computer system, where the program(s) of the program product define functions of the embodiments (including the methods described herein). In one embodiment, the program(s) can be contained on a variety of non-transitory computer-readable storage media, where, as used herein, the expression “non-transitory computer readable storage media” comprises all computer-readable media, with the sole exception being a transitory, propagating signal. In another embodiment, the program(s) can be contained on a variety of transitory computer-readable storage media. Illustrative computer-readable storage media include, but are not limited to: (i) non-writable storage media (e.g., read-only memory devices within a computer such as CD-ROM disks readable by a CD-ROM drive, ROM chips or any type of solid-state non-volatile semiconductor memory) on which information is permanently stored; and (ii) writable storage media (e.g., flash memory, floppy disks within a diskette drive or hard-disk drive or any type of solid-state random-access semiconductor memory) on which alterable information is stored. The computer program may be run on the processor 302 described herein.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. As used herein, the singular forms “a,” “an,” and “the” are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms “comprises” and/or “comprising,” when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
The corresponding structures, materials, acts, and equivalents of all means or step plus function elements in the claims below are intended to include any structure, material, or act for performing the function in combination with other claimed elements as specifically claimed. The description of embodiments of the present invention has been presented for purposes of illustration, but is not intended to be exhaustive or limited to the implementations in the form disclosed. Many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the present invention. The embodiments were chosen and described in order to best explain the principles and some practical applications of the present invention, and to enable others of ordinary skill in the art to understand the present invention for various embodiments with various modifications as are suited to the particular use contemplated.
Number | Date | Country | Kind |
---|---|---|---|
18158854 | Feb 2018 | EP | regional |
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/EP2019/054207 | 2/20/2019 | WO | 00 |
Publishing Document | Publishing Date | Country | Kind |
---|---|---|---|
WO2019/166297 | 9/6/2019 | WO | A |
Number | Name | Date | Kind |
---|---|---|---|
20010030686 | Young, Jr. | Oct 2001 | A1 |
20130293113 | Morrow | Nov 2013 | A1 |
20150161137 | Lashina et al. | Jun 2015 | A1 |
20160088707 | Van De Sluis | Mar 2016 | A1 |
20160174342 | Liu et al. | Jun 2016 | A1 |
20190178711 | Rajagopalan | Jun 2019 | A1 |
20190394855 | Meerbeek et al. | Dec 2019 | A1 |
Number | Date | Country |
---|---|---|
3226660 | Oct 2018 | EP |
2007113740 | Oct 2007 | WO |
2013102854 | Jul 2013 | WO |
2016019005 | Feb 2016 | WO |
2016083136 | Jun 2016 | WO |
2017021088 | Feb 2017 | WO |
2018028973 | Feb 2018 | WO |
Number | Date | Country | |
---|---|---|---|
20210243870 A1 | Aug 2021 | US |