The invention relates to a system for determining one or more light effects based on an analysis of video content and controlling one or more lighting devices to render said one or more light effects.
The invention further relates to a method of determining one or more light effects based on an analysis of video content and controlling one or more lighting devices to render said one or more light effects.
The invention also relates to a computer program product enabling a computer system to perform such a method.
Philips' Hue Entertainment and Hue Sync have become very popular among owners of Philips Hue lights. Philips Hue Sync enables the rendering of light effects based on the content that is played on a computer, e.g. video games. Furthermore, Ambilight has become one of the most distinguishing features of Philips TVs. Ambilight renders light effects based on the content that is played on the TV. A dynamic lighting system can dramatically influence the experience and impression of audio-visual material, especially when the colors sent to the lights match what would be seen in the composed environment around the screen.
This new use of light can bring the atmosphere of a video game or movie right into the room with the user. For example, gamers can immerse themselves in the ambience of the gaming environment and enjoy the flashes of weapons fire or magic spells and sit in the glow of the force fields as if they were real. Hue Sync works by observing areas of the video content and computing light output parameters that are rendered on Hue lights around the screen. When the entertainment mode is active, the selected lighting devices in a defined entertainment area will play light effects in accordance with the content depending on their positions relative to the screen.
Initially, Hue Sync was only available as an application for PCs. An HDMI device called the Hue Play HDMI Sync Box was later added to the Hue entertainment portfolio. This device addresses one of the main limitations of Hue Sync and aims at streaming and gaming devices connected to the TV. It makes use of the same principle of an entertainment area and the same mechanisms to transport information.
The Philips Hue Play HDMI Sync Box consists of a box to which different HDMI inputs can be connected (e.g. gaming console, set top box, PC, etc.). This device is in principle a HDMI splitter which is placed between any HDMI device and a TV. The signals received on the HDMI inputs are then transmitted unmodified to the display device to show the content as intended but, in the process, they get analyzed and key content is extracted to determine light effects.
A drawback of the current Hue Play HDMI Sync Box is that although the light generating content may come from the different HDMI sources, additional content might be generated on the screen by the TV itself. For example, if the user interacts with menus of the TV (e.g. settings, volume, etc.), these will not get analyzed, and as a result, the screen content and the light effects might differ. On the other hand, menus triggered from the original content source (e.g. gaming console menus) do get carried via the (in this case HDMI) input and do generate appropriate light effects. For menus received on the input, it might even be possible to disregard the OSD layer when generating the light effects, as disclosed in US 2015/092115 A1.
Ultimately, the fact that menus of the TV do not get analyzed leads to an inconsistency on how menus, or overlays in general, are dealt with. The user might prefer prioritizing content over menus, such that no menus ever trigger light effects (or the light effects are paused/cancelled), while others will prefer to always having a consistent mapping between menus and light effects, such that all menus should be acknowledged by the entertainment system.
It is a first object of the invention to provide a system, which can be used to reduce the number of inconsistencies between screen content and light effects.
It is a second object of the invention to provide a method, which can be used to reduce the number of inconsistencies between screen content and light effects.
In a first aspect of the invention, a system for determining one or more light effects based on an analysis of video content and controlling one or more lighting devices to render said one or more light effects, comprises at least one input interface, at least one output interface, and at least one processor configured to receive a video signal via said at least one input interface, said video signal comprising said video content, output, via said at least one output interface, said video signal to a display device for displaying said video content, receive a further signal via said at least one input interface, said further signal being indicative of one or more commands transmitted by a further device to said display device, determine, based on said further signal, whether said display device will likely display an overlay on top of said video content, determine said one or more light effects based on said analysis of said video content in dependence on whether said display device will likely display an overlay on top of said video content, and control, via said at least one output interface, said one or more lighting devices to render said one or more light effects.
As it may not be possible to obtain information from the display device as to whether it is displaying an overlay on top of the video content provided it, the system determines whether the display device will likely display an overlay on top of the video content by ‘sniffing’ commands transmitted to the display device, e.g. by a remote control or mobile device. This makes it possible to determine light effects for screen content, and deal with overlays, in a consistent manner, independent of which device generates the overlays.
Said at least one processor may be configured to, upon determining that said display device will likely display an overlay on top of said video content, determine one or more further light effects based on said analysis of said video content and based on content of said overlay and control, via said at least one output interface, said one or more lighting devices to render said one or more further light effects. Thus, these one or more further light effects are not determined based only on the video content, but also based on the content of the (likely displayed) overlay.
The one or more further light effects may be determined by first determining the (normal) one or more light effects based on the analysis of the video content and then adapting these one or more light effects. The adaptation of the one or more light effects may take into account what the overlay could do to the underlying screen content. For example, if the overlay is just changing the transparency of the content, then the intensity and saturation of the one or more light effects may be adapted. If a bright yellow overlay is displayed on an original dark background, then only the hue and the brightness of the light effects may be adapted.
Said at least one processor may be configured to, upon determining that said display device will likely display an overlay on top of said video content, control said one or more lighting devices to stop rendering light.
Said at least one processor may be configured to, upon determining that said display device will likely display an overlay on top of said video content, determine said one or more light effects such that a noticeability of said one or more light effects is gradually reduced. After said reduction, said one or more light devices may be controlled to stop rendering light or light may be rendered at a lower intensity, for example.
Said at least one processor may be configured to, upon determining that said display device will likely display an overlay on top of said video content, determine one or more default and/or user-defined light effects, and control, via said at least one output interface, said one or more lighting devices to render said one or more default and/or user-defined light effects. Said one or more default and/or user-defined light effects may comprise 50% warm light on all lighting devices, for example.
Said at least one processor may be configured to, upon determining that said display device will likely display an overlay on top of said video content, determine one or more further light effects associated with said overlay, and control, via said at least one output interface, said one or more lighting devices to render said one or more further light effects. For example, if it is determined than an EPG is likely displayed on the display device and the EPG of this display device has blue as dominant color, a blue or blueish light effect may be rendered.
Said at least one processor may be configured to, upon determining that said one or more commands instruct said display device to exit a menu, determine said one or more light effects based on said analysis of said video content and control said one or more lighting devices to render said one or more light effects. An exit menu command may indicate that the menu has disappeared from the screen and therefore the received video content is what should be matched with the light effects.
Said at least one processor may be configured to determine said one or more light effects based on said analysis of said video content further in dependence on a size and/or shape of said overlay. For example, if an overlay, e.g. rendered in response to a volume control command, takes up little space, it may not be necessary to adapt the light effects. Furthermore, the shape of the overlay may impact what the overlay does to the underlying screen content. For example, a square white box on the mid-left of the screen will be quite noticeable to the human eye, but the same amount of “surface” of an overlay that is spread out along the outer edges of the screen might not be enough for a user to notice that the screen has changed colors. In the latter case, the user will not perceive inconsistencies between screen content and light effects, and it is therefore not necessary to adapt the one or more light effects.
Said at least one processor may be configured to determine whether said display device will likely display an overlay on top of said video content further based on data exchanged between said display device and one or more other devices. For example, if one or more Bluetooth (e.g. BLE) commands are transmitted to a display device by a remote control and then a burst of data is transmitted to the display device by a Wi-Fi access point, this may indicate that a certain type of menu is being displayed on the display device, e.g. an EPG.
Said at least one processor may be configured to determine whether said display device will likely display an overlay on top of said video content based on content of said one or more commands. If it is possible to decode the content of the further signal, and thus the content of the one or more commands, this is usually the easiest way of determining whether the display device will likely display an overlay on top of the video content.
Said at least one processor may be configured to determine whether said display device will likely display an overlay on top of said video content based on a quantity and/or type and/or duration of said one or more commands. If it is not possible to decode the content of the further signal, it may be possible in one or more of these other ways to determine whether the display device will likely display an overlay on top of the video content.
In a second aspect of the invention, a method of determining one or more light effects based on an analysis of video content and controlling one or more lighting devices to render said one or more light effects comprises receiving a video signal, said video signal comprising said video content, outputting said video signal to a display device for displaying said video content, receiving a further signal, said further signal being indicative of one or more commands transmitted by a further device to said display device, determining, based on said further signal, whether said display device will likely display an overlay on top of said video content, determining said one or more light effects based on said analysis of said video content in dependence on whether said display device will likely display an overlay on top of said video content, and controlling said one or more lighting devices to render said one or more light effects. Said method may be performed by software running on a programmable device. This software may be provided as a computer program product.
Moreover, a computer program for carrying out the methods described herein, as well as a non-transitory computer readable storage-medium storing the computer program are provided. A computer program may, for example, be downloaded by or uploaded to an existing device or be stored upon manufacturing of these systems.
A non-transitory computer-readable storage medium stores at least one software code portion, the software code portion, when executed or processed by a computer, being configured to perform executable operations for determining one or more light effects based on an analysis of video content and controlling one or more lighting devices to render said one or more light effects.
The executable operations comprise receiving a video signal, said video signal comprising said video content, outputting said video signal to a display device for displaying said video content, receiving a further signal, said further signal being indicative of one or more commands transmitted by a further device to said display device, determining, based on said further signal, whether said display device will likely display an overlay on top of said video content, determining said one or more light effects based on said analysis of said video content in dependence on whether said display device will likely display an overlay on top of said video content, and controlling said one or more lighting devices to render said one or more light effects.
As will be appreciated by one skilled in the art, aspects of the present invention may be embodied as a device, a method or a computer program product. Accordingly, aspects of the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment (including firmware, resident software, micro-code, etc.) or an embodiment combining software and hardware aspects that may all generally be referred to herein as a “circuit”, “module” or “system.” Functions described in this disclosure may be implemented as an algorithm executed by a processor/microprocessor of a computer. Furthermore, aspects of the present invention may take the form of a computer program product embodied in one or more computer readable medium(s) having computer readable program code embodied, e.g., stored, thereon.
Any combination of one or more computer readable medium(s) may be utilized. The computer readable medium may be a computer readable signal medium or a computer readable storage medium. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples of a computer readable storage medium may include, but are not limited to, the following: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or Flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of the present invention, a computer readable storage medium may be any tangible medium that can contain, or store, a program for use by or in connection with an instruction execution system, apparatus, or device.
A computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber, cable, RF, etc., or any suitable combination of the foregoing. Computer program code for carrying out operations for aspects of the present invention may be written in any combination of one or more programming languages, including an object oriented programming language such as Java™, Smalltalk, C++ or the like and conventional procedural programming languages, such as the “C” programming language or similar programming languages. The program code may execute entirely on a local computer, partly on the local computer, as a stand-alone software package, partly on the local computer and partly on a remote computer, or entirely on the remote computer or server. In the latter scenario, the remote computer may be connected to the local computer through any type of network, including a local area network (LAN) or a wide area network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet Service Provider).
Aspects of the present invention are described below with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the present invention. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor, in particular a microprocessor or a central processing unit (CPU), of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer, other programmable data processing apparatus, or other devices create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer readable medium that can direct a computer, other programmable data processing apparatus, or other devices to function in a particular manner, such that the instructions stored in the computer readable medium produce an article of manufacture including instructions which implement the function/act specified in the flowchart and/or block diagram block or blocks.
The computer program instructions may also be loaded onto a computer, other programmable data processing apparatus, or other devices to cause a series of operational steps to be performed on the computer, other programmable apparatus or other devices to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide processes for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of devices, methods and computer program products according to various embodiments of the present invention. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the blocks may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustrations, and combinations of blocks in the block diagrams and/or flowchart illustrations, can be implemented by special purpose hardware-based systems that perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
These and other aspects of the invention are apparent from and will be further elucidated, by way of example, with reference to the drawings, in which:
Corresponding elements in the drawings are denoted by the same reference numeral.
In the example of
The HDMI module 11 is connected to a wireless LAN access point 41, e.g. using Wi-Fi. The bridge 21 is also connected to the wireless LAN access point 41, e.g. using Wi-Fi or Ethernet. In the example of
The HDMI module 11 is connected to the display device 46 and local media receivers 43 and 44 via HDMI. The local media receivers 43 and 44 may comprise one or more streaming or content generation devices, e.g. an Apple TV, Microsoft Xbox One (or Series X or S) and/or Sony PlayStation 4 (or 5), and/or one or more cable or satellite TV receivers. Each of the local media receivers 43 and 44 may be able to receive content from a media server 49 and/or from a media server in the home network. The local media receivers 43 and 44 provide this content as a video signal to the HDMI module 11 via HDMI.
The wireless LAN access point 41 and media server 49 are connected to the Internet 48. Media server 49 may be a server of a video-on-demand service such as Netflix, Amazon Prime Video, Hulu, Disney+ or Apple TV+, for example. In the example of
The HDMI module 11 comprises a receiver 13, a transmitter 14, a processor 15, and memory 17. The processor 15 is configured to receive a video signal via the receiver 13, e.g. from local media receiver 43 or local media receiver 44. The video signal comprises video content. The processor 15 is further configured to output the video signal to display device 46, which displays the video content, and to receive a further signal via receiver 13. The further signal is indicative of one or more commands transmitted by a further device, e.g. remote control 45 or mobile device 29, to the display device 46.
The processor 15 is further configured to determine, based on the further signal, whether the display device 46 will likely display an overlay on top of the video content, determine one or more light effects based on the analysis of the video content in dependence on whether the display device 46 will likely display an overlay on top of the video content, and control, via the transmitter 14, one or more of the lighting devices 31-34 to render the one or more light effects.
For example, the HDMI module 11 may sniff via IR, Wi-Fi, or BLE whether one or more commands are being issued to the display device 46 and determine whether those one or more commands are expected to result in an overlay being displayed on top of the video content that would have a relatively large impact on the light experience. The shape of the overlay, the size of the overlay, and/or other characteristics of the overlay (e.g. color and/or contrast) may be taken into account when determining the impact of the overlay on the light experience. For example, a small bright yellow menu overlaid on a dark movie will be more impactful than a large dark screen menu over footage of a football match which has light green grass over most of the screen.
If standardized commands are being used and can be recognized, then a simple look up table may be used to determine whether the command is for example a volume control command, which generally results in an overlay that takes up little space and therefore will not require light effects to be adapted, versus a menu/settings/guide command, which generally results in an overlay that takes up more screen space and would lead to light effects no longer matching the video content if no action would be taken.
If the commands are expected to result in a relatively large overlay being displayed on top of the video content, then the HDMI module 11 may choose to perform one of more of the following actions:
In the embodiment of
In the embodiment of the HDMI module 11 shown in
The receiver 13 and the transmitter 14 may use one or more wired or wireless communication technologies such as Wi-Fi to communicate with the wireless LAN access point 41 and HDMI to communicate with the display device 46 and with local media receivers 43 and 44, for example. In an alternative embodiment, multiple receivers and/or multiple transmitters are used instead of a single receiver and a single transmitter. In the embodiment shown in
The HDMI module 11 may comprise other components typical for a consumer electronic device such as a power connector. The invention may be implemented using a computer program running on one or more processors. In the embodiment of
A first embodiment of the method of determining one or more light effects based on an analysis of video content and controlling one or more lighting devices to render the one or more light effects is shown in
A step 108 comprises checking whether it was determined in step 107 that the display device will likely display an overlay on top of the video content. If so, a step 110 is performed. If not, a step 109 is performed. Step 109 comprises determining one or more light effects based on an analysis of the video content received in step 101. The light effects determined in step 109 are typically determined with a color extraction algorithm, which determines one or more dominant colors from one or more spatial regions of the video content.
Step 110 comprises determining one or more further light effects which are different from the light effects which would have been determined in step 109 when step 109 would have been performed. A step 111 comprises controlling the one or more lighting devices to render the one or more light effects determined in step 109 or the one or more further light effects determined in step 110. After step 111, step 101, and possibly step 105, is/are repeated and the method proceeds as shown in
The light effects that are determined and according to which the lighting devices are then controlled, are thus different when it is determined that the display device will likely display an overlay on top of said video content compared to when it has not been determined that the display device will likely display an overlay on top of said video content. In this first embodiment, the light effects are determined based on the analysis of video content, and some of the light effects are amended, replaced or dropped based on the determination that the display device will likely display an overlay on top of said video content, and further additional light effects may be determined based on the determination that the display device will likely display an overlay on top of said video content.
A second embodiment of the method of determining one or more light effects based on an analysis of video content and controlling one or more lighting devices to render the one or more light effects is shown in
Step 115 comprises determining whether the one or more lighting devices are rendering light. In this embodiment, this is typically only the case in step 115 when the overlay has only recently been ‘detected’. If it is determined in step 115 that the one or more lighting devices are rendering light, step 117 is performed. If not, step 101, and possibly step 105, is/are repeated and the method proceeds as shown in
A third embodiment of the method of determining one or more light effects based on an analysis of video content and controlling one or more lighting devices to render the one or more light effects is shown in
Step 121 comprises determining one or more light effects based on an analysis of video content like in step 109, but such that a noticeability of the one or more light effects is gradually reduced. After the gradual reduction, the rendering of light may be stopped or continued at a reduced level. Steps 109 and 121 may be performed by the same procedure. which is given either overlay=true or overlay=false as input parameter, for example.
Step 123 comprises determining one or more default and/or user-defined light effects. The one or more default and/or user-defined light effects may comprise 50% warm light on all lighting devices, for example. Step 125 comprises determining one or more further light effects associated with the overlay. For example, if it is determined than an EPG is likely displayed on the display device and the EPG of this display device has blue as dominant color, a blue or blueish light effect may be rendered.
A fourth embodiment of the method of determining one or more light effects based on an analysis of video content and controlling one or more lighting devices to render the one or more light effects is shown in
Step 143 comprises determining whether the display device will likely display an overlay on top of the video content based on content of the one or more commands and/or a quantity and/or type and/or duration of the one or more commands. As described above, the further signal received in step 105 is indicative of these one or more commands.
Ideally, the system is able to decode the content of sniffed commands, but this may not always be possible. In that case, it may be possible to determine whether the display device will likely display an overlay on top of the video content based on a quantity and/or type and/or duration of the sniffed commands.
Machine learning may be used to learn over time which quantity and/or type and/or duration of commands corresponds to which type of menu and/or which size and/or shape of overlay.
Optionally, the determination made in step 143 may further be based on the data received in step 143. If the menus are contextual (e.g. a home-like menu that in addition to settings also loads top 5 news in the user's region), the system performing the method may infer the type of menu that was triggered by monitoring traffic among different sources.
As a first example, if BLE commands are sent to a TV after which a burst of WiFi data is transmitted to the TV, even if neither communication streams can be decoded, their time commonality may be a useful hint. As a second example, in the case of online games, a semi-constant stream of data might be expected to be transmitted to the display device running the game. However, most game menus do not require specific data to be loaded, so by monitoring drops of traffic towards the display device (possibly coupled to sniffed commands), it may be possible to determine if a menu is being displayed.
A fifth embodiment of the method of determining one or more light effects based on an analysis of video content and controlling one or more lighting devices to render the one or more light effects is shown in
Step 121 comprises determining, based on the further signal received in step 105, whether the display device will likely display an overlay on top of the video content. Amongst others, step 121 comprises determining whether the one or more commands instruct the display device to exit a menu. If it is determined in step 121 that the one or more commands instruct the display device to exit a menu, it is assumed that the display device will not display an overlay on top of the video content.
Exit commands typically indicate that a menu/overlay has disappeared from the screen and that the light effects should match the received video content. Determining whether an exit command is being transmitted may be done by looking for dedicated explicit commands (e.g. an “exit menu” button), by tracking certain type of commands (e.g. how many times a “back” button was pressed), or even by determining how long no commands have been sent (since e.g. some TV menus quite automatically after certain time of inactivity).
Furthermore, step 121 may comprise determining based on the further signal whether a user has confirmed an option in the menu and whether this confirmation has resulted in the menu being exited. For example, if a user navigates to the settings, changes something, and presses save, this may automatically result in the menu being exited.
In the embodiment of
The embodiments of
The methods of
As shown in
The memory elements 304 may include one or more physical memory devices such as, for example, local memory 308 and one or more bulk storage devices 310. The local memory may refer to random access memory or other non-persistent memory device(s) generally used during actual execution of the program code. A bulk storage device may be implemented as a hard drive or other persistent data storage device. The processing system 300 may also include one or more cache memories (not shown) that provide temporary storage of at least some program code in order to reduce the quantity of times program code must be retrieved from the bulk storage device 310 during execution. The processing system 300 may also be able to use memory elements of another processing system, e.g. if the processing system 300 is part of a cloud-computing platform.
Input/output (I/O) devices depicted as an input device 312 and an output device 314 optionally can be coupled to the data processing system. Examples of input devices may include, but are not limited to, a keyboard, a pointing device such as a mouse, a microphone (e.g. for voice and/or speech recognition), or the like. Examples of output devices may include, but are not limited to, a monitor or a display, speakers, or the like. Input and/or output devices may be coupled to the data processing system either directly or through intervening I/O controllers. Input devices and/or output devices may be connected via one or more HDMI ports, for example.
In an embodiment, the input and the output devices may be implemented as a combined input/output device (illustrated in
A network adapter 316 may also be coupled to the data processing system to enable it to become coupled to other systems, computer systems, remote network devices, and/or remote storage devices through intervening private or public networks. The network adapter may comprise a data receiver for receiving data that is transmitted by said systems, devices and/or networks to the data processing system 300, and a data transmitter for transmitting data from the data processing system 300 to said systems, devices and/or networks. Modems, cable modems, and Ethernet cards are examples of different types of network adapter that may be used with the data processing system 300.
The video signal, and metadata if applicable, may be obtained from an input device 312, via the network adapter 316, or from the memory elements 304, for example.
As pictured in
Various embodiments of the invention may be implemented as a program product for use with a computer system, where the program(s) of the program product define functions of the embodiments (including the methods described herein). In one embodiment, the program(s) can be contained on a variety of non-transitory computer-readable storage media, where, as used herein, the expression “non-transitory computer readable storage media” comprises all computer-readable media, with the sole exception being a transitory, propagating signal. In another embodiment, the program(s) can be contained on a variety of transitory computer-readable storage media. Illustrative computer-readable storage media include, but are not limited to: (i) non-writable storage media (e.g., read-only memory devices within a computer such as CD-ROM disks readable by a CD-ROM drive, ROM chips or any type of solid-state non-volatile semiconductor memory) on which information is permanently stored; and (ii) writable storage media (e.g., flash memory, floppy disks within a diskette drive or hard-disk drive or any type of solid-state random-access semiconductor memory) on which alterable information is stored. The computer program may be run on the processor 302 described herein.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. As used herein, the singular forms “a,” “an,” and “the” are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms “comprises” and/or “comprising,” when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
The corresponding structures, materials, acts, and equivalents of all means or step plus function elements in the claims below are intended to include any structure, material, or act for performing the function in combination with other claimed elements as specifically claimed. The description of embodiments of the present invention has been presented for purposes of illustration, but is not intended to be exhaustive or limited to the implementations in the form disclosed. Many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the present invention. The embodiments were chosen and described in order to best explain the principles and some practical applications of the present invention, and to enable others of ordinary skill in the art to understand the present invention for various embodiments with various modifications as are suited to the particular use contemplated.
Number | Date | Country | Kind |
---|---|---|---|
20212519.1 | Dec 2020 | EP | regional |
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/EP2021/083296 | 11/29/2021 | WO |