The present subject matter relates to techniques and equipment to enable adaptive control of light emissions for illumination of a surface, for example, for wall grazing or wall wash applications.
A wall wash is a type of light fixture, which is mounted to a ceiling and intended to direct light to the face of a wall near the fixture in a uniform manner to substantially eliminate shadows or other variations in intensity or shading across the illuminated surface of the wall. Ideally, light would be distributed evenly on the wall with the light directed close to the ceiling and a smooth transition down the face of the wall toward the floor. Such lighting, for example, may emphasize smoothness of the illuminated surface. Variations in surface texture, however, may disrupt the apparent uniformity by creating light and dark regions on the illuminated surface of the wall due to shadow effects of projecting features and the angle of light from the wall wash fixture.
A grazing application similarly uses a light fixture mounted to a ceiling intended to direct light to the face of a wall near the fixture. The grazing light fixture typically is mounted nearer the illuminated face of the wall and aimed to output light around an axis at a smaller angle relative to the wall face. Rather than emphasizing surface uniformity, such a grazing light fixture emphasizes texture by creating light and dark regions due to shadow effects of intentional design features of a textured surface on the face of the wall. Grazing, however, may also create undesired light and dark regions when portions of the surface that should be uniform are not uniform due to surface imperfections. A grazing light, however, may even be used to detect imperfections for corrections, for example, to allow a builder to detect and sand out or otherwise remove imperfections in a wall during construction.
Light fixtures for wall washing and grazing applications traditionally have relatively static light distributions specifically designed for the particular application. In either case, the fixture is mounted at a particular location and any adjustable components of the light fixture (e.g. relating to angle of emission toward the surface of the wall) are set during installation so as to provide the intended direction and range of angular distribution for the washing or grazing of the particular architectural panel. Once mounted and configured, the distribution remains unchanged unless a technician manually adjusts the fixture. Other than some minor manual adjustment, there is no practical technique to adjust the output distribution of light from the fixture, for example, to compensate for imperfections in the illuminated wall surface.
Hence, a need exists for adaptive control of light emissions for illumination of a surface, for example, for wall grazing or wall wash applications. In a wall wash or grazing, example, adaptive control may enable compensation for undesirable variations in light and dark regions due to imperfections in the illuminated architectural surface. For grazing or similar surface illumination applications, it may sometimes be desirable to increase variations in light and dark regions due to deliberately provided textural features of the illuminated architectural surface or to deliberately detect surface irregularities or imperfections. Hence, the concepts disclosed herein improve illumination of a face of an architectural panel by utilizing a pixel controllable array of solid state light emitters and circuitry for adaptive control of the emitters of the array. For example, this technique may adaptively illuminate features a face of a wall or other architectural panel by selectively activating ones of the light emitters of the array.
An example of a lighting system for illuminating a face of an architectural panel includes an optic and a pixel controllable array of solid state light emitters. The optic is configured to be aimed to have an optical axis at an acute angle relative to a face of the architectural panel. The pixel controllable array of solid state light emitters is coupled to selectively emit light from emitters at pixels of the array through the optic toward different regions of the face of the architectural panel. The example also includes a driver coupled to selectively drive individual light emitters at pixels of the array. A controller is coupled to control the individual light emitters, via the driver. The controller is configured to control the emitters of the array so as to selectively control output intensity of a plurality of the light emitters emitting light through the optic to one or more selected regions of the face of the architectural panel, so as to adaptively illuminate surface topology features of the face of the architectural panel.
An example method for illuminating a face of an architectural panel involves directing light output from a pixel controllable array of solid state light emitters to illuminate an area of the face of the architectural panel, at an acute angle relative to the face of the architectural panel; and capturing an image of the illuminated area of the face of the architectural panel. Regions within the illuminated area of the face having relatively brighter and darker illumination are identified. The differences in illumination, for example, may be caused by angled illumination of topological features of the surface of the face of the panel, from processing of data of the captured image. The method also entails selectively changing output intensity of one or more of the light emitters of the array to change intensity of illumination of one or more of the identified regions and adjust relative intensity of illumination among the regions of the illuminated area of the face of the architectural panel.
Additional objects, advantages and novel features of the examples will be set forth in part in the description which follows, and in part will become apparent to those skilled in the art upon examination of the following and the accompanying drawings or may be learned by production or operation of the examples. The objects and advantages of the present subject matter may be realized and attained by means of the methodologies, instrumentalities and combinations particularly pointed out in the appended claims.
The drawing figures depict one or more implementations, by way of example only, not by way of limitations. In the figures, like reference numerals refer to the same or similar elements.
In the following detailed description, numerous specific details are set forth by way of examples in order to provide a thorough understanding of the relevant teachings. However, it should be apparent to those skilled in the art that the present teachings may be practiced without such details. In other instances, well known methods, procedures, components, and/or circuitry have been described at a relatively high-level, without detail, in order to avoid unnecessarily obscuring aspects of the present teachings.
The various examples disclosed herein relate to a lighting system or method to adaptively illuminate a surface. In the examples, the surface is that of a face of an architectural panel.
The face of the panel, for example, would be the area of the panel facing into a room or other space illuminated by the system. The face of the architectural panel has one or more associated planes. If the face was a perfectly flat surface, for example, then a plane would be coincident with the face of the architectural panel. In many examples of adaptive surface illumination, whether the face is relatively flat or is curved or otherwise non-flat, the face of the architectural panel is not a perfectly smooth surface. Instead, the face of the panel may include any of various types of surface topology features that produce differences in depth (topographical deviations) from the ideal macro-contour of the surface. For example, for a vertical wall, the plane would be vertical. For a horizontal panel, such as a ceiling, floor or countertop, the plane would be horizontal. The surface topology features of the face of the relatively flat example architectural panel (i.e. the actual surface) vary in depth from the plane (i.e. the ideal planar contour in the vertical or horizontal panel type examples). For example, such surface topology features may be smaller scale contours deliberately formed on the face of the panel to create an apparent texture, or the surface topology features may be unintended surface irregularities or imperfections.
Many of the examples relate to panels with faces having relatively flat contours, such as walls, ceilings, floors or countertops. The systems and methods disclosed herein, however, may be adapted to illumination of panels with contours of other non-flat types. If the face is not a flat contoured surface, for example, if the face has a curved or faceted large scale (macro) surface contour, then the face would have associated tangential planes at each point of the intended or ideal contour of the face of the panel. Examples of architectural panels with curved or faceted contours include domes or other types of vaulted ceilings as well as columns of various shapes.
An example of an illumination system for such an application includes an optic and a pixel controllable array of solid state light emitters. When installed, the optic is aimed to have its optical axis at an acute angle relative to a plane of the face of the architectural panel. The pixel controllable array of solid state light emitters is coupled to selectively emit light from emitters at pixels of the array through the optic toward different regions of the face of the architectural panel. The example system also includes a driver coupled to selectively drive individual light emitters at pixels of the array. A controller is coupled to control the individual light emitters, via the driver. The controller is configured to control the emitters of the array so as to selectively control output intensity of the light emitters emitting light through the optic toward selected regions of the face of the architectural panel, so as to adaptively illuminate surface topology features of the face of the architectural panel.
The term “luminaire,” as used herein, is intended to encompass essentially any type of device that processes energy to generate or supply artificial light, for example, for general illumination of a space intended for use of occupancy or observation, typically by a living organism that can take advantage of or be affected in some desired manner by the light emitted from the device. Other application examples include providing light for highlighting a wall or the like that bears information (e.g. signage or a billboard), which may be read or otherwise observed by a person. However, a luminaire may provide light for use by automated equipment, such as sensors/monitors, robots, etc. that may occupy or observe the illuminated space, instead of or in addition to light provided for an organism. However, it is also possible that one or more luminaires in or on a particular premises have other lighting purposes, such as signage for an entrance or to indicate an exit. In most examples, the luminaire(s) illuminate a space or area of a premises to a level useful for a human in or passing through the space, e.g. general illumination of a room or corridor in a building or of an outdoor space such as a street, sidewalk, parking lot or performance venue. The actual source of illumination light in or supplying the light for a luminaire may be any type of artificial light emitting device, several examples of which are included in the discussions below. In a typical installation for the adaptive illumination system, at least the pixel controllable array and the optic would be elements of a light fixture or other type of luminaire for mounting in proximity to the illuminated face of the architectural panel with an appropriate output angle for the illumination light from the array and optic.
Terms such as “artificial lighting,” as used herein, are intended to encompass essentially any type of lighting that a device produces light by processing of electrical power to generate the light. An artificial lighting device, for example, may take the form of a lamp, light fixture, or other luminaire that incorporates a light source, where the light source by itself contains no intelligence or communication capability, such as one or more LEDs or the like, or a lamp (e.g. “regular light bulbs”) of any suitable type. The general illumination light output of an artificial illumination type luminaire, for example, may have an intensity and/or other characteristic(s) that satisfy an industry acceptable performance standard for a general lighting application.
The term “coupled” as used herein refers to any logical, optical, physical or electrical connection, link or the like by which signals or light produced or supplied by one system element are imparted to another coupled element. Unless described otherwise, coupled elements or devices are not necessarily directly connected to one another and may be separated by intermediate components, elements or communication media that may modify, manipulate or carry the light or signals.
Light output from the light fixture or other type of luminaire may carry information, such as a code (e.g. to identify the luminaire or its location) or downstream transmission of communication signaling and/or user data. The light based information transmission may involve modulation or otherwise adjusting parameters (e.g. intensity, color characteristic or distribution) of the illumination light output from the device.
Reference now is made in detail to the examples illustrated in the accompanying drawings and discussed below.
The high-resolution example utilizes light emitting diodes (LEDs) as the emitters of the array, although other solid state emitter devices may be used. Although each pixel 13 of the array 11 could include multiple LEDs, for example of different color characteristics (e.g. for color tuning), in one example, each of the pixels 13 of the array 11 includes a single LED for emitting white light (e.g. based on ultraviolet or blue pumped phosphor conversion). The LED at each pixel is controllable with respect to full ON and OFF states; and in the example, is controllable to some degree with respect to intermediate levels of output intensity.
Although other array devices may be used, an example lighting system may utilize a high-resolution, pixel controllable array of white emitters such as that originally developed for adaptive vehicle headlights by Osram Opto Semiconductors and described in Osram's literature as “Eviyos.” In such an example, the array may take the form of a chip of approximately 4 mm×4 mm that carries LEDs at 1024 pixels. A high-resolution emitter array therefore may have solid state emitters for 1024 or more pixels. The examples utilize a single chip array in a single package or housing, although additional chips in the same or additional packages/housings may be utilized increase the number of solid state emitters so as to provide increased light output and/or an increased range of variable light output distribution. Alternatively, arrays with more or fewer pixels may be used.
Although smaller and potentially having lower resolution than a typical display, the controllable array 11 operates much like a direct output LED display in that the output of light at each pixel is controllable. For lighting purposes, control of which LEDs (at the corresponding pixels) are operating at a given time allows control of which regions are illuminated by the output light from the array 11. Control of the driver current supplied to each of the LEDs, to control output intensity of selected LEDs, allows control of the intensity of light directed through the optic to various regions of the illuminated area on the face of the architectural panel.
As noted in the discussion of
Each of these example systems 21, 31 also includes a driver 33 coupled to selectively drive individual light emitters at pixels of the array 29. The driver 33, for example, may be a controllable multi-channel power supply circuitry having at least one channel to supply power to each of the emitters of the array 29. A controller 35 is coupled to control the individual light emitters at pixels of the array 29, via the driver 33. The controller 35 is configured to control the emitters of the array 29 so as to adaptively illuminate surface topology features (not visible in the illustrations in
A light fixture including the optic 27a or 27b and the array 29 is located to emit light about its output axis A-A at a relatively small angle θ, between the axis and the illuminated face 23 of the architectural panel 25. In the examples of
A variety of different designs may be used to implement the optic.
The pixel controllable array 29 of solid state light emitters is coupled to output light from the emitters through an optic. An unprocessed distribution of light from the array 29 would tend to illuminate of the face 23 of the panel 25 with lower intensity illumination on regions of the face 23 of the panel 25 further away from the light fixture or the like containing the array 29 (by operation of an inverse square law). For grazing or washing a wall from a ceiling mounted fixture, for example, the fixture provides more light (higher intensity) to regions on the face 23 of the wall panel 25 nearer the fixture than to regions of the face 23 of the wall panel 25 nearer the floor. The design of the optic can provide some compensation, for example, based on an optic design that tends to reduce light output toward regions near the ceiling and the array and optic as well as increase light output toward more distant regions of the face 23 of the wall panel 25 near the floor. In the system 21, 31 with a pixel controllable array of emitters, a lamp correction profile for controlling the emitter outputs may also provide some degree of adjustment of intensity output of emitters of various pixels of the array 29 to compensate for the distance to the various regions of the face 23 of the panel 25 (e.g. in addition to compensation for shadow effects caused by the surface topology features on the face of the panel.
The examples described here and shown in the various drawings are providing adaptive general illumination of a face of an architectural panel to change the appearance of surface topological features on the face, for illumination applications like wall washing or wall grazing. The examples need not project an image onto the face 23 of the architectural panel 25. Hence, the lens or lenses used to implement an optic in a system like 21 or 31 need not be an imaging optic. The example optics 27a, 27b, particularly the implementations like that shown at 27b in
The system 21 of
The optic 27b may provide an efficient delivery of the light where desired in the various emission states of the emitter array 29. As noted, examples are not projecting an image onto the face 23, but the examples instead are providing adaptive general illumination of a face 23 of an architectural panel 25 to change the appearance of surface topological features on the face. The example optic 27b with the compound input and output surfaces is a not actually an imaging optic, although it may not actually be a non-imaging optic (in the strict optical-science sense of the ‘non-imaging’ term). By contrast, a projector would utilize an imaging optic.
In the example system 31, the optic 27b is a circular compound-surface lens (shown in cross-section without hatching), e.g. if viewed from a perspective along the optical axis A-A. The circular compound-surface lens is made of suitably shaped solid transparent material having aspheric or spheric surfaces. The circular lens is suitable, for example, to an array 29 that has a substantially square pixel matrix, such as that shown in
The compound-surface lens implementation 27b is positioned over or across the path of light outputs from the emitters of the pixel controllable array 29. The aspheric or spheric surfaces of the compound-surface lens 27b include, for example, a compound input surface facing in a direction to receive light from the array 29 and a compound output surface. In a circular implementation of the compound-surface lens 27b, the compound input and output surfaces are centered along the optical axis A-A.
The input surface of the compound-surface lens 27b, facing the emitters of the array 29, includes an input peripheral portion and an input central portion, both of which are somewhat convex in the illustrated example. The input peripheral portion extends from relative proximity to the array 29 toward an interface or edge formed at a junction with the input central portion; and the input peripheral portion has an angled convex curvature. The input central portion curves towards the array 29, e.g. with a convex curvature across the optical axis A-A and facing directly toward the array 29 in the illustrated example orientation. The convex central portion of the compound input surface is spheric in the example, e.g. corresponds in shape to a portion of a sphere.
The compound output surface (opposite the input surface and the array 29) includes an output lateral portion, an output shoulder portion, and an output body portion. The output lateral portion forms the outer peripheral surface of the lens of optic 27b. The output lateral portion is considered part of the compound output surface in that some light may emerge via at least part of that peripheral surface, although that surface may provide total internal reflection (TIR) for other light, depending on the angle of diffracted light rays from different emitters at different pixels of the array 29. The output lateral portion extends away from relative proximity to the array 29, where it forms an interface or edge at the junction with the peripheral portion of the compound input surface. The output lateral portion curves away from the interface or edge formed at the junction with the input peripheral portion of the lens input surface, and intersects the output shoulder portion at a distal edge or interface away from the array 29. The output shoulder portion of the output surface extends inward from the output lateral portion of the compound output surface to where the shoulder portion abuts the output body portion of the compound output surface. The output body portion curves outwards (convex) away from the array 29, e.g. with a convex curvature across the optical axis A-A and away from the edge formed at the abutment with the output shoulder portion. The convex output body of the compound output surface is spheric in the example, e.g. corresponds in shape to a portion of a sphere.
Incoming light rays surface illumination, emitted by at least one of the illumination light emitters of the array 29, can first pass through the compound input surface where the incoming light rays undergo refraction to shape or steer the illumination lighting. After passing through the compound input surface, the refracted incoming light rays can then pass through the portions of the compound output surface where the refracted incoming light rays undergo further refraction to shape or steer the illumination lighting.
Alternatively or additionally, after passing through the compound input surface, the refracted incoming light rays can then strike the output lateral portion of the compound output surface (i.e. the peripheral wall/surface of the lens 27b) where the incoming light rays undergo total internal reflection (TIR) to further shape or steer the illumination lighting. After striking the output lateral portion, the refracted and TIR incoming light rays can pass through the output shoulder portion with further refraction.
With a compound-surface lens such as optic 27b, activation of different emitters at different pixels of the array 29 results in different refraction and thus different directions of light output. Additional information about lenses like the example of
The compound-surface lens type optic 27b may provide a more precise variable light output throw as a function of position of each activated pixel emitter of the array 29 relative to the various surface portions of the lens 27b. The compound-surface lens type optic 27b also may be more efficient in delivering the light to the appropriate regions of the face 23. The optic 27a, however, is a simpler, more common type of optic that may be cheaper to manufacture and deploy.
The example optics 27a and 27b are lenses. It should be appreciated that adaptive illumination systems of the type discussed herein may use other types of optical elements, such as reflective optics (e.g. one or more appropriately contoured mirrors). Mirrors or lenses or other optics may be specifically designed for the particular type of pixel controllable emitter array 29 and/or for the particular shape of face 23 (e.g. having a flat contour or having a particular non-flat contour).
At a high level, the general lighting system 100 is includes an optic 110 and a high-resolution, pixel controllable array 111 of solid state light emitters. The optic 11 is generally similar to one of the optics discussed above relative to
The example system 100 also includes a driver 113 coupled to selectively drive individual light emitters at pixels of the array. A controller 114 is coupled to control the individual light emitters at pixels of the array 111, via the driver 113. The controller 114 is configured to control the emitters of the array 111 so as to adaptively illuminate surface topological features of the face by selectively activating ones of the light emitters to selectively direct light through the optic to selected ones of the regions of the face of the architectural panel.
The driver 113 includes circuitry coupled to control light outputs generated by the light emitters at the pixels of the array 111. Although the driver 113 may be implemented as an element of the controller 114, in the example, the driver 113 is located separately from the controller 114. The driver 113 may be a separate device on one or more integrated circuits, or driver 113 may be integrated on the same semiconductor chip as the emitters forming he array 111.
In the example using the high-resolution pixel controllable array 111 of white LED type emitters, the LED at each pixel in controllable with respect to ON/OFF states and supports variable in-between light output intensity settings. Other types of array may include two or more emitters of different color characteristics (e.g. white plus a specific color, RGB or other tri-color emitter sets, RGBW, etc.) at the individual pixels of the array to allow controlled adjustment of color characteristic of the pixel outputs.
The driver circuit 113 may be a matrix type driver circuit, such as an active matrix driver or a passive matrix driver. Although active-matrix driver circuitry may be used in the driver 113, to drive the individual emitters at the pixels of the array 111, passive matrix driver circuitry may be sufficient for many general illumination applications. For example, a passive matrix driver circuit may be a more cost effective solution to drive the emitters of the array 111 for general illumination applications such as described herein, particularly for any pixel emitter array configuration or application that need not be dynamically controlled at a fast refresh rate. An issue with passive matrix is that the brightness scales with the number of rows in the array of controllable pixel emitters. Both active matrix and passive matrix can independently control pixel outputs. For a driver circuit for an array that is not necessarily high-resolution, active matrix or passive matrix driving methods may not be required. In any event, the lighting system 100 provides general illumination light output from the array 111 through the optic 110 in response to lighting control signals received from the matrix driver 113.
Equipment implementing functions like those of lighting system 100 may take various forms. The pixel controllable light generation array 111 and the optic 110 typically will be elements of a light fixture or other type of luminaire configured for angled surface illumination, such as wall grazing or wall washing. In some examples, the controller 114, matrix driver 113, high-resolution array 111 and optic 110 may be elements of a single hardware platform, e.g. a single luminaire. In other examples, some components attributed to the lighting system 100 may be separated from the pixel controllable light generation array 111 and the optic 110. Stated another way, a light fixture or other suitable type of luminaire may have all of the above hardware components of the system 100 on a single hardware device or in different somewhat separate units. In a particular hardware-separated example, one set of the hardware components may be separated from the pixel controllable light generation array 111, such that the controller 114 and the matrix driver 113 may control an array 111 from a remote location. In an alternative example, with each luminaire including a matrix driver together with the array 111, one controller 114 may control such luminaires to graze or wash the face(s) of one or more architectural panels.
As shown by way of example in
The host processing system 115 is coupled to the communication interface(s) 117. In the example, the communication interface(s) 117 offer a user interface function or communication with hardware elements providing a user interface for the general illumination system 100. The communication interface(s) 117 may communicate with other lighting systems at a particular premises. The communication interface(s) 117 may communicate with other control elements, for example, a host computer of a building and control automation system (BCAS). The communication interface(s) 117 also may support device communication with a variety of other systems of other parties, e.g. the device manufacturer for maintenance or an on-line server, such as server for downloading of software and/or configuration data.
The system 100 may also include one or more image sensor(s) 121. Such an image sensor 121 may be any type of digital camera (e.g. a surveillance camera, a mobile or wearable device with a camera, etc.) of suitable resolution for providing image data for use in the adaptive surface illumination operations described herein. If provided, an image sensor 121 may be coupled via a communication interface to provide data of one or more captured images of the illuminated area of the architectural surface for processing by the host processing system 114. The image sensor 121 may be coupled/operated to provide such image data during an initial set up for illumination of a particular surface area, for example, if the reset of the system will be statically configured to provide particular illumination of the surface area for some long period of time. Alternatively, the image sensor may be continuously available and utilized from time to time for dynamic changes in the array operations, to adapt illumination output to conditions that may change the shadow effects of various regions within the illuminated surface area of the face of the architectural panel.
The illustration, by way of example, shows a single processor in the form of the microprocessor 123. It should be understood that the controller 114 may include one or more additional processors, such as multiple processor cores, parallel processors, or specialized processors (e.g. a math co-processor). Particularly for dynamic image responsive control, it may be advantageous to include an image processor in addition to the microprocessor 123.
Although specially configured circuitry may be used in place of microprocessor 123 and/or the entire host processor system 115, the drawing depicts a processor-based example of the controller 114 in which functions relating to the controlled operation of the system 100, including operation of the high-resolution, pixel controllable array 111 of solid state emitters, may be implemented by the programming 127 and/or configuration data stored in a memory device 125 for execution by the microprocessor 123 (or other type of processor). The programming 127 and/or data configure the processor 123 to control system operations so as to implement functions of the system 100 described herein.
Aspects of the system 100 for adaptive surface illumination therefore include “products” or “articles of manufacture” typically in the form of firmware or other software that include executable code of programming 127 and/or associated configuration data (not separately shown) that is/are carried on or embodied in a type of machine readable medium. “Storage” type media include any or all of storage devices that may be used to implement the memory 125, any tangible memory of computers or the like that may communicate with the system 100 or associated modules of such other equipment. Examples of storage media include but are not limited to various semiconductor memories, tape drives, disk drives and the like, which may provide non-transitory storage at any time for the programming 127 and/or the configuration data. All or portions of the programming and/or data may at times be communicated through the Internet or various other telecommunication networks. Such communications, for example, may enable loading of the programming and/or data from a computer or the like into the host processing system 115 of the controller 114, for example, from a management server or host computer of the lighting system service provider into a lighting system 100. Thus, another type of media that may bear the programming 127 and/or the data includes optical, electrical and electromagnetic waves, such as used across physical interfaces between local devices, through wired and optical landline networks and over various air-links. The physical elements that carry such waves, such as wired or wireless links, optical links or the like, also may be considered as media bearing the software. As used herein, unless restricted to non-transitory, tangible or “storage” media, terms such as computer or machine “readable medium” refer to any medium that participates in providing instructions to a processor for execution.
The light fixture 201 provides angled illumination of a face 203 of an architectural panel 205, for example, around the optical output axis A-A (see
For convenience, only small portions of the panel 205 and the face 203 are shown enlarged in these drawings. The surface 204 in the example is not uniformly flat and instead includes surface topological features that are raised or lowered from the average height of the face 203 of the panel 205, in the illustrated example orientation. Such features may be imperfections or may be deliberately provided as a texture or the like for the surface 204. Features may have a variety of shapes and sizes. For ease of illustration, these drawings show an example of the face 203 of the architectural panel 205 having a fairly regular recurring wave pattern as surface 204. In the cross-sectional views, the features provided by the wave include a peak 207, and valley 209 and another peak 211.
For convenience,
Using one or more processors of multiple lighting systems 100 (
In many applications, the camera(s) 213 may be on-site only during system set-up. Once the lamp correction profile is created and stored, the camera(s) 213 may no longer be necessary. In such an arrangement, a technician with one or more cameras might come back and run the set-up again, e.g. to obtain a new lamp correction profile for one or more light fixtures 201 in view of some change on or in the vicinity of the face of the wall or other panel.
In other installations/applications, it may be desirable to dynamically change operations of the emitters of one or more arrays (in one or more fixtures 201), for example, to compensate for dynamic changes effecting lighting of the face 203 of the panel 205. For example, it may be desirable to compensate for changes in natural light from a window or skylight that also may illuminate the face 203 of the panel 205. By way of another example, it may be desirable to compensate for shadows cast on the face of a wall or the like by movable objects, e.g. people, when in the vicinity of the panel 205. For the more dynamic, substantially real time control, the camera(s) 213 would be permanently installed in the vicinity of the panel 205 and aimed at the illuminated surface area or the panel face 203. After calibration, image data from the camera(s) 213 would provide feedback that the system 100 could use to adjust the output intensities of light from the emitters of the array(s) of the fixture(s) 201 illuminating the face 203 of the architectural panel 205 illuminated by fixture(s) at a particular installation.
A technician setting up a general lighting system for illuminating the architectural surface 203 might observe the bright and dark areas and provide manual inputs via a suitable user terminal device (e.g. mobile device or computer terminal), to adjust operations of the emitters of the pixel controllable array. Alternatively, locations and intensities of illuminations of the brighter and darker regions in the illuminated area of the architectural surface 203 may be detected by processing image data from a camera 213, serving as the image sensor(s) 121 in the system 100 of
The camera 213/121 or other sensor could be in the lighting system or even in the light fixture 201 or other luminaire portion of the lighting system 100. Most often, the camera would be remote from the light fixture 201 or other type of luminaire in order to provide a better imaging view of the face 203 of the panel 205. Hence, in the example of
The systems as shown in
The illustrations in
With specific reference to the example of
As shown in
As shown in
The combination of decreases of light directed through light throws 213, 215 and increases of light directed through light throws 217, 219 are selected to provide a more uniform (closer to a desired average value) of illumination in the different regions of the illuminated area of the surface 203. In this simple example, the emitters at the appropriate pixels of the array of the light fixture 201 are controlled to reduce intensity in some regions and increase intensity in other regions of the face 203, so as to implement an intended uniform illumination effect on the face 203, for example, to reduce perceptible striations as might help to hide imperfections in the face 203. Also, the increased uniformity may enable a wall wash type application in which the light fixture 201 is mounted closer to the architectural panel and aimed to output light at a smaller angle between the axis A-A of the fixture output (see
The example of
At a high level, these methods involve illuminating an area of the face of the architectural panel, at an acute angle relative to a plane of the face or otherwise associated with the face, of the architectural panel (e.g. as in
The determination of output intensity settings, for example, may compile a lamp correction profile. The method also entails selectively changing output intensity of one or more of the light emitters of the array to change intensity of illumination of one or more of the identified regions and adjust relative intensity of illumination among the regions of the illuminated area of the face of the architectural panel, for example, by selectively changing emitter output intensities in response to the determined output intensity settings in the profile to achieve the intended uniformity or other intended illumination effect on the face of the architectural panel.
The method examples specifically relate to set-up techniques to define lamp correction profiles for use in establishing an intended illumination effect on the illuminated area of the face of the architectural panel. Similar procedures may be developed to use the data of images captured by the camera as feedback in control steps repeated over time, to repetitively adjust the pixel emitter output settings so as to maintain the intended illumination effect on the illuminated area of the face of the architectural panel, e.g. in response to changing conditions of other illumination of the face of the architectural panel by other sources of light or shadow.
A goal of any such procedure directed to achieving apparent uniformity may be to minimize perceptible striations on the illuminated surface area of the face of the architectural panel. The procedures may be configured to reduce differences in intensity across the illuminated area of the panel face, which may provide uniformity or may reduce gradients (rate of change) in intensity. These approaches tend to smooth out the appearance to an observer, e.g. eliminate visible striations.
With more specific reference to a first example in
Step S13 involves further processing the image data obtained as part of step S12 to calculate intensity settings for the emitters at the pixels of the controllable array to achieve a desired illumination effect on the area of the face of the panel. Calculated settings may be correlated to locations of detected regions within the illuminated area of the face of the architectural panel of relatively brighter and darker illumination. The collection of image settings for the entire array of emitters forms a profile, referred to for convenience as a lamp correction profile. The settings in the profile may be calculated to achieve other effects, but for purposes of an example, the illustrated flow relates to establishing a suitable degree of uniformity (e.g. minimal perceptible striations) across the illuminated area on the face of the architectural panel.
For example, profile formation in S13 may also involve determining output intensity settings for the emitters of the pixel controllable array to achieve the intended illumination effect, for example a desired degree of uniformity, on the illuminated area of the face of the architectural panel. In such an example, the processing in step S13 may calculate new settings to reduce differences in illumination intensity between the previously detected regions of relatively brighter and darker illumination. The resulting lamp correction profile, for example, may specify emitter output intensities, e.g. in steps form 0% (full OFF) to 100% (maximum or full ON), for each of the pixels of the array in fixture 201 to achieve the intended illumination effect on the face 203 of the architectural panel 205.
A variety of algorithms may be used in step S13 to calculate the intensities setting values for inclusion in the lamp correction profile. Iterative procedures and procedures using machine learning are described later with regard to other flow charts. For example, the processor may process the image data to determine locations and degrees of variation in contrast, and adjust the intensity of light output directed to different regions of the illuminated area of the panel face to reduce changes in contrast across the face. Contrast corresponds to slope or gradient of intensity, therefore, rather than necessarily producing actual uniform illumination intensity, achieving contrast uniformity may only require that changes in slope or gradient of intensity are sufficiently small. The illumination is sufficiently uniform when the changes in contrast are sufficiently low as to be imperceptible by a human observer.
The lower graph represents a relatively uniform intensity versus location of light on the illuminated area of the face, as represented by a relatively horizontal line (zero slope). Where the slope is fairly constant, a person would see the illumination as uniform. For discussion purposes, the illustration includes a zig-zag in the line. The sudden deviation from the slope creates perceptible striation. Where the deviation is a sudden increase above the horizontally sloped line, the observer would perceive a brighter illumination. Where the deviation is a sudden decrease below the horizontally sloped line, the observer would perceive a darker area (darker illumination striation).
The upper graph represents observable illumination intensity that changes linearly with location of light on the illuminated area of the face, as represented by way of example by a decreasing line of relatively constant slope (e.g. lower actual intensity further away from the light fixture). Where the slope is fairly constant, a person still may perceive the illumination as uniform. For discussion purposes, the illustration includes a zig-zag in the upper line. The sudden deviation from the slope creates perceptible striation. Where the deviation is a sudden increase above the downwardly sloped line, the observer may perceive a brighter illumination. Where the deviation is a sudden decrease below the downwardly sloped line, the observer may perceive a darker area (darker illumination striation).
A simple algorithm example might calculate an average slope of illumination intensity across the illuminated area, based on data of the captured image for that area. Then, the algorithm might select emitters corresponding to light regions and dark regions and adjust the respective intensity settings from emitters down or up from the average slope setting, by an amount selected so that each emitter provides approximately the average slope of illumination intensity at the angled region of the face where light from the respective emitter illuminates the surface.
Subsequently, step S14 implements the selective control of output intensity of one or more of the light emitters of the array, for example, to change intensity of illumination of one or more of the identified regions (in the example, as a change from the intensities in the uniform appearance state of S11) so as to adjust relative intensity of illumination among the regions of the illuminated area of the face of the architectural panel. In the example flow using the lamp correction profile, the system (e.g. from any of
The method example of
In the example of
Other techniques may be used to structure light emissions for use in the alignment or calibration step S23. For example, the structured light emission might involve cycling through and driving the pixel emitters at the same (uniform slope) set intensity (e.g. each full ON, or each 75% of full ON, or the like) one at a time or by rows and columns and capturing successive images in the different structured emission states of the output light from the fixture 201 on the face 203 of architectural panel 205. In this approach, the image processing would identify emitters for correlation with illuminated locations on the face of the architectural panel based by correlating emission times of the emitters with detected illumination timing via the camera images. In another approach to structured light emission, the pixel emitter outputs may be individually modulated, e.g. pulse width modulated with codes or the like, for use in identifying an illuminated region on the face of the panel exhibiting the code/modulation with the particular emitter that produced the respective modulated light output. The code/modulation approach provides a form of multiplexing that enables detection of illumination on the surface from some number of the emitters during each cycle of capturing one or more images sufficient in number to detect the code/modulation.
With these or other structured light emission strategies, the data of one or more images of the illumination of the face 203 with the structured light is processed to determine the relationship between light outputs from emitters at pixels in the array and locations of regions of the illuminated area of the face 203 of the architectural panel when illuminated by the emissions from the respective pixel emitters through the optic.
The order of the steps is shown by way of example only. For example, the uniform illumination and image detection in steps S11, S12 may follow the structured illumination, image data processing and calibration in step S13.
Once calibrated, step S24 involves processing the image data obtained in step S22 and the correlation of the pixel emitters of the array to locations of illumination from the emitters on the illuminated area of the face 203 in step S23, to calculate intensity setting values for a lamp correction profile corresponding to a desired surface illumination effect. In the example, the intended effect is a relatively high degree of uniformity, although other effects may be achieved, such as a particular degree of intended shadow effects to show desired textures. The lamp correction profile would specify output intensities, e.g. in steps form 0% (full OFF) to 100% (maximum or full ON), for each of the pixels of the array in fixture 201 to achieve the desired illumination effect on the face 203 of the architectural panel 205. The setting adjustments for the various pixel emitters of the array may be calculated at S24 in a manner similar to examples discussed above relative to step S13 of
Step S25 then involves applying the lamp correction profiled to drive the emitters of the pixel controllable array of the light fixture, according to the setting values, so as to illuminate the face 203 of the architectural panel 205, including the surface topological features of the face 203. The settings enable the array output light, as distributed through the optic, to vary the illumination across the area of the face 203 so as to achieve (within a suitable tolerance) the intended illumination effect.
Once calibrated and if configured with profiles or other forms of setting data defining some number of target or predefined variations in illumination effects, dynamic control, for example, may allow a user to select or ‘dial’ between a uniform illumination setting that gives a visual appearance of a flat non-textured surface (analogous to a wall washing application) and an illumination pattern that emphasizes the light and dark regions of the panel face created by the surface topological features (analogous to a wall grazing of a textured wall or the like). If the camera is available to provide feedback during operations of the fixture, the system may process image data from the camera to further adjust the correction profile and thus the output light distribution to maintain a setting for a desired illumination effect, e.g. using the camera and associated data processing as a feedback loop to maintain uniformity or maintained desired lighting of textural features.
As noted, a method like that of
The example of
The example of
Hence, in each iteration of the loop, step S33 analyzes illumination intensity data obtained from the image captured by the camera in step S32 to determine if the distributed light outputs from the array via the optic provide suitably uniform illumination of the face of the panel. If not, then processing branches from S33 to step S34. The processing in step S34 calculates new intensity setting values for the emitters at the pixels of the controllable array to form a lamp correction profile. If a non-imaging optic is used, it may be helpful to utilize an optimization algorithm to correlate changes in the output intensities of the emitters of the array to contrast in the illumination of the surface so as to minimize perceptible striations. An example optimization algorithm is a genetic algorithm.
In the iterative process example of
After each execution of the profile calculation in step S34, the process flow returns to step S31. The step S31 shines the output light based on the latest iteration of the lamp correction profile in step S34. Then, the steps S32 and S33 are repeated to capture an image, process the data of the image detect any remaining non-uniformity of illumination on the surface area of the face, and determine from analysis of the image data if the illumination is now sufficiently uniform.
The iterative process would repeat with adjustments to the lamp correction profile until a profile is determined that achieves a desired degree of perceptible uniformity of illumination across the illuminated area of the face 203, as determined at step S33. For example, the system might repeat the iterative processing until all regions of the illumination area appear to have substantially the same illumination intensity in the image captured by the camera and/or except contrast (slope or gradient of intensity variation), within a defined level of tolerance not readily apparent to a human observer.
The machine learning based processing loop would repeat with adjustments to the lamp correction profile until a profile is determined that achieves a desired degree of perceptible uniformity of illumination across the illuminated area of the face 203, as determined at step S43. For example, the system might repeat the machine learning processing until all regions of the illumination area appear to have substantially the same illumination intensity and/or minimal variations in contrast in the image captured by the camera, within a defined level of tolerance not readily apparent to a human observer, similar to the tolerance(s) discussed above in the example of
The image analysis and processing to calculate intensity settings values for a profile in the examples of
The image data relating to non-uniform surface illumination provides a logical ‘map’ of light and dark areas and relative differences in detected intensity as the face of the panel is illuminated. With a uniform intensity light output distribution from the fixture, the intensity distribution ‘map’ in the image data relates closely to the angled illumination from the fixture impacting of the surface topological features of the particular illuminated face/panel. Although not necessarily used for illumination control, the image data may be processed to form a contour map of the features of the illuminated area of the face of the panel.
For convenience, the illustrated examples and description thereof have generally assumed a relatively static surface (e.g. the contour of the face and/or configuration of the surface topology features of the face of the architectural panel do not change over time, although the ambient lighting thereof may change). The adaptive illumination light fixtures and the methods of illumination, however, may be readily adapted to a panel and/or face that changes over time. A face of an outdoor panel, for example, may change over time due to a buildup of some kind of material on the face, e.g. due to falling rain, snow or the like, or due to pollutants from other repeated precipitation events or from air-borne contaminants. The feedback using a camera, in such an example, may enable adjustment of the outputs of the emitters at the points of the array to maintain the desired illumination effect while adjusting for the buildup at different locations on the illuminated area of the face. Changes in outputs of the array may also compensate for changes in reflectivity of the face of the panel, e.g. due to dirt or the like. By way of another dynamic control example for a potentially changeable panel, an architectural panel may include a door. If the door opens, the processing of the image data from the camera may be capable of detecting that changed condition and adjusting the output of light from the pixel controllable array of solid state light emitters through the optic, e.g. to reduce glare by dimming the light delivered to the area of the open door.
The illustrations for the examples discussed in detail above related to panels with faces having relatively flat macro-contours, such as walls, ceilings, floors or countertops. Such flat contoured panels have a flat plane PL associated with the face 203 of the architectural wall/panel 205 (see
It may be helpful to consider a few curved contour examples, with respect to
The curved face of the panel has macro-contour MC corresponding to an intended or ideal shape for the curved panel face. The curve of the MC may correspond to minima, maxima or an average height of the features. In the example, the macro-contour MC corresponds to the points at the maximum distances from the opposite surface of the panel. The curvature may be three-dimensional, although the drawing only shows a two-dimensional curvature of MC, in the side or cross-sectional view.
A curved contour such as MC will have associated tangential planes associated with various points on the curve. In the illustration, the line at TPL-TPL represents the tangential plane relative to MC, that touches the curve of MC at the point where the light fixture output axis A-A intersects contour MC. The line/plane N-N is normal to MC and TPL where the axis A-A intersects MC.
The light fixture 201 is aimed so that its optical output axis A-A is oriented at an acute angle θ relative to the tangential plane TPL-TPL of the face of the curved architectural panel. The surface(s) of the face in this example forms a number (one or more) of surface topological features, generally similar to the wave shown in the earlier example of
For convenience, the surface topological features are shown in somewhat exaggerated enlarged form. The surface topological features may be raised or lowered from the average of the MC curve of the face of the panel. Such features may be imperfections or may be deliberately provided as a texture or the like for the surface. Features may have a variety of shapes and sizes.
In a situation where the wall is not flat, an active grazing light output or an active wall wash light can still be used. The same results would still be desirable and may be controlled as in the earlier processing examples.
It will be understood that the terms and expressions used herein have the ordinary meaning as is accorded to such terms and expressions with respect to their corresponding respective areas of inquiry and study except where specific meanings have otherwise been set forth herein. Relational terms such as first and second and the like may be used solely to distinguish one entity or action from another without necessarily requiring or implying any actual such relationship or order between such entities or actions. The terms “comprises,” “comprising,” “includes,” “including,” or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises or includes a list of elements or steps does not include only those elements or steps but may include other elements or steps not expressly listed or inherent to such process, method, article, or apparatus. An element preceded by “a” or “an” does not, without further constraints, preclude the existence of additional identical elements in the process, method, article, or apparatus that comprises the element.
Unless otherwise stated, any and all measurements, values, ratings, positions, magnitudes, sizes, and other specifications that are set forth in this specification, including in the claims that follow, are approximate, not exact. Such amounts are intended to have a reasonable range that is consistent with the functions to which they relate and with what is customary in the art to which they pertain. For example, unless expressly stated otherwise, a parameter value or the like may vary by as much as ±10% from the stated amount.
In addition, in the foregoing Detailed Description, it can be seen that various features are grouped together in various examples for the purpose of streamlining the disclosure. This method of disclosure is not to be interpreted as reflecting an intention that the claimed examples require more features than are expressly recited in each claim. Rather, as the following claims reflect, the subject matter to be protected lies in less than all features of any single disclosed example. Thus the following claims are hereby incorporated into the Detailed Description, with each claim standing on its own as a separately claimed subject matter.
While the foregoing has described what are considered to be the best mode and/or other examples, it is understood that various modifications may be made therein and that the subject matter disclosed herein may be implemented in various forms and examples, and that they may be applied in numerous applications, only some of which have been described herein. It is intended by the following claims to claim any and all modifications and variations that fall within the true scope of the present concepts.
Number | Name | Date | Kind |
---|---|---|---|
9198262 | Bosua | Nov 2015 | B1 |
9210779 | Bosua | Dec 2015 | B2 |
9347642 | Catalano | May 2016 | B2 |
9464767 | Whitfield | Oct 2016 | B2 |
9470406 | Catalano | Oct 2016 | B2 |
9635737 | Bosua | Apr 2017 | B2 |
9658438 | Forrester | May 2017 | B2 |
9883563 | Bosua | Jan 2018 | B2 |
9936566 | Alexander | Apr 2018 | B2 |
10136490 | Mao | Nov 2018 | B1 |
10139077 | Kang | Nov 2018 | B2 |
10190746 | Mao | Jan 2019 | B1 |
10215391 | Newton | Feb 2019 | B2 |
10267486 | Mao et al. | Apr 2019 | B1 |
20130058103 | Jiang et al. | Mar 2013 | A1 |
20140036511 | Whitfield | Feb 2014 | A1 |
20140084809 | Catalano | Mar 2014 | A1 |
20160258588 | Grötsch et al. | Sep 2016 | A1 |
20170018215 | Black | Jan 2017 | A1 |
20170153004 | De Zwart | Jun 2017 | A1 |
20170160528 | Liu | Jun 2017 | A1 |
20170175987 | Newton | Jun 2017 | A1 |
20180073686 | Quilici et al. | Mar 2018 | A1 |
20190235257 | Mao et al. | Aug 2019 | A1 |
Number | Date | Country |
---|---|---|
102012201494 | Aug 2012 | DE |
Entry |
---|
Linda Trego, “Osram's hybrid LED combines light-emitting chip and individual pixel control,” Autonomous Vehicle Technology, Sep. 26, 2017 (2 pages). |
LBC Lighting, “Wall Wash Lighting & Wall Graze Lighting Techniques,” Blog Search Article downloaded from https://www.lbclighting.com/blog/2016/05/wall-wash-lighting/, May 27, 2016 (7 pages). |
Chris Stobing, “Short Throw vs. Long Throw Projectors: What's the Difference?”, Gadget Review, updated Apr. 10, 2018 (12 pages). |
Elizabeth Donoff, “Wallwashing and Wall Grazing,” Architectural Lighting, posted on Sep. 3, 2015 (5 pages). |
JEDI—Targetti, “JEDI Series,” downloaded from http://targettiusa.net/outdoor/jedi/, Jul. 5, 2018 (5 pages). |
U.S. Appl. No. 15/868,624, entitled “Optical Lens for Beam Shaping and Steering and Devices Using the Optical Lens,” filed Jan. 11, 2018 (72 pages). |
U.S. Appl. No. 15/914,619, entitled “Lighting Device With Optical Lens and Beam Pattern Adjustment Using the Optical Lens With Driving Techniques,” filed Mar. 7, 2018 (94 pages). |
U.S. Appl. No. 15/924,868, entitled “Lighting Device With Optical Lens for Beam Shaping and Illumination Light Source Matrix,” filed Mar. 19, 2018 (118 pages). |