The present disclosure relates generally to systems, apparatus, and methods for monitoring, analyzing, and automating aspects of the built environment. More specifically, the present disclosure relates to systems, apparatus, and methods for using existing lighting infrastructure and networks to integrate the physical world with computer-based systems and networks to more efficiently and accurately perform monitoring, analysis, and automation in the built environment.
Over the past several years, intelligent lighting systems combining light sources (e.g., solid state), sensors, networks, and autonomous control have grown in popularity as the prices of solid-state lighting devices have fallen and their efficacy has increased. However, the core functionality of these systems has generally been focused on two areas—energy management and lighting control.
As the costs of sensing, networking, and autonomous control also drop, intelligent lighting systems and/or networks may become the Trojan horse that brings the Internet of Things (IoT) to the built environment, thereby integrating the physical world with computer-based systems and networks for improved efficiency, accuracy, and economic benefit.
The IoT may include one or more networks of uniquely identifiable physical things (e.g., wearable objects, vehicles, buildings, etc.), each thing embedded with one or more of hardware, software, a sensor, an actuator, and a network connectivity that enables the thing to collect data and interoperate (e.g., exchange data and instructions) within the existing network infrastructure (e.g., the Internet). This interconnection of devices, systems, and services may lead to automation covering a variety of protocols, domains, and applications (e.g., a smart grid, a smart home, smart transportation, and even a smart city) as well as generation of large amounts of diverse types of data.
The built environment may include, for example, manmade surroundings and supporting infrastructure that provide the settings for human activity. In the built environment, intelligent lighting systems and/or networks have a few key features making them particularly well-suited for automation and data generation, namely ubiquity, power, sensing, and networking. First, artificial lighting is already ubiquitous in the built environment, including indoor and outdoor fixtures for general, accent, and task lighting (e.g., streetlamps, decorative holiday lights, and under-cabinet lighting). Not only are lighting fixtures all around us, but lighting fixtures also are directly connected to existing energy infrastructure. For at least solid-state lighting fixtures, auxiliary power may be diverted to other sensing, networking, and control devices without the added expense of making space and/or installing new connections (e.g., wiring). Furthermore, real-time connections (wired or wireless) to local and wide area networks allow intelligent lighting systems to provide a platform for integration with other devices, systems, and services. The range of sensor types and the granularity of sensor deployment may enable generation of a stream of data about the built environment that can be used for other purposes.
In one embodiment, a system to monitor an environment includes a plurality of sensors configured to utilize existing infrastructure supporting a plurality of lighting fixtures in the environment. The existing infrastructure includes a plurality of physical locations provided in the environment for the plurality of lighting fixtures, an electric power supply including a plurality of electrical connections for powering the plurality of lighting fixtures, and/or a first network including a plurality of network connections for exchanging data with the plurality of lighting fixtures. Each sensor of the plurality of sensors is operable to detect at least one change in at least one portion of the environment. The system also includes at least one memory configured to store at least one record of each change detected by each sensor of the plurality of sensors. The at least one record includes a time stamp indicating a time measure associated with the change and a location stamp indicating a physical location of the corresponding sensor. The at least one memory is further configured to store at least one rule governing at least one response by the system to at least one change in at least one portion of the environment based on historical information and machine learning. The system also includes at least one processor operably coupled to the plurality of sensors and the at least one memory, the at least one processor configured to monitor the plurality of sensors for at least one change in the at least one portion of the environment and generate the at least one response based on the at least one record of the at least one change and the at least one rule. The at least one response includes a report of the historical data associated with the at least one record, an alert associated with the at least one record, a digital control signal to control at least one lighting fixture of the plurality of lighting fixtures, and/or a modification of the at least one rule.
In an embodiment, the digital control signal controls the at least one lighting fixture of the plurality of lighting fixtures to activate the at least one lighting fixture, deactivate the at least one lighting fixture, flash the at least one lighting fixture, select a light level of visible light delivered by the at least one lighting fixture, select a color temperature of visible light delivered by the at least one lighting fixture, deliver nonvisible light with the at least one lighting fixture, and/or provide directional cues using the at least one lighting fixture. The selected color temperature of the visible light may be red. The nonvisible light may include infrared light.
In an embodiment, the plurality of sensors includes at least one sensor mechanically connected with at least one lighting fixture of the plurality of lighting fixtures such that the at least one sensor and the at least one lighting fixture share the same physical location of the plurality of physical locations provided in the environment for the plurality of lighting fixtures.
In an embodiment, the plurality of sensors includes at least one sensor electrically connected with at least one lighting fixture of the plurality of lighting fixtures such that the at least one sensor and the at least one lighting fixture share the same electrical connection of the plurality of electrical connections for powering the plurality of lighting fixtures.
In an embodiment, the plurality of sensors includes at least one sensor communicatively connected with at least one lighting fixture of the plurality of lighting fixtures such that the at least one sensor and the at least one lighting fixture share the same network connection of the plurality of network connections to exchange data over the first network.
In an embodiment, the report of the historical data associated with the at least one record and/or the alert associated with the at least one record includes a graphical representation. The graphical representation includes an occupancy map, a heat map, a temporal grid, a graph, and/or a scatter plot.
In an embodiment, the plurality of sensors includes, but is not limited to, an active infrared sensor, a passive infrared sensor, a microwave sensor, a radar device, a lidar device, a photometer, a radiometer, ambient light sensor, a digital camera, a thermographic camera, an electric beacon receiver, a radio-frequency identification (RFID) chip reader, a microphone, a sonar device, a seismograph, an ultrasound sensor, a temperature sensor, a humidity sensor, a chemical sensor, a smoke sensor, and/or a particulate matter sensor.
In an embodiment, the plurality of lighting fixtures in the environment have a first spatial resolution of lighting fixtures per an area, the plurality of sensors have a second spatial resolution of sensors per the area, and the first spatial resolution and the second spatial resolution are the same. In an embodiment, the plurality of physical locations may include at least one overhead physical location and/or at least one task height physical location.
In an embodiment, the plurality of electrical connections include a wire, a cable, a raceway, a bus bar, a bus duct, a junction box, and/or a power meter. In an embodiment, the plurality of network connections may include at least one of a wired connection and a wireless connection. In an embodiment, the first network is at least one of a point-to-point network, a bus network, a star network, a ring network, a mesh network, a tree network, a hybrid network, and a daisy chain network.
In another embodiment, a method for monitoring an environment includes providing a plurality of sensors configured to utilize existing infrastructure supporting a plurality of lighting fixtures in the environment. The existing infrastructure includes a plurality of physical locations provided in the environment for the plurality of lighting fixtures, an electric power supply including a plurality of electrical connections for powering the plurality of lighting fixtures, and/or a first network including a plurality of network connections for exchanging data with the plurality of lighting fixtures. Each sensor of the plurality of sensors is operable to detect at least one change in at least one portion of the environment. The method also includes monitoring the plurality of sensors for at least one change in the at least one portion of the environment and storing at least one record of the at least one change detected by at least one sensor of the plurality of sensors. The at least one record includes at least one time stamp indicating at least one time measure associated with the at least one change and at least one location stamp indicating at least one physical location of the at least one sensor. The method further includes providing at least one rule governing at least one response to the at least one change based on historical information and machine learning and generating the at least one response based on the at least one record and at least one rule. The at least one response includes a graphical representation of the historical data associated with the at least one record, an alert associated with the at least one record, a digital control signal to control at least one lighting fixture of the plurality of lighting fixtures, and a modification of the at least one rule.
In another embodiment, a system for tracking at least one of objects and individuals in an environment includes a plurality of sensors configured to utilize existing infrastructure supporting a plurality of lighting fixtures in the environment. The existing infrastructure includes a plurality of physical locations provided in the environment for the plurality of lighting fixtures, an electric power supply including a plurality of electrical connections for powering the plurality of lighting fixtures, and a first network including a plurality of network connections for exchanging data with the plurality of lighting fixtures. Each sensor of the plurality of sensors is operable to detect a presence or an absence of at least one of an object and an individual in at least one portion of the environment. The system also includes at least one memory configured to store records of changes in the presence or the absence of the at least one of the object and the individual as detected by the plurality of sensors. Each of the records includes a time stamp indicating a time measure associated with a change and a location stamp indicating a physical location of a sensor detecting the change. The at least one memory also is configured to store rules governing a response by the system to at least one change in the presence or the absence of the at least one of the object and the individual in at least one portion of the environment based on historical information and machine learning. The system further includes at least one processor operably coupled to the plurality of sensors and the at least one memory. The at least one processor is configured to generate the response based on at least one record of at least one change and at least one rule. The response includes a report of the historical data associated with the at least one record, an alert associated with the at least one record of the at least one change, a digital control signal to control at least one lighting fixture of the plurality of lighting fixtures, and a modification of the at least one rule.
In an embodiment, the system is configured for tracking an identification, a location, a movement, a density, a distribution, and/or a pattern of the at least one of the object and the individual in the environment. In an embodiment, the system is further configured for security monitoring, search and rescue, inventory management, marketing research, and/or space utilization.
It should be appreciated that all combinations of the foregoing concepts and additional concepts discussed in greater detail below (provided such concepts are not mutually inconsistent) are contemplated as being part of the inventive subject matter disclosed herein. In particular, all combinations of claimed subject matter appearing at the end of this disclosure are contemplated as being part of the inventive subject matter disclosed herein. It should also be appreciated that terminology explicitly employed herein that also may appear in any disclosure incorporated by reference should be accorded a meaning most consistent with the particular concepts disclosed herein.
Other systems, processes, and features will become apparent to those skilled in the art upon examination of the following drawings and detailed description. It is intended that all such additional systems, processes, and features be included within this description, be within the scope of the present invention, and be protected by the accompanying claims.
The skilled artisan will understand that the drawings primarily are for illustrative purposes and are not intended to limit the scope of the inventive subject matter described herein. The drawings are not necessarily to scale; in some instances, various aspects of the inventive subject matter disclosed herein may be shown exaggerated or enlarged in the drawings to facilitate an understanding of different features. In the drawings, like reference characters generally refer to like features (e.g., functionally similar and/or structurally similar elements).
The present disclosure describes systems, apparatus, and methods for monitoring, analyzing, and automating aspects of the built environment, including the appropriation of existing lighting infrastructure and networks to integrate the physical world with computer-based systems and networks for improved efficiency, accuracy, and economic benefit.
Most modern building automation systems tend to have a similar architecture no matter the vendor. In most cases, this consists of a common network backbone (also known as a “primary bus”), with modules or subsystems corresponding to specific building systems. As shown in
Intelligent lighting systems leverage the ubiquity of lights in the built environment to deploy a pervasive network of sensors and controls throughout a wide variety of buildings and public spaces. The data that these systems may generate—occupancy, temperature, energy, asset tracking, ambient light, among other types—may be used to improve the operation of non-lighting building automation subsystems. For example, real-time data about where people are in a facility can be used to make HVAC systems more comfortable and more efficient, for instance, by increasing cooling power in a crowded conference room or by automatically modifying temperature set points in an empty office.
In some embodiments, intelligent lighting networks may be used a backbone for connecting various pieces of building automation equipment, which might otherwise not be cost-effective to deploy because of the expense of connecting them to the Internet. For example, in the event of an emergency, intelligent lighting systems may provide real-time data to fire and life safety systems about where people are located in a building in order to streamline evacuation. Real-time feeds of occupancy data from an intelligent lighting system may make security systems more reliable and effective, by dramatically expanding sensor coverage while automatically ignoring false alarms.
Applications or apps 306 may be communicatively connected with platform 300 over at least one application program interface (API) 308 to provide a user with information and to receive user input. An API 308 includes a set of requirements governing how a particular app 306 can communicate and interact with platform 300, including routine definitions, protocols, and tools. An API may be an open API, that is, publicly available and providing developers with programmatic access to certain functions of platform 300.
Apps 306 may be stored locally or remotely on a server or downloaded to one or more personal computing devices, such as a desktop computer, laptop, tablet, smartphone, etc. Similarly, platform 300 may be communicatively connected with one or more third party systems 310 over the API 308 to provide output and receive input regarding various aspects of an environment related to a third party, such HVAC performance in the environment managed by a repair and maintenance service, energy efficiency in the environment related to an energy provider, and suspicious changes in the environment subject to a security or law enforcement agency.
In addition to data storage 312 and security features 314, the architecture of an intelligent lighting system-based platform 300 may include a connectivity module 316 that functions to bolster reliable and real-time communication over a variety of physical layers. The physical layers may include hardware, such as sensors 302, control devices 304, and interfaces for communicating with apps 306. The connectivity module 316 may manage and/or provide connectivity between these physical layers. A device management module 318 may manage and/or provide for commissioning devices, over-the air firmware upgrades, and/or system health metrics. A rule engine module 320 may manage and/or provide real-time behavioral control of the platform 300, sensors 302, control devices 304, and/or communications to apps 306 or third party systems 310. A visualization module 322 may manage and/or provide graphical representation of data collected from the sensors, examples of which are further described herein.
An analytics module 324 may manage and/or provide machine learning algorithms to generate actionable insights based on changes (or the absence of expected changes) sensed in the environment. Actionable insights may include generating a security alert based on the detection of suspicious activity or the presence (or absence) of changes in the environment, generating an inventory report based on tracking particular objects in a warehouse, scheduling use of particular spaces for activities or objects based on analyses of space utilization in an office or factory. Examples of techniques employing machine learning algorithms are further described herein. An external interfaces module 326 may manage and/or provide integration with other systems, for example, using APIs.
According to some embodiments, data storage 312 may include one or more memory devices in communication with the connectivity module 316, the device management module 318, the rule engine module 320, the visualization module 322, the analytics module 324, and/or the external interfaces module 326. For example, stored data may include rules used by the rule engine module 320 for real-time behavioral control, sensor data used by the visualization module 322 to generate graphical reports, and/or actionable insights generated and tested by the analytics module 324. The modules described above may be implemented using hardware, software, and/or a combination thereof.
Platform management resources may be deployed locally or remotely (e.g., via cloud computing) to manage at least one platform servicing at least one environment. Platform management resources may be configured for, but are not limited to, data storage, connectivity, device management, rule management, visualization, analytics, and/or external interfaces.
Intelligent lighting systems may require network infrastructure to function, but this infrastructure may take different forms based on the requirements of specific applications and end users. In some embodiments, the network infrastructure may include a standalone network. In this network an intelligent lighting system may have its own separate data infrastructure. This may enhance security, reduce costs, and provide better convenience. In some embodiments, the network infrastructure may include a facility local area network (LAN). In this network an intelligent lighting system may piggyback on existing facility LAN directly or via a virtual LAN (VLAN). This may reduce costs and provide better manageability. In some embodiments, the network infrastructure may include a cellular network. In this network an intelligent lighting system may provide its own cellular or wireless wide area network (WAN) backhaul connection to obviate the need to connect to facility network altogether.
In some embodiments, intelligent lighting systems may require software deployment to function. The deployment may be on-premise deployment, off-site deployment, or a combination thereof based on the requirements of specific applications and end users. In some embodiments, software deployment may include on-premise deployment. The software may be installed on an appliance and/or an end-user server. In some embodiments, the software deployment may include off-site deployment. The software may be hosted off-site and/or virtualized on a private or public cloud.
In some embodiments, web and mobile clients 404 interact with multi-site cloud management 402 and/or connected devices 406. Web and mobile clients may include facility-level computing infrastructure provided for an environment and/or personal computing devices, such as a desktop computer, laptop, tablet, smartphone, etc. In some embodiments, web and mobile clients may be used to interface with platform appliance 400 by, for example, displaying the results of sensor data processed by management 402.
In some embodiments, platform appliance 400 communicates with connected devices 406, which may include one or more sensors, control devices, power sources, and/or other IoT devices 412. One or more connected devices may be connected to platform appliance 400 with a Power over Ethernet (PoE) switch 414, which allows network cables to carry electrical power to the connected devices 406, and/or a wireless gateway 416, which routes information from a wireless network to another wired or wireless network. Platform appliance 400 also may communicate over an API 418 with an energy monitoring system (EMS) 420, which monitors and reports energy data to support decision making, and/or a building management system (BMS) 422, which controls and monitors a building's mechanical and electrical equipment (e.g., ventilation, lighting, power, and fire and security equipment).
The multi-tier architecture comprising multi-site cloud management 402, web and mobile clients 404 and connected devices 406 may enable distribution of the processing load, thereby providing flexibility to work with a range of customer sizes and respective IT requirements (e.g., an individual facility or facilities of a multi-national corporation).
The overlay of nodes and connections may represent a physical topology including device location and cable installation (i.e., the actual physical connections between devices unless wirelessly connected) and/or a logical topology describing the routes data takes between nodes. In
One or more of intelligent lighting fixtures in an environment may be connected with one or more control devices and/or sensors distributed throughout the environment in accordance with some embodiments. The connection between a lighting fixture and a control device or sensor may be mechanical, electrical, and/or network-based. For example, a microphone may be mechanically connected to a lighting fixture via a shared housing or a physical attachment to the fixture housing. The microphone may be electrically connected to the fixture by sharing a power source, and/or communicatively connected to the fixture by sharing a network and/or communication interface.
In some embodiments, an intelligent lighting system-based platform provides a network and connectivity between a plurality of sensors. Accordingly, the overlay in
In some embodiments, sensor 602 may perform occupancy sensing such that platform 606 controls lighting fixture 600 to provides light as needed based on the occupancy of a space. In some embodiments, sensor 602 may perform ambient light sensing such that platform 606 controls lighting fixture 600 to adjust light level accordingly (i.e., daylight harvesting). Lighting fixture 600 may be configured for full-range dimming for optimal energy saving and/or visual comfort. Sensor 602 may include an on-board power meter to measure and/or validate energy used by lighting fixture 600. Platform 606 may process data (e.g., occupancy, ambient light, energy usage, etc.) from sensor 602 to generate reports and/or optimize operations within the space. Device 608, which displays an energy usage report, may be interfaced with external management system 610 to implement optimizations generated by platform 606.
In some embodiments, a sensor network is installed via installation of lighting fixtures with integrated sensors, installation of a control device to connect sensors with existing lighting fixtures, or some combination thereof. In some embodiments, the installed network is connected to the platform, resulting in one or more of integration of the sensors with an intelligent lighting system platform, occupancy sensing to provide light when and where needed, daylight harvesting to adjust light levels based on ambient light, full-range dimming for visual comfort and optimal savings, and on-board power metering for measuring and validating energy use. In some embodiments, the platform provides reports and/or automatically adjusts operations for optimization.
Sensor data may be sent to an odometry module 712 stored, for example, in onboard memory or at least one remote memory device. Odometry module 712 may use the collected data to compute a plurality of parameters, such as energy usage, power up time, active time, and temperature. Sensor data may be sent to at least one event log 714 stored, for example, in onboard memory or at least one remote memory device. Events may be time-stamped (e.g., based on clock 706) and/or location-stamped (e.g., based on the position of a source sensor and/or the odometry module 712). Event records may include activity or inactivity statuses for at least one sensor as well as deviations from, for example, a range or threshold (e.g., a temperature above an expected threshold).
According to some embodiments, sensor data may be sent from at least one sensor data handler 710 to at least one network handler 716, which may provide feedback to odometry module 712, event log 714, and/or at least one configuration register 718, which controls aspects of the system such as an active level of ambient light, an inactive level of ambient light, a dimming ramp up period, a dimming ramp down period, an active power usage target, an inactive power usage target, a sensor delay, etc.
In particular, control loop 720 controls connected devices (e.g., lighting, HVAC, etc.) based on configurations stored in configuration register 718 and sensor data from sensor data handler 710. For example, control loop 720 may receive sensor data indicating active occupancy, a measured ambient temperature, and a measured ambient light level. Based on this input, control loop 720 may compare the measured ambient temperature and measured ambient light level to configurations defining an active target temperature and an active target light level. If the sensor data indicates inactivity, control loop 720 may compare the measured ambient temperature and measured ambient light level to different configurations defining an inactive target temperature and an inactive target light level. If sensor data does not meet one or more configurations, control loop 720 may activate at least one driver 722 to adjust connected devices. For example, too much power usage may be corrected with a control signal to dim one or more connected lighting fixtures (e.g., as shown in
In some embodiments, the spatial density of, for example, occupancy sensing built into intelligent lighting systems (e.g., a sensor built into every single light fixture) makes lighting a particularly useful platform on top of which to build other systems. For example, the platform may be integrated with building management systems, including legacy protocols and special-purpose protocols.
According to some embodiments, one or more APIs may be provided for communication and/or interaction between the platform and users. For example, an API may be for mobile apps and/or web apps. An API may be a closed API for in-house development or an open API, that is, publicly available and providing developers with programmatic access to certain functions of platform. The API may limit access rights and/or require user authorization. The API may provide information in real time and/or buffered. In some embodiments, an API provides alerts and/or reports. The API may provide historical information, current information, and/or predictive information.
Intelligent lighting systems can consist of a range of different device types, each with its own capabilities. Some devices may be fully-integrated, with light source, networking, sensing, and control software combined in a single physical object, but it also may be useful to deploy devices containing only a subset of these four key features. In some embodiments, one or more devices may include, but are not limited to, an electromagnetic radiation sensor, a chemical sensor, a climate sensor, a sound sensor, etc. Sensors may include an analog and/or digital sensor.
Electromagnetic radiation sensors may include, but are not limited to, an active infrared sensor (e.g., a forward looking infrared-like imaging array), a passive infrared (PIR) sensor, a microwave sensor, a radar device, a lidar device, a photometer, a radiometer, an ambient light sensor, a digital camera, a thermographic camera, an electric beacon receiver, a charge-coupled device (CCD) image sensor, a complementary metal-oxide-semiconductor (CMOS) image sensor, a radio-frequency beacon sensor, and a radio-frequency identification (RFID) chip reader. Active infrared sensors may be used to detect and/or track people, vehicles, and/or other objects in, for example, low light or visually noisy environments (e.g., monitoring process equipment in refineries or manufacturing facilities). PIR sensors have relatively low resolution (e.g., a single bit indicating presence or absence) and typically require line of sight but are relatively inexpensive for deployment throughout an environment. Microwave sensors may be used to detect occupancy through barriers (e.g., walls, ceilings, and other obstacles). Ambient light sensors may be useful for daylight harvesting and, in some circumstances, motion detection. CCD or CMOS image sensors may be used for detection (e.g., facial recognition) and tracking (e.g., path reconstruction). An RF beacon sensor may be deployed in mobile devices carried by people, vehicles, and/or other objects (e.g., as an app including push notifications, mapping, and routing). An RFID reader may be used to detect and track people, vehicles, and/or other objects with RFID chips (e.g., identification cards, windshield stickers, etc.).
Chemical and/or biological sensors may include, but are not limited to, an oxygen sensor, a carbon dioxide sensor, a carbon monoxide sensor, an ammonia sensor, a radioactivity sensor, a DNA analysis device, a pathogen sensor, and a particulate matter sensor. For example, a carbon dioxide sensor may be used as a backup to an occupancy sensor. Some sensors may indicate hazardous or suspicious conditions in an environment, such as a fire (e.g., a smoke detector), an unexpected or undesirable presence of something (e.g., carbon monoxide sensor, radioactivity sensor, ammonia sensor to detect cooling system leakage, a pathogen sensor, etc.), an unexpected or undesirable absence of a substance (e.g., oxygen sensor), or explosive conditions (e.g., a particulate matter sensor in, for example, a grain processing plant).
Climate sensors may include, but are not limited to, a temperature sensor, a humidity sensor, a seismometer, and a pressure sensor also may indicate conditions in an environment (e.g., fire, flood, earthquake, and occupancy), data from which can be used to activate safety and security systems or optimize a HVAC system.
Sound sensors may include, but are not limited to, a microphone, a vibration sensor, a sonar device, and an ultrasound sensor. Such sensors may be used to measure ambient noise, detect occupancy (including intruders), and flag changes in sound signatures to indicate disruptions among people and maintenance needs among machines. Although relatively expensive, ultrasonic sensors may be used, particularly in high security risk environments, to convert ultrasonic waves to electrical circuits for impressive detection capabilities (e.g., being able to “see around corners”).
According to some embodiments, an intelligent lighting system platform supports a very robust profile scheduling feature—specific parts of a facility can be configured to behave in specific ways at specific times of day. Facilities do not always operate on a rigid, fixed schedule, so manual configuration of when each profile should run is challenging. In some embodiments, information captured about an entire facility is used to learn the facility's operating rhythms (e.g., when the site is empty, sparsely occupied, or busy) and automatically apply an appropriate profile.
In one embodiment, a the week is divided into 672 fifteen-minute segments (7×24×4). For each segment, historical data is used to calculate average occupancy. The observed occupancy range is divided into n classes, where n is specified by the facility manager (typically 2≦n≦4). Classification thresholds are calculated based on segment boundaries, and a profile is assigned to each fifteen-minute segment based on the appropriate classification.
According to some embodiments, instead of manual configurations, the parameters of individual profiles (e.g., active and inactive levels, sensor delays, progressive dimming ramp times, and district heating (DH) targets) are automatically adjusted in order to optimize energy savings while maintaining employee safety and productivity.
In one embodiment, a gradient descent/simulated annealing technique is used to optimize profile parameters, with a fitness function that combines energy usage and information about frequency of manual overrides. This technique may be more useful as an offline process (e.g., updating parameters once each day) rather than in real time.
In some embodiments, techniques are applied to avoid wasted energy and reduce the number of false positives generated, for example, by security applications running on top of a sensor network. For example, PIR occupancy sensors may be used to detect motion of people and/or vehicles. PIR sensor rely on changes in temperature and may be falsely triggered by, for example, warm airflow in a cool environment.
In one embodiment, a conditional probability function is generated for each sensor. Similar to the technique used in automatic profile scheduling described above, the function outputs the likelihood that a sensor event is spurious based on the fifteen-minute time segment in which the event occurs and the real-time state of neighboring sensors. These probability functions may be updated in a batch/offline manner to incorporate new data and capture potential facility changes.
In some embodiments, techniques are applied to tune the sensitivity of a sensor network. For example, PIR sensors are inherently analog but output a digital signal. Sensors that require analog-to-digital conversion have a sensitivity parameter. If the sensitivity is set too high, a sensor will produce false positives (e.g., implying motion when none is present). If the sensitivity is set too low, a sensor will produce false negatives (e.g., failing to sense motion that is occurring).
In one embodiment, data from an intelligent lighting system platform is fed back into each individual sensor to automatically adjust sensitivity. For example, when one sensor is not being triggered even though all of the neighboring sensors indicate activity, the sensitivity of that particular sensor is increased. Conversely, when one sensor is being triggered even though all of the neighboring sensors are not indicating activity, the sensitivity of that particular sensor is decreased. These adjustments may be carried out either in real-time (as each new sensor event arrives) or as a batch/offline process (once per day, for example).
In some embodiments, an intelligent lighting system platform may be used as a system for tracking people, vehicles, and/or other moving objects along a spectrum of granularity depending on the requirements of specific applications, ranging from aggregate numbers and anonymized patterns to real-time facial recognition and path reconstruction. In some embodiments, a sensor network may be visible in order to, for example, deter undesirable behaviors or activities (e.g., theft). In other embodiments, sensors are disguised via integration into a lighting infrastructure for surveillance purposes or for user comfort in the environment. For example, beacons or other sensors embedded in overhead lights may allow transportation service providers to track traffic and parking patterns (e.g., intersections, parking lots, etc.), employers to track employee productivity (e.g., on factory lines, in break rooms, etc.), service providers to track physical queues and adjust coverage (e.g., in banks, checkout lines, etc.), retailers to track physical shoppers (e.g., to design store layouts, prevent shoplifting, etc.), security and law enforcement providers to identify and track suspects or suspicious activities (e.g., in high risk locations such as transportation hubs, lockdown situations, etc.), and rescue providers to locate and track individuals (e.g., in fires, missing person cases, etc.).
Monitoring and securing physical space is an important goal for many users of intelligent lighting. Because these lighting systems often provide very high sensor resolution (in terms of sensors per square foot), real-time monitoring, and a network connection to the outside world, they can be repurposed to augment or replace traditional security systems.
In some embodiments, a physical security system processes occupancy information (either anonymized “activity” data or tracking information about specific people) in real time and generates “alerts” according to rules programmed into the system. For instance, one rule may require sending an alert to a facility manager if any occupancy sensor indicates activity after 5:00 P.M. Another rule may require sending an executive an alert if the occupancy sensor in her office indicates activity while she is on vacation. In some embodiments, a system may be configured to detect not just the presence of person, vehicle, or object in an environment, but also whether that person, vehicle, or object is or is not authorized to be in that environment. In some embodiments, a knowledgeable user defines these rules (e.g., typical operating hours of the facility, who is allowed to access each part of a facility, etc.).
In some embodiments, techniques derived from the field of machine learning can reduce the need to update rules (as, e.g., usage of space shifts over time) or “false positives” (e.g., occupancy events caused by environmental factors such as airflow or temperature changes). Parameters for the rules discussed above may be learned through analysis of historical sensor data for the particular environment and/or other similar environments. Learning algorithms may be run at any or all geographic scales—from a full facility down to individual rooms.
In some embodiments, each occupancy event is assigned a coefficient (a “probability of suspicion”) based on these learned parameters, the current state of other nearby sensors, and/or the overall activity level in the environment to reduce the number of false positive alerts.
The real-time data from this lighting-based security system may be processed and presented to a user via an application similar to
The data generated by an intelligent lighting system may provide insight into how spaces are actually used. For example, the data may be used to more intelligently schedule meeting rooms in commercial office settings or to allocate space in a facility that follows “hoteling” practices.
In some embodiments, occupancy data generated by intelligent lighting systems may be used to optimize space utilization. For example, even with shared calendars and scheduling software, shared spaces (e.g., conference rooms) often end up “overbooked” (the scheduled time exceeds the time for which it is actually needed), “underbooked” (the scheduled time is less than the time for which it is actually needed), or “squatted” (used without scheduling in the shared calendar).
In some embodiments, real-time occupancy data is automatically fed back into a calendar system to give a more accurate representation of space utilization. For example, when occupancy sensing detects that a meeting has ended earlier than scheduled, the end time of its entry in the calendar system automatically may be shortened. When occupancy sensing detects that a meeting is running longer than scheduled, the end time of its entry in the calendar system automatically may be lengthened. When occupancy sensing detects a group “squatting” in an unscheduled conference room, an “ad-hoc” entry in the calendar system automatically may be created.
In addition to occupancy data, an intelligent lighting system platform may generate large amounts of other high-resolution data, including temperature and/or humidity data. By providing significantly more sampling points (e.g., one per light fixture versus one per thermostat), this data feed may allow HVAC systems to better understand the range of temperature and humidity in a facility, as well as pointing out any particular hot or cold spots. In some embodiments, this data may be combined with software to allow users to specify a desired temperature in their “personal space” (be it, e.g., an office, a cubicle, or any other region of the facility). Real-time high-resolution data may enable a level of personalized climate control that is not possible with traditional HVAC systems.
In some embodiments, an intelligent lighting system-based platform supports an energy management and/or equipment optimization system. Real-time monitoring and management may support energy and/or resource optimization as well as identification of problems or hazards. For example, data gathered from the sensors included in the intelligent lighting system may be processed to monitor and manage the energy usage for an area. In some embodiments, an energy management system provides measurement of lighting energy usage with integrated power metering and other electrical loads, validation of energy savings and projections for utility rebates, and/or monitoring trends and progress toward returns on investments.
In some embodiments, the intelligent lighting systems could be used as an equipment optimization system. The data gathered from the sensors included in the intelligent lighting system could be processed to monitor and manage the power usage for at least one piece of equipment over a period of time, thereby tracking when machinery is being used and how much energy it is consuming. Real-time monitoring and managing power usage for at least one piece of equipment may be used to optimize performance and effectiveness by the at least one piece of equipment. Power metering data also may be used to identify when machinery needs maintenance attention.
In some embodiments, an intelligent lighting system platform may be used as a space optimization system. Data gathered from the sensors included in an intelligent lighting system may be processed to monitor and manage how spaces are actually used. Real-time monitoring and managing spaces could have a plurality of applications (in addition to scheduling meeting rooms, allocating desk space in a facility, or allocating rooms in a hotel), including gathering data about how a space is being used, how people travel through a space, and how to maximize limited space and productivity. A space optimization system may be used to optimize a layout of a store, warehouse, or factory floor, for example, where to position a new display, item, or piece of equipment.
In some embodiments, data gathered from the sensors included in the intelligent lighting system may be processed to monitor and manage the system health of some embodiments of the intelligent lighting system.
In some embodiments, an intelligent lighting system platform is used as an inventory management system. Manufacturers, distributors, retailers, and many other entities share a need to track specific objects (and their conditions), whether parts moving along an assembly line, pallets on a truck, consumer goods at a drug store, or medications in a hospital, etc. Intelligent lighting systems with built-in object tracking capabilities (e.g., RFID, visual object recognition, or some other technology) may provide this functionality to augment or replace traditional inventory management systems.
Object tracking capabilities may augment the functionality of some embodiments of an intelligent lighting system as an inventory management system. In some embodiments, an intelligent lighting systems has built-in object tracking capabilities. In other embodiments, an intelligent lighting system has devices with object tracking capabilities mounted thereon or otherwise connected to the lighting system platform. The sensors may be RFID chip readers to recognize chips attached to the objects. Alternatively or in addition, sensors with visual object recognition capabilities may be used to detect and track objects. In some embodiments, a heat map may be used to visualize inventory or objects within a facility.
While various inventive embodiments have been described and illustrated herein, those of ordinary skill in the art will readily envision a variety of other means and/or structures for performing the function and/or obtaining the results and/or one or more of the advantages described herein, and each of such variations and/or modifications is deemed to be within the scope of the inventive embodiments described herein. More generally, those skilled in the art will readily appreciate that all parameters, dimensions, materials, and configurations described herein are meant to be exemplary and that the actual parameters, dimensions, materials, and/or configurations will depend upon the specific application or applications for which the inventive teachings is/are used. Those skilled in the art will recognize, or be able to ascertain using no more than routine experimentation, many equivalents to the specific inventive embodiments described herein. It is, therefore, to be understood that the foregoing embodiments are presented by way of example only and that, within the scope of the appended claims and equivalents thereto, inventive embodiments may be practiced otherwise than as specifically described and claimed. Inventive embodiments of the present disclosure are directed to each individual feature, system, article, material, kit, and/or method described herein. In addition, any combination of two or more such features, systems, articles, materials, kits, and/or methods, if such features, systems, articles, materials, kits, and/or methods are not mutually inconsistent, is included within the inventive scope of the present disclosure.
The above-described embodiments can be implemented in any of numerous ways. For example, embodiments may be implemented using hardware, software or a combination thereof. When implemented in software, the software code can be executed on any suitable processor or collection of processors, whether provided in a single computer or distributed among multiple computers.
Further, it should be appreciated that a computer may be embodied in any of a number of forms, such as a rack-mounted computer, a desktop computer, a laptop computer, or a tablet computer. Additionally, a computer may be embedded in a device not generally regarded as a computer but with suitable processing capabilities, including a Personal Digital Assistant (PDA), a smart phone or any other suitable portable or fixed electronic device.
Also, a computer may have one or more input and output devices. These devices can be used, among other things, to present a user interface. Examples of output devices that can be used to provide a user interface include printers or display screens for visual presentation of output and speakers or other sound generating devices for audible presentation of output. Examples of input devices that can be used for a user interface include keyboards, and pointing devices, such as mice, touch pads, and digitizing tablets. As another example, a computer may receive input information through speech recognition or in other audible format.
Such computers may be interconnected by one or more networks in any suitable form, including a local area network or a wide area network, such as an enterprise network, and intelligent network (IN) or the Internet. Such networks may be based on any suitable technology and may operate according to any suitable protocol and may include wireless networks, wired networks or fiber optic networks.
The various methods or processes outlined herein may be coded as software that is executable on one or more processors that employ any one of a variety of operating systems or platforms. Additionally, such software may be written using any of a number of suitable programming languages and/or programming or scripting tools, and also may be compiled as executable machine language code or intermediate code that is executed on a framework or virtual machine.
Also, various inventive concepts may be embodied as one or more methods, of which an example has been provided. The acts performed as part of the method may be ordered in any suitable way. Accordingly, embodiments may be constructed in which acts are performed in an order different than illustrated, which may include performing some acts simultaneously, even though shown as sequential acts in illustrative embodiments.
All publications, patent applications, patents, and other references mentioned herein are incorporated by reference in their entirety.
All definitions, as defined and used herein, should be understood to control over dictionary definitions, definitions in documents incorporated by reference, and/or ordinary meanings of the defined terms.
The indefinite articles “a” and “an,” as used herein in the specification and in the claims, unless clearly indicated to the contrary, should be understood to mean “at least one.”
The phrase “and/or,” as used herein in the specification and in the claims, should be understood to mean “either or both” of the elements so conjoined, i.e., elements that are conjunctively present in some cases and disjunctively present in other cases. Multiple elements listed with “and/or” should be construed in the same fashion, i.e., “one or more” of the elements so conjoined. Other elements may optionally be present other than the elements specifically identified by the “and/or” clause, whether related or unrelated to those elements specifically identified. Thus, as a non-limiting example, a reference to “A and/or B”, when used in conjunction with open-ended language such as “comprising” can refer, in one embodiment, to A only (optionally including elements other than B); in another embodiment, to B only (optionally including elements other than A); in yet another embodiment, to both A and B (optionally including other elements); etc.
As used herein in the specification and in the claims, “or” should be understood to have the same meaning as “and/or” as defined above. For example, when separating items in a list, “or” or “and/or” shall be interpreted as being inclusive, i.e., the inclusion of at least one, but also including more than one, of a number or list of elements, and, optionally, additional unlisted items. Only terms clearly indicated to the contrary, such as “only one of” or “exactly one of,” or, when used in the claims, “consisting of,” will refer to the inclusion of exactly one element of a number or list of elements. In general, the term “or” as used herein shall only be interpreted as indicating exclusive alternatives (i.e. “one or the other but not both”) when preceded by terms of exclusivity, such as “either,” “one of” “only one of” or “exactly one of” “Consisting essentially of,” when used in the claims, shall have its ordinary meaning as used in the field of patent law.
As used herein in the specification and in the claims, the phrase “at least one,” in reference to a list of one or more elements, should be understood to mean at least one element selected from any one or more of the elements in the list of elements, but not necessarily including at least one of each and every element specifically listed within the list of elements and not excluding any combinations of elements in the list of elements. This definition also allows that elements may optionally be present other than the elements specifically identified within the list of elements to which the phrase “at least one” refers, whether related or unrelated to those elements specifically identified. Thus, as a non-limiting example, “at least one of A and B” (or, equivalently, “at least one of A or B,” or, equivalently “at least one of A and/or B”) can refer, in one embodiment, to at least one, optionally including more than one, A, with no B present (and optionally including elements other than B); in another embodiment, to at least one, optionally including more than one, B, with no A present (and optionally including elements other than A); in yet another embodiment, to at least one, optionally including more than one, A, and at least one, optionally including more than one, B (and optionally including other elements); etc.
In the claims, as well as in the specification above, all transitional phrases such as “comprising,” “including,” “carrying,” “having,” “containing,” “involving,” “holding,” “composed of,” and the like are to be understood to be open-ended, i.e., to mean including but not limited to. Only the transitional phrases “consisting of” and “consisting essentially of” shall be closed or semi-closed transitional phrases, respectively, as set forth in the United States Patent Office Manual of Patent Examining Procedures, Section 2111.03.
This application claims a priority benefit of U.S. Provisional Patent Application No. 62/196,225, filed on Jul. 23, 2015, and entitled “Intelligent Lighting Systems for Monitoring the Built Environment,” U.S. Provisional Patent Application No. 62/318,318, filed on Apr. 5, 2016, and entitled “Intelligent Lighting and Building Automation,” and U.S. Provisional Patent Application No. 62/350,948, filed on Jun. 16, 2016, and entitled “Systems, Apparatus, and Methods for Automation and Machine Learning,” each of which is incorporated herein by reference in its entirety.
Number | Date | Country | |
---|---|---|---|
62196225 | Jul 2015 | US | |
62318318 | Apr 2016 | US | |
62350948 | Jun 2016 | US |