The present invention relates generally to thermostats and more particularly to the improved control of a building or space's heating, ventilating, and air conditioning (HVAC) system through the use of a multi-function thermostat.
A thermostat is, in general, a component of an HVAC control system. Traditional thermostats sense the temperature of a system and control components of the HVAC in order to maintain a setpoint. A thermostat may be designed to control a heating or cooling system or an air conditioner. Thermostats are manufactured in many ways, and use a variety of sensors to measure temperature and other desired parameters of a system.
Conventional thermostats are configured for one-way communication to connected components, and to control HVAC systems by turning on or off certain components or by regulating flow. Each thermostat may include a temperature sensor and a user interface. The user interface typically includes a display for presenting information to a user and one or more user interface elements for receiving input from a user. To control the temperature of a building or space, a user adjusts the setpoint via the thermostat's user interface.
One implementation of the present disclosure is a thermostat for a building. The thermostat includes a halo light emitting diode (LED) system including one or more LEDs configured to emit light and a halo diffuser structured around at least a portion of an outer edge of the thermostat. The halo diffuser is configured to diffuse the emitted light of the one or more LEDs around at least the portion of the outer edge of the thermostat. The thermostat includes a processing circuit configured to receive one or more data streams, determine whether the one or more data streams indicate a building emergency condition, and operate the one or more LEDs of the halo LED system to indicate the building emergency condition to a user.
In some embodiments, the processing circuit is configured to determine a thermostat condition that requires user input and operate the one or more LEDs of the halo LED system to indicate the thermostat condition to the user.
In some embodiments, the halo LED system further includes one or more waveguides, each of the one or more waveguides is associated with one of the one or more LEDs of the halo LED system. In some embodiments, each of the one or more waveguides is configured to transmit the light emitted from one of the one or more LEDs to the halo diffuser. In some embodiments, each of the one or more waveguides is coupled to the halo diffuser at a first end of the one or more waveguides and is proximate one of the one or more LEDs at a second end of the one or more waveguides.
In some embodiments, the thermostat includes an enclosure including a front portion and a back portion. In some embodiments, the halo diffuser is coupled to the front portion and the back portion and is located between the front portion and the back portion.
In some embodiments, the processing circuit is configured to operate the one or more LEDs of the halo LED system to indicate the emergency condition to the user by operating the one or more LEDs in a pattern to indicate one or more emergency response directions to the user prompting the user to perform a user response to the emergency condition.
In some embodiments, operating the one or more LEDs in the pattern to indicate the one or more emergency response directions comprises activating the one or more LEDs sequentially to indicate an emergency navigation direction.
In some embodiments, the thermostat includes a display screen. In some embodiments, the processing circuit is configured to operate the display screen to display one or more emergency response directions in response to a determination that the one or more data streams indicate the emergency condition.
In some embodiments, the one or more data streams include a building data stream generated by a building management system and a weather data stream generated by a weather server. In some embodiments, the thermostat includes a communication interface configured to receive the building data stream from the building management system via a network and the weather data stream from the weather server via the network. In some embodiments, the processing circuit is configured to cause the display screen to display non-emergency information based on the building data stream, determine whether the weather data stream indicates an emergency weather condition, and override the display of the non-emergency information by causing the display screen to indicate the one or more emergency response directions in response to a determination that the weather data stream indicates the emergency weather condition.
In some embodiments, the one or more emergency response directions include a building map and one or more evacuation directions, wherein the one or more evacuation directions include at least one of one or more directions to a building exit or one or more directions to an emergency shelter in the building. In some embodiments, causing the display screen to display the one or more emergency response directions includes causing the display screen to display the building map and the one or more evacuation directions.
In some embodiments, the one or more emergency response directions include an arrow indicating a route through the building for the user to follow. In some embodiments, causing the display screen to display the one or more emergency response directions includes causing the display screen to display the arrow.
In some embodiments, the arrow includes a first portion and an arrow border surrounding the first portion. In some embodiments, the first portion is a first color and the arrow border is a second color different than the first color.
Another implementation of the present disclosure is a display device for a building. The display device includes a halo light emitting diode (LED) system including one or more LEDs configured to emit light, a halo diffuser structured around at least a portion of an outer edge of the thermostat, wherein the halo diffuser is configured to diffuse the emitted light of the one or more LEDs around at least the portion of the outer edge of the thermostat, and one or more waveguides, wherein each of the one or more waveguides is configured to transmit light from one of the one or more LEDs to the halo diffuser. The display device includes a processing circuit configured to operate the one or more LEDs of the halo LED system to indicate a building emergency condition to a user.
In some embodiments, the processing circuit is configured to receive one or more data streams, determine whether the one or more data streams indicate the building emergency condition, and operate the one or more LEDs of the halo LED system to indicate the building emergency condition to the user.
In some embodiments, the processing circuit is configured to determine a display device condition that requires user input and operate the one or more LEDs of the halo LED system to indicate the display device condition to the user.
In some embodiments, each of the one or more waveguides are coupled to the halo diffuser at a first end of the one or more waveguides and is proximate to one of the one or more LEDs at a second end of the one or more waveguides.
In some embodiments, the display device includes an enclosure including a front portion and a back portion. In some embodiments, the halo diffuser is coupled to the front portion and the back portion and is located between the front portion and the back portion.
In some embodiments, the processing circuit is configured to operate the one or more LEDs of the halo LED system to indicate the emergency condition to the user by operating the one or more LEDs in a pattern to indicate one or more emergency response directions to the user prompting the user to perform a user response to the emergency condition.
In some embodiments, operating the one or more LEDs in the pattern to indicate the one or more emergency response directions comprises activating the one or more LEDs sequentially to indicate an emergency navigation direction.
Another implementation of the present disclosure is a controller for a building. The controller includes a halo light system including one or more lighting components configured to emit light and a halo diffuser structured around at least a portion of an outer edge of the controller, wherein the halo diffuser is configured to diffuse the emitted light of the one or more lighting components around at least the portion of the outer edge of the controller. The controller includes a display device configured to display information to a user. The controller includes a processing circuit configured to receive one or more data streams, determine whether at least one of the one or more data streams indicate a building emergency condition, operate the one or more lighting components of the halo light system to indicate the building emergency condition to the user, and operate the display device to display the building emergency condition to the user.
In some embodiments, a halo LED system further comprises one or more waveguides, wherein each of the one or more waveguides is associated with one of one or more LEDs of the halo LED system, wherein each of the one or more waveguides is configured to transmit light from one of the one or more LEDs to the halo diffuser, wherein each of the one or more waveguides is coupled to the halo diffuser at a first end of the one or more waveguides and is proximate to one of the one or more LEDs at a second end of the one or more waveguides.
Another implementation of the present disclosure is a thermostat for a building with an area light system and an occupancy sensor. The thermostat includes one or more LEDs configured to emit light in a direction toward a floor area beneath the thermostat. The thermostat is configured with a processing circuit configured to cause the one or more LEDs to emit the light towards the floor in response to an indication using data from an occupancy sensor that a user has approached the thermostat.
In some embodiments, the processing circuit of the occupancy sensor of the thermostat for a building with an area light system comprises one of more of a processor module, a memory module, an LED module, an occupancy sensor module, an occupancy sensor, an input interface, and output interface.
In some embodiments, the thermostat for a building with an area light system and occupancy sensor further comprises an area light system including a halo light system including one or more lighting components configured to emit light in an area in proximity to the thermostat and in a direction toward a floor area beneath the thermostat and a halo diffuser structured around at least a portion of an outer edge of the thermostat, wherein the halo diffuser is configured to diffuse the emitted light of the one or more lighting components around at least the portion of the outer edge of the thermostat
Overview
Referring generally to the FIGURES, a user control device is shown, according to various exemplary embodiments. The thermostat described herein may be used in any HVAC system, room, environment, or system within which it is desired to control and/or observe environmental conditions (e.g., temperature, humidity, etc.). In traditional HVAC systems, a thermostat may be adjusted by a user to control the temperature of a system.
The user control device is intended to provide the user with an ability to function as a connected smart hub. The thermostat provides a desirable user interface for other environmental controls because of its known fixed location within a space. The user control device is intended to be more personal, more efficient, and more aware than traditional thermostats.
The user control device collects data about a space and the occupants of the space with various sensors (e.g., temperature sensors, humidity sensors, acoustic sensors, optical sensors, gas and other chemical sensors, biometric sensors, motion sensors, etc.) and user inputs. The user control device may utilize data collected from a single room, multiple rooms, an entire building, or multiple buildings. The data may be analyzed locally by the user control device or may be uploaded to a remote computing system and/or the cloud for further analysis and processing.
Building Management System and HVAC System
Referring now to
The BMS that serves building 10 includes an HVAC system 100. HVAC system 100 may include a plurality of HVAC devices (e.g., heaters, chillers, air handling units, pumps, fans, thermal energy storage, etc.) configured to provide heating, cooling, ventilation, or other services for building 10. For example, HVAC system 100 is shown to include a waterside system 120 and an airside system 130. Waterside system 120 may provide a heated or chilled fluid to an air handling unit of airside system 130. Airside system 130 may use the heated or chilled fluid to heat or cool an airflow provided to building 10. An exemplary waterside system and airside system which may be used in HVAC system 100 are described in greater detail with reference to
HVAC system 100 is shown to include a chiller 102, a boiler 104, and a rooftop air handling unit (AHU) 106. Waterside system 120 may use boiler 104 and chiller 102 to heat or cool a working fluid (e.g., water, glycol, etc.) and may circulate the working fluid to AHU 106. In various embodiments, the HVAC devices of waterside system 120 may be located in or around building 10 (as shown in
AHU 106 may place the working fluid in a heat exchange relationship with an airflow passing through AHU 106 (e.g., via one or more stages of cooling coils and/or heating coils). The airflow may be, for example, outside air, return air from within building 10, or a combination of both. AHU 106 may transfer heat between the airflow and the working fluid to provide heating or cooling for the airflow. For example, AHU 106 may include one or more fans or blowers configured to pass the airflow over or through a heat exchanger containing the working fluid. The working fluid may then return to chiller 102 or boiler 104 via piping 110.
Airside system 130 may deliver the airflow supplied by AHU 106 (i.e., the supply airflow) to building 10 via air supply ducts 112 and may provide return air from building 10 to AHU 106 via air return ducts 114. In some embodiments, airside system 130 includes multiple variable air volume (VAV) units 116. For example, airside system 130 is shown to include a separate VAV unit 116 on each floor or zone of building 10. VAV units 116 may include dampers or other flow control elements that can be operated to control an amount of the supply airflow provided to individual zones of building 10. In other embodiments, airside system 130 delivers the supply airflow into one or more zones of building 10 (e.g., via supply ducts 112) without using intermediate VAV units 116 or other flow control elements. AHU 106 may include various sensors (e.g., temperature sensors, pressure sensors, etc.) configured to measure attributes of the supply airflow. AHU 106 may receive input from sensors located within AHU 106 and/or within the building zone and may adjust the flow rate, temperature, or other attributes of the supply airflow through AHU 106 to achieve setpoint conditions for the building zone.
Referring now to
In some embodiments, control device 214 can monitor the health of an occupant 216 of building 10. In some embodiments, control device 214 monitors heat signatures, heartrates, and any other information that can be collected from cameras, medical devices, and/or any other health related sensor. In some embodiments, building 10 has wireless transmitters 218 in each or some of zones 202-212. The wireless transmitters 218 may be routers, coordinators, and/or any other device broadcasting radio waves. In some embodiments, wireless transmitters 218 form a Wi-Fi network, a Zigbee network, a Bluetooth network, and/or any other kind of network.
In some embodiments, occupant 216 has a mobile device that can communicate with wireless transmitters 218. Control device 214 may use the signal strengths between the mobile device of occupant 216 and the wireless transmitters 218 to determine in which zone the occupant is. In some embodiments, control device 214 causes temperature setpoints, music and/or other control actions to follow occupant 216 as the occupant 216 moves from one zone to another zone (i.e., from one floor to another floor).
In some embodiments, control devices 214 are connected to a building management system, a weather server, and/or a building emergency sensor(s). In some embodiments, control devices 214 may receive emergency notifications from the building management system, the weather server, and/or the building emergency sensor(s). Based on the nature of the emergency, control devices 214 may give directions to an occupant of the building. In some embodiments, the direction may be to respond to an emergency (e.g., call the police, hide and turn the lights off, etc.) In various embodiments, the directions given to the occupant (e.g., occupant 216) may be navigation directions. For example, zone 212 may be a safe zone with no windows an individual (e.g., occupant 216). If control devices 214 determines that there are high winds around building 10, the control device 214 may direct occupants of zones 202-210 to zone 212 if zone 212 has no windows.
Referring now to
In
Hot water loop 314 and cold water loop 316 may deliver the heated and/or chilled water to air handlers located on the rooftop of building 10 (e.g., AHU 106) or to individual floors or zones of building 10 (e.g., VAV units 116). The air handlers push air past heat exchangers (e.g., heating coils or cooling coils) through which the water flows to provide heating or cooling for the air. The heated or cooled air may be delivered to individual zones of building 10 to serve the thermal energy loads of building 10. The water then returns to subplants 302-312 to receive further heating or cooling.
Although subplants 302-312 are shown and described as heating and cooling water for circulation to a building, it is understood that any other type of working fluid (e.g., glycol, CO2, etc.) may be used in place of or in addition to water to serve the thermal energy loads. In other embodiments, subplants 302-312 may provide heating and/or cooling directly to the building or campus without requiring an intermediate heat transfer fluid. These and other variations to waterside system 300 are within the teachings of the present disclosure.
Each of subplants 302-312 may include a variety of equipment configured to facilitate the functions of the subplant. For example, heater subplant 302 is shown to include a plurality of heating elements 320 (e.g., boilers, electric heaters, etc.) configured to add heat to the hot water in hot water loop 314. Heater subplant 302 is also shown to include several pumps 322 and 324 configured to circulate the hot water in hot water loop 314 and to control the flow rate of the hot water through individual heating elements 320. Chiller subplant 306 is shown to include a plurality of chillers 332 configured to remove heat from the cold water in cold water loop 316. Chiller subplant 306 is also shown to include several pumps 334 and 336 configured to circulate the cold water in cold water loop 316 and to control the flow rate of the cold water through individual chillers 332.
Heat recovery chiller subplant 304 is shown to include a plurality of heat recovery heat exchangers 326 (e.g., refrigeration circuits) configured to transfer heat from cold water loop 316 to hot water loop 314. Heat recovery chiller subplant 304 is also shown to include several pumps 328 and 330 configured to circulate the hot water and/or cold water through heat recovery heat exchangers 326 and to control the flow rate of the water through individual heat recovery heat exchangers 226. Cooling tower subplant 208 is shown to include a plurality of cooling towers 338 configured to remove heat from the condenser water in condenser water loop 318. Cooling tower subplant 308 is also shown to include several pumps 340 configured to circulate the condenser water in condenser water loop 318 and to control the flow rate of the condenser water through individual cooling towers 338.
Hot TES subplant 310 is shown to include a hot TES tank 342 configured to store the hot water for later use. Hot TES subplant 310 may also include one or more pumps or valves configured to control the flow rate of the hot water into or out of hot TES tank 342. Cold TES subplant 312 is shown to include cold TES tanks 344 configured to store the cold water for later use. Cold TES subplant 312 may also include one or more pumps or valves configured to control the flow rate of the cold water into or out of cold TES tanks 344.
In some embodiments, one or more of the pumps in waterside system 300 (e.g., pumps 322, 324, 328, 330, 334, 336, and/or 340) or pipelines in waterside system 300 include an isolation valve associated therewith. Isolation valves may be integrated with the pumps or positioned upstream or downstream of the pumps to control the fluid flows in waterside system 300. In various embodiments, waterside system 300 may include more, fewer, or different types of devices and/or subplants based on the particular configuration of waterside system 300 and the types of loads served by waterside system 300.
Referring now to
Each of dampers 416-420 may be operated by an actuator. For example, exhaust air damper 416 may be operated by actuator 424, mixing damper 418 may be operated by actuator 426, and outside air damper 420 may be operated by actuator 428. Actuators 424-428 may communicate with an AHU controller 430 via a communications link 432. Actuators 424-428 may receive control signals from AHU controller 430 and may provide feedback signals to AHU controller 430. Feedback signals may include, for example, an indication of a current actuator or damper position, an amount of torque or force exerted by the actuator, diagnostic information (e.g., results of diagnostic tests performed by actuators 424-428), status information, commissioning information, configuration settings, calibration data, and/or other types of information or data that may be collected, stored, or used by actuators 424-428. AHU controller 430 may be an economizer controller configured to use one or more control algorithms (e.g., state-based algorithms, extremum seeking control (ESC) algorithms, proportional-integral (PI) control algorithms, proportional-integral-derivative (PID) control algorithms, model predictive control (MPC) algorithms, feedback control algorithms, etc.) to control actuators 424-428.
Still referring to
Cooling coil 434 may receive a chilled fluid from waterside system 200 (e.g., from cold water loop 316) via piping 442 and may return the chilled fluid to waterside system 200 via piping 444. Valve 446 may be positioned along piping 442 or piping 444 to control a flow rate of the chilled fluid through cooling coil 474. In some embodiments, cooling coil 434 includes multiple stages of cooling coils that can be independently activated and deactivated (e.g., by AHU controller 430, by BMS controller 466, etc.) to modulate an amount of cooling applied to supply air 410.
Heating coil 436 may receive a heated fluid from waterside system 200 (e.g., from hot water loop 314) via piping 448 and may return the heated fluid to waterside system 200 via piping 450. Valve 452 may be positioned along piping 448 or piping 450 to control a flow rate of the heated fluid through heating coil 436. In some embodiments, heating coil 436 includes multiple stages of heating coils that can be independently activated and deactivated (e.g., by AHU controller 430, by BMS controller 466, etc.) to modulate an amount of heating applied to supply air 410.
Each of valves 446 and 452 may be controlled by an actuator. For example, valve 446 may be controlled by actuator 454 and valve 452 may be controlled by actuator 456. Actuators 454-456 may communicate with AHU controller 430 via communications links 458-460. Actuators 454-456 may receive control signals from AHU controller 430 and may provide feedback signals to controller 430. In some embodiments, AHU controller 430 receives a measurement of the supply air temperature from a temperature sensor 462 positioned in supply air duct 612 (e.g., downstream of cooling coil 434 and/or heating coil 436). AHU controller 430 may also receive a measurement of the temperature of building zone 406 from a temperature sensor 464 located in building zone 406.
In some embodiments, AHU controller 430 operates valves 446 and 452 via actuators 454-456 to modulate an amount of heating or cooling provided to supply air 410 (e.g., to achieve a set point temperature for supply air 410 or to maintain the temperature of supply air 410 within a set point temperature range). The positions of valves 446 and 452 affect the amount of heating or cooling provided to supply air 410 by cooling coil 434 or heating coil 436 and may correlate with the amount of energy consumed to achieve a desired supply air temperature. AHU controller 430 may control the temperature of supply air 410 and/or building zone 406 by activating or deactivating coils 434-436, adjusting a speed of fan 438, or a combination of both.
Still referring to
In some embodiments, AHU controller 430 receives information from BMS controller 466 (e.g., commands, set points, operating boundaries, etc.) and provides information to BMS controller 466 (e.g., temperature measurements, valve or actuator positions, operating statuses, diagnostics, etc.). For example, AHU controller 430 may provide BMS controller 466 with temperature measurements from temperature sensors 462-464, equipment on/off states, equipment operating capacities, and/or any other information that can be used by BMS controller 466 to monitor or control a variable state or condition within building zone 406.
Control device 214 may include one or more of the user control devices. Control device 214 may include one or more human-machine interfaces or client interfaces (e.g., graphical user interfaces, reporting interfaces, text-based computer interfaces, client-facing web services, web servers that provide pages to web clients, etc.) for controlling, viewing, or otherwise interacting with HVAC system 100, its subsystems, and/or devices. Control device 214 may be a computer workstation, a client terminal, a remote or local interface, or any other type of user interface device. Control device 214 may be a stationary terminal or a mobile device. For example, control device 214 may be a desktop computer, a computer server with a user interface, a laptop computer, a tablet, a smartphone, a PDA, or any other type of mobile or non-mobile device. Control device 214 may communicate with BMS controller 466 and/or AHU controller 430 via communications link 472.
Referring now to
In some embodiments, speakers 504 are located locally as a component of control device 214. Speakers 504 may be low power speakers used for playing audio to the immediate occupant of control device 214 and/or occupants of the zone in which control device 214 is located. In some embodiments, speakers 504 may be remote speakers connected to control device 214 via a network. In some embodiments, speakers 504 are a building audio system, an emergency alert system, and/or alarm system configured to broadcast building wide and/or zone messages or alarms.
Control device 214 may communicate with a remote camera 506, a shade control system 512, a leak detection system 508, a HVAC system, or any of a variety of other external systems or devices which may be used in a home automation system or a building automation system. Control device 214 may provide a variety of monitoring and control interfaces to allow a user to control all of the systems and devices connected to control device 214. Exemplary user interfaces and features of control device 214 are described in greater detail below.
Referring now to
In some embodiments, network 602 communicatively couples the devices, systems, and servers of system 600. In some embodiments, network 602 is at least one of and/or a combination of a Wi-Fi network, a wired Ethernet network, a Zigbee network, and a Bluetooth network. Network 602 may be a local area network or a wide area network (e.g., the Internet, a building WAN, etc.) and may use a variety of communications protocols (e.g., BACnet, IP, LON, etc.) Network 602 may include routers, modems, and/or network switches.
In some embodiments, control device 214 is configured to receive emergency information, navigation directions, occupant information, concierge information, and any other information via network 602. In some embodiments, the information is received from building management system 610 via network 602. In various embodiments, the information is received from the Internet via network 602. In some embodiments, control device 214 is at least one of or a combination of a thermostat, a humidistat, a light controller, and any other wall mounted and/or hand held device. In some embodiments, control device 214 is connected to building emergency sensor(s) 606. In some embodiments, building emergency sensor(s) 606 are sensors which detect building emergencies. Building emergency sensor(s) 606 may be smoke detectors, carbon monoxide detectors, carbon dioxide detectors (e.g., carbon dioxide sensors 522), an emergency button (e.g., emergency pull handles, panic buttons, a manual fire alarm button and/or handle, etc.) and/or any other emergency sensor. In some embodiments, the emergency sensor(s) include actuators. The actuators may be building emergency sirens and/or building audio speaker systems (e.g., speakers 504), automatic door and/or window control (e.g., shade control system 512), and any other actuator used in a building.
In some embodiments, control device 214 may be communicatively coupled to weather server(s) 608 via network 602. In some embodiments, the control device 214 may be configured to receive weather alerts (e.g., high and low daily temperature, five-day forecast, thirty-day forecast, etc.) from weather server(s) 608. Control device 214 may be configured to receive emergency weather alerts (e.g., flood warnings, fire warnings, thunder storm warnings, winter storm warnings, etc.) In some embodiments, control device 214 may be configured to display emergency warnings via a user interface of control device 214 when control device 214 receives an emergency weather alert from weather server(s) 608. The control device 214 may be configured to display emergency warnings based on the data received from building emergency sensor(s) 606. In some embodiments, the control device 214 may cause a siren (e.g., speakers 504 and/or building emergency sensor(s) 606) to alert occupants of the building of an emergency, cause all doors to become locked and/or unlocked, cause an advisory message be broadcast through the building, and control any other actuator or system necessary for responding to a building emergency.
In some embodiments, control device 214 is configured to communicate with building management system 610 via network 602. Control device 214 may be configured to transmit environmental setpoints (e.g., temperature setpoint, humidity setpoint, etc.) to building management system 610. In some embodiments, building management system 610 may be configured to cause zones of a building (e.g., building 10) to be controlled to the setpoint received from control device 214. In some embodiments, building management system 610 may be configured to control the lighting of a building. In some embodiments, building management system 610 may be configured to transmit emergency information to control device 214. In some embodiments, the emergency information is a notification of a shooter lockdown, a tornado warning, a flood warning, a thunderstorm warning, and/or any other warning. In some embodiments, building management system 610 is connected to various weather servers or other web servers from which building management system 610 receives emergency warning information. In various embodiments, building management system is a computing system of a hotel. Building management system 610 may keep track of hotel occupancy, may relay requests to hotel staff, and/or perform any other functions of a hotel computing system.
Control device 214 is configured to communicate with user device 612 via network 602. In some embodiments, user device 612 is a smartphone, a tablet, a laptop computer, and/or any other mobile and/or stationary computing device. In some embodiments, user device 612 communicates calendar information to control device 214. In some embodiments, the calendar information is stored and/or entered by a user into a calendar application. In some embodiments, calendar application is at least one of Outlook, Google Calendar, Fantastical, Shifts, CloudCal, DigiCal, and/or any other calendar application. In some embodiments, control device 214 receives calendar information from the calendar application such as times and locations of appointments, times and locations of meetings, and/or any other information. Control device 214 may be configured to display building map direction to a user associated with user device 612 and/or any other information.
In some embodiments, a user may press a button on a user interface of control device 214 indicating a building emergency. The user may be able to indicate the type of emergency (e.g., fire, flood, active shooter, etc.) Control device 214 may communicate an alert to building management system 610, user device 612, and any other device, system, and/or server.
In some embodiments, control device 214 is communicably coupled to healthcare sensor(s) 604 via network 602. In some embodiments, control device is configured to monitor healthcare sensor(s) 604 collecting data for occupants of a building (e.g., building 10) and determine health metrics for the occupants based on the data received from the healthcare sensor(s) 604. In some embodiments, healthcare sensor(s) 604 are one or more smart wrist bands, pacemakers, insulin pumps, and/or any other medical device. The health metrics may be determined based on heart rates, insulin levels, and/or any other biological and/or medical data.
Referring now to
Sensors 714 may be configured to measure a variable state or condition of the environment in which control device 214 is installed. For example, sensors 714 are shown to include a temperature sensor 716, a humidity sensor 718, an air quality sensor 720, a proximity sensor 722, a camera 724, a microphone 726, a light sensor 728, and a vibration sensor 730. Air quality sensor 720 may be configured to measure any of a variety of air quality variables such as oxygen level, carbon dioxide level, carbon monoxide level, allergens, pollutants, smoke, etc. Proximity sensor 722 may include one or more sensors configured to detect the presence of people or devices proximate to control device 214. For example, proximity sensor 722 may include a near-field communications (NFC) sensor, a radio frequency identification (RFID) sensor, a Bluetooth sensor, a capacitive proximity sensor, a biometric sensor, or any other sensor configured to detect the presence of a person or device. Camera 724 may include a visible light camera, a motion detector camera, an infrared camera, an ultraviolet camera, an optical sensor, or any other type of camera. Light sensor 728 may be configured to measure ambient light levels. Vibration sensor 730 may be configured to measure vibrations from earthquakes or other seismic activity at the location of control device 214.
Still referring to
Communications interface 732 may include a network interface configured to facilitate electronic data communications between control device 214 and various external systems or devices (e.g., network 602, building management system 610, HVAC equipment 738, user device 612, etc.) For example, control device 214 may receive information from building management system 610 or HVAC equipment 738 indicating one or more measured states of the controlled building (e.g., temperature, humidity, electric loads, etc.) and one or more states of the HVAC equipment 738 (e.g., equipment status, power consumption, equipment availability, etc.). In some embodiments, HVAC equipment 738 may be lighting systems, building systems, actuators, chillers, heaters, and/or any other building equipment and/or system. Communications interface 732 may receive inputs from building management system 610 or HVAC equipment 738 and may provide operating parameters (e.g., on/off decisions, set points, etc.) to building management system 610 or HVAC equipment 738. The operating parameters may cause building management system 610 to activate, deactivate, or adjust a set point for various types of home equipment or building equipment in communication with control device 214.
Processing circuit 734 is shown to include a processor 740 and memory 742. Processor 740 may be a general purpose or specific purpose processor, an application specific integrated circuit (ASIC), one or more field programmable gate arrays (FPGAs), a group of processing components, or other suitable processing components. Processor 740 may be configured to execute computer code or instructions stored in memory 742 or received from other computer readable media (e.g., CDROM, network storage, a remote server, etc.).
Memory 742 may include one or more devices (e.g., memory units, memory devices, storage devices, etc.) for storing data and/or computer code for completing and/or facilitating the various processes described in the present disclosure. Memory 742 may include random access memory (RAM), read-only memory (ROM), hard drive storage, temporary storage, non-volatile memory, flash memory, optical memory, or any other suitable memory for storing software objects and/or computer instructions. Memory 742 may include database components, object code components, script components, or any other type of information structure for supporting the various activities and information structures described in the present disclosure. Memory 742 may be communicably connected to processor 740 via processing circuit 734 and may include computer code for executing (e.g., by processor 740) one or more processes described herein. For example, memory 742 is shown to include a voice command module 744, a building module 746, a voice control module 748, a payment module 758, a hotel module 750, a healthcare module 752, an occupancy module 754, and an emergency module 756. The functions of some of these modules is described in greater detail below.
In some embodiments, voice command module 744 is configured to receive audio data from microphone 726. Voice command module 744 may be configured to translate audio data into spoken words. In some embodiments, voice command module 744 may be configured to perform Internet searches based on the spoken words via network 602. In various embodiments, voice command module 744 may send requests to building management system 610 based on the spoken words.
Occupancy Tracking Features
Referring now to
Building management system 610 may include an application server. The application server may be a remote server and may be hosted at a remote location. The application server may be configured to provide a web-based presence for users and/or building administrators to access information regarding occupancy of the building. In some embodiments, the application server allows users and/or building administrators to view data pertaining to the number of users in the building space and their respective locations. The application server may communicate with user device 612 through routers 804-808 or may communicate to user device 612 via mobile data (e.g. 1G, 2G, 3G, LTE, etc.).
In some embodiments, the application server integrates a building facility web application with the determined number and location of occupants. In some embodiments, the building facility application may control room, zone, building, and campus lighting, booking, public service announcements and other features of a building facility. In some embodiments, the building facility web application may identify a user when a device associated with the user (e.g., user device 612) is detected in a room, zone, building and/or campus based on wireless signal strengths. The building facility web application may automatically login the identified user with the building web facility application. A user that has been logged in may be able to change lighting, environmental setpoints and any other adjustable building facility web application feature via user device 612. In some embodiments, the building facility web application may automatically adjust lighting and environmental setpoints to preferred settings of the identified and logged in user.
Routers 804-808 may be installed for the specific purpose of determining user occupancy or may be existing routers in a wireless building network. In some embodiments, each router may have a unique ID. In
Routers 804-808 can be configured to emit, receive, sense, relay, or otherwise engage in unidirectional or bidirectional wireless communications. Routers 804-808 can use any type wireless technology or communications protocol. For example, in various embodiments, the wireless emitters/receivers can be Bluetooth low energy (BLE) emitters, near field communications (NFC) devices, Wi-Fi transceivers, RFID devices, ultrawide band (UWB) devices, infrared emitters/sensors, visible light communications (VLC) devices, ultrasound devices, cellular transceivers, iBeacons, or any other type of hardware configured to facilitate wireless data communications. In some embodiments, routers 804-808 are integrated with various devices within the building (e.g., thermostats, lighting sensors, zone controllers).
Routers 804-808 can broadcast a wireless signal. The wireless signal broadcast by routers 804-808 can include the identifier associated with routers 804-808. For example, routers 804-808 can broadcast a SSID, MAC address, or other identifier which can be used to identify a particular router. In some embodiments, the wireless signal broadcast by routers 804-808 includes multiple emitter identifiers (e.g., a UUID value, a major value, a minor value, etc.). User device 612 can detect the wireless signals emitted by the routers 804-808. User device 612 can be configured to identify the router associated with the wireless signal. In some embodiments, user device 612 detects the signal strength of the wireless signals for each of routers 804-808.
In
User device 612 may store the location of each router 804-808 in a memory device and may determine (e.g., triangulate, estimate, etc.) the location of user device 612 based on the stored locations of routers 804-808 and the determined RSSI value for each router. In some embodiments, user device 612 is only connected to a single router or only receives a wireless signal from a single router. User device 612 may determine an approximate circular field around the single router in which user device 612 may be located based on the determined RSSI. In some embodiments, the circular field is an approximate radius such as a distance that user device 612 may be located away from the router. For example, a strong RSSI may indicate that user device 612 is close to a particular router, whereas a weaker RSSI may indicate that user device 612 is further from the router. User device 612 can use a mapping table or function to translate RSSI into distance. In some embodiments, the translation between RSSI and distance is a function of the router's broadcast power or other router settings, which user device 612 can receive from each router within broadcast range. In some embodiments, the field is a range of radii. Each radii may be different and user device 612 may be located between the two radii in a disc shaped field. In various embodiments, user device 612 triangulates the location of user device 612 based on one or more signal strengths between known locations of routers.
In various embodiments, routers 804-808 send signal strengths between routers 804-808 and user device 612 to control device 214. Control device 214 may store the location of each router 804-808 in a memory device and may determine (e.g., triangulate, estimate, etc.) the location of user device 612 based on the stored locations of routers 804-808 and the determined RSSI value for each router. In some embodiments, user device 612 is only connected to a single router or only receives a wireless signal from a single router. Control device 214 may determine an approximate circular field around the single router in which user device 612 may be located based on the determined RSSI. In some embodiments, the circular field is an approximate radius such as a distance that user device 612 may be located away from the router. For example, a strong RSSI may indicate that user device 612 is close to a particular router, whereas a weaker RSSI may indicate that user device 612 is further from the router. Control device 214 can use a mapping table or function to translate RSSI into distance. In some embodiments, the translation between RSSI and distance is a function of the router's broadcast power or other router settings, which control device 214 can receive from each router within broadcast range. In some embodiments, the field is a range of radii. Each radii may be different and user device 612 may be located between the two radii in a disc shaped field. In various embodiments, control device 214 triangulates the location of user device 612 based on one or more signal strengths between known locations of routers.
Still referring to
Referring now to
Still referring to
In
Building management system 610 uses the emitter identifier and/or the device identifier to select a user interface for presentation on user device 612. Building management system 610 may select the user interface for a building zone associated with the emitter identifier reported by user device 612. For example, building management system 610 may select a user interface which includes information and/or control options relating to the building zone associated with the reported emitter identifier. In some embodiments, building management system 610 selects a user interface based on the identity of a user associated with user device 612 (e.g., based on a user identifier or device identifier reported by user device 612). In some embodiments, building management system 610 uses emitter identifier reported by user device 612 to determine the position of user device 612 within the building. Building management system 610 may send the position of user device 612 to control device 214. Building management system 610 may select a user interface for monitoring and/or controlling the building zone in which user device 612 is currently located or a building zone in which user device 612 has been located previously.
Still referring to
Referring now to
Sensor units 1002 (e.g., proximity sensor 520, remote camera 506, occupancy sensor 516, routers 804-808, emitter 902, etc.) may be installed in various rooms or zones in the home. For example,
In some embodiments, a building management system and/or control device 214 determines the location of the user device. The sensor units 1002 may be configured to measure environmental conditions within each room or zone and to receive user input (e.g., voice commands via a microphone). For example, each sensor unit 1002 may include a plurality of sensors (e.g., a temperature sensor, a humidity sensor, a smoke detector, a light sensor, a camera, a motion sensor etc.) configured to measure variables such as temperature, humidity, light, etc. in the room or zone in which the sensor unit is installed. The sensor units 1002 may communicate (e.g., wirelessly or via a wired communications link) with the control device 214 and/or with each other. In some embodiments, sensors, such as low power door sensors, can communicate with repeaters disposed in the gang boxes or other locations using a low power overhead protocol. The repeaters can provide wired or wireless communication to the main control unit.
Referring now to
In some embodiments, occupancy module 754 may be configured to determine the identity of an occupant based on occupancy data 1102 received from sensors 714. In some embodiments, the occupancy module 754 receives sensor input from sensors 714 where the sensors may include camera 724. Occupancy module 754 can perform digital image processing to identify the one or more users based on the digital images received from camera 724. In some embodiments, digital image processing is used to identify the faces of the one or more users, the height of the one or more users, or any other physical characteristic of the one or more users. In some embodiments, the digital image processing is performed by image analysis tools such as edge detectors and neural networks. In some embodiments, the digital image processing compares the physical characteristics of the one or more users with physical characteristics of previously identified users.
In some embodiments, the occupancy module 754 receives sensor input from microphone 726. Microphone 726 can be any of a plurality of microphone types. The microphone types include, for example, a dynamic microphone, a ribbon microphone, a carbon microphone, a piezoelectric microphone, a fiber optic microphone, a laser microphone, a liquid microphone, and an audio speaker used as a microphone. In some embodiments, the occupancy controller analyzes the audio data received from the microphone. In some embodiments, the occupancy controller 636 identifies one or more users based on voice biometrics of the audio received from microphone 726. Voice biometrics are the unique characteristics of a speaker's voice. Voice biometrics include voice pitch or speaking style that result from the anatomy of the speaker's throat and/or mouth. In some embodiments, the occupancy module 754 uses a text dependent voice recognition technique. In some embodiments, the occupancy module 754 uses a text independent voice recognition technique to identify the one or more users. Occupancy module 754 may be configured to store voice biometrics linked to individuals. Occupancy module 754 may be configured to match the stored voice biometrics to voice biometrics determined for occupants.
In some embodiments, the occupancy module 754 uses the text dependent voice recognition technique to identify the one or more users based on a password or particular phrase spoken by one of the users. For example, the user may speak a phrase such as “This is Felix, I am home.” The occupancy module 754 can perform speech recognition to determine the spoken phrase “This is Felix, I am home” from the audio data received form the microphone. In some embodiments, occupancy module 754 uses one or a combination of a hidden Markov models, dynamic time warping, and a neural networks to determine the spoken phrase. Occupancy module 754 compares the determined spoken phrase to phrases linked to users. If the phrase, “This is Felix, I am home” matches a phrase linked to a user Felix, the occupancy controller identifies the user as Felix.
In some embodiments, occupancy module 754 uses the text independent voice recognition technique to identify one or more users based on particular voice biometrics of the user. The text independent voice recognition technique performs a pattern recognition technique to identify the particular voice biometrics of the speaker from the audio data received from the microphone. The voice biometrics include voice pitch and speaking style. In some embodiments, a plurality of techniques are used to identify the voice biometrics of the user. The techniques include frequency estimation, hidden Markov models, Gaussian mixture models, pattern matching algorithms, neural networks, matrix representation, Vector Quantization, and decision trees.
In some embodiments, the occupancy module 754 is configured to capture audio data from one or more users and perform pre-processing. In some embodiments pre-processing may be compressing the audio data, converting the audio data into an appropriate format, and any other pre-processing action necessary. The occupancy module 754 may be configured to transmit the captured spoken audio data to a voice recognition server via communications interface 732 and network 602 as described with reference to
Still referring to
The building management system 610 may send the identity of the occupant and the location of the occupant in a building (e.g., building 10). In some embodiments, control device 214 is configured to cause zones and/or buildings to be controlled to environmental conditions (e.g., temperature setpoint, humidity setpoint, etc.) based on environmental condition preferences and location of the occupant. The control device 214 may be configured to generate control signals for HVAC equipment 738 to achieve the preferred environmental conditions. In various embodiments, the control device 214 may be configured to play music in different zones and/or cause a music platform (e.g., Pandora, Spotify, etc.) to play music preferences of the identified user in the zone and/or building which the user is located.
Referring now to
Control device 214 may determine that the user has moved to a second zone 1204 of the home/building (step 1308) and may operate the home/building equipment to achieve the user-specific climate control settings in the second zone 1204 (step 1310). In some embodiments, control device 214 is configured to operate the lighting of zones 1202 and 1204 based upon the location of the user (step 1312). For example, control device 214 may turn off lights in zone 1202 and on in zone 1204 when the user moves from zone 1202 to zone 1204 (step 1316). Control device 214 may be configured to operating music played in zones 1202 and 1204 when the user moves from zone 1202 to 1204 (step 1316). For example, when the user moves to zone 1204, the music may stop playing in zone 1202 and being playing in 1204 (step 1318).
Referring now to
In some embodiments, a unique device identifier (e.g., a serial number, a hardware ID, a MAC address, etc.) may link user device 612 to a particular user profile. When user device 612 is determined to be in the building (e.g., building 10) the user may receive a command to authenticate (i.e., log in) with building management system 610 via user device 612 (step 1404). In some embodiments, user device 612 automatically authenticated with the building management system 610 based on a unique device identifier. In some embodiments, the authentication is performed directly between the user device and the building management system 610. In various embodiments, control device 214 receives the unique device identifier from the user device and facilitates the authentication with building management system 610. In various embodiments, the user may be prompted to enter a user name and password via user device 612 and/or user interface 702 of control device 214 to authenticate with the building management system 610.
In some embodiments, the building management system 610 may be configured to generate a three dimensional building map with the location and identity of multiple building occupants located on the map (step 1406). The building map may contain multiple floors, zones, buildings, and/or campuses. In some embodiments, the three dimensional building map may be accessible via a user device (e.g., user device 612) if the user device has the proper permissions to view the building map. In some embodiments, the user device must be associated with a technician, and/or any other building employee for the user to have access to the three dimensional building map.
In some embodiments, building management system 610 keeps a record of various occupants of the building and associated permissions with each occupant. In some embodiments, the permissions are music permission (i.e., if the user can change music, radio stations, volume, etc. of the music played in various zones of the building). In some embodiments, the permissions allow a user to change music, radio stations, music volume, environmental setpoints, lighting and/or any other adjustable setting of control device 214 via user interface 702, microphone 726, and/or user device 612 associated with the user. In some embodiments, the permissions to change and/or adjust environmental conditions (e.g., temperature setpoint, humidity setpoint, etc.) (step 1408). Based on the permissions and user preferences, the building management system 610 may be configured to send commands to the devices (e.g., control device 214) to adjust environmental zone conditions, lighting, and music of zones (step 1410).
Referring now to
Occupant A 1414 has a preferred setpoint of 78 degrees F., occupant B 1416 has a preferred setpoint of 75 degrees F. and occupant C 1418 has no permission to change the setpoint. In some embodiments, when an occupant with a preferred setpoint moves from a first zone to a second zone, the preferred setpoint may follow the occupant and the second zone may be heated and/or cooled to the preferred setpoint. An occupant with no permission to change a setpoint (e.g., occupant C 1418) may not be able to make any changes to the setpoint.
In some embodiments, control device 214 may disable changes to the setpoint whenever occupant C 1418 is determined to be a set distances from control device 214. In some embodiments, control device 214 may disable changes to the lighting whenever occupant C 1418 is identified in the zone that control device 214 is located. In some embodiments, when occupant C 1418 is authenticated and/or logged in with the building management system and/or control device 214 as described with reference to
Occupant A 1414, occupant B 1416, and occupant C 1418 may have permissions and preferences for music 1422 such as the music played in zones of a building (e.g., building 10). In table 1412, occupant A 1414 has a preference for no music, occupant B 1416 has a preferred radio station, and occupant C 1418 does not have permission to play music. In some embodiments, whenever occupant B 1416 is in a zone, the building equipment in that zone may automatically play radio station AM 1130. In some embodiments, when occupant A 1414 enters a zone, the building equipment in that zone will automatically turn off any music that is playing. In some embodiments, any attempt by occupant C 1418 to play music and/or audio will be met by a notification that occupant C 1418 does not have the appropriate permissions to change the music and/or audio.
In some embodiments, control device 214 may disable changes to music preferences whenever occupant C 1418 is determined to be a set distances from control device 214. In some embodiments, control device 214 may disable changes to the lighting whenever occupant C 1418 is identified in the zone that control device 214 is located. In some embodiments, when occupant C 1418 is authenticated and/or logged in with building management system 610 and/or control device 214 via a user device (e.g., user device 612) as described with reference to
Occupant A 1414, occupant B 1416, and occupant C 1418 may have permissions and preferences for lighting 1424. In some embodiments, the lighting in zones and/or a building (e.g., building 10) may be adjusted based on permissions and preferences of occupant A 1414, occupant B 1416, and occupant C 1418. Occupant A 1414 may have no permission to change lighting. Occupant B 1416 may have a preference for lighting in the zone which occupant B occupies to be dim. Occupant C 1418 may have the preference that the lighting associated with the zone which occupant C 1418 occupies be at full brightness.
In some embodiments, control device 214 may disable changes to the lighting whenever occupant A 1414 is determined to be a set distances from control device 214. In some embodiments, control device 214 may disable changes to the lighting whenever occupant A 1414 is identified in the zone that control device 214 is located. In some embodiments, when occupant A 1414 is authenticated and/or logged in with building management system 610 and/or control device 214 via a user device (e.g., user device 612) as described with reference to
Occupant A 1414, occupant B 1416, and occupant C 1418 may have permissions and preferences for shades/blinds 1426. In some embodiments, occupant A 1414 has the preference that natural light be used to illuminate the zone which occupant A 1414 occupies whenever possible. Using natural light may include opening shades, opening blinds, and/or opening shutters. Occupant B 1416 and occupant C 1418 may have no permission to open and/or close shades, blinds, and/or shutters. Any attempt by occupant B 1416 and occupant C 1418 to open and/or close shades, blinds, and/or shutters controlled by control device 214 may be met with a notification that occupants A 1416 and/or occupant C 1418 may not have the proper permission to open and/or close the shades, blinds, and/or shutters.
In some embodiments, control device 214 may disable changes to the shades and/or blinds whenever occupants B 1416 and/or occupant C 1418 are determined to be a set distance from control device 214. In some embodiments, control device 214 may disable changes to the shades and/or blinds whenever occupant B 1416 and/or occupant C 1418 are identified in the zone which control device 214 is located. In some embodiments, when occupant B 1416 and/or occupant C 1418 are authenticated with building management system 610 and/or control device 214 via a user device (e.g., user device 612) as described with reference to
Display and Emergency Features
Referring now to
Referring now to
In some embodiments, if a connection is lost between control device 214 and building management system 610, control device 214 may display messages stored and/or generated locally on control device 214 (step 1616) on user interface 702. In some embodiments, the display messages stored and/or generated locally on control device 214 include zone temperatures, zone humidity, building events, etc. In the event that an emergency is detected by emergency sensors (e.g., building emergency sensor(s) 606) connected to control device 214, the general messages received from building management system 610 may be overridden and emergency messages may be display on user interface 702 based on data received from the emergency sensors (step 1618). In some embodiments, when the data received from the emergency sensors is above a predefined threshold and/or below another predefined threshold, an emergency may be identified. In the event that an emergency is detected by emergency sensors (e.g., building emergency sensor(s) 606) connected to control device 214, the general messages stored locally and/or determined by control device 214 may be overridden and emergency messages may be display on user interface 702 based on data received from the emergency sensors.
In some embodiments, control device 214 may receive a message from a weather server (e.g., weather server 608). Control device 214 may be configured to override general messages received from building management system 610 when a notification for weather related emergency and/or any other type of emergency is received from weather server 608 (step 1620). Control device 214 may be configured to display weather related emergency notifications and directions via user interface 702 over the general messages received from building management system 610.
Referring now to
Emergency screen 1700 is shown to have an alert title 1702 describing the contents of the page. In this exemplary embodiment, the title is “TORNADO WARNING.” In some embodiments, alert title 1702 is customizable to provide more information. In other embodiments, alert title 1702 is customizable to provide less information. Alert title 1702 may be a button which takes the user to a page related to the title. For example, clicking alert title 1702 may take a user to a menu of pages related to “TORNADO WARNING.” In some embodiments, clicking and/or pressing alert title 1702 navigates to a website and/or other entity. The website may be a weather server and may provide more information into the nature of the emergency.
Emergency screen 1700 is also shown to have an alert icon 1704. In this exemplary embodiment, alert icon 1704 is an image of a tornado. Alert icon 1704 may be any symbol, text, etc., and indicates the nature of the alert. For example, alert icon 1704 may be an image of a snowflake, text reading “FLOOD,” text reading “FIRE,” text reading “ACTIVE SHOOTER,” etc. Alert icon 1704 provides information to a user about the alert, and may be any indicator relating to any type of emergency.
Emergency screen 1700 is shown to have instructions 1706. Instructions 1706 can provide information to a user about how to proceed in the current situation. In some embodiments, instructions 1706 may inform a user of how to exit a building. For example, instructions 1706 may inform a user of which room to head to. In other embodiments, instructions 1706 inform a user of which authorities to inform, etc. For example, instructions 1706 may instruct a user to call an ambulance, then the police, then building and/or campus security. Instructions 1706 may be downloaded from a network (e.g., network 602). In some embodiments, instructions are requested from network 602. In various embodiments, instructions are pushed to control device 214. Instructions 1706 may be stored for access by control device 214 in specific situations. In some embodiments, instructions 1706 may be stored locally on control device 214. In other embodiments, instructions 1706 may be stored remotely from control device 214. Instructions 1706 may be stored anywhere and retrieved by control device 214.
Emergency screen 1700 is also shown to have directions 1708. In some embodiments, directions 1708 may be an embodiment of instructions 1706. In other embodiments, directions 1708 provide different information from instructions 1706. Directions 1708 may provide a user information regarding where to go. For example, directions 1708 may be an arrow pointing in the correct direction to go. In some embodiments, control device 214 is portable, and may detect movement to alter directions 1708. For example, directions 1708 may change depending on the direction a user is facing. Directions 1708 may be any indicator providing directional information, and is not limited to those specifically enumerated.
Emergency screen 1700 is also shown to have a menu option 1710. In this exemplary embodiment, option 1710 is an “Ok” button. For example, option 1710 may accept the prompt. In some embodiments, option 1710 may simply dismiss the prompt. In other embodiments, option 1710 may proceed to the next action. In some embodiments, option 1710 is a forward button, a menu, etc. Option 1710 may perform any function, and is not limited to those specifically enumerated.
Referring now to
Screen 1800 is shown to include position indicator 1802. Position indicator 1802 may provide information on the whereabouts of a user, or another person, item, component, etc. For example, in this exemplary embodiment, position indicator 1802 is shown as an image of a person, and indicates the position of the person. In some embodiments, position indicator 1802 may indicate the position of multiple users, items, etc. Position indicator 1802 may further include a differentiating label, which may indicate which user, item, etc. is shown by each of the multiple indicators. In other embodiments, position indicator 1802 may indicate the position of a single user, item, etc. Position indicator 1802 may be any symbol, text, etc., and is not limited to those specifically enumerated.
Screen 1800 is shown to include floorplan 1804. Floorplan 1804 may be a diagram of a floorplan of an area serviced by control device 214. In some embodiments, the area is the area in which control device 214 is installed. In other embodiments, the area is another area, and may be selected by a user. In some embodiments, floorplan 1804 may show multiple locations. For example, floorplan 1804 may show both floors of a two-story building. A user may be able to select multiple locations to display (e.g., the top floor and the fourth floor of a 35 story building). In other embodiments, floorplan 1804 may show a single location. Floorplan 1804 may display any number of any locations, and is not limited to those specifically enumerated.
Screen 1800 is also shown to include directions 1806. Directions 1803 may provide information to a user regarding how to navigate to a certain location (i.e., evacuate). In some embodiments, directions 1806 provide the fastest route out of a building. For example, directions 1806 may direct a user to the exit of a building in case of an emergency. In other embodiments, directions 1806 provide a user with a route to a specified location. For example, directions 1806 may direct a user to a shelter (e.g., a basement fallout shelter, a safe location with no windows, etc.) In yet other embodiments, directions 1806 may allow a user to select options for the route. For example, a user may be able to indicate that she wishes to stay on the same floor, avoid stairs, etc. In yet other embodiments, directions 1806 may enable a user to select multiple destinations. For example, a user may indicate that he wishes to stop by a supply room before continuing to a conference room. The user may be able to make edits to any selections made. Directions 1806 are not limited to those forms and features specifically enumerated.
Referring now to
Health Care and Hospital Features
Referring now to
Healthcare module 752 facilitates healthcare functionality of control device 214. Functions performed by healthcare module 752 may include monitoring the health of occupants of the area in which control device 468 is installed. In some embodiments, healthcare module 752 may monitor an occupant's health through data collected by healthcare sensors 604 and/or may determine a health metric for the occupant based on the data collect by healthcare sensors 604. For example, healthcare module 752 may monitor an individual's health by tracking his temperature through healthcare sensor 604. In some embodiments, healthcare sensor 604 is one or more or a combination of a smartwatch, a smart wrist band, a heart rate monitor, a pacemaker, a portable insulin device, and/or any other wearable medical device. In some embodiments, healthcare sensor 604 is a camera, an infrared camera, and/or any other occupancy detection device. Healthcare module 752 may use healthcare sensors 604 to monitor a user's waking/rest times, heart rate, insulin levels, body temperature, etc. Healthcare module 752 is not limited to monitoring the health attributes specifically enumerated, and may monitor any aspect of a user's bio-status. In some embodiments, control device 214 is configured to forward any data collected by healthcare sensors 604 and/or healthcare equipment 2104 to medical server 2102. In some embodiments, medial server 2102 is a hospital server, a nurses station computing system, and/or an emergency response operator server.
Healthcare module 752 may communicate with user interface 702 or user device 612 belonging to a user to sense and collect health data. For example, healthcare module 752 may communicate with an individual's smartwatch which contains a heart rate monitor to track the individual's heart rate. In some embodiments, control device 214 does not communicate with healthcare sensors 604 which monitor a user's health, and instead collects data solely from healthcare equipment 2104. In other embodiments, control device 214 contains sensors and collects data from other devices, combining the data collected to produce a general metric of a user's health.
Healthcare module 752 may detect a change of a predetermined amount or a sensor value over or under a predetermined threshold value (e.g., abnormally high and/or low heart rate (i.e., bradycardia and tachycardia), abnormally high and/or low insulin level, abnormally high and/or low temperature, etc.). In some embodiments, healthcare module 752 may monitor the heart rate of an occupant and determine if the heart rate is abnormal (i.e., arrhythmia). In some embodiments, healthcare module 752 may alert a user, the monitored occupant, a nurse's station computing system, a hospital server, a hospital computing system, call 911 (i.e., send a message to an emergency response server and/or an emergency response computing system) etc. For example, healthcare module 752 may communicate with user device 612 of a user to display an alert describing the situation triggering the healthcare alert. Healthcare module 752 may communicate with network 602 to update a healthcare system (e.g., medial server 2102) with new data collected, set a flag on a user's condition, etc. For example, healthcare module 752 may send data to a patient database and update a value for a body temperature, blood pressure, etc.
In some embodiments, a heart rate and/or body temperature is measured by a smart wrist band and/or smart watch (e.g., healthcare sensors 604). The heart rate and/or body temperature (e.g., health data 2103) may be sent to control device 214. In some embodiments, healthcare sensors 604 are cameras. The cameras may be heat sensitive. The heat images (e.g., health data 2103) may be sent to control device 214. Control device 214 may determine the body temperature of various occupants of a building (e.g., building 10) based on the heat images (e.g., health data 2103) received form healthcare sensors 604.
Healthcare module 752 may send push alerts to user device 612 from network 602. For example, network 602 may receive a notification that it is time for a middle school individual to take her medication. Control device 214 may communicate with user device 612 of the individual, a teacher, a nurse, etc. to alert the user of user device 612 that it is time for the individual to take her medication. In some embodiments, control device 214 may communicate with a user through user interface 702 to convey healthcare information. For example, network 602 may receive a notification that it is time for an individual's appointment with the nurse. Network 602 may communicate with control device 214 to convey the information to the nurse, the individual, the individual's current teacher, etc. For example, control device 214 may have access to a user's schedule and/or calendar, and adjust actions accordingly. In some embodiments, control device 214 may determine that an individual is currently in math class, and may send an alert to user device 612 of the individual. In other embodiments, control device 214 may determine that an individual is currently in a free period with a specific teacher in a specific room, and may send an alert to a control device 214 installed in the room, or to a user device 612 of the teacher. Control device 214 may convey healthcare information through any media, and is not limited to those specifically discussed.
Healthcare module 752 may contain some or all of the features of occupancy module 754. The occupancy detectors (e.g., healthcare sensors 604, sensors 714, etc.) may be installed in a patient room in a health care facility and may be used to monitor the presence of the patient in the room. Healthcare module 752 may communicate with the network 602, medical server 2102, and/or building management system 610 to alert medical personnel if a patient leaves their room without permission. Healthcare module 752 may communicate with a user interface to determine the identities of persons in a patient's room. For example, the occupancy detector may use a camera and facial recognition software to determine the identities of medical personnel that are present. Healthcare module 752 may use camera and facial recognition to determine the presence of visitors and other unauthorized personnel in a patient's room.
In some embodiments, the healthcare module 752 communicates with users or relevant persons when an emergency situation arises (e.g., building management system 610, medical server 2102, user device 612, etc.) Healthcare module 752 may receive the patient's health information from the network, healthcare sensors 604, and/or healthcare equipment 2104, and display it to medical personnel if a medical alert is detected (e.g., abnormal blood pressure, abnormal oxygen saturation, abnormal heart rate, abnormal heart rhythm, etc.). In another embodiment, healthcare module 752 may communicate to the patient or to medical personnel when a regular medical procedure is scheduled. For example, healthcare module 752 may communicate to the patient or to medical personnel when a pill is to be taken, when an IV is to be replaced, when a wound dressing is to be changed, etc. In another embodiment, healthcare module 752 may communicate with alert module to communicate with user device 612 of a patient. For example, if a patient is undergoing treatment requiring regular pill taking may receive alerts from an alert module on a mobile device (e.g., a smartphone, smart watch, wearable, laptop, etc.).
Healthcare module 752 may communicate with any systems, devices, etc. connected to control device 214. For example, healthcare module 752 may issue an alert to medical personnel which is pushed to control device 214 (e.g., a nurse's station) and mobile devices (e.g., user device 612 of medical personnel assigned to the patient, etc.) Healthcare module 752 may issue an alert which is pushed to user devices 612 through network 602. Healthcare module 752 may be in communication with all modules of control device 214.
In some embodiments, healthcare module 752 may require the credentials of healthcare personnel to make changes related to treatment of the patient. The healthcare module 752 may record the unique identity of any user making changes to a patient's treatment.
Referring now to
In various embodiments, other control devices 468 are located remotely, such as in other buildings, states, countries, etc. For example, referring to
In an exemplary scenario, a patient may be discharged from a medical care facility, such as a hospital to their home or to an assisted living facility. The patient may, for example, have received a routine checkup or may have been treated for a chronic or acute medical situation. The patient may be automatically monitored by healthcare equipment 2104 as descried with reference to
Control device 214 may continue to monitor the health of the patient after receiving medical care. If control device 214 detects a medical alert, it may take an action, depending on the severity of the medical alert. For example, control device 214 may prompt the patient to return to the hospital, alert a local medical person (e.g., an in-home nurse or caretaker), or may have an ambulance sent to the patient's location.
In some embodiments, control device 214 can transmit patient data to a central computer system (over a local network or via the internet) in compliance with HIPPA standards and regulations.
In some embodiments, control device 214 may not collect personal health data without consent of the person whose data is being collected. In other embodiments, control device 214 may offer an opt-out system, where control device 214 is prevented from collecting personal health data when a user specifically opts out. In yet other embodiments, control device 214 may collect data from all users, and anonymize all data before storing, analyzing, etc. For example, control device 214 may collect data from all patients undergoing a particular procedure and anonymize all data before sending to a research facility, hospital, etc.
Control device 214 may collect data from each person, and each person is given a window of time to opt-out out retroactively or delete data. In some embodiments, control device 214 may communicate with the users through the user interface, a mobile device, and/or the network to inform users that their data has been collected. For example, control device 214 may push a notification out to all applicable users over the network that his or her information has been collected, and will be stored or sold to a hospital within 24 hours. In some embodiments users may be given the full 24 hours to opt-out or delete data. In other embodiments, users may be given any predetermined period of time in which to respond or take action.
Control device 214 may communicate with users to ask for permission to share his or her information. For example, control device 214 may display a prompt on a mobile device of each person whose data was collected. In some embodiments, control device 214 may share a user's data when permission has been granted. In other embodiments, control device 214 may share non-sensitive user data that has been anonymized.
Referring now to
The individual 2408 may communicate directly with control device 214 through a user interface, voice commands, etc. For example, individual 2408 may tell control device 214 that he does not feel well. In some embodiments, control device 214 may trigger an alert or take some other action depending on the information received. In other embodiments, control device 214 may wait for specific instructions to take action before executing any commands.
In part 2404, a screen of control device 214 during normal health monitoring operation is shown. Control device 214 has confirmed that individual's 2408 body temperature, displays the temperature, the individual's name, an indication that all is well, and takes no further action. In some embodiments, control device 214 stores the information. In other embodiments, control device 214 sends the information to healthcare institutions, facilities, professionals (e.g., medical server 2102, building management system 610, etc.) Control device 214 may handle all information in accordance with HIPAA rules and regulations.
Control device 214 may monitor and collect any health data, such as blood pressure, heart rate, etc. For example, control device 214 may communicate with a heart rate monitor, and raise an alarm if an individual's heart rate becomes irregular, over a threshold rate, etc. For example, control device 214 may detect that an individual is experiencing a high amount of stress using a combination of body temperature and heart rate. Control device 214 is not limited to the health statistics specifically enumerated.
In part 2406, control device 214 has automatically detected that a health condition has arisen. In this exemplary depiction, the health condition is a fever, detected by the high body temperature. In other embodiments, the health condition may be high stress, arrhythmia, low blood sugar, etc. Control device 214 may produce a sound, vibrate, flash the screen, etc. to present an alert to a user. In some embodiments, control device 214 may send a signal to a user device (e.g., user device 612, network 602, building management system 610, medical server 2102, etc.) or some other system or device to display the alert, as described above.
Referring now to
Screen 2500 further includes an alert message 2504 and a cause 2506. Alert message 2504 may display any message, such as “STUDENT COLLAPSE,” “STUDENT EMERGENCY,” etc. In some embodiments, alert message 2504 may be customized to provide more information, such as the individual's name, emergency contact information, etc. In other embodiments, alert message 2504 may be customized to display anything that may be more helpful or appropriate for the environment in which user control device is installed. Alert message 2504 is not limited to those messages specifically enumerated.
Cause 2506 may be any reason, such as “Cardiac distress,” “Low blood sugar,” etc. In some embodiments, cause 2506 may be customized to provide more information, such as the individual's name, emergency contact information, etc. In other embodiments, cause 2506 may be customized to display anything that may be more helpful or appropriate for the environment in which user control device is installed. Cause 2506 is not limited to those messages specifically enumerated.
Screen 2500 is further shown to include an icon 2508. Icon 2508 may give a user a quick impression of what the alert is related to. Control device 214 is capable of providing alerts for many different categories, such as inclement weather, security, health, etc. Control device 214 is not limited to those categories specifically enumerated. Icon 2508 may be a symbol, a word, etc., and may be any indication of what the alert is related to.
Screen 2500 is further shown to include a location 2510. Location 2510 may give a user the location of the particular individual to which the alert is related. In some embodiments, location 2510 is provided as text. In other embodiments, location 2510 is provided as a map. For example, location 2510 may be displayed as live feed 2502. Location 2510 may be displayed or presented to the user in any form, and is not limited to those specifically enumerated.
Screen 2500 is finally shown to include options 2512, 2514, and 2516. Options 2512, 2514, and 2516 may provide a user with options of actions to take. In some embodiments, screen 2500 may include more options. In other embodiments, screen 2500 may include fewer options. The options presented may be customized to be more appropriate for each situation. For example, if an individual's insulin pump needs to be restarted, control device 214 may present the option of restarting the pump. In some embodiments, option 2516 to ignore the alert may not be available. For example, if an individual is in critical condition, such as cardiac arrest, user control device may automatically execute options 2512 and 2514 by calling security and 911.
Concierge and Hotel Features
Referring now to
In some embodiments, hotel module 750 is configured to process orders for food from local restaurants. In some embodiments, control device 214 (i.e., hotel module 750) may send a request to a restaurant computing system 2602 for a menu. Control device 214 may display the menu to the user via user interface 702 and may allow the user to order food directly through user interface 702 (i.e., enter orders through user interface 702). In some embodiments, the user may be able to send a reservation request to restaurant computing system 2602 via hotel module 750 and display device 702. A user may place an order via user interface 702 causing hotel module 750 to communicate with restaurant computing system 2602 via network 602. Hotel module 750 may cause payment module 758 to process any payment transactions for food orders with financial institution system 3504. Payment transactions are described in further detail at
In some embodiments, hotel module 750 is configured to process requests for taxis, busses, subways, trains, and/or planes. In some embodiments, control device 214 communicates with transportation server 2604. Transportation server 2604 may be Uber, Lyft, and/or any other taxi service. In some embodiments, transportation server 2604 is an airline server, a buss server, a train server, etc. Hotel module 750 may allow a user to request a ride from transportation server 2604 and may cause payment module 758 to process payment transactions via network 602 and financial institution system 3504. In some embodiments, input device 712 may be configured to scan credit and/or debit cards for payment for transactions with restaurant computing system 2602 and/or transportation server 2604. In some embodiments, payment module 758 facilitates the transaction with financial institution system 3504. Input device 712 is described in further detail in
Referring now to
According to this exemplary embodiment, a calendar interface may be provided to a user via the user interface and/or the mobile device. In some embodiments, the calendar interface may show the user's appointments and events. For example, a user's work and personal calendar events may be displayed on the calendar interface. In other embodiments, multiple users' schedules may be displayed on the calendar interface.
The calendar interface may show information such as availabilities for a hotel. In some embodiments, the control device 214 is located inside the hotel which it displays availability for. In some embodiments, the calendar interface may provide all availabilities. In other embodiments, the calendar interface may be sorted according to room size, amenities, etc. The calendar interface may not be specific to a single hotel. In some embodiments, the calendar interface may display availabilities for multiple hotels. The hotels shown may be selected by a user. In other embodiments, control device 214 may automatically select multiple hotels according to criteria such as price range, length of stay, amenities, distance to destinations, hotel ratings, etc.
The information may be displayed in any format. For example, control device 214 may display the information as drop-down boxes, check boxes, etc. In some embodiments, control device 214 may display content directly from a hotel's website, a travel website, etc. In other embodiments, control device 214 may display content parsed from a website, in a format native to control device 214.
Process 2606 continues with step 2610, in which a user selects a range of days for her stay at the hotel. In some embodiments, a user selects a range of consecutive days. In other embodiments, a user may select a set of non-consecutive days. The user may enter other information, such as billing information, number of guests, destination, etc. In some embodiments, the calendar interface may display the range of days selected as darkened days, checked boxes, etc. The information input by the user is transmitted from control device 214 to a building management system for a hotel (e.g., building management system 610) and/or any other server for the hotel.
Process 2606 continues with step 2612, the information transmitted from control device 214 is received by a database. In some embodiments, control device 214 may book a stay at the hotel directly using entered billing information. In other embodiments, control device 214 connects the user to a travel agent, to the hotel's booking website with the fields pre-populated, etc. The information transmitted from control device 214 may be received by any system, and is not limited to databases. In some embodiments, the database is connected to a hotel's main system, and hotel staff are notified. In some embodiments, the hotel's main system is building management system 610.
The database may be connected to additional services, such as destinations, airlines, etc. For example, control device 214 may automatically suggest flights from a billing address entered by the user to the destination entered by the user. In some embodiments, control device 214 may automatically select flights and present the user with a confirmation dialog. In other embodiments, control device 214 presents a set of available flights for the scheduled hotel stay. Control device 214 may also suggest, book, etc. activities, such as local attractions, tours, ground transportation, etc.
Control device 214 may learn from information entered by the user with his permission. For example, control device 214 may store information such as a user's preferences for flight times, direct vs. non-direct flights, seat preferences, hotel chain preferences, pillow firmness preferences, attractions, tours, ground transportation, etc. A user may be presented with a dialog confirming that she is allowing control device 214 to store or analyze such data. In some embodiments, the data is stored remotely. In other embodiments, the data is stored locally on control device 214.
Process 2606 continues with step 2614 in which control device 214 provides the user with information. In some embodiments, control device 214 provides a confirmation of all bookings made. In other embodiments, control device 214 provides a list of prospective bookings, contact information for each option, etc. Control device 214 may provide the user with any information. In some embodiments, control device 214 may not provide the user with further information.
In this exemplary embodiment, control device 214 is shown to provide the user with information through a user interface (e.g., user interface 702). In other embodiments, control device 214 may provide the user with information through any medium, format, etc. For example, control device 214 may provide the user with information through speakers (e.g., speakers 710), a mobile device (e.g., user device 612), etc.
Referring now to
Process 2700 continues with step 2704, in which control device 214 may present the user with a list of available modes of transportation. For example, control device 214 may present the user with a list of links to different sites of different modes of transportation. In some embodiments, each option is a link which takes the user to a set of available options. Availability may be determined by criteria such as the current time, the desired time, the location, the distance, the mode of travel, extra considerations for the passenger (oversize luggage, animals, etc.), etc. In some embodiments, the user may enter the criteria via user interface 702. In various embodiments, the user may enter the criteria via microphone 726 and voice command module 744. Control device 214 may suggest the closest form of transportation if the selected mode is unavailable. In some embodiments, control device 214 may make suggestions and/or arrange the list of modes of transportation (i.e., most relevant mode of transportation to least relevant mode of transportation) based on the most commonly used, least expensive, fastest, a target destination, etc. For example, if no taxis are available at the desired time, control device 214 may suggest taking the subway.
Process 2700 continues with step 2706, in which control device 214 may make arrangements for the final selection. For example, once the user has selected the taxi company, times, options, etc., control device 214 may place a call to the company to make arrangements. In some embodiments, control device 214 may enter the information in the company's website. In other embodiments, control device 214 may present the information to the user, who will make the final arrangements himself.
Process 2700 continues with step 2708, in which the user is connected with her transportation. In some embodiments, the transportation travels to pick up the user. In other embodiments, the user travels to board the transportation. The travel arrangements may be made for travelling to a destination, travelling from a destination, etc. Travel arrangements may be made for any purpose.
Referring now to
Other ways of making arrangements may be available via control device 214. In some embodiments, a user may be able to set preferences through voice command, gesture input, etc. In other embodiments, a user may set preferences through specific applications, the hotel's website, etc. In some embodiments, the control device 214 can send payment and/or credit card information for the transportation. In some embodiments, hotel module 750 may process payment with input device 712 and payment module 758.
Referring now to
Process 2900 continues with step 2904, in which control device 214 receives reservation information for the room at a first time. Control device 214 may display a confirmation message. In some embodiments, control device 214 may send a confirmation message to the front desk, main system, etc. In other embodiments, control device 214 may send a confirmation message to the user. In this exemplary embodiment, the reservation information is received at 1 p.m. local time, and the reservation is for 6 p.m. local time.
Process 2900 continues with step 2906, in which the reservation information and/or preferences are analyzed. The received information may include room number, temperature, humidity, lighting level, pillow firmness, etc. Other information and preferences may be set. The format in which the information is presented to the system, control device 214, etc. may be any format. For example, the system may receive the information as raw data while control device 214 receives data parsed into packets for each category of preference.
Process 2900 continues with step 2908, in which control device 214 may determine the amount of time needed to reach the guest's preferred settings, and when to begin preparing. Control device 214 may determine the approximate time of arrival of a guest and the approximate amount of time needed to reach the environmental setpoints of the guest.
Process 2900 continues with step 2910, in which control device 214 has determined the amount of time needed, the time at which to begin preparing, etc. For example, the preparation for a guest Jimmy arriving at 6 p.m. is shown to begin at 4 p.m. Control device 214 may begin to change the temperature, humidity, etc. of the room. For example, control device 214 may begin to heat the room from 69° F. to Jimmy's preferred 70° F.
Process 2900 continues with step 2912, in which control device 214 informs hospitality services of the guest's preferences. In this exemplary embodiment, Jimmy prefers firm pillows. Control device 214 is shown to inform the front desk of Jimmy's preference. In some embodiments, control device 214 communicates directly with the front desk (e.g., a computer at the front desk). In other embodiments, control device 214 goes through an intermediary (e.g., network 602) to communicate with the front desk. Control device 214 may communicate with the front desk through any means, and may transmit any information. Control device 214 may be compliant with all privacy rules and regulations.
Process 2900 continues with step 2914, in which control device 214 communicates with hotel equipment (e.g., HVAC equipment 738) to achieve the guest's preferences. In this exemplary embodiment, Jimmy prefers low lighting. Control device 214 may communicate with lights (e.g., HVAC Equipment 738) of the room to dim. In some embodiments, control device 214 may communicate directly with lights 2920. In other embodiments, control device 214 may communicate through an intermediary, such as hotel automation system (e.g., building management system 610), network 602, etc. Control device 214 may communicate with hotel equipment (e.g., HVAC Equipment 738) through any communications protocol, and is not limited to those specifically enumerated.
Process 2900 continues with step 2916, in which the guest arrives at the room at a time indicated by his reservation information transmitted to control device 214. In this exemplary embodiment, Jimmy arrives at Room 78 at 6 p.m. local time. Control device 214 is shown to display one or more room settings. For example, control device 214 is shown to be mounted to a wall of the room, and displays the current room temperature—Jimmy's preferred 70° F. Lighting 2920 may be at Jimmy's preferred low setting. In some embodiments, accommodations such as bed inclination level/mattress firmness (e.g., hotel module 750) may be adjusted. In other embodiments, fewer settings may be adjusted.
Process 2900 continues with step 2918, in which the guest is greeted by control device 214. In some embodiments, control device 214 greets the guest purely visually. For example, control device 214 may display text saying “Welcome to Room 12, Aaron.” In other embodiments, control device 214 may greet the guest using sound. For example, control device 214 may say “Welcome to Room 78, Jimmy.” Control device 214 may greet the user through any means. Control device 214 may be customizable to use a greeting a user has selected, or a greeting specific to the hotel, the room, etc. the user is staying in. Control device 214 may provide options to the user, such as a call for room service, access to the front desk, concierge, etc. In some embodiments, control device 214 performs many of the functions of the concierge desk. In other embodiments, control device 214 connects a user to the concierge desk.
Referring now to
Process 3000 continues with step 3004, in which the user chooses an option and inputs the selection to control device 214. In some embodiments, the user may provide the input as a voice command. In other embodiments, the user may provide the selection as a button press, a tactile input, a gesture, etc. via a user interface (e.g., user interface 702). Any input method may be used.
Process 3000 continues with step 3006, in which the selection is transmitted from control device 214 to the appropriate system. In some embodiments, the appropriate system is building management system 610. For example, if the selection made is a request for new towels, housekeeping would be notified. In some embodiments, housekeeping may be notified via building management system 610. In some embodiments, selections made indicate that other departments, such as the front desk, billing, etc. are contacted. In some embodiments, the front desk and billing are connected to building management system 610.
In other embodiments, the request made can be executed automatically by control device 214. For example, if the user requests that the light be turned off when there are multiple lights in the room, control device 214 may use voice command detection (e.g., voice control module 748). Control device 214 may detect which occupancy sensor (e.g., sensors 714) detected the user's voice, or which sensor detected the voice the “loudest.” Control device 214 may decide the location of the user using an algorithm and turn off the light nearest that location.
Referring now to
The user may request information in any way. In some embodiments, the user may request information through voice commands. In other embodiments, the user may request information through tactile input (e.g., via user interface 702), via a mobile device (e.g., user device 612), etc.
Process 3100 continues with step 3104, in which user control device has obtained the requested information, and transmits the information to the user. In some embodiments, control device 214 provides the information to the user through speakers. For example, control device 214 may say “The gym closes at 12 a.m.” In other embodiments, control device 214 may transmit the information through text, images, etc. Control device 214 may present the information to the user via a user interface (e.g., user interface 702), a mobile device (e.g., user device 612), etc.
In some embodiments, control device 214 provides information to the user in the same way the user requested the information. For example, if the user asked a question using a voice command, control device 214 would answer the question via speakers. In other embodiments, control device 214 may provide information to the user according to her preferences. In yet other embodiments, control device 214 would answer the question via a default method, which may be customizable.
Referring now to
The user may request information in any way. In some embodiments, the user may request information through voice commands. In other embodiments, the user may request information through tactile input (e.g., via user interface 702), via a mobile device (e.g., user device 612), etc.
Process 3200 continues with step 3204, in which user control device has obtained the requested information, and transmits the information to the user. In some embodiments, control device 214 provides the information to the user through speakers. In other embodiments, control device 214 may transmit the information through text, images, etc. if the answer is too long or too complicated to answer over speakers. For example, if the information requested is an explanation for why the sky is blue, user control device may, as a default, present the information to the user through text. Control device 214 may present the information to the user via user interface 702, user device 612, etc.
Referring now to
The user may request information in any way. In some embodiments, the user may request information through voice commands. In other embodiments, the user may request information through tactile input (e.g., user interface 702), via a mobile device (e.g., user device 612), etc.
Process 3300 continues with step 3304, in which user control device has obtained the requested information, and transmits the information to the user. In some embodiments, control device 214 provides the information to the user through speakers. In other embodiments, control device 214 may transmit the information through text, images, etc. In this exemplary embodiment, the information is presented through an interface of a companion application for control device 214. The exemplary embodiment includes a room status indicator 3306. The exemplary embodiment also includes a menu option 3308. The exemplary embodiment includes a message 3310 that greets the user and provides relevant information. For example, if the user is leaving the hotel on that day, message 3310 may include the time of checkout.
The exemplary embodiment includes an information section 3312 that provides relevant information regarding attractions and accommodations. In some embodiments, the attractions and accommodations are local to the hotel. In other embodiments, a user may specify the location, distance, price, etc. Control device 214 may store the information. In some embodiments, control device 214 may access the information from an outside site, such as Yelp, Google Reviews, etc.
The exemplary embodiment includes a navigation section 3314 that provides navigation tools. In some embodiments, the tools are buttons represented by icons. In other embodiments, the tools may be text links, check boxes, etc. Navigation section 3314 may be customized to provide relevant options. The exemplary embodiment further includes a system indicator 3316. The exemplary embodiment further includes a page title 3318.
Process 3300 continues with step 3318, in which a screen shows accommodations available at the hotel. A user may input a selection through control device 214 by any means previously described.
Process 3300 continues with step 3320, in which a screen showing a floorplan is displayed on control device 214. In some embodiments, the floorplan may display a user selection, such as a pool. In this exemplary embodiment, the user selected the pool from the screen of control device 214. The location of the pool on the floorplan is shown on the screen. In other embodiments, other information may be shown on control device 214, as described earlier.
Referring now to
Process 3400 continues with step 3404, in which control device 214 thanks the user for staying with the hotel with a parting message. In some embodiments, the parting message may be customized to the user's liking. In other embodiments, the parting message is customized for the hotel. The parting message may be delivered in any way. In some embodiments, the parting message is delivered via speakers. In other embodiments, the parting message is delivered as text, images, etc. The parting message may be accompanied by a receipt for the total of the stay. In some embodiments, the receipt may be printed by control device 214. In other embodiments, the receipt may be printed at the front desk and delivered to or picked up by the user. Process 3400 may be executed by control device 214 and/or hotel module 750.
In some embodiments, control device 214 prompts the user to enter payment information and/or swipe a credit and/or debit card via input device 712. This may allow the user to pay for their stay and/or any additional charges without stopping at the front desk. In some embodiments, the control device facilitates transfer of funds from a financial account associated with a user to a financial account associated with the hotel. The financial account may be held with financial institution system 3504 and control device 214 may facilitate the transfer of funds with hotel module 750 and payment module 758. In some embodiments, the user is required to swipe their card with input device 712 at the beginning of their stay and simply confirm the amount and/or leave a tip when their stay expires.
Payment Features
Referring to
Referring specifically to
Referring now to
Referring to
In some embodiments, input device 712 (e.g., card reader, wireless reader, etc.) may be integrated into the user control device. For example, input device 712 may be integrally formed with the display or the base. In other embodiments, input device 712 may be coupled to the display or the base (e.g., as an aftermarket device, etc.). In other embodiments, input device 712 may be separate from control device 214 and may be connected to control device 214 through a wired connection or a wireless connection.
Referring now to
Referring now to
The process continues with step 3904 in which payment data is received by user control device 214. Payment data may be received, for example, by swiping a card through a card reader (e.g., input device 712, card reading device 3602, etc.), inserting a card into a card reader, passing a card under a sensor (e.g., an infrared sensor), or holding a card or mobile device close to control device 214. The payment data may include various information such as authentication data, encryption data, decryption data, etc.
The process continues with step 3906 in which user control device 214 communicates with financial institution system 3504 to authorize the payment. Financial institution system 3504 may, for example, be a credit card company or a banking network. The control device 214 communicates a variety of information to financial institution system 3504 including payment data and transaction data to authorize the payment.
Thermostat with Direction Display
Referring now to
In some embodiments, network 4004 communicatively couples the devices, systems, and servers of system 4000. In some embodiments, network 4004 is at least one of and/or a combination of a Wi-Fi network, a wired Ethernet network, a Zigbee network, a Bluetooth network, and/or any other wireless network. Network 4004 may be a local area network or a wide area network (e.g., the Internet, a building WAN, etc.) and may use a variety of communications protocols (e.g., BACnet, IP, LON, etc.). Network 4004 may include routers, modems, and/or network switches. Network 4004 may be a combination of wired and wireless networks.
In some embodiments, display device 4002 is configured to receive emergency information and navigation directions via network 4004. In some embodiments, display device 4002 is a wall mounted device with a display screen. For example, display device 4002 can be a thermostat, a humidistat, a light controller, and any other wall mounted device with a display screen. In some embodiments, display device 4002 is connected to building emergency sensor(s) 4006 and receives emergency data from the building emergency sensor(s) 4006. In some embodiments, building emergency sensor(s) 4006 are sensors which detect building emergencies. Building emergency sensor(s) 4006 can include, for example, smoke detectors, carbon monoxide detectors, fire pull handles, panic buttons, gunshot detection sensors, and any other emergency sensor. In some embodiments, the emergency sensor(s) include actuators. The actuators may be building emergency sirens, a sprinkler and/or sprinkler system, an automatic door controller and/or automatic door control system, and any other actuator used in a building. In some embodiments, building emergency sensor(s) 4006 may communicate with building management system 4010. Building management system 4010 may sensor data from the building emergency sensor(s) 4006. In various embodiments, building management system 4010 may send the sensor data and/or emergency information associated with the sensor data to display device 4002.
In some embodiments, display device 4002 is communicatively coupled to weather server(s) 4008 via network 4004. In some embodiments, display device 4002 is configured to receive weather alerts (e.g., high and low daily temperature, five-day forecast, thirty-day forecast, etc.) from the weather server(s) 4008. Display device 4002 may be configured to receive emergency weather alerts (e.g., flood warnings, fire warnings, thunder storm warnings, winter storm warnings, etc.) from the weather server(s) 4008. In some embodiments, display device 4002 is configured to display emergency warnings via a user interface of display device 4002 when display device 4002 receives an emergency weather alert from weather server(s) 4008. Display device 4002 may be configured to display emergency warnings based on the data received from building emergency sensor(s) 4006. In some embodiments, display device 4002 causes a siren to alert occupants of the building of an emergency, causes all doors to become locked and/or unlocked, causes an advisory message be broadcast through the building, and/or controls any other actuator or system necessary for responding to a building emergency. In some embodiments, the building management system 4010 communicates with weather server 4008. Building management system 4010 may communicate (e.g., send) information from weather server 4008 to display device 4002.
In some embodiments, display device 4002 is configured to communicate with building management system 4010 via network 4004. Display device 4002 may be configured to transmit environmental setpoints (e.g., temperature setpoint, humidity setpoint, etc.) to building management system 4010. In some embodiments, building management system 4010 is configured to cause zones of a building (e.g., building 10) to be controlled to the setpoint received from display device 4002. For example, building management system 4010 may be configured to control the temperature, humidity, lighting, or other environmental conditions of a building based on the setpoints or control signals received from display device 4002. In some embodiments, building management system 4010 is configured to transmit emergency information to display device 4002. The emergency information can include, for example, a notification of a shooter lockdown, a tornado warning, a flood warning, a thunderstorm warning, and/or any other warning. In some embodiments, building management system 4010 is connected to various weather servers and/or other web servers from which building management system 4010 receives emergency warning information.
In some embodiments, the display device 4002 is configured to communicate with one or more social media server(s) 4011 via network 4004. Social media server(s) 4011 may include, but are not limited to, servers supporting Facebook, Instagram, Twitter, Snapchat, WhatsApp, and/or other social media platforms. In some embodiments, the display device 4002 may have a profile or other presence on a social media platform, such that a user may send a direct message, post, tweet, etc. to the display device 4002. For example, a user may tweet at (i.e., via Twitter) or send a direct message to (e.g., via Facebook Messenger, WhatsApp, etc.) the display device 4002 and/or the building management system 4010 to indicate that an emergency is ongoing in a building (e.g., “@displaydevice4002 a fire just started in Room X”). The display device 4002 may receive such a message, tweet, post, etc., extract relevant information therefrom using a natural language processing approach, and generate emergency directions based on the extracted information. In some embodiments, the display device 4002 is configured to send a message or comment to the user in response, for example using an automated chat bot approach.
In various embodiments, the display device 4002 accesses the social media server(s) to passively monitor social media activity of one or more occupants of a building to identify events in a building and/or emergencies in a building. For example, the display device 4002 may access a message sent from a first user of a social media server 4011 to a second user of the social media server 4011 which mentions an ongoing emergency in the building. As another example, the display device 4002 may analyze pictures and/or videos posted publically by a social media user (e.g., via Snapchat, Instagram, etc.) to identify building occupancy, events in the building, emergencies in the building, etc. and respond accordingly. For example, a user may post a video that shows an active shooter in a building, and the display device 4002 may receive said video, analyze said video to determine a location of the shooter in the building, and generate one or more directions to provide to one or more building occupants to help the occupants find safety. Various such interactions between the social media server(s) 4011 and the display device 4002 are contemplated by the present disclosure.
Display device 4002 can be configured to communicate with user device 4012 via network 4004. In some embodiments, user device 4012 communicates calendar information to display device 4002. User device 4012 can include any user-operable computing device such as smartphones, tablets, laptop computers, desktop computers, wearable devices (e.g., smart watches, smart wrist bands, smart glasses, etc.), and/or any other computing device. User device 4012 can be a mobile device or a non-mobile device. In some embodiments, the calendar information is stored and/or entered by a user into calendar application 4014. Calendar application 4014 may be one or a combination of Outlook, Google Calendar, Fantastical, Shifts, CloudCal, DigiCal, and/or any other calendar application. Display device 4002 may receive calendar information from the calendar application such as times and locations of appointments, times and locations of meetings, information about the expected location of the user, and/or any other calendar information. Information about the expected location of the user may be information that the user will depart for an airport or another location at a specific time or in a range of times. Display device 4002 may be configured to display direction to a user associated with user device 4012 based on the calendar information stored in calendar application 4014.
In various embodiments, the user device 4012 provides various data and information regarding use of the user device 4012 to the display device 4002 and/or the building management system 4010. For example, the display device 4002 may collect a live feed of the usage of the user device 4012 to facilitate identification and characterization of building emergencies and/or to facilitate the provision of directions to a user in case of an emergency. For example, the display device 4002 may receive data relating to an emergency call made by the user device 4012, the location of the user device 4012 (e.g., based on GPS data collected by the user device 4012), social media activity of a user of the user device 4012, etc. In some embodiments, the display device 4002 activates a microphone and/or camera of the user device 4012 in an emergency situation to monitor the safety of a user in an emergency situation.
In some embodiments, a user may press a button on a user interface of display device 4002 indicating a building emergency. The user may be able to indicate the type of emergency (e.g., fire, flood, medical, active shooter, etc.). Display device 4002 may communicate an alert to building management system 4010, user device 4012, social media server 4011 and/or any other device, system, or server. For example, display device 4002 may be configured to cause the social media server 4011 to generate a social media notification relating to a building emergency for a user.
Referring now to
In some embodiments, system 4100 includes display devices 4016 and 4018. Display devices 4016 and 4018 may be identical and/or similar to display device 4002. In some embodiments, display devices 4016 and 4018 have the ability to communicate to display device 4002 but are different from display device 4002. For example, display device 4016 and display device 4018 can be smart actuators, building controllers, etc., while display device 4002 can be a smart thermostat. Display device 4002, display device 4016, and display device 4018 may be located in different locations of a building (e.g., building 10). In some embodiments, display device 4002, display device 4016, display device 4018 and user device 4012 may communicate to each other ad hoc. In some embodiments, display device 4002, display device 4016, and display device 4018 may communicate to each other via network 4004. In some embodiments, ad hoc communication may be at least one of (ad hoc Wi-Fi, ad hoc Zigbee, ad hoc Bluetooth, NFC, etc.) In some embodiments, the devices form a MANET, a VANET, a SPAN, an IMANET, and/or any other ad hoc network. In some embodiments, the devices are connected and communicate via RS-485, Ethernet, and/or any other wired, wireless, or combination of wired and wireless communication method.
In some embodiments, display device 4002, display device 4016, display device 4018 send navigation directions to one another via ad hoc communication. In some embodiments, one of the display devices determines a route for a building occupant. The route may be the fastest or shortest path to a destination (e.g., a conference room, an office, etc.). Display device may handoff the navigation directions to other display devices (e.g., display device 4016, display device 4018, etc.) along the path of the occupant. In some embodiments, the route may meet a need of the occupant, such as a route that will accommodate wheelchairs if the occupant is in a wheelchair or traveling with someone in a wheelchair.
In some embodiments, user device 4012 is configured to communicate with display device 4002, display device 4016, and display device 4018 via ad hoc communication. In some embodiments, user device 4012 may communicate with the display devices (e.g., display device 4002, display device 4016, display device 4018, etc.) and request navigation directions. In some embodiments, a user may check in with a display device and the display device may display navigation information for the individual associated with the user device 4012. Checking in with the display device may be holding user device 4012 a certain distance from the display device so that user device 4012 can communicate with the display device via NFC. In various embodiments, checking in with the display device includes connecting to the display device via Wi-Fi, Bluetooth, or Zigbee and entering a password and/or username.
Referring now to
Communications interface 4202 may be configured to communicate with network 4004 as described with reference to
In some embodiments, communications interface 4202 communicates with display device 4016, display device 4018, building emergency sensor(s) 4006, weather server(s) 4008, building management system 4010, and/or user device 4012 as described with reference to
Occupancy sensor 4204 may be used to detect occupancy and determine the identity of the occupant. Occupancy sensor 4204 may be one or a combination of motion sensors, cameras, microphones, capacitive sensors, or any number of other sensors. For example, occupancy sensor 4204 can include one or more cameras which detect heat signatures. Occupancy sensor 4204 may detect separate objects and distinguish between humans and other objects. Occupancy sensor 4204 can include one or more transducers that detect some characteristic of their respective environment and surroundings. Occupancy sensors, such as a camera, may be used to determine if an occupant is using a wheelchair, cane, crutches, and/or any other assistance device.
Speaker 4206 may be configured to project audio. The audio may be warning messages, direction messages, alternate route suggestion messages and any other message. Speaker 4206 may be any kind of electroacoustic transducer and/or combination of transducers that are configured to generate sound waves based on electrical signals. Speaker 4206 may be a loudspeaker (e.g., various combinations of subwoofers, woofers, mid-range drivers, tweeters, etc.) and may broadcast messages to an entire zone and/or an entire building (e.g., building 10). In some embodiments, speaker 4206 includes filters. In some embodiments, the filters are various combinations of high pass filters, low pass filters, band pass filters, etc.
User interface 4208 may be a touch screen display configured to receive input from a user and display images and text to a user. In some embodiments, user interface 4208 is at least one or a combination of a resistive touch screen and a capacitive touch screen (e.g., projective capacitive touch screen). In some embodiments, user interface 4208 is a swept-volume display, a varifocal mirror display, an emissive volume display, a laser display, a holographic display, a light field display, and/or any other display or combination of displays. User interface 4208 may be configured to display images and text to a user but may not be configured to receive input from the user. In some embodiments, user interface 4208 is one or a combination of a CRT display, an LCD display, an LED display, a plasma display, and/or an OLED display.
Processing circuit 4210 is shown to include a processor 4212 and memory 4214. Processor 4212 can be a general purpose or specific purpose processor, an application specific integrated circuit (ASIC), one or more field programmable gate arrays (FPGAs), a group of processing components, or other suitable processing components. Processor 4212 may be configured to execute computer code and/or instructions stored in memory 4214 or received from other computer readable media (e.g., CDROM, network storage, a remote server, etc.).
Memory 4214 can include one or more devices (e.g., memory units, memory devices, storage devices, etc.) for storing data and/or computer code for completing and/or facilitating the various processes described in the present disclosure. Memory 4214 can include random access memory (RAM), read-only memory (ROM), hard drive storage, temporary storage, non-volatile memory, flash memory, optical memory, or any other suitable memory for storing software objects and/or computer instructions. Memory 4214 can include database components, object code components, script components, or any other type of information structure for supporting the various activities and information structures described in the present disclosure. Memory 4214 can be communicably connected to processor 4212 via processing circuit 4210 and can include computer code for executing (e.g., by processor 4212) one or more processes described herein.
Memory 4214 is shown to include a network controller 4216, an emergency identifier 4218, a HVAC controller 4226, a directions controller 4228, a direction selector 4244, an occupancy controller 4238, an audio controller 4240, and user interface controller 4242. Each of these components is described in greater detail below.
Network controller 4216 may contain instructions to communicate with a network (e.g., network 4004) and ad hoc to other devices (e.g., display device 4016, display device 4018, user device 4012, etc.). In some embodiments, network controller 4216 contains instructions to communicate over wireless and wired communication methods. In some embodiments, wireless communication methods are communicating in a Wi-Fi network, a Zigbee network, and/or a Bluetooth network via communications interface 4202. In some embodiments, the communication methods are wired such as via RS-485, Ethernet (e.g., CAT5, CAT5e, etc.), and/or any other wired communication method. Network controller 4216 may be configured to facilitate communication a local area network or a wide area network (e.g., the Internet, a building WAN, etc.) and may be configured to use a variety of communications protocols (e.g., BACnet, IP, LON, etc.). In some embodiments, network controller 4216 facilitates ad hoc communication. The ad hoc communication may be at least one of (ad hoc Wi-Fi, ad hoc Zigbee, ad hoc Bluetooth, NFC etc.). In some embodiments, network controller 4216 facilitates communication over an ad hoc network (e.g., MANET, a VANET, a SPAN, an IMANET, and any other ad hoc network).
Emergency identifier 4218 can be configured to determine whether an emergency is occurring. The emergency can be an emergency inside the building (e.g., a fire, a dangerous person, a critical fault or operating condition in the BMS, etc.) or an emergency outside the building (e.g., a tornado, dangerous weather conditions, etc.). In some embodiments, emergency identifier 4218 is configured to determine emergency alerts based on information received from network controller 4216. Emergency identifier 4218 may include emergency sensor controller 4220, weather server controller 4222, and BMS emergency controller 4224. Emergency sensor controller 4220 may be configured to communicate with building emergency sensor(s) 4006 described with reference to
Emergency sensor controller 4220 may receive sensor data from building emergency sensor(s) 4006 via network controller 4216 and communications interface 4202. Emergency sensor controller 4220 may be configured to analyze the sensor data and determine if an emergency is present. Emergency sensor controller 4220 may determine the nature and/or location of the emergency based on the analysis of the sensor data. The nature of the emergency may be an earthquake, a fire, a gas leak, etc. Emergency sensor controller 4220 may be configured to determine and/or retrieve applicable directions for the determined emergency. In some embodiments, emergency sensor controller 4220 determines that an emergency is occurring when the sensor data is above and/or below a predefined threshold. For example, if emergency sensor controller 4220 determines that sensor data/information indicates that carbon monoxide levels cross a predefined threshold, the air is dangerous to breath and the building should be evacuated.
In some embodiments, building emergency sensor(s) 4006 are configured to determine the nature of the emergency. Emergency sensor controller 4220 may be configured to receive the nature of the emergency from building emergency sensor(s) 4006 via network controller 4216 and communications interface 4202. Emergency sensor controller 4220 can be configured to generate emergency directions based on the emergency. In some embodiments, the emergency directions are to evacuate a building, hide under tables and/or desks, close windows, and any other direction relevant to an emergency situation. Emergency sensor controller 4220 may send the determined emergency directions to direction selector 4244.
In some embodiments, the building emergency sensor(s) 4006 are configured to identify a location of an emergency in the building (e.g., a location of a fire, a location of an active shooter) and the emergency sensor controller 4220 is configured to receive the location of the emergency from the building emergency sensor(s) 4006 via network controller 4216 and communications interface 4202. In such embodiments, the emergency sensor controller 4220 can be configured to generate emergency directions based on the location of the emergency, for example to direct a user away from the emergency (e.g., away from a fire, away from an active shooter, along an evacuation route that avoids a dangerous area). The emergency directions may update dynamically as the emergency moves through a building, e.g., as the emergency sensor(s) 4006 detect the emergency (e.g., a fire, a gunshot) in changing locations in the building.
In some embodiments, the existence, nature, and/or location of an emergency may be determined based at least in part on live data received from the user device 4012 and/or other web-based live data streams (e.g., social media). For example, the emergency identifier 4218 may receive an indication of a call or message transmitted from the user device 4012 to an emergency response system. As another example, the emergency identifier 4218 may receive social media posts that indicate that an emergency event is occurring. The emergency identifier 4218 may use this live data to identify an ongoing emergency and/or determine the nature and/or location of the emergency.
Weather server controller 4222 may be configured to communicate with weather server(s) 4008 as described with reference to
BMS emergency controller 4224 may be configured to communicate with building management system 4010 as described with reference to
In some embodiments, building management system 4010 may include one or more databases which store building maps, room and meeting schedules, and/or any other information regarding a building (e.g., building 10). In some embodiments, BMS emergency controller 4224 is configured to request the building information from building management system 4010 and send the building related information to directions controller 4228.
Still referring to
HVAC controller 4226 may use any of a variety of control algorithms (e.g., state-based algorithms, extremum-seeking control algorithms, PID control algorithms, model predictive control algorithms, feedback control algorithms, etc.) to determine appropriate control actions for any HVAC equipment connected to building management system 4010 as a function of temperature and/or humidity. For example, if the temperature is above a temperature set point received from user interface 4208, HVAC controller 4226 may determine that a cooling coil and/or a fan should be activated to decrease the temperature of a supply air delivered to a building zone. Similarly, if the temperature is below the temperature set point, HVAC controller 4226 may determine that a heating coil and/or a fan should be activated to increase the temperature of the supply air delivered to the building zone. HVAC controller 4226 may determine that a humidification or dehumidification component of building management system 4010 should be activated or deactivated to control the ambient relative humidity to a humidity set point for a building zone.
Directions controller 4228 may be configured to determine directions for an occupant or a group of occupants of a building (e.g., building 10). In some embodiments, directions controller 4228 includes an opportunistic controller 4230, a user based direction controller 4232, a special needs controller 4234, and a direction request controller 4236. Opportunistic controller 4230 may be configured to generate and/or determine building event directions and/or messages based on information received from the building management system 4010. In some embodiments, opportunistic controller 4230 is configured to receive building event information from building management system 4010 and/or calendar application 4014 of user device 4012 as described with reference to
In some embodiments, opportunistic controller 4230 analyzes calendar information from one or more mobile devices (e.g., user device 4012) received via network controller 4216 and communications interface 4202. Based on the calendar information, display device 4002 may learn what events are occurring in the building. Opportunistic controller 4230 may be configured to generate an event image (e.g., various combinations of logos, admission fees, locations, start and end times, etc.) relating to the event and may determine proper audio notifications to be served along with the generated event image.
User-based direction controller 4232 may be configured to generate navigation directions for an occupant. In some embodiments, user based direction controller 4232 may be configured to receive the identity of an occupant from occupancy controller 4238. The identity may be the identity of an occupant a predetermined distance from display device 4002. In some embodiments, the user based direction controller 4232 may be configured to query the building management system 4010 via network controller 4216 and communications interface 4202 for information associated with the identified occupant. In some embodiments, building management system 4010 may reply with the name of the occupant, the schedule of the occupant, any meetings and/or events that the occupant is a participant (e.g., optional participant, required participant, etc.), and may also reply with any special needs of the occupant, such as wheel chair accessible directions. User based direction controller 4232 may be configured to generate directions to any locations which the identified occupant may be scheduled to be. In some embodiments, user based direction controller 4232 may be configured to communicate with a calendar application (e.g., calendar application 4014) via ad hoc and/or network communications with a user device (e.g., user device 4012) to determine the schedule of a building occupant. In some embodiments, user based direction controller 4232 may be configured to generate arrows, building maps, audio directions, and any other form of directions. User based direction controller 4232 may be configured to send the directions to direction selector 4244.
Special needs controller 4234 may determine if the occupant identified by user based direction controller 4232 has any special needs. For example, special needs controller 4234 may be configured to communicate with building management system 4010 and receive any information relating to any physical and/or mental disabilities associated with the identified user. The disabilities may be that the identified occupant is deaf, mute, blind, in a wheelchair, on crutches, etc. In some embodiments, special needs controller 4234 may determine building directions based on the disability of the occupant. For example, if the identified occupant is in a wheel chair, the special needs controller 4234 may generate directions to a location that circumnavigates any stairs. If the identified occupant is determined to be deaf, the special needs controller 4234 may be configured to generate audio directions only and not visual directions. In some embodiments, the audio directions are a series of turns (e.g., “go forward to end of hall turn right, go forward to end of hall turn left,” etc.)
Direction request controller 4236 may be configured to receive direction requests from user interface 4208. Direction request controller may communicate with user interface controller 4242 and may receive the direction request from user interface controller 4242. In some embodiments, direction request controller 4236 is configured to display directions to a requested location in response to a building occupant requesting directions via user interface 4208. The requested location can include, for example, a conference room, a meeting room, an office, etc. In some embodiments, direction request controller 4236 may be configured to display a map showing where the user is, where the destination is, the shortest route to the destination, etc. In some embodiments, direction request controller 4236 is configured to generate text directions indicating which turns to make in order to navigate to the destination. Further, direction request controller 4236 may be configured to generate audio messages to be played along with the visual directions.
In some embodiments, occupancy controller 4238 may be configured to determine the identity of an occupant based on information received from occupancy sensor 4204. The identity of the occupant may be provided to user based direction controller 4232. In some embodiments, the occupancy controller 4238 receives sensor input from occupancy sensor 4204 where the sensor may be a camera. Occupancy controller 4238 can perform digital image processing to identify the one or more users based on the digital images received from the camera. In some embodiments, digital image processing is used to identify the faces of the one or more users, the height of the one or more users, or any other physical characteristic of the one or more users. In some embodiments, the digital image processing is performed by image analysis tools such as edge detectors and neural networks. In some embodiments, the digital image processing compares the physical characteristics of the one or more users with physical characteristics of previously identified users.
In some embodiments, occupancy controller 4238 receives sensor input from a microphone. The microphone can be any of a plurality of microphone types. The microphone types include, for example, a dynamic microphone, a ribbon microphone, a carbon microphone, a piezoelectric microphone, a fiber optic microphone, a laser microphone, a liquid microphone, and an audio speaker used as a microphone. In some embodiments, the occupancy controller analyzes the audio data received from the microphone. In some embodiments, occupancy controller 4238 identifies one or more users based on voice biometrics of the audio received from the microphone. Voice biometrics are the unique characteristics of a speaker's voice. Voice biometrics include voice pitch or speaking style that result from the anatomy of the speaker's throat and/or mouth. In some embodiments, the voice biometrics of linked users is stored on display device 4002 in occupancy controller 4238. In some embodiments, the voice biometrics are stored on building management system 4010 and must be retrieved by occupancy controller 4238. In some embodiments, occupancy controller 4238 uses a text dependent voice recognition technique. In some embodiments, occupancy controller 4238 uses a text independent voice recognition technique to identify the one or more users.
In some embodiments, occupancy controller 4238 uses the text dependent voice recognition technique to identify the one or more users based on a password or particular phrase spoken by one of the users. For example, the user may speak a phrase such as “This is Felix, I am home.” Occupancy controller 4238 can perform speech recognition to determine the spoken phrase “This is Felix, I am home” from the audio data received form the microphone. In some embodiments, occupancy controller 4238 uses one or a combination of a hidden Markov models, dynamic time warping, neural networks to determine the spoken phrase, etc. Occupancy controller 4238 compares the determined spoken phrase to phrases linked to users. If the phrase, “This is Felix, I am home” matches a phrase linked to a user Felix, occupancy controller 4238 can identify the user as Felix. In some embodiments, the linked phrases are stored on occupancy controller 4238. In various embodiments, the linked phrases are stored on building management system 4010.
In some embodiments, occupancy controller 4238 is configured to capture audio data from one or more users and perform pre-processing. In some embodiments pre-processing may be compressing the audio data, converting the audio data into an appropriate format, and any other pre-processing action necessary. Occupancy controller 4238 may be configured to transmit the captured spoken audio data to a voice recognition server via communications interface 4202 and network 4004 as described with reference to
Audio controller 4240 may be configured to receive audio directions from direction selector 4244. Audio controller 4240 may generate an analog signal for speaker 4206 based on a digital audio signal from direction selector 4244. In some embodiments, audio controller 4240 may be configured to convert a digital audio signal into an analog audio signal (i.e., digital to audio conversion (DAC)). In some embodiments, audio controller 4240 may contain a text to speech application program interface (API) that is configured to generate spoken words based on the received navigation direction. In some embodiments, the text to speech API is one or a combination of Watson Text to Speech, Cortana text to speech, an open source text to speech API, a proprietary text to speech API, and/or any other text to speech API.
User interface controller 4242 may be configured to display images on user interface 4208. The images can include, for example, maps, text, arrows, and/or any other image used to display direction to an occupant of a building. In some embodiments, user interface controller 4242 is configured to receive input from use interface 4208. The input may be rotating a map, zooming in on a map, typing in a conference room navigation request, and any other input that can be received from user interface 4208. In some embodiments, user interface controller 4242 receives images to display from direction selector 4244. In some embodiments, user interface controller 4242 sends direction requests to direction request controller 4236.
Direction selector 4244 may be configured to receive directions from direction controller 4228. Direction selector 4244 may be configured to receive emergency directions from emergency identifier 4218. In some embodiments, direction prioritization selector 4246 is configured to receive the directions for directions controller 4228. Direction selector 4244 may be configured to prioritize the directions received from directions controller 4228 and the emergency directions received from emergency identifier 4218. Direction prioritization selector 4246 may be configured to rank each direction request in order of highest priority. In some embodiments, directions requested via user interface 4208 may have the highest priority over opportunistic directions and/or direction determined based on information from occupancy sensor 4204. The ranking system may contain a queue which directions may be placed. The length of time which a direction is in the queue may factor into determining the priority for that direction. For example, a conference advertisement may be received from opportunistic controller 4230 and may be placed into a display queue. The longer the advertisement sits in the queue, the higher the priority level for the advertisement may grow. When the priority level crosses a predefined level, the advertisement may be displayed and the priority level reset. In some embodiments, the priority of a direction may determine the period of time that the direction is displayed on user interface 4208.
In some embodiments, direction prioritization selector 4246 may provide the highest priority direction to emergency prioritization selector 4248. Emergency prioritization selector may provide the directions received from direction prioritization selector 4246 to user interface controller 4242 if no emergency is present. If an emergency is present, emergency prioritization selector may provide the emergency directions to user interface controller 4242 instead of the directions from direction prioritization selector 4246. In some embodiments, emergency directions for multiple emergencies (e.g., floods, tornados, storms, earthquakes, etc.) may be ranked base on order of priority. For example, if emergency prioritization selector 4248 receives a notification from emergency identifier 4218 that there is an active shooter in the building (e.g., building 10) and a notification that there is a flooding, emergency prioritization selector 4248 may rank the active shooter directions as higher priority, and may show these directions exclusively and/or for longer periods of time. In some embodiments, the highest priority emergency direction is the direction that is most likely to cause harm to occupants of the building.
In various embodiments, emergency prioritization selector 4248 may combine emergency directions when occupants of the building must respond to multiple emergencies simultaneously. For example, if there is a fire and a tornado, the emergency prioritization selector 4248 may combine fire response directions with tornado response directions. Emergency prioritization selector 4248 may create emergency messages which tell occupants of the building to go to a certain exit. The route to the exit may bypass rooms and/or hallways with large windows. Emergency prioritization selector 4248 may be able to combine any amount or type of emergency directions.
Referring now to
In addition to determining navigation directions, emergency directions, and prioritizing directions, display device 4300 may be configured to communicate with other display devices (e.g., display device 4016, display device 4018, etc.) and pass directions to other display devices. In some embodiments, display device 4300 passes direction to other display devices that are on the route of a navigation path. In some embodiments, the direction handoff is performed via network 4004 as described with reference to
Building map controller 4304 may be configured to maintain and/or store a building map. The building map may include multiple floors, multiple campuses, etc. Building map controller 4304 may receive updates from building management system 4010 via network 4004. In some embodiments, building map controller 4304 may be configured to receive a map when first installed in the building. In some embodiments, building map controller 4304 contains the locations of all other display devices in the building. In some embodiments, building map controller 4304 is configured to receive map updates from building management system 4010. In various embodiments, building map controller 4304 may receive notices from building management system 4010 that a hallway and/or exit may be closed and/or blocked. In some embodiments, a hallway and/or exit may be blocked based on an emergency (e.g., a certain hallway is on fire and is not transmissible by an occupant. In various embodiments, a hallway and/or exit may be blocked when there are building renovations and/or repairs being done in the building.
User based handoff controller 4306 may have all of the functionality of user based direction controller 4232 and special needs controller 4234. In addition to this functionality, user based handoff controller 4306 may be configured to generate a message to send to other devices along the determined path and/or route. The other devices may be targeted based on their location along the route. Further, the time at which the user based handoff controller 4306 causes the message to be sent may be based on an anticipated and/or determined walking speed of a user. For example, the message to display the directions for a user may be displayed when it is anticipated that the user will be passing the next display device based on an anticipated and/or determined walking speed. User based handoff controller 4306 may cause network controller 4216 and communications interface 4202 to send the message to other targeted display devices.
Display device location controller 4308 may be configured to maintain the location of the display device 4300. In some embodiments, display device location controller 4308 may perform an initial configuration routine in which the display device may prompt an installer with a building map and request that the installer identify the location of the display device 4300. In some embodiments, a password may be entered via user interface 4208 allowing an authorized individual to change the location of the display device 4300. In various embodiments, display device location controller 4308 may be configured to periodically prompt users to confirm the location of the display device 4300. In various embodiments, display device location controller 4308 may prompt the user by asking the user if the directions it is displaying are correct or incorrect. If the user indicates via user interface 4208 that the direction displayed by display device location controller 4308 are incorrect, display device location controller 4308 may be configured to cause a message to be sent to building management system 4010. Building management system 4010 may notify a building technician that the location of display device 4300 needs to be correct and/or updated.
Direction request handoff controller 4310 may contain some or all of the functionality of direction request controller 4236. In addition to this functionality, direction request handoff controller 4310 may be configured to generate a message to send to other devices along the determined path and/or route. The other devices may be targeted based on their location along the route. Further, the time at which direction request handoff controller 4310 causes the message to be sent may be based on an anticipated and/or determined walking speed of a user. For example, the message to display the directions for a user may be displayed when it is anticipated that the user will be passing the next display device based on an anticipated and/or determined walking speed. Direction request handoff controller 4310 may cause network controller 4216 and communications interface 4202 to send the message to other targeted display devices.
Referring now to
Battery controller circuit 4402 is configured to charge and/or discharge battery 4404. Battery controller circuit 4402 may receive AC power and/or DC power. Battery controller circuit 4402 may include a rectifier circuit configured to convert the AC power into DC power. In some embodiments, the rectifier is a full wave rectifier, a half wave rectifier, a full bridge rectifier, and any other type of rectifier. In some embodiments, the rectified wave is filtered to smooth out any voltage ripple present after the wave is rectified. Battery controller circuit 4402 may be configured to configured to perform maximum power point tracking (MPPT) when charging the battery if the power source is a solar cell and/or solar panel. In some embodiments, battery controller circuit 4402 includes circuits configured to perform slow charging (i.e. trickle charging) and/or fast charging. In some embodiments, the temperature of the battery 4404 is monitored while fast charging is performed so that the battery 4404 does not become damaged.
In some embodiments, the battery 4404 stores charge which can be released to power display device 4400. In some embodiments, battery controller circuit 4402 begins discharging battery 4404 when battery controller circuit detects that a wired power source of the display device 4400 is removed (i.e. display device 4400 is removed from the wall). Battery 4404 may be any type or combination of batteries. In some embodiments, the battery is a nickel cadmium (Ni—Cd) battery and/or a nickel-metal hydride (Ni-MH) battery. In various embodiments, the battery is a lithium ion battery and/or a lithium polymer battery.
GPS 4406 may be configured to determine the location of the display device 4400. In some embodiments, GPS 4406 determines the coordinates of display device 4400. GPS 4406 may send the coordinates of display device 4400 to GPS controller 4410. In some embodiments, GPS controller 4410 logs and tracks the location of display device 4400. In some embodiments, GPS controller 4410 is configured to determine what direction display device 4400 is moving by analyzing a plurality of GPS coordinate readings. Building map controller 4412 may contain some of all of the functionality of building map controller 4304 as described with reference to
In some embodiments, mobile directions controller 4416 generates audio directions and visual directions for display device 4400. Mobile directions controller 4416 may be configured to provide audio directions to audio controller 4240 as described with reference to
In some embodiments, mobile directions controller 4416 may be configured to determine directions based on the nature of the emergency determined by emergency identifier 4218. For example, if there is a fire in the building, the mobile directions controller 4416 may navigate the user holding the display device 4400 to the nearest accessible exit. If the emergency is an active shooter in the building, the display device may direct the user holding display device 4400 to an exit and/or may navigate the user holding display device 4400 to a room that can be locked and/or easily barricaded.
In some embodiments, audio controller 4240 is configured to use sound navigation when appropriate. For example, if there is an active shooter in the building, audio controller 4240 may be configured to be silent so that the shooter is not alerted of the location of the user holding display device 4400. In some embodiments, if there is a fire, smoke may be thick enough and/or impair the vision of the user holding display device 4400. Audio controller 4240 may be configured to play audio directing the user holding display device 4400 to an exit without needing the user to be able to see user interface 4208.
Referring now to
In step 4504, display device 4002 receives weather related emergency notifications from weather servers (e.g., weather server(s) 4008.) The alert may be a winter storm watch, a flooding warning, a tornado warning, a tornado watch, etc. In step 4506, display device 4002 may receive and/or query emergency sensors (e.g., building emergency sensor(s) 4006) for data indicating a building emergency. In some embodiments, the emergency sensors are configured to determine the nature of the emergency and provide an emergency notification directly to the display device 4002. In some embodiments, the emergency notification is one or a combination of a fire, a gas leak, unsafe carbon monoxide levels, etc. At step 4506, the display device 4002 may also access social media server(s) 4011 to receive and/or monitor data indicating or relating to a building emergency.
The display device 4002 may thereby receive one or more data streams that include multiple messages indicating one or more emergencies relating to the building. The data streams may include a weather data stream indicating weather conditions associated with the building (i.e., as received from weather server(s) 4008), a social media data stream indicating social media postings, comments, messages and/or other activity (i.e., as received from the social media server(s) 4011, a news data stream indicating one or more events associated with the building (e.g., as received from the social media server(s) 4011, the calendar application 4014, the user device 4012, the building management system 4010, etc.), and/or other relevant data streams.
In step 4508, a decision is made by display device 4002 based on the presence or absence of any emergencies. That is, based on the one or more data streams received in steps 4502-4506, the display device 4002 may determine an existence of an emergency and/or a nature or type of the emergency. If display device 4002 does not determine that there is a building and/or weather related emergency in step 4502, step 4504, and step 4506, the display device 4300 may perform 4516 and display non-emergency related directions. If display device 4002 determines that there is a building and/or weather related emergency in step 4502, step 4504 and/or step 4506 display device 4002 may prioritize the emergency directions and display emergency related directions.
In step 4510, display device 4002 may prioritize all the emergencies determined in step 4502, step 4504, and/or step 4506. Display device 4002 may determine the priority of emergencies based on emergency severity and/or immediate impact to occupants of a building. For example, a winter storm warning may have a lower priority than an active shooter.
In step 4512, display device 4002 may display the emergency directions. In some embodiments, the emergency directions are actions (e.g., emergency response directions) to take in lieu of the building and/or weather related emergency. For example, if there is a tornado, the directions may be to hide under desks and/or tables. If there is a fire, the display device 4002 may display evacuation directions and/or a route to the nearest exit. If there are multiple emergencies present, the display device 4002 may cycle emergencies and/or choose the most important emergency to display. In some embodiments, display device 4002 generates custom directions to accommodate the proper actions to take when there are multiple emergencies. For example, if there is a fire and an active shooter present in a building, display device 4002 may turn off all sound on display device 4002 and display a message to the individual to keep silent. The display device 4002 may then precede to direct building occupants to the nearest exits.
In step 4514, the display device 4002 may generate audible alarms. In some embodiments, the audible alarm may be a loudspeaker message disclosing what the emergency is and/or the proper actions to take in lieu of the emergency. In some embodiments, the audible directions are directions to the nearest exit. The directions may be “Turn left at the end of hallway and proceed to exit” and/or any other message indicating the proper directions that a user should take to evacuate the building.
If display device 4002 determines that no emergencies are present in step 4508, the display device may perform step 4516. In step 4516, display device 4002 receive user direction request via a user interface. In some embodiments, a user may input a specific conference room, meeting room, and/or office.
In step 4518, display device 4002 may identify an occupant based on digital video processing from a camera, digital audio processing from a microphone, and/or any other processing of occupancy sensors that can be used to identify a user. In some embodiments, display device 4002 stores features of users that can be matched by using digital video processing and/or digital audio processing. In some embodiments, display device 4002 sends a query with identified physical features of a user to a building management system (e.g., building management system 4010). The building management system may return the identity of the user. In some embodiments, the building management system may return a schedule indicating locations and times of meetings which the user may be required to attend, or which may be of interest to the user. In some embodiments, display device 4002 generates navigation direction based on the identity of the user and/or based on the schedule received from the building management system.
In step 4520, display device 4002 may generate directions opportunistically. In some embodiments, directions may be based on events occurring in the building. In some embodiments, display device communicates with a building management system (e.g., building management system 4010) and/or a building scheduler system. In some embodiments, display device 4002 generates opportunistic directions based on the location of display device 4002 in the building and/or the events occurring in the building. In some embodiments, display device 4002 communicates with the scheduling applications of mobile devices of users in the building and/or passing by display device 4002. In some embodiments, display device 4002 determines what events are occurring in the building and their nature (e.g., public, private, etc.). In some embodiments, display device 4002 generates directions opportunistically based on the schedules of mobile devices in the building.
In some embodiments, display device 4002 prioritizes the directions determined in steps 4516-4520 (step 4522). The directions can be ranked in order of highest priority. In some embodiments, requested directions (step 4516) may have the highest priority over opportunistic directions (step 4520) and/or direction determined based on information from an occupancy sensor (step 4518). The ranking system may contain a queue which directions may be placed. The length of time which a direction is in the queue may factor into determining the priority for that direction. For example, if a conference advertisement is received from a building management system, the priority for displaying this advertisement may be low. In some embodiments, the priority of a direction may determine how long the direction is displayed on a user interface of display device 4002. The highest priority direction may be displayed on a user interface of display device 4002.
Referring now to
In step 4602, display device 4300 sends emergency directions to other display devices located in the building. In some embodiments, display device 4300 determines where other display devices are located in the building with a display device location controller (e.g., display device location controller 4308). In some embodiments, display device 4300 sends the emergency directions to other devices located in the building via ad hoc communication (e.g., ad hoc Wi-Fi, ad hoc Zigbee, ad hoc Bluetooth, NFC etc.). In some embodiments, display device 4300 is configured to communicate ad hoc to the other display devices. In various embodiments, display device 4300 may be configured to transmit the emergency directions to the other display devices via network 4004 as described with reference to
In step 4508, if no emergency is present, display device 4300 may receive direction requests from user interface (step 4604). In some embodiments, display device 4300 may be configured to allow users to enter destinations via a touch screen user interface. In some embodiments, the destination is a conference room, a meeting room, and/or an office. Display device 4300 may be configured to display an arrow, a map, turn by turn directions, and/or generate audio directions. Display device 4300 may determine other display devices along the route to the destination (step 4608) and may send display directions to these devices ad hoc and/or over network 4004 (step 4610).
In step 4606, display device 4300 may determine directions for an occupant based on the identity of the occupant. In some embodiments, display device 4300 uses at least one of a camera and/or a microphone to determine the identity of an occupant. An occupancy controller (e.g., occupancy controller 4238) may be configured to identify occupants based on data received from occupancy sensors (e.g., cameras, microphones, etc.). Display device 4300 may be connected to a network (e.g., network 4004) and may be able to retrieve meeting information associated with the identified user. Display device 4300 may be configured to display directions (arrows, turn by turn directions, maps, etc.) based on any destinations that are indicated by the identified user's meeting schedule. In some embodiments, display device 4300 is configured to determine other display devices along the route to the destination (step 4608) and may send display directions to these devices ad hoc and/or over network 4004 (step 4610).
Referring now to
In some embodiments, if an emergency is determined in at least one of steps 4502, 4504, and 4506 as described with reference to
In step 4710, display device 4400 may prompt a user to remove display device 4400 from the wall. In some embodiments, user interface 4208 intermittently periodically displays a message “Remove From Wall For Evacuation” for a predefined duration of time. In some embodiments, the user may press a button on user interface 4208 which confirms that the user has removed the device from the wall. In some embodiments, display device 4400 may use GPS 4406 and GPS controller 4410 to determine that display device 4400 has is changing location and has been removed from its original location. In some embodiments, display device 4400 has a sensor such as a switch which detects that the device has been removed from the wall.
In step 4712, display device 4400 may determine its current location with GPS 4406. In some embodiments, GPS controller 4410 may communicate with GPS 4406 to determine coordinates of display device 4400. In some embodiments, the coordinates are a latitude, a longitude, and an altitude. Display device 4400 may be configured to use the coordinates to determine the location of the display device 4400 and the user who has removed display device 4400 from the wall in the building. In some embodiments, display device 4400 uses GPS controller 4410 to poll GPS 4406 for coordinates periodically. In some embodiments, GPS controller 4410 receives a new coordinate when one of the coordinates (i.e., altitude, longitude, and latitude) has changed more than a predefined amount.
In step 4714, the display device may use building map controller 4412 and mobile directions controller 4416 to determine a route to an evacuation point and/or a safe zone with the GPS coordinates of GPS controller 4410. In some embodiments, user interface controller 4242 may display the location of the user on user interface 4208 and a map with a route indicating the necessary directions to take to reach the evacuation point and/or safe zone.
Referring now to
Referring now to
Referring now to
In some embodiments, audio 5004 may be broadcast by display device 4002 and/or display device 4300 to accompany the direction message 5102. In some embodiments audio 5004 is broadcast via speaker 4206. The audio 5104 may give audible directions to occupants of the building to report to certain rooms, floors, building, and/or any other location. In some embodiments, audio 5004 is music and/or any other audio based message or sound. Audio 5004 may identify an occupant by name and/or handle before playing directions for the occupant.
Referring now to
In some embodiments, audio 5104 may be broadcast by display device 4002 and/or display device 4300 to accompany the alternate route message 5102. In some embodiments audio 5104 is broadcast via speaker 4206. The audio 5104 may give audible directions to occupants of alternate routes. In some embodiments, the audio 5104 may direct an occupant to a wheelchair accessible ramp. In some embodiments, audio 5104 is music and/or any other audio based message or sound. Audio 5104 may identify an occupant by name and/or handle before playing directions for the alternate route.
Referring now to
More particularly, in the example of
In some cases, the shooter may move within the building. The shot detection system may detect a second location of a gunshot and provide the second location to the display device 4002. The display device may then update the escape route and the associated navigations directions on the display device 4002 and on the one or more additional display devices (e.g., display device 4016, display device 4018) to direct the user along an update route that avoids the second location. The user may thereby be guided to safety along a route that avoids the active shooter in the building. For example, with reference to
Halo Light Emitting Diode (LED) System
A display device includes a housing having a front portion, a rear portion, and a halo having a rim which is disposed between the front portion and the rear portion, according to some embodiments. In some embodiments, the halo receives light emitted by one or more LEDs and diffuses the light along sides of the display device. The halo includes multiple light guiding portions which each have a receiving post and a sweep portion, according to some embodiments. In some embodiments, each of the light guiding portions protrude from a rim of the halo which is positioned between the front portion and rear portion of the display device. In some embodiments, the halo is made of or includes a translucent material and/or a transparent material. In some embodiments, the LEDs are disposed along a path of an LED board and are each configured to emit light received by a corresponding light guiding portion. In some embodiments, the light guiding portions are cantilever portions, having an overall S-shape, protruding at one end from the rim. In some embodiments, the light guiding portions include exterior surfaces coated (e.g., cladded) with an opaque material which does not allow light to pass through along substantially an entire length of the light guiding portions. In some embodiments, a surface of the receiving posts and an exterior surface of the rim does not include the opaque material, such that light may enter and exit the light guiding portions only at the receiving post and exterior surface of the rim, respectively.
In some embodiments, the halo facilitates notification of a user regarding any of information, a message, an event, etc., at a wider viewing angle. For example, if the user is not positioned directly in front of the display device, the user may be unable to view a front display panel of the display device, according to some embodiments. In some embodiments, the halo directs light outwards from sides of the display device, so that the light emitted by the LEDs can be viewed by a user at a generally side angle.
In some embodiments, the display device is a thermostat, e.g., the control device 214 as described with reference to
Advantageously, the display device facilitates visual notification regarding a variable, an event, a change in a variable, etc., to a user at a wider viewing angle, according to some embodiments.
Referring now to
In some embodiments, user interface 5306 transitions between a set of predetermined messages/alerts/information. For example, user interface 5306 may iteratively display an indoor air temperature, an indoor air quality, an outdoor air temperature, a time of day, an alert, etc. In some embodiments, user interface 5306 transitions from displaying one message/information/alert at an end of a predetermined time period. For example, user interface 5306 may display a different message/information/alert every 1 second, every 5 seconds, etc., upon a request received from the user through user interface 5306, or upon an event (e.g., an alert), according to some embodiments.
Display device 5300 includes a front portion 5302 and a rear portion 5304, according to some embodiments. In some embodiments, front portion 5302 and rear portion 5304 are coupled (e.g., removably coupled, fixedly coupled, selectively coupled, fastened, integrally formed, etc.) to each other. In some embodiments, front portion 5302 and rear portion 5304 are removably coupled (e.g., by fasteners). In some embodiments, front portion 5302 and rear portion 5304 are configured to interface with each other (e.g., a slip fit, a frictional fit, a snap fit, etc.). In some embodiments, front portion 5302 and rear portion 5304 use a combination of fasteners and an interfaced fit (e.g., a slip fit, a frictional fit, a snap fit, etc.).
In some embodiments, front portion 5302 includes user interface 5306. In some embodiments, front portion 5302 includes an aperture (e.g., an opening, a hole, a recess, etc.) configured to receive user interface 5306 therein. In some embodiments, front portion 5302 includes a covering 5310 configured to interface with front portion 5302. In some embodiments, covering 5310 is a protective covering configured to protect user interface 5306 from damage. In some embodiments, covering 5310 is disposed in front of user interface 5306. Covering 5310 may be any of a glass material, a plastic material, etc. In some embodiments, covering 5310 is translucent. In some embodiments, covering 5310 is transparent. In some embodiments, covering 5310 is configured to allow light emitted by user interface 5306 to pass through.
Covering 5310 is disposed outside of front portion 5302, according to some embodiments. In some embodiments, covering 5310 is disposed adjacent an inner surface of front portion 5302. In some embodiments, covering 5310 covers at least part of or an entire area of the aperture of front portion 5302 which receives user interface 5306. In some embodiments, covering 5310 is received in an aperture (e.g., an opening, a hole, a recess, etc.) of front portion 5302. In some embodiments, covering 5310 is received within the aperture within which user interface 5306 is received.
In some embodiments, sides 5308 (e.g., walls, borders, faces, surfaces, panels, etc.) are disposed between front portion 5302 and rear portion 5304. In some embodiments, sides 5308 extend between rear portion 5304 and front portion 5302. In some embodiments, any of sides 5308 are integrally formed with at one of front portion 5302 and rear portion 5304. For example, in some embodiments, sides 5308 are integrally formed with front portion 5302. In some embodiments, sides 5308 are integrally formed with rear portion 5304. In some embodiments, one or more of sides 5308 are integrally formed with one of front portion 5302 or rear portion 5304, while one or more other sides 5308 are integrally formed with another of front portion 5302 or rear portion 5304. For example, left side 5308a and right side 5308b are integrally formed with front portion 5302 and upper side 5308c and bottom side 5308d are integrally formed with rear portion 5304 (or vice versa), according to some embodiments.
In some embodiments, sides 5308 are coupled (e.g., removably coupled, attached, fastened, fixed, slip fit, frictionally fit, snap fit, etc.) to at least one of front portion 5302 and rear portion 5304. In some embodiments, sides 5308, front portion 5302, and rear portion 5304 define an enclosure having an inner volume therein. In some embodiments, any of user interface 5306, a controller, a power circuit, etc., or any other components, subcomponents or devices (e.g., LEDs) are disposed within the inner volume defined by front portion 5302, rear portion 5304 and sides 5308.
In some embodiments, sides 5308 are generally planar. For example, as shown in
Opposite sides 5308 are substantially parallel to each other, according to some embodiments. For example, left side 5308a is shown generally parallel to right side 5308b and upper side 5308c is generally parallel to bottom side 5308d, according to some embodiments. In some embodiments, opposite sides 5308 are not parallel to each other. For example, in some embodiments, left side 5308a non-parallel with right side 5308b. In some embodiments, adjacent sides 5308 are substantially perpendicular to each other. For example, as shown in FIG. 53, left side 5308a is substantially perpendicular to upper side 5308c (which is adjacent left side 5308a), according to some embodiments. In some embodiments, left side 5308a is substantially perpendicular to bottom side 5308d. In some embodiments, left side 5308a, right side 5308b, upper side 5308c, and bottom side 5308d are integrally formed with each other.
In some embodiments, a halo 5322 is positioned between front portion 5302 and rear portion 5304. In some embodiments, halo 5322 is positioned between sides 5308 and one of front portion 5302 and rear portion 5304. For example, as shown in
Referring now to
Rear portion 5304 is shown to include a first modular portion 5414 and a second modular portion 5416, according to some embodiments. In some embodiments, first modular portion 5414 and second modular portion 5416 are integrally formed. In some embodiments, first modular portion 5414 and second modular portion 5416 define rear portion 5304. First modular portion 5414 is shown to have an overall height substantially equal to height 5402, according to some embodiments. In some embodiments, first modular portion 5414 includes and/or is sides 5308. In some embodiments, first modular portion 5414 is configured to interface with one or more of sides 5308 and front portion 5302. For example, first modular portion 5414 is configured to interface with sides 5308 and/or front portion 5302 with at least one of a slip fit, a frictional fit, a snap fit, fasteners, etc., according to some embodiments.
In some embodiments, second modular portion 5416 has a height 5410 and depth 5408. Height 5410 is shown less than overall height 5402 of display device 5300, according to some embodiments. In some embodiments, height 5410 is substantially equal to or greater than overall height 5402. In some embodiments, second modular portion 5416 protrudes (e.g., extends, juts from, extrudes from, etc.), surface 5506 of first modular portion 5414. In some embodiments, second modular portion 5416 protrudes a distance from surface 5506 substantially equal to depth 5408. Advantageously, if display device 5300 is a wall-mounted display device, second modular portion 5416 is configured to extend within and be received by an aperture of the wall, according to some embodiments. In some embodiments, second modular portion 5416 extends at least partially within an aperture of a wall. In some embodiments, first modular portion 5414 extends at least partially within an aperture of a wall. For example, in some embodiments, the aperture (e.g., of the wall) is a recess (e.g., cavity, indent) which is stepped to both receive first modular portion 5414 and at least partially receive second modular portion 5416. In some embodiments, second modular portion 5416 extends from surface 5506 of first modular portion 5414 which is disposed sub-flush a rim 5518 of first modular portion 5414. In some embodiments, rim 5518 is cooperatively formed by sides 5308. In some embodiments, rim 5518 extends along an entire perimeter of first modular portion 5414. In some embodiments, rim 5518, surface 5506, and sides 5512 of second modular portion define a recess 5600 having a width 5520 which runs along an entire perimeter of display device first modular portion 5414. In some embodiments, recess 5600 is configured to interface with a protrusion of a mounting plate (e.g., a wall mounting plate, a wall, etc.).
In some embodiments, first modular portion 5414 includes one or more fastener elements (e.g., posts, apertures, threaded bores, clips, latches, etc. configured to fasten display device 5300 to a wall), shown as fastener elements 5508. Fastener elements 5508 are shown as bores configured to receive a fastener to removably couple display device 5300 to a surface. In some embodiments, fastener elements 5508 are threaded bores. In some embodiments, fastener elements 5508 are bores configured to receive self-tapping screws. In some embodiments, fastener elements 5508 are disposed along a patterned path. In some embodiments, fastener elements 5508 are disposed proximate corners of display device 5300. In some embodiments, fastener elements 5508 are evenly spaced a distance apart.
In some embodiments, second modular portion 5416 is generally rectangular having sides (e.g., walls, panels, sidewalls, etc.), shown as sides 5512. In some embodiments, second modular portion 5416 is a generally rectangular shape having a length 5504 and a height 5410. In some embodiments, adjacent sides 5512 form a rounded intersection point. For example, side 5512c and side 5512a are adjacent each other, and intersect to form a fillet. In some embodiments, second modular portion 5416 is a generally rectangular shape having filleted (e.g., rounded) corners. In some embodiments, second modular portion 5416 is a generally rectangular shape having chamfered corners. In some embodiments, first modular portion 5414 is generally rectangular shaped having height 5402 and length 5502. In some embodiments, first modular portion 5414 is generally rectangular shaped having filleted corners (e.g., corners 5510). In some embodiments, first modular portion 5414 is generally rectangular shaped having chamfered corners. In some embodiments, a center of a cross section of first modular portion 5414 is substantially coincident with a center of a cross section of second modular portion 5416.
In some embodiments, second modular portion 5416 includes a surface (e.g., a back surface, a back plate, a back panel, a back wall, etc.), shown as rear surface 5514. In some embodiments, rear surface 5514 includes any of fastener elements 5508. In some embodiments, rear surface 5514 includes one or more apertures (e.g., bores, openings, through-holes, rectangular openings, etc.), configured to facilitate wired connections to a controller (e.g., a processing circuit, a power board, etc.) disposed within display device 5300. In some embodiments, rear surface 5514 is removably connected to sides 5512, facilitating easy access to internal components of display device 5300. In some embodiments, rear surface 5514 is removably connected to sides 5512 with any one of or a combination of fasteners, a slip fit, a frictional fit, a snap fit, etc. In some embodiments, rear surface 5514 is configured to be received by an aperture cooperatively formed by sides 5512.
In some embodiments, surface 5506 of first modular portion 5414 includes a rectangular aperture (e.g., opening, recess, hole, etc.), shown as rectangular opening 5516. In some embodiments, rectangular opening 5516 is configured to receive a protrusion of another member (e.g., a mounting plate, a wall, etc.) to connect display device 5300 to the other member. In some embodiments, rectangular opening 5516 is configured to allow a wired connection (e.g., a USB connection, a power connection, etc.) to a controller disposed within display device 5300. In some embodiments, one or more rectangular openings 5516 are included on rear surface 5514.
Referring now to
Referring now to
Referring now to
Referring now to
Referring still to
Referring now to
In some embodiments, recess 6212 is generally rectangular. Recess 6212 is shown to include an aperture (e.g., opening, hole, etc.), shown as vertical aperture 6206, according to some embodiments. In some embodiments, vertical aperture 6206 is a notch and extends partially along a height of second surface 6208. In some embodiments, front portion 5302 includes one or more apertures, shown as apertures 6204. In some embodiments, apertures 6204 are rectangular and extend at least partially into first surface 6210.
Referring now to
Referring still to
LED Board
Referring now to
As shown in
Halo
Referring now to
In some embodiments, halo 5322 is or includes translucent and/or transparent material. In some embodiments, halo 5322 is configured to allow light to pass through. In some embodiments, one or more exterior surfaces of halo 5322 are coated with a material which does not allow light to pass through. For example, in some embodiments, all exterior surfaces of halo 5322 are coated with a material (e.g., a coating, a paint, etc.)
Referring to
Referring still to
Referring now to
In some embodiments, wave guides 6704 are a substantially translucent and/or transparent material. In some embodiments, wave guides 6704 are cladded with an opaque material. In some embodiments, exterior surfaces of wave guide 6704 which do not either facilitate an entry or an egress of light into/out of wave guides 6704 is cladded with the opaque material. In some embodiments, the opaque material is painted onto exterior surfaces of wave guides 6704.
In some embodiments, wave guides 6704 include a sweep portion 6808 and a receiving post 6806. In some embodiments, sweep portions 6808 of wave guides 6704 protrude from a rim 6702 (e.g., bezel, surrounding edge, etc.) of halo 5322. In some embodiments, wave guides 6704 protrude from rim 6702 along a curved path. In some embodiments, a width of sweep portion 6808 of wave guides 6704 varies (e.g., decreases) along the curved path. In some embodiments, wave guides 6704 include a receiving post (e.g., a square receiving post, a rectangular receiving post, a square receiving post, etc.) which protrudes from an end point of sweep portion 6808. In some embodiments, any or all of rim 6702, sweep portions 6808 and receiving posts 6806 are integrally formed.
In some embodiments, receiving posts 6806 are configured to facilitate entry of light into wave guides 6704. In some embodiments, receiving posts 6806 include a surface which is not covered with an opaque material (e.g., not cladded) configured to facilitate entry of light emitted by one or more of LEDs 6400 into wave guide 6704. In some embodiments, receiving posts 6806 protrude such that an end of receiving posts 6806 is substantially adjacent to the corresponding LED 6400. In some embodiments, the end of receiving posts 6806 contacts an exterior surface of a corresponding LED 6400.
Referring now to
In some embodiments, rim 6702 is coated with the opaque material. In some embodiments, first surface 6906, second surface 6908 and interior surface 6928 are coated with the opaque material. In some embodiments, if sweep portion 6808 protrudes from interior surface 6928, at least part of an area of interior surface 6928 which sweep portion 6808 protrudes from is configured to allow light to pass through. In some embodiments, exterior surface 6910 is configured to facilitate egress of light from wave guide 6704. In some embodiments, exterior surface 6910 is configured to diffuse light which passes through wave guide 6704 along at least part of exterior surface 6910.
In some embodiments, sweep portion 6808 includes one or more exterior surfaces which are coated (e.g., cladded) with an opaque material configured to restrict the exit of light from wave guide 6704. In some embodiments, sweep portion 6808 includes first surface 6902 and second surface 6904. In some embodiments, first surface 6902 and second surface 6904 are opposite each other and are each offset an equal distance from path 6922 in opposite directions. In some embodiments, first surface 6902 and second surface 6904 substantially follow path 6922 at an offset distance. In some embodiments, first surface 6902 and second surface 6904 are coated (e.g., cladded) with the opaque material.
In some embodiments, an axis 6810 extends tangent to a starting point 6930 of path 6922. In some embodiments, an axis 6812 extends tangent to an end point of path 6922. In some embodiments, axis 6812 is a central axis of receiving post 6806. In some embodiments, axis 6812 extends tangent to the end point of path 6922 and is the central axis of receiving post 6806. In some embodiments, axis 6810 and axis 6812 are substantially parallel to each other. In some embodiments, axis 6810 and axis 6812 are substantially parallel to each other and are offset a distance 7002 from each other. In some embodiments, distance 7002 is a distance which is perpendicular to both axis 6810 and axis 6812. In some embodiments, distance 6926 is parallel to the Z-axis of coordinate system 6900. In some embodiments, axis 6810 extends tangentially outwards from starting point 6930 of path 6922 and starting point 6930 of path 6922 is disposed at a center point of initial width 1632 of sweep portion 6808. In some embodiments, axis 6810 and axis 6812 are offset relative to each other along the X-axis of coordinate system 6900 (e.g., laterally).
In some embodiments, sweep portion 6808 has a width 6914 and/or an opening 6918 for fasting to another component and/or enclosure piece. Width 6914 varies (e.g., decreases) along path 6922, according to some embodiments. In some embodiments, width 6914 decreases along path 6922 until it is substantially equal to thickness 7006 of receiving post 6806. In some embodiments, width 6914 decreases non-linearly. In some embodiments, sweep portion 6808 has initial width 1632 proximate the interface (e.g., connection) between rim 6702 and sweep portion 6808. In some embodiments, width 6914 decreases linearly. In some embodiments, width 6914 decreases (e.g., either linearly or non-linearly) along part of path 6922 and increases (e.g., either linearly or non-linearly) along another part of path 6922.
In some embodiments, receiving post 6806 protrudes from an end of sweep portion 6808. In some embodiments, receiving post 6806 protrudes tangentially outwards from an endpoint of path 6922. In some embodiments, receiving post 6806 extends in a direction substantially parallel to the Y-axis. In some embodiments, receiving post 6806 includes a receiving surface 1720, configured to facilitate entry of light emitted by one of LEDs 6400. In some embodiments, all other surfaces of receiving post 6806 are coated (e.g., cladded) with the opaque material to prevent light from exiting through the other surfaces.
In some embodiments, sweep portion 6808 has a constant thickness 6912 along an entire length of path 6922. In some embodiments, sweep portion 6808 has a variable thickness 6912 with respect to path 6922. For example, in some embodiments thickness 6912 increases, decreases, or a combination of both, along path 6922. In some embodiments, thickness 6912 is substantially equal to thickness 6906 of receiving post 6806. In some embodiments, thickness 6912 changes (e.g., increases, decreases, or a combination of both) along path 6922 and is substantially equal to thickness 6906 of receiving post 6806 at an end of path 6922 which receiving post 6806 protrudes from.
In some embodiments, receiving post 6806 has a height 6916. In some embodiments, receiving post 6806 protrudes from the end of sweep portion 6808 such that surface 6920 of receiving post is adjacent LED 6400. In some embodiments, receiving post 6806 protrudes from the end of sweep portion 6808 such that surface 6920 is distance 7004 from LED 6400. In some embodiments, distance 7004 is negligible.
Referring now to
Halo 5322 facilitates a wider off-axis viewing angle of light emitted by LED 6400, according to some embodiments. In some embodiments, halo 5322 facilitates notifying a user regarding information received by or determined by display device 5300. In some embodiments, halo 5322 enables the notification to be visible by an observer generally facing any of sides 5308. In some embodiments, halo 5322 enables notifications to an observer when the observer cannot view user interface 5306.
LED Controller
Referring now to
Still referring to
In some embodiments interfaces 7024 and 7022 can be joined as one interface rather than two separate interfaces. For example, output interface 7022 and input interface 7024 can be combined as one Ethernet interface configured to receive network communications from controller 7034 or a network. In some embodiments, controller 7034 provides both a setpoint and feedback via an Ethernet network. In some embodiments, output interface 7022 can be another standardized communications interface for communicating data or control signals (e.g., analog or digital). Interfaces 7022 and 7024 can include communications electronics (e.g., receivers, transmitters, transceivers, modulators, demodulators, filters, communications processors, communication logic modules, buffers, decoders, encoders, encryptors, amplifiers, etc.) configured to provide or facilitate the communication of the signals described herein.
Still referring to
Memory 7206 can include one or more devices (e.g., memory units, memory devices, storage devices, etc.) for storing data and/or computer code for completing and/or facilitating the various processes described in the present disclosure. Memory 7206 can include random access memory (RAM), read-only memory (ROM), hard drive storage, temporary storage, non-volatile memory, flash memory, optical memory, or any other suitable memory for storing software objects and/or computer instructions. Memory 542 can include database components, object code components, script components, or any other type of information structure for supporting the various activities and information structures described in the present disclosure. Memory 542 can be communicably connected to processor 540 via processing circuit 538 and can include computer code for executing (e.g., by processor 540) one or more processes described herein.
Referring still to
Controller 7000 is shown receiving information from sensor/equipment 7036 through input interface 7024. In some embodiments, sensor/equipment module 7010 receives the information from sensor/equipment 7036. In some embodiments, sensor/equipment module 7010 receives the information from sensor/equipment 7036 and determines an event based on the received information. For example, in some embodiments, sensor/equipment module 7010 periodically receives temperature information from a temperature sensor and determines if the received temperature exceeds a predetermined threshold value. In another example, sensor/equipment 7036 receives information from an indoor air quality sensor (e.g., a carbon monoxide detector) and determines if the received indoor air quality information is less than a predetermined threshold value. In some embodiments, controller 7000 receives information from any of one or more controller, one or more equipment devices, one or more sensors, a network, etc., and determines an operation of user interface 7032 and/or LEDs 7026 based on the received information. Controller 7000 may receive information through a wired connection at input interface 7024, a wireless connection at input interface 7024, or a combination of both.
In some embodiments, sensor/equipment module 7010 determines an event based on the received information and provides the event to LED module 7008. For example, if sensor/equipment module 7010 determines that the indoor air quality has dropped below a predetermined value, sensor/equipment module 7010 provides the determined event to LED module 7008. In some embodiments, sensor/equipment module 7010 provides the information received from sensor/equipment 7036 to user interface module 7020. For example, in some embodiments, if sensor/equipment module 7010 receives temperature information from sensor/equipment 7036, sensor/equipment module 7010 provides the temperature information to user interface module 7020. In some embodiments, user interface module 7020 is configured to determine control signals for user interface 7032 to display the information received from sensor/equipment module 7010 to a user. In some embodiments, sensor/equipment module 7010 is configured to provide LED module 7008 with at least one of information received through input interface (from at least one of controller 7034 and sensor/equipment 7036) and the determined or received event.
In some embodiments, user interface module 7020 is configured to determine control signals for user interface 7024. In some embodiments, user interface 7032 is user interface 5306. In some embodiments, user interface module 7020 is configured to determine control signals for user interface 7032 to display messages, information, graphical representations of information, data, etc. In some embodiments, user interface module 7020 also receives information from user interface 7032 through input interface 7024. In some embodiments, user interface module 7020 receives commands, directives, requests, etc., from user interface 7032 and adjusts an operation (e.g., a displayed message) of user interface 7032 based on the command, request, etc., received from user interface 7032. In some embodiments, user interface module 7020 receives a request from user interface 7032 to display certain data, and user interface module 7020 adjusts an operation of user interface 7032 to display the requested data.
In some embodiments, controller 7000 receives any of information and an event from controller 7034. For example, in some embodiments, controller 7034 is communicably connected with sensor/equipment 7036 and is configured to analyze, process, group, etc., information form sensor/equipment 7036 and determine if an event has occurred. In some embodiments, controller 7034 provides the information and/or event data to at least one of user interface module 7020 and LED module 7008.
Referring still to
LED module 7008 is shown to include an LED database 7012, a color module 7014, a pattern module 7016, and an intensity module 7018. In some embodiments, LED database 7012 stores information regarding a patterned operation of one or more LEDs based on a received event and/or received information. For example, if LED module 7008 receives an event from sensor/equipment module 7010, controller 7034, sensor/equipment 7036, etc., indicating that the indoor air quality has dropped below a predetermined value, LED module 7008 may retrieve a set of instructions from LED database 7012 regarding an operation of LEDs based on the event. In some embodiments, LED database 7012 includes information regarding an operation of LEDs for a variety of events, including but not limited to, an increased temperature event, a decreased temperature event, a low indoor air quality event, an emergency event, a fire detection event, an equipment failure event, a calendar date event, a time of day, etc. In some embodiments, LED database 7012 includes a set of predetermined instructions regarding an operation of LEDs for each of these events.
In some embodiments, LED database 7012 includes a set of predetermined instructions for each of a set of predefined events. In some embodiments, LED database 7012 can be updated and/or customized. For example, in some embodiments, LED database 7012 can receive directives from user interface 7032 to change an operation of one or more of the LEDs (e.g., color, on-off pattern, intensity, timing, etc.) to modify the predetermined instructions for one or more of the predefined events. In some embodiments, additional events can be added to LED database 7012 along with corresponding LED operation instructions for the additional events. In some embodiments, for example, controller 7000 includes a wireless radio (e.g., a Bluetooth wireless radio) configured to interface with a user device (e.g., a smartphone). The LED database 7012 is configured to be updated or modified based on directives received from the user device. For example, if a user wants to be notified/reminded of an event on a certain date at a specific time, the user may add an event to LED database 7012 to adjust an operation of one or more LEDs according to a predetermined pattern, set of rules, etc., on the certain date at the specific time.
In some embodiments, upon receiving an event and/or information, LED database 7012 provides the instructions to color module 2014, pattern module 7016, and intensity module 7018. For example, if LED database 7012 receives a night-time event (e.g., from a clock or from a light detector), LED database 7012 may retrieve a specific set of instructions (e.g., dim all LEDs by 50%, turn off several LEDs, adjust a color of one or more LEDs to blue, etc.) for the LEDs (e.g., LEDs 7026, LEDs 6400, etc.) corresponding to the night-time event.
In some embodiments, LED database 7012 includes instructions for various events to adjust a color of one or more of the LEDs (e.g., red, blue, green, etc.), adjust an intensity of one or more of the LEDs, turn one or more of the LEDs on or off, patterningly adjust a color of one or more of the LEDs, patterningly adjust an intensity of one or more of the LEDs, patterningly turn one or more of the LEDs on or off, etc. In some embodiments, any of the color, intensity, on/off state, etc., of the one or more LEDs is patterned over time (e.g., all LEDs are turned on for 5 seconds, then turned off for 5 seconds, and this is repeated), or patterned based on a predetermined position of the one or more LEDs (e.g., turn a first LED on, then turn a second LED on, then turn a third LED on and turn the first LED off, then turn a fourth LED on and turn the second LED off, then turn a fifth LED on and turn the third LED off, etc.), or patterned based on both time and position of the one or more LEDs.
In some embodiments, one or more of the set of instructions stored in LED database 7012 extend for a time duration, and are repeated. For example, some of the sets of instructions may last for five seconds (e.g., a patterned operation of the LEDs for five seconds) and be repeated a predetermined number of times, while other sets of instruction may last only two seconds (e.g., increase intensity from 0% to 5300% for all LEDs over a 1 second time duration, then decrease intensity from 5300% to 0% for all LEDs over a 1 second time duration), and repeated.
In some embodiments, sets of instructions are combined. For example, in some embodiments, all events which indicate an increase in temperature include a same patterned intensity operation of LEDs (e.g., linearly increase intensity of all LEDs from 0% to 5300% over a five second time window). However, within the set of all events which indicate an increase in temperature, other operations of the LEDs (e.g., color) may vary based on other factors (e.g., which temperature from a set of temperatures is increasing, how fast the temperature increases, etc.).
Any of the color, pattern, intensity, etc., of the one or more LEDs may be adjusted over a time window linearly (e.g., increase intensity from 0% to 5300% linearly over a 5 second time window) or may be adjusted over a time window non-linearly (e.g., increase intensity from 0% to 5300% according to an exponential function, a polynomial, etc.).
In some embodiments, the instructions stored in LED database 7012 depend on the particular types of LEDs used. For example, some LEDs may not be multi-color LEDs and may only actuate between an on state and an off state, according to some embodiments. In some embodiments, LED database 7012 stores a map of positions of the LEDs and abilities of each of the LEDs (e.g., dimming abilities, maximum light intensity, etc.).
In some embodiments, controller 7000 does not include LED database 2012, and receives instructions from any of controller 7034 and/or a network to adjust an operation of any of a color, a pattern, an intensity (e.g., dimming), etc., of any of the LEDs.
Referring still to
In some embodiments, LED module 7008 is connected to one or more LEDs (e.g., LEDs 7026, LEDs 6400, etc.). In some embodiments, LED module 7008 adjusts an operation of the one or more LEDs to produce the desired effect (e.g., dimming, changing color, patterned dimming, patterned change in color, etc.). In some embodiments, the one or more LEDs each correspond to one or more wave guide 6704 to any of diffuse, direct, scatter, focus, etc., light emitted by the one or more LEDs along sides 5308 of display device 5300.
Thermostat with Halo Light System and Emergency Features
Referring now generally to
The halo LED interface as described in
Referring now to
The control device 214 includes a user interface 7302 in some embodiments. The user interface 7302 may be a transparent touch screen interface displaying information configured to display information to a user and receive input from the user. The user interface 7302 may be the same as, similar to, and/or a combination of touch-sensitive panel 704, the electronic display 706, and/or the ambient lighting 708 as described with reference to
The user interface 7302 may be transparent such that a user can view information on the display and view the surface located behind the display. Thermostats with transparent and cantilevered displays are described in further detail in U.S. patent application Ser. No. 15/146,649 filed May 4, 2016, the entirety of which is incorporated by reference herein.
The user interface 7302 can be a touchscreen or other type of electronic display configured to present information to a user in a visual format (e.g., as text, graphics, etc.) and receive input from a user (e.g., via a touch-sensitive panel). For example, the user interface 7302 may include a touch-sensitive panel layered on top of an electronic visual display. A user can provide inputs through simple or multi-touch gestures by touching the user interface 7302 with one or more fingers and/or with a stylus or pen. The user interface 7302 can use any of a variety of touch-sensing technologies to receive user inputs, such as capacitive sensing (e.g., surface capacitance, projected capacitance, mutual capacitance, self-capacitance, etc.), resistive sensing, surface acoustic wave, infrared grid, infrared acrylic projection, optical imaging, dispersive signal technology, acoustic pulse recognition, or other touch-sensitive technologies known in the art. Many of these technologies allow for multi-touch responsiveness of user interface 7302 allowing registration of touch in two or even more locations at once. The display may use any of a variety of display technologies such as light emitting diode (LED), organic light-emitting diode (OLED), liquid-crystal display (LCD), organic light-emitting transistor (OLET), surface-conduction electron-emitter display (SED), field emission display (FED), digital light processing (DLP), liquid crystal on silicon (LCoC), or any other display technologies known in the art. In some embodiments, the user interface 7302 is configured to present visual media (e.g., text, graphics, etc.) without requiring a backlight.
The user interface 7302 is configured to display an arrow 7304 in some embodiments. The arrow 7304 can aid a user in navigating a building. For example, the arrow 7304 can be a direction (e.g., emergency direction, navigation direction, etc.) as described with reference to
For example, an emergency evacuation arrow may be colored red. However, if the control device 214 is installed on a red colored wall, the arrow 7304 may be difficult to see for a user. In this regard, the control device 214 can be configured to cause the user interface 6902 to display the arrow 7304 with the border 7306. The border 7306 may be black, red, yellow, orange, green, blue etc. and/or any other color which helps the arrow 7304 stand out and be visible to a user. In some embodiments, a user programs, via the user interface 7302, a wall color and/or a border color 7304 in order for the control device 214 to appropriately generate the arrow 7304. In some embodiments, the control device 214 includes color sensors configured to determine a color of a wall that the control device 214 is installed in an automatically select the color for the border 7306 and generate the arrow 7304 with the selected border color. For example, if the wall which the control device 214 is located by is red, the arrow 7304 may be generated to be a red color with a blue colored border 7306 to help the arrow 7304 stand out to a user.
In addition to, or in place of, the navigation direction (e.g., the arrow 7304) displayed by the control device 214 on the user interface 7302, the user device can operate the halo 7300 to cause the control device 214 to communicate navigation directions and/or indications to a user. In some embodiments, in the event of an emergency, the halo, in part and/or in its entirety (e.g., one, multiple, or all of the LEDS lighting the halo 7300) can turn on causing the control device 214 to have an ambient halo light (e.g., a red light for an emergency). In some embodiments, the halo 7300 is operated by the control device 214 to communicate a navigation and/or emergency response direction to a user.
For example, if a user needs to make a right turn, the halo 7300 may operate such that the right side of the halo 7300 (e.g., as shown in
Referring now to
The control device 214 is configured to cause the user interface 7302 to display a map 7402, in some embodiments. The map 7402 may be multiple emergency response directions, e.g., directions helping a user navigate through and/or out of a building in the event of an emergency (e.g., an active shooter situation, a fire, etc.). The map 7402 can indicate the current location of a user, an indication of the control device 214 on the map, and a navigation path including one or multiple turns to evacuate a building. In some embodiments, the control device 214 causes the halo 7400 to operate to display a complementary indication to a user. For example, if the next turn on the map 7402 is a right turn, the halo 7400 can be operated to communicate a right turn to the user.
For example, the control device 214 is configured to cause LEDs illuminating the halo 7400 to operate in a pattern, e.g., a sweeping pattern from left to right. In some embodiments, rather than a sweep from left to right, the LEDs can be activated and held on one at a time at predefined intervals from left to right. In some embodiments, a particular set of LEDs can be operated as a blinker. For example, LEDs on a left side of the halo 7400 can be operated in a blinking mode to indicate a left turn while LEDs on the right side of the halo 7400 can be operated in a blinking mode to indicate a right turn.
Although the control device 214 described with reference to
In some embodiments, the control device 214 is located in a hotel room. When a user first enters the hotel room, the control device 214 may detect the presence of the user and activate the halo LEDs, illuminating the halos 7300 and/or 7400 to indicate that the user should approach the control device 214 to provide the control device 214 input. The control device 214 can present information e.g., check-in and check-out information, facilitate a booking payment, request a wakeup time (and sound an alarm once the wakeup time occurs), prompt a user for preferred environmental settings (e.g., temperature setpoint), etc.
The control device 214 can activate halo LEDs to illuminate the halos 7300 and/or 7400 to provide alarm functionality. For example, a user may set an alarm time and/or date on the control device 214 via the user interface 7302. The control device 214 is configured to sound an alarm when the alarm time and/or date occurs. The alarm may be an audio based alarm sounded via the speakers 710. Furthermore, the control device 214 can activate the halo LEDs to illuminate the halos 7300 and/or 7400 to awaken the user. The LEDs can be pulsed on and off at particular frequencies and/or ramp a light intensity of the LEDs up and/or down.
Furthermore, the control device 214 can be configured to integrate, via the network 602, with a television. The television may be a smart television configured to receive control input via the network 602. For example, the television may be connected to the Internet. An Internet server may store settings for the television and push settings to the television causing the television to implement the settings. Examples of settings may be volume, television channel, powering on or off (e.g., going from a low power state to a fully operational power state), etc.
In some embodiments, the control device 214 receives, via the microphone 726 audio commands (e.g., to turn volume up or down, change a channel up or down, pause a video being played on the television, play the video, fast forward the video, rewind the video, etc.). The control device 214 can process the audio data recorded, determine the command, and push the command to the Internet television server which can in turn cause the television to implement the command. In some embodiments, whenever the control device 214 is processing audio data and/or causing the television to implement a command based on the processed audio data, the control device 214 can operate LEDs of the halos 7300 and/or 7400. For example, when the control device 214 is listening to a user, the LEDs may be operated in a first pattern or in a first color. When the control device 214 is processing the audio data, the control device 214 can operate the LEDs in a second pattern and/or at a second color.
Referring now to
The halo LED system 7508 can be the same as and/or similar to the components of
The control device 214 is shown to receive both emergency and non-emergency data from one or multiple data streams via the network 602. The emergency and non-emergency data can be received from the building management system 610, the building emergency sensors 606, and/or the weather server 608 as described with reference to
The memory 742 is shown to include an emergency identifier 7500. The emergency identifier 7500 is configured, in some embodiments, to analyze data streams receive from the network 602 to determine whether the data of one or more of the data streams indicates an emergency. For example, in some embodiments, the data of a particular data stream may be indicative of an emergency occurring, a type of emergency occurring, etc. In some embodiments, the data received via the network 602 is labeled as an emergency and the emergency identifier 7500 can identify that data as representing an emergency by identifying whether the label is present. In some embodiments, the emergency identifier 7500 itself analyzes values within the data to determine whether an emergency is present.
For example, if the data received from the network 602 is indicative of a particular ambient temperature, the emergency identifier 7500 can identify whether the particular temperature is indicative of a dangerously cold temperature (e.g., by comparing the temperature to a threshold value, e.g., by determining whether the temperature is less than the threshold value). Similarly, the emergency identifier 7500 is configured, in some embodiments, to determine whether wind speed data receive from the weather server 608 indicates hurricane level winds (e.g., wind speed above a predefined amount).
The memory 742 includes display information controller 7502. The display information controller 7502 is configured to generate information for the halo controller 7504 and/or the user interface controller 7506 to display. For example, in some embodiments, the information may be indicative of the emergency and/or non-emergency data receive from the network 602. For example, in some embodiments, if an outdoor ambient temperature is received from the network 602, the display information controller 7502 can communicate a value of the outdoor ambient temperature to the halo controller 7504 and/or the user interface controller 7506. The user interface controller 7506 can cause the user interface 7302 to display a numeric value (or other interface element) representing the ambient temperature. The halo controller 7504 is configured to cause the halo LED system 7508 to display an indication of the current temperature (e.g., illuminate in a particular color and/or with a particular temperature that is based on (e.g., is a function of) the temperature value. For example, the colors displayed by the halo LED system 7508 may be blue and red. The color displayed by the halo LED system 7508 may scale from blue to red as the temperature increases.
The display information controller 7502 is configured, in some embodiments, to generate emergency response directions and cause the halo controller 7504 and/or the user interface controller 7506 to communicate the emergency response directions to a user. In some embodiments, the display information controller 7502 includes some and/or all of the operations of the display device 4300 for generating and displaying directions as described with reference to
In some embodiments, the emergency response directions are one or multiple instructions to navigate a building (e.g., evacuate a building), respond to an active shooter (e.g., fortify a room, turn lights off, hide under a desk, etc.), respond to a hurricane or tornado (e.g., close windows, close shutters, move away from windows, hide under desks or tables, etc.). The display information controller 7502 is configured to communicate the emergency response directions to the halo controller 7504 and/or the user interface controller 7506.
Furthermore, the display information controller 7502 is configured, in some embodiments, to override the current operate (e.g., display) of the halo LED system 7508 and/or the user interface 7302. For example, if the halo LED system 7508 and the user interface 7302 are currently displaying non-emergency information (e.g., information pertaining to normal weather, non-emergency building events, etc.) the display information controller 7502 can cause the halo controller 7504 and/or the user interface controller 7506 to override the display of information by the halo LED system 7508 and/or the user interface 7302 with the emergency response directions.
In some embodiments, in response to receiving emergency response directions, the halo controller 7504 can override a current operation of the halo LED system 7508. For example, the halo LED system 7508 may slowly blink (or linearly, exponentially, etc. vary intensity) at a particular color (e.g., green, blue, etc.) and/or turn on constantly at the particular color to indicate that a user has a message, notification, or otherwise that the control device 214 requires their input. However, if the display information controller 7502 provides emergency response directions to the halo controller 7504. The halo controller 7504 can override the operation of the halo LED system 7508 with the emergency response directions and/or an indication of an emergency. For example, the halo controller 7504 can cause the color of the LED system 7508 to change color to another color indicative of an emergency (e.g., red, orange, etc.) and/or change from being constantly on (or off) to blinking at a particular frequency (e.g., every half second) to gain the attention of a user.
The user interface controller 7506 can be configured to cause the user interface 7302 to display the emergency response directions and/or can be configured to override any other information displayed on the user interface 7302 in response to receiving an indication of an emergency from the display information controller 7502. For example, the user interface 7302 could display navigation instructions for a user to navigate to a particular conference room. The navigation instructions and/or request for the instructions can be receive via the network 602 via a data stream. However, in response to determining that there is a weather emergency (e.g., tornado, flooding, earthquake, etc.) the user interface controller 7506 can override the display of the normal non-emergency y building navigation directions and cause the user interface 7302 to display emergency response directions (e.g., a navigation arrow for evacuation, shooter response directions, etc.).
Referring now to
In step 7602, the control device 214 receives building information from one or more data sources. The data sources can be weather related data sources indicating weather conditions of cities, towns, states, countries, etc. and can be received from the weather server 608 via the network 602. In some embodiments, the data is social media data, e.g., trending posts, videos, etc. receive from the social media servers 4011 via the network 602. Furthermore, the data can be indications of indoor temperatures, indoor air quality values (e.g., carbon monoxide), etc. receive from the building emergency sensors 606.
In step 7604, the control device 214 can determine whether the building information receive in the step 7602 is indicative of an emergency. For example, in some embodiments, the data received in the step 7602 is tagged as an emergency and/or a particular type of emergency. For example, weather data received via a weather data stream from the weather server 608 can indicate that a hurricane is present. Furthermore, an emergency pull handle (e.g., a building emergency sensor 606) can be triggered causing an indication of a fire or active shooter within a building to the control device 214.
In step 7606, in response to determining that a building emergency is occurring as determined in the step 7604, the control device 214 is configured to generate one or more emergency response directions. For example, the control device 214 can generate one or more directions for responding to an emergency, e.g., directions for navigating a building, directions for responding to an active shooter, a fire, etc. In step 7608, the control device 214 can display the emergency directions on the user interface 7302. In some embodiments, the directions are text based instructions “Close Windows,” “Hide Under Desk”, or are visual indications, e.g., arrows, maps, etc.
In step 7610, the control device 214 cause the halo LED system 7508 to operate to provide an indication of the emergency determine din the step 7604 to a user and/or provide an indication of the emergency response directions to a user. For example, the control device 214 could cause the halo LED system 7508 to illuminate (e.g., turn on constantly, blink at a particular frequency, etc.) a particular color (e.g., red) to indicate that there is an emergency. In various embodiments, the halo LED system 7508 operates LEDs of the halo LED system 7508 to provide emergency navigation directions. For example, the halo LED system 7508 could be operated that LEDs on a left side of the control device 214 blink to indicate to make a left turn down a hallway. Furthermore, the lights could turn on in a pattern from left to right to indicate the left turn.
Referring now to
In step 7702, the control device 214 can receive non-emergency data from a first data stream from at least one of the network or a sensor. The control device 214 can receive the non-emergency data from the network 602, e.g., from the building management system 610, from the building emergency sensor 606, from the weather server 608, and/or from the social media servers 4011. Furthermore, the non-emergency data can be received from a sensor of the control device 214 (e.g., a temperature sensor, a pressure sensor, a humidity sensor, etc.). In step 7704, based on the non-emergency data, the control device 214 can cause the user interface 7302 to display non-emergency information.
For example, the user interface 7302 could display temperatures, humilities, weather reports, social media events, scheduled building events, building notifications, news stories, etc. In step 7706, the control device 214 operates the halo LED system 7508 to display an indication of the non-emergency data. For example, if the data is new, the halo LED system 7508 may illuminate to notify a user that new information is received. If the non-emergency data indicates ambient outdoor temperature, the color of the halo LED system 7508 may illuminate to a color that is a function of the temperature (e.g., between blue and red to indicate hot or cold.)
In step 7708, the control device 214 receives emergency data from a second data stream from at least one of the network and the sensor. The emergency data can be received from a second data stream and/or from the first data stream and can be used to override the display of information based on the non-emergency information. For example, the non-emergency information could be received from the building management system 610 via a data stream of the building management system 610 however based on receiving emergency data from a data stream of the weather server 608, the control device 214 can override display of the information on the user interface 7302 and/or the halo LED system 7508.
In step 7710, the control device 214 can determine whether the emergency data received in the step 7708 is indicative of an emergency. For example, the data received from the second data stream may be labeled as emergency and non-emergency data and the control device 214 can identify whether the data of the second data stream is the emergency data based on the label. In some embodiments, the control device 214 itself identifies whether the data of the second data stream is emergency data, e.g., determine whether a wind speed is greater than a predefined amount, determine whether an outdoor temperature is lower than a predefined amount, determine whether a snowfall amount is greater than a predefined amount, etc.
In step 7712, the control device 214 determines emergency response based on the emergency data. For example, if the emergency data indicates that there is a tornado, the control device 214 can generate route directions for navigating to a tornado shelter or safe area of a building. Furthermore, if there is an active shooter in the building, the control device 214 can generate emergency response directions which provide navigation to an area where the shooter is not present.
In step 7714, the control device 214 can override the display of the non-emergency information on the user interface 7302. For example, the control device 214 can cause the user interface 7302 to stop displaying the non-emergency information and begin displaying the emergency response directions. Similarly, the control device 214, in step 7716, can override the operation of the halo LED system 7508 to display the emergency response directions. The step of overriding and displaying the emergency response directions on the user interface 7302 and/or the halo LED system 7508 can be the same and/or similar to the steps 7608 and/or 7610 as described with reference to
Referring generally to
Referring to
In further embodiments, the floor illumination module of
Processing circuit 7800 may be configured to correlate occupancy sensor data from the occupancy sensor 7808 based on detection of presence of one of more of motion, heat, sound, or light conditions in proximity to the thermostat 8000 to detect the approach of a user 7809 and provide occupancy detection data as outputs to the processing circuit 7800. Processing circuit 7800 may be further configured to determine the existence of an occupancy condition based on change detection data inputs from occupancy sensor 7808. Processing circuit 7800 may be configured to determine current time, times of one or more LED 7901a-7901j activations and deactivations, elapsed time of one or more LED 7901a-7901j activation periods, and elapsed time between one or more LED deactivations and subsequent activations.
Occupancy sensor 7808 may be configured to detect the approach of a user 7809 to the thermostat 8000 and/or a presence of the user within an area of the thermostat 8000 and provide occupancy data to the processing circuit 7800. In some embodiments, the processing circuit 78000 determines the approach of the user 7809 to the thermostat 8000 and/or a presence of the user within an area of the thermostat 8000 based upon the occupancy data provided by the occupancy sensor 7808. In some embodiments, the processing circuit 78000 determines the approach of the user 7809 to the thermostat 8000 and/or a presence of the user within an area of the thermostat 8000 based upon the data provided aby another device (e.g., smart phone) without using occupancy data provided by the occupancy sensor 7808 or in combination with occupancy data provided by the occupancy sensor 7808.
Occupancy sensor 7808 may comprise one or more detectors of changes in one of more of motion, heat, sound, or light conditions in proximity to the thermostat 8000. Occupancy sensor 7808 and/or the processing circuit 7800 may be further configured to detect changes in one of more of motion, heat, sound, or light conditions in proximity to the thermostat 8000 that result from the approach of user 7809. Occupancy sensor 7808 may be further configured to transmit data to the processing circuit 7800 via an input interface 7807. The processing circuit 7800 can use historical data associated with levels or changes in one of more of motion, heat, sound, or light conditions in proximity to the thermostat 8000 to determine if the user is approaching or leaving the area associated with the thermostat 8000. The processing circuit 7800 can utilize other data to confirm occupancy. For example, the lighting can be disabled if geofencing data or a vacation mode for the thermostat 8000 indicates that a user is not on the premises.
The thermostat 8000 includes an ambient light sensor 7809 configured to detect ambient light levels in proximity to the thermostat 8000 and provide an output of ambient light level data to the processing circuit 7800. The processing circuit 7800 only provides the light to the floor when the ambient light level is below a threshold when the user 7809 is in proximity of the thermostat 7808 in some embodiments. In some embodiments, the processing circuit 7800 only provides the light to the floor when clock data indicates non-daylight hours. One or more of LEDs 7901a-7901j may be configured to emit light in one or more of a direction toward a floor of a building 10 beneath a thermostat 8000 or area in proximity to the thermostat (e.g. the wall of a building 10). The light is emitted in response to a signal form the processing circuit 7800 via the output interface 7805. In some embodiments, an LED 7901a is disposed at a bottom edge of the housing 7900 to emit the light toward the floor in response to the occupant being in proximity of the thermostat 8000 or approaching the thermostat 8000.
In some embodiments, LEDs 7901a-7901j may be configured to emit light in an area in proximity to the thermostat using the LED halo of display device 5300. In addition a display associated with the thermostat 800 may also illuminate and message be provided to the user 7809 in response to the user being in the proximity of or approaching the thermostat 8000.
In further embodiments, LEDs 7901a-7901j may be configured to emit light in the ultraviolet light spectrum at wavelengths know to kill or inactivate microorganisms on surface areas, wherein the processing circuit 7800 may determine conditions and periods for which LEDs 7901a-7901j are activated to kill or inactivate microorganisms on surface areas. Processing circuit 7800 may be further configured to LEDs 7901a-7901j to kill or inactivate microorganisms on surface areas during periods when occupancy conditions are not sensed by the occupancy sensor 7808. Processing circuit 7800 may be further configured to LEDs 7901a-7901j to kill or inactivate microorganisms on surface areas during periods of time determined by the processing circuit 7800. In further embodiments, LEDs 7901a-7901j may be configured to emit light in one or more light spectra comprising visible, infrared, or ultraviolet.
Processing circuit 7800 may be further configured to cause the one or more LEDs 7901a-7901e to emit the light towards the floor of a building 10 in response to the existence of an occupancy condition. Processing circuit 7800 may be configured to transmit activation signals to LEDs 7901a-7901i via output interface 7805. Processing circuit 7800 may be further configured to receive an ambient light level input from the occupancy sensor 7808 and inhibit activation of the one or more LEDs 7901a-7901j in response to the existence of an occupancy condition if the ambient light level exceeds an activation threshold. The processing circuit 7800 may be further configured to activate or inhibit activation of the one or more LEDs 7901a-7901j based on a determination that conditions in proximity to the thermostat 7800 satisfy or fail to satisfy parameters based on historic LED activation and inhibition data stored in memory module 7802.
Processing circuit 7800 may be further configured to deactivate of one or more LEDs 7901a-7901j in response to the absence of one or more of occupancy conditions. Processing circuit 7800 may be further configured to activate and deactivate one or more LEDs 7901a-7901j based on a determination of the existence or absence of one or more activation or deactivation conditions. Processing circuit 7800 may be further configured to activate and deactivate one or more LEDs 7901a-7901j based on determination the existence or absence of one or more activation or deactivation conditions based on current or elapsed time.
Memory module 7802 may be configured to receive data from and provide data to processing module 7801. Memory module 7802 may be further configured to record and store one or more LED 7901a-7901j activations and deactivations as historic data. Memory module 7802 may be further configured to store occupancy condition data.
In some embodiments, the thermostat 8000 may be configured as a thermostat with an area light system and an occupancy sensor. The thermostat 8000 includes one or more LEDs 7901a-7901j configured to emit light in a direction toward a floor area beneath the thermostat. The thermostat 8000 is configured with a processing circuit 7800 configured to cause the one or more LEDs 7901a-7901j to emit the light towards one of more of the floor or areas proximate to the thermostat 8000 in response to an indication using data from an occupancy sensor 7808 that a user has approached the thermostat 8000. In some embodiments, the thermostat 8000 is another type of building sensor, such as a room pressure sensor with a differential pressure environment sensor, a humidity sensor, or other environmental sensor with or without a display. The input interface 7807 can also include a network interface for receiving data from other equipment or data sources. Similar to the control device 214, the thermostat 8000 can receive emergency or alarm data and provide light in response to such data (e.g., follow processes 7600 and 7700 or other procedures discussed above). In some embodiments, the thermostat 8000 provides white light when an occupant is in the area of the thermostat and provides red light in response to an alarm condition
Configuration of Exemplary Embodiments
The construction and arrangement of the systems and methods as shown in the various exemplary embodiments are illustrative only. Although only a few embodiments have been described in detail in this disclosure, many modifications are possible (e.g., variations in sizes, dimensions, structures, shapes and proportions of the various elements, values of parameters, mounting arrangements, use of materials, colors, orientations, etc.). For example, the position of elements may be reversed or otherwise varied and the nature or number of discrete elements or positions may be altered or varied. Accordingly, all such modifications are intended to be included within the scope of the present disclosure. The order or sequence of any process or method steps may be varied or re-sequenced according to alternative embodiments. Other substitutions, modifications, changes, and omissions may be made in the design, operating conditions and arrangement of the exemplary embodiments without departing from the scope of the present disclosure.
The present disclosure contemplates methods, systems and program products on any machine-readable media for accomplishing various operations. The embodiments of the present disclosure may be implemented using existing computer processors, or by a special purpose computer processor for an appropriate system, incorporated for this or another purpose, or by a hardwired system. Embodiments within the scope of the present disclosure include program products comprising machine-readable media for carrying or having machine-executable instructions or data structures stored thereon. Such machine-readable media can be any available media that can be accessed by a general purpose or special purpose computer or other machine with a processor. By way of example, such machine-readable media can comprise RAM, ROM, EPROM, EEPROM, CD-ROM or other optical disk storage, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to carry or store desired program code in the form of machine-executable instructions or data structures and which can be accessed by a general purpose or special purpose computer or other machine with a processor. Combinations of the above are also included within the scope of machine-readable media. Machine-executable instructions include, for example, instructions and data that cause a general purpose computer, special purpose computer, or special purpose processing machines to perform a certain function or group of functions.
Although the figures show a specific order of method steps, the order of the steps may differ from what is depicted. Also two or more steps may be performed concurrently or with partial concurrence. Such variation will depend on the software and hardware systems chosen and on designer choice. All such variations are within the scope of the disclosure. Likewise, software implementations could be accomplished with standard programming techniques with rule based logic and other logic to accomplish the various connection steps, processing steps, comparison steps and decision steps.
This application is a continuation-in-part of U.S. patent application Ser. No. 16/030,422, filed Jul. 9, 2018, which is a continuation-in-part of U.S. patent application Ser. No. 15/338,215, filed Oct. 28, 2016, now U.S. Pat. No. 10,020,956, granted Jul. 10, 2018. U.S. patent application Ser. No. 16/030,422 is also a continuation-in-part of U.S. patent application Ser. No. 15/338,221 filed Oct. 28, 2016, now U.S. Pat. No. 10,187,471, granted Jan. 22, 2019. U.S. patent application Ser. No. 16/030,422 is also a continuation-in-part of U.S. patent application Ser. No. 15/336,789, filed Oct. 28, 2016, now U.S. Pat. No. 10,345,781, granted Jul. 9, 2019, which claims the benefit of and priority to U.S. Provisional Patent Application No. 62/247,672, filed Oct. 28, 2015, U.S. Provisional Application No. 62/274,750, filed Jan. 4, 2016, U.S. Provisional Application No. 62/275,199, filed Jan. 5, 2016, U.S. Provisional Application No. 62/275,202, filed Jan. 5, 2016, U.S. Provisional Application No. 62/275,204, filed Jan. 5, 2016, and U.S. Provisional Application No. 62/275,711, filed Jan. 6, 2016. U.S. patent application Ser. No. 16/030,422 is also a continuation-in-part of U.S. patent application Ser. No. 15/397,722, filed Jan. 3, 2017, which is a continuation-in-part of U.S. patent application Ser. No. 15/336,791, filed Oct. 28, 2016, now U.S. Pat. No. 10,162,327, granted Dec. 25, 2018, which claims the benefit of and priority to U.S. Provisional Patent Application No. 62/247,672, filed Oct. 28, 2015, U.S. Provisional Application No. 62/274,750, filed Jan. 4, 2016, U.S. Provisional Application No. 62/275,199, filed Jan. 5, 2016, U.S. Provisional Application No. 62/275,202, filed Jan. 5, 2016, U.S. Provisional Application No. 62/275,204, filed Jan. 5, 2016, and U.S. Provisional Application No. 62/275,711, filed Jan. 6, 2016. U.S. patent application Ser. No. 16/030,422 is also a continuation-in-part of U.S. patent application Ser. No. 15/336,792, filed Oct. 28, 2016, now U.S. Pat. No. 10,180,673, granted Jan. 15, 2019, which claims the benefit of and priority to U.S. Provisional Patent Application No. 62/247,672, filed Oct. 28, 2015, U.S. Provisional Application No. 62/274,750, filed Jan. 4, 2016, U.S. Provisional Application No. 62/275,199, filed Jan. 5, 2016, U.S. Provisional Application No. 62/275,202, filed Jan. 5, 2016, U.S. Provisional Application No. 62/275,204, filed Jan. 5, 2016, and U.S. Provisional Application No. 62/275,711, filed Jan. 6, 2016. This application is also a continuation-in-part of U.S. patent application Ser. No. 16/246,366, filed Jan. 11, 2019, which claims the benefit of and priority to U.S. Provisional Application No. 62/783,580, filed Dec. 21, 2018. U.S. patent application Ser. No. 16/246,366 is also a continuation-in-part of U.S. patent application Ser. No. 15/338,221, filed Oct. 28, 2016, now U.S. Pat. No. 10,187,471, granted Jan. 22, 2019. U.S. patent application Ser. No. 16/246,366 is also a continuation-in-part of U.S. patent application Ser. No. 15/397,722, filed Jan. 3, 2017, which claims the benefit of and priority to U.S. Provisional Application No. 62/274,750, filed Jan. 4, 2016, U.S. Provisional Application No. 62/275,199, filed Jan. 5, 2016, U.S. Provisional Application No. 62/275,202, filed Jan. 5, 2016, U.S. Provisional Application No. 62/275,204, filed Jan. 5, 2016, and U.S. Provisional Application No. 62/275,711, filed Jan. 6, 2016. U.S. patent application Ser. No. 15/397,722 is also a continuation-in-part of U.S. patent application Ser. No. 15/336,791, filed Oct. 28, 2016, now U.S. Pat. No. 10,162,327, granted Dec. 25, 2018, which claims the benefit of and priority to U.S. Provisional Patent Application No. 62/247,672, filed Oct. 28, 2015, U.S. Provisional Application No. 62/274,750, filed Jan. 4, 2016, U.S. Provisional Application No. 62/275,199, filed Jan. 5, 2016, U.S. Provisional Application No. 62/275,202, filed Jan. 5, 2016, U.S. Provisional Application No. 62/275,204, filed Jan. 5, 2016, and U.S. Provisional Application No. 62/275,711, filed Jan. 6, 2016. U.S. patent application Ser. No. 16/246,366 is also a continuation-in-part of U.S. patent application Ser. No. 15/336,789, filed Oct. 28, 2016, now U.S. Pat. No. 10,345,781, granted Jul. 9, 2019, which claims the benefit of and priority to U.S. Provisional Patent Application No. 62/247,672, filed Oct. 28, 2015, U.S. Provisional Application No. 62/274,750, filed Jan. 4, 2016, U.S. Provisional Application No. 62/275,199, filed Jan. 5, 2016, U.S. Provisional Application No. 62/275,202, filed Jan. 5, 2016, U.S. Provisional Application No. 62/275,204, filed Jan. 5, 2016, and U.S. Provisional Application No. 62/275,711, filed Jan. 6, 2016. U.S. patent application Ser. No. 16/246,366 is also a continuation-in-part of U.S. patent application Ser. No. 15/336,792, filed Oct. 28, 2016, now U.S. Pat. No. 10,180,673, granted Jan. 15, 2019, which claims the benefit of and priority to U.S. Provisional Patent Application No. 62/247,672, filed Oct. 28, 2015, U.S. Provisional Application No. 62/274,750, filed Jan. 4, 2016, U.S. Provisional Application No. 62/275,199, filed Jan. 5, 2016, U.S. Provisional Application No. 62/275,202, filed Jan. 5, 2016, U.S. Provisional Application No. 62/275,204, filed Jan. 5, 2016, and U.S. Provisional Application No. 62/275,711, filed Jan. 6, 2016. U.S. patent application Ser. No. 16/246,366 is also a continuation-in-part of U.S. patent application Ser. No. 15/336,793, filed Oct. 28, 2016, now U.S. Pat. No. 10,310,477, granted Jun. 4, 2019, which claims the benefit of and priority to U.S. Provisional Patent Application No. 62/247,672, filed Oct. 28, 2015, U.S. Provisional Application No. 62/274,750, filed Jan. 4, 2016, U.S. Provisional Application No. 62/275,199, filed Jan. 5, 2016, U.S. Provisional Application No. 62/275,202, filed Jan. 5, 2016, U.S. Provisional Application No. 62/275,204, filed Jan. 5, 2016, and U.S. Provisional Application No. 62/275,711, filed Jan. 6, 2016. U.S. patent application Ser. No. 16/246,366 is also a continuation-in-part of U.S. patent application Ser. No. 16/030,422, filed Jul. 9, 2018, which is a continuation-in-part of U.S. patent application Ser. No. 15/336,789, filed Oct. 28, 2016, now U.S. Pat. No. 10,345,781, granted Jul. 9, 2019, U.S. patent application Ser. No. 15/336,792, filed Oct. 28, 2016, now U.S. Pat. No. 10,180,673, granted Jan. 15, 2019, U.S. patent application Ser. No. 15/338,215, filed Oct. 28, 2016, now U.S. Pat. No. 10,020,956, granted Jul. 10, 2018, U.S. patent application Ser. No. 15/338,221, filed Oct. 28, 2016, now U.S. Pat. No. 10,187,471, granted Jan. 22, 2019, and U.S. patent application Ser. No. 15/397,722, filed Jan. 3, 2017. The disclosures of each of these applications are hereby incorporated by reference in their entirety.
Number | Name | Date | Kind |
---|---|---|---|
4084438 | Lee et al. | Apr 1978 | A |
4107464 | Lynch et al. | Aug 1978 | A |
4942613 | Lynch | Jul 1990 | A |
5052186 | Dudley et al. | Oct 1991 | A |
5062276 | Dudley | Nov 1991 | A |
5797729 | Rafuse et al. | Aug 1998 | A |
6121885 | Masone et al. | Sep 2000 | A |
6164374 | Rhodes et al. | Dec 2000 | A |
6169937 | Peterson | Jan 2001 | B1 |
6193395 | Logan | Feb 2001 | B1 |
6227961 | Moore et al. | May 2001 | B1 |
6260765 | Natale et al. | Jul 2001 | B1 |
6314750 | Ishikawa et al. | Nov 2001 | B1 |
6351693 | Monie et al. | Feb 2002 | B1 |
6435418 | Toth et al. | Aug 2002 | B1 |
6487869 | Sulc et al. | Dec 2002 | B1 |
6557771 | Shah | May 2003 | B2 |
6641054 | Morey | Nov 2003 | B2 |
6693514 | Perea et al. | Feb 2004 | B2 |
6726112 | Ho | Apr 2004 | B1 |
6726113 | Guo | Apr 2004 | B2 |
6771172 | Robinson et al. | Aug 2004 | B1 |
6789429 | Pinto et al. | Sep 2004 | B2 |
6810307 | Addy | Oct 2004 | B1 |
6824069 | Rosen | Nov 2004 | B2 |
6827465 | Shemitz et al. | Dec 2004 | B2 |
6851621 | Wacker et al. | Feb 2005 | B1 |
6874691 | Hildebrand et al. | Apr 2005 | B1 |
6888441 | Carey | May 2005 | B2 |
6995518 | Havlik et al. | Feb 2006 | B2 |
7028912 | Rosen | Apr 2006 | B1 |
7083109 | Pouchak | Aug 2006 | B2 |
7099748 | Rayburn | Aug 2006 | B2 |
7140551 | De Pauw et al. | Nov 2006 | B2 |
7146253 | Hoog et al. | Dec 2006 | B2 |
7152806 | Rosen | Dec 2006 | B1 |
7156317 | Moore | Jan 2007 | B1 |
7156318 | Rosen | Jan 2007 | B1 |
7159789 | Schwendinger et al. | Jan 2007 | B2 |
7159790 | Schwendinger et al. | Jan 2007 | B2 |
7167079 | Smyth et al. | Jan 2007 | B2 |
7188002 | Chapman et al. | Mar 2007 | B2 |
7212887 | Shah et al. | May 2007 | B2 |
7225054 | Amundson et al. | May 2007 | B2 |
7232075 | Rosen | Jun 2007 | B1 |
7261243 | Butler et al. | Aug 2007 | B2 |
7274972 | Amundson et al. | Sep 2007 | B2 |
7287709 | Proffitt et al. | Oct 2007 | B2 |
7296426 | Butler et al. | Nov 2007 | B2 |
7299996 | Garrett et al. | Nov 2007 | B2 |
7306165 | Shah | Dec 2007 | B2 |
7308384 | Shah et al. | Dec 2007 | B2 |
7317970 | Pienta et al. | Jan 2008 | B2 |
7331187 | Kates | Feb 2008 | B2 |
7343751 | Kates | Mar 2008 | B2 |
7348925 | Noro et al. | Mar 2008 | B2 |
7383158 | Krocker et al. | Jun 2008 | B2 |
7402780 | Mueller et al. | Jul 2008 | B2 |
7434744 | Garozzo et al. | Oct 2008 | B2 |
7442012 | Moens | Oct 2008 | B2 |
7451917 | McCall et al. | Nov 2008 | B2 |
7469550 | Chapman et al. | Dec 2008 | B2 |
7475558 | Perry | Jan 2009 | B2 |
7475828 | Bartlett et al. | Jan 2009 | B2 |
7556207 | Mueller et al. | Jul 2009 | B2 |
7565813 | Pouchak | Jul 2009 | B2 |
7575179 | Morrow et al. | Aug 2009 | B2 |
7584897 | Schultz et al. | Sep 2009 | B2 |
7614567 | Chapman et al. | Nov 2009 | B2 |
7624931 | Chapman et al. | Dec 2009 | B2 |
7633743 | Barton et al. | Dec 2009 | B2 |
7636604 | Bergman et al. | Dec 2009 | B2 |
7638739 | Rhodes et al. | Dec 2009 | B2 |
7641126 | Schultz et al. | Jan 2010 | B2 |
7645158 | Mulhouse et al. | Jan 2010 | B2 |
7667163 | Ashworth et al. | Feb 2010 | B2 |
7726581 | Naujok et al. | Jun 2010 | B2 |
7731096 | Lorenz et al. | Jun 2010 | B2 |
7731098 | Butler et al. | Jun 2010 | B2 |
7740184 | Schnell et al. | Jun 2010 | B2 |
7748225 | Butler et al. | Jul 2010 | B2 |
7748639 | Perry | Jul 2010 | B2 |
7748640 | Roher et al. | Jul 2010 | B2 |
7755220 | Sorg et al. | Jul 2010 | B2 |
7765826 | Nichols | Aug 2010 | B2 |
7774102 | Butler et al. | Aug 2010 | B2 |
7775452 | Shah et al. | Aug 2010 | B2 |
7784291 | Butler et al. | Aug 2010 | B2 |
7784704 | Harter | Aug 2010 | B2 |
7802618 | Simon et al. | Sep 2010 | B2 |
7832221 | Wijaya et al. | Nov 2010 | B2 |
7832652 | Barton et al. | Nov 2010 | B2 |
7845576 | Siddaramanna et al. | Dec 2010 | B2 |
7861941 | Schultz et al. | Jan 2011 | B2 |
7867646 | Rhodes | Jan 2011 | B2 |
7908116 | Steinberg et al. | Mar 2011 | B2 |
7908117 | Steinberg et al. | Mar 2011 | B2 |
7918406 | Rosen | Apr 2011 | B2 |
7938336 | Rhodes et al. | May 2011 | B2 |
7941294 | Shahi et al. | May 2011 | B2 |
7954726 | Siddaramanna et al. | Jun 2011 | B2 |
7963454 | Sullivan et al. | Jun 2011 | B2 |
7979164 | Garozzo et al. | Jul 2011 | B2 |
7992794 | Leen et al. | Aug 2011 | B2 |
8010237 | Cheung et al. | Aug 2011 | B2 |
8032254 | Amundson et al. | Oct 2011 | B2 |
8078326 | Harrod et al. | Dec 2011 | B2 |
8082065 | Imes et al. | Dec 2011 | B2 |
8083154 | Schultz et al. | Dec 2011 | B2 |
8089032 | Beland et al. | Jan 2012 | B2 |
8091794 | Siddaramanna et al. | Jan 2012 | B2 |
8099195 | Imes et al. | Jan 2012 | B2 |
8108076 | Imes et al. | Jan 2012 | B2 |
8131506 | Steinberg et al. | Mar 2012 | B2 |
8141791 | Rosen | Mar 2012 | B2 |
8167216 | Schultz et al. | May 2012 | B2 |
8180492 | Steinberg | May 2012 | B2 |
8182106 | Shin et al. | May 2012 | B2 |
8190296 | Alhilo | May 2012 | B2 |
8195313 | Fadell et al. | Jun 2012 | B1 |
8196185 | Geadelmann et al. | Jun 2012 | B2 |
8209059 | Stockton | Jun 2012 | B2 |
8239066 | Jennings et al. | Aug 2012 | B2 |
8275918 | Bourbeau et al. | Sep 2012 | B2 |
8276829 | Stoner et al. | Oct 2012 | B2 |
8280536 | Fadell et al. | Oct 2012 | B1 |
8289182 | Vogel et al. | Oct 2012 | B2 |
8289226 | Takach et al. | Oct 2012 | B2 |
8299919 | Dayton et al. | Oct 2012 | B2 |
8321058 | Zhou et al. | Nov 2012 | B2 |
8346396 | Amundson et al. | Jan 2013 | B2 |
8387891 | Simon et al. | Mar 2013 | B1 |
8393550 | Simon et al. | Mar 2013 | B2 |
8412488 | Steinberg et al. | Apr 2013 | B2 |
8416084 | Beltmann et al. | Apr 2013 | B2 |
8419236 | Fisher et al. | Apr 2013 | B2 |
8429566 | Koushik et al. | Apr 2013 | B2 |
8456293 | Trundle et al. | Jun 2013 | B1 |
8473109 | Imes et al. | Jun 2013 | B1 |
8476964 | Atri | Jul 2013 | B1 |
8489243 | Fadell et al. | Jul 2013 | B2 |
8504180 | Imes et al. | Aug 2013 | B2 |
8510255 | Fadell et al. | Aug 2013 | B2 |
8511576 | Warren et al. | Aug 2013 | B2 |
8511577 | Warren et al. | Aug 2013 | B2 |
8517088 | Moore et al. | Aug 2013 | B2 |
8523083 | Warren et al. | Sep 2013 | B2 |
8523084 | Siddaramanna et al. | Sep 2013 | B2 |
8527096 | Pavlak et al. | Sep 2013 | B2 |
8532827 | Stefanski et al. | Sep 2013 | B2 |
8544285 | Stefanski et al. | Oct 2013 | B2 |
8549658 | Kolavennu et al. | Oct 2013 | B2 |
8550368 | Butler et al. | Oct 2013 | B2 |
8554374 | Lunacek et al. | Oct 2013 | B2 |
8555662 | Peterson et al. | Oct 2013 | B2 |
8558179 | Filson et al. | Oct 2013 | B2 |
8560127 | Leen et al. | Oct 2013 | B2 |
8560128 | Ruff et al. | Oct 2013 | B2 |
8571518 | Imes et al. | Oct 2013 | B2 |
8596550 | Steinberg et al. | Dec 2013 | B2 |
8600564 | Imes et al. | Dec 2013 | B2 |
8606409 | Amundson et al. | Dec 2013 | B2 |
8613792 | Ragland et al. | Dec 2013 | B2 |
8620841 | Filson et al. | Dec 2013 | B1 |
8622314 | Fisher et al. | Jan 2014 | B2 |
8626344 | Imes et al. | Jan 2014 | B2 |
8630741 | Matsuoka et al. | Jan 2014 | B1 |
8630742 | Stefanski et al. | Jan 2014 | B1 |
8644009 | Rylski et al. | Feb 2014 | B2 |
8659302 | Warren et al. | Feb 2014 | B1 |
8671702 | Shotey et al. | Mar 2014 | B1 |
8674816 | Trundle et al. | Mar 2014 | B2 |
8689572 | Evans et al. | Apr 2014 | B2 |
8695887 | Helt et al. | Apr 2014 | B2 |
8706270 | Fadell et al. | Apr 2014 | B2 |
8708242 | Conner et al. | Apr 2014 | B2 |
8712590 | Steinberg | Apr 2014 | B2 |
8718826 | Ramachandran et al. | May 2014 | B2 |
8726680 | Schenk et al. | May 2014 | B2 |
8727611 | Huppi et al. | May 2014 | B2 |
8738327 | Steinberg et al. | May 2014 | B2 |
8746583 | Simon et al. | Jun 2014 | B2 |
8752771 | Warren et al. | Jun 2014 | B2 |
8754780 | Petite et al. | Jun 2014 | B2 |
8766194 | Filson et al. | Jul 2014 | B2 |
8770490 | Drew | Jul 2014 | B2 |
8770491 | Warren et al. | Jul 2014 | B2 |
8788103 | Warren et al. | Jul 2014 | B2 |
8802981 | Wallaert et al. | Aug 2014 | B2 |
8830267 | Brackney | Sep 2014 | B2 |
8838282 | Ratliff et al. | Sep 2014 | B1 |
8843239 | Mighdoll et al. | Sep 2014 | B2 |
8850348 | Fadell et al. | Sep 2014 | B2 |
8855830 | Imes et al. | Oct 2014 | B2 |
8868219 | Fadell et al. | Oct 2014 | B2 |
8870086 | Tessier et al. | Oct 2014 | B2 |
8870087 | Pienta et al. | Oct 2014 | B2 |
8880047 | Konicek et al. | Nov 2014 | B2 |
8893032 | Bruck et al. | Nov 2014 | B2 |
8893555 | Bourbeau et al. | Nov 2014 | B2 |
8903552 | Amundson et al. | Dec 2014 | B2 |
8918219 | Sloo et al. | Dec 2014 | B2 |
8942853 | Stefanski et al. | Jan 2015 | B2 |
8944338 | Warren et al. | Feb 2015 | B2 |
8950686 | Matsuoka et al. | Feb 2015 | B2 |
8950687 | Bergman et al. | Feb 2015 | B2 |
8961005 | Huppi et al. | Feb 2015 | B2 |
8978994 | Moore et al. | Mar 2015 | B2 |
8998102 | Fadell et al. | Apr 2015 | B2 |
9014686 | Ramachandran et al. | Apr 2015 | B2 |
9014860 | Moore et al. | Apr 2015 | B2 |
9020647 | Johnson et al. | Apr 2015 | B2 |
9026232 | Fadell et al. | May 2015 | B2 |
9033255 | Tessier et al. | May 2015 | B2 |
RE45574 | Harter | Jun 2015 | E |
9074784 | Sullivan et al. | Jul 2015 | B2 |
9075419 | Sloo et al. | Jul 2015 | B2 |
9077055 | Yau | Jul 2015 | B2 |
9080782 | Sheikh | Jul 2015 | B1 |
9081393 | Lunacek et al. | Jul 2015 | B2 |
9086703 | Warren et al. | Jul 2015 | B2 |
9088306 | Ramachandran et al. | Jul 2015 | B1 |
9092039 | Fadell et al. | Jul 2015 | B2 |
9098279 | Mucignat et al. | Aug 2015 | B2 |
9113156 | Sugiyama et al. | Aug 2015 | B2 |
9116529 | Warren et al. | Aug 2015 | B2 |
9121623 | Filson et al. | Sep 2015 | B2 |
9122283 | Rylski et al. | Sep 2015 | B2 |
9125049 | Huang et al. | Sep 2015 | B2 |
9127853 | Filson et al. | Sep 2015 | B2 |
9131904 | Qualey et al. | Sep 2015 | B2 |
9134710 | Cheung et al. | Sep 2015 | B2 |
9134715 | Geadelmann et al. | Sep 2015 | B2 |
9146041 | Novotny et al. | Sep 2015 | B2 |
9151510 | Leen | Oct 2015 | B2 |
9154001 | Dharwada et al. | Oct 2015 | B2 |
9157764 | Shetty et al. | Oct 2015 | B2 |
9164524 | Imes et al. | Oct 2015 | B2 |
9175868 | Fadell et al. | Nov 2015 | B2 |
9175871 | Gourlay et al. | Nov 2015 | B2 |
9182141 | Sullivan et al. | Nov 2015 | B2 |
9189751 | Matsuoka et al. | Nov 2015 | B2 |
9191277 | Rezvani et al. | Nov 2015 | B2 |
9191909 | Rezvani et al. | Nov 2015 | B2 |
9194597 | Steinberg et al. | Nov 2015 | B2 |
9194598 | Fadell et al. | Nov 2015 | B2 |
9194600 | Kates | Nov 2015 | B2 |
9207817 | Tu | Dec 2015 | B2 |
9213342 | Drake et al. | Dec 2015 | B2 |
9215281 | Iggulden et al. | Dec 2015 | B2 |
9222693 | Gourlay et al. | Dec 2015 | B2 |
9223323 | Matas et al. | Dec 2015 | B2 |
9234669 | Filson et al. | Jan 2016 | B2 |
9244445 | Finch et al. | Jan 2016 | B2 |
9244470 | Steinberg | Jan 2016 | B2 |
9261287 | Warren et al. | Feb 2016 | B2 |
9268344 | Warren et al. | Feb 2016 | B2 |
9279595 | Mighdoll et al. | Mar 2016 | B2 |
9282590 | Donlan | Mar 2016 | B2 |
9285134 | Bray et al. | Mar 2016 | B2 |
9285802 | Arensmeier | Mar 2016 | B2 |
9286781 | Filson et al. | Mar 2016 | B2 |
9291359 | Fadell et al. | Mar 2016 | B2 |
9292022 | Ramachandran et al. | Mar 2016 | B2 |
9298196 | Matsuoka et al. | Mar 2016 | B2 |
9298197 | Matsuoka et al. | Mar 2016 | B2 |
9304366 | Kubo | Apr 2016 | B2 |
D763707 | Sinha et al. | Aug 2016 | S |
9453340 | Van Herpen et al. | Sep 2016 | B2 |
9618185 | Ricci et al. | Apr 2017 | B2 |
D790369 | Sinha et al. | Jun 2017 | S |
9951968 | Novotny et al. | Apr 2018 | B2 |
10001790 | Oh et al. | Jun 2018 | B2 |
10119712 | Grosshart et al. | Nov 2018 | B2 |
20010015281 | Schiedegger et al. | Aug 2001 | A1 |
20030034897 | Shamoon et al. | Feb 2003 | A1 |
20030034898 | Shamoon et al. | Feb 2003 | A1 |
20030079387 | Derose | May 2003 | A1 |
20030136853 | Morey | Jul 2003 | A1 |
20030177012 | Drennan | Sep 2003 | A1 |
20040074978 | Rosen | Apr 2004 | A1 |
20040125940 | Turcan et al. | Jul 2004 | A1 |
20040249479 | Shorrock | Dec 2004 | A1 |
20040262410 | Hull | Dec 2004 | A1 |
20050012633 | Yoon | Jan 2005 | A1 |
20050040943 | Winick | Feb 2005 | A1 |
20050083168 | Breitenbach | Apr 2005 | A1 |
20050119794 | Amundson et al. | Jun 2005 | A1 |
20050156049 | Van Ostrand et al. | Jul 2005 | A1 |
20050194456 | Tessier et al. | Sep 2005 | A1 |
20050195757 | Kidder et al. | Sep 2005 | A1 |
20050219860 | Schexnaider | Oct 2005 | A1 |
20050270151 | Winick | Dec 2005 | A1 |
20050270735 | Chen | Dec 2005 | A1 |
20060038025 | Lee | Feb 2006 | A1 |
20060113398 | Ashworth | Jun 2006 | A1 |
20060192022 | Barton et al. | Aug 2006 | A1 |
20060226970 | Saga et al. | Oct 2006 | A1 |
20060260334 | Carey et al. | Nov 2006 | A1 |
20070013532 | Ehlers | Jan 2007 | A1 |
20070045431 | Chapman et al. | Mar 2007 | A1 |
20070050732 | Chapman et al. | Mar 2007 | A1 |
20070057079 | Stark et al. | Mar 2007 | A1 |
20070114295 | Jenkins | May 2007 | A1 |
20070121334 | Bourdin et al. | May 2007 | A1 |
20070138496 | Zhao et al. | Jun 2007 | A1 |
20070198099 | Shah | Aug 2007 | A9 |
20070228182 | Wagner et al. | Oct 2007 | A1 |
20070228183 | Kennedy et al. | Oct 2007 | A1 |
20070241203 | Wagner et al. | Oct 2007 | A1 |
20080048046 | Wagner et al. | Feb 2008 | A1 |
20080054084 | Olson | Mar 2008 | A1 |
20080099568 | Nicodem et al. | May 2008 | A1 |
20080120446 | Butler et al. | May 2008 | A1 |
20080161978 | Shah | Jul 2008 | A1 |
20080216495 | Kates | Sep 2008 | A1 |
20080223051 | Kates | Sep 2008 | A1 |
20080227430 | Polk | Sep 2008 | A1 |
20080280637 | Shaffer et al. | Nov 2008 | A1 |
20080289347 | Kadle et al. | Nov 2008 | A1 |
20080290183 | Laberge et al. | Nov 2008 | A1 |
20080294274 | Laberge et al. | Nov 2008 | A1 |
20080295030 | Laberge et al. | Nov 2008 | A1 |
20090140065 | Juntunen et al. | Jun 2009 | A1 |
20090143880 | Amundson et al. | Jun 2009 | A1 |
20090143918 | Amundson et al. | Jun 2009 | A1 |
20090144015 | Bedard | Jun 2009 | A1 |
20090251422 | Wu et al. | Oct 2009 | A1 |
20090276096 | Proffitt et al. | Nov 2009 | A1 |
20100070092 | Winter et al. | Mar 2010 | A1 |
20100084482 | Kennedy et al. | Apr 2010 | A1 |
20100131884 | Shah | May 2010 | A1 |
20100163633 | Barrett et al. | Jul 2010 | A1 |
20100163635 | Ye | Jul 2010 | A1 |
20100171889 | Pantel et al. | Jul 2010 | A1 |
20100182743 | Roher | Jul 2010 | A1 |
20100190479 | Scott et al. | Jul 2010 | A1 |
20100204834 | Comerford et al. | Aug 2010 | A1 |
20100212198 | Matsunaga et al. | Aug 2010 | A1 |
20100212879 | Schnell et al. | Aug 2010 | A1 |
20100250707 | Dalley et al. | Sep 2010 | A1 |
20100327766 | Recker et al. | Dec 2010 | A1 |
20110006887 | Shaull et al. | Jan 2011 | A1 |
20110067851 | Terlson et al. | Mar 2011 | A1 |
20110088416 | Koethler | Apr 2011 | A1 |
20110132991 | Moody et al. | Jun 2011 | A1 |
20110133655 | Recker et al. | Jun 2011 | A1 |
20110181412 | Alexander et al. | Jul 2011 | A1 |
20110225859 | Safavi | Sep 2011 | A1 |
20110254450 | Bergholz et al. | Oct 2011 | A1 |
20110264279 | Poth | Oct 2011 | A1 |
20120001837 | Yamayoshi | Jan 2012 | A1 |
20120001873 | Wu et al. | Jan 2012 | A1 |
20120007555 | Bukow | Jan 2012 | A1 |
20120007804 | Morrison et al. | Jan 2012 | A1 |
20120048955 | Lin et al. | Mar 2012 | A1 |
20120061480 | Deligiannis et al. | Mar 2012 | A1 |
20120093141 | Imes et al. | Apr 2012 | A1 |
20120095601 | Abraham et al. | Apr 2012 | A1 |
20120101637 | Imes et al. | Apr 2012 | A1 |
20120126020 | Filson et al. | May 2012 | A1 |
20120126021 | Warren et al. | May 2012 | A1 |
20120131504 | Fadell et al. | May 2012 | A1 |
20120165993 | Whitehouse | Jun 2012 | A1 |
20120179727 | Esser | Jul 2012 | A1 |
20120181010 | Schultz et al. | Jul 2012 | A1 |
20120191257 | Corcoran et al. | Jul 2012 | A1 |
20120193437 | Henry et al. | Aug 2012 | A1 |
20120229521 | Hales et al. | Sep 2012 | A1 |
20120230661 | Alhilo | Sep 2012 | A1 |
20120239207 | Fadell et al. | Sep 2012 | A1 |
20120252430 | Imes et al. | Oct 2012 | A1 |
20120259470 | Nijhawan et al. | Oct 2012 | A1 |
20120298763 | Young | Nov 2012 | A1 |
20120303165 | Qu et al. | Nov 2012 | A1 |
20120303828 | Young et al. | Nov 2012 | A1 |
20120310418 | Harrod et al. | Dec 2012 | A1 |
20120315848 | Smith et al. | Dec 2012 | A1 |
20130002447 | Vogel et al. | Jan 2013 | A1 |
20130054758 | Imes et al. | Feb 2013 | A1 |
20130057381 | Kandhasamy | Mar 2013 | A1 |
20130087628 | Nelson et al. | Apr 2013 | A1 |
20130090767 | Bruck et al. | Apr 2013 | A1 |
20130099008 | Aljabari et al. | Apr 2013 | A1 |
20130099009 | Filson et al. | Apr 2013 | A1 |
20130123991 | Richmond | May 2013 | A1 |
20130138250 | Mowery et al. | May 2013 | A1 |
20130144443 | Casson et al. | Jun 2013 | A1 |
20130151016 | Bias et al. | Jun 2013 | A1 |
20130151018 | Bias et al. | Jun 2013 | A1 |
20130158721 | Somasundaram et al. | Jun 2013 | A1 |
20130163300 | Zhao et al. | Jun 2013 | A1 |
20130180700 | Aycock | Jul 2013 | A1 |
20130190932 | Schuman | Jul 2013 | A1 |
20130190940 | Sloop et al. | Jul 2013 | A1 |
20130204408 | Thiruvengada et al. | Aug 2013 | A1 |
20130204441 | Sloo et al. | Aug 2013 | A1 |
20130204442 | Modi et al. | Aug 2013 | A1 |
20130211600 | Dean-Hendricks et al. | Aug 2013 | A1 |
20130215058 | Brazell et al. | Aug 2013 | A1 |
20130221117 | Warren et al. | Aug 2013 | A1 |
20130228633 | Toth et al. | Sep 2013 | A1 |
20130234840 | Trundle et al. | Sep 2013 | A1 |
20130238142 | Nichols et al. | Sep 2013 | A1 |
20130245838 | Zywicki et al. | Sep 2013 | A1 |
20130261803 | Kolavennu | Oct 2013 | A1 |
20130261807 | Zywicki et al. | Oct 2013 | A1 |
20130268129 | Fadell et al. | Oct 2013 | A1 |
20130271670 | Sakata et al. | Oct 2013 | A1 |
20130292481 | Filson et al. | Nov 2013 | A1 |
20130297078 | Kolavennu | Nov 2013 | A1 |
20130318217 | Imes et al. | Nov 2013 | A1 |
20130318444 | Imes et al. | Nov 2013 | A1 |
20130325190 | Imes et al. | Dec 2013 | A1 |
20130338837 | Hublou et al. | Dec 2013 | A1 |
20130338839 | Rogers et al. | Dec 2013 | A1 |
20130340993 | Siddaramanna et al. | Dec 2013 | A1 |
20130345882 | Dushane et al. | Dec 2013 | A1 |
20140000861 | Barrett et al. | Jan 2014 | A1 |
20140002461 | Wang | Jan 2014 | A1 |
20140031989 | Bergman et al. | Jan 2014 | A1 |
20140034284 | Butler et al. | Feb 2014 | A1 |
20140039692 | Leen et al. | Feb 2014 | A1 |
20140041846 | Leen et al. | Feb 2014 | A1 |
20140048608 | Frank | Feb 2014 | A1 |
20140052300 | Matsuoka et al. | Feb 2014 | A1 |
20140058806 | Guenette et al. | Feb 2014 | A1 |
20140070919 | Jackson et al. | Mar 2014 | A1 |
20140081466 | Huapeng et al. | Mar 2014 | A1 |
20140112331 | Rosen | Apr 2014 | A1 |
20140114706 | Blakely | Apr 2014 | A1 |
20140117103 | Rossi et al. | May 2014 | A1 |
20140118285 | Poplawski | May 2014 | A1 |
20140129034 | Stefanski et al. | May 2014 | A1 |
20140149270 | Lombard et al. | May 2014 | A1 |
20140151456 | McCurnin et al. | Jun 2014 | A1 |
20140152631 | Moore et al. | Jun 2014 | A1 |
20140156087 | Amundson | Jun 2014 | A1 |
20140158338 | Kates | Jun 2014 | A1 |
20140165612 | Qu et al. | Jun 2014 | A1 |
20140175181 | Warren et al. | Jun 2014 | A1 |
20140188288 | Fisher et al. | Jul 2014 | A1 |
20140191848 | Imes et al. | Jul 2014 | A1 |
20140207291 | Golden et al. | Jul 2014 | A1 |
20140207292 | Ramagem et al. | Jul 2014 | A1 |
20140214212 | Leen et al. | Jul 2014 | A1 |
20140216078 | Ladd | Aug 2014 | A1 |
20140217185 | Bicknell | Aug 2014 | A1 |
20140217186 | Kramer et al. | Aug 2014 | A1 |
20140228983 | Groskreutz et al. | Aug 2014 | A1 |
20140231530 | Warren et al. | Aug 2014 | A1 |
20140244047 | Oh et al. | Aug 2014 | A1 |
20140250399 | Gaherwar | Sep 2014 | A1 |
20140262196 | Frank et al. | Sep 2014 | A1 |
20140262484 | Khoury et al. | Sep 2014 | A1 |
20140263679 | Conner et al. | Sep 2014 | A1 |
20140267008 | Jain et al. | Sep 2014 | A1 |
20140277762 | Drew | Sep 2014 | A1 |
20140277769 | Matsuoka et al. | Sep 2014 | A1 |
20140277770 | Aljabari et al. | Sep 2014 | A1 |
20140299670 | Ramachandran et al. | Oct 2014 | A1 |
20140309792 | Drew | Oct 2014 | A1 |
20140312129 | Zikes et al. | Oct 2014 | A1 |
20140312131 | Tousignant et al. | Oct 2014 | A1 |
20140312694 | Tu et al. | Oct 2014 | A1 |
20140316585 | Boesveld et al. | Oct 2014 | A1 |
20140316586 | Boesveld et al. | Oct 2014 | A1 |
20140316587 | Imes et al. | Oct 2014 | A1 |
20140317029 | Matsuoka et al. | Oct 2014 | A1 |
20140319231 | Matsuoka et al. | Oct 2014 | A1 |
20140319236 | Novotny et al. | Oct 2014 | A1 |
20140320282 | Zhang | Oct 2014 | A1 |
20140321011 | Bisson et al. | Oct 2014 | A1 |
20140324232 | Modi et al. | Oct 2014 | A1 |
20140330435 | Stoner et al. | Nov 2014 | A1 |
20140346239 | Fadell et al. | Nov 2014 | A1 |
20140358295 | Warren et al. | Dec 2014 | A1 |
20140367475 | Fadell et al. | Dec 2014 | A1 |
20140376530 | Erickson et al. | Dec 2014 | A1 |
20150001361 | Gagne et al. | Jan 2015 | A1 |
20150002165 | Juntunen et al. | Jan 2015 | A1 |
20150016443 | Erickson et al. | Jan 2015 | A1 |
20150025693 | Wu et al. | Jan 2015 | A1 |
20150039137 | Perry et al. | Feb 2015 | A1 |
20150041551 | Tessier et al. | Feb 2015 | A1 |
20150043615 | Steinberg et al. | Feb 2015 | A1 |
20150053779 | Adamek et al. | Feb 2015 | A1 |
20150053780 | Nelson et al. | Feb 2015 | A1 |
20150053781 | Nelson et al. | Feb 2015 | A1 |
20150058779 | Bruck et al. | Feb 2015 | A1 |
20150061859 | Matsuoka et al. | Mar 2015 | A1 |
20150066215 | Buduri | Mar 2015 | A1 |
20150066216 | Ramachandran | Mar 2015 | A1 |
20150066220 | Sloo et al. | Mar 2015 | A1 |
20150081106 | Buduri | Mar 2015 | A1 |
20150081109 | Fadell et al. | Mar 2015 | A1 |
20150081568 | Land, III | Mar 2015 | A1 |
20150088272 | Drew | Mar 2015 | A1 |
20150088318 | Amundson et al. | Mar 2015 | A1 |
20150096876 | Mittleman | Apr 2015 | A1 |
20150100166 | Baynes et al. | Apr 2015 | A1 |
20150100167 | Sloo et al. | Apr 2015 | A1 |
20150115045 | Tu et al. | Apr 2015 | A1 |
20150115046 | Warren et al. | Apr 2015 | A1 |
20150124853 | Huppi et al. | May 2015 | A1 |
20150127176 | Bergman et al. | May 2015 | A1 |
20150140994 | Partheesh et al. | May 2015 | A1 |
20150142180 | Matsuoka et al. | May 2015 | A1 |
20150144706 | Robideau et al. | May 2015 | A1 |
20150145653 | Katingari et al. | May 2015 | A1 |
20150148963 | Klein et al. | May 2015 | A1 |
20150153057 | Matsuoka et al. | Jun 2015 | A1 |
20150153060 | Stefanski et al. | Jun 2015 | A1 |
20150156631 | Ramachandran | Jun 2015 | A1 |
20150159893 | Daubman et al. | Jun 2015 | A1 |
20150159899 | Bergman et al. | Jun 2015 | A1 |
20150159902 | Quam et al. | Jun 2015 | A1 |
20150159903 | Marak et al. | Jun 2015 | A1 |
20150159904 | Barton | Jun 2015 | A1 |
20150160691 | Kadah et al. | Jun 2015 | A1 |
20150163945 | Barton et al. | Jun 2015 | A1 |
20150167995 | Fadell et al. | Jun 2015 | A1 |
20150168002 | Plitkins et al. | Jun 2015 | A1 |
20150168003 | Stefanski et al. | Jun 2015 | A1 |
20150168933 | Klein et al. | Jun 2015 | A1 |
20150176854 | Butler et al. | Jun 2015 | A1 |
20150176855 | Geadelmann et al. | Jun 2015 | A1 |
20150198346 | Vedpathak | Jul 2015 | A1 |
20150198347 | Tessier et al. | Jul 2015 | A1 |
20150204558 | Sartain et al. | Jul 2015 | A1 |
20150204561 | Sadwick et al. | Jul 2015 | A1 |
20150204563 | Imes et al. | Jul 2015 | A1 |
20150204564 | Shah | Jul 2015 | A1 |
20150204565 | Amundson et al. | Jul 2015 | A1 |
20150204569 | Lorenz et al. | Jul 2015 | A1 |
20150204570 | Adamik et al. | Jul 2015 | A1 |
20150205310 | Amundson et al. | Jul 2015 | A1 |
20150219357 | Stefanski et al. | Aug 2015 | A1 |
20150233594 | Abe et al. | Aug 2015 | A1 |
20150233595 | Fadell et al. | Aug 2015 | A1 |
20150233596 | Warren et al. | Aug 2015 | A1 |
20150234369 | Wen et al. | Aug 2015 | A1 |
20150241078 | Matsuoka et al. | Aug 2015 | A1 |
20150245189 | Nalluri et al. | Aug 2015 | A1 |
20150248118 | Li et al. | Sep 2015 | A1 |
20150249605 | Erickson et al. | Sep 2015 | A1 |
20150260424 | Fadell et al. | Sep 2015 | A1 |
20150267935 | Devenish et al. | Sep 2015 | A1 |
20150268652 | Lunacek et al. | Sep 2015 | A1 |
20150276237 | Daniels et al. | Oct 2015 | A1 |
20150276238 | Matsuoka et al. | Oct 2015 | A1 |
20150276239 | Fadell et al. | Oct 2015 | A1 |
20150276254 | Nemcek et al. | Oct 2015 | A1 |
20150276266 | Warren et al. | Oct 2015 | A1 |
20150277463 | Hazzard et al. | Oct 2015 | A1 |
20150277492 | Chau et al. | Oct 2015 | A1 |
20150280935 | Poplawski et al. | Oct 2015 | A1 |
20150287310 | Deiiuliis et al. | Oct 2015 | A1 |
20150292764 | Land et al. | Oct 2015 | A1 |
20150292765 | Matsuoka et al. | Oct 2015 | A1 |
20150293541 | Fadell et al. | Oct 2015 | A1 |
20150300672 | Fadell et al. | Oct 2015 | A1 |
20150312696 | Ribbich et al. | Oct 2015 | A1 |
20150316285 | Clifton et al. | Nov 2015 | A1 |
20150316286 | Roher | Nov 2015 | A1 |
20150316902 | Wenzel et al. | Nov 2015 | A1 |
20150323212 | Warren et al. | Nov 2015 | A1 |
20150327010 | Gottschalk et al. | Nov 2015 | A1 |
20150327084 | Ramachandran et al. | Nov 2015 | A1 |
20150327375 | Bick et al. | Nov 2015 | A1 |
20150330654 | Matsuoka | Nov 2015 | A1 |
20150330658 | Filson et al. | Nov 2015 | A1 |
20150330660 | Filson et al. | Nov 2015 | A1 |
20150332150 | Thompson | Nov 2015 | A1 |
20150338117 | Henneberger et al. | Nov 2015 | A1 |
20150345818 | Oh et al. | Dec 2015 | A1 |
20150348554 | Orr et al. | Dec 2015 | A1 |
20150354844 | Kates | Dec 2015 | A1 |
20150354846 | Hales et al. | Dec 2015 | A1 |
20150355371 | Ableitner et al. | Dec 2015 | A1 |
20150362208 | Novotny et al. | Dec 2015 | A1 |
20150362926 | Yarde et al. | Dec 2015 | A1 |
20150362927 | Giorgi | Dec 2015 | A1 |
20150364135 | Kolavennu et al. | Dec 2015 | A1 |
20150370270 | Pan et al. | Dec 2015 | A1 |
20150370272 | Reddy et al. | Dec 2015 | A1 |
20150370615 | Pi-Sunyer | Dec 2015 | A1 |
20150370621 | Karp et al. | Dec 2015 | A1 |
20150372832 | Kortz et al. | Dec 2015 | A1 |
20150372834 | Karp et al. | Dec 2015 | A1 |
20150372999 | Pi-Sunyer | Dec 2015 | A1 |
20160006274 | Tu et al. | Jan 2016 | A1 |
20160006577 | Logan | Jan 2016 | A1 |
20160010880 | Bravard et al. | Jan 2016 | A1 |
20160018122 | Frank et al. | Jan 2016 | A1 |
20160018127 | Gourlay et al. | Jan 2016 | A1 |
20160020590 | Roosli et al. | Jan 2016 | A1 |
20160026194 | Mucignat et al. | Jan 2016 | A1 |
20160036227 | Schultz et al. | Feb 2016 | A1 |
20160040903 | Emmons et al. | Feb 2016 | A1 |
20160047569 | Fadell et al. | Feb 2016 | A1 |
20160054022 | Matas et al. | Feb 2016 | A1 |
20160054792 | Poupyrev | Feb 2016 | A1 |
20160054988 | Desire | Feb 2016 | A1 |
20160061471 | Eicher et al. | Mar 2016 | A1 |
20160061474 | Cheung et al. | Mar 2016 | A1 |
20160069582 | Buduri | Mar 2016 | A1 |
20160069583 | Fadell et al. | Mar 2016 | A1 |
20160077532 | Lagerstedt et al. | Mar 2016 | A1 |
20160088041 | Nichols | Mar 2016 | A1 |
20160107820 | Macvittie et al. | Apr 2016 | A1 |
20160171289 | Lee et al. | Jun 2016 | A1 |
20160327298 | Sinha et al. | Nov 2016 | A1 |
20160327299 | Ribbich et al. | Nov 2016 | A1 |
20160327300 | Ribbich et al. | Nov 2016 | A1 |
20160327301 | Ribbich et al. | Nov 2016 | A1 |
20160327302 | Ribbich et al. | Nov 2016 | A1 |
20160327921 | Ribbich et al. | Nov 2016 | A1 |
20160365885 | Honjo | Dec 2016 | A1 |
20160377306 | Drees et al. | Dec 2016 | A1 |
20170074536 | Bentz et al. | Mar 2017 | A1 |
20170074537 | Bentz et al. | Mar 2017 | A1 |
20170074539 | Bentz et al. | Mar 2017 | A1 |
20170074541 | Bentz et al. | Mar 2017 | A1 |
20170075510 | Bentz et al. | Mar 2017 | A1 |
20170075568 | Bentz et al. | Mar 2017 | A1 |
20170076263 | Bentz et al. | Mar 2017 | A1 |
20170102162 | Drees et al. | Apr 2017 | A1 |
20170102433 | Wenzel et al. | Apr 2017 | A1 |
20170102434 | Wenzel et al. | Apr 2017 | A1 |
20170102675 | Drees | Apr 2017 | A1 |
20170103483 | Drees et al. | Apr 2017 | A1 |
20170104332 | Wenzel et al. | Apr 2017 | A1 |
20170104336 | Elbsat et al. | Apr 2017 | A1 |
20170104337 | Drees | Apr 2017 | A1 |
20170104342 | Elbsat et al. | Apr 2017 | A1 |
20170104343 | Elbsat et al. | Apr 2017 | A1 |
20170104344 | Wenzel et al. | Apr 2017 | A1 |
20170104345 | Wenzel et al. | Apr 2017 | A1 |
20170104346 | Wenzel et al. | Apr 2017 | A1 |
20170104449 | Drees | Apr 2017 | A1 |
20170122613 | Sinha et al. | May 2017 | A1 |
20170122617 | Sinha et al. | May 2017 | A1 |
20170123391 | Sinha et al. | May 2017 | A1 |
20170124838 | Sinha et al. | May 2017 | A1 |
20170124842 | Sinha et al. | May 2017 | A1 |
20170131825 | Moore et al. | May 2017 | A1 |
20170295058 | Gottschalk et al. | Oct 2017 | A1 |
20170357607 | Cayemberg et al. | Dec 2017 | A1 |
Number | Date | Country |
---|---|---|
2466854 | Apr 2008 | CA |
2633200 | Jan 2011 | CA |
2633121 | Aug 2011 | CA |
2818356 | May 2012 | CA |
2818696 | May 2012 | CA |
2853041 | Apr 2013 | CA |
2853081 | Apr 2013 | CA |
2812567 | May 2014 | CA |
2886531 | Sep 2015 | CA |
2894359 | Dec 2015 | CA |
1784701 | Jun 2006 | CN |
101695126 | Apr 2010 | CN |
102088474 | Jun 2011 | CN |
102739478 | Oct 2012 | CN |
102763436 | Oct 2012 | CN |
103312583 | Sep 2013 | CN |
103536399 | Jan 2014 | CN |
104036699 | Sep 2014 | CN |
104510460 | Apr 2015 | CN |
104656530 | May 2015 | CN |
204394473 | Jun 2015 | CN |
204410794 | Jun 2015 | CN |
104767802 | Jul 2015 | CN |
204883329 | Dec 2015 | CN |
10 2004 005 962 | Aug 2005 | DE |
2 283 279 | Feb 2011 | EP |
2 738 478 | Jun 2014 | EP |
2 897 018 | Jul 2015 | EP |
2 988 188 | Feb 2016 | EP |
2 519 441 | Apr 2015 | GB |
WO-0022491 | Apr 2000 | WO |
WO-2006041599 | Jul 2006 | WO |
WO-2009006133 | Jan 2009 | WO |
WO-2009036764 | Mar 2009 | WO |
WO-2009058127 | May 2009 | WO |
WO-2010059143 | May 2010 | WO |
WO-201 0078459 | Jul 2010 | WO |
WO-2010088663 | Aug 2010 | WO |
WO-2012042232 | Apr 2012 | WO |
WO-2012068436 | May 2012 | WO |
WO-2012068437 | May 2012 | WO |
WO-2012068459 | May 2012 | WO |
WO-2012068495 | May 2012 | WO |
WO-2012068503 | May 2012 | WO |
WO-2012068507 | May 2012 | WO |
WO-2012068517 | May 2012 | WO |
WO-2012068526 | May 2012 | WO |
WO-2012142477 | Oct 2012 | WO |
WO-2013033469 | Mar 2013 | WO |
WO-2013052389 | Apr 2013 | WO |
WO-2013052901 | Apr 2013 | WO |
WO-2013052905 | Apr 2013 | WO |
WO-2013058932 | Apr 2013 | WO |
WO-2013058933 | Apr 2013 | WO |
WO-2013058934 | Apr 2013 | WO |
WO-2013058968 | Apr 2013 | WO |
WO-2013058969 | Apr 2013 | WO |
WO-2013059684 | Apr 2013 | WO |
WO-2013153480 | Oct 2013 | WO |
WO-2014047501 | Mar 2014 | WO |
WO-2014051632 | Apr 2014 | WO |
WO-2014051635 | Apr 2014 | WO |
WO-2014055059 | Apr 2014 | WO |
WO-2014152301 | Sep 2014 | WO |
WO-2014152301 | Sep 2014 | WO |
WO-2015012449 | Jan 2015 | WO |
WO-2015039178 | Mar 2015 | WO |
WO-2015054272 | Apr 2015 | WO |
WO-2015057698 | Apr 2015 | WO |
WO-2015099721 | Jul 2015 | WO |
WO-2015127499 | Sep 2015 | WO |
WO-2015127566 | Sep 2015 | WO |
WO-2015134755 | Sep 2015 | WO |
WO-2015195772 | Dec 2015 | WO |
WO-2016038374 | Mar 2016 | WO |
Entry |
---|
U.S. Appl. No. 14/543,354, filed Nov. 17, 2014, Vivint, Inc. |
U.S. Appl. No. 15/143,373, filed Apr. 29, 2016, Johnson Controls Technology Company. |
U.S. Appl. No. 15/143,134, filed May 4, 2016, Johnson Controls Technology Company. |
U.S. Appl. No. 15/146,202, filed May 4, 2016, Johnson Controls Technology Company. |
U.S. Appl. No. 15/146,649, filed May 4, 2016, Johnson Controls Technology Company. |
U.S. Appl. No. 15/146,749, filed May 4, 2016, Johnson Controls Technology Company. |
U.S. Appl. No. 15/146,763, filed May 4, 2016, Johnson Controls Technology Company. |
U.S. Appl. No. 15/179,894, filed Jun. 10, 2016, Johnson Controls Technology Company. |
U.S. Appl. No. 15/207,431, filed Jul. 11, 2016, Johnson Controls Technology Company. |
U.S. Appl. No. 15/247,777, filed Aug. 25, 2016, Johnson Controls Technology Company. |
U.S. Appl. No. 15/247,784, filed Aug. 25, 2016, Johnson Controls Technology Company. |
U.S. Appl. No. 15/247,788, filed Aug. 25, 2016, Johnson Controls Technology Company. |
U.S. Appl. No. 15/247,793, filed Aug. 25, 2016, Johnson Controls Technology Company. |
U.S. Appl. No. 15/247,844, filed Aug. 25, 2016, Johnson Controls Technology Company. |
U.S. Appl. No. 15/247,869, filed Aug. 25, 2016, Johnson Controls Technology Company. |
U.S. Appl. No. 15/247,872, filed Aug. 25, 2016, Johnson Controls Technology Company. |
U.S. Appl. No. 15/247,873, filed Aug. 25, 2016, Johnson Controls Technology Company. |
U.S. Appl. No. 15/247,875, filed Aug. 25, 2016, Johnson Controls Technology Company. |
U.S. Appl. No. 15/247,879, filed Aug. 25, 2016, Johnson Controls Technology Company. |
U.S. Appl. No. 15/247,880, filed Aug. 25, 2016, Johnson Controls Technology Company. |
U.S. Appl. No. 15/247,881, filed Aug. 25, 2016, Johnson Controls Technology Company. |
U.S. Appl. No. 15/247,883, filed Aug. 25, 2016, Johnson Controls Technology Company. |
U.S. Appl. No. 15/247,885, filed Aug. 25, 2016, Johnson Controls Technology Company. |
U.S. Appl. No. 15/247,886, filed Aug. 25, 2016, Johnson Controls Technology Company. |
U.S. Appl. No. 15/260,293, filed Sep. 8, 2016, Johnson Controls Technology Company. |
U.S. Appl. No. 15/260,295, filed Sep. 8, 2016, Johnson Controls Technology Company. |
U.S. Appl. No. 15/260,297, filed Sep. 8, 2016, Johnson Controls Technology Company. |
U.S. Appl. No. 15/260,299, filed Sep. 8, 2016, Johnson Controls Technology Company. |
U.S. Appl. No. 15/260,301, filed Sep. 8, 2016, Johnson Controls Technology Company. |
U.S. Appl. No. 15/336,789, filed Oct. 28, 2016, Johnson Controls Technology Company. |
U.S. Appl. No. 15/336,791, filed Oct. 28, 2016, Johnson Controls Technology Company. |
U.S. Appl. No. 15/336,792, filed Oct. 28, 2016, Johnson Controls Technology Company. |
U.S. Appl. No. 15/336,793, filed Oct. 28, 2016, Johnson Controls Technology Company. |
U.S. Appl. No. 15/338,215, filed Oct. 28, 2016, Johnson Controls Technology Company. |
U.S. Appl. No. 15/338,221, filed Oct. 28, 2016, Johnson Controls Technology Company. |
U.S. Appl. No. 15/397,722, filed Jan. 3, 2017, Johnson Controls Technology Company. |
U.S. Appl. No. 29/525,907, filed May 4, 2015, Johnson Controls Technology Company. |
U.S. Appl. No. 29/548,334, filed Dec. 11, 2015, Johnson Controls Technology Company. |
U.S. Appl. No. 29/563,447, filed May 4, 2016, Johnson Controls Technology Company. |
U.S. Appl. No. 29/576,515, filed Sep. 2, 2016, Johnson Controls Technology Company. |
U.S. Appl. No. 62/239,131, filed Oct. 8, 2015, Johnson Controls Technology Company. |
U.S. Appl. No. 62/239,231, filed Oct. 8, 2015, Johnson Controls Technology Company. |
U.S. Appl. No. 62/239,233, filed Oct. 8, 2015, Johnson Controls Technology Program. |
U.S. Appl. No. 62/239,245, filed Oct. 8, 2015, Johnson Controls Technology Program. |
U.S. Appl. No. 62/239,246, filed Oct. 8, 2015, Johnson Controls Technology Program. |
U.S. Appl. No. 62/239,249, filed Oct. 8, 2015, Johnson Controls Technology Program. |
International Search Report and Written Opinion for Application No. PCT/US2016/030291, dated Sep. 7, 2016, 11 pages. |
International Search Report and Written Opinion for Application No. PCT/US2016/030827, dated Sep. 7, 2016, 13 pages. |
International Search Report and Written Opinion for Application No. PCT/US2016/030829, dated Sep. 7, 2016, 15 pages. |
International Search Report and Written Opinion for Application No. PCT/US2016/030835, dated Sep. 7, 2016, 13 pages. |
International Search Report and Written Opinion for Application No. PCT/US2016/030836, dated Sep. 7, 2016, 11 pages. |
International Search Report and Written Opinion for Application No. PCT/US2016/030837, dated Sep. 7, 2016, 13 pages. |
International Search Report and Written Opinion for Application No. PCT/US2016/051176, dated Feb. 16, 2017, 20 pages. |
International Search Report and Written Opinion for Application No. PCT/US2017/012218, dated Mar. 31, 2017, 14 pages. |
International Search Report and Written Opinion for Application No. PCT/US2017/012221, dated Mar. 31, 2017, 13 pages. |
International Search Report and Written Opinion for Application No. PCT/US2017/030890, dated Jun. 21, 2017, 13 pages. |
International Search Report and Written Opinion for Application PCT/US2017/012217, dated Mar. 31, 2017, 14 pages. |
Office Action on CN 201780005745.9, dated Dec. 4, 2019, 24 pages with English translation. |
Unknown, National Semiconductor's Temperature Sensor Handbook, Nov. 1, 1997, retrieved from the Internet at http://shrubbery.net/˜heas/willem/PDF/NSC/temphb.pdf on Aug. 11, 2016, pp. 1-40. |
Written Opinion for Singapore Application No. 11201708996V, dated Dec. 27, 2017, 6 pages. |
Written Opinion for Singapore Application No. 11201708997W, dated Jan. 10, 2018, 9 pages. |
Written Opinion for Singapore Application No. 11201709002Y, dated Feb. 7, 2018, 5 pages. |
Johnson Controls Technology Company's Complaint for Patent Infringement. Case 1:20-cv-03692-LMM. United States District Court for the Northern District of Georgia, filed Sep. 9, 2020. 23 pages. |
Office Action on CN 201780005745.9, dated Aug. 25, 2020, 8 pages with English language translation. |
Exhibit 10—Defendants' Invalidity Contentions for U.S. Pat. No. 9,824,549, 12 pages. |
First Office Action on CN 202010698781.1, dated Mar. 12, 2021, 14 pages. |
Office Action on CN 201780005745.9, dated Mar. 2, 2021, 6 pages. |
Number | Date | Country | |
---|---|---|---|
20200128646 A1 | Apr 2020 | US |
Number | Date | Country | |
---|---|---|---|
62247672 | Oct 2015 | US | |
62274750 | Jan 2016 | US | |
62275199 | Jan 2016 | US | |
62275202 | Jan 2016 | US | |
62275204 | Jan 2016 | US | |
62275711 | Jan 2016 | US | |
62783580 | Dec 2018 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 16030422 | Jul 2018 | US |
Child | 16717887 | US | |
Parent | 15338215 | Oct 2016 | US |
Child | 16030422 | US | |
Parent | 15338221 | Oct 2016 | US |
Child | 15338215 | US | |
Parent | 15336789 | Oct 2016 | US |
Child | 15338221 | US | |
Parent | 15397722 | Jan 2017 | US |
Child | 16030422 | Jul 2018 | US |
Parent | 15336791 | Oct 2016 | US |
Child | 15397722 | US | |
Parent | 15336792 | Oct 2016 | US |
Child | 16030422 | Jul 2018 | US |
Parent | 16717887 | US | |
Child | 16030422 | Jul 2018 | US |
Parent | 16246366 | Jan 2019 | US |
Child | 16717887 | US | |
Parent | 15338221 | Oct 2016 | US |
Child | 16246366 | US | |
Parent | 15397722 | Jan 2017 | US |
Child | 15338221 | US | |
Parent | 15336791 | Oct 2016 | US |
Child | 15397722 | US | |
Parent | 15336789 | Oct 2016 | US |
Child | 16246366 | Jan 2019 | US |
Parent | 15336792 | Oct 2016 | US |
Child | 15336789 | US | |
Parent | 15336793 | Oct 2016 | US |
Child | 15336792 | US | |
Parent | 16030422 | Jul 2018 | US |
Child | 15336793 | US | |
Parent | 15336789 | Oct 2016 | US |
Child | 16030422 | US | |
Parent | 15336792 | Oct 2016 | US |
Child | 15336789 | US | |
Parent | 15338215 | Oct 2016 | US |
Child | 15336792 | US | |
Parent | 15338221 | Oct 2016 | US |
Child | 15338215 | US | |
Parent | 15397722 | Jan 2017 | US |
Child | 15338221 | US |