Automation system user interface

Information

  • Patent Grant
  • 11816323
  • Patent Number
    11,816,323
  • Date Filed
    Monday, March 15, 2021
    3 years ago
  • Date Issued
    Tuesday, November 14, 2023
    a year ago
Abstract
Systems and methods include an automation network comprising a gateway at a premises. The gateway is coupled to a remote network and is configured to control components at the premises including at least one of a thermostat and a lock. A sensor user interface (SUI) is coupled to the gateway and presented to a user via remote client devices. The SUI includes display elements for managing and receiving data of the components agnostically across the remote client devices. The display elements include an interactive icon comprising control regions. Each control region is configured to control a state change of a corresponding component.
Description
TECHNICAL FIELD

The embodiments described herein relate generally to a method and apparatus for improving the capabilities of home automation systems in premises applications.


BACKGROUND

The field of home and small business security is dominated by technology suppliers who build comprehensive ‘closed’ security systems, where the individual components (sensors, security panels, keypads) operate solely within the confines of a single vendor solution. For example, a wireless motion sensor from vendor A cannot be used with a security panel from vendor B. Each vendor typically has developed sophisticated proprietary wireless technologies to enable the installation and management of wireless sensors, with little or no ability for the wireless devices to operate separate from the vendor's homogeneous system. Furthermore, these traditional systems are extremely limited in their ability to interface either to a local or wide area standards-based network (such as an IP network); most installed systems support only a low-bandwidth, intermittent connection utilizing phone lines or cellular (RF) backup systems. Wireless security technology from providers such as GE Security, Honeywell, and DSC/Tyco are well known in the art, and are examples of this proprietary approach to security systems for home and business.


Furthermore, with the proliferation of the internet, ethernet and WiFi local area networks (LANs) and advanced wide area networks (WANs) that offer high bandwidth, low latency connections (broadband), as well as more advanced wireless WAN data networks (e.g. GPRS or CDMA 1×RTT) there increasingly exists the networking capability to extend these traditional security systems to offer enhanced functionality. In addition, the proliferation of broadband access has driven a corresponding increase in home and small business networking technologies and devices. It is desirable to extend traditional security systems to encompass enhanced functionality such as the ability to control and manage security systems from the world wide web, cellular telephones, or advanced function internet-based devices. Other desired functionality includes an open systems approach to interface home security systems to home and small business networks.


Due to the proprietary approach described above, the traditional vendors are the only ones capable of taking advantage of these new network functions. To date, even though the vast majority of home and business customers have broadband network access in their premises, most security systems do not offer the advanced capabilities associated with high speed, low-latency LANs and WANs. This is primarily because the proprietary vendors have not been able to deliver such technology efficiently or effectively. Solution providers attempting to address this need are becoming known in the art, including three categories of vendors: traditional proprietary hardware providers such as Honeywell and GE Security; third party hard-wired module providers such as Alarm.com, NextAlarm, and uControl; and new proprietary systems providers such as InGrid.


A disadvantage of the prior art technologies of the traditional proprietary hardware providers arises due to the continued proprietary approach of these vendors. As they develop technology in this area it once again operates only with the hardware from that specific vendor, ignoring the need for a heterogeneous, cross-vendor solution. Yet another disadvantage of the prior art technologies of the traditional proprietary hardware providers arises due to the lack of experience and capability of these companies in creating open internet and web based solutions, and consumer friendly interfaces.


A disadvantage of the prior art technologies of the third party hard-wired module providers arises due to the installation and operational complexities and functional limitations associated with hardwiring a new component into existing security systems. Moreover, a disadvantage of the prior art technologies of the new proprietary systems providers arises due to the need to discard all prior technologies, and implement an entirely new form of security system to access the new functionalities associated with broadband and wireless data networks. There remains, therefore, a need for systems, devices, and methods that easily interface to and control the existing proprietary security technologies utilizing a variety of wireless technologies.


INCORPORATION BY REFERENCE

Each patent, patent application, and/or publication mentioned in this specification is herein incorporated by reference in its entirety to the same extent as if each individual patent, patent application, and/or publication was specifically and individually indicated to be incorporated by reference.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1 is a block diagram of the integrated security system, under an embodiment.



FIG. 2 is a block diagram of components of the integrated security system, under an embodiment.



FIG. 3 is a block diagram of the gateway software or applications, under an embodiment.



FIG. 4 is a block diagram of the gateway components, under an embodiment.



FIG. 5 (collectively FIGS. 5A and 5B) shows the orb icon and corresponding text summary display elements, under an embodiment.



FIG. 6 is a table of security state and the corresponding sensor status displayed on the SUI, under an embodiment.



FIG. 7 is a table of system state and the corresponding warning text displayed as system warnings on the SUI, under an embodiment.



FIG. 8 is a table of sensor state/sort order and the corresponding sensor name and status text of the SUI, under an embodiment.



FIG. 9 shows icons of the interesting sensors, under an embodiment.



FIG. 10 shows the quiet sensor icon, under an embodiment.



FIG. 11 is an example Home Management Mode (HMM) screen presented via the web portal SUI, under an embodiment.



FIG. 12 is an example Home Management Mode (HMM) screen presented via the mobile portal SUI, under an embodiment.



FIG. 13 is a block diagram of an iPhone® client device SUI, under an embodiment.



FIG. 14 is a first example iPhone® client device SUI, under an embodiment.



FIG. 15 is a second example iPhone® client device SUI, under an embodiment.



FIG. 16 is a block diagram of a mobile portal client device SUI, under an embodiment.



FIG. 17 is an example summary page or screen presented via the mobile portal SUI, under an embodiment.



FIG. 18 is an example security panel page or screen presented via the mobile portal SUI, under an embodiment.



FIG. 19 is an example sensor status page or screen presented via the mobile portal SUI, under an embodiment.



FIG. 20 is an example interface page or screen presented via the web portal SUI, under an embodiment.



FIG. 21 is an example summary page or screen presented via the touchscreen SUI, under an embodiment.



FIG. 22 is an example sensor status page or screen presented via the touchscreen SUI, under an embodiment.



FIG. 23 is an example Home View display, under an embodiment.



FIG. 24 shows a table of sensor icons displayed on the Home View floor plan, under an embodiment.



FIG. 25 shows example device icons of Home View, under an embodiment.



FIG. 26 shows a Home View display that includes indicators for multiple floors, under an embodiment.



FIG. 27 shows the system states along with the corresponding Home View display and system or orb icon, under an embodiment.



FIG. 28 shows a Home View floor display (disarmed) that includes a warning indicator, under an embodiment.



FIG. 29 shows an example of the Home View using the iPhone security tab, under an embodiment.



FIG. 30 shows an example screen for site Settings, under an embodiment.



FIG. 31 shows an example screen for Security Tab Options, under an embodiment.



FIG. 32 shows an example “Add Floor” screen for use in selecting a floor plan, under an embodiment.



FIG. 33 shows an “Edit Home View” screen of the editor, under an embodiment.



FIG. 34 shows an example of dragging a device icon during which a name of the device (“Front Door”) is displayed, under an embodiment.



FIG. 35 is an example of a U-shaped floor plan customized by changing interior tiles to define walls, under an embodiment.



FIG. 36 shows an example in which the zoom level is increased and dragging has been used to focus on a sensor location, under an embodiment.



FIG. 37 is an example “Add Floor” page, under an embodiment.



FIG. 38 is an example Edit Home View screen showing the floor thumbnails for use in selecting a floor, under an embodiment.



FIG. 39 shows the Edit Home View screen with a delete floor selector, under an embodiment.



FIG. 40 is an example Edit Home View screen displaying options to “Save” and “Don't Save” changes following selection of the Done button, under an embodiment.



FIG. 41 is an example of the floor grid data, under an embodiment.



FIG. 42 is an example sensor hash table for a single-floor site, under an embodiment.



FIG. 43 shows an example hash table mapping, under an embodiment.



FIG. 44 shows the twelve shapes of a tile set, under an embodiment.



FIG. 45 shows the tile shapes and corresponding fill options for rendered tiles, under an embodiment.



FIG. 46 is an example tile rendering for a room of a premise, under an embodiment.



FIG. 47 is an example popup display in response to hovering near/adjacent a sensor icon (e.g., “Garage” sensor), under an embodiment.



FIG. 48 shows a Home View display that includes a floor plan display 4800 of a selected floor along with indicators 4801/4802 for multiple floors, under an embodiment.



FIG. 49 shows an example of the Home View user interface displayed via a mobile device (e.g., iPhone), under an embodiment.


Home View is configured via site settings as described in detail herein. Each application retains or remembers the user's preferred mode across sessions.



FIG. 50 shows an example of a Settings page of Home View, under an embodiment.



FIG. 51 shows an example “Home View Setup” editor page 5100 for use in selecting a floor plan, under an embodiment.



FIG. 52 shows a “Home View Setup” editor screen 5200 with a selected floor plan 5201, under an embodiment.



FIG. 59 shows a Home View Setup page 5900 with options displayed, under an embodiment.



FIG. 53 shows an example editor screen 5300 for which a label 5301 with a name of the device (“Front Door”) is displayed, under an embodiment.



FIG. 54 shows a Home View Setup page 5400 with a selected floor plan 5201 that has been edited to add numerous interior walls 5401, under an embodiment.



FIG. 55 shows a Home View Setup page with a label editing prompt 5501, under an embodiment.



FIG. 56 shows a Home View Setup page 5600 in a zoomed editing mode to zoom on one room 5601 in a building, under an embodiment.



FIG. 57 shows a Home View Setup page for adding at least one floor to a floor plan, under an embodiment.



FIG. 58 shows a Home View Setup page 5800 with editing for multiple floors, under an embodiment.



FIG. 60 shows a Home View Setup page 6000 with editor exit option prompts 6001 displayed, under an embodiment.



FIG. 61 is an example floor plan, under an embodiment.



FIG. 62 is an example Home View one-story floor plan, under an embodiment.



FIG. 63 is an example Home View floor plan that includes two devices, under an embodiment.



FIG. 64 is an example Home View floor plan that includes two labels, under an embodiment.



FIG. 65 is a block diagram of IP device integration with a premise network, under an embodiment.



FIG. 66 is a block diagram of IP device integration with a premise network, under an alternative embodiment.



FIG. 67 is a block diagram of a touchscreen, under an embodiment.



FIG. 68 is an example screenshot of a networked security touchscreen, under an embodiment.



FIG. 69 is a block diagram of network or premise device integration with a premise network, under an embodiment.



FIG. 70 is a block diagram of network or premise device integration with a premise network, under an alternative embodiment.



FIG. 71 is a flow diagram for a method of forming a security network including integrated security system components, under an embodiment.



FIG. 72 is a flow diagram for a method of forming a security network including integrated security system components and network devices, under an embodiment.



FIG. 73 is a flow diagram for installation of an IP device into a private network environment, under an embodiment.



FIG. 74 is a block diagram showing communications among IP devices of the private network environment, under an embodiment.



FIG. 75 is a flow diagram of a method of integrating an external control and management application system with an existing security system, under an embodiment.



FIG. 76 is a block diagram of an integrated security system wirelessly interfacing to proprietary security systems, under an embodiment.



FIG. 77 is a flow diagram for wirelessly ‘learning’ the gateway into an existing security system and discovering extant sensors, under an embodiment.



FIG. 78 is a block diagram of a security system in which the legacy panel is replaced with a wireless security panel wirelessly coupled to a gateway, under an embodiment.



FIG. 79 is a block diagram of a security system in which the legacy panel is replaced with a wireless security panel wirelessly coupled to a gateway, and a touchscreen, under an alternative embodiment.



FIG. 80 is a block diagram of a security system in which the legacy panel is replaced with a wireless security panel connected to a gateway via an Ethernet coupling, under another alternative embodiment.



FIG. 81 is a flow diagram for automatic takeover of a security system, under an embodiment.



FIG. 82 is a flow diagram for automatic takeover of a security system, under an alternative embodiment.



FIG. 83 is an example status interface of Home View 3D, under an embodiment.



FIG. 84 is an example user interface of Home View 3D, under an embodiment.



FIG. 85 is an example user interface showing “enable” control of Home View 3D, under an embodiment.



FIG. 86 is an example user interface showing “disable” control of Home View 3D, under an embodiment.



FIG. 87 is an example editor interface with indicators of Home View 3D being enabled, under an embodiment.



FIG. 88 is an example user interface showing five floors, under an embodiment.



FIG. 89 is an example interface of Home View 3D showing variables, under an embodiment.



FIG. 90 shows example renderings for square, wide, and tall canvases, 2D single floor, and 2D multi floor, under an embodiment.



FIG. 91 is an example user interface showing a “heat map” of Home View 3D, under an embodiment.



FIG. 92 is an example user interface for configuring a “heat map” of Home View 3D, under an embodiment.



FIG. 93 is another example user interface for configuring a “heat map” of Home View 3D, under an embodiment.



FIG. 94 is an example UI screen, under an embodiment.



FIG. 95 shows an example Status Bar of the UI, under an embodiment.



FIG. 96 shows an example System Bar of the UI, under an embodiment.



FIG. 97 shows an example Tab Bar of the UI, under an embodiment.



FIG. 98 shows an example Details View of the UI, under an embodiment.



FIG. 99 shows two versions of an example Details Card in Home View of the UI, under an embodiment.



FIG. 100 shows an example List View of the UI, under an embodiment.



FIG. 101 shows an example List layout of List View of the UI, under an embodiment.



FIG. 102 shows a device list item of the UI, under an embodiment.



FIG. 103 shows an example Settings Menu of the UI, under an embodiment.



FIG. 104 shows an example Events History View of the UI, under an embodiment.



FIG. 105 shows example thermostat line graphs of the UI, under an embodiment.



FIG. 106 shows example versions of a dismissable message in a message bar of the UI, under an embodiment.



FIG. 107 shows example versions of a non-dismissable message in a message bar of the UI, under an embodiment.



FIG. 108 shows example versions of multiple messages presented by the UI, under an embodiment.



FIG. 109 shows example versions of a Home View (3D, multiple floors) screen or page of the UI, under an embodiment.



FIG. 110 shows an example Home View (2D, multiple floors) screen or page of the UI, under an embodiment.



FIG. 111 shows an example Home View device control screen or page of the UI, under an embodiment.



FIG. 112 shows an example Notable Events screen or page of the UI, under an embodiment.



FIG. 113 shows example versions of a sensor list screen or page of the UI, under an embodiment.



FIG. 114 shows an example Sensor History screens or pages of the UI, under an embodiment.



FIG. 115 shows an example of arm options presented by the UI, under an embodiment.



FIG. 116 shows an example of arm protest presented by the UI, under an embodiment.



FIG. 117 shows an example of arm protest failed presented by the UI, under an embodiment.



FIG. 118 shows an example of arm dialogue presented by the UI, under an embodiment.



FIG. 119 shows an example of modes dialog presented by the UI, under an embodiment.



FIG. 120 shows examples of camera detail screens presented by the UI, under an embodiment.



FIG. 121 shows example camera device lists presented by the UI, under an embodiment.



FIG. 122 shows an example of camera full-screen live video presented by the UI, under an embodiment.



FIG. 123 shows camera capture options (e.g., “Take Picture”, “Take Video Clip”, etc.) presented by the UI, under an embodiment.



FIG. 124 shows a capture message (e.g., “Capturing Video Clip . . . ”) presented by the UI, under an embodiment.



FIG. 125 shows example camera history (clips and pictures) views presented by the UI, under an embodiment.



FIG. 126A shows an example binary switch icon (“off” state) presented by the UI, under an embodiment.



FIG. 126B shows an example binary switch icon (“on” state, indicated by different color than “off” state) presented by the UI, under an embodiment.



FIG. 127A shows an example UI page with a binary switch (e.g., coffee maker, etc.) icon (“off” state) presented by the UI, under an embodiment.



FIG. 127B shows an example UI page with a binary switch (e.g., coffee maker, etc.) icon (“on” state, indicated by different color than “off” state) presented by the UI, under an embodiment.



FIG. 128A shows an example dimmer switch icon (“off” state) presented by the UI, under an embodiment.



FIG. 128B shows an example dimmer switch icon (“on” state, indicated by different color than “off” state) presented by the UI, under an embodiment.



FIG. 128C shows an example dimmer switch icon (in use) presented by the UI, under an embodiment.



FIG. 129A shows an example UI page with a dimmer switch (e.g., light, etc.) icon “off” state) presented by the UI, under an embodiment.



FIG. 129B shows an example UI page with a dimmer switch (e.g., light, etc.)


icon (“on” state, indicated by different color than “off” state) presented by the UI, under an embodiment.



FIGS. 130A and 130B show example thermostat state icons presented by the UI, under an embodiment.



FIG. 131 shows set point drag and tap areas of a thermostat presented by the UI, under an embodiment.



FIG. 132 shows the thermostat set point (heat/cool) slider in use (top), and increment/decrement function in use (bottom) as presented by the UI, under an embodiment.



FIG. 133A shows example versions of thermostat details (auto mode) screens presented by the UI, under an embodiment.



FIG. 133B shows an example thermostat (actively heating) screen presented by the UI, under an embodiment.



FIG. 133C shows an example thermostat (actively cooling) screen presented by the UI, under an embodiment.



FIG. 133D shows an example thermostat (changing cool setpoint) screen presented by the UI, under an embodiment.



FIG. 133E shows an example thermostat (off) screen presented by the UI, under an embodiment.



FIG. 134 shows mode selection popups presented by the UI, under an embodiment.



FIG. 135 shows an example door lock control tap and drag control screen presented by the UI, under an embodiment.



FIG. 136 shows example lock icons (e.g., locked state, unlocked state, low battery) presented by the UI, under an embodiment.



FIG. 137A shows an example UI with door lock details icon (inactive) presented by the UI, under an embodiment.



FIG. 137B shows an example UI page with door lock details icon (active) presented by the UI, under an embodiment.



FIG. 138A shows an example UI page with garage door details icon (inactive) presented by the UI, under an embodiment.



FIG. 138B shows an example UI page with garage door details icon (active) presented by the UI, under an embodiment.



FIG. 139 shows an example energy meter details page of the UI, under an embodiment.





DETAILED DESCRIPTION

Systems and methods include an automation network comprising a gateway at a premises. The gateway is coupled to a remote network and is configured to control components at the premises including at least one of a thermostat and a lock. A sensor user interface (SUI) is coupled to the gateway and presented to a user via remote client devices. The SUI includes display elements for managing and receiving data of the components agnostically across the remote client devices. The display elements include an interactive icon comprising control regions. Each control region is configured to control a state change of a corresponding component.


An integrated security system is described that integrates broadband and mobile access and control with conventional security systems and premise devices to provide a tri-mode security network (broadband, cellular/GSM, POTS access) that enables users to remotely stay connected to their premises. The integrated security system, while delivering remote premise monitoring and control functionality to conventional monitored premise protection, complements existing premise protection equipment. The integrated security system integrates into the premise network and couples wirelessly with the conventional security panel, enabling broadband access to premise security systems. Automation devices (cameras, lamp modules, thermostats, etc.) can be added, enabling users to remotely see live video and/or pictures and control home devices via their personal web portal or webpage, mobile phone, and/or other remote client device. Users can also receive notifications via email or text message when happenings occur, or do not occur, in their home.


Although the detailed description herein contains many specifics for the purposes of illustration, anyone of ordinary skill in the art will appreciate that many variations and alterations to the following details are within the scope of the embodiments described herein. Thus, the following illustrative embodiments are set forth without any loss of generality to, and without imposing limitations upon, the claimed invention.


In accordance with the embodiments described herein, a wireless system (e.g., radio frequency (RF)) is provided that enables a security provider or consumer to extend the capabilities of an existing RF-capable security system or a non-RF-capable security system that has been upgraded to support RF capabilities. The system includes an RF-capable Gateway device (physically located within RF range of the RF-capable security system) and associated software operating on the Gateway device. The system also includes a web server, application server, and remote database providing a persistent store for information related to the system.


The security systems of an embodiment, referred to herein as the iControl security system or integrated security system, extend the value of traditional home security by adding broadband access and the advantages of remote home monitoring and home control through the formation of a security network including components of the integrated security system integrated with a conventional premise security system and a premise local area network (LAN). With the integrated security system, conventional home security sensors, cameras, touchscreen keypads, lighting controls, and/or Internet Protocol (IP) devices in the home (or business) become connected devices that are accessible anywhere in the world from a web browser, mobile phone or through content-enabled touchscreens. The integrated security system experience allows security operators to both extend the value proposition of their monitored security systems and reach new consumers that include broadband users interested in staying connected to their family, home and property when they are away from home.


The integrated security system of an embodiment includes security servers (also referred to herein as iConnect servers or security network servers) and an iHub gateway (also referred to herein as the gateway, the iHub, or the iHub client) that couples or integrates into a home network (e.g., LAN) and communicates directly with the home security panel, in both wired and wireless installations. The security system of an embodiment automatically discovers the security system components (e.g., sensors, etc.) belonging to the security system and connected to a control panel of the security system and provides consumers with full two-way access via web and mobile portals. The gateway supports various wireless protocols and can interconnect with a wide range of control panels offered by security system providers. Service providers and users can then extend the system's capabilities with the additional IP cameras, lighting modules or security devices such as interactive touchscreen keypads. The integrated security system adds an enhanced value to these security systems by enabling consumers to stay connected through email and SMS alerts, photo push, event-based video capture and rule-based monitoring and notifications. This solution extends the reach of home security to households with broadband access.


The integrated security system builds upon the foundation afforded by traditional security systems by layering broadband and mobile access, IP cameras, interactive touchscreens, and an open approach to home automation on top of traditional security system configurations. The integrated security system is easily installed and managed by the security operator, and simplifies the traditional security installation process, as described below.


The integrated security system provides an open systems solution to the home security market. As such, the foundation of the integrated security system customer premises equipment (CPE) approach has been to abstract devices, and allows applications to manipulate and manage multiple devices from any vendor. The integrated security system DeviceConnect technology that enables this capability supports protocols, devices, and panels from GE Security and Honeywell, as well as consumer devices using Z-Wave, IP cameras (e.g., Ethernet, wifi, and Homeplug), and IP touchscreens. The DeviceConnect is a device abstraction layer that enables any device or protocol layer to interoperate with integrated security system components. This architecture enables the addition of new devices supporting any of these interfaces, as well as add entirely new protocols.


The benefit of DeviceConnect is that it provides supplier flexibility. The same consistent touchscreen, web, and mobile user experience operate unchanged on whatever security equipment selected by a security system provider, with the system provider's choice of IP cameras, backend data center and central station software.


The integrated security system provides a complete system that integrates or layers on top of a conventional host security system available from a security system provider. The security system provider therefore can select different components or configurations to offer (e.g., CDMA, GPRS, no cellular, etc.) as well as have iControl modify the integrated security system configuration for the system provider's specific needs (e.g., change the functionality of the web or mobile portal, add a GE or Honeywell-compatible TouchScreen, etc.).


The integrated security system integrates with the security system provider infrastructure for central station reporting directly via Broadband and GPRS alarm transmissions. Traditional dial-up reporting is supported via the standard panel connectivity. Additionally, the integrated security system provides interfaces for advanced functionality to the CMS, including enhanced alarm events, system installation optimizations, system test verification, video verification, 2-way voice over IP and GSM.


The integrated security system is an IP centric system that includes broadband connectivity so that the gateway augments the existing security system with broadband and GPRS connectivity. If broadband is down or unavailable GPRS may be used, for example. The integrated security system supports GPRS connectivity using an optional wireless package that includes a GPRS modem in the gateway. The integrated security system treats the GPRS connection as a higher cost though flexible option for data transfers. In an embodiment the GPRS connection is only used to route alarm events (e.g., for cost), however the gateway can be configured (e.g., through the iConnect server interface) to act as a primary channel and pass any or all events over GPRS. Consequently, the integrated security system does not interfere with the current plain old telephone service (POTS) security panel interface. Alarm events can still be routed through POTS; however the gateway also allows such events to be routed through a broadband or GPRS connection as well. The integrated security system provides a web application interface to the CSR tool suite as well as XML web services interfaces for programmatic integration between the security system provider's existing call center products. The integrated security system includes, for example, APIs that allow the security system provider to integrate components of the integrated security system into a custom call center interface. The APIs include XML web service APIs for integration of existing security system provider call center applications with the integrated security system service. All functionality available in the CSR Web application is provided with these API sets. The Java and XML-based APIs of the integrated security system support provisioning, billing, system administration, CSR, central station, portal user interfaces, and content management functions, to name a few. The integrated security system can provide a customized interface to the security system provider's billing system, or alternatively can provide security system developers with APIs and support in the integration effort.


The integrated security system provides or includes business component interfaces for provisioning, administration, and customer care to name a few. Standard templates and examples are provided with a defined customer professional services engagement to help integrate OSS/BSS systems of a Service Provider with the integrated security system.


The integrated security system components support and allow for the integration of customer account creation and deletion with a security system. The iConnect APIs provides access to the provisioning and account management system in iConnect and provide full support for account creation, provisioning, and deletion. Depending on the requirements of the security system provider, the iConnect APIs can be used to completely customize any aspect of the integrated security system backend operational system.


The integrated security system includes a gateway that supports the following standards-based interfaces, to name a few: Ethernet IP communications via Ethernet ports on the gateway, and standard XML/TCP/IP protocols and ports are employed over secured SSL sessions; USB 2.0 via ports on the gateway; 802.11b/g/n IP communications; GSM/GPRS RF WAN communications; CDMA 1×RTT RF WAN communications (optional, can also support EVDO and 3G technologies).


The gateway supports the following proprietary interfaces, to name a few: interfaces including Dialog RF network (319.5 MHz) and RS485 Superbus 2000 wired interface; RF mesh network (908 MHz); and interfaces including RF network (345 MHz) and RS485/RS232bus wired interfaces.


Regarding security for the IP communications (e.g., authentication, authorization, encryption, anti-spoofing, etc), the integrated security system uses SSL to encrypt all IP traffic, using server and client-certificates for authentication, as well as authentication in the data sent over the SSL-encrypted channel. For encryption, integrated security system issues public/private key pairs at the time/place of manufacture, and certificates are not stored in any online storage in an embodiment.


The integrated security system does not need any special rules at the customer premise and/or at the security system provider central station because the integrated security system makes outgoing connections using TCP over the standard HTTP and HTTPS ports. Provided outbound TCP connections are allowed then no special requirements on the firewalls are necessary.



FIG. 1 is a block diagram of the integrated security system 100, under an embodiment. The integrated security system 100 of an embodiment includes the gateway 102 and the security servers 104 coupled to the conventional home security system 110. At a customer's home or business, the gateway 102 connects and manages the diverse variety of home security and self-monitoring devices. The gateway 102 communicates with the iConnect Servers 104 located in the service provider's data center 106 (or hosted in integrated security system data center), with the communication taking place via a communication network 108 or other network (e.g., cellular network, internet, etc.). These servers 104 manage the system integrations necessary to deliver the integrated system service described herein. The combination of the gateway 102 and the iConnect servers 104 enable a wide variety of remote client devices 120 (e.g., PCs, mobile phones and PDAs) allowing users to remotely stay in touch with their home, business and family. In addition, the technology allows home security and self-monitoring information, as well as relevant third party content such as traffic and weather, to be presented in intuitive ways within the home, such as on advanced touchscreen keypads.


The integrated security system service (also referred to as iControl service) can be managed by a service provider via browser-based Maintenance and Service Management applications that are provided with the iConnect Servers. Or, if desired, the service can be more tightly integrated with existing OSS/BSS and service delivery systems via the iConnect web services-based XML APIs.


The integrated security system service can also coordinate the sending of alarms to the home security Central Monitoring Station (CMS) 199. Alarms are passed to the CMS 199 using standard protocols such as Contact ID or SIA and can be generated from the home security panel location as well as by iConnect server 104 conditions (such as lack of communications with the integrated security system). In addition, the link between the security servers 104 and CMS 199 provides tighter integration between home security and self-monitoring devices and the gateway 102. Such integration enables advanced security capabilities such as the ability for CMS personnel to view photos taken at the time a burglary alarm was triggered. For maximum security, the gateway 102 and iConnect servers 104 support the use of a mobile network (both GPRS and CDMA options are available) as a backup to the primary broadband connection.


The integrated security system service is delivered by hosted servers running software components that communicate with a variety of client types while interacting with other systems. FIG. 2 is a block diagram of components of the integrated security system 100, under an embodiment. Following is a more detailed description of the components.


The iConnect servers 104 support a diverse collection of clients 120 ranging from mobile devices, to PCs, to in-home security devices, to a service provider's internal systems. Most clients 120 are used by end-users, but there are also a number of clients 120 that are used to operate the service.


Clients 120 used by end-users of the integrated security system 100 include, but are not limited to, the following:

    • Clients based on gateway client applications 202 (e.g., a processor-based device running the gateway technology that manages home security and automation devices).
    • A web browser 204 accessing a Web Portal application, performing end-user configuration and customization of the integrated security system service as well as monitoring of in-home device status, viewing photos and video, etc. Device and user management can also be performed by this portal application.
    • A mobile device 206 (e.g., PDA, mobile phone, etc.) accessing the integrated security system Mobile Portal. This type of client 206 is used by end-users to view system status and perform operations on devices (e.g., turning on a lamp, arming a security panel, etc.) rather than for system configuration tasks such as adding a new device or user.
    • PC or browser-based “widget” containers 208 that present integrated security system service content, as well as other third-party content, in simple, targeted ways (e.g. a widget that resides on a PC desktop and shows live video from a single in-home camera). “Widget” as used herein means applications or programs in the system.
    • Touchscreen home security keypads 208 and advanced in-home devices that present a variety of content widgets via an intuitive touchscreen user interface.
    • Notification recipients 210 (e.g., cell phones that receive SMS-based notifications when certain events occur (or don't occur), email clients that receive an email message with similar information, etc.).
    • Custom-built clients (not shown) that access the iConnect web services XML API to interact with users' home security and self-monitoring information in new and unique ways. Such clients could include new types of mobile devices, or complex applications where integrated security system content is integrated into a broader set of application features.


In addition to the end-user clients, the iConnect servers 104 support PC browser-based Service Management clients that manage the ongoing operation of the overall service. These clients run applications that handle tasks such as provisioning, service monitoring, customer support and reporting.


There are numerous types of server components of the iConnect servers 104 of an embodiment including, but not limited to, the following: Business Components which manage information about all of the home security and self-monitoring devices; End-User Application Components which display that information for users and access the Business Components via published XML APIs; and Service Management Application Components which enable operators to administer the service (these components also access the Business Components via the XML APIs, and also via published SNMP MIBs).


The server components provide access to, and management of, the objects associated with an integrated security system installation. The top-level object is the “network.” It is a location where a gateway 102 is located, and is also commonly referred to as a site or premises; the premises can include any type of structure (e.g., home, office, warehouse, etc.) at which a gateway 102 is located. Users can only access the networks to which they have been granted permission. Within a network, every object monitored by the gateway 102 is called a device. Devices include the sensors, cameras, home security panels and automation devices, as well as the controller or processor-based device running the gateway applications.


Various types of interactions are possible between the objects in a system. Automations define actions that occur as a result of a change in state of a device. For example, take a picture with the front entry camera when the front door sensor changes to “open”. Notifications are messages sent to users to indicate that something has occurred, such as the front door going to “open” state, or has not occurred (referred to as an iWatch notification). Schedules define changes in device states that are to take place at predefined days and times. For example, set the security panel to “Armed” mode every weeknight at 11:00 pm.


The iConnect Business Components are responsible for orchestrating all of the low-level service management activities for the integrated security system service. They define all of the users and devices associated with a network (site), analyze how the devices interact, and trigger associated actions (such as sending notifications to users). All changes in device states are monitored and logged. The Business Components also manage all interactions with external systems as required, including sending alarms and other related self-monitoring data to the home security Central Monitoring System (CMS) 199. The Business Components are implemented as portable Java J2EE Servlets, but are not so limited.


The following iConnect Business Components manage the main elements of the integrated security system service, but the embodiment is not so limited:

    • A Registry Manager 220 defines and manages users and networks. This component is responsible for the creation, modification and termination of users and networks. It is also where a user's access to networks is defined.
    • A Network Manager 222 defines and manages security and self-monitoring devices that are deployed on a network (site). This component handles the creation, modification, deletion and configuration of the devices, as well as the creation of automations, schedules and notification rules associated with those devices.
    • A Data Manager 224 manages access to current and logged state data for an existing network and its devices. This component specifically does not provide any access to network management capabilities, such as adding new devices to a network, which are handled exclusively by the Network Manager 222.
    • To achieve optimal performance for all types of queries, data for current device states is stored separately from historical state data (a.k.a. “logs”) in the database. A Log Data Manager 226 performs ongoing transfers of current device state data to the historical data log tables.


Additional iConnect Business Components handle direct communications with certain clients and other systems, for example:

    • An iHub Manager 228 directly manages all communications with gateway clients, including receiving information about device state changes, changing the configuration of devices, and pushing new versions of the gateway client to the hardware it is running on.
    • A Notification Manager 230 is responsible for sending all notifications to clients via SMS (mobile phone messages), email (via a relay server like an SMTP email server), etc.
    • An Alarm and CMS Manager 232 sends critical server-generated alarm events to the home security Central Monitoring Station (CMS) and manages all other communications of integrated security system service data to and from the CMS.
    • The Element Management System (EMS) 234 is an iControl Business Component that manages all activities associated with service installation, scaling and monitoring, and filters and packages service operations data for use by service management applications. The SNMPBs published by the EMS can also be incorporated into any third party monitoring system if desired.


The iConnect Business Components store information about the objects that they manage in the iControl Service Database 240 and in the iControl Content Store 242. The iControl Content Store is used to store media objects like video, photos and widget content, while the Service Database stores information about users, networks, and devices. Database interaction is performed via a JDBC interface. For security purposes, the Business Components manage all data storage and retrieval.


The iControl Business Components provide web services-based APIs that application components use to access the Business Components' capabilities. Functions of application components include presenting integrated security system service data to end-users, performing administrative duties, and integrating with external systems and back-office applications.


The primary published APIs for the iConnect Business Components include, but are not limited to, the following:

    • A Registry Manager API 252 provides access to the Registry Manager Business Component's functionality, allowing management of networks and users.
    • A Network Manager API 254 provides access to the Network Manager Business Component's functionality, allowing management of devices on a network.
    • A Data Manager API 256 provides access to the Data Manager Business Component's functionality, such as setting and retrieving (current and historical) data about device states.
    • A Provisioning API 258 provides a simple way to create new networks and configure initial default properties.


Each API of an embodiment includes two modes of access: Java API or XML API. The XML APIs are published as web services so that they can be easily accessed by applications or servers over a network. The Java APIs are a programmer-friendly wrapper for the XML APIs. Application components and integrations written in Java should generally use the Java APIs rather than the XML APIs directly.


The iConnect Business Components also have an XML-based interface 260 for quickly adding support for new devices to the integrated security system. This interface 260, referred to as DeviceConnect 260, is a flexible, standards-based mechanism for defining the properties of new devices and how they can be managed. Although the format is flexible enough to allow the addition of any type of future device, pre-defined XML profiles are currently available for adding common types of devices such as sensors (SensorConnect), home security panels (PanelConnect) and IP cameras (CameraConnect).


The iConnect End-User Application Components deliver the user interfaces that run on the different types of clients supported by the integrated security system service. The components are written in portable Java J2EE technology (e.g., as Java Servlets, as JavaServer Pages (JSPs), etc.) and they all interact with the iControl Business Components via the published APIs.


The following End-User Application Components generate CSS-based HTML/JavaScript that is displayed on the target client. These applications can be dynamically branded with partner-specific logos and URL links (such as Customer Support, etc.). The End-User Application Components of an embodiment include, but are not limited to, the following:

    • An iControl Activation Application 270 that delivers the first application that a user sees when they set up the integrated security system service. This wizard-based web browser application securely associates a new user with a purchased gateway and the other devices included with it as a kit (if any). It primarily uses functionality published by the Provisioning API.
    • An iControl Web Portal Application 272 runs on PC browsers and delivers the web-based interface to the integrated security system service. This application allows users to manage their networks (e.g. add devices and create automations) as well as to view/change device states, and manage pictures and videos. Because of the wide scope of capabilities of this application, it uses three different Business Component APIs that include the Registry Manager API, Network Manager API, and Data Manager API, but the embodiment is not so limited.
    • An iControl Mobile Portal 274 is a small-footprint web-based interface that runs on mobile phones and PDAs. This interface is optimized for remote viewing of device states and pictures/videos rather than network management. As such, its interaction with the Business Components is primarily via the Data Manager API.
    • Custom portals and targeted client applications can be provided that leverage the same Business Component APIs used by the above applications.
    • A Content Manager Application Component 276 delivers content to a variety of clients. It sends multimedia-rich user interface components to widget container clients (both PC and browser-based), as well as to advanced touchscreen keypad clients. In addition to providing content directly to end-user devices, the Content Manager 276 provides widget-based user interface components to satisfy requests from other Application Components such as the iControl Web 272 and Mobile 274 portals.


A number of Application Components are responsible for overall management of the service. These pre-defined applications, referred to as Service Management Application Components, are configured to offer off-the-shelf solutions for production management of the integrated security system service including provisioning, overall service monitoring, customer support, and reporting, for example. The Service Management Application Components of an embodiment include, but are not limited to, the following:

    • A Service Management Application 280 allows service administrators to perform activities associated with service installation, scaling and monitoring/alerting. This application interacts heavily with the Element Management System (EMS) Business Component to execute its functionality, and also retrieves its monitoring data from that component via protocols such as SNMP MIBs.
    • A Kitting Application 282 is used by employees performing service provisioning tasks. This application allows home security and self-monitoring devices to be associated with gateways during the warehouse kitting process.
    • A CSR Application and Report Generator 284 is used by personnel supporting the integrated security system service, such as CSRs resolving end-user issues and employees enquiring about overall service usage. Pushes of new gateway firmware to deployed gateways is also managed by this application.


The iConnect servers 104 also support custom-built integrations with a service provider's existing OSS/BSS, CSR and service delivery systems 290. Such systems can access the iConnect web services XML API to transfer data to and from the iConnect servers 104. These types of integrations can compliment or replace the PC browser-based Service Management applications, depending on service provider needs.


As described above, the integrated security system of an embodiment includes a gateway, or iHub. The gateway of an embodiment includes a device that is deployed in the home or business and couples or connects the various third-party cameras, home security panels, sensors and devices to the iConnect server over a WAN connection as described in detail herein. The gateway couples to the home network and communicates directly with the home security panel in both wired and wireless sensor installations. The gateway is configured to be low-cost, reliable and thin so that it complements the integrated security system network-based architecture.


The gateway supports various wireless protocols and can interconnect with a wide range of home security control panels. Service providers and users can then extend the system's capabilities by adding IP cameras, lighting modules and additional security devices. The gateway is configurable to be integrated into many consumer appliances, including set-top boxes, routers and security panels. The small and efficient footprint of the gateway enables this portability and versatility, thereby simplifying and reducing the overall cost of the deployment.



FIG. 3 is a block diagram of the gateway 102 including gateway software or applications, under an embodiment. The gateway software architecture is relatively thin and efficient, thereby simplifying its integration into other consumer appliances such as set-top boxes, routers, touch screens and security panels. The software architecture also provides a high degree of security against unauthorized access. This section describes the various key components of the gateway software architecture.


The gateway application layer 302 is the main program that orchestrates the operations performed by the gateway. The Security Engine 304 provides robust protection against intentional and unintentional intrusion into the integrated security system network from the outside world (both from inside the premises as well as from the WAN). The Security Engine 304 of an embodiment comprises one or more sub-modules or components that perform functions including, but not limited to, the following:

    • Encryption including 128-bit SSL encryption for gateway and iConnect server communication to protect user data privacy and provide secure communication.
    • Bi-directional authentication between the gateway and iConnect server in order to prevent unauthorized spoofing and attacks. Data sent from the iConnect server to the gateway application (or vice versa) is digitally signed as an additional layer of security. Digital signing provides both authentication and validation that the data has not been altered in transit.
    • Camera SSL encapsulation because picture and video traffic offered by off-the-shelf networked IP cameras is not secure when traveling over the Internet. The gateway provides for 128-bit SSL encapsulation of the user picture and video data sent over the internet for complete user security and privacy.
    • 802.11b/g/n with WPA-2 security to ensure that wireless camera communications always takes place using the strongest available protection.
    • A gateway-enabled device is assigned a unique activation key for activation with an iConnect server. This ensures that only valid gateway-enabled devices can be activated for use with the specific instance of iConnect server in use. Attempts to activate gateway-enabled devices by brute force are detected by the Security Engine. Partners deploying gateway-enabled devices have the knowledge that only a gateway with the correct serial number and activation key can be activated for use with an iConnect server. Stolen devices, devices attempting to masquerade as gateway-enabled devices, and malicious outsiders (or insiders as knowledgeable but nefarious customers) cannot effect other customers' gateway-enabled devices.


As standards evolve, and new encryption and authentication methods are proven to be useful, and older mechanisms proven to be breakable, the security manager can be upgraded “over the air” to provide new and better security for communications between the iConnect server and the gateway application, and locally at the premises to remove any risk of eavesdropping on camera communications.


A Remote Firmware Download module 306 allows for seamless and secure updates to the gateway firmware through the iControl Maintenance Application on the server 104, providing a transparent, hassle-free mechanism for the service provider to deploy new features and bug fixes to the installed user base. The firmware download mechanism is tolerant of connection loss, power interruption and user interventions (both intentional and unintentional). Such robustness reduces down time and customer support issues. Gateway firmware can be remotely download either for one gateway at a time, a group of gateways, or in batches.


The Automations engine 308 manages the user-defined rules of interaction between the different devices (e.g. when door opens turn on the light). Though the automation rules are programmed and reside at the portal/server level, they are cached at the gateway level in order to provide short latency between device triggers and actions.


DeviceConnect 310 includes definitions of all supported devices (e.g., cameras, security panels, sensors, etc.) using a standardized plug-in architecture. The DeviceConnect module 310 offers an interface that can be used to quickly add support for any new device as well as enabling interoperability between devices that use different technologies/protocols. For common device types, pre-defined sub-modules have been defined, making supporting new devices of these types even easier. SensorConnect 312 is provided for adding new sensors, CameraConnect 316 for adding IP cameras, and PanelConnect 314 for adding home security panels.


The Schedules engine 318 is responsible for executing the user defined schedules (e.g., take a picture every five minutes; every day at 8 am set temperature to 65 degrees Fahrenheit, etc.). Though the schedules are programmed and reside at the iConnect server level they are sent to the scheduler within the gateway application. The Schedules Engine 318 then interfaces with SensorConnect 312 to ensure that scheduled events occur at precisely the desired time.


The Device Management module 320 is in charge of all discovery, installation and configuration of both wired and wireless IP devices (e.g., cameras, etc.) coupled or connected to the system. Networked IP devices, such as those used in the integrated security system, require user configuration of many IP and security parameters—to simplify the user experience and reduce the customer support burden, the device management module of an embodiment handles the details of this configuration. The device management module also manages the video routing module described below.


The video routing engine 322 is responsible for delivering seamless video streams to the user with zero-configuration. Through a multi-step, staged approach the video routing engine uses a combination of UPnP port-forwarding, relay server routing and STUN/TURN peer-to-peer routing.



FIG. 4 is a block diagram of components of the gateway 102, under an embodiment. Depending on the specific set of functionality desired by the service provider deploying the integrated security system service, the gateway 102 can use any of a number of processors 402, due to the small footprint of the gateway application firmware. In an embodiment, the gateway could include the Broadcom BCM5354 as the processor for example. In addition, the gateway 102 includes memory (e.g., FLASH 404, RAM 406, etc.) and any number of input/output (I/O) ports 408.


Referring to the WAN portion 410 of the gateway 102, the gateway 102 of an embodiment can communicate with the iConnect server using a number of communication types and/or protocols, for example Broadband 412, GPRS 414 and/or Public Switched Telephone Network (PTSN) 416 to name a few. In general, broadband communication 412 is the primary means of connection between the gateway 102 and the iConnect server 104 and the GPRS/CDMA 414 and/or PSTN 416 interfaces acts as backup for fault tolerance in case the user's broadband connection fails for whatever reason, but the embodiment is not so limited.


Referring to the LAN portion 420 of the gateway 102, various protocols and physical transceivers can be used to communicate to off-the-shelf sensors and cameras. The gateway 102 is protocol-agnostic and technology-agnostic and as such can easily support almost any device networking protocol. The gateway 102 can, for example, support GE and Honeywell security RF protocols 422, Z-Wave 424, serial (RS232 and RS485) 426 for direct connection to security panels as well as WiFi 428 (802.11b/g) for communication to WiFi cameras.


The system of an embodiment uses or includes a system user interface (SUI) that provides an iconic, at-a-glance representation of security system status. The SUI is for use across all client types as described above with reference to FIG. 1. The SUI includes a number of display elements that are presented across all types of client devices used to monitor status of the security system. The clients of an embodiment include, but are not limited to, the iPhone, the iPad, a mobile portal, a web portal, and a touchscreen. The display elements of the SUI of an embodiment include, but are not limited to, an orb icon, text summary, security button, device and system warnings, interesting sensors, and quiet sensors, as described in detail below. The SUI thus provides system status summary information (e.g., security and sensors) uniformly across all clients. Additionally, the SUI provides consistent iconography, terminology, and display rules across all clients as well as consistent sensor and system detail across clients.


Following is a description of the various states of the iControl sensors, and how these states are indicated uniformly across all clients using the SUI and other sensor information displays such as sensor lists and timelines.


Regarding the display elements of the SUI, the orb icon visually indicates the current arm state and sensor status of a single site. FIG. 5 (collectively FIGS. 5A and 5B) shows the orb icon and corresponding text summary display elements, under an embodiment. Across all clients, when sensor detail is shown in a list or timeline, state is indicated using the proper icon, text summary and grouping. The orb icons and text summary elements of an embodiment generally represent system state 4001 to include the following states: “Disarmed” or “Subdisarmed; “Armed (Doors and Windows, Stay, Away, All, Night Stay, Instant, Motion, Maximum)”; “Disarmed”, or “Subdisarmed” (sensor absent; sensor tripped; sensor tampered; low battery; uncleared alarm); “Armed (Doors and Windows, Stay, Away, All, Night Stay, Instant, Motion, Maximum)” (sensor absent; sensor tripped; sensor tampered; low battery); “Alarm”; and “No iHub Connection” (broadband offline, etc.) (no security panel connection). In addition to representing system state, the orb icons and text summary elements of an embodiment generally represent system status 4002 to include the following status: “All Quiet”; “Motion”; “Open”; “Open & Motion”.


Using various combinations of system state 4001 and status 4002, the orb icons of an embodiment indicate or represent numerous system states.


When the system state 4001 is “Disarmed” or “Subdisarmed”, the orb icons of an embodiment indicate or represent status 4002 as follows: Disarmed (status: all quiet) 4010 (e.g., icon color is green); Disarmed (status: motion) 4011 (e.g., icon color is green); Disarmed, (number of sensors open) Sensor(s) Open (status: open) 4012 (e.g., icon color is green, bottom region for sensor number is yellow); Disarmed, (number of sensors open) Sensor(s) Open (status: open and motion) 4013 (e.g., icon color is green, bottom region for sensor number is yellow).


When the system state 4001 is “Armed (Doors and Windows, Stay, Away, All, Night Stay, Instant, Motion, Maximum)”, the orb icons of an embodiment indicate or represent status 4002 as follows: Armed Doors & Windows (status: all quiet) 4014 (e.g., icon color is red); Armed Doors & Windows (status: motion) 4015 (e.g., icon color is red); Armed Doors & Windows, (number of sensors open) Sensor(s) Open (status: open) 4016 (e.g., icon color is red, bottom region for sensor number is yellow); Armed Doors & Windows, (number of sensors open) Sensor(s) Open (status: open and motion) 4017 (e.g., icon color is red, bottom region for sensor number is yellow).


When the system state 4001 is “Disarmed”, or “Subdisarmed” (sensor absent; sensor tripped; sensor tampered; low battery; uncleared alarm), the orb icons of an embodiment indicate or represent status 4002 as follows: Disarmed, sensor problem (status: all quiet) 4018 (e.g., icon color is green, badge in top region with “!” symbol is red); Disarmed, sensor problem (status: motion) 4019 (e.g., icon color is green, badge in top region with “!” symbol is red); Disarmed, sensor problem (status: open) 4020 (e.g., icon color is green, badge in top region with “!” symbol is red, bottom region for sensor number is yellow); Disarmed, sensor problem (status: open and motion) 4021 (e.g., icon color is green, badge in top region with “!” symbol is red, bottom region for sensor number is yellow).


When the system state 4001 is “Armed (Doors and Windows, Stay, Away, All, Night Stay, Instant, Motion, Maximum)” (sensor absent; sensor tripped; sensor tampered; low battery), the orb icons of an embodiment indicate or represent status 4002 as follows: Armed Doors & Windows, sensor problem (status: all quiet) 4022 (e.g., icon color is red, badge in top region with “!” symbol is red); Armed Doors & Windows, sensor problem (status: motion) 4023 (e.g., icon color is red, badge in top region with “!” symbol is red); Armed Doors & Windows, sensor problem (status: open) 4024 (e.g., icon color is red, badge in top region with “!” symbol is red, bottom region for sensor number is yellow); Armed Doors & Windows, sensor problem (status: open & motion) 4025 (e.g., icon color is red, badge in top region with “!” symbol is red, bottom region for sensor number is yellow).


When the system state 4001 is “Alarm”, the orb icons of an embodiment indicate or represent status 4002 as follows: Armed Away/Stay, (alarm type) ALARM 4026 (e.g., icon color is red).


When the system state 4001 is “No iHub Connection” (broadband offline, etc.) (no security panel connection), the orb icons of an embodiment indicate or represent status 4002 as follows: Status Unavailable 4027 (e.g., icon color is grey).


When the client of an embodiment is a touchscreen, a mini orb is presented at the bottom of the touch screen in all widgets and settings screens. The mini orb is green when the security panel is disarmed, and it is red when the security panel is armed, but is not so limited. The form factor of the mini orb, and the text corresponding to the mini orb, is the same or similar to that described above as corresponding to the orb icon on the home screen.


The orb icons of an embodiment include motion indicators that animate to indicate motion detected by a corresponding sensor or detector. Furthermore, the orb icons of an embodiment show an animation during the exit delay when arming the security system and, additionally, indicate a countdown time showing the time remaining before the security system is fully armed. Moreover, selection of the orb of an embodiment causes additional information (e.g., list of sensors, etc.) of the security system and/or premise to be displayed.


The text summary display element of the SUI includes or displays information including a direct description of the current state of the security system to support the visual appearance of the orb icon. In an embodiment, two phrases are shown, including a first phrase for security state and a second phrase for sensor status (e.g., “Armed Stay. All Quiet”), as described herein. FIG. 6 is a table of security state and the corresponding sensor status displayed on the SUI, under an embodiment. The possible values for the text summary are (in priority order): Status Unavailable; if the security panel and control box are online and there are no current alarms, the text summary section is a combination of one phrase from each of the security state 4030 and the sensor status 4032. The security state 4030 of an embodiment is selected from among the following, but is not so limited: Armed Doors & Windows; Armed All; Armed Stay; Armed Away; Disarmed; Armed Maximum; Armed Night Stay; Armed Stay Instant; Armed Away Instant; Armed Motion; Subdisarmed. The sensor status 4032 of an embodiment is selected from among the following, but is not so limited: Uncleared Alarm; Sensor Tripped; Sensor Problem; Sensor(s) Bypassed; Motion; All Quiet; (number of sensors open) Sensor(s) Open.


The display elements of the SUI also include security buttons. The security buttons are used to control or arm/disarm the security panel. A single arm button (e.g., button labeled “Arm”) can be used on the SUI of a first client device type (e.g., Touchscreen, iPhone, etc.). Two different buttons (e.g., buttons labeled “Arm Away/Arm Stay” or “Arm All/Doors and Windows”) can be used on the SUI of a second client device type (e.g., web portal, mobile portal, etc.). In either embodiment, when the system is armed, the arm button (e.g., “Arm”, “Arm Stay” and “Arm Away”) label will change to a “Disarm” label. If the system is in the process of arming, the button is disabled.


The display elements of the SUI include system and device warnings, as described above. The system and device warning are informational warnings that are not associated with specific sensors, and involve more detail than can be displayed in the text summary display element. FIG. 7 is a table of system state and the corresponding icons and warning text displayed as system warnings on the SUI, under an embodiment. Where an icon is displayed, an embodiment uses a red color for the icon, but it is not so limited. The system states/warnings of an embodiment include, but are not limited to, the following: primary connection is broadband, broadband is down, cellular is being used/using cellular connection; primary connection is broadband, broadband and cellular are down/no cellular connection; primary connection is broadband, broadband is down, no cellular backup installed/broadband connection unknown; primary connection is cellular, cellular is down/no cellular connection; security panel not connected to AC power/security panel AC power loss; security panel low battery/security panel low battery; security panel tampered/security panel tampered; sensor(s) bypassed/sensor bypassed.


The device warnings of an embodiment include, but are not limited to, the following: camera(s) offline; light(s) offline; thermostat(s) offline. The device and system warnings may be combined into one box, or indicated separately in respective regions or portions of the SUI, depending on a type of the client device (e.g., combined into one box on a web portal or a mobile portal, but indicated in separate boxes on a Touchscreen or iPhone® device).


The device and system warnings display element is cumulative (e.g., built up in a list), but is not so limited. On the web and mobile portals the system and device warnings of an embodiment are combined into one area, but are not so limited. On the touchscreen device and mobile phone (e.g., iPhone), device warnings are indicated separately so that, in an embodiment, the iPhone® tab bar and the touchscreen home screen indicate device warnings with icon badges, and system warnings are placed on the sensors screen.


The list of all sensors includes, but is not limited to, door/window sensors, motion detectors, smoke, flood, fire, glass break, etc. The list of all sensors of an embodiment does not include cameras or locks, or non-security related devices such as lights, thermostats, energy, water etc. The list of sensors is split into groups that, in an embodiment, include interesting sensors as a first group, and quiet sensors as a second group. The interesting sensor group is positioned above or sorted to the top portion of the sensor list and the quiet sensors are sorted to the bottom portion of the sensor list. Any sensor that is triggered (e.g. open, motion, etc.) is categorized as an interesting sensor and placed in the interesting sensor group and list. Additionally, other sensor states such as tampered, tripped, absent, installing, low battery, or bypassed make a sensor “interesting” regardless of their state.



FIG. 8 is a table of sensor state/sort order and the corresponding icon, sensor name and status text of the SUI, under an embodiment. Generally, the list of interesting sensors is sorted according to the following categories: motion; open/tripped; tampered; low battery; offline; installing; bypassed. Sensors are sorted alphabetically by sensor name within each category or interest type when multiple interesting sensors have the same state. The sensor state/sort order of an embodiment includes, but is not limited to, the following: breached & any sensor state (e.g., red icon) (interesting sensor); tripped (smoke, water, gas, freeze, etc.) (e.g., red icon) (interesting sensor); tampered (e.g., red icon) (interesting sensor); low battery (e.g., red icon) (interesting sensor); offline/AWOL (e.g., red icon) (interesting sensor); unknown (if the iHub or Security Panel is offline, all sensors have a grey diamond icon and “Unknown” for the status text) (e.g., grey icon) (interesting sensor); installing (e.g., grey icon) (interesting sensor); open (e.g., yellow icon) (interesting sensor); motion (e.g., yellow icon) (interesting sensor); bypassed (e.g., yellow or green icon) (interesting sensor); okay, closed, no motion (e.g., green icon) (quiet sensor).


The interesting sensors are shown or displayed with an icon. FIG. 9 shows icons of the interesting sensors, under an embodiment. A red diamond bang icon represents tamper, offline, bypassed, installing, and/or battery. A yellow triangle icon represents open or triggered. A wavy lines icon represents motion. It is possible for an interesting sensor to have a green/closed icon (e.g., any quiet sensor that has been bypassed).


Following the state icon and the sensor name an embodiment displays status text. The status of an embodiment includes, but is not limited to, the following: ALARM, (sensor state); tripped; tampered, (sensor state); low battery, (sensor state); offline; unknown; installing; bypassed, (sensor state). If a sensor is offline or tampered, it will show that text; otherwise the status text will show the tripped state: open, motion, tripped, etc. In addition, if a sensor is bypassed its state is “Bypassed, (sensor state)”. For example, a bypassed motion sensor that has recently detected motion would have the status: “Motion, bypassed”. If a sensor has a low battery its state does not change, but it still joins the interesting sensors group.


The quiet sensors include the remaining sensors that are not currently active, and so are not categorized as interesting sensors. Quiet sensor states of an embodiment include closed, no motion or otherwise not tripped or faulted. FIG. 10 shows the quiet sensor icon, under an embodiment. A green circle icon is a quiet sensor icon in an embodiment, and represents closed/no motion/okay/quiet. In addition to the state icon and sensor name, each quiet sensor shows status text as follows: if a door/window sensor is closed its state is “closed”; if a motion sensor has not recently detected motion then its state is “no motion”; other sensors, such as a smoke detector, indicate “quiet” or “okay”. Quiet sensors are listed alphabetically.


The SUI of an embodiment includes control icons for a Home Management Mode (HMM). If the user deselects the “Set home management modes automatically” setting via the web portal, then the Home Management Mode (HMM) screen will appear in the web and mobile Portals. FIG. 11 is an example Home Management Mode (HMM) screen presented via the web portal SUI, under an embodiment. The HMM screen includes an orb icon and corresponding text summary display elements, along with security buttons that control or arm/disarm the security panel. Furthermore, the HMM screen includes sensor status information (e.g., “Door”, status is “open”, icon is yellow; “Basement Motion”, status is “motion”, icon is yellow; “Family Room North Motion”, status is “motion”, icon is yellow; “Water”, status is “okay”, icon is green).



FIG. 12 is an example Home Management Mode (HMM) screen presented via the mobile portal SUI, under an embodiment. The HMM screen of the mobile portal includes an orb icon and corresponding text summary display elements, along with security buttons that control or arm/disarm the security panel.


The SUI of an embodiment is supported on numerous client types, for example, mobile telephones (e.g., iPhone®, etc.), client access via mobile portal, client access via web portal, and touchscreen to name a few. All clients types supported in an embodiment have the same status related sections, but their locations change slightly depending on the client. The status related sections of an embodiment include the following: orb; arm state/sensor summary; change mode; device summary and system warnings; interesting sensors; and quiet sensors.



FIG. 13 is a block diagram of an iPhone® client device SUI, under an embodiment. The client interface of the iPhone®, as one example client, has the orb on the security page. The text summary is below the orb. The security button (e.g., arm, disarm, etc.) is below the text summary. A tab bar is presented at the bottom of the screen. The SUI of an embodiment represents device warnings by the icons in the bottom horizontal tab bar. If a camera, light, lock, or thermostat is offline then a red circle will badge the corresponding icon in the tab bar. The number of offline devices is shown in the badge. FIG. 14 is a first example iPhone® client device SUI, under an embodiment. In this first example screenshot, the security page indicates one camera is offline, as indicated by the “1” in a “circle” badge displayed corresponding to the “camera” icon in the tab bar.


System warnings appear as a group in an area (e.g., yellow area) at the top of the sensor status screen. This area at the top of the sensor status screen appears only when there is a device or system warning; otherwise, it is not presented. Multiple messages appear as a vertical list with one message on each line. The yellow bar will grow in length to fit additional messages. If there are no system warnings then the interesting sensors group is at the top of the sensor status screen. Interesting sensors are presented below system warnings. Quiet sensors are presented below interesting sensors. FIG. 15 is a second example iPhone® client device SUI, under an embodiment. In this second example screenshot, the sensor status page indicates at least one sensor is bypassed, as indicated by the “Sensor(s) bypassed” message displayed at the top of the sensor status screen.



FIG. 16 is a block diagram of a mobile portal client device SUI, under an embodiment. The mobile portal of an embodiment comprises three (3) pages or screens presented to the client, including a summary page (“summary”), a security panel page (“security panel”), and a sensor status page (“sensors status”), but the embodiment is not so limited. The client interface of the mobile portal, as one example client, has the orb at the top of the summary page below the site name. The text summary is below the orb. The security buttons (e.g., arm, disarm, etc.) (plural on mobile portal) are on the security panel page (accessible via the “Security” link on the summary page). Device and system warnings are presented in an area (e.g., yellow area) below the text summary; in an embodiment this area is presented only when device or system warnings are present. Interesting sensors presented are at the top of the sensor status page. Quiet sensors are presented below interesting sensors on the sensor status page.



FIG. 17 is an example summary page or screen presented via the mobile portal SUI, under an embodiment. FIG. 18 is an example security panel page or screen presented via the mobile portal SUI, under an embodiment. FIG. 19 is an example sensor status page or screen presented via the mobile portal SUI, under an embodiment.



FIG. 20 is an example interface page or screen presented via the web portal SUI, under an embodiment. The client interface of the web portal, as one example client, has the orb in the center of the security widget. The text summary is below the orb. The security button (plurality in the web portal) is adjacent to the orb's right side. System warnings are presented in an area (e.g., yellow area) below the text summary; in an embodiment this area is presented only when device or system warnings are present. Multiple system warning messages are presented as a vertical list with one message on each line, and the area dedicated to the system warnings grows in length to accommodate additional messages. Interesting sensors span across the entire security widget below the text summary. Quiet sensors span across the entire security widget below interesting sensors.



FIG. 21 is an example summary page or screen presented via the touchscreen SUI, under an embodiment. The summary page of the touchscreen, as one example, has the orb in the center of the security bar. The text summary is split into sections or parts on each side of the orb. The security button is presented on the right side of the security bar.


In addition to the orb, text summary, and security button, the summary page also includes one or more icons that enable a transfer of content to and from the remote network, as described in detail herein. The touchscreen integrates the content with access and control of the security system. The content includes interactive content in the form of internet widgets. The summary page of an embodiment also comprises at least one icon enabling communication and control of the premise devices coupled to the subnetwork. The summary page also comprises one or more icons enabling access to live video from a camera, wherein the camera is an Internet Protocol (IP) camera.



FIG. 22 is an example sensor status page or screen presented via the touchscreen SUI, under an embodiment. The sensor status page of the touchscreen, as one example, displays widget badges or icons representing device warnings. System warnings are at the top of the sensor status screen; in an embodiment this area is presented only when system warnings are present. Multiple system warning messages are presented as a vertical list with one message on each line, and the area dedicated to the system warnings grows in length to accommodate additional messages. Interesting sensors are below system warnings. Quiet sensors are below interesting sensors. The sensors screen also includes the mini-orb which indicates the arm state with text and color.


The integrated security system of an embodiment includes a component referred to herein as “Home View” that provides end users an at-a-glance representation of their home security status using the layout of their home. Like the System Icon or “orb” as described in detail herein, Home View is intended to complement a set of common elements including, but not limited to, the security text summary, arm/disarm button, system warnings, and sensor status list. These UI elements are in the primary display of every iControl client application, and Home View adds to that set of UI elements.


Home View can be an alternative to the System Icon, adding sensor location and information about other devices like lights, thermostats, cameras, locks, and energy devices, to name a few. Home View is an optional view, and users who set up Home View are able to switch between the System Icon view and Home View. Home View provides the user or installer a way to express the floor plans of their home, where the layout of Home View is representational and, as such, is not meant to be a precise rendering of a home. The rendering of Home View can vary on each device depending on screen size and display capabilities.



FIG. 23 is an example Home View display 4000, under an embodiment. Using this example, Home View 4000 expresses or represents with a display the floor plan 4002 of a relatively large premise (e.g., home) or structure (e.g., 5 rooms wide and 5 rooms tall). Home View accommodates multi-story homes or structures (e.g., 4 stories). This mechanism can also be used to express other parts of a property, such as outbuildings. Home View allows the user to see all devices 4010 present on a selected floor, and indications if other floors have interesting/active devices (such as an open door, or a light that is on).


Home View information defined on one client affects all clients. In other words, if a change is made to the floor plans on one client, all clients display that change if they are using Home View. Home View is provided on the iPhone, and is also supported on one or more clients common to all users (web portal and/or touch screen).


Home View of an embodiment includes an editing tool that supports basic sensors and common devices. Using the sensor state display of Home View, and while editing, the user can position each sensor device on each floor, and the sensor icon is displayed over each floor plan.


Under an embodiment and as further described below, basic device states are represented by device and/or sensor state icons in Home View. FIG. 24 shows a table of sensor state icons displayed on the Home View floor plan, under an embodiment. The sensor states displayed in an embodiment include, but are not limited to, the following: breached or alarmed, tripped, or tampered (e.g., red icon) (interesting sensor); low battery (e.g., red icon) (interesting sensor); offline/AWOL (e.g., red icon) (interesting sensor); unknown (if the iHub or Security Panel is offline, all sensors have a grey diamond icon and “Unknown” for the status text) (e.g., grey icon) (interesting sensor); installing (e.g., grey icon) (interesting sensor); open door/window (e.g., yellow icon) (interesting sensor); motion sensor active (e.g., yellow icon) (interesting sensor); okay, closed, no motion (e.g., green icon) (quiet sensor). The states of each sensor icon of an embodiment are updated periodically (typically 15-30 seconds) to reflect their status.


A touch sensed anywhere in Home View navigates the UI to the sensor list available in System Icon view. The user can also touch any sensor icon in Home View and see a popup display showing the sensor name. The popup box is presented above the sensor with a connector pointing to and indicating the sensor selected. If the sensor is at the top of the screen, the popup box may appear below the sensor with a connector pointing up to and indicating the selected sensor. The popup box also includes a “more” button for navigating to detailed information about that sensor (in this case, sensor history). An embodiment presents sensor icon, name, and status text, and the last event for that sensor, plus a navigation arrow e.g., (a blue circle on some UIs) the selection of which switches screens to the sensor detail or history (same as clicking sensor name in each client).


Using the device state display of Home View, a set of device and/or sensor icons can be placed on each floor. FIG. 25 shows example sensor status and device icons of Home View, under an embodiment. The device icons include, but are not limited to, icons representing lights, thermostats, cameras, locks, and energy devices, to name a few. Each of the device icons change states in the same way they change in their device list. These states include offline, installing, quiet, and active states but are not so limited. In an embodiment, cameras do no indicate an active state with an icon change. When the user touches a device icon, the device name pops up or is displayed. The popup box includes a “more” button for navigating to more information about that device as follows: camera icon (the popup box “more” button jumps to live video for that camera; exiting live video returns to Home View); lights, thermostats, energy, locks icon (“more” button jumps to the detail screen for controlling each device; the back buttons from those screens behave as they always do).


Home View visually indicates changes in device state under an embodiment. Under one embodiment device icons represent an underlying device component and its current state by modeling the device itself. For example (and as set forth in FIG. 25), an iconic image of a lock represents an actual lock device. As another example (and as set forth in FIG. 25), an iconic image of a lamp represents an actual lamp device monitored/controlled by the integrated security system. Home View may then use the device icon itself to indicate change in state. For example, Home View may express an unlocked or open status of a lock device by replacing the symbol of a closed or engaged lock with a symbol clearly depicting a lock that is unlocked or disengaged. As another example, Home View may indicate an inactive lamp device by replacing an iconic lamp representation in an “on” state (i.e., indicating emanation of light) with a darkened lamp representation (using a darkened lamp shade) indicating an “off” status. In other words, change in appearance of the device icon expresses a change in state of the underlying device.


Under another embodiment a generic sensor icon may be used to represent a device and its operational status. For example, a user may use an edit feature of Home View (described in greater detail below) to place a generic sensor icon on the Home View floor plan. When the user touches the icon on an iPhone client or mouses over the icon in a web application, the name/type of device appears above the icon (along with other relevant information and options as further described herein). The icon itself then displays status by shifting to a state specific status icon. As described above, Home View may use one of the status icons described in FIG. 24 as appropriate to the operational status of the represented device but is not so limited.


Under another embodiment, Home View may indicate a change in state of the device by simply replacing the device icon with a status icon. For example, a lock device may be offline at which time the Home View would replace the lock icon representation with a status icon representation that indicates an offline status. The offline status icon may correspond to the offline status symbol set forth in FIG. 24 but is not so limited.


Under another embodiment, Home View may visually superimpose or visually annotate a device icon with status representations. As an example, Home View may visually annotate a lock device icon with a status icon to indicate its current operational status. The Home View may use the status icons described in FIG. 24 to visually append status information to device representations but is not so limited. Under an embodiment, the Home View may use smaller representations of such icons to serve as status badges on a portion of the device icons. The Home View may also superimpose a partially transparent status icon as a palimpsest layer over the device icon or alternatively integrate a partially transparent status icon into the device icon as a watermark representation. Home View may use one of the status icons described in FIG. 24 as appropriate to the operational status of the represented device but is not so limited.


Under yet another embodiment, Home View may use any combination and/or manipulation of status/device icons to represent operational status of system components.


If more than one floor has been defined in Layout mode of Home View, thumbnails on a portion of the display indicate that there are floors above or below the current one, and a means provided to switch floors. FIG. 26 shows a Home View display 4100 that includes indicators 4101/4102 for multiple floors, under an embodiment. In this example, two icons are presented to indicate a first (lower) floor 4101 and a second (upper) floor 4102. The currently-displayed floor 4101 (e.g., first (lower) floor) is outlined in white or otherwise highlighted. The last-viewed floor will be remembered across sessions.


The display of indicators for multiple floors through a mobile portal includes numbered links on a portion of the display (e.g., right), starting from “1”. The currently-displayed floor is shown as bold, and not a link, as in:

    • Floor: 1 2 3


Like the System Icon, Home View indicates the overall system state by using background color. For accessibility, this may also be presented using corresponding text located adjacent to the icon. FIG. 27 shows the system states along with the corresponding Home View display and system or orb icon, under an embodiment. Across all clients, system state is indicated using a representative color. The disarmed or subdisarmed system state is displayed in Home View using a green background or green border 4202 on the floor plan. The armed (any type) system state is displayed in Home View using a red background or red border 4204 on the floor plan. The alarm system state is displayed in Home View using a red background (with or without black diagonal stripes) 4206 on the floor plan. The offline (iHub or panel) system state is displayed in Home View using a grey background 4208 on the floor plan.


The System Icon of some client device UIs (e.g., the iPhone, the Touch Screen) also includes a warning badge to indicate that there are warnings to see in the sensor list. In Home View, a general warning indicator 4302 is shown in a region (e.g., on one side) of the Home View floor display. FIG. 28 shows a Home View floor display (disarmed 4202) that includes a warning indicator 4302, under an embodiment. The Home View display and warning indicator correspond to the system icon or “orb” set forth in the upper left corner of FIG. 28.


The use of Home View as a user interface includes Summary Text as described in detail herein, and the Summary Text provides definitive information on the current arm state, and a summary of any sensor issues. Additionally, the system arm/disarm buttons are displayed separately. FIG. 29 shows an example of the Home View 4402 using the iPhone security tab, under an embodiment. System state information 4404 is displayed (“Disarmed. 1 Sensor Open”), and an “Arm” button 4406 is displayed by which a user arms the system.


Home View is an alternative to the System Icon, as described herein, and is configured via site settings. Each application retains the user's preferred mode across sessions. FIG. 30 shows an example screen for site Settings 4500, under an embodiment. The Settings screen 4500 includes a list of sites 4502 that can be selected, along with a Sign Out button 4504. The Settings screen 4500 also includes a Security Tab Options button 4506. Selection of the Security Tab Options button 4506 displays the Security Tab Options screen 4600.



FIG. 31 shows an example screen for Security Tab Options 4600, under an embodiment. The Security Tab Options screen 4600 displays a list of options 4602 to select what the security tab displays (i.e., the System Icon display or the Home View display), along with an Edit Home View button 4604. When the user first attempts to switch to Home View from the Security Tab Options screen 4600 the following modal dialog is displayed: “Home View must be set up before use.” This dialog includes but is not limited to the following two buttons: “Set Up Now” and “Cancel”.


Any time the user wants to alter their Home View floor plans or device positions, they can choose Settings 4500, then select the Security Tab Options button 4506, then the Edit Home View button 4604. If a device has been deleted, then the Home View display code removes it from the device settings table. If a device has been installed or added to the system, it does not automatically appear in Home View, but it will be available in Edit Home View mode, ready to be placed on a floor.


The Home View mode of an embodiment includes an editor or Edit Mode. On the Settings screen 4500, the user can select Security Tab Options 4506, then Edit Home View 4604, as described above. This puts the user in Edit mode, where they can make changes to device positions, floor plans, and add/remove floors, for example. When editing is complete, selection of a “Done” button on the screen returns a user to the Security Tab Options screen 4600. If the user has made changes, then a dialog slides up that includes buttons for “Save Changes”, “Don't Save”, and “Cancel”. Once saved, Home View data is saved on the iHub/iServer with other site settings, and can appear in any client that has Home View enabled for display.


When the user first enters Edit mode, the user selects a basic floor plan which defines the perimeter shape of each floor of the premise. FIG. 32 shows an example “Add Floor” screen for use in selecting a floor plan, under an embodiment. Numerous floor plan selections are presented in a region of the screen labeled “Select a floor plan” 4702, and the floor plan selections 4702 of an embodiment comprise, but are not limited to, the following: square; horizontal; vertical; four different L-shapes; four different U-shapes; four different zigzag shapes. The title bar of the “Add Floor” screen 4700 includes a Cancel button 4704. At the point when there are no floors, there are no other buttons.


Upon selection of a basic floor plan, the editor is displayed. FIG. 33 shows an “Edit Home View” screen 4800 of the editor, under an embodiment. The title bar includes an add floor button [+] 4802. In this example only one floor is defined, so there is no delete button (cannot delete the last floor). In addition to adding and deleting floors, the editor of an embodiment displays selection buttons 4810-4814 for three editing modes: Devices mode 4810 (used for placing devices on each floor); Walls mode 4812 (used for adding or changing walls); Erase mode 4814 (used for deleting walls). If the default floor plan matches the user's home, then the user has only to position devices on that floor. However, if the user wishes to modify a floor plan or define interiors then the Walls Mode and Erase mode are used to make changes.


Devices are represented by icons in the editor, and the icons can be positioned by dragging to the appropriate location on the floor plan 4804. Below the displayed floor 4804 is a dock area 4806 that includes all devices displayed in rows. The user can drag a device to any tile on the floor 4804 that does not already contain a device icon. Devices can also be dragged back off the floor 4804 and onto the dock 4806. To identify a device the user can tap a device icon or start dragging and the name will appear above the device icon. FIG. 34 shows an example of dragging a device icon during which a name of the device 4900 (“Front Door”) is displayed, under an embodiment. Devices are not required to be placed on floors, and any devices left in the dock 4806 are ignored when Home View is displayed. These can be added to any floor at a subsequent time. Newly installed devices are also left on the dock 4806, ready to be placed when editing.


The dock 4806 has a grid of tiles, similar to the floor plans. The user can move devices around on that grid. Upon exiting the editor and then returning, the dock is drawn in ordered rows. Devices of an embodiment are placed every-other-tile, up to 11 devices per row and up to 3 rows for a total of up to 33 devices on screen, but are not so limited. If the site has more than 33 devices in the dock, they are not shown until some devices are moved onto the floor, so that the dock condenses after each device is placed on a floor.


The selected floor plan provides a basic perimeter for the floor. If the user wishes to change the default perimeter walls or define interior walls, the user can switch to Walls mode. The user can tap any tile to customize that tile, and tapping a tile cycles the tile through twelve different tile shapes. Tile cycles start with the best-fit tile based on context, then cycle through all possible tile shapes in best-fit order. For example, if the user taps a blank tile with a horizontal line to the right and a vertical line below it, then the first tile drawn will be a corner tile that connects those lines, then a tile that connects one line, then the other line, etc.


For example, a typical task is to draw an interior wall. Each tile should require only one tap to draw as a user progresses across tiles of the floor plan. FIG. 35 is an example of a U-shaped floor plan 5000 customized by changing interior tiles to define walls 5002, under an embodiment.


The editor of an embodiment includes a Walls mode and an Erase mode, as described above. In the Walls mode and the Erase mode the device icons are hidden. Erase mode is used to change wall tiles into blank tiles, to remove mistakes, and/or begin to move a wall. For example, a user wanting to narrow a rectangular floor plan by moving an entire wall inward first switches to Erase mode and taps every tile of the vertical wall they wish to move, and then switches to Walls mode and taps every tile where they wish a wall to be placed.


An embodiment may adopt an alternative floor plan editing scheme in the form of a commercial diagramming tool. The alternative approach replaces the tile based diagramming described above with a vector based graphics approach. A user may choose design primitives to establish and subsequently manipulate (via touch/drag interactions or keyboard/mouse operations) basic floor plan shapes and representations. Such approach may incorporate a “free hand” ability to trace lines or other floor plan elements (via touch/drag interactions or keyboard/mouse operations).


While editing tiles or positioning sensors, more precision may be needed in which case the user can toggle the zoom level of the editor (includes the dock) in any edit mode. To zoom to 300%, for example, the user taps the + magnifying glass 5004, and to return to 100% zoom, the user taps the − magnifying glass 5102. If there are multiple floors, tapping a floor thumbnail returns to 100% zoom. Once zoomed, the user scrolls around the floor by a dragging operation. FIG. 36 shows an example in which the zoom level is increased and dragging has been used to focus on a sensor location 5100, under an embodiment. When zoomed in, if the user touches and drags a device, the device moves and not the floor. If the user taps and drags a tile, the floor scrolls around and the tile is not altered.


Home View of an embodiment supports up to four (4) floors but is not so limited. These floors can also be used for other physical spaces, such as outbuildings or garages for example, so floor numbering is generally avoided. To define a new floor in Edit mode, the user touches a + button 4802 at the top of the screen and the Add Floor page appears. FIG. 37 is an example “Add Floor” page 5200, under an embodiment. If at least one floor has previously been defined, a new control appears to help add this new floor above (“Add Above” 5202) or below (“Add Below” 5204) the current floor. The default option adds the new floor above (“Add Above”) the current floor. By selecting a floor in Edit mode, touching +, and changing this control in the Add Floor page, the user can add basements, insert floors etc.


When more than one floor is defined in Home View, some differences appear on the Edit Home View screen. Among the changes, a column of floor thumbnails appears on the right portion of the screen. The currently selected floor thumbnail is highlighted, and the user can tap any floor to switch to that floor. For example, the user can move a device to the dock, switch floors by touching the other floor thumbnail, then drag the device onto the new floor. FIG. 38 is an example Edit Home View screen 5300 showing the floor thumbnails 5302/5304 for use in selecting a floor, under an embodiment.


An additional change displayed on the Edit Home View screen includes the display of a delete floor button [−] in the title bar of the editor, to the right of the add floor button [+]. If more than one floor is defined, the user selects the [−] button to delete the current floor. The user is prompted with a warning with the options to Delete Floor or Cancel 5404. FIG. 39 shows the Edit Home View screen 5400 with a delete floor selector 5402, under an embodiment.


Selection of the Done button on the Edit Home View screen allows the user to exit the editor. If upon selecting the Done button the user has made changes to the floors or device locations, the user is prompted to save the changes before exiting back to the Settings screen. FIG. 40 is an example Edit Home View screen 5500 displaying options to “Save” 5502 and “Don't Save” 5504 changes following selection of the Done button, under an embodiment.


For each premise, Home View allows users to define the floors of their home and the locations of all devices on those floors using the Edit Home View layout editor described above. The output of the layout editor includes two strings that are stored in site preferences on the server. All client applications share this static definition of the site layout, and locally combine it with the current state of the sensors and panel to produce a graphical view.


Home View is presented in an embodiment using tiles, and allows a user to define up to a pre-specified number of floors (e.g., four floors, etc.), but is not so limited. Each floor in Home View is presented as a layout of tiles in two layers or structures. A first layer, or bottom layer, is a static layout of a single floor (e.g., 19 tiles by 19 tiles, etc.). FIG. 41 is an example of the floor grid data, under an embodiment. A second layer, or top layer, is a set of sensor/device icons (states changing) placed or overlaid over the grid (first layer). FIG. 42 is an example sensor hash table for a single-floor site, under an embodiment.


The server (e.g., iServer) of an embodiment stores the two structures in two variables in site preferences, but the embodiment is not so limited. A first variable comprises a series of floor layouts corresponding to the number of floors. Each floor layout is a floor grid represented by a single string of characters (e.g., 19×19 or 361 ASCII characters), with one character corresponding to each tile as described above.


The homeViewLayouts preference strings represents between 1 and 4 tile grids. Each tile grid is 19 tiles by 19 tiles for a total of 361 tiles. The grids comprise odd numbers to support centering of walls. The first 361 tiles represent the first floor of the premise. If there are multiple floors, the next 361 tiles represent the second floor of the premise. Therefore, homeViewLayouts length is 361 characters (premise having one floor), 722 characters (premise having two floors), 1083 characters (premise having three floors), or 1444 characters (premise having four floors). FIG. 43 shows an example homeViewLayouts string, under an embodiment.


A second variable comprises a hash table mapping specific tiles to sensors, separated by commas, and every sensor is represented. A homeViewDevice preference string represents such information and comprises key,value pairs separated by commas. As example homeViewDevices character string is as follows:

    • homeViewDevices=“3,zone2,74,zone5,88,zone1,129,zone2,166, cameraFront Door Cam,200,lightUpstairs Light 2,226,thermoUpstairs”.


The key of the key,value pair is an integer representing the absolute offset into the homeViewLayouts array. The value of the key,value pair represents a way to precisely identify the device. For sensors, this value is “zone” followed by the zone ID. For example, if the front door (zone id 7) is on the third tile over, then the key value pair is 2,zone7 (e.g., zero-based offset).


Each tile set includes twelve basic shapes. The shapes of an embodiment include, but are not limited to the following: empty; horizontal wall; vertical wall; top left corner; top right corner; bottom left corner; bottom right corner; T-shape down; T-shape right; T-shape up; T-shape left; 4 corner shape. FIG. 44 shows the twelve shapes of a tile set, under an embodiment. Wall lines are centered within each tile to ensure alignment. The user draws the floor(s) of their premise using the shapes, and the set of tile shapes is used while editing (generally blue, like blueprints), and for two of the rendered states of the security system: when alarmed (red and black striped) and when offline (gray tiles).


As stated above, the user defines the walls of each floor of their home using twelve basic tile shapes. However, when a floor is rendered, the building exteriors should be readily distinguished from the interiors. For rendering Home View in armed and disarmed states, algorithms determine the interior of the home and compute which tiles are transparent and which tiles are filled. For perimeter walls, the algorithm clears the exterior side but not the interior side. A larger set of tiles is used to handle all possible transparent/filled tile renderings. FIG. 45 shows the tile shapes and corresponding fill options for rendered tiles, under an embodiment.


As stated above, the user defines the walls of each floor of their home using twelve basic tile shapes. However, when a floor is rendered, the building exteriors should be readily distinguished from the interiors. This achieved when the editor is exited and tiles exterior to each building are replaced with transparent tiles. Similarly, tiles with walls facing the exterior are replaced with tiles where the exterior portions are transparent.



FIG. 46 is an example tile rendering for a room of a premise, under an embodiment. In this example, there are two perimeter versions of the top-right corner tile “t”, and one perimeter version is filled on the bottom right (tile “u”), and one perimeter version is filled on the top left (tile “U”).


A description follows for operation of the algorithm for determining an interior and an exterior. The algorithms generate a list of all tiles on the edge of each floor that are empty (top row, bottom row, left column, right column, up to 19+19+2*17=72 tiles per floor). With each tile, a function is called to clear the tile. In that function, the empty tile is changed to an empty exterior tile (for example, “e” changes to “E”). The algorithm then examines the four tiles on each side (top, right, bottom, left) of the current tile and, if they are non-empty, replaces them with tiles where the side facing the current tile is transparent. The algorithm then examines the four tiles diagonal to this exterior tile and, if they are non-empty and have a corner (T shapes, plus shape, corners), replaces them with tiles where the corner facing the current tile is transparent. A list is generated comprising any of the four tiles on each side (top, right, bottom, left) of the current tile that are empty. With each empty tile, a recursive function is called and the process repeats as described above.


In order to avoid stepping into “doors”, the algorithm does not call the recursive function in response to empty tiles if there are wall edges touching the current tile. For example, the process only recurses down to an empty tile if the tiles to the right and left are not horizontal tiles (or similar) touching the current tile. This works for doors one and two tiles wide; wider openings get filled.


The fully computed floor definition is stored in the gateway (e.g., iHub) and/or server (e.g., iServer) but is not so limited. If the Home View editor is used, these computed tiles can be converted back to the twelve-tile set while editing. The Home View data output from Edit mode is checked to ensure integrity of parameters, for example: the number of tiles (and number of floors) is correct; the tile data only includes valid tile characters; all sensors and devices still exist. At the time Home View is rendered, the same checks are again performed to verify data integrity. If any checks fail, the user is presented a dialog, and the preference returns to the System Icon (the “orb”). Essentially the feature is turned off for display, but the data is still there until edited. If the user tries to edit home view and the data is corrupted, they are given the option to reset the data and start over.


An alternative embodiment of Home View also provides methods for generating and presenting floor plans and icons representing sensors overlaid on a floor plan for a home, thereby enabling users to quickly see the state of each sensor (such as open doors, status of lights and thermostats, etc.), and click on any sensor to get more information about that sensor. As described in detail herein, FIG. 24 shows a table of sensor state icons displayed on the Home View floor plan, and FIG. 25 shows example sensor status and device icons of Home View, under an embodiment. The device icons include, but are not limited to, icons representing lights, thermostats, cameras, locks, and energy devices, to name a few. Each of the device icons change states in the same way they change in their device list. These states include offline, installing, quiet, and active states but are not so limited. The sensor states displayed in an embodiment include, but are not limited to, the following: breached or alarmed, tripped, or tampered (e.g., red icon) (interesting sensor); low battery (e.g., red icon) (interesting sensor); offline/AWOL (e.g., red icon) (interesting sensor); unknown (if the iHub or Security Panel is offline, all sensors have a grey diamond icon and “Unknown” for the status text) (e.g., grey icon) (interesting sensor); installing (e.g., grey icon) (interesting sensor); open door/window (e.g., yellow icon) (interesting sensor); motion sensor active (e.g., yellow icon) (interesting sensor); okay, closed, no motion (e.g., green icon) (quiet sensor). The states of each sensor icon of an embodiment are updated periodically (typically 15-30 seconds) to reflect their status.


A touch sensed anywhere in Home View navigates the user interface to the sensor list available in the System Icon view. The sensor icons of an embodiment update periodically (e.g., frequently) to reflect their current status (e.g., an open window). The sensor icon also represents the “health” of that sensor (offline, low battery etc.). A user can hover over (in desktop web browser) or tap (tablet/touch device) any sensor icon and see a popup display showing the name, state, and the last event for that sensor. FIG. 47 is an example popup display in response to hovering near/adjacent a sensor icon (e.g., “Garage” sensor), under an embodiment. If the device is at the very top of the screen, the popup box may appear below the sensor. Alternatively, if the device is on the edge of the screen the popup box may be pushed inward or displayed in another portion of the interface. Clicking (desktop) or double-tapping (tablets) in regions of the display causes the system to navigate to sensor history. When the interface is displayed on an iPhone, for example, the popup box may also have a blue “more” button for that same navigation.


If more than one floor has been defined in Layout mode of Home View, the display includes thumbnails on a portion of the display that indicate the existence of floors above or below the current one, and a process to switch floors. FIG. 48 shows a Home View display that includes a floor plan display 4800 of a selected floor along with indicators 4801/4802 for multiple floors, under an embodiment. In this example, two icons are presented to indicate a first (lower) floor 4801 and a second (upper) floor 4802. Alternatively, other notations (e.g., dots, etc.) can be used to indicate multiple floors. The currently-displayed floor 4801 (e.g., first (lower) floor) is highlighted. The last-viewed floor will be remembered across sessions. When accessing Home View via a mobile portal, the display of indicators for multiple floors through the mobile portal includes numbered links on a portion of the display (e.g., right), starting from “1”. The currently-displayed floor is shown as bold, and not a link, for example:

    • Floor: 1 2 3


The use of Home View as a user interface includes a system icon or Summary Text that provides definitive information on the current arm state, and a summary of any sensor issues. Additionally, the system arm/disarm buttons are displayed separately. FIG. 49 shows an example of the Home View user interface displayed via a mobile device (e.g., iPhone), under an embodiment. The user interface 4900 includes a floor plan display 4901 of a selected floor along with indicators 4902 for selecting among corresponding multiple floors of a building. System state information is displayed 4903 (“Disarmed. All Quiet”), and an “Arm” button 4904 is displayed by which a user controls arming of the system. A toolbar 4905 is included by which a user selects a device type (e.g., security, cameras, lights, thermostats, etc.) for which status and control information is available via Home View.


Home View is configured via site settings as described in detail herein. Each application retains or remembers the user's preferred mode across sessions. FIG. 50 shows an example of a Settings page of Home View, under an embodiment. The Settings page includes a Sites list, a “Home View” button 5001, and a corresponding On/Off switch 5002. For site owners, there is also a “Set Up Home View” button (not shown), the selection of which directs the system to the editor. Once Home View is defined by a user, the interface presents the “Set Up Home View” button as an “Edit Home View” button 5003. In the web portal of an embodiment, Home View can be enabled and edited using a Customize link on the Summary tab. Users can check the box to show Home View, and site owners will have an Edit button.


Any time the user wants to alter their Home View floor plans or device positions, they can choose Settings and then select the Edit Home View button. If a device has been deleted, then the Home View display code removes it from the device settings table. If a device has been installed or added to the system, it does not automatically appear in Home View, but it will be available in Edit Home View mode, ready to be placed on a floor.


The Home View mode of an embodiment includes an editor or Edit Mode, as described in detail herein. On the Settings screen, the user can select the Edit Home View button, as described above. This puts the user in Edit mode, where they can make changes to device positions, floor plans, labels, and add/remove floors, for example. When editing is complete, selection of a “Done” button on the screen returns a user to the Security Tab Options screen. If the user has made changes, then a dialog slides up that includes buttons for “Save Changes”, “Don't Save”, and “Cancel”. Once saved, Home View data is saved on the iHub/iServer with other site settings, and can appear in any client that has Home View enabled for display.


When the user first enters Edit mode, the user selects a basic floor plan that defines the perimeter shape of each floor of the premises. FIG. 51 shows an example “Home View Setup” editor page 5100 for use in selecting a floor plan, under an embodiment. Numerous floor plan selections 5102 are presented in a region of the screen labeled “Select a floor plan” 5102, and the floor plan selections of an embodiment comprise, but are not limited to, the following: square; horizontal; vertical; numerous different L-shapes; numerous different U-shapes; numerous different zigzag shapes. The title bar 5103 is labeled “Home View Setup” and includes a Back button 5104.


Upon selection of a basic floor plan, the selected floor plan is displayed. FIG. 52 shows a “Home View Setup” editor screen 5200 with a selected floor plan 5201, under an embodiment. The editor screen 5200 displays a selected floor plan 5201, and includes a device dock 5202, or dock 5202, that includes devices 5203 as represented by icons. The editor 5200 includes an “Options” 5204 icon, the selection of which presents editing options. For example, FIG. 59 shows a Home View Setup page 5900 with options displayed, under an embodiment. The editor 5200 includes numerous editing operations including, but not limited to, positioning devices (dragging device icons from the dock and placing devices on the floor), editing walls (adding new horizontal or vertical walls, or deleting existing walls), and adding or editing labels (changing or deleting room labels). If the default floor plan matches the user's home, then the user has only to position devices on that floor plan. Optionally, the user can add labels. If the user wishes to modify a floor plan or define interiors, however, then walls can be drawn or erased.


Devices are represented by icons that are presented in a device icon dock 5202 of the interface. The interface includes a dock area that includes device icons displayed in rows. Device icons are positioned on the floor plan by dragging them from the dock to the appropriate location on the floor plan. To identify a device the user can tap a device icon or start dragging the device and the name will appear above the device icon. Devices can also be dragged back off the floor and into the dock. Furthermore, labels can be added to devices of the home (e.g., front door 5301). FIG. 53 shows an example editor screen 5300 for which a label 5301 with a name of the device (“Front Door”) is displayed, under an embodiment.


There is no requirement under an embodiment for devices to be placed on floors, and any device left in the dock is ignored when Home View is displayed. The devices remaining in the dock can be added to any floor of a floor plan at a subsequent time. Newly installed devices are also left on the dock, ready to be placed when editing. The dock of an embodiment is rendered in ordered rows, and the dock can be scrolled vertically to access all devices in the dock.


The selected floor plan of Home View provides a basic perimeter for the floor, but is not so limited. A user wishing to draw new perimeter walls or define interior walls drags across the grid lines to create new walls. The user deletes walls in much the same way by dragging along the gridline over an existing wall. The process of erasing old walls then drawing new ones can be used to “move” a wall but the embodiment is not so limited. For example, the process of narrowing a rectangular floor plan by moving an entire wall inward includes dragging over the vertical wall that is to be moved and then dragging on the new gridline where the wall is to be placed. FIG. 54 shows a Home View Setup page 5400 with a selected floor plan 5201 that has been edited to add numerous interior walls 5401, under an embodiment.


A user can edit labels on any location of a floor plan, where editing includes adding, editing, and deleting labels. FIG. 55 shows a Home View Setup page with a label editing prompt 5501, under an embodiment. To add a new label, the user selects the option to add a room label and then touches a location for that label. In response the interface presents a label editing prompt 5501 for the label text. In order to edit an existing label, the user taps that location and the same label editing prompt 5501 is presented for use in editing the label. To delete a label the user clears the text.


The floor plan editing of an embodiment includes zoom editing in order to offer increased precision when editing. FIG. 56 shows a Home View Setup page 5600 in a zoomed editing mode to zoom on one room 5601 in a building, under an embodiment. The user edits in a zoomed mode by tapping a magnifying glass icon 5206 displayed on Home View Setup. When using zoom editing, the magnifying glass icon 5206 of the Home View Setup page is replaced with a floor plan icon 5602 displaying the entire floor plan with an overlay 5603 showing the region of the floor plan on which the user has zoomed. Once zoomed, the user scrolls around the floor by dragging the view rectangle in the zoom thumbnail area. Tapping the zoom thumbnail area returns the display to full zoom. When zoom editing, the touching and dragging of a device results in the device being moved instead of the floor. When the user draws a wall and drags the wall, the editor scrolls the floor automatically.


Home View of an embodiment supports the addition of multiple floors, and these floors can also be used for other physical spaces (e.g., outbuildings, garages, etc.). FIG. 57 shows a Home View Setup page for adding at least one floor to a floor plan, under an embodiment. In order to define a new floor in Edit mode, the user touches the Options button 5204 at the top of the Home View Setup page and chooses Add Floor Above (e.g., FIG. 59, element 5902). In response the Add Floor page 5700 appears. In addition to the predefined floor plans, the current user floor is also available for copying to a new floor. The Add Floor page 5702 presents a prompt 5703 to select a floor plan along with numerous floor plans 5704 available for selection.


The Home View editor supports editing with multiple floors. FIG. 58 shows a Home View Setup page 5800 with editing for multiple floors, under an embodiment. When more than one floor is defined, the editor has a few changes. For example, a column of floor thumbnails 5802 appears in a portion of the interface, and the currently selected floor thumbnail 5801 is highlighted. At any time, the user can tap any floor to switch to that floor. As another example, a Remove Floor option is available in the Options menu (see FIG. 59, element 5902).


The Home View editor enables the setting of a default floor when multiple floors are included. Generally, the first floor is drawn first on any client. However, if multiple floors are included and the bottom floor is not the default (e.g., a basement is included), Home View enables changing of this default. The default floor is changed, for example, by tapping the icon for the second floor and then choosing the option “Set As Default Floor” (see FIG. 59, element 5902).


The Home View editor supports the moving of devices between floors when multiple floors are included. At any time, the user can move a device to the dock, switch floors by touching the floor thumbnail corresponding to the desired floor, then drag the device onto the new floor.


The Home View editor of an embodiment includes auto-fill interiors. By default, the interiors of each floor of an embodiment are “filled” to look different from the exteriors, and the interior walls are less prominent than the exterior walls. The auto-fill interiors can be selectively enabled.


The Home View editor is exited by tapping a “Done” button 5204. If changes have been introduced to the floors, device locations, or labels during an editing session, the editor prompts the user to save the changes before exiting back to the Settings screen. FIG. 60 shows a Home View Setup page 6000 with editor exit option prompts 6001 displayed, under an embodiment.


Home View of an embodiment includes or couples to a common data model. For each site, the site owner can use the Edit Home View layout editor to define the floors of the home, label the rooms of the home, and indicate the locations of the devices in the home. FIG. 61 is an example floor plan, under an embodiment. The output of the layout editor of an embodiment is represented using compact ASCII strings stored in site preferences on the server, but is not so limited. This storage scheme uses a virtual grid, and stores simple vector and x,y locations on that grid. For example, given a single-story home, the data describes the visual components as follows: the lighter-shade interior tile areas are described as two large rectangles; the stronger, exterior walls are described as four horizontal and three vertical vectors; the lighter interior walls are described as one horizontal and one vertical vector; the two device icons are each described with an x,y coordinate plus device identifier; the two room labels are each described with an x,y coordinate plus the text.


This static ASCII data model of the home is stored by the editor so that client applications can fetch this static data model and combine it locally with the current state of their devices to render a graphical view. The only thing that subsequently changes are the device icons as users take actions that affect the status of devices (e.g., open doors, turn on lights, etc.).


The data model strings are stored in three variables in site preferences on the server. The three variables include homeview/floors, homeview/devices, and homeview/labels. The variable homeview/floors specifies where the walls should be drawn for each floor, and whether interior floor space should be filled. The variable homeview/floors includes a single floor, or multiple floors (separated in the data by semicolons). If multiple floors are included, a default floor can be indicated so apps will display the default floor first.


The variable homeview/devices includes a list of floor locations and device IDs to draw on those locations. For multi-floor homes, per-floor data is separated by semicolons, but is not so limited. The list of floor locations and device IDs may be a subset of devices (the data model does not include information about devices that have not been placed on a floor).


The variable homeview/labels includes a list of locations, and text labels to draw centered on those locations. For multi-floor homes the data per floor is separated by semicolons.


Home View of an embodiment includes a compact method for storing numbers wherein, throughout this model, numbers such as x,y coordinates and vector lengths are compactly represented using an ASCII-offset model starting with the lowercase alphabet (plus a few characters that follow z in ASCII for >26), as follows:

    • a=0, b=1, c=2, . . . , x=23, y=24, z=25, {=26, |=27,}=28


      The use of this model enables specification of any (x,y) coordinate using two characters. For example, a horizontal line drawn from x,y position 2,5 with a length of 20 (2,5,20) can be represented by storing the “2” as “c”, storing “5” as “f”, and storing 20 as t, compactly storing the line as “cft”.


The homeview/floors variable includes specific data elements, but the embodiments are not so limited. The data elements of an embodiment include the following: [max # of tiles across] [optional flag: don't autofill interiors]; [floorplan data for 1st floor] [; floorplan data for 2nd floor] [; 3rd floor] [; 4th floor] [; 5th floor].


The data element “max # of tiles across” is saved as 28 by default. The result is that the user can draw a floor plan using up to 28 walls horizontally (29 walls vertically), containing 28 “tiles,” which supports a house with up to five rooms across.


The data element optional flag to prevent autofill interiors, when included, instructs the Home View editor to never fill any floor interiors when exporting the floor data. While the data may not include any interior tiles, depending on how the walls were drawn, but this flag prevents any interior tiles from being computed by the editor.


The data element “semicolon” separates the general settings from the first floor data.


The data element “floorplan data for a single floor” includes an optional flag plus a number of blocks of text representing vectors to draw, each block separated by spaces. The first character of each block indicates the type of vector to draw, and the characters that follow represent the vectors. When a floor should be shown first, the flag “default” is added before the vector data for that floor. Generally, the first floor is the default, so in that case (or in a single-floor house) this flag is not needed. The blocks of text representing the vectors include but are not limited to an H block, V block, h block, v block, and t block.


The H block, when there are horizontal exterior walls to draw, starts with a capital H, followed by three characters for each horizontal wall to draw (startX, startY, length). For example, a 15-tile wall drawn from the top corner is represented as H(0,0,15), which is compactly represented as Haap. A second horizontal wall drawn elsewhere appends another block of three coordinates. So Haap might become Haappph if there are two horizontal exterior walls. In the full example there are four exterior walls to draw so the data block is H followed by 4 triples: Haappphxpfa}}.


The V block, when there are vertical exterior walls to draw, starts with a capital V, followed by three characters for each vertical wall to draw (startX, startY, length). A vertical exterior wall drawn down the left side is represented as V(0,0,28) as Vaa}. Again, another three characters are added for each additional vertical exterior wall to draw.


The h block is similar to the H block except these are rendered as horizontal interior walls. This block starts with the letter h, followed by three characters for each horizontal line to draw (startX, startY, length). For example, a 15-tile line drawn in the middle is represented as h(0,15,15), which is compactly represented as happ. Another wall drawn in another area appends another block of three coordinates for each additional wall.


The v block is similar to the V block except these are rendered as vertical interior walls. This block starts with the letter v, followed by three characters for each vertical wall to draw (startX, startY, length).


The t block, when there are interior tiles to draw, starts with the letter t, followed by four characters for each rectangle to draw (x, y, width, height). For example, a 15-tile square is drawn in the corner is represented as t(0,0,15,15), which is compactly represented as taapp. Another rectangle of tiles drawn in another area appends another block of four coordinates. So taapp might become taappap}n.


If there are multiple floors, a semicolon is added and then another block of floor plan data can be added. For an empty floor there can be nothing between floors. For example, a three-story house with nothing defined for the middle floor is represented as follows: 28; Haapppgxpfa Vaa}pap}pn; Haapppgxpfa Vaa}pap}pn.



FIG. 62 is an example Home View one-story floor plan, under an embodiment. This floor plan is represented in an embodiment as follows: 28 (draw on a grid 28 tiles wide by 28 tiles tall); taappap}n (draw interior tiles as two large rectangles (x,y,w,h): (0,0,15,15) and (0,15,28,13)); happ (draw an interior horizontal wall (x,y,w): (0,15,15)); vhui (draw an interior vertical wall (x,y,h): (7,20,8)); Haappphxpfa}} (draw 4 exterior horizontal walls); Vaa}pap}pn (draw 3 vertical exterior walls). The complete homeview/floors data for this single-story home would be: 28;taappap}n happ vhui Haappphxpfa}} Vaa}pap}pn.


The homeview/devices variable includes specific data elements, but the embodiments are not so limited. The data elements of an embodiment include the following: [device location+id on 1st floor] [another device location+id on 1st floor] [ . . . ] [; device data for 2nd floor] [; 3rd floor] [; 4th floor] [; 5th floor].


Regarding the device location data element, each device location starts with a letter indicating location type: t (center the device over the middle of a tile); h (center the device over the middle of a horizontal segment); v (center the device over the middle of a vertical segment). The device location is followed by two characters that specify the (x,y) location of that tile or wall segment. For example, to place a device in the center of the first tile an embodiment uses t(0,0), represented as taa.


The device identifier data element is the unique identifier for the device. Note that some IDs can be long, so an embodiment only stores the last six characters of the device ID. For example, if the identifier is “ZONE12VER1”, an embodiment stores “12VER1”, and if the identifier is “ZONE5VER1” the embodiment stores “E5VER1”.


A complete device location+id element is a minimum of four characters (type, x, y, id) and can be up to nine characters. An example of a complete device location and identification is as follows: Draw camera “SCOFEBED” centered on the third horizontal wall segment across the top: t+(2, 0)+SCOFEBED, stored compactly as tca0FEBED. Another example of a complete device location and identification is as follows: Draw z-wave light with ID “7” centered over vertical wall segment 11 across and 5 down: vke7.


Data for multiple floors are separated by semicolons as described herein. Therefore, for a three-story house with just two devices on the third floor the data is as follows: t{qE5VER1 h{w0FEBED.



FIG. 63 is an example Home View floor plan that includes two devices, under an embodiment. This floor plan is represented in an embodiment as follows: t{qE5VER1: draw a motion sensor “ZONE5VER1” centered over tile at x,y location (26, 16); h{w0FEBED: draw camera “SCOFEBED” centered over horizontal wall at x,y location (26, 22). The complete homeview/devices data for this single-story home are: t{qE5VER1 h{w0FEBED.


The homeview/labels variable includes specific data elements, but the embodiments are not so limited. The data elements of an embodiment include the following: [label location+label text on 1st floor] [another location+label on 1st floor] [ . . . ] [; label data for 2nd floor] [; 3rd floor] [; 4th floor] [; 5th floor].


Each label location data element starts with a letter indicating location type: t (center the label over the middle of a tile; h (center the label over the middle of a horizontal segment; v (center the label over the middle of a vertical segment). The label location data element is followed by two characters that specify the (x,y) location of that tile or wall segment. For example, to place a label in the center of the first tile of an embodiment uses t(0,0), represented as taa.


The label text data element can be almost any string, enclosed in brackets [ ]. The text encoding of an embodiment follows the W3C definition for encodeURLComponent( ) method in javascript, which encodes everything except ˜!*( ). The only characters not allowed in labels are brackets themselves ([ ]). These should be stripped out when labels are defined in the editor.


Empty labels should not be stored. A complete label location+text element includes a minimum of six characters (type, x, y, [text]), as in vhg[Bedroom].



FIG. 64 is an example Home View floor plan that includes two labels, under an embodiment. This floor plan is represented in an embodiment as follows: vhg[Bedroom]: draw label “Bedroom” centered over vertical wall at x,y location (7, 6); tsv[Living%20Room]: draw label “Living Room” centered over tile at x,y location (26, 22). The complete homeview/labels data for this single-story home are: vhg[Bedroom] tsv[Living%20Room].


As described in detail herein, the user defines the walls of each floor of a home by drawing basic vectors. However, when a floor is rendered, the building exteriors should be readily distinguished from the interiors. For rendering Home View, an embodiment includes algorithms that determine the interior of the home and compute which tiles should be transparent and which are filled. Perimeter walls are rendered to be more vivid than interior walls. The user may draw openings in the external walls.


The algorithm of an embodiment for determining interior and exterior walls begins by marking all tiles as interior tile. A list is generated of tiles on the edge of each floor that are empty (top row, bottom row, left column, right column), and a function is called to clear each tile having no outside wall. Any edge tiles having no walls outside of them are marked as exterior tiles. For each exterior tile, the algorithm recursively searches the surrounding tiles. If there are no walls separating that tile from the next, then the next one is also marked as exterior.


In this way, Home View recursively crawls into the house from the edges, marking tiles as “exterior” as operation proceeds. Once all exterior tiles are determined, walls adjacent to them are also considered “exterior”, and any walls bounded by interior tiles are considered “interior”. The algorithm identifies small openings, before recursing from one exterior tile to an adjacent tile, by examining the walls nearby to ensure the opening is wide enough before proceeding. This interior/exterior computation is computed by the Home View editor, and stored with the floor data on the server. Client renderers have an easier job since the data indicates interior/exterior information as defined above in homeview/floors.


The Home View data output from Edit mode is checked to ensure integrity through performance of the following: the home vectors fit without bounds of each floor; all sensors and devices still exist. At the time of rendering of the home view, the same checks are repeated to verify data integrity. If any checks fail, a dialog is presented to the user, and the preference returns to the System Icon (the “orb”). The feature therefore is turned off for display, but the data is still there until subsequently edited; if a user attempts to edit home view and the data is corrupted, the user is given the option to reset the data and start over.


The integrated security system includes couplings or connections among a variety of IP devices or components, and the device management module is in charge of the discovery, installation and configuration of the IP devices coupled or connected to the system, as described above. The integrated security system of an embodiment uses a “sandbox” network to discover and manage all IP devices coupled or connected as components of the system. The IP devices of an embodiment include wired devices, wireless devices, cameras, interactive touchscreens, and security panels to name a few. These devices can be wired via ethernet cable or Wifi devices, all of which are secured within the sandbox network, as described below. The “sandbox” network is described in detail below.



FIG. 65 is a block diagram 500 of network or premise device integration with a premise network 250, under an embodiment. In an embodiment, network devices 255-257 are coupled to the gateway 102 using a secure network coupling or connection such as SSL over an encrypted 802.11 link (utilizing for example WPA-2 security for the wireless encryption). The network coupling or connection between the gateway 102 and the network devices 255-257 is a private coupling or connection in that it is segregated from any other network couplings or connections. The gateway 102 is coupled to the premise router/firewall 252 via a coupling with a premise LAN 250. The premise router/firewall 252 is coupled to a broadband modem 251, and the broadband modem 251 is coupled to a WAN 200 or other network outside the premise. The gateway 102 thus enables or forms a separate wireless network, or sub-network, that includes some number of devices and is coupled or connected to the LAN 250 of the host premises. The gateway sub-network can include, but is not limited to, any number of other devices like WiFi IP cameras, security panels (e.g., IP-enabled), and security touchscreens, to name a few. The gateway 102 manages or controls the sub-network separately from the LAN 250 and transfers data and information between components of the sub-network and the LAN 250/WAN 200, but is not so limited. Additionally, other network devices 254 can be coupled to the LAN 250 without being coupled to the gateway 102.



FIG. 66 is a block diagram 600 of network or premise device integration with a premise network 250, under an alternative embodiment. The network or premise devices 255-257 are coupled to the gateway 102. The network coupling or connection between the gateway 102 and the network devices 255-257 is a private coupling or connection in that it is segregated from any other network couplings or connections. The gateway 102 is coupled or connected between the premise router/firewall 252 and the broadband modem 251. The broadband modem 251 is coupled to a WAN 200 or other network outside the premise, while the premise router/firewall 252 is coupled to a premise LAN 250. As a result of its location between the broadband modem 251 and the premise router/firewall 252, the gateway 102 can be configured or function as the premise router routing specified data between the outside network (e.g., WAN 200) and the premise router/firewall 252 of the LAN 250. As described above, the gateway 102 in this configuration enables or forms a separate wireless network, or sub-network, that includes the network or premise devices 255-257 and is coupled or connected between the LAN 250 of the host premises and the WAN 200. The gateway sub-network can include, but is not limited to, any number of network or premise devices 255-257 like WiFi IP cameras, security panels (e.g., IP-enabled), and security touchscreens, to name a few. The gateway 102 manages or controls the sub-network separately from the LAN 250 and transfers data and information between components of the sub-network and the LAN 250/WAN 200, but is not so limited. Additionally, other network devices 254 can be coupled to the LAN 250 without being coupled to the gateway 102.


The examples described above with reference to FIGS. 47 and 48 are presented only as examples of IP device integration. The integrated security system is not limited to the type, number and/or combination of IP devices shown and described in these examples, and any type, number and/or combination of IP devices is contemplated within the scope of this disclosure as capable of being integrated with the premise network.


The integrated security system of an embodiment includes a touchscreen (also referred to as the iControl touchscreen or integrated security system touchscreen), as described above, which provides core security keypad functionality, content management and presentation, and embedded systems design. The networked security touchscreen system of an embodiment enables a consumer or security provider to easily and automatically install, configure and manage the security system and touchscreen located at a customer premise. Using this system the customer may access and control the local security system, local IP devices such as cameras, local sensors and control devices (such as lighting controls or pipe freeze sensors), as well as the local security system panel and associated security sensors (such as door/window, motion, and smoke detectors). The customer premise may be a home, business, and/or other location equipped with a wired or wireless broadband IP connection.


The system of an embodiment includes a touchscreen with a configurable software user interface and/or a gateway device (e.g., iHub) that couples or connects to a premise security panel through a wired or wireless connection, and a remote server that provides access to content and information from the premises devices to a user when they are remote from the home. The touchscreen supports broadband and/or WAN wireless connectivity. In this embodiment, the touchscreen incorporates an IP broadband connection (e.g., Wifi radio, Ethernet port, etc.), and/or a cellular radio (e.g., GPRS/GSM, CDMA, WiMax, etc.). The touchscreen described herein can be used as one or more of a security system interface panel and a network user interface (UI) that provides an interface to interact with a network (e.g., LAN, WAN, internet, etc.).


The touchscreen of an embodiment provides an integrated touchscreen and security panel as an all-in-one device. Once integrated using the touchscreen, the touchscreen and a security panel of a premise security system become physically co-located in one device, and the functionality of both may even be co-resident on the same CPU and memory (though this is not required).


The touchscreen of an embodiment also provides an integrated IP video and touchscreen UI. As such, the touchscreen supports one or more standard video CODECs/players (e.g., H.264, Flash Video, MOV, MPEG4, M-JPEG, etc.). The touchscreen UI then provides a mechanism (such as a camera or video widget) to play video. In an embodiment the video is streamed live from an IP video camera. In other embodiments the video comprises video clips or photos sent from an IP camera or from a remote location.


The touchscreen of an embodiment provides a configurable user interface system that includes a configuration supporting use as a security touchscreen. In this embodiment, the touchscreen utilizes a modular user interface that allows components to be modified easily by a service provider, an installer, or even the end user. Examples of such a modular approach include using Flash widgets, HTML-based widgets, or other downloadable code modules such that the user interface of the touchscreen can be updated and modified while the application is running. In an embodiment the touchscreen user interface modules can be downloaded over the internet. For example, a new security configuration widget can be downloaded from a standard web server, and the touchscreen then loads such configuration app into memory, and inserts it in place of the old security configuration widget. The touchscreen of an embodiment is configured to provide a self-install user interface.


Embodiments of the networked security touchscreen system described herein include a touchscreen device with a user interface that includes a security toolbar providing one or more functions including arm, disarm, panic, medic, and alert. The touchscreen therefore includes at least one screen having a separate region of the screen dedicated to a security toolbar. The security toolbar of an embodiment is present in the dedicated region at all times that the screen is active.


The touchscreen of an embodiment includes a home screen having a separate region of the screen allocated to managing home-based functions. The home-based functions of an embodiment include managing, viewing, and/or controlling IP video cameras. In this embodiment, regions of the home screen are allocated in the form of widget icons; these widget icons (e.g. for cameras, thermostats, lighting, etc) provide functionality for managing home systems. So, for example, a displayed camera icon, when selected, launches a Camera Widget, and the Camera widget in turn provides access to video from one or more cameras, as well as providing the user with relevant camera controls (take a picture, focus the camera, etc.)


The touchscreen of an embodiment includes a home screen having a separate region of the screen allocated to managing, viewing, and/or controlling internet-based content or applications. For example, the Widget Manager UI presents a region of the home screen (up to and including the entire home screen) where internet widgets icons such as weather, sports, etc. may be accessed). Each of these icons may be selected to launch their respective content services.


The touchscreen of an embodiment is integrated into a premise network using the gateway, as described above. The gateway as described herein functions to enable a separate wireless network, or sub-network, that is coupled, connected, or integrated with another network (e.g., WAN, LAN of the host premises, etc.). The sub-network enabled by the gateway optimizes the installation process for IP devices, like the touchscreen, that couple or connect to the sub-network by segregating these IP devices from other such devices on the network. This segregation of the IP devices of the sub-network further enables separate security and privacy policies to be implemented for these IP devices so that, where the IP devices are dedicated to specific functions (e.g., security), the security and privacy policies can be tailored specifically for the specific functions. Furthermore, the gateway and the sub-network it forms enables the segregation of data traffic, resulting in faster and more efficient data flow between components of the host network, components of the sub-network, and between components of the sub-network and components of the network.


The touchscreen of an embodiment includes a core functional embedded system that includes an embedded operating system, required hardware drivers, and an open system interface to name a few. The core functional embedded system can be provided by or as a component of a conventional security system (e.g., security system available from GE Security). These core functional units are used with components of the integrated security system as described herein. Note that portions of the touchscreen description below may include reference to a host premise security system (e.g., GE security system), but these references are included only as an example and do not limit the touchscreen to integration with any particular security system.


As an example, regarding the core functional embedded system, a reduced memory footprint version of embedded Linux forms the core operating system in an embodiment, and provides basic TCP/IP stack and memory management functions, along with a basic set of low-level graphics primitives. A set of device drivers is also provided or included that offer low-level hardware and network interfaces. In addition to the standard drivers, an interface to the RS 485 bus is included that couples or connects to the security system panel (e.g., GE Concord panel). The interface may, for example, implement the Superbus 2000 protocol, which can then be utilized by the more comprehensive transaction-level security functions implemented in PanelConnect technology (e.g SetAlarmLevel (int level, int partition, char *accessCode)). Power control drivers are also provided.



FIG. 67 is a block diagram of a touchscreen 700 of the integrated security system, under an embodiment. The touchscreen 700 generally includes an application/presentation layer 702 with a resident application 704, and a core engine 706. The touchscreen 700 also includes one or more of the following, but is not so limited: applications of premium services 710, widgets 712, a caching proxy 714, network security 716, network interface 718, security object 720, applications supporting devices 722, PanelConnect API 724, a gateway interface 726, and one or more ports 728.


More specifically, the touchscreen, when configured as a home security device, includes but is not limited to the following application or software modules: RS 485 and/or RS-232 bus security protocols to conventional home security system panel (e.g., GE Concord panel); functional home security classes and interfaces (e.g. Panel ARM state, Sensor status, etc.); Application/Presentation layer or engine; Resident Application; Consumer Home Security Application; installer home security application; core engine; and System bootloader/Software Updater. The core Application engine and system bootloader can also be used to support other advanced content and applications. This provides a seamless interaction between the premise security application and other optional services such as weather widgets or IP cameras.


An alternative configuration of the touchscreen includes a first Application engine for premise security and a second Application engine for all other applications. The integrated security system application engine supports content standards such as HTML, XML, Flash, etc. and enables a rich consumer experience for all ‘widgets’, whether security-based or not. The touchscreen thus provides service providers the ability to use web content creation and management tools to build and download any ‘widgets’ regardless of their functionality.


As discussed above, although the Security Applications have specific low-level functional requirements in order to interface with the premise security system, these applications make use of the same fundamental application facilities as any other ‘widget’, application facilities that include graphical layout, interactivity, application handoff, screen management, and network interfaces, to name a few.


Content management in the touchscreen provides the ability to leverage conventional web development tools, performance optimized for an embedded system, service provider control of accessible content, content reliability in a consumer device, and consistency between ‘widgets’ and seamless widget operational environment. In an embodiment of the integrated security system, widgets are created by web developers and hosted on the integrated security system Content Manager (and stored in the Content Store database). In this embodiment the server component caches the widgets and offers them to consumers through the web-based integrated security system provisioning system. The servers interact with the advanced touchscreen using HTTPS interfaces controlled by the core engine and dynamically download widgets and updates as needed to be cached on the touchscreen. In other embodiments widgets can be accessed directly over a network such as the Internet without needing to go through the iControl Content Manager


Referring to FIG. 67, the touchscreen system is built on a tiered architecture, with defined interfaces between the Application/Presentation Layer (the Application Engine) on the top, the Core Engine in the middle, and the security panel and gateway APIs at the lower level. The architecture is configured to provide maximum flexibility and ease of maintenance.


The application engine of the touchscreen provides the presentation and interactivity capabilities for all applications (widgets) that run on the touchscreen, including both core security function widgets and third party content widgets. FIG. 68 is an example screenshot 800 of a networked security touchscreen, under an embodiment. This example screenshot 800 includes three interfaces or user interface (UI) components 802-806, but is not so limited. A first UI 802 of the touchscreen includes icons by which a user controls or accesses functions and/or components of the security system (e.g., “Main”, “Panic”, “Medic”, “Fire”, state of the premise alarm system (e.g., disarmed, armed, etc.), etc.); the first UI 802, which is also referred to herein as a security interface, is always presented on the touchscreen. A second UI 804 of the touchscreen includes icons by which a user selects or interacts with services and other network content (e.g., clock, calendar, weather, stocks, news, sports, photos, maps, music, etc.) that is accessible via the touchscreen. The second UI 804 is also referred to herein as a network interface or content interface. A third UI 806 of the touchscreen includes icons by which a user selects or interacts with additional services or components (e.g., intercom control, security, cameras coupled to the system in particular regions (e.g., front door, baby, etc.) available via the touchscreen.


A component of the application engine is the Presentation Engine, which includes a set of libraries that implement the standards-based widget content (e.g., XML, HTML, JavaScript, Flash) layout and interactivity. This engine provides the widget with interfaces to dynamically load both graphics and application logic from third parties, support high level data description language as well as standard graphic formats. The set of web content-based functionality available to a widget developer is extended by specific touchscreen functions implemented as local web services by the Core Engine.


The resident application of the touchscreen is the master service that controls the interaction of all widgets in the system, and enforces the business and security rules required by the service provider. For example, the resident application determines the priority of widgets, thereby enabling a home security widget to override resource requests from a less critical widget (e.g. a weather widget). The resident application also monitors widget behavior, and responds to client or server requests for cache updates.


The core engine of the touchscreen manages interaction with other components of the integrated security system, and provides an interface through which the resident application and authorized widgets can get information about the home security system, set alarms, install sensors, etc. At the lower level, the Core Engine's main interactions are through the PanelConnect API, which handles all communication with the security panel, and the gateway Interface, which handles communication with the gateway. In an embodiment, both the iHub Interface and PanelConnect API are resident and operating on the touchscreen. In another embodiment, the PanelConnect API runs on the gateway or other device that provides security system interaction and is accessed by the touchscreen through a web services interface.


The Core Engine also handles application and service level persistent and cached memory functions, as well as the dynamic provisioning of content and widgets, including but not limited to: flash memory management, local widget and content caching, widget version management (download, cache flush new/old content versions), as well as the caching and synchronization of user preferences. As a portion of these services the Core engine incorporates the bootloader functionality that is responsible for maintaining a consistent software image on the touchscreen, and acts as the client agent for all software updates. The bootloader is configured to ensure full update redundancy so that unsuccessful downloads cannot corrupt the integrated security system.


Video management is provided as a set of web services by the Core Engine. Video management includes the retrieval and playback of local video feeds as well as remote control and management of cameras (all through iControl CameraConnect technology).


Both the high level application layer and the mid-level core engine of the touchscreen can make calls to the network. Any call to the network made by the application layer is automatically handed off to a local caching proxy, which determines whether the request should be handled locally. Many of the requests from the application layer are web services API requests; although such requests could be satisfied by the iControl servers, they are handled directly by the touchscreen and the gateway. Requests that get through the caching proxy are checked against a white list of acceptable sites, and, if they match, are sent off through the network interface to the gateway. Included in the Network Subsystem is a set of network services including HTTP, HTTPS, and server-level authentication functions to manage the secure client-server interface. Storage and management of certificates is incorporated as a part of the network services layer.


Server components of the integrated security system servers support interactive content services on the touchscreen. These server components include, but are not limited to the content manager, registry manager, network manager, and global registry, each of which is described herein.


The Content Manager oversees aspects of handling widget data and raw content on the touchscreen. Once created and validated by the service provider, widgets are ‘ingested’ to the Content Manager, and then become available as downloadable services through the integrated security system Content Management APIs. The Content manager maintains versions and timestamp information, and connects to the raw data contained in the backend Content Store database. When a widget is updated (or new content becomes available) all clients registering interest in a widget are systematically updated as needed (a process that can be configured at an account, locale, or system-wide level).


The Registry Manager handles user data, and provisioning accounts, including information about widgets the user has decided to install, and the user preferences for these widgets.


The Network Manager handles getting and setting state for all devices on the integrated security system network (e.g., sensors, panels, cameras, etc.). The Network manager synchronizes with the gateway, the advanced touchscreen, and the subscriber database.


The Global Registry is a primary starting point server for all client services, and is a logical referral service that abstracts specific server locations/addresses from clients (touchscreen, gateway 102, desktop widgets, etc.). This approach enables easy scaling/migration of server farms.


The touchscreen of an embodiment operates wirelessly with a premise security system. The touchscreen of an embodiment incorporates an RF transceiver component that either communicates directly with the sensors and/or security panel over the panel's proprietary RF frequency, or the touchscreen communicates wirelessly to the gateway over 802.11, Ethernet, or other IP-based communications channel, as described in detail herein. In the latter case the gateway implements the PanelConnect interface and communicates directly to the security panel and/or sensors over wireless or wired networks as described in detail above.


The touchscreen of an embodiment is configured to operate with multiple security systems through the use of an abstracted security system interface. In this embodiment, the PanelConnect API can be configured to support a plurality of proprietary security system interfaces, either simultaneously or individually as described herein. In one embodiment of this approach, the touchscreen incorporates multiple physical interfaces to security panels (e.g. GE Security RS-485, Honeywell RF, etc.) in addition to the PanelConnect API implemented to support multiple security interfaces. The change needed to support this in PanelConnect is a configuration parameter specifying the panel type connection that is being utilized.


So for example, the setARMState( ) function is called with an additional parameter (e.g., Armstate=setARMState(type=“ARM STAY|ARM AWAY|DISARM”, Parameters=“ExitDelay=30|Lights=OFF”, panelType=“GE Concord4 RS485”)). The ‘panelType’ parameter is used by the setARMState function (and in practice by all of the PanelConnect functions) to select an algorithm appropriate to the specific panel out of a plurality of algorithms.


The touchscreen of an embodiment is self-installable. Consequently, the touchscreen provides a ‘wizard’ approach similar to that used in traditional computer installations (e.g. InstallShield). The wizard can be resident on the touchscreen, accessible through a web interface, or both. In one embodiment of a touchscreen self-installation process, the service provider can associate devices (sensors, touchscreens, security panels, lighting controls, etc.) remotely using a web-based administrator interface.


The touchscreen of an embodiment includes a battery backup system for a security touchscreen. The touchscreen incorporates a standard Li-ion or other battery and charging circuitry to allow continued operation in the event of a power outage. In an embodiment the battery is physically located and connected within the touchscreen enclosure. In another embodiment the battery is located as a part of the power transformer, or in between the power transformer and the touchscreen.


The example configurations of the integrated security system described above with reference to FIGS. 47 and 48 include a gateway that is a separate device, and the touchscreen couples to the gateway. However, in an alternative embodiment, the gateway device and its functionality can be incorporated into the touchscreen so that the device management module, which is now a component of or included in the touchscreen, is in charge of the discovery, installation and configuration of the IP devices coupled or connected to the system, as described above. The integrated security system with the integrated touchscreen/gateway uses the same “sandbox” network to discover and manage all IP devices coupled or connected as components of the system.


The touchscreen of this alternative embodiment integrates the components of the gateway with the components of the touchscreen as described herein. More specifically, the touchscreen of this alternative embodiment includes software or applications described above with reference to FIG. 3. In this alternative embodiment, the touchscreen includes the gateway application layer 302 as the main program that orchestrates the operations performed by the gateway. A Security Engine 304 of the touchscreen provides robust protection against intentional and unintentional intrusion into the integrated security system network from the outside world (both from inside the premises as well as from the WAN). The Security Engine 304 of an embodiment comprises one or more sub-modules or components that perform functions including, but not limited to, the following:

    • Encryption including 128-bit SSL encryption for gateway and iConnect server communication to protect user data privacy and provide secure communication.
    • Bi-directional authentication between the touchscreen and iConnect server in order to prevent unauthorized spoofing and attacks. Data sent from the iConnect server to the gateway application (or vice versa) is digitally signed as an additional layer of security. Digital signing provides both authentication and validation that the data has not been altered in transit.
    • Camera SSL encapsulation because picture and video traffic offered by off-the-shelf networked IP cameras is not secure when traveling over the Internet. The touchscreen provides for 128-bit SSL encapsulation of the user picture and video data sent over the internet for complete user security and privacy.
    • 802.11b/g/n with WPA-2 security to ensure that wireless camera communications always takes place using the strongest available protection.
    • A touchscreen-enabled device is assigned a unique activation key for activation with an iConnect server. This ensures that only valid gateway-enabled devices can be activated for use with the specific instance of iConnect server in use. Attempts to activate gateway-enabled devices by brute force are detected by the Security Engine. Partners deploying touchscreen-enabled devices have the knowledge that only a gateway with the correct serial number and activation key can be activated for use with an iConnect server. Stolen devices, devices attempting to masquerade as gateway-enabled devices, and malicious outsiders (or insiders as knowledgeable but nefarious customers) cannot effect other customers' gateway-enabled devices.


As standards evolve, and new encryption and authentication methods are proven to be useful, and older mechanisms proven to be breakable, the security manager can be upgraded “over the air” to provide new and better security for communications between the iConnect server and the gateway application, and locally at the premises to remove any risk of eavesdropping on camera communications.


A Remote Firmware Download module 306 of the touchscreen allows for seamless and secure updates to the gateway firmware through the iControl Maintenance Application on the server 104, providing a transparent, hassle-free mechanism for the service provider to deploy new features and bug fixes to the installed user base. The firmware download mechanism is tolerant of connection loss, power interruption and user interventions (both intentional and unintentional). Such robustness reduces down time and customer support issues. Touchscreen firmware can be remotely download either for one touchscreen at a time, a group of touchscreen, or in batches.


The Automations engine 308 of the touchscreen manages the user-defined rules of interaction between the different devices (e.g. when door opens turn on the light). Though the automation rules are programmed and reside at the portal/server level, they are cached at the gateway level in order to provide short latency between device triggers and actions.


DeviceConnect 310 of the touchscreen touchscreen includes definitions of all supported devices (e.g., cameras, security panels, sensors, etc.) using a standardized plug-in architecture. The DeviceConnect module 310 offers an interface that can be used to quickly add support for any new device as well as enabling interoperability between devices that use different technologies/protocols. For common device types, pre-defined sub-modules have been defined, making supporting new devices of these types even easier. SensorConnect 312 is provided for adding new sensors, CameraConnect 316 for adding IP cameras, and PanelConnect 314 for adding home security panels.


The Schedules engine 318 of the touchscreen is responsible for executing the user defined schedules (e.g., take a picture every five minutes; every day at 8 am set temperature to 65 degrees Fahrenheit, etc.). Though the schedules are programmed and reside at the iConnect server level they are sent to the scheduler within the gateway application of the touchscreen. The Schedules Engine 318 then interfaces with SensorConnect 312 to ensure that scheduled events occur at precisely the desired time. The Device Management module 320 of the touchscreen is in charge of all discovery, installation and configuration of both wired and wireless IP devices (e.g., cameras, etc.) coupled or connected to the system. Networked IP devices, such as those used in the integrated security system, require user configuration of many IP and security parameters, and the device management module of an embodiment handles the details of this configuration. The device management module also manages the video routing module described below.


The video routing engine 322 of the touchscreen is responsible for delivering seamless video streams to the user with zero-configuration. Through a multi-step, staged approach the video routing engine uses a combination of UPnP port-forwarding, relay server routing and STUN/TURN peer-to-peer routing. The video routing engine is described in detail in the Related Applications.



FIG. 69 is a block diagram 900 of network or premise device integration with a premise network 250, under an embodiment. In an embodiment, network devices 255, 256, 957 are coupled to the touchscreen 902 using a secure network connection such as SSL over an encrypted 802.11 link (utilizing for example WPA-2 security for the wireless encryption), and the touchscreen 902 coupled to the premise router/firewall 252 via a coupling with a premise LAN 250. The premise router/firewall 252 is coupled to a broadband modem 251, and the broadband modem 251 is coupled to a WAN 200 or other network outside the premise. The touchscreen 902 thus enables or forms a separate wireless network, or sub-network, that includes some number of devices and is coupled or connected to the LAN 250 of the host premises. The touchscreen sub-network can include, but is not limited to, any number of other devices like WiFi IP cameras, security panels (e.g., IP-enabled), and IP devices, to name a few. The touchscreen 902 manages or controls the sub-network separately from the LAN 250 and transfers data and information between components of the sub-network and the LAN 250/WAN 200, but is not so limited. Additionally, other network devices 254 can be coupled to the LAN 250 without being coupled to the touchscreen 902.



FIG. 70 is a block diagram 1000 of network or premise device integration with a premise network 250, under an alternative embodiment. The network or premise devices 255, 256, 1057 are coupled to the touchscreen 1002, and the touchscreen 1002 is coupled or connected between the premise router/firewall 252 and the broadband modem 251. The broadband modem 251 is coupled to a WAN 200 or other network outside the premise, while the premise router/firewall 252 is coupled to a premise LAN 250. As a result of its location between the broadband modem 251 and the premise router/firewall 252, the touchscreen 1002 can be configured or function as the premise router routing specified data between the outside network (e.g., WAN 200) and the premise router/firewall 252 of the LAN 250. As described above, the touchscreen 1002 in this configuration enables or forms a separate wireless network, or sub-network, that includes the network or premise devices 255, 156, 1057 and is coupled or connected between the LAN 250 of the host premises and the WAN 200. The touchscreen sub-network can include, but is not limited to, any number of network or premise devices 255, 256, 1057 like WiFi IP cameras, security panels (e.g., IP-enabled), and security touchscreens, to name a few. The touchscreen 1002 manages or controls the sub-network separately from the LAN 250 and transfers data and information between components of the sub-network and the LAN 250/WAN 200, but is not so limited. Additionally, other network devices 254 can be coupled to the LAN 250 without being coupled to the touchscreen 1002.


The gateway of an embodiment, whether a stand-along component or integrated with a touchscreen, enables couplings or connections and thus the flow or integration of information between various components of the host premises and various types and/or combinations of IP devices, where the components of the host premises include a network (e.g., LAN) and/or a security system or subsystem to name a few. Consequently, the gateway controls the association between and the flow of information or data between the components of the host premises. For example, the gateway of an embodiment forms a sub-network coupled to another network (e.g., WAN, LAN, etc.), with the sub-network including IP devices. The gateway further enables the association of the IP devices of the sub-network with appropriate systems on the premises (e.g., security system, etc.). Therefore, for example, the gateway can form a sub-network of IP devices configured for security functions, and associate the sub-network only with the premises security system, thereby segregating the IP devices dedicated to security from other IP devices that may be coupled to another network on the premises.


The gateway of an embodiment, as described herein, enables couplings or connections and thus the flow of information between various components of the host premises and various types and/or combinations of IP devices, where the components of the host premises include a network, a security system or subsystem to name a few. Consequently, the gateway controls the association between and the flow of information or data between the components of the host premises. For example, the gateway of an embodiment forms a sub-network coupled to another network (e.g., WAN, LAN, etc.), with the sub-network including IP devices. The gateway further enables the association of the IP devices of the sub-network with appropriate systems on the premises (e.g., security system, etc.). Therefore, for example, the gateway can form a sub-network of IP devices configured for security functions, and associate the sub-network only with the premises security system, thereby segregating the IP devices dedicated to security from other IP devices that may be coupled to another network on the premises.



FIG. 71 is a flow diagram for a method 1100 of forming a security network including integrated security system components, under an embodiment. Generally, the method comprises coupling 1102 a gateway comprising a connection management component to a local area network in a first location and a security server in a second location. The method comprises forming 1104 a security network by automatically establishing a wireless coupling between the gateway and a security system using the connection management component. The security system of an embodiment comprises security system components located at the first location. The method comprises integrating 1106 communications and functions of the security system components into the security network via the wireless coupling.



FIG. 72 is a flow diagram for a method 1200 of forming a security network including integrated security system components and network devices, under an embodiment. Generally, the method comprises coupling 1202 a gateway to a local area network located in a first location and a security server in a second location. The method comprises automatically establishing 1204 communications between the gateway and security system components at the first location, the security system including the security system components. The method comprises automatically establishing 1206 communications between the gateway and premise devices at the first location. The method comprises forming 1208 a security network by electronically integrating, via the gateway, communications and functions of the premise devices and the security system components.


In an example embodiment, FIG. 73 is a flow diagram 1300 for integration or installation of an IP device into a private network environment, under an embodiment. The IP device includes any IP-capable device that, for example, includes the touchscreen of an embodiment. The variables of an embodiment set at time of installation include, but are not limited to, one or more of a private SSID/Password, a gateway identifier, a security panel identifier, a user account TS, and a Central Monitoring Station account identification.


An embodiment of the IP device discovery and management begins with a user or installer activating 1302 the gateway and initiating 1304 the install mode of the system. This places the gateway in an install mode. Once in install mode, the gateway shifts to a default (Install) Wifi configuration. This setting will match the default setting for other integrated security system-enabled devices that have been pre-configured to work with the integrated security system. The gateway will then begin to provide 1306 DHCP addresses for these IP devices. Once the devices have acquired a new DHCP address from the gateway, those devices are available for configuration into a new secured Wifi network setting.


The user or installer of the system selects 1308 all devices that have been identified as available for inclusion into the integrated security system. The user may select these devices by their unique IDs via a web page, Touchscreen, or other client interface. The gateway provides 1310 data as appropriate to the devices. Once selected, the devices are configured 1312 with appropriate secured Wifi settings, including SSID and WPA/WPA-2 keys that are used once the gateway switches back to the secured sandbox configuration from the “Install” settings. Other settings are also configured as appropriate for that type of device. Once all devices have been configured, the user is notified and the user can exit install mode. At this point all devices will have been registered 1314 with the integrated security system servers.


The installer switches 1316 the gateway to an operational mode, and the gateway instructs or directs 1318 all newly configured devices to switch to the “secured” Wifi sandbox settings. The gateway then switches 1320 to the “secured” Wifi settings. Once the devices identify that the gateway is active on the “secured” network, they request new DHCP addresses from the gateway which, in response, provides 1322 the new addresses. The devices with the new addresses are then operational 1324 on the secured network.


In order to ensure the highest level of security on the secured network, the gateway can create or generate a dynamic network security configuration based on the unique ID and private key in the gateway, coupled with a randomizing factor that can be based on online time or other inputs. This guarantees the uniqueness of the gateway secured network configuration.


To enable the highest level of performance, the gateway analyzes the RF spectrum of the 802.11x network and determines which frequency band/channel it should select to run.


An alternative embodiment of the camera/IP device management process leverages the local ethernet connection of the sandbox network on the gateway. This alternative process is similar to the Wifi discovery embodiment described above, except the user connects the targeted device to the ethernet port of the sandbox network to begin the process. This alternative embodiment accommodates devices that have not been pre-configured with the default “Install” configuration for the integrated security system.


This alternative embodiment of the IP device discovery and management begins with the user/installer placing the system into install mode. The user is instructed to attach an IP device to be installed to the sandbox Ethernet port of the gateway. The IP device requests a DHCP address from the gateway which, in response to the request, provides the address. The user is presented the device and is asked if he/she wants to install the device. If yes, the system configures the device with the secured Wifi settings and other device-specific settings (e.g., camera settings for video length, image quality etc.). The user is next instructed to disconnect the device from the ethernet port. The device is now available for use on the secured sandbox network.



FIG. 74 is a block diagram showing communications among integrated IP devices of the private network environment, under an embodiment. The IP devices of this example include a security touchscreen 1403, gateway 1402 (e.g., “iHub”), and security panel (e.g., “Security Panel 1”, “Security Panel 2”, “Security Panel n”), but the embodiment is not so limited. In alternative embodiments any number and/or combination of these three primary component types may be combined with other components including IP devices and/or security system components. For example, a single device which comprises an integrated gateway, touchscreen, and security panel is merely another embodiment of the integrated security system described herein. The description that follows includes an example configuration that includes a touchscreen hosting particular applications. However, the embodiment is not limited to the touchscreen hosting these applications, and the touchscreen should be thought of as representing any IP device.


Referring to FIG. 74, the touchscreen 1403 incorporates an application 1410 that is implemented as computer code resident on the touchscreen operating system, or as a web-based application running in a browser, or as another type of scripted application (e.g., Flash, Java, Visual Basic, etc.). The touchscreen core application 1410 represents this application, providing user interface and logic for the end user to manage their security system or to gain access to networked information or content (Widgets). The touchscreen core application 1410 in turn accesses a library or libraries of functions to control the local hardware (e.g. screen display, sound, LEDs, memory, etc.) as well as specialized librarie(s) to couple or connect to the security system.


In an embodiment of this security system connection, the touchscreen 1403 communicates to the gateway 1402, and has no direct communication with the security panel. In this embodiment, the touchscreen core application 1410 accesses the remote service APIs 1412 which provide security system functionality (e.g. ARM/DISARM panel, sensor state, get/set panel configuration parameters, initiate or get alarm events, etc.). In an embodiment, the remote service APIs 1412 implement one or more of the following functions, but the embodiment is not so limited: Armstate=setARMState(type=“ARM STAY|ARM AWAY|DISARM”, Parameters=“ExitDelay=30|Lights=OFF”); sensorState=getSensors(type=“ALL|SensorName|SensorNameList”); result=setSensorState(SensorName, parameters=“Option1, Options2, . . . Option n”); interruptHandler=SensorEvent( ); and, interruptHandler=alarmEvent( ).


Functions of the remote service APIs 1412 of an embodiment use a remote PanelConnect API 1424 which resides in memory on the gateway 1402. The touchscreen 1403 communicates with the gateway 1402 through a suitable network interface such as an Ethernet or 802.11 RF connection, for example. The remote PanelConnect API 1424 provides the underlying Security System Interfaces 1426 used to communicate with and control one or more types of security panel via wired link 1430 and/or RF link 3. The PanelConnect API 1224 provides responses and input to the remote services APIs 1426, and in turn translates function calls and data to and from the specific protocols and functions supported by a specific implementation of a Security Panel (e.g. a GE Security Simon XT or Honeywell Vista 20P). In an embodiment, the PanelConnect API 1224 uses a 345 MHz RF transceiver or receiver hardware/firmware module to communicate wirelessly to the security panel and directly to a set of 345 MHz RF-enabled sensors and devices, but the embodiment is not so limited.


The gateway of an alternative embodiment communicates over a wired physical coupling or connection to the security panel using the panel's specific wired hardware (bus) interface and the panel's bus-level protocol.


In an alternative embodiment, the Touchscreen 1403 implements the same PanelConnect API 1414 locally on the Touchscreen 1403, communicating directly with the Security Panel 2 and/or Sensors 2 over the proprietary RF link or over a wired link for that system. In this embodiment the Touchscreen 1403, instead of the gateway 1402, incorporates the 345 MHz RF transceiver to communicate directly with Security Panel 2 or Sensors 2 over the RF link 2. In the case of a wired link the Touchscreen 1403 incorporates the real-time hardware (e.g. a PIC chip and RS232-variant serial link) to physically connect to and satisfy the specific bus-level timing requirements of the SecurityPanel2.


In yet another alternative embodiment, either the gateway 1402 or the Touchscreen 1403 implements the remote service APIs. This embodiment includes a Cricket device (“Cricket”) which comprises but is not limited to the following components: a processor (suitable for handling 802.11 protocols and processing, as well as the bus timing requirements of SecurityPanel1); an 802.11 (WiFi) client IP interface chip; and, a serial bus interface chip that implements variants of RS232 or RS485, depending on the specific Security Panel.


The Cricket also implements the full PanelConnect APIs such that it can perform the same functions as the case where the gateway implements the PanelConnect APIs. In this embodiment, the touchscreen core application 1410 calls functions in the remote service APIs 1412 (such as setArmState( )). These functions in turn couple or connect to the remote Cricket through a standard IP connection (“Cricket IP Link”) (e.g., Ethernet, Homeplug, the gateway's proprietary Wifi network, etc.). The Cricket in turn implements the PanelConnect API, which responds to the request from the touchscreen core application, and performs the appropriate function using the proprietary panel interface. This interface uses either the wireless or wired proprietary protocol for the specific security panel and/or sensors.



FIG. 75 is a flow diagram of a method of integrating an external control and management application system with an existing security system, under an embodiment. Operations begin when the system is powered on 1510, involving at a minimum the power-on of the gateway device, and optionally the power-on of the connection between the gateway device and the remote servers. The gateway device initiates 1520 a software and RF sequence to locate the extant security system. The gateway and installer initiate and complete 1530 a sequence to ‘learn’ the gateway into the security system as a valid and authorized control device. The gateway initiates 1540 another software and RF sequence of instructions to discover and learn the existence and capabilities of existing RF devices within the extant security system, and store this information in the system. These operations under the system of an embodiment are described in further detail below.


Unlike conventional systems that extend an existing security system, the system of an embodiment operates utilizing the proprietary wireless protocols of the security system manufacturer. In one illustrative embodiment, the gateway is an embedded computer with an IP LAN and WAN connection and a plurality of RF transceivers and software protocol modules capable of communicating with a plurality of security systems each with a potentially different RF and software protocol interface. After the gateway has completed the discovery and learning 1540 of sensors and has been integrated 1550 as a virtual control device in the extant security system, the system becomes operational. Thus, the security system and associated sensors are presented 1550 as accessible devices to a potential plurality of user interface subsystems.


The system of an embodiment integrates 1560 the functionality of the extant security system with other non-security devices including but not limited to IP cameras, touchscreens, lighting controls, door locking mechanisms, which may be controlled via RF, wired, or powerline-based networking mechanisms supported by the gateway or servers.


The system of an embodiment provides a user interface subsystem 1570 enabling a user to monitor, manage, and control the system and associated sensors and security systems. In an embodiment of the system, a user interface subsystem is an HTML/XML/Javascript/Java/AJAX/Flash presentation of a monitoring and control application, enabling users to view the state of all sensors and controllers in the extant security system from a web browser or equivalent operating on a computer, PDA, mobile phone, or other consumer device.


In another illustrative embodiment of the system described herein, a user interface subsystem is an HTML/XML/Javascript/Java/AJAX presentation of a monitoring and control application, enabling users to combine the monitoring and control of the extant security system and sensors with the monitoring and control of non-security devices including but not limited to IP cameras, touchscreens, lighting controls, door locking mechanisms.


In another illustrative embodiment of the system described herein, a user interface subsystem is a mobile phone application enabling users to monitor and control the extant security system as well as other non-security devices.


In another illustrative embodiment of the system described herein, a user interface subsystem is an application running on a keypad or touchscreen device enabling users to monitor and control the extant security system as well as other non-security devices.


In another illustrative embodiment of the system described herein, a user interface subsystem is an application operating on a TV or set-top box connected to a TV enabling users to monitor and control the extant security system as well as other non-security devices.



FIG. 76 is a block diagram of an integrated security system 1600 wirelessly interfacing to proprietary security systems, under an embodiment. A security system 1610 is coupled or connected to a Gateway 1620, and from Gateway 1620 coupled or connected to a plurality of information and content sources across a network 1630 including one or more web servers 1640, system databases 1650, and applications servers 1660. While in one embodiment network 1630 is the Internet, including the World Wide Web, those of skill in the art will appreciate that network 1630 may be any type of network, such as an intranet, an extranet, a virtual private network (VPN), a mobile network, or a non-TCP/IP based network.


Moreover, other elements of the system of an embodiment may be conventional, well-known elements that need not be explained in detail herein. For example, security system 1610 could be any type home or business security system, such devices including but not limited to a standalone RF home security system or a non-RF-capable wired home security system with an add-on RF interface module. In the integrated security system 1600 of this example, security system 1610 includes an RF-capable wireless security panel (WSP) 1611 that acts as the master controller for security system 1610. Well-known examples of such a WSP include the GE Security Concord, Networx, and Simon panels, the Honeywell Vista and Lynx panels, and similar panels from DSC and Napco, to name a few. A wireless module 1614 includes the RF hardware and protocol software necessary to enable communication with and control of a plurality of wireless devices 1613. WSP 1611 may also manage wired devices 1614 physically connected to WSP 1611 with an RS232 or RS485 or Ethernet connection or similar such wired interface.


In an implementation consistent with the systems and methods described herein, Gateway 1620 provides the interface between security system 1610 and LAN and/or WAN for purposes of remote control, monitoring, and management. Gateway 1620 communicates with an external web server 1640, database 1650, and application server 1660 over network 1630 (which may comprise WAN, LAN, or a combination thereof). In this example system, application logic, remote user interface functionality, as well as user state and account are managed by the combination of these remote servers. Gateway 1620 includes server connection manager 1621, a software interface module responsible for all server communication over network 1630. Event manager 1622 implements the main event loop for Gateway 1620, processing events received from device manager 1624 (communicating with non-security system devices including but not limited to IP cameras, wireless thermostats, or remote door locks). Event manager 1622 further processes events and control messages from and to security system 1610 by utilizing WSP manager 1623.


WSP manager 1623 and device manager 1624 both rely upon wireless protocol manager 1626 which receives and stores the proprietary or standards-based protocols required to support security system 1610 as well as any other devices interfacing with gateway 1620. WSP manager 1623 further utilizes the comprehensive protocols and interface algorithms for a plurality of security systems 1610 stored in the WSP DB client database associated with wireless protocol manager 1626. These various components implement the software logic and protocols necessary to communicate with and manager devices and security systems 1610. Wireless Transceiver hardware modules 1625 are then used to implement the physical RF communications link to such devices and security systems 1610. An illustrative wireless transceiver 1625 is the GE Security Dialog circuit board, implementing a 319.5 MHz two-way RF transceiver module. In this example, RF Link 1670 represents the 319.5 MHz RF communication link, enabling gateway 1620 to monitor and control WSP 1611 and associated wireless and wired devices 1613 and 1614, respectively.


In one embodiment, server connection manager 1621 requests and receives a set of wireless protocols for a specific security system 1610 (an illustrative example being that of the GE Security Concord panel and sensors) and stores them in the WSP DB portion of the wireless protocol manager 1626. WSP manager 1623 then utilizes such protocols from wireless protocol manager 1626 to initiate the sequence of processes detailed in FIG. 57 and FIG. 58 for learning gateway 1620 into security system 1610 as an authorized control device. Once learned in, as described with reference to FIG. 58 (and above), event manager 1622 processes all events and messages detected by the combination of WSP manager 1623 and the GE Security wireless transceiver module 1625.


In another embodiment, gateway 1620 incorporates a plurality of wireless transceivers 1625 and associated protocols managed by wireless protocol manager 1626. In this embodiment events and control of multiple heterogeneous devices may be coordinated with WSP 1611, wireless devices 1613, and wired devices 1614. For example a wireless sensor from one manufacturer may be utilized to control a device using a different protocol from a different manufacturer.


In another embodiment, gateway 1620 incorporates a wired interface to security system 1610, and incorporates a plurality of wireless transceivers 1625 and associated protocols managed by wireless protocol manager 1626. In this embodiment events and control of multiple heterogeneous devices may be coordinated with WSP 1611, wireless devices 1613, and wired devices 1614.


Of course, while an illustrative embodiment of an architecture of the system of an embodiment is described in detail herein with respect to FIG. 58, one of skill in the art will understand that modifications to this architecture may be made without departing from the scope of the description presented herein. For example, the functionality described herein may be allocated differently between client and server, or amongst different server or processor-based components. Likewise, the entire functionality of the gateway 1620 described herein could be integrated completely within an existing security system 1610. In such an embodiment, the architecture could be directly integrated with a security system 1610 in a manner consistent with the currently described embodiments.



FIG. 77 is a flow diagram for wirelessly ‘learning’ the Gateway into an existing security system and discovering extant sensors, under an embodiment. The learning interfaces gateway 1620 with security system 1610. Gateway 1620 powers up 1710 and initiates software sequences 1720 and 1725 to identify accessible WSPs 1611 and wireless devices 1613, respectively (e.g., one or more WSPs and/or devices within range of gateway 1620). Once identified, WSP 1611 is manually or automatically set into ‘learn mode’ 1730, and gateway 1620 utilizes available protocols to add 1740 itself as an authorized control device in security system 1610. Upon successful completion of this task, WSP 1611 is manually or automatically removed from ‘learn mode’ 1750.


Gateway 1620 utilizes the appropriate protocols to mimic 1760 the first identified device 1614. In this operation gateway 1620 identifies itself using the unique or pseudo-unique identifier of the first found device 1614, and sends an appropriate change of state message over RF Link 1670. In the event that WSP 1611 responds to this change of state message, the device 1614 is then added 1770 to the system in database 1650. Gateway 1620 associates 1780 any other information (such as zone name or token-based identifier) with this device 1614 in database 1650, enabling gateway 1620, user interface modules, or any application to retrieve this associated information.


In the event that WSP 1611 does not respond to the change of state message, the device 1614 is not added 1770 to the system in database 1650, and this device 1614 is identified as not being a part of security system 1610 with a flag, and is either ignored or added as an independent device, at the discretion of the system provisioning rules. Operations hereunder repeat 1785 operations 1760, 1770, 1780 for all devices 1614 if applicable. Once all devices 1614 have been tested in this way, the system begins operation 1790.


In another embodiment, gateway 1620 utilizes a wired connection to WSP 1611, but also incorporates a wireless transceiver 1625 to communicate directly with devices 1614. In this embodiment, operations under 1720 above are removed, and operations under 1740 above are modified so the system of this embodiment utilizes wireline protocols to add itself as an authorized control device in security system 1610.


A description of an example embodiment follows in which the Gateway (FIG. 58, element 1620) is the iHub available from iControl Networks, Palo Alto, Calif., and described in detail herein. In this example the gateway is “automatically” installed with a security system.


The automatic security system installation begins with the assignment of an authorization key to components of the security system (e.g., gateway, kit including the gateway, etc.). The assignment of an authorization key is done in lieu of creating a user account. An installer later places the gateway in a user's premises along with the premises security system. The installer uses a computer to navigate to a web portal (e.g., integrated security system web interface), logs in to the portal, and enters the authorization key of the installed gateway into the web portal for authentication. Once authenticated, the gateway automatically discovers devices at the premises (e.g., sensors, cameras, light controls, etc.) and adds the discovered devices to the system or “network”. The installer assigns names to the devices, and tests operation of the devices back to the server (e.g., did the door open, did the camera take a picture, etc.). The security device information is optionally pushed or otherwise propagated to a security panel and/or to the server network database. The installer finishes the installation, and instructs the end user on how to create an account, username, and password. At this time the user enters the authorization key which validates the account creation (uses a valid authorization key to associate the network with the user's account). New devices may subsequently be added to the security network in a variety of ways (e.g., user first enters a unique ID for each device/sensor and names it in the server, after which the gateway can automatically discover and configure the device).


A description of another example embodiment follows in which the security system (FIG. 58, element 1610) is a Dialog system and the WSP (FIG. 58, element 1611) is a SimonXT available from General Electric Security, and the Gateway (FIG. 58, element 1620) is the iHub available from iControl Networks, Palo Alto, Calif., and described in detail herein. Descriptions of the install process for the SimonXT and iHub are also provided below.


GE Security's Dialog network is one of the most widely deployed and tested wireless security systems in the world. The physical RF network is based on a 319.5 MHz unlicensed spectrum, with a bandwidth supporting up to 19 Kbps communications. Typical use of this bandwidth—even in conjunction with the integrated security system—is far less than that. Devices on this network can support either one-way communication (either a transmitter or a receiver) or two-way communication (a transceiver). Certain GE Simon, Simon XT, and Concord security control panels incorporate a two-way transceiver as a standard component. The gateway also incorporates the same two-way transceiver card. The physical link layer of the network is managed by the transceiver module hardware and firmware, while the coded payload bitstreams are made available to the application layer for processing.


Sensors in the Dialog network typically use a 60-bit protocol for communicating with the security panel transceiver, while security system keypads and the gateway use the encrypted 80-bit protocol. The Dialog network is configured for reliability, as well as low-power usage. Many devices are supervised, i.e. they are regularly monitored by the system ‘master’ (typically a GE security panel), while still maintaining excellent power usage characteristics. A typical door window sensor has a battery life in excess of 5-7 years.


The gateway has two modes of operation in the Dialog network: a first mode of operation is when the gateway is configured or operates as a ‘slave’ to the GE security panel; a second mode of operation is when the gateway is configured or operates as a ‘master’ to the system in the event a security panel is not present. In both configurations, the gateway has the ability to ‘listen’ to network traffic, enabling the gateway to continually keep track of the status of all devices in the system. Similarly, in both situations the gateway can address and control devices that support setting adjustments (such as the GE wireless thermostat).


In the configuration in which the gateway acts as a ‘slave’ to the security panel, the gateway is ‘learned into’ the system as a GE wireless keypad. In this mode of operation, the gateway emulates a security system keypad when managing the security panel, and can query the security panel for status and ‘listen’ to security panel events (such as alarm events).


The gateway incorporates an RF Transceiver manufactured by GE Security, but is not so limited. This transceiver implements the Dialog protocols and handles all network message transmissions, receptions, and timing. As such, the physical, link, and protocol layers of the communications between the gateway and any GE device in the Dialog network are totally compliant with GE Security specifications.


At the application level, the gateway emulates the behavior of a GE wireless keypad utilizing the GE Security 80-bit encrypted protocol, and only supported protocols and network traffic are generated by the gateway. Extensions to the Dialog RF protocol of an embodiment enable full control and configuration of the panel, and iControl can both automate installation and sensor enrollment as well as direct configuration downloads for the panel under these protocol extensions.


As described above, the gateway participates in the GE Security network at the customer premises. Because the gateway has intelligence and a two-way transceiver, it can ‘hear’ all of the traffic on that network. The gateway makes use of the periodic sensor updates, state changes, and supervisory signals of the network to maintain a current state of the premises. This data is relayed to the integrated security system server (e.g., FIG. 2, element 260) and stored in the event repository for use by other server components. This usage of the GE Security RF network is completely non-invasive; there is no new data traffic created to support this activity.


The gateway can directly (or indirectly through the Simon XT panel) control two-way devices on the network. For example, the gateway can direct a GE Security Thermostat to change its setting to ‘Cool’ from ‘Off’, as well as request an update on the current temperature of the room. The gateway performs these functions using the existing GE Dialog protocols, with little to no impact on the network; a gateway device control or data request takes only a few dozen bytes of data in a network that can support 19 Kbps.


By enrolling with the Simon XT as a wireless keypad, as described herein, the gateway includes data or information of all alarm events, as well as state changes relevant to the security panel. This information is transferred to the gateway as encrypted packets in the same way that the information is transferred to all other wireless keypads on the network.


Because of its status as an authorized keypad, the gateway can also initiate the same panel commands that a keypad can initiate. For example, the gateway can arm or disarm the panel using the standard Dialog protocol for this activity. Other than the monitoring of standard alarm events like other network keypads, the only incremental data traffic on the network as a result of the gateway is the infrequent remote arm/disarm events that the gateway initiates, or infrequent queries on the state of the panel.


The gateway is enrolled into the Simon XT panel as a ‘slave’ device which, in an embodiment, is a wireless keypad. This enables the gateway for all necessary functionality for operating the Simon XT system remotely, as well as combining the actions and information of non-security devices such as lighting or door locks with GE Security devices. The only resource taken up by the gateway in this scenario is one wireless zone (sensor ID).


The gateway of an embodiment supports three forms of sensor and panel enrollment/installation into the integrated security system, but is not limited to this number of enrollment/installation options. The enrollment/installation options of an embodiment include installer installation, kitting, and panel, each of which is described below.


Under the installer option, the installer enters the sensor IDs at time of installation into the integrated security system web portal or iScreen. This technique is supported in all configurations and installations.


Kits can be pre-provisioned using integrated security system provisioning applications when using the kitting option. At kitting time, multiple sensors are automatically associated with an account, and at install time there is no additional work required.


In the case where a panel is installed with sensors already enrolled (i.e. using the GE Simon XT enrollment process), the gateway has the capability to automatically extract the sensor information from the system and incorporate it into the user account on the integrated security system server.


The gateway and integrated security system of an embodiment uses an auto-learn process for sensor and panel enrollment in an embodiment. The deployment approach of an embodiment can use additional interfaces that GE Security is adding to the Simon XT panel. With these interfaces, the gateway has the capability to remotely enroll sensors in the panel automatically. The interfaces include, but are not limited to, the following: EnrollDevice(ID, type, name, zone, group); SetDeviceParameters(ID, type, Name, zone, group), GetDeviceParameters(zone); and RemoveDevice(zone).


The integrated security system incorporates these new interfaces into the system, providing the following install process. The install process can include integrated security system logistics to handle kitting and pre-provisioning. Pre-kitting and logistics can include a pre-provisioning kitting tool provided by integrated security system that enables a security system vendor or provider (“provider”) to offer pre-packaged initial ‘kits’. This is not required but is recommended for simplifying the install process. This example assumes a ‘Basic’ kit is preassembled and includes one (1) Simon XT, three (3) Door/window sensors, one (1) motion sensor, one (1) gateway, one (1) keyfob, two (2) cameras, and ethernet cables. The kit also includes a sticker page with all Zones (1-24) and Names (full name list).


The provider uses the integrated security system kitting tool to assemble ‘Basic’ kit packages. The contents of different types of starter kits may be defined by the provider. At the distribution warehouse, a worker uses a bar code scanner to scan each sensor and the gateway as it is packed into the box. An ID label is created that is attached to the box. The scanning process automatically associates all the devices with one kit, and the new ID label is the unique identifier of the kit. These boxes are then sent to the provider for distribution to installer warehouses. Individual sensors, cameras, etc. are also sent to the provider installer warehouse. Each is labeled with its own barcode/ID.


An installation and enrollment procedure of a security system including a gateway is described below as one example of the installation process.

  • 1. Order and Physical Install Process
    • a. Once an order is generated in the iControl system, an account is created and an install ticket is created and sent electronically to the provider for assignment to an installer.
    • b. The assigned installer picks up his/her ticket(s) and fills his/her truck with Basic and/or Advanced starter kits. He/she also keeps a stock of individual sensors, cameras, iHubs, Simon XTs, etc. Optionally, the installer can also stock homeplug adapters for problematic installations.
    • c. The installer arrives at the address on the ticket, and pulls out the Basic kit. The installer determines sensor locations from a tour of the premises and discussion with the homeowner. At this point assume the homeowner requests additional equipment including an extra camera, two (2) additional door/window sensors, one (1) glass break detector, and one (1) smoke detector.
    • d. Installer mounts SimonXT in the kitchen or other location in the home as directed by the homeowner, and routes the phone line to Simon XT if available. GPRS and Phone numbers pre-programmed in SimonXT to point to the provider Central Monitoring Station (CMS).
    • e. Installer places gateway in the home in the vicinity of a router and cable modem. Installer installs an ethernet line from gateway to router and plugs gateway into an electrical outlet.
  • 2. Associate and Enroll gateway into SimonXT
    • a. Installer uses either his/her own laptop plugged into router, or homeowners computer to go to the integrated security system web interface and log in with installer ID/pass.
    • b. Installer enters ticket number into admin interface, and clicks ‘New Install’ button. Screen prompts installer for kit ID (on box's barcode label).
    • c. Installer clicks ‘Add SimonXT’. Instructions prompt installer to put Simon XT into install mode, and add gateway as a wireless keypad. It is noted that this step is for security only and can be automated in an embodiment.
    • d. Installer enters the installer code into the Simon XT. Installer Learns ‘gateway’ into the panel as a wireless keypad as a group 1 device.
    • e. Installer goes back to Web portal, and clicks the ‘Finished Adding SimonXT’ button.
  • 3. Enroll Sensors into SimonXT via iControl
    • a. All devices in the Basic kit are already associated with the user's account.
    • b. For additional devices, Installer clicks ‘Add Device’ and adds the additional camera to the user's account (by typing in the camera ID/Serial #).
    • c. Installer clicks ‘Add Device’ and adds other sensors (two (2) door/window sensors, one (1) glass break sensor, and one (1) smoke sensor) to the account (e.g., by typing in IDs).
    • d. As part of Add Device, Installer assigns zone, name, and group to the sensor. Installer puts appropriate Zone and Name sticker on the sensor temporarily.
    • e. All sensor information for the account is pushed or otherwise propagated to the iConnect server, and is available to propagate to CMS automation software through the CMS application programming interface (API).
    • f. Web interface displays ‘Installing Sensors in System . . . ’ and automatically adds all of the sensors to the Simon XT panel through the GE RF link.
    • g. Web interface displays ‘Done Installing’->all sensors show green.
  • 4. Place and Tests Sensors in Home
    • a. Installer physically mounts each sensor in its desired location, and removes the stickers.
    • b. Installer physically mounts WiFi cameras in their location and plugs into AC power. Optional fishing of low voltage wire through wall to remove dangling wires. Camera transformer is still plugged into outlet but wire is now inside the wall.
    • c. Installer goes to Web interface and is prompted for automatic camera install. Each camera is provisioned as a private, encrypted Wifi device on the gateway secured sandbox network, and firewall NAT traversal is initiated. Upon completion the customer is prompted to test the security system.
    • d. Installer selects the ‘Test System’ button on the web portal—the SimonXT is put into Test mode by the gateway over GE RF.
    • e. Installer manually tests the operation of each sensor, receiving an audible confirmation from SimonXT.
    • f. gateway sends test data directly to CMS over broadband link, as well as storing the test data in the user's account for subsequent report generation.
    • g. Installer exits test mode from the Web portal.
  • 5. Installer instructs customer on use of the Simon XT, and shows customer how to log into the iControl web and mobile portals. Customer creates a username/password at this time.
  • 6. Installer instructs customer how to change Simon XT user code from the Web interface. Customer changes user code which is pushed to SimonXT automatically over GE RF.


An installation and enrollment procedure of a security system including a gateway is described below as an alternative example of the installation process. This installation process is for use for enrolling sensors into the SimonXT and integrated security system and is compatible with all existing GE Simon panels.


The integrated security system supports all pre-kitting functionality described in the installation process above. However, for the purpose of the following example, no kitting is used.

    • 1. Order and Physical Install Process
      • a. Once an order is generated in the iControl system, an account is created and an install ticket is created and sent electronically to the security system provider for assignment to an installer.
      • b. The assigned installer picks up his/her ticket(s) and fills his/her truck with individual sensors, cameras, iHubs, Simon XTs, etc. Optionally, the installer can also stock homeplug adapters for problematic installations.
      • c. The installer arrives at the address on the ticket, and analyzes the house and talks with the homeowner to determine sensor locations. At this point assume the homeowner requests three (3) cameras, five (5) door/window sensors, one (1) glass break detector, one (1) smoke detector, and one (1) keyfob.
      • d. Installer mounts SimonXT in the kitchen or other location in the home. The installer routes a phone line to Simon XT if available. GPRS and Phone numbers are pre-programmed in SimonXT to point to the provider CMS.
      • e. Installer places gateway in home in the vicinity of a router and cable modem, and installs an ethernet line from gateway to the router, and plugs gateway into an electrical outlet.
    • 2. Associate and Enroll gateway into SimonXT
      • a. Installer uses either his/her own laptop plugged into router, or homeowners computer to go to the integrated security system web interface and log in with an installer ID/pass.
      • b. Installer enters ticket number into admin interface, and clicks ‘New Install’ button. Screen prompts installer to add devices.
      • c. Installer types in ID of gateway, and it is associated with the user's account.
      • d. Installer clicks ‘Add Device’ and adds the cameras to the user's account (by typing in the camera ID/Serial #).
      • e. Installer clicks ‘Add SimonXT’. Instructions prompt installer to put Simon XT into install mode, and add gateway as a wireless keypad.
      • f. Installer goes to Simon XT and enters the installer code into the Simon XT. Learns ‘gateway’ into the panel as a wireless keypad as group 1 type sensor.
      • g. Installer returns to Web portal, and clicks the ‘Finished Adding SimonXT’ button.
      • h. Gateway now is alerted to all subsequent installs over the security system RF.
    • 3. Enroll Sensors into SimonXT via iControl
      • a. Installer clicks ‘Add Simon XT Sensors’—Displays instructions for adding sensors to Simon XT.
      • b. Installer goes to Simon XT and uses Simon XT install process to add each sensor, assigning zone, name, group. These assignments are recorded for later use.
      • c. The gateway automatically detects each sensor addition and adds the new sensor to the integrated security system.
      • d. Installer exits install mode on the Simon XT, and returns to the Web portal.
      • e. Installer clicks ‘Done Adding Devices’.
      • f. Installer enters zone/sensor naming from recorded notes into integrated security system to associate sensors to friendly names.
      • g. All sensor information for the account is pushed to the iConnect server, and is available to propagate to CMS automation software through the CMS API.
    • 4. Place and Tests Sensors in Home
      • a. Installer physically mounts each sensor in its desired location.
      • b. Installer physically mounts Wifi cameras in their location and plugs into AC power. Optional fishing of low voltage wire through wall to remove dangling wires. Camera transformer is still plugged into outlet but wire is now inside the wall.
      • c. Installer puts SimonXT into Test mode from the keypad.
      • d. Installer manually tests the operation of each sensor, receiving an audible confirmation from SimonXT.
      • e. Installer exits test mode from the Simon XT keypad.
      • f. Installer returns to web interface and is prompted to automatically set up cameras. After waiting for completion cameras are now provisioned and operational.
    • 5. Installer instructs customer on use of the Simon XT, and shows customer how to log into the integrated security system web and mobile portals. Customer creates a username/password at this time.
    • 6. Customer and Installer observe that all sensors/cameras are green.
    • 7. Installer instructs customer how to change Simon XT user code from the keypad. Customer changes user code and stores in SimonXT.
    • 8. The first time the customer uses the web portal to Arm/Disarm system the web interface prompts the customer for the user code, which is then stored securely on the server. In the event the user code is changed on the panel the web interface once again prompts the customer.


The panel of an embodiment can be programmed remotely. The CMS pushes new programming to SimonXT over a telephone or GPRS link. Optionally, iControl and GE provide a broadband link or coupling to the gateway and then a link from the gateway to the Simon XT over GE RF.


In addition to the configurations described above, the gateway of an embodiment supports takeover configurations in which it is introduced or added into a legacy security system. A description of example takeover configurations follow in which the security system (FIG. 2, element 210) is a Dialog system and the WSP (FIG. 2, element 211) is a GE Concord panel (e.g., equipped with POTS, GE RF, and Superbus 2000 RS485 interface (in the case of a Lynx takeover the Simon XT is used) available from General Electric Security. The gateway (FIG. 2, element 220) in the takeover configurations is an iHub (e.g., equipped with built-in 802.11b/g router, Ethernet Hub, GSM/GPRS card, RS485 interface, and iControl Honeywell-compatible RF card) available from iControl Networks, Palo Alto, Calif. While components of particular manufacturers are used in this example, the embodiments are not limited to these components or to components from these vendors.


The security system can optionally include RF wireless sensors (e.g., GE wireless sensors utilizing the GE Dialog RF technology), IP cameras, a GE-iControl Touchscreen (the touchscreen is assumed to be an optional component in the configurations described herein, and is thus treated separately from the iHub; in systems in which the touchscreen is a component of the base security package, the integrated iScreen (available from iControl Networks, Palo Alto, Calif.) can be used to combine iHub technology with the touchscreen in a single unit), and Z-Wave devices to name a few.


The takeover configurations described below assume takeover by a “new” system of an embodiment of a security system provided by another third party vendor, referred to herein as an “original” or “legacy” system. Generally, the takeover begins with removal of the control panel and keypad of the legacy system. A GE Concord panel is installed to replace the control panel of the legacy system along with an iHub with GPRS Modem. The legacy system sensors are then connected or wired to the Concord panel, and a GE keypad or touchscreen is installed to replace the control panel of the legacy system. The iHub includes the iControl RF card, which is compatible with the legacy system. The iHub finds and manages the wireless sensors of the legacy system, and learns the sensors into the Concord by emulating the corresponding GE sensors. The iHub effectively acts as a relay for legacy wireless sensors.


Once takeover is complete, the new security system provides a homogeneous system that removes the compromises inherent in taking over or replacing a legacy system. For example, the new system provides a modern touchscreen that may include additional functionality, new services, and supports integration of sensors from various manufacturers. Furthermore, lower support costs can be realized because call centers, installers, etc. are only required to support one architecture. Additionally, there is minimal install cost because only the panel is required to be replaced as a result of the configuration flexibility offered by the iHub.


The system takeover configurations described below include but are not limited to a dedicated wireless configuration, a dedicated wireless configuration that includes a touchscreen, and a fished Ethernet configuration. Each of these configurations is described in detail below.



FIG. 78 is a block diagram of a security system in which the legacy panel is replaced with a GE Concord panel wirelessly coupled to an iHub, under an embodiment. All existing wired and RF sensors remain in place. The iHub is located near the Concord panel, and communicates with the panel via the 802.11 link, but is not so limited. The iHub manages cameras through a built-in 802.11 router. The iHub listens to the existing RF HW sensors, and relays sensor information to the Concord panel (emulating the equivalent GE sensor). The wired sensors of the legacy system are connected to the wired zones on the control panel.



FIG. 79 is a block diagram of a security system in which the legacy panel is replaced with a GE Concord panel wirelessly coupled to an iHub, and a GE-iControl Touchscreen, under an embodiment. All existing wired and RF sensors remain in place. The iHub is located near the Concord panel, and communicates with the panel via the 802.11 link, but is not so limited. The iHub manages cameras through a built-in 802.11 router. The iHub listens to the existing RF HW sensors, and relays sensor information to the Concord panel (emulating the equivalent GE sensor). The wired sensors of the legacy system are connected to the wired zones on the control panel.


The GE-iControl Touchscreen can be used with either of an 802.11 connection or Ethernet connection with the iHub. Because the takeover involves a GE Concord panel (or Simon XT), the touchscreen is always an option. No extra wiring is required for the touchscreen as it can use the 4-wire set from the replaced keypad of the legacy system. This provides power, battery backup (through Concord), and data link (RS485 Superbus 2000) between Concord and touchscreen. The touchscreen receives its broadband connectivity through the dedicated 802.11 link to the iHub.



FIG. 80 is a block diagram of a security system in which the legacy panel is replaced with a GE Concord panel connected to an iHub via an Ethernet coupling, under an embodiment. All existing wired and RF sensors remain in place. The iHub is located near the Concord panel, and wired to the panel using a 4-wire Superbus 2000 (RS485) interface, but is not so limited. The iHub manages cameras through a built-in 802.11 router. The iHub listens to the existing RF HW sensors, and relays sensor information to the Concord panel (emulating the equivalent GE sensor). The wired sensors of the legacy system are connected to the wired zones on the control panel.


The takeover installation process is similar to the installation process described above, except the control panel of the legacy system is replaced; therefore, only the differences with the installation described above are provided here. The takeover approach of an embodiment uses the existing RS485 control interfaces that GE Security and iControl support with the iHub, touchscreen, and Concord panel. With these interfaces, the iHub is capable of automatically enrolling sensors in the panel. The exception is the leverage of an iControl RF card compatible with legacy systems to ‘takeover’ existing RF sensors. A description of the takeover installation process follows.


During the installation process, the iHub uses an RF Takeover Card to automatically extract all sensor IDs, zones, and names from the legacy panel. The installer removes connections at the legacy panel from hardwired wired sensors and labels each with the zone. The installer pulls the legacy panel and replaces it with the GE Concord panel. The installer also pulls the existing legacy keypad and replaces it with either a GE keypad or a GE-iControl touchscreen. The installer connects legacy hardwired sensors to appropriate wired zone (from labels) on the Concord. The installer connects the iHub to the local network and connects the iHub RS485 interface to the Concord panel. The iHub automatically ‘enrolls’ legacy RF sensors into the Concord panel as GE sensors (maps IDs), and pushes or otherwise propagates other information gathered from HW panel (zone, name, group). The installer performs a test of all sensors back to CMS. In operation, the iHub relays legacy sensor data to the Concord panel, emulating equivalent GE sensor behavior and protocols.


The areas of the installation process particular to the legacy takeover include how the iHub extracts sensor info from the legacy panel and how the iHub automatically enrolls legacy RF sensors and populates Concord with wired zone information. Each of these areas is described below.


In having the iHub extract sensor information from the legacy panel, the installer ‘enrolls’ iHub into the legacy panel as a wireless keypad (use install code and house ID—available from panel). The iHub legacy RF Takeover Card is a compatible legacy RF transceiver. The installer uses the web portal to place iHub into ‘Takeover Mode’, and the web portal the automatically instructs the iHub to begin extraction. The iHub queries the panel over the RF link (to get all zone information for all sensors, wired and RF). The iHub then stores the legacy sensor information received during the queries on the iConnect server.


The iHub also automatically enrolls legacy RF sensors and populates Concord with wired zone information. In so doing, the installer selects ‘Enroll legacy Sensors into Concord’ (next step in ‘Takeover’ process on web portal). The iHub automatically queries the iConnect server, and downloads legacy sensor information previously extracted. The downloaded information includes an ID mapping from legacy ID to ‘spoofed’ GE ID. This mapping is stored on the server as part of the sensor information (e.g., the iConnect server knows that the sensor is a legacy sensor acting in GE mode). The iHub instructs Concord to go into install mode, and sends appropriate Superbus 2000 commands for sensor learning to the panel. For each sensor, the ‘spoofed’ GE ID is loaded, and zone, name, and group are set based on information extracted from legacy panel. Upon completion, the iHub notifies the server, and the web portal is updated to reflect next phase of Takeover (e.g., ‘Test Sensors’).


Sensors are tested in the same manner as described above. When a HW sensor is triggered, the signal is captured by the iHub legacy RF Takeover Card, translated to the equivalent GE RF sensor signal, and pushed to the panel as a sensor event on the SuperBus 2000 wires.


In support of remote programming of the panel, CMS pushes new programming to Concord over a phone line, or to the iConnect CMS/Alarm Server API, which in turn pushes the programming to the iHub. The iHub uses the Concord Superbus 2000 RS485 link to push the programming to the Concord panel.



FIG. 81 is a flow diagram for automatic takeover 2100 of a security system, under an embodiment. Automatic takeover includes establishing 2102 a wireless coupling between a takeover component running under a processor and a first controller of a security system installed at a first location. The security system includes some number of security system components coupled to the first controller. The automatic takeover includes automatically extracting 2104 security data of the security system from the first controller via the takeover component. The automatic takeover includes automatically transferring 2106 the security data to a second controller and controlling loading of the security data into the second controller. The second controller is coupled to the security system components and replaces the first controller.



FIG. 82 is a flow diagram for automatic takeover 2200 of a security system, under an alternative embodiment. Automatic takeover includes automatically forming 2202 a security network at a first location by establishing a wireless coupling between a security system and a gateway. The gateway of an embodiment includes a takeover component. The security system of an embodiment includes security system components. The automatic takeover includes automatically extracting 2204 security data of the security system from a first controller of the security system. The automatic takeover includes automatically transferring 2206 the security data to a second controller. The second controller of an embodiment is coupled to the security system components and replaces the first controller.


Home View as described herein enables users to quickly access and view state, and control devices from a single user experience. Home View provides an easy way for users to represent each floor of their home and indicate the location of security sensors, cameras, lights, thermostats, locks, and any other devices in the home automation system. Using this interface, users can easily check on the state of their home from anywhere using a mobile phone or web browser. To further enhance the “glanceable” experience of home management, the Home View of an embodiment includes a three-dimensional version referred to herein as “Home View 3D”. Home View 3D provides the added ability to see all locations in a multi-floor dwelling at once. For example, a user can instantly notice an open window upstairs, turn off a light, view temperature on each floor, and access cameras outside with a single click, to name a few.



FIG. 83 is an example status interface of Home View 3D, under an embodiment. FIG. 84 is an example user interface of Home View 3D, under an embodiment.


To enable Home View 3D, the user can edit the representation of their home using one or more of a web browser, smart phone, and tablet computer, and select or click the Home View 3D option. That setting is saved in the cloud-based environment or other server environment, and changes the user's web and mobile devices to use a 3D view. Home View 3D provides unique and powerful visualization of the home lets the user feel connected and in control of their home from anywhere in the world. FIG. 85 is an example user interface showing “enable” control of Home View 3D, under an embodiment.


Home View 3D is disabled by default, and a user can enable it in any editor of an embodiment. Home View 3D includes options in the editor menu to toggle the 3D option. These settings affect or are applied to all client devices that interface with the site (e.g., after next login, depending on caching). FIG. 86 is an example user interface showing “disable” control of Home View 3D, under an embodiment.


Additionally, when Home View 3D is enabled, the editor displays an indicator to that effect using the thumbnails, but the embodiment is not so limited. FIG. 87 is an example editor interface with indicators of Home View 3D being enabled, under an embodiment.


The 3D of an embodiment is a render-time feature, but is not so limited. The interaction with Home View 3D is as described in detail herein with a single-floor rendering (e.g., devices include popups indicating state, double-clicking devices causes navigation, etc.). In Home View 3D of an embodiment, if the canvas is non-square, the rendering stretches to fit the canvas (or display viewer). For example, on tablets the renderer can be wider than it is tall. Additionally, floating text for devices at the top edge of lower floors flips over to render below the device, just as they did for the 2D renderer described herein.


Regarding general rendering and scaling rules of an embodiment, Home View 3D primarily affects walls with isometric skewing to make them look tipped back. As an example, the front wall is full width, and the back wall is approximately 80% of normal width, giving the illusion of depth. Devices and text are not skewed and the device or text appears as if sitting upright on the tipped floors. Devices and text of an embodiment are scaled to match horizontal scaling. Specifically, devices and text on the front edge are approximately 100% normal size, and devices and text on the back edge are approximately 80% of normal size.


Furthermore, floors are tapered so that a top floor is slightly wider than the bottom floor to add to the 3D illusion. Specifically, the front corners of the bottom floor render as they would in 2D (e.g., with a gutter on left/right), but the front corners of the top floor is approximately one pixel away from canvas edge, but the embodiment is not so limited.


Home view 3D of an example embodiment supports between one and five floors, but is not so limited. FIG. 88 is an example user interface showing five floors, under an embodiment.


Home View 3D includes customization and branding but is not so limited. FIG. 89 is an example interface of Home View 3D showing variables, under an embodiment. Home View branding variables are as follows, but are not so limited:















A. threeDScaleBackRowByPct
 = 0.8; //horizontally scale back wall







(and icons and text) this %, 80% width of front edge


B. threeDVertScaleSingleFloorPct = 0.75; //if rending single floor 3D,


scale vertically by this percent








C. threeDVertFloorGapInTiles
= 1.2; //insert vertical gap betwen floors,







height is this many tiles








D. threeDTopFloorColorStops
= [{stop: 0, color: “rgb(180,180,180)”},







//color for back edge of top floor








E.
  {stop: 1, color:







“rgb(180,180,180)”} ]; //color for front edge of top floor








F. threeDBotFloorColorStops
= [{stop: 0, color: “rgb(180,180,180)”},







//color for back edge of shadow on lower floors








G.
  {stop: 0.75, color:







“rgb(180,180,180)”}, //color for front edge of shadow on lower floors








H.
  {stop: 0.9, color:







“rgb(180,180,180)”}, //color for back edge of lighted section of lower


floors








I.
  {stop: 1 , color:







“rgb(180,180,180)”} ]; //color for front edge of lighted section of lower


floors








J. threeDSubShadowGapInTiles
 = 2.5; //gap between bottom floor and







sub-shadow; height is this many tiles








K. threeDSubShadowColor
= “rgba(0,0,0,0.15)”; //color and







transparency of shadow (same shape as bottom floor)








L. threeDSubShadowBlur
= 20; //radius of blur for sub-shadow









Home View 3D presents more information when a device (e.g., tablet, phone, touch screen, etc.) is in landscape mode. When 3D is enabled and the host device is in landscape mode, the rendering of an embodiment is approximately 40% wider than it is tall, but the embodiment is not so limited. Further, it should also center both vertically and horizontally. FIG. 90 shows example renderings for square, wide, and tall canvases, 3D single-floor premises, and 3D multi-floor premises, under an embodiment.


In addition to rendering 3D, Home View 3D includes historical activity data or information for sensors, like a “heat map” for history that fades with time. For example, if a door opens or closes, the device icon will have a bright glow around it that will fade with time. At a glance the user can tell where there has been recent activity. FIG. 91 is an example user interface showing a “heat map” of Home View 3D, under an embodiment. In this example, sensors in the “family room” and “living room” are displayed with a bright glow indicating recent activity, the but embodiment is not so limited.


This feature is activated on each client when the user selects or taps the history icon and enables history view by choosing a time period. Once a time period is selected, that client shows a history glow for all sensors that have had activity within that time period. For example, with 1 Week selected, a sensor that has been tripped today will have a strong glow, a sensor tripped 3 days ago will be half faded, a sensor tripped 6 days ago will have a very faint glow, and a sensor tripped 7 days ago (or more) will have no glow at all.


The heat map feature includes three UI elements but is not so limited. An icon is used to enable and set the feature. By default, the icon is a standard history icon (clock in circle). But if history view is enabled, the circle contains the time period shown (10M=10 minutes, 1D=10 days etc.). Additionally, a popup dialog enables the user to enable the feature and select a time period. A glow ring is shown around sensors, and the glow ring is configured to fade with passage of time. FIG. 92 is an example user interface for configuring a “heat map” of Home View 3D, under an embodiment. FIG. 93 is another example user interface for configuring a “heat map” of Home View 3D, under an embodiment.


Embodiments display activity for the premises devices based on the type of device, but are not so limited. For example, activity presented for sensors includes a last update for any point in the instance (e.g., open/close, low battery, trouble, tamper, bypass, alarms, etc.). Activity presented for door locks and garage door controllers includes a last or most recent update for any point in the instance (e.g., open/close, lock/unlock, low battery, trouble, etc.). Activity presented for lights (w/o energy) includes last or most recent update for any point in the instance (e.g., on/off, dimmer level changes, offline, etc.). Activity presented for lights that report energy includes last or most recent update for any point in the instance (e.g., on/off, dimmer level changes, offline, etc.) (energy changes and related points may be ignored). Activity presented for thermostats includes last or most recent update for any point in the instance (e.g. heating/cooling, setpoint changes, mode changes, low battery, etc.). Activity presented for cameras includes last or most recent update for motion sensor (may not report camera taking pictures/clips). Activity presented for energy may not include report activity.


When computing coordinates in two dimensions (2D), an embodiment used a two-dimensional array (28×28) comprising information about each “tile” in the data grid for each floor. Here, a block of numbers from the serial data is provided to draw a large rectangle of floor tiles:














 for (i=0; i<tilesArr.length; i++) {


  if (tilesArr[i].length > 4) {


   x = (tilesArr[i][1]);


   y = (tilesArr[i][2]);


   w = (tilesArr[i][3]);


   h = (tilesArr[i][4]);


   //save individual tile data for editing


   for (row=y; row < (y+h) && row<this.numTiles; row++) {


    for (col=x; col < (x+w) && col<this.numTiles; col++) this.t[row][col].shown=true;


//turn on tile for each value in vector


   }


   //remember full tile blocks, ONLY for superfast rendering (not edit mode, where


segs are being changed constantly)


   point0 = this.pSkewXY( x *this.tileWidth + this.startPosX, y *this.tileWidth +


this.startPosY);


   point1 = this.pSkewXY((x + w)*this.tileWidth + this.startPosX, y *this.tileWidth


+ this.startPosY);


   point2 = this.pSkewXY((x + w)*this.tileWidth + this.startPosX, (y +


h)*this.tileWidth + this.startPosY);


   point3 = this.pSkewXY( x *this.tileWidth + this.startPosX, (y + h)*this.tileWidth


+ this.startPosY);


   this.tFastRender.push([point0, point1, point2, point3]);


  }


 }









For example, if the data included taadc, that becomes an array [0,0,3,2], meaning draw a rectangle from the origin, three tiles wide and two tiles high. The above code, computes the true pixel position for those locations, converting the parameters to 4 (x,y) corners of the rectangle to render:

    • . . . =>.point0.point1=>.(x0,y0).(x1,y1)
      • . . . . point3.point2.(x3,y3).(x2,y2)


The actual pixel location of each x,y coordinate is taking the abstract grid location and turning it into pixels. Each location is multiplied by the tileWidth, then offset by the rendering start positions startPosX and startPosY that account for gutters. To compute an abstract position like (3,2), the params are multiplied by the pixel width of a tile, and offset by the pixel position startPosX etc.


pixelPosition for (x,y)=(x*this.tileWidth+this.startPosX, y*this.tileWidth+this.startPosY)


For 2D rendering, the pSkewXY function does not alter these pixel positions, but returns them. For 3D rendering, each x,y position gets altered in several ways as follows, but the embodiment is not so limited:

    • 1. If there are multiple floors, each y position is scaled vertically (for example, if there are 2 floors, every y value is divided by 2). The first floor would be drawn from the origin, but the 2nd floor would also be offset vertically so it draws halfway down. In addition, vertical offset is altered to provide a gap is between floors.
    • 2. If there is a single floor, each position is scaled vertically to 60% of its height and offset to be vertically centered. This is controlled by a ppref.
    • 3. All x positions are altered by shifting them toward the vertical midline. For example, in a 100 px canvas, An x value of 50 it is unchanged. However, if x is 0, it needs to be skewed 20% toward the center. Since the back row is to be scaled to 80% width, we bring X to 80% of it's distance from the vertical midline. In this example, x would change to (50−abs(x−50)*0.8). So an x at 0 shifted 20% to midline becomes x=10. This effect is reduced as we render lower rows (toward the front edge of the floor). Back row is squeezed to 80%, and front row is not horizontally squeezed at all, so 100% of original position.
    • 4. A front-to-back scaling factor must be computed for later shrinking of device icons and label text. Devices in back (top) row are scaled to 80%, halfway back 90%, and front edge (bottom) devices are 100%.


An example follows of the core skewing algorithm of an embodiment, in code, but the embodiment is not so limited:














//------------------------------------------------------------


 // pSkewXY


 // arguments: absolute canvas x and y positions


 // return: object with x and y properties with new, skewed values


 //


 // In general, y skew is scaled by # floors (2 floors means y = y/2). X skew is more


subtle.


 // If x is about halfway across, its unaffected. And if Y is the max, x is the front row and


 // undaffected. But the farther “back” you go, the more skewed x is. For example, in the


first


 // row, x==0 will be bent in by the 80% factor, or 10% increased towards the middle.


 //------------------------------------------------------------


 ic_hvwFloorData.prototype.pSkewXY = function (px, py) {


  var devScale = 1; //computed amount to scale devices for each location. 1 for front


edge (bottom row), .80 for back edge (top row)


  try {


   if (this.render3D) { //skewing ONLY affects render mode, not editor


    if (!this.cache) { //to ensure this is fast, precompute everything possible, only once


per floor








     var scaleBackRowByPct= this.threeDScaleBackRowByPct,
  //horizontally







scale back wall (and icons and text) this %


      vertScaleSingleFloorPct = this.threeDVertScaleSingleFloorPct, //if rending


single floor 3D, scale vert by this %








      floorGapInTiles = this.threeDVertFloorGapInTiles,
//vert gap btwn







floors, height is this many tiles (scaled by # floors)


      gapBetweenFloors = (this.numFloors>1)?


(floorGapInTiles*this.tileHeight/(this.numFloors)) : 0; //gap in pixels if 3D & >1 flr


     this.cache = { }; //create or clear cache object


     this.cache.xSkewFactor = (1-scaleBackRowByPct); //constant controls amount of


skew, such as .8 = 80% horiz scale


     this.cache.ySkewFactor = (1 - (this.numFloors-


1)*(floorGapInTiles/this.numTiles)) / this.numFloors;


     this.cache.drawWidth   = this.tileWidth * this.numTiles;


     this.cache.drawHeight  = this.tileHeight * this.numTiles;


     this.cache.yOffset  = ((this.numFloors - 1) - this.floorNum) *


//amount to shift each floor down


        ((this.cache.drawHeight/this.numFloors) + gapBetweenFloors);


//offset by # floors + gap








     if (this.numFloors == 1) {
 //if single floor


      this.cache.ySkewFactor *= vertScaleSingleFloorPct;
   //scale







vertically


      this.cache.yOffset = ((1 - vertScaleSingleFloorPct) / 2) *


this.cache.drawHeight; //and offset vertically so centered


     }


     this.cache.halfDrawWidth  = this.cache.drawWidth / 2; //precompute for speed


     this.cache.xSkewMultiplier = this.cache.drawHeight * (this.cache.xSkewFactor) / 2;


     this.cache.yScaleFactor  = (1-(this.cache.xSkewFactor)*(this.cache.drawHeight -


this.startPosY)/(this.cache.drawHeight));


    }


    //compute skewed x, y positions, and scale for this row


    devScale = py*this.cache.xSkewFactor/this.cache.drawHeight +


this.cache.yScaleFactor; //device scale: compute before altering py


    px  += (1 - (px-this.startPosX)/this.cache.halfDrawWidth) * //add normal X


factor skewing


       (1 - (py-this.startPosY)/this.cache.drawHeight )* //but diminished by Y


factor








       this.cache.xSkewMultiplier;
//then scale overall







    py  = (py-this.startPosY)*this.cache.ySkewFactor + this.startPosY +


this.cache.yOffset; //remove start pos, skew, then add back


   }


  }


  catch (ev) {


   //console.log(“Home View: pSkew failed ”+ev);


  }


  return {x:px, y:py, scaleFor3D:devScale};


 };









Tapering of the floors in Home View 3D, as described in detail herein, means that the top floor is rendered slightly wider than the bottom floor. Since the render naturally has vertical gutters on the left and right edge, and these gutters are wider than needed since the floors are skewed and smaller, the algorithm of an embodiment renders the bottom floor with gutter unchanged, and reduces the top floor gutter to approximately 35% of its normal width, as an example.


Before computing all the locations for rendering a floor, an embodiment shrinks this gutter for the higher floors. For example, with 3 floors, the gutters are approximately 35%, 57%, and 100% of their typical width, but are not so limited. Since the gutters are smaller, the floors are wider, so an embodiment grows the tile widths by that same approximate percent. An example algorithm is as follows, but is not so limited:














if (render3D) { //This block makes the higher floors a bit wider then tapers inward to


enhance 3D illusion


  this.cache = null; //need to clear pre-computed cache from lower floors


  var gutterPct = 0.35 + 0.65*((numFloors-1)-floorNum)/((numFloors>1)?(numFloors-


1):1); //top floor: 35% gutter, bottom floor: 100% gutter








  this.startPosX *= gutterPct;
//shrink startPosX that







% to shift closer to edge


  this.tileWidth *= 1 + (2*(startPosX-this.startPosX)/(this.numTiles*this.tileWidth))


//grow tileWidth by same percent gutter shrank


 }









Embodiments of the integrated system described herein include a user interface (UI) that is a cross-platform UI providing control over home automation and security systems and devices from client devices including but not limited to tablets, smart phones, iOS devices, and Android devices. The user interface, also referred to herein as “Alta”, includes Home View, which as described in detail herein and in the Related Applications provides a top-level view in which premises devices are displayed according to their actual position in the premises.



FIG. 94 is an example UI screen, under an embodiment. Using the UI, devices are displayed in Home View, List View, and Details View. Home View comprises a top-level view in which premises devices are displayed according to their actual position in the premises. Tap on a device icon in Home View to view an overlay for quick access to details and controls (when applicable) for that device.


The Tab Bar, located at the bottom of the screen in an example embodiment, provides navigation through the app. A tap detected on a device group's icon in the Tab Bar navigates to a Details View page of either the last viewed device in that group, or otherwise the first listed device in the group.


Details View displays information and controls for a device. Swipe left or right to move between devices. Details View displays information and controls for a device; this view does not exist for Sensors. Swipe left or right to move between devices.


The List icon to the right of the device name in Details View pulls up the device list for the current group (e.g., if previously viewing a thermostat, a list of installed thermostats is displayed). This List View appears over the content area in a vertically scrolling list. View a device's details by tapping on that device in the list to view it in Device View. Tap the X icon to the right of the list header to close the list and return to the previously-viewed device.


The navigation layers appear in the UI, from top to bottom for example, as follows: Full-Screen View and Status Bar (with mini system icon); Modal dialogs and Menus; Warnings/Messages and Status Bar (no mini system icon)/System Bars; History View; Tab Bar; List View; Detail View and Home View; Background.



FIG. 95 shows an example Status Bar of the UI, under an embodiment. The Status Bar of an example includes the following elements: System status text (e.g., “Disarmed), and Sensor status text (e.g., “All Quiet”).



FIG. 96 shows an example System Bar of the UI, under an embodiment. The System Bar of an embodiment includes but is not limited to the following elements: Mode Button (shows current mode); System Icon (reflects current arm status by color, also shows badging for tripped sensors and system trouble); and Arm/Disarm Button (shows arm/disarm action). The System Bar shows current Mode and Arm status and allows security panel arm and disarm options and mode changes. Tap the Arm button to select arming options. Tap the Disarm button to disarm. Tap the Mode button to select a new mode. Press the system icon as a shortcut to the sensors list.



FIG. 97 shows an example Tab Bar of the UI, under an embodiment. The Tab Bar of an example provides navigation to each section in the app, and is pinned to the bottom of the screen and displayed on top of the content area, but is not so limited. The current section is highlighted. The touch area for each tab is larger than the icon. Sections marked with an asterisk (*) are shown only if devices in that group are installed: List; Home (Home View/Sensors: HV shown first if set up, and Sensors otherwise); Cameras*; Lights*; Thermostats*; Doors*; Energy*; and Settings. When sensors (in Home), cameras, or any lifestyle device group include at least one trouble state like offline, unknown or installing, a trouble badge appears on the top right corner of the icon.


The UI of an embodiment includes content area positioned under the System Bar. When scrolling vertically, content scrolls under the Tab Bar at the bottom and under the System Bar and Status Bar at the top. The content area includes information about devices or other system information in several ways. Home View is a visual arrangement of security and lifestyle devices on a floor plan. Details View is a detailed device screen that fills the content area and can also be scrolled horizontally to move between devices. List View is a vertically scrolling list displayed in a sheet. Scroll horizontally to move between device groups. In Full Screen mode, the content area is the entire area beneath the Status Bar. The type of content displayed in the content area in Full Screen mode includes: live video; captured clips; captured pictures; third party content (widgets); and installer application.



FIG. 98 shows an example Details View of the UI, under an embodiment. Details View displays information and controls for a device (or other types of information such as settings or widget icons). Device details appear when tapping a device icon in Home View (swipe navigation disabled, no list icon, appears in a dark dialog box), tapping the tab icon for a device group (horizontal swiping enabled, list icon, floats in content area), or tapping a device from list view (horizontal swiping enabled, list icon, floats in content area). Tap the List icon to see a list of devices in the current device section.


On Home View, device details are presented in a card overlay. FIG. 99 shows two versions of an example Details Card in Home View of the UI, under an embodiment.



FIG. 100 shows an example List View of the UI, under an embodiment. FIG. 101 shows an example List layout of List View of the UI, under an embodiment. List View is a compact display of items in a vertically scrolling list. For lifestyle devices, List View is accessible by tapping the List icon in Details View. When a list is displayed, tap on the List icon to hide the list. The list is dismissed and the view returned to the previous view, as if the X icon was tapped. When viewing Home View, tap on the List icon to navigate to the Sensors list. If Home View is toggled off, the Sensors list becomes the default view when tapping on the Home icon. The types of information that may be displayed in a List View include sensors, cameras, lights, thermostats, doors, and energy, but embodiments are not so limited.



FIG. 102 shows a device list item of the UI, under an embodiment. Each device list item includes but is not limited to the following: Icon (indicates device type and status, vertically-centered, left-aligned); Name (bold text, left-aligned) in either portrait (listed on first row) or landscape (listed inline, left-aligned); Zone number (sensors only, normal text) in either portrait (listed on second row, left-aligned) or landscape (Inline, left-aligned following the device name); and Status text (sensor and device lists only, normal text, includes other secondary status such as “Stopped” or “Low battery”) in either portrait (vertically centered, right-aligned) or landscape (Inline, right-aligned). Tap an item in List View to close the list and see the Detail View for the selected item. In Settings, this may navigate to another List View menu.



FIG. 103 shows an example Settings Menu of the UI, under an embodiment. Menus of an embodiment include the same structure as lists, but may not have icons. These also cover the rest of the UI when they are displayed, though maintains the same width as a list in landscape, and can be closed by tapping the X icon at the top right of the header. Menus follow the same vertical scroll behavior as lists.



FIG. 104 shows an example Events History View of the UI, under an embodiment. Event history views or lists scroll vertically under the header. Events are organized under Date headers and listed in reverse chronological order (with the most recent event at the top). The top header is pinned to the top and the entries for that day scroll under the date header, but the next header pushes the previous header under the sheet header and then is pinned until the next header pushes it away. Individual entries are time-stamped. Sensors, Cameras, and Notable Events present history as a list of events. Each sensor and camera has its own history, while Notable Events is a single history sheet.


History View is used when viewing a list of recent or past events for a system or device. These sheets slide up from the bottom and, when dismissed by tapping the X icon, slide down to reveal the UI underneath. The header includes the device name and an ‘X’ button to close the history view and return to the previous view. Side-swipe navigation is disabled in this view. History View may present history as a scrolling list of events or as a history graph.


The UI of an embodiment includes line graphs, but is not so limited. FIG. 105 shows example thermostat line graphs of the UI, under an embodiment. Thermostats (using line graphs) and Energy devices (using bar graphs) present history in graph form for each device.


System warnings and messages appear in a message bar, which is positioned beneath the System Bar in an embodiment. FIG. 106 shows example versions of a dismissable message in a message bar of the UI, under an embodiment. FIG. 107 shows example versions of a non-dismissable message in a message bar of the UI, under an embodiment.


The Message Bar appears whenever a message needs to be shown, regardless of context, except when in full-screen mode (third party apps, full-screen video or device manager for example). Two levels of message urgency are handled by the message bar, which include Info (grey) and Warning (yellow/red text/red icon). Messages may be hidden or dismissed as indicated by the icon on the right side of the message: Dismissable (has a “dismiss” X icon, and optionally a timeout); or Non-dismissable (“hide” icon, no timeout). Warning messages also add the appropriate badge to the System Icon.


Messages stack up as they accumulate until they are hidden or dismissed. They are displayed in chronological order, but sorted with warnings grouped at the top and information messages in a group below warnings. FIG. 108 shows example versions of multiple messages presented by the UI, under an embodiment.


Dismissable messages with a timeout value are automatically hidden when they expire. Otherwise, messages are individually dismissed or hidden. If the message is dismissable, tapping the dismiss icon dismisses the message permanently. If the message is non-dismissable, tapping the hide icon hides the message temporarily. When a non-dismissable message is hidden, the system icon badge bounces to reinforce the connection and serves as a reminder of where to find the message again.


To show hidden (non-dismissable) messages again, tap the system icon. The sensor list is shown and the non-dismissable messages are shown again.


Upon completing sign-in procedures via the UI, the Home section is loaded and includes both Home View and the Sensors list. If Home View is set up, the app defaults to Home View after the app is launched. On sites without Home View set up, the app defaults to the Sensors list on launch and the page dots are hidden. Page dots in this tab indicate Home View and the Sensors list. The Home tab icon displays the trouble badge based on the status of the sensors, not on the status of Home View. From Home View, swipe right to left to go to the Sensors list. From the Sensors list, swipe left to right to go to Home View.



FIG. 109 shows example versions of a Home View (3D, multiple floors) screen or page of the UI, under an embodiment. Home View allows users to view all installed premises devices at a glance. FIG. 110 shows an example Home View (2D, multiple floors) screen or page of the UI, under an embodiment.



FIG. 111 shows an example Home View device control screen or page of the UI, under an embodiment. Tap on a device to show its details in a card overlay on top of Home View. Cameras will open the live video window.



FIG. 112 shows an example Notable Events screen or page of the UI, under an embodiment. Tap on the History icon to view Notable Events. If a site has more than one floor and Home View is in 2D mode, switching between floors is handled by the Home View component: tap on a floor thumbnail to the right of the floor plan to navigate to a new floor.



FIG. 113 shows example versions of a sensor list screen or page of the UI, under an embodiment. Users without Home View enabled will see the Sensors list with the site name visible when navigating to the Home tab. This also becomes the default view when loading into the app as long as Home View is disabled. Each device list item includes but is not limited to the following: Icon (indicates device type and status, vertically-centered, left-aligned); Name (bold text, left-aligned.) either portrait (listed on first row) or landscape (listed inline, left-aligned); Zone # (sensors only, normal text) either portrait (listed on second row, left-aligned) or landscape (inline, left-aligned following the device name); and status text (sensor and device lists only, normal text, includes other secondary status such as “Stopped” or “Low battery”) either portrait (vertically centered, right-aligned) or landscape (inline, right-aligned). Tap on a sensor in the list to view that sensor's history.



FIG. 114 shows an example Sensor History screens or pages of the UI, under an embodiment. The Sensor History screen includes history for the selected sensor, as well as controls (e.g. Bypass) for the sensor where applicable. This history sheet can appear by tapping an icon in Home View (scrolling disabled) or tapping on a sensor in List View (scrolling enabled, swiping shortcut between sheets enabled). Sensor History displays recent (since last reboot) activity for that sensor in a scrolling list.



FIG. 115 shows an example of arm options presented by the UI, under an embodiment. Press the Arm button to show the arm options dialog. Press the Arm button to show the arm options dialog. FIG. 116 shows an example of arm protest presented by the UI, under an embodiment. Arm protest is shown if some sensors are open or troubled while attempting to arm. A list of sensors is shown. FIG. 117 shows an example of arm protest failed presented by the UI, under an embodiment.


In the event of an alarm, the alarm dialog is shown. FIG. 118 shows an example of arm dialogue presented by the UI, under an embodiment.



FIG. 119 shows an example of modes dialog presented by the UI, under an embodiment. The Modes button uses the name of the current mode for its label, or the partner-specified feature name otherwise. Press the Modes button to show a dialog including a list of modes. Select a mode to change the mode on the site and close the dialog or tap on the X icon to close the dialog without changing the mode.



FIG. 120 shows examples of camera detail screens presented by the UI, under an embodiment. Tap on the Cameras tab to see the first camera from the list in Details View. A recent thumbnail from the camera (scaled to fit) is displayed with a Play icon in the center. Tap on the image to view a full-screen landscape Live Video stream for the camera. If the camera is offline, the last available picture is displayed if available, or a gray placeholder box and the Play icon is replaced with text that reads “Offline”. On sites having multiple cameras, swipe left or right to navigate between cameras. A History icon navigates to a list of video clips and pictures recorded from the camera. A List icon navigates to a List View of installed cameras.


From the Details View, a tap on the List icon results in presentation of the Cameras list. FIG. 121 shows example camera device lists presented by the UI, under an embodiment. Cameras with exceptional states have status text, though all cameras display status through their status icon. Tap on a camera in the list to see the Details View for that camera. Tap on the X to return to the Details View for the last-viewed camera.



FIG. 122 shows an example of camera full-screen live video presented by the UI, under an embodiment. FIG. 123 shows camera capture options (e.g., “Take Picture”, “Take Video Clip”, etc.) presented by the UI, under an embodiment. FIG. 124 shows a capture message (e.g., “Capturing Video Clip . . . ”) presented by the UI, under an embodiment. Tap anywhere while viewing Live Video to bring up the Video Timeline, as well as a Capture button to capture clips or pictures and an X icon to close and return to the Details View of the current camera (also possible with the OS back button). Tap on a Capture button to present the following options: Take Picture (take a picture on the current camera and see the following message: “Your picture will be taken in a few seconds.”) or Capture Video Clip (capture a video clip on the current camera and see the following message: “Your video capture will begin in a few seconds.”). The message is displayed in an overlay for a couple seconds and then disappears.



FIG. 125 shows example camera history (clips and pictures) views presented by the UI, under an embodiment. Tap the History icon on the Details View of a camera to see history for that camera. Video clips and pictures are displayed together in a scrolling thumbnail grid list, and are sorted under Date headers (same as Sensor History). As a result, the timestamp under the photo now only displays the time when captured, not the date. Tap on a thumbnail to see the selected clip or picture in full-screen view.


Embodiments also include support for various types of switches. More particularly, four types of switches are supported but embodiments are not so limited: ON/OFF functionality; ON/OFF functionality in addition to a variable dimming control options; ON/OFF functionality with Energy Monitor (Multifunction Device); and ON/OFF functionality in addition to a variable dimming control options with Energy Monitor (Multifunction Device).



FIG. 126A shows an example binary switch icon (“off” state) presented by the UI, under an embodiment. FIG. 126B shows an example binary switch icon (“on” state, indicated by different color than “off” state) presented by the UI, under an embodiment.



FIG. 127A shows an example UI page with a binary switch (e.g., coffee maker, etc.) icon (“off” state) presented by the UI, under an embodiment. FIG. 127B shows an example UI page with a binary switch (e.g., coffee maker, etc.) icon (“on” state, indicated by different color than “off” state) presented by the UI, under an embodiment.



FIG. 128A shows an example dimmer switch icon (“off” state) presented by the UI, under an embodiment. FIG. 128B shows an example dimmer switch icon (“on” state, indicated by different color than “off” state) presented by the UI, under an embodiment. FIG. 128C shows an example dimmer switch icon (in use) presented by the UI, under an embodiment. Regarding dimmer interaction, “Tap” is tapping in the center to turn the device on/off. Tap on the dimmer track to jump to the closest point on the track. “Drag” is dragging along the dimmer track should jump to the finger location, and then follow the finger location at a pre-specified ratio (e.g., 1:1, etc.).



FIG. 129A shows an example UI page with a dimmer switch (e.g., light, etc.) icon “off” state) presented by the UI, under an embodiment. FIG. 129B shows an example UI page with a dimmer switch (e.g., light, etc.) icon (“on” state, indicated by different color than “off” state) presented by the UI, under an embodiment.


Additionally, embodiments include support for thermostats. FIGS. 130A and 130B show example thermostat state icons presented by the UI, under an embodiment. The thermostat state icons include, but are not limited to, auto mode cooling, auto mode heating, cool mode cooling, heat mode heating, fan not supported, auto mode (not heating/cooling), off, and offline. The thermostat control includes: current temperature; heating and cooling set point indicators; heating and cooling visual indicator; thermostat mode button (tap to display); and fan mode button (tap to display). Thermostats can have the following combinations of features but are not so limited: heat, cool and off thermostat modes; heat, cool, off and automatic thermostat modes; heat, cool and off thermostat modes plus on, auto fan modes; and heat, cool, off and auto thermostat modes plus on, auto fan modes.



FIG. 131 shows set point drag and tap areas of a thermostat presented by the UI, under an embodiment. “Tap” is tapping anywhere within the four quadrants of the control to change the set point in increments (e.g., one degree, etc.), as indicated by the +/−icons. “Drag” is dragging along either set point track should jump to the finger location (drag only, not tap), and then follow the finger location at a ratio (e.g., 1:1 ratio, etc.).



FIG. 132 shows the thermostat set point (heat/cool) slider in use (top), and increment/decrement function in use (bottom) as presented by the UI, under an embodiment. When the thermostat is actively heating, the heating indicator (center glow) and “Heating” label are shown. When the thermostat is actively cooling, the cooling indicator (center glow) and “Cooling” label are shown. When the thermostat is off, the temperature controls are hidden.



FIG. 133A shows example versions of thermostat details (auto mode) screens presented by the UI, under an embodiment. FIG. 133B shows an example thermostat (actively heating) screen presented by the UI, under an embodiment. FIG. 133C shows an example thermostat (actively cooling) screen presented by the UI, under an embodiment. FIG. 133D shows an example thermostat (changing cool setpoint) screen presented by the UI, under an embodiment. FIG. 133E shows an example thermostat (off) screen presented by the UI, under an embodiment.


Embodiments include a tap-to-popup UI for operation mode (Mode) and fan mode (fan icon). FIG. 134 shows mode selection popups presented by the UI, under an embodiment. Tapping Mode or fan icon shows a popup with mode selector+X icon to close.


Furthermore, embodiments include support for various types of door locks. FIG. 135 shows an example door lock control tap and drag control screen presented by the UI, under an embodiment. The door lock and garage door controls include current state (e.g., open, closed, locked, unlocked) and lock ring with drag handle.



FIG. 136 shows example lock icons (e.g., locked state, unlocked state, low battery) presented by the UI, under an embodiment. To lock or unlock the door, drag up, down, radially or diagonally within the upper-left portion of the control. The handle will not stay in a position other than 0° or 90°, but is not so limited. For example, the handle can be dragged to any location between 0° and 90° but on release, the handle will reset to the nearest position. Tap the center lock icon to toggle between locked/unlocked or open/closed.



FIG. 137A shows an example UI with door lock details icon (inactive) presented by the UI, under an embodiment. FIG. 137B shows an example UI page with door lock details icon (active) presented by the UI, under an embodiment.



FIG. 138A shows an example UI page with garage door details icon (inactive) presented by the UI, under an embodiment. FIG. 138B shows an example UI page with garage door details icon (active) presented by the UI, under an embodiment.


Embodiments also include support for various types of energy meters or devices. FIG. 139 shows an example energy meter details page of the UI, under an embodiment. Energy devices show current power usage on a semi-logarithmic scale but are not so limited.


Embodiments include a system comprising an automation network comprising a gateway at a premises. The gateway is coupled to a remote network. The gateway is configured to control a plurality of components at the premises including at least one of a thermostat and a lock. The system includes a sensor user interface (SUI) coupled to the gateway and presented to a user via a plurality of remote client devices. The SUI includes a plurality of display elements for managing and receiving data of the plurality of components agnostically across the plurality of remote client devices. The plurality of display elements includes an interactive icon comprising a plurality of control regions. Each control region is configured to control a state change of a corresponding component.


Embodiments include a system comprising: an automation network comprising a gateway at a premises, wherein the gateway is coupled to a remote network, wherein the gateway is configured to control a plurality of components at the premises including at least one of a thermostat and a lock; and a sensor user interface (SUI) coupled to the gateway and presented to a user via a plurality of remote client devices, wherein the SUI includes a plurality of display elements for managing and receiving data of the plurality of components agnostically across the plurality of remote client devices, wherein the plurality of display elements includes an interactive icon comprising a plurality of control regions, wherein each control region is configured to control a state change of a corresponding component.


The plurality of control regions of the interactive icon is configured to control a plurality of states of the corresponding component.


The interactive icon includes a thermostat icon corresponding to the thermostat, and the plurality of control regions includes a plurality of quadrants.


Each control region is configured to change a setting of the thermostat in response to an action received in the control region.


The thermostat icon includes a display region configured to display a plurality of state information of the corresponding environmental control system.


A first quadrant is configured to control an increase in a heat point setting of the corresponding thermostat.


A second quadrant is configured to control a decrease in a heat point setting of the corresponding thermostat.


A third quadrant is configured to control an increase in a cool point setting of the corresponding thermostat.


A fourth quadrant is configured to control a decrease in a cool point setting of the corresponding thermostat.


The interactive icon includes a lock icon corresponding to the lock.


Each control region is configured to change a setting of the lock in response to an action received in the control region.


The lock icon includes a display region configured to display a plurality of state information of the corresponding lock.


The plurality of control regions includes a first control region configured to toggle a state of the corresponding lock in response to a tap received in the first control region.


The plurality of control regions includes a second control region configured to control locking of the corresponding lock in response to a tap received in the second control region.


The plurality of control regions includes a third control region configured to control unlocking of the corresponding lock in response to a tap received in the third control region.


The plurality of control regions includes a handle icon.


The plurality of control regions includes a fourth control region configured to control unlocking of the corresponding lock in response to dragging and releasing the handle icon in the fourth control region.


The plurality of control regions includes a fifth control region configured to control locking of the corresponding lock in response to dragging and releasing the handle icon in the fifth control region.


The plurality of components includes a door, and the interactive icon includes a door icon corresponding to the door.


Each control region is configured to change a state of the door in response to an action received in the control region.


The door icon includes a display region configured to display a plurality of state information of the corresponding door.


The plurality of control regions includes a first control region configured to toggle a state of the corresponding door in response to a tap received in the first control region.


The plurality of control regions includes a second control region configured to control closing of the corresponding door in response to a tap received in the second control region.


The plurality of control regions includes a third control region configured to control opening of the corresponding door in response to a tap received in the third control region.


The plurality of control regions includes a handle icon.


The plurality of control regions includes a fourth control region configured to control opening of the corresponding door in response to dragging and releasing the handle icon in the fourth control region.


The plurality of control regions includes a fifth control region configured to control closing of the corresponding door in response to dragging and releasing the handle icon in the fifth control region.


The plurality of remote client devices includes one or more of a smart phone, a mobile phone, a cellular phone, a tablet computer, a personal computer, and a touchscreen device.


The controlling of the plurality of components at the premises includes controlling interoperability among the plurality of components.


The gateway is configured using data of the plurality of components.


At least one of the gateway and the plurality of remote devices are configured to perform a synchronization to associate the plurality of remote devices with the plurality of components.


The plurality of remote devices includes applications that receive the data from and transmit control instructions to the plurality of components via the gateway.


The plurality of display elements include display elements comprising a representation of a floor plan layout of the premises, wherein the floor plan layout includes representations of the plurality of components.


The interactive icon is configured as an overlay on the floor plan layout.


The floor plan layout visually and separately indicates a location and a state of the plurality of components, wherein the state includes current state and historical state.


The floor plan layout includes a three-dimensional representation of the floor plan.


The floor plan layout includes configuration data for each of the plurality of components.


The plurality of components at the premises includes network devices and a security system comprising security system components.


Embodiments include a method comprising configuring an automation network to include a gateway at a premises. The gateway is coupled to a remote network. The method includes configuring the gateway to control a plurality of components at the premises including at least one of a thermostat and a lock. The method includes configuring a sensor user interface (SUI) to include a plurality of display elements for managing and receiving data of the plurality of components agnostically across a plurality of remote client devices. The SUI is coupled to the gateway and presented to a user via a plurality of remote client devices. The plurality of display elements includes an interactive icon comprising a plurality of control regions. Each control region is configured to control a state change of a corresponding component.


Embodiments include a method comprising: configuring an automation network to include a gateway at a premises, wherein the gateway is coupled to a remote network; configuring the gateway to control a plurality of components at the premises including at least one of a thermostat and a lock; configuring a sensor user interface (SUI) to include a plurality of display elements for managing and receiving data of the plurality of components agnostically across a plurality of remote client devices, wherein the SUI is coupled to the gateway and presented to a user via a plurality of remote client devices, wherein the plurality of display elements includes an interactive icon comprising a plurality of control regions, wherein each control region is configured to control a state change of a corresponding component.


The method includes configuring the plurality of control regions of the interactive icon to control a plurality of states of the corresponding component.


The method includes configuring the interactive icon to include a thermostat icon corresponding to the thermostat, and configuring the plurality of control regions to include a plurality of quadrants.


The method includes configuring each control region to change a setting of the thermostat in response to an action received in the control region.


The method includes configuring the thermostat icon to include a display region configured to display a plurality of state information of the corresponding environmental control system.


The method includes configuring a first quadrant to control an increase in a heat point setting of the corresponding thermostat.


The method includes configuring a second quadrant to control a decrease in a heat point setting of the corresponding thermostat.


The method includes configuring a third quadrant to control an increase in a cool point setting of the corresponding thermostat.


The method includes configuring a fourth quadrant to control a decrease in a cool point setting of the corresponding thermostat.


The method includes configuring the interactive icon to include a lock icon corresponding to the lock.


The method includes configuring each control region to change a setting of the lock in response to an action received in the control region.


The method includes configuring the lock icon to include a display region configured to display a plurality of state information of the corresponding lock.


The method includes configuring the plurality of control regions to include a first control region configured to toggle a state of the corresponding lock in response to a tap received in the first control region.


The method includes configuring the plurality of control regions to include a second control region configured to control locking of the corresponding lock in response to a tap received in the second control region.


The method includes configuring the plurality of control regions to include a third control region configured to control unlocking of the corresponding lock in response to a tap received in the third control region.


The method includes configuring the plurality of control regions to include a handle icon.


The method includes configuring the plurality of control regions to include a fourth control region configured to control unlocking of the corresponding lock in response to dragging and releasing the handle icon in the fourth control region.


The method includes configuring the plurality of control regions to include a fifth control region configured to control locking of the corresponding lock in response to dragging and releasing the handle icon in the fifth control region.


The plurality of components includes a door, and the interactive icon includes a door icon corresponding to the door.


The method includes configuring each control region to change a state of the door in response to an action received in the control region.


The method includes configuring the door icon to include a display region configured to display a plurality of state information of the corresponding door.


The method includes configuring the plurality of control regions to include a first control region configured to toggle a state of the corresponding door in response to a tap received in the first control region.


The method includes configuring the plurality of control regions to include a second control region configured to control closing of the corresponding door in response to a tap received in the second control region.


The method includes configuring the plurality of control regions to include a third control region configured to control opening of the corresponding door in response to a tap received in the third control region.


The method includes configuring the plurality of control regions to include a handle icon.


The method includes configuring the plurality of control regions to include a fourth control region configured to control opening of the corresponding door in response to dragging and releasing the handle icon in the fourth control region.


The method includes configuring the plurality of control regions to include a fifth control region configured to control closing of the corresponding door in response to dragging and releasing the handle icon in the fifth control region.


The method includes configuring at least one of the gateway and the plurality of remote devices to perform a synchronization to associate the plurality of remote devices with the plurality of components.


The plurality of remote devices includes applications that receive the data from and transmit control instructions to the plurality of components via the gateway.


The method includes configuring the plurality of display elements to include a representation of a floor plan layout of the premises, wherein the floor plan layout includes representations of the plurality of components.


As described above, computer networks suitable for use with the embodiments described herein include local area networks (LAN), wide area networks (WAN), Internet, or other connection services and network variations such as the world wide web, the public internet, a private internet, a private computer network, a public network, a mobile network, a cellular network, a value-added network, and the like. Computing devices coupled or connected to the network may be any microprocessor controlled device that permits access to the network, including terminal devices, such as personal computers, workstations, servers, mini computers, main-frame computers, laptop computers, mobile computers, palm top computers, hand held computers, mobile phones, TV set-top boxes, or combinations thereof. The computer network may include one of more LANs, WANs, Internets, and computers. The computers may serve as servers, clients, or a combination thereof.


The integrated security system can be a component of a single system, multiple systems, and/or geographically separate systems. The integrated security system can also be a subcomponent or subsystem of a single system, multiple systems, and/or geographically separate systems. The integrated security system can be coupled to one or more other components (not shown) of a host system or a system coupled to the host system.


One or more components of the integrated security system and/or a corresponding system or application to which the integrated security system is coupled or connected includes and/or runs under and/or in association with a processing system. The processing system includes any collection of processor-based devices or computing devices operating together, or components of processing systems or devices, as is known in the art. For example, the processing system can include one or more of a portable computer, portable communication device operating in a communication network, and/or a network server. The portable computer can be any of a number and/or combination of devices selected from among personal computers, personal digital assistants, portable computing devices, and portable communication devices, but is not so limited. The processing system can include components within a larger computer system.


The processing system of an embodiment includes at least one processor and at least one memory device or subsystem. The processing system can also include or be coupled to at least one database. The term “processor” as generally used herein refers to any logic processing unit, such as one or more central processing units (CPUs), digital signal processors (DSPs), application-specific integrated circuits (ASIC), etc. The processor and memory can be monolithically integrated onto a single chip, distributed among a number of chips or components, and/or provided by some combination of algorithms. The methods described herein can be implemented in one or more of software algorithm(s), programs, firmware, hardware, components, circuitry, in any combination.


The components of any system that includes the integrated security system can be located together or in separate locations. Communication paths couple the components and include any medium for communicating or transferring files among the components. The communication paths include wireless connections, wired connections, and hybrid wireless/wired connections. The communication paths also include couplings or connections to networks including local area networks (LANs), metropolitan area networks (MANs), wide area networks (WANs), proprietary networks, interoffice or backend networks, and the Internet. Furthermore, the communication paths include removable fixed mediums like floppy disks, hard disk drives, and CD-ROM disks, as well as flash RAM, Universal Serial Bus (USB) connections, RS-232 connections, telephone lines, buses, and electronic mail messages.


Aspects of the integrated security system and corresponding systems and methods described herein may be implemented as functionality programmed into any of a variety of circuitry, including programmable logic devices (PLDs), such as field programmable gate arrays (FPGAs), programmable array logic (PAL) devices, electrically programmable logic and memory devices and standard cell-based devices, as well as application specific integrated circuits (ASICs). Some other possibilities for implementing aspects of the integrated security system and corresponding systems and methods include: microcontrollers with memory (such as electronically erasable programmable read only memory (EEPROM)), embedded microprocessors, firmware, software, etc. Furthermore, aspects of the integrated security system and corresponding systems and methods may be embodied in microprocessors having software-based circuit emulation, discrete logic (sequential and combinatorial), custom devices, fuzzy (neural) logic, quantum devices, and hybrids of any of the above device types. Of course the underlying device technologies may be provided in a variety of component types, e.g., metal-oxide semiconductor field-effect transistor (MOSFET) technologies like complementary metal-oxide semiconductor (CMOS), bipolar technologies like emitter-coupled logic (ECL), polymer technologies (e.g., silicon-conjugated polymer and metal-conjugated polymer-metal structures), mixed analog and digital, etc.


It should be noted that any system, method, and/or other components disclosed herein may be described using computer aided design tools and expressed (or represented), as data and/or instructions embodied in various computer-readable media, in terms of their behavioral, register transfer, logic component, transistor, layout geometries, and/or other characteristics. Computer-readable media in which such formatted data and/or instructions may be embodied include, but are not limited to, non-volatile storage media in various forms (e.g., optical, magnetic or semiconductor storage media) and carrier waves that may be used to transfer such formatted data and/or instructions through wireless, optical, or wired signaling media or any combination thereof. Examples of transfers of such formatted data and/or instructions by carrier waves include, but are not limited to, transfers (uploads, downloads, e-mail, etc.) over the Internet and/or other computer networks via one or more data transfer protocols (e.g., HTTP, FTP, SMTP, etc.). When received within a computer system via one or more computer-readable media, such data and/or instruction-based expressions of the above described components may be processed by a processing entity (e.g., one or more processors) within the computer system in conjunction with execution of one or more other computer programs.


Unless the context clearly requires otherwise, throughout the description and the claims, the words “comprise,” “comprising,” and the like are to be construed in an inclusive sense as opposed to an exclusive or exhaustive sense; that is to say, in a sense of “including, but not limited to.” Words using the singular or plural number also include the plural or singular number respectively. Additionally, the words “herein,” “hereunder,” “above,” “below,” and words of similar import, when used in this application, refer to this application as a whole and not to any particular portions of this application. When the word “or” is used in reference to a list of two or more items, that word covers all of the following interpretations of the word: any of the items in the list, all of the items in the list and any combination of the items in the list.


The above description of embodiments of the integrated security system and corresponding systems and methods is not intended to be exhaustive or to limit the systems and methods to the precise forms disclosed. While specific embodiments of, and examples for, the integrated security system and corresponding systems and methods are described herein for illustrative purposes, various equivalent modifications are possible within the scope of the systems and methods, as those skilled in the relevant art will recognize. The teachings of the integrated security system and corresponding systems and methods provided herein can be applied to other systems and methods, not only for the systems and methods described above.


The elements and acts of the various embodiments described above can be combined to provide further embodiments. These and other changes can be made to the integrated security system and corresponding systems and methods in light of the above detailed description.

Claims
  • 1. A system comprising: a plurality of premises devices located at a premises, wherein at least one of the premises devices is associated with a door; anda computing device configured to: cause output of a user interface, wherein the user interface is associated with the plurality of premises devices, wherein the user interface comprises a handle icon associated with the door, and wherein the handle icon comprises a plurality of regions;cause first modification, based on a first input comprising a rotational dragging and releasing action in a first region of the plurality of regions of the handle icon, of a setting associated with the at least one of the premises devices associated with the door; andcause second modification, based on a second input comprising a rotational dragging and releasing action in a second region of the plurality of regions of the handle icon, of the setting associated with the at least one of the premises devices associated with the door.
  • 2. The system of claim 1, wherein the plurality of regions comprise quadrants.
  • 3. The system of claim 1, wherein the setting comprises a locking setting.
  • 4. The system of claim 3, wherein the plurality of regions comprises five regions.
  • 5. The system of claim 3, wherein the plurality of regions comprises four regions.
  • 6. The system of claim 1, wherein the first input causes locking of the door and the second input causes unlocking of the door.
  • 7. The system of claim 1, wherein the setting of at least one of the premises devices associated with the door comprises an opening setting of the at least one of the premises devices associated with the door.
  • 8. The system of claim 1, wherein the first input causes closing of the door and the second input causes opening of the door.
  • 9. The system of claim 1, wherein the computing device is configured to cause output of the user interface via one or more of a smart phone, a mobile phone, a cellular phone, a tablet computer, a personal computer, or a touchscreen device.
  • 10. The system of claim 1, wherein the user interface comprises a representation of a floor plan layout of a premises, wherein the floor plan layout comprises representations of a plurality of premises devices.
  • 11. The system of claim 10, wherein the plurality of regions comprise an overlay on the floor plan layout.
  • 12. The system of claim 10, wherein the floor plan layout indicates a location and a state of the plurality of premises devices, wherein the state comprises current state and historical state.
  • 13. The system of claim 10, wherein the floor plan layout comprises a three-dimensional representation of a floor plan.
  • 14. The system of claim 10, wherein the floor plan layout comprises configuration data for each of the plurality of premises devices.
  • 15. The system of claim 1, wherein the computing device is further configured to cause modification, based on a third input comprising a dragging and releasing action in a third region, of the setting of the door.
  • 16. A method comprising: causing output of a user interface, wherein the user interface is associated with a plurality of premises devices, wherein at least one of the plurality of premises device is associated with a door, and wherein the user interface comprises a handle icon associated with the door, wherein the handle icon comprises a plurality of regions;causing first modification, based on a first input comprising a rotational dragging and releasing action in a first region of the plurality of regions of the handle icon, of a setting associated with the at least one premises device associated with the door; andcausing second modification, based on a second input comprising a rotational dragging and releasing action in a second region of the plurality of regions of the handle icon, of the setting associated with the at least one premises device associated with the door.
  • 17. The method of claim 16, wherein the plurality of regions comprise quadrants.
  • 18. The method of claim 16, wherein the setting comprises a locking setting.
  • 19. The method of claim 16, wherein the first input causes locking of the door and the second input causes unlocking of the door.
  • 20. The method of claim 16, wherein the setting associated with the at least one of the premises devices associated with the door comprises an opening setting of the door.
  • 21. The method of claim 16, wherein the first input causes closing of the door and the second input causes opening of the door.
  • 22. The method of claim 16, wherein the plurality of regions comprises at least five regions.
  • 23. The method of claim 16, wherein the plurality of regions comprises at least four regions.
  • 24. The method of claim 16, wherein the user interface comprises a representation of a floor plan layout of a premises, wherein the floor plan layout comprises representations of a plurality of premises devices.
  • 25. A device comprising: one or more processors; andmemory storing instructions that, when executed by the one or more processors, cause the device to: cause output of a user interface, wherein the user interface is associated with a plurality of premises devices, wherein at least one of the plurality of the premises devices is associated with a door, and wherein the user interface comprises a handle icon associated with the door, wherein the handle icon comprises a plurality of regions;cause first modification, based on a first input comprising a rotational dragging and releasing action in a first region of the plurality of regions of the handle icon, of a setting associated with the at least one premises device associated with the door; andcause second modification, based on a second input comprising a rotational dragging and releasing action in a second region of the plurality of regions of the handle icon, of the setting associated with the at least one premises device associated with the door.
  • 26. The device of claim 25, wherein the setting comprises a locking setting.
  • 27. The device of claim 25, wherein the first input causes locking of the door and the second input causes unlocking of the door.
  • 28. The device of claim 25, wherein the setting associated with the at least one premises device associated with the door comprises an opening setting of the door.
  • 29. The device of claim 25, wherein the first input causes closing of the door and the second input causes opening of the door.
  • 30. The device of claim 25, wherein the plurality of regions comprise at least four regions.
  • 31. The device of claim 25, wherein the plurality of regions comprise at least five regions.
CROSS REFERENCE TO RELATED APPLICATIONS

This application is a divisional application of U.S. patent application Ser. No. 15/238,864, filed Aug. 17, 2016, now abandoned, which is hereby incorporated by reference in its entirety. U.S. patent application Ser. No. 15/238,864, claims the benefit of U.S. Patent Application No. 62/205,922, filed Aug. 17, 2015; and claims the benefit of U.S. Patent Application No. 62/205,872, filed Aug. 17, 2015; and is a continuation in part application of U.S. patent application Ser. No. 12/189,780, filed Aug. 11, 2008, now abandoned; and is a continuation in part application of U.S. patent application Ser. No. 13/531,757, filed Jun. 25, 2012, now abandoned; and is a continuation in part application of U.S. patent application Ser. No. 12/197,958, filed Aug. 25, 2008, now U.S. Pat. No. 10,721,087; and is a continuation in part application of U.S. patent application Ser. No. 13/334,998, filed Dec. 22, 2011, now U.S. Pat. No. 9,531,593; and is a continuation in part application of U.S. patent application Ser. No. 12/539,537, filed Aug. 11, 2009, now U.S. Pat. No. 10,156,959; and is a continuation in part application of U.S. patent application Ser. No. 14/645,808, filed Mar. 12, 2015, now U.S. Pat. No. 10,127,801; and is a continuation in part application of U.S. patent application Ser. No. 13/104,932, filed May 10, 2011, now abandoned; and is a continuation in part application of U.S. patent application Ser. No. 13/104,936, filed May 10, 2011, now U.S. Pat. No. 10,380,871; and is a continuation in part application of U.S. patent application Ser. No. 13/929,568, filed Jun. 27, 2013, now U.S. Pat. No. 10,444,964; and is a continuation in part application of U.S. patent application Ser. No. 14/704,045, filed May 5, 2015, now U.S. Pat. No. 10,365,810; and is a continuation in part application of U.S. patent application Ser. No. 14/704,098, filed May 5, 2015, now U.S. Pat. No. 10,348,575; and is a continuation in part application of U.S. patent application Ser. No. 14/704,127, filed May 5, 2015, now abandoned; and is a continuation in part application of U.S. patent application Ser. No. 14/628,651, filed Feb. 23, 2015, now U.S. Pat. No. 10,091,014; and is a continuation in part application of U.S. patent application Ser. No. 13/718,851, filed Dec. 18, 2012, now U.S. Pat. No. 10,156,831; and is a continuation in part application of U.S. patent application Ser. No. 12/972,740, filed Dec. 20, 2010, now U.S. Pat. No. 9,729,342; and is a continuation in part application of U.S. patent application Ser. No. 13/954,553, filed Jul. 30, 2013, now U.S. Pat. No. 11,582,065; and is a continuation in part application of U.S. patent application Ser. No. 14/943,162, filed Nov. 17, 2015, now U.S. Pat. No. 10,062,245; and is a continuation in part application of U.S. patent application Ser. No. 15/177,915, filed Jun. 9, 2016, now U.S. Pat. No. 11,316,958; and is a continuation in part application of U.S. patent application Ser. No. 15/177,448, filed Jun. 9, 2016; and is a continuation in part application of U.S. patent application Ser. No. 15/196,281, filed Jun. 29, 2016, now U.S. Pat. No. 11,368,327; and is a continuation in part application of U.S. patent application Ser. No. 15/198,531, filed Jun. 30, 2016, now abandoned; and is a continuation in part application of U.S. patent application Ser. No. 15/204,662, filed Jul. 7, 2016, now U.S. Pat. No. 10,522,026; and is a continuation in part application of U.S. patent application Ser. No. 15/237,873, filed Aug. 16, 2016.

US Referenced Citations (2341)
Number Name Date Kind
686838 Appel Nov 1901 A
1738540 Replogle et al. Dec 1929 A
3803576 Dobrzanski et al. Apr 1974 A
3852541 Altenberger Dec 1974 A
4006460 Hewitt et al. Feb 1977 A
4141006 Braxton Feb 1979 A
4206449 Apsell et al. Jun 1980 A
4257038 Rounds et al. Mar 1981 A
4286331 Anderson et al. Aug 1981 A
4304970 Fahey et al. Dec 1981 A
4351023 Richer Sep 1982 A
4363031 Reinowitz Dec 1982 A
4459582 Sheahan et al. Jul 1984 A
4520503 Kirst et al. May 1985 A
4559526 Tani et al. Dec 1985 A
4559527 Kirby Dec 1985 A
4567557 Burns Jan 1986 A
4574305 Campbell et al. Mar 1986 A
4581606 Mallory Apr 1986 A
4591834 Kyle May 1986 A
D284084 Ferrara, Jr. Jun 1986 S
4641127 Hogan et al. Feb 1987 A
4652859 Van Wienen Mar 1987 A
4670739 Kelly, Jr. Jun 1987 A
4683460 Nakatsugawa Jul 1987 A
4694282 Tamura et al. Sep 1987 A
4716973 Cobern Jan 1988 A
4730184 Bach Mar 1988 A
4754261 Marino Jun 1988 A
4755792 Pezzolo et al. Jul 1988 A
4779007 Schlanger et al. Oct 1988 A
4785289 Chen Nov 1988 A
4801924 Burgmann et al. Jan 1989 A
4812820 Chatwin Mar 1989 A
4818970 Natale et al. Apr 1989 A
4833339 Luchaco et al. May 1989 A
4833449 Gaffigan May 1989 A
4855713 Brunius Aug 1989 A
4860185 Brewer et al. Aug 1989 A
4887064 Drori et al. Dec 1989 A
4897630 Nykerk Jan 1990 A
4918623 Lockitt et al. Apr 1990 A
4918717 Bissonnette et al. Apr 1990 A
4951029 Severson Aug 1990 A
4959713 Morotomi et al. Sep 1990 A
4962473 Crain Oct 1990 A
4980666 Hwang Dec 1990 A
4993059 Smith et al. Feb 1991 A
4994787 Kratt et al. Feb 1991 A
4996646 Farrington Feb 1991 A
5023901 Sloan et al. Jun 1991 A
5083106 Kostusiak et al. Jan 1992 A
5086385 Launey et al. Feb 1992 A
5091780 Pomerleau Feb 1992 A
5109278 Erickson et al. Apr 1992 A
5132968 Cephus Jul 1992 A
5134644 Garton et al. Jul 1992 A
5159315 Schultz et al. Oct 1992 A
5160879 Tortola et al. Nov 1992 A
5164703 Rickman Nov 1992 A
5164979 Choi Nov 1992 A
D337569 Kando Jul 1993 S
5227776 Starefoss Jul 1993 A
5237305 Ishikuro et al. Aug 1993 A
5245694 Zwern Sep 1993 A
5247232 Lin Sep 1993 A
5280527 Gullman et al. Jan 1994 A
5283816 Gomez Diaz Feb 1994 A
5299971 Hart Apr 1994 A
5319394 Dukek Jun 1994 A
5319698 Glidewell et al. Jun 1994 A
5334974 Simms et al. Aug 1994 A
5400011 Sutton Mar 1995 A
5400246 Wilson et al. Mar 1995 A
5406260 Cummings et al. Apr 1995 A
5410343 Coddington et al. Apr 1995 A
5412708 Katz May 1995 A
5414409 Voosen et al. May 1995 A
5414833 Hershey et al. May 1995 A
5428293 Sinclair et al. Jun 1995 A
5438607 Przygoda et al. Aug 1995 A
5446445 Bloomfield et al. Aug 1995 A
5448290 Vanzeeland Sep 1995 A
5452344 Larson Sep 1995 A
5465081 Todd Nov 1995 A
5471194 Guscott Nov 1995 A
5481312 Cash et al. Jan 1996 A
5483224 Rankin et al. Jan 1996 A
5486812 Todd Jan 1996 A
5499014 Greenwaldt Mar 1996 A
5499196 Pacheco Mar 1996 A
5510975 Ziegler, Jr. Apr 1996 A
5519878 Dolin, Jr. May 1996 A
RE35268 Frolov et al. Jun 1996 E
5525966 Parish Jun 1996 A
5526428 Arnold Jun 1996 A
5534845 Issa et al. Jul 1996 A
5541585 Duhame et al. Jul 1996 A
5543778 Stouffer Aug 1996 A
5546072 Creuseremee et al. Aug 1996 A
5546074 Bernal et al. Aug 1996 A
5546447 Skarbo et al. Aug 1996 A
5548646 Aziz et al. Aug 1996 A
5550984 Gelb Aug 1996 A
5557254 Johnson et al. Sep 1996 A
5565843 Meyvis Oct 1996 A
5570079 Dockery Oct 1996 A
5572438 Ehlers et al. Nov 1996 A
5578989 Pedtke Nov 1996 A
5579197 Mengelt et al. Nov 1996 A
5579221 Mun Nov 1996 A
D377034 Matsushita Dec 1996 S
5586254 Kondo et al. Dec 1996 A
5587705 Morris Dec 1996 A
5598086 Somerville Jan 1997 A
5602918 Chen et al. Feb 1997 A
5604493 Behlke Feb 1997 A
5606615 Lapointe et al. Feb 1997 A
5621662 Humphries et al. Apr 1997 A
5623601 Vu Apr 1997 A
5625338 Pildner et al. Apr 1997 A
5625410 Washino et al. Apr 1997 A
5629687 Sutton et al. May 1997 A
5630216 McEwan May 1997 A
5631630 McSweeney May 1997 A
5638046 Malinowski Jun 1997 A
5650773 Chiarello Jul 1997 A
5651070 Blunt Jul 1997 A
5652567 Traxler Jul 1997 A
5654694 Newham Aug 1997 A
5675321 McBride Oct 1997 A
5680131 Utz Oct 1997 A
5682133 Johnson et al. Oct 1997 A
5686885 Bergman Nov 1997 A
5686896 Bergman Nov 1997 A
5689235 Sugimoto et al. Nov 1997 A
5689708 Regnier et al. Nov 1997 A
5691697 Carvalho et al. Nov 1997 A
5694335 Hollenberg Dec 1997 A
5694595 Jacobs et al. Dec 1997 A
5696486 Poliquin et al. Dec 1997 A
5696898 Baker et al. Dec 1997 A
D389501 Mascarenas et al. Jan 1998 S
5706191 Bassett et al. Jan 1998 A
5712679 Coles Jan 1998 A
5714933 Le Van Suu Feb 1998 A
5715394 Jabs Feb 1998 A
5717378 Malvaso et al. Feb 1998 A
5717379 Peters Feb 1998 A
5717578 Afzal Feb 1998 A
5719551 Flick Feb 1998 A
5726912 Krall et al. Mar 1998 A
5731756 Roddy Mar 1998 A
5736927 Stebbins et al. Apr 1998 A
5737391 Dame et al. Apr 1998 A
5748084 Isikoff May 1998 A
5748089 Sizemore May 1998 A
5757616 May et al. May 1998 A
5761206 Kackman Jun 1998 A
5774051 Kostusiak Jun 1998 A
5777551 Hess Jul 1998 A
5777837 Eckel et al. Jul 1998 A
5784461 Shaffer et al. Jul 1998 A
5784463 Chen et al. Jul 1998 A
5790531 Ellebracht et al. Aug 1998 A
5793028 Wagener et al. Aug 1998 A
5793763 Mayes et al. Aug 1998 A
5794128 Brockel et al. Aug 1998 A
5796401 Winer Aug 1998 A
5798701 Bernal et al. Aug 1998 A
5801618 Jenkins Sep 1998 A
5805056 Mueller et al. Sep 1998 A
5805064 Yorkey Sep 1998 A
5809013 Kackman Sep 1998 A
5809265 Blair et al. Sep 1998 A
5812054 Cohen Sep 1998 A
5819124 Somner et al. Oct 1998 A
5821937 Tonelli Oct 1998 A
5825865 Oberlander et al. Oct 1998 A
5838226 Houggy et al. Nov 1998 A
5844599 Hildin Dec 1998 A
5845070 Ikudome Dec 1998 A
5845081 Rangarajan et al. Dec 1998 A
5854588 Dockery Dec 1998 A
5859966 Hayman et al. Jan 1999 A
5861804 Fansa et al. Jan 1999 A
5864614 Farris et al. Jan 1999 A
5867484 Shaunfield Feb 1999 A
5867495 Elliott et al. Feb 1999 A
5874952 Morgan Feb 1999 A
5875395 Holmes Feb 1999 A
5877696 Powell Mar 1999 A
5877957 Bennett Mar 1999 A
5880775 Ross Mar 1999 A
5881226 Veneklase Mar 1999 A
5886697 Naughton et al. Mar 1999 A
5886894 Rakoff Mar 1999 A
5892442 Ozery Apr 1999 A
5898831 Hall et al. Apr 1999 A
5905438 Weiss et al. May 1999 A
5907279 Bruins et al. May 1999 A
5909183 Borgstahl et al. Jun 1999 A
5914655 Clifton et al. Jun 1999 A
5924069 Kowalkowski et al. Jul 1999 A
5926209 Glatt Jul 1999 A
5933098 Haxton Aug 1999 A
5940387 Humpleman Aug 1999 A
5943394 Ader et al. Aug 1999 A
5952815 Rouillard et al. Sep 1999 A
5955946 Beheshti et al. Sep 1999 A
5958053 Denker Sep 1999 A
5959528 Right et al. Sep 1999 A
5959529 Kail, IV Sep 1999 A
5963916 Kaplan Oct 1999 A
5967975 Ridgeway Oct 1999 A
5974547 Klimenko Oct 1999 A
D416910 Vasquez Nov 1999 S
5982418 Ely Nov 1999 A
5991795 Howard et al. Nov 1999 A
5995838 Oda et al. Nov 1999 A
5999525 Krishnaswamy et al. Dec 1999 A
6002430 McCall et al. Dec 1999 A
6009320 Dudley Dec 1999 A
6011321 Stancu et al. Jan 2000 A
6011921 Takahashi et al. Jan 2000 A
6032036 Maystre et al. Feb 2000 A
6037991 Thro et al. Mar 2000 A
6038289 Sands Mar 2000 A
6040770 Britton Mar 2000 A
6049272 Lee et al. Apr 2000 A
6049273 Hess Apr 2000 A
6049598 Peters et al. Apr 2000 A
6052052 Delmonaco Apr 2000 A
6058115 Sawyer et al. May 2000 A
6060994 Chen May 2000 A
6067346 Akhteruzzaman May 2000 A
6067440 Diefes May 2000 A
6069655 Seeley et al. May 2000 A
6078253 Fowler Jun 2000 A
6078257 Ferraro Jun 2000 A
6078649 Small et al. Jun 2000 A
6085030 Whitehead et al. Jul 2000 A
6085238 Yuasa et al. Jul 2000 A
6091771 Seeley et al. Jul 2000 A
6094134 Cohen Jul 2000 A
6097429 Seeley et al. Aug 2000 A
6104785 Chen Aug 2000 A
6107918 Klein et al. Aug 2000 A
6107930 Behlke et al. Aug 2000 A
6108034 Kim Aug 2000 A
6112015 Planas et al. Aug 2000 A
6112237 Donaldson et al. Aug 2000 A
6117182 Alpert et al. Sep 2000 A
6124882 Voois et al. Sep 2000 A
6128653 Del et al. Oct 2000 A
6134303 Chen Oct 2000 A
6134591 Nickles Oct 2000 A
6138249 Nolet Oct 2000 A
6139177 Venkatraman et al. Oct 2000 A
6140987 Stein et al. Oct 2000 A
6144993 Fukunaga et al. Nov 2000 A
6154133 Ross et al. Nov 2000 A
6157649 Peirce et al. Dec 2000 A
6157943 Meyer Dec 2000 A
6161182 Nadooshan Dec 2000 A
6167186 Kawasaki et al. Dec 2000 A
6167253 Farris et al. Dec 2000 A
6181341 Shinagawa Jan 2001 B1
6192282 Smith et al. Feb 2001 B1
6192418 Hale et al. Feb 2001 B1
6198475 Kunimatsu et al. Mar 2001 B1
6198479 Humpleman et al. Mar 2001 B1
6208247 Agre et al. Mar 2001 B1
6208952 Goertzel et al. Mar 2001 B1
6209011 Vong et al. Mar 2001 B1
6211783 Wang Apr 2001 B1
6215404 Morales Apr 2001 B1
6218938 Lin Apr 2001 B1
6219677 Howard Apr 2001 B1
6226031 Barraclough et al. May 2001 B1
6229429 Horon May 2001 B1
6230271 Wadlow et al. May 2001 B1
6239892 Davidson May 2001 B1
6243683 Peters Jun 2001 B1
6246320 Monroe Jun 2001 B1
6252883 Schweickart et al. Jun 2001 B1
6259440 Vaughan et al. Jul 2001 B1
6268789 Diamant et al. Jul 2001 B1
6271752 Vaios Aug 2001 B1
6275227 DeStefano Aug 2001 B1
6281790 Kimmel et al. Aug 2001 B1
6282569 Wallis et al. Aug 2001 B1
6286038 Reichmeyer et al. Sep 2001 B1
6288716 Humpleman et al. Sep 2001 B1
6289382 Bowman-Amuah Sep 2001 B1
6292766 Mattos et al. Sep 2001 B1
6292827 Raz Sep 2001 B1
6295346 Markowitz et al. Sep 2001 B1
6314425 Serbinis et al. Nov 2001 B1
6320506 Ferraro Nov 2001 B1
6323897 Kogane et al. Nov 2001 B1
D451529 Vasquez Dec 2001 S
6327044 Shima Dec 2001 B1
6331122 Wu Dec 2001 B1
6332193 Glass et al. Dec 2001 B1
6341274 Leon Jan 2002 B1
6347393 Alpert et al. Feb 2002 B1
6351213 Hirsch et al. Feb 2002 B1
6351271 Mainwaring et al. Feb 2002 B1
6351595 Kim Feb 2002 B1
6351829 Dupont et al. Feb 2002 B1
6353853 Gravlin Mar 2002 B1
6353891 Borella et al. Mar 2002 B1
6359560 Budge et al. Mar 2002 B1
6363417 Howard et al. Mar 2002 B1
6363422 Hunter et al. Mar 2002 B1
6366211 Parker Apr 2002 B1
6369695 Horon Apr 2002 B2
6369705 Kennedy Apr 2002 B1
6370436 Howard et al. Apr 2002 B1
6374079 Hsu Apr 2002 B1
6377861 York Apr 2002 B1
6378109 Young et al. Apr 2002 B1
6385772 Courtney May 2002 B1
6392538 Shere May 2002 B1
6396531 Gerszberg et al. May 2002 B1
6400265 Saylor et al. Jun 2002 B1
6405348 Fallah-Tehrani et al. Jun 2002 B1
6411802 Cardina et al. Jun 2002 B1
D460472 Wang Jul 2002 S
6418037 Zhang Jul 2002 B1
6421080 Lambert Jul 2002 B1
6430629 Smyers Aug 2002 B1
6433683 Robinson Aug 2002 B1
6434604 Harada et al. Aug 2002 B1
6434700 Alonso et al. Aug 2002 B1
6437692 Petite et al. Aug 2002 B1
6441723 Mansfield et al. Aug 2002 B1
6441731 Hess Aug 2002 B1
6442241 Tsumpes Aug 2002 B1
6445291 Addy et al. Sep 2002 B2
6446111 Lowery Sep 2002 B1
6446192 Narasimhan et al. Sep 2002 B1
6452490 Garland et al. Sep 2002 B1
6452923 Gerszberg et al. Sep 2002 B1
6452924 Golden et al. Sep 2002 B1
6453687 Sharood et al. Sep 2002 B2
D464328 Vasquez et al. Oct 2002 S
D464948 Vasquez et al. Oct 2002 S
6462507 Fisher, Jr. Oct 2002 B2
6462663 Wilson et al. Oct 2002 B1
6467084 Howard et al. Oct 2002 B1
6473407 Ditmer et al. Oct 2002 B1
6476858 Ramirez et al. Nov 2002 B1
6480901 Weber et al. Nov 2002 B1
6486896 Ubillos Nov 2002 B1
6493020 Stevenson et al. Dec 2002 B1
6496927 McGrane et al. Dec 2002 B1
6499131 Savithri et al. Dec 2002 B1
6504479 Lemons et al. Jan 2003 B1
6507589 Ramasubramani et al. Jan 2003 B1
6508709 Karmarkar Jan 2003 B1
6515968 Combar et al. Feb 2003 B1
6526581 Edson Feb 2003 B1
6529230 Chong Mar 2003 B1
6529589 Nelson et al. Mar 2003 B1
6529723 Bentley Mar 2003 B1
6535110 Arora et al. Mar 2003 B1
6542992 Peirce et al. Apr 2003 B1
6549130 Joao Apr 2003 B1
6552647 Thiessen et al. Apr 2003 B1
6553336 Johnson et al. Apr 2003 B1
6559769 Anthony et al. May 2003 B2
6563800 Salo et al. May 2003 B1
6567122 Anderson et al. May 2003 B1
6567502 Zellner et al. May 2003 B2
6574234 Myer et al. Jun 2003 B1
6580424 Krumm Jun 2003 B1
6580950 Johnson et al. Jun 2003 B1
6587046 Joao Jul 2003 B2
6587235 Chaudhuri et al. Jul 2003 B1
6587455 Ray et al. Jul 2003 B1
6587736 Howard et al. Jul 2003 B2
6587739 Abrams et al. Jul 2003 B1
6591094 Bentley Jul 2003 B1
6593856 Madau Jul 2003 B1
6597703 Li et al. Jul 2003 B1
6601086 Howard et al. Jul 2003 B1
6603488 Humpleman et al. Aug 2003 B2
6609127 Lee et al. Aug 2003 B1
6611206 Eshelman et al. Aug 2003 B2
6615088 Myer et al. Sep 2003 B1
6621827 Rezvani et al. Sep 2003 B1
6624750 Marman et al. Sep 2003 B1
6636893 Fong Oct 2003 B1
6643355 Tsumpes Nov 2003 B1
6643652 Helgeson et al. Nov 2003 B2
6643669 Novak et al. Nov 2003 B1
6643795 Sicola et al. Nov 2003 B1
6648682 Wu Nov 2003 B1
6658091 Naidoo et al. Dec 2003 B1
6661340 Saylor et al. Dec 2003 B1
6662340 Rawat et al. Dec 2003 B2
6665004 Paff Dec 2003 B1
6667688 Menard et al. Dec 2003 B1
6674767 Kadyk et al. Jan 2004 B1
6675365 Elzinga Jan 2004 B2
6680730 Shields et al. Jan 2004 B1
6680935 Kung et al. Jan 2004 B1
6686838 Rezvani et al. Feb 2004 B1
6690411 Naidoo et al. Feb 2004 B2
6690719 Raphaeli et al. Feb 2004 B1
6693530 Dowens et al. Feb 2004 B1
6693545 Brown et al. Feb 2004 B2
6697103 Fernandez et al. Feb 2004 B1
6704786 Gupta et al. Mar 2004 B1
6716101 Meadows et al. Apr 2004 B1
6720990 Walker et al. Apr 2004 B1
6721689 Markle et al. Apr 2004 B2
6721740 Skinner et al. Apr 2004 B1
6721747 Lipkin Apr 2004 B2
6721802 Wright et al. Apr 2004 B1
6727811 Fendis Apr 2004 B1
6728233 Park et al. Apr 2004 B1
6728688 Hirsch et al. Apr 2004 B1
6738824 Blair May 2004 B1
6741171 Palka et al. May 2004 B2
6741977 Nagaya et al. May 2004 B1
6754181 Elliott et al. Jun 2004 B1
6754717 Day et al. Jun 2004 B1
6756896 Ford Jun 2004 B2
6756988 Wang et al. Jun 2004 B1
6756998 Bilger Jun 2004 B1
6759956 Menard et al. Jul 2004 B2
6762686 Tabe Jul 2004 B1
6763377 Belknap et al. Jul 2004 B1
6766353 Lin et al. Jul 2004 B1
6771181 Hughen, Jr. Aug 2004 B1
6778085 Faulkner et al. Aug 2004 B2
6779019 Mousseau et al. Aug 2004 B1
6781509 Oppedahl et al. Aug 2004 B1
6785542 Blight et al. Aug 2004 B1
6789147 Kessler et al. Sep 2004 B1
6795322 Aihara et al. Sep 2004 B2
6795863 Doty, Jr. Sep 2004 B1
6798344 Faulkner et al. Sep 2004 B2
6804638 Fiedler Oct 2004 B2
6810409 Fry et al. Oct 2004 B1
6810420 Buse et al. Oct 2004 B1
6823223 Gonzales et al. Nov 2004 B2
6826173 Kung et al. Nov 2004 B1
6826233 Oosawa Nov 2004 B1
6829478 Layton et al. Dec 2004 B1
6834208 Gonzales et al. Dec 2004 B2
6836214 Choi Dec 2004 B2
6850252 Hoffberg Feb 2005 B1
6856236 Christensen et al. Feb 2005 B2
6857026 Cain Feb 2005 B1
6859831 Gelvin et al. Feb 2005 B1
6865690 Kocin Mar 2005 B2
6871193 Campbell et al. Mar 2005 B1
6873256 Lemelson Mar 2005 B2
6885362 Suomela Apr 2005 B2
D504889 Andre et al. May 2005 S
6891838 Petite et al. May 2005 B1
6912429 Bilger Jun 2005 B1
6914533 Petite Jul 2005 B2
6918112 Bourke-Dunphy et al. Jul 2005 B2
6920502 Araujo et al. Jul 2005 B2
6920615 Campbell et al. Jul 2005 B1
6922701 Ananian et al. Jul 2005 B1
6928148 Simon et al. Aug 2005 B2
6930599 Naidoo et al. Aug 2005 B2
6930730 Maxson et al. Aug 2005 B2
6931445 Davis Aug 2005 B2
6941258 Van et al. Sep 2005 B2
6943681 Rezvani et al. Sep 2005 B2
6956477 Chun Oct 2005 B2
6957075 Iverson Oct 2005 B1
6957186 Guheen et al. Oct 2005 B1
6957275 Sekiguchi Oct 2005 B1
6959341 Leung Oct 2005 B1
6959393 Hollis et al. Oct 2005 B2
6963908 Lynch et al. Nov 2005 B1
6963981 Bailey et al. Nov 2005 B1
6965294 Elliott et al. Nov 2005 B1
6965313 Saylor et al. Nov 2005 B1
6970183 Monroe Nov 2005 B1
6971063 Rappaport et al. Nov 2005 B1
6971076 Chen Nov 2005 B2
6972676 Kimmel et al. Dec 2005 B1
6975220 Foodman et al. Dec 2005 B1
6977485 Wei Dec 2005 B1
6983432 Hayes Jan 2006 B2
6990591 Pearson Jan 2006 B1
6993658 Engberg et al. Jan 2006 B1
6999562 Winick Feb 2006 B2
6999992 Deen et al. Feb 2006 B1
7016970 Harumoto et al. Mar 2006 B2
7019639 Stilp Mar 2006 B2
7020697 Goodman et al. Mar 2006 B1
7020701 Gelvin et al. Mar 2006 B1
7023913 Monroe Apr 2006 B1
7023914 Furukawa et al. Apr 2006 B2
7023975 Mansfield et al. Apr 2006 B2
7024676 Klopfenstein Apr 2006 B1
7028328 Kogane et al. Apr 2006 B2
7030752 Tyroler Apr 2006 B2
7032002 Rezvani et al. Apr 2006 B1
7035907 Decasper et al. Apr 2006 B1
7039391 Rezvani et al. May 2006 B2
7042880 Voit et al. May 2006 B1
7043537 Pratt May 2006 B1
7047088 Nakamura et al. May 2006 B2
7047092 Wimsatt May 2006 B2
7047180 Mathews et al. May 2006 B1
7050388 Kim et al. May 2006 B2
7053764 Stilp May 2006 B2
7053765 Clark May 2006 B1
7068164 Duncan et al. Jun 2006 B1
7072934 Helgeson et al. Jul 2006 B2
7073140 Li et al. Jul 2006 B1
7075429 Marshall Jul 2006 B2
7079020 Stilp Jul 2006 B2
7080046 Rezvani et al. Jul 2006 B1
7082460 Hansen et al. Jul 2006 B2
7085814 Gandhi et al. Aug 2006 B1
7085937 Rezvani et al. Aug 2006 B1
7086018 Ito Aug 2006 B2
7099944 Anschutz et al. Aug 2006 B1
7099994 Thayer et al. Aug 2006 B2
7103152 Naidoo et al. Sep 2006 B2
7106176 La et al. Sep 2006 B2
7107322 Freeny, Jr. Sep 2006 B1
7110774 Davis et al. Sep 2006 B1
7111072 Matthews et al. Sep 2006 B1
7113090 Saylor et al. Sep 2006 B1
7113099 Tyroler et al. Sep 2006 B2
7114554 Bergman et al. Oct 2006 B2
7119609 Naidoo et al. Oct 2006 B2
7119674 Sefton Oct 2006 B2
7120139 Kung et al. Oct 2006 B1
7120232 Naidoo et al. Oct 2006 B2
7120233 Naidoo et al. Oct 2006 B2
7126473 Powell Oct 2006 B1
7130383 Naidoo et al. Oct 2006 B2
7130585 Ollis et al. Oct 2006 B1
7134138 Scherr Nov 2006 B2
7136711 Duncan et al. Nov 2006 B1
7142503 Grant et al. Nov 2006 B1
7145898 Elliott Dec 2006 B1
7147147 Enright et al. Dec 2006 B1
7148810 Bhat Dec 2006 B2
7149798 Rezvani et al. Dec 2006 B2
7149814 Neufeld et al. Dec 2006 B2
7158776 Estes et al. Jan 2007 B1
7158920 Ishikawa Jan 2007 B2
7164883 Rappaport et al. Jan 2007 B2
7164907 Cochran et al. Jan 2007 B2
7166987 Lee et al. Jan 2007 B2
7171686 Jansen et al. Jan 2007 B1
7174018 Patil et al. Feb 2007 B1
7174564 Weatherspoon et al. Feb 2007 B1
7180889 Kung et al. Feb 2007 B1
7181207 Chow et al. Feb 2007 B1
7181517 Iavergne et al. Feb 2007 B1
7181571 Jiang et al. Feb 2007 B2
7181716 Dahroug Feb 2007 B1
7183907 Simon et al. Feb 2007 B2
7184428 Gerszberg et al. Feb 2007 B1
7184848 Krzyzanowski et al. Feb 2007 B2
7187986 Johnson et al. Mar 2007 B2
7194003 Danner et al. Mar 2007 B2
7194446 Bromley et al. Mar 2007 B1
7197125 Prasad et al. Mar 2007 B1
7203486 Patel Apr 2007 B2
7209945 Hicks et al. Apr 2007 B2
7212570 Akiyama et al. May 2007 B2
7213061 Hite et al. May 2007 B1
7218217 Adonailo et al. May 2007 B2
7222359 Freund et al. May 2007 B2
7229012 Enright et al. Jun 2007 B1
7237267 Rayes et al. Jun 2007 B2
7240327 Singh et al. Jul 2007 B2
7246044 Imamura et al. Jul 2007 B2
7248150 Mackjust et al. Jul 2007 B2
7248161 Spoltore et al. Jul 2007 B2
7249177 Miller Jul 2007 B1
7249317 Nakagawa et al. Jul 2007 B1
7250854 Rezvani et al. Jul 2007 B2
7250859 Martin et al. Jul 2007 B2
7254779 Rezvani et al. Aug 2007 B1
7254833 Cornelius et al. Aug 2007 B1
7262690 Heaton et al. Aug 2007 B2
7277010 Joao Oct 2007 B2
7292142 Simon et al. Nov 2007 B2
7293083 Ranous et al. Nov 2007 B1
7298253 Petricoin et al. Nov 2007 B2
7305461 Ullman Dec 2007 B2
7310115 Tanimoto Dec 2007 B2
7313102 Stephenson et al. Dec 2007 B2
7313231 Reid Dec 2007 B2
D558460 Yu et al. Jan 2008 S
D558756 Andre et al. Jan 2008 S
7315886 Meenan et al. Jan 2008 B1
7337217 Wang Feb 2008 B2
7337473 Chang et al. Feb 2008 B2
7339895 Ozaki et al. Mar 2008 B2
7340314 Duncan et al. Mar 2008 B1
7343619 Ofek et al. Mar 2008 B2
7345580 Akamatsu et al. Mar 2008 B2
7346338 Calhoun et al. Mar 2008 B1
7349682 Bennett et al. Mar 2008 B1
7349761 Cruse Mar 2008 B1
7349967 Wang Mar 2008 B2
7356372 Duncan et al. Apr 2008 B1
7359843 Keller et al. Apr 2008 B1
7362221 Katz Apr 2008 B2
7367045 Ofek et al. Apr 2008 B2
7370115 Bae et al. May 2008 B2
7383339 Meenan et al. Jun 2008 B1
7383522 Murgai et al. Jun 2008 B2
7391298 Campbell et al. Jun 2008 B1
7403838 Deen et al. Jul 2008 B2
7409045 Naidoo et al. Aug 2008 B2
7409451 Meenan et al. Aug 2008 B1
7412447 Hilbert et al. Aug 2008 B2
7425101 Cheng Sep 2008 B2
7428585 Owens et al. Sep 2008 B1
7430614 Shen et al. Sep 2008 B2
7437753 Nahum Oct 2008 B2
7440434 Chaskar et al. Oct 2008 B2
7440767 Ballay et al. Oct 2008 B2
7447775 Zhu et al. Nov 2008 B1
7454731 Oh et al. Nov 2008 B2
7457869 Kernan Nov 2008 B2
7466223 Sefton Dec 2008 B2
7469139 Van De Groenendaal Dec 2008 B2
7469294 Luo et al. Dec 2008 B1
7469381 Ording Dec 2008 B2
7469391 Carrere et al. Dec 2008 B2
D584738 Kim et al. Jan 2009 S
D585399 Hwang Jan 2009 S
7477629 Tsirtsis et al. Jan 2009 B2
7479949 Jobs et al. Jan 2009 B2
7480713 Ullman Jan 2009 B2
7480724 Zimler et al. Jan 2009 B2
7483958 Elabbady et al. Jan 2009 B1
7490350 Murotake et al. Feb 2009 B1
7498695 Gaudreau et al. Mar 2009 B2
7502672 Kolls Mar 2009 B1
7506052 Qian et al. Mar 2009 B2
7509687 Ofek et al. Mar 2009 B2
7512965 Amdur et al. Mar 2009 B1
7526539 Hsu Apr 2009 B1
7526762 Astala et al. Apr 2009 B1
7528723 Fast et al. May 2009 B2
7535880 Hinman et al. May 2009 B1
7542721 Bonner et al. Jun 2009 B1
7549134 Li et al. Jun 2009 B1
7551071 Bennett et al. Jun 2009 B2
7554934 Abraham et al. Jun 2009 B2
7558379 Winick Jul 2009 B2
7558862 Tyukasz et al. Jul 2009 B1
7558903 Kinstler Jul 2009 B2
7562323 Bai et al. Jul 2009 B1
7564855 Georgiou Jul 2009 B1
7568018 Hove et al. Jul 2009 B1
7571459 Ganesh et al. Aug 2009 B2
7577420 Srinivasan et al. Aug 2009 B2
7583191 Zinser Sep 2009 B2
7584263 Hicks et al. Sep 2009 B1
7587464 Moorer et al. Sep 2009 B2
7590953 Chang Sep 2009 B2
7595816 Enright et al. Sep 2009 B1
7596622 Owen et al. Sep 2009 B2
D602014 Andre et al. Oct 2009 S
D602015 Andre et al. Oct 2009 S
D602017 Andre et al. Oct 2009 S
D602486 Andre et al. Oct 2009 S
D602487 Maskatia Oct 2009 S
7606767 Couper et al. Oct 2009 B1
7610555 Klein et al. Oct 2009 B2
7610559 Humpleman et al. Oct 2009 B1
7619512 Trundle et al. Nov 2009 B2
7620427 Shanahan Nov 2009 B2
7627665 Barker et al. Dec 2009 B2
7633385 Cohn et al. Dec 2009 B2
7634519 Creamer et al. Dec 2009 B2
7639157 Whitley et al. Dec 2009 B1
7651530 Winick Jan 2010 B2
7653911 Doshi et al. Jan 2010 B2
7671729 Hershkovitz et al. Mar 2010 B2
7679503 Mason et al. Mar 2010 B2
7681201 Dale et al. Mar 2010 B2
7684418 Scott et al. Mar 2010 B2
7696873 Sharma et al. Apr 2010 B2
7697028 Johnson Apr 2010 B1
7701970 Krits et al. Apr 2010 B2
7702421 Sullivan et al. Apr 2010 B2
7702782 Pai Apr 2010 B1
D615083 Andre et al. May 2010 S
7720654 Hollis May 2010 B2
7730223 Bavor et al. Jun 2010 B1
7733371 Monroe Jun 2010 B1
7734020 Elliot et al. Jun 2010 B2
7734286 Almeda Jun 2010 B2
7734906 Orlando et al. Jun 2010 B2
7739596 Clarke-Martin et al. Jun 2010 B2
7739658 Watson et al. Jun 2010 B2
7747975 Dinter et al. Jun 2010 B2
7751409 Carolan Jul 2010 B1
7755506 Clegg et al. Jul 2010 B1
7756928 Meenan et al. Jul 2010 B1
7761275 Chopra et al. Jul 2010 B2
7787863 Van De Groenendaal Aug 2010 B2
7804760 Schmukler et al. Sep 2010 B2
D624896 Park et al. Oct 2010 S
D626437 Lee et al. Nov 2010 S
7825793 Spillman et al. Nov 2010 B1
7827252 Hopmann et al. Nov 2010 B2
7844699 Horrocks et al. Nov 2010 B1
7847675 Thyen et al. Dec 2010 B1
7855635 Cohn et al. Dec 2010 B2
7859404 Chul et al. Dec 2010 B2
7882466 Ishikawa Feb 2011 B2
7882537 Okajo et al. Feb 2011 B2
7884855 Ortiz Feb 2011 B2
7890612 Todd et al. Feb 2011 B2
7890915 Celik et al. Feb 2011 B2
7899732 Van et al. Mar 2011 B2
7904074 Karaoguz et al. Mar 2011 B2
7904187 Hoffberg et al. Mar 2011 B2
7917624 Gidwani Mar 2011 B2
D636769 Wood et al. Apr 2011 S
7921686 Bagepalli et al. Apr 2011 B2
7928840 Kim et al. Apr 2011 B2
7930365 Dixit et al. Apr 2011 B2
D637596 Akana et al. May 2011 S
7949960 Roessler et al. May 2011 B2
D639805 Song et al. Jun 2011 S
D640663 Arnholt et al. Jun 2011 S
7956736 Cohn et al. Jun 2011 B2
7957326 Christie, IV Jun 2011 B1
7970863 Fontaine Jun 2011 B1
D641018 Lee et al. Jul 2011 S
7974235 Ghozati et al. Jul 2011 B2
D642563 Akana et al. Aug 2011 S
8001219 Moorer et al. Aug 2011 B2
D645015 Lee et al. Sep 2011 S
D645435 Kim et al. Sep 2011 S
D645833 Seflic et al. Sep 2011 S
8022833 Cho Sep 2011 B2
8028041 Olliphant et al. Sep 2011 B2
8032881 Holmberg et al. Oct 2011 B2
8042049 Killian et al. Oct 2011 B2
8046411 Hayashi et al. Oct 2011 B2
8046721 Chaudhri Oct 2011 B2
8069194 Manber et al. Nov 2011 B1
D650381 Park et al. Dec 2011 S
D654460 Kim et al. Feb 2012 S
D654497 Lee Feb 2012 S
8125184 Raji et al. Feb 2012 B2
D656137 Chung et al. Mar 2012 S
8140658 Gelvin et al. Mar 2012 B1
8144836 Naidoo et al. Mar 2012 B2
8159519 Kurtz et al. Apr 2012 B2
8159945 Muro et al. Apr 2012 B2
8160425 Kisliakov Apr 2012 B2
8196064 Krzyzanowski et al. Jun 2012 B2
8200827 Hunyady et al. Jun 2012 B1
8205181 Singla et al. Jun 2012 B1
8209400 Baum et al. Jun 2012 B2
D663298 Song et al. Jul 2012 S
D664540 Kim et al. Jul 2012 S
8214494 Slavin Jul 2012 B1
8219254 O'Connor Jul 2012 B2
8229812 Raleigh Jul 2012 B2
D664954 Kim et al. Aug 2012 S
D666198 Van et al. Aug 2012 S
8239477 Sharma et al. Aug 2012 B2
8244550 Sim et al. Aug 2012 B2
D667395 Lee Sep 2012 S
D667396 Koh Sep 2012 S
D667397 Koh Sep 2012 S
D667398 Koh Sep 2012 S
D667399 Koh Sep 2012 S
8269376 Elberbaum Sep 2012 B1
8269623 Addy Sep 2012 B2
8271629 Winters et al. Sep 2012 B1
8272053 Markham et al. Sep 2012 B2
8275830 Raleigh Sep 2012 B2
D668650 Han Oct 2012 S
D668651 Kim et al. Oct 2012 S
D668652 Kim et al. Oct 2012 S
D669469 Kang Oct 2012 S
D670692 Akana et al. Nov 2012 S
D671514 Kim et al. Nov 2012 S
8311526 Forstall et al. Nov 2012 B2
D671938 Hsu et al. Dec 2012 S
D672344 Li Dec 2012 S
D672345 Li Dec 2012 S
D672739 Sin Dec 2012 S
D672768 Huang et al. Dec 2012 S
8335854 Eldering Dec 2012 B2
8336010 Chang et al. Dec 2012 B1
D673561 Hyun et al. Jan 2013 S
D673948 Andre et al. Jan 2013 S
D673950 Li et al. Jan 2013 S
D674369 Jaewoong Jan 2013 S
D675203 Yang Jan 2013 S
8350694 Trundle et al. Jan 2013 B1
8363791 Gupta et al. Jan 2013 B2
D675588 Park Feb 2013 S
D675612 Andre et al. Feb 2013 S
D676443 Canizares et al. Feb 2013 S
D676819 Choi Feb 2013 S
8373313 Garcia et al. Feb 2013 B2
D677255 McManigal et al. Mar 2013 S
D677640 Kim et al. Mar 2013 S
D677659 Akana et al. Mar 2013 S
D677660 Groene et al. Mar 2013 S
D678271 Chiu Mar 2013 S
D678272 Groene et al. Mar 2013 S
D678877 Groene et al. Mar 2013 S
8396766 Enright et al. Mar 2013 B1
8400767 Yeom et al. Mar 2013 B2
D679706 Tang et al. Apr 2013 S
D680151 Katori Apr 2013 S
D680524 Feng et al. Apr 2013 S
D681032 Akana et al. Apr 2013 S
8413204 White et al. Apr 2013 B2
D681583 Park May 2013 S
D681591 Sung May 2013 S
D681632 Akana et al. May 2013 S
D682239 Yeh et al. May 2013 S
8451986 Cohn et al. May 2013 B2
D684553 Kim et al. Jun 2013 S
D684968 Smith et al. Jun 2013 S
8456293 Trundle et al. Jun 2013 B1
D685778 Fahrendorff et al. Jul 2013 S
D685783 Bryan et al. Jul 2013 S
8478450 Lu et al. Jul 2013 B2
8483853 Lambourne Jul 2013 B1
8493202 Trundle et al. Jul 2013 B1
8499038 Vucurevich Jul 2013 B1
8520072 Slavin et al. Aug 2013 B1
8525664 Hadizad et al. Sep 2013 B2
8543665 Ansari et al. Sep 2013 B2
D692042 Dawes et al. Oct 2013 S
8554478 Hartman Oct 2013 B2
8560041 Flaherty et al. Oct 2013 B2
8570993 Austin et al. Oct 2013 B2
8584199 Chen et al. Nov 2013 B1
8595377 Apgar et al. Nov 2013 B1
D695735 Kitchen et al. Dec 2013 S
8599018 Kellen et al. Dec 2013 B2
8634533 Strasters Jan 2014 B2
8635499 Cohn et al. Jan 2014 B2
8638211 Cohn et al. Jan 2014 B2
8649386 Ansari et al. Feb 2014 B2
8650320 Merrick et al. Feb 2014 B1
8666560 Lu et al. Mar 2014 B2
8675071 Slavin et al. Mar 2014 B1
8700769 Alexander et al. Apr 2014 B2
8704821 Kulkarni et al. Apr 2014 B2
8723671 Foisy et al. May 2014 B2
8730834 Marusca et al. May 2014 B2
8738765 Wyatt et al. May 2014 B2
8812654 Gelvin et al. Aug 2014 B2
8817809 Gage Aug 2014 B2
8832244 Gelvin et al. Sep 2014 B2
8836467 Cohn et al. Sep 2014 B1
8868678 Hildreth et al. Oct 2014 B2
8885552 Bedingfield et al. Nov 2014 B2
8902740 Hicks, III Dec 2014 B2
8914526 Lindquist et al. Dec 2014 B1
8914837 Ahmed et al. Dec 2014 B2
8935236 Morita et al. Jan 2015 B2
8937539 Sharma et al. Jan 2015 B2
8937658 Hicks et al. Jan 2015 B2
8953479 Hall et al. Feb 2015 B2
8953749 Naidoo et al. Feb 2015 B2
8963713 Dawes et al. Feb 2015 B2
8976763 Shrestha et al. Mar 2015 B2
8983534 Patel Mar 2015 B2
8988217 Piccolo, III Mar 2015 B2
8988221 Raji et al. Mar 2015 B2
8989922 Jones et al. Mar 2015 B2
9047753 Dawes et al. Jun 2015 B2
9059863 Baum et al. Jun 2015 B2
9064394 Trundle Jun 2015 B1
9094407 Matthieu et al. Jul 2015 B1
9141276 Dawes et al. Sep 2015 B2
9144143 Raji et al. Sep 2015 B2
9146548 Chambers et al. Sep 2015 B2
9147337 Cohn et al. Sep 2015 B2
9160784 Jeong et al. Oct 2015 B2
9164669 Yaksick Oct 2015 B1
9170707 Laska et al. Oct 2015 B1
9172532 Fuller et al. Oct 2015 B1
9172533 Fielder Oct 2015 B2
9172605 Hardy et al. Oct 2015 B2
9189934 Jentoft et al. Nov 2015 B2
9191228 Fulker et al. Nov 2015 B2
9202362 Hyland et al. Dec 2015 B2
9246921 Vlaminck et al. Jan 2016 B1
9286772 Shapiro et al. Mar 2016 B2
9287727 Egan et al. Mar 2016 B1
9300921 Naidoo et al. Mar 2016 B2
9310864 Klein et al. Apr 2016 B1
9373014 Mehranfar Jun 2016 B1
9412248 Cohn et al. Aug 2016 B1
9426720 Cohn et al. Aug 2016 B2
9462041 Hagins et al. Oct 2016 B1
9510065 Cohn et al. Nov 2016 B2
9529344 Hagins et al. Dec 2016 B1
9531593 Baum et al. Dec 2016 B2
9600945 Naidoo et al. Mar 2017 B2
9609003 Chmielewski et al. Mar 2017 B1
9613524 Lamb et al. Apr 2017 B1
9621408 Gutt et al. Apr 2017 B2
9721461 Zeng et al. Aug 2017 B2
9729342 Cohn et al. Aug 2017 B2
9779595 Thibault Oct 2017 B2
9805587 Lamb Oct 2017 B2
9824234 Cho et al. Nov 2017 B2
9843458 Cronin Dec 2017 B2
9876651 Cho et al. Jan 2018 B2
9882985 Esam et al. Jan 2018 B1
9978238 Fadell et al. May 2018 B2
9979625 McLaughlin et al. May 2018 B2
10002507 Wilson et al. Jun 2018 B2
10025473 Sarao et al. Jul 2018 B2
10079839 Bryan et al. Sep 2018 B1
10108272 DeBates Oct 2018 B1
10120354 Rolston et al. Nov 2018 B1
10237757 Raleigh et al. Mar 2019 B2
10257474 Nadathur et al. Apr 2019 B2
10264138 Raleigh et al. Apr 2019 B2
10354517 King Jul 2019 B1
10380873 Halverson Aug 2019 B1
10430887 Parker et al. Oct 2019 B1
10687270 Ishii Jun 2020 B2
10782681 Slavin Sep 2020 B1
10868712 Hutz Dec 2020 B1
11417159 Li Aug 2022 B2
20010012775 Modzelesky et al. Aug 2001 A1
20010016501 King Aug 2001 A1
20010022836 Bremer et al. Sep 2001 A1
20010025349 Sharood et al. Sep 2001 A1
20010029585 Simon et al. Oct 2001 A1
20010030597 Inoue et al. Oct 2001 A1
20010034209 Tong et al. Oct 2001 A1
20010034754 Elwahab et al. Oct 2001 A1
20010034759 Chiles et al. Oct 2001 A1
20010036192 Chiles et al. Nov 2001 A1
20010042137 Ota et al. Nov 2001 A1
20010044835 Schober et al. Nov 2001 A1
20010046366 Susskind Nov 2001 A1
20010047474 Takagi et al. Nov 2001 A1
20010048030 Sharood et al. Dec 2001 A1
20010053207 Jeon et al. Dec 2001 A1
20010054115 Ferguson et al. Dec 2001 A1
20020000913 Hamamoto et al. Jan 2002 A1
20020003575 Marchese Jan 2002 A1
20020004828 Davis et al. Jan 2002 A1
20020005894 Foodman et al. Jan 2002 A1
20020016639 Smith et al. Feb 2002 A1
20020018057 Sano Feb 2002 A1
20020018478 Takeyama et al. Feb 2002 A1
20020019751 Rothschild et al. Feb 2002 A1
20020026476 Miyazaki et al. Feb 2002 A1
20020026531 Keane et al. Feb 2002 A1
20020027504 Davis et al. Mar 2002 A1
20020028696 Hirayama et al. Mar 2002 A1
20020029276 Bendinelli et al. Mar 2002 A1
20020031120 Rakib Mar 2002 A1
20020032853 Preston et al. Mar 2002 A1
20020035633 Bose et al. Mar 2002 A1
20020037004 Bossemeyer et al. Mar 2002 A1
20020038380 Brawn et al. Mar 2002 A1
20020046280 Fujita Apr 2002 A1
20020046301 Shannon et al. Apr 2002 A1
20020052719 Alexander et al. May 2002 A1
20020052913 Yamada et al. May 2002 A1
20020055977 Nishi May 2002 A1
20020059078 Valdes et al. May 2002 A1
20020059148 Rosenhaft et al. May 2002 A1
20020059637 Rakib May 2002 A1
20020068558 Janik Jun 2002 A1
20020068984 Alexander et al. Jun 2002 A1
20020072868 Bartone et al. Jun 2002 A1
20020075153 Dahl Jun 2002 A1
20020080771 Krumel Jun 2002 A1
20020083342 Webb et al. Jun 2002 A1
20020085488 Kobayashi Jul 2002 A1
20020091815 Anderson et al. Jul 2002 A1
20020095490 Barker et al. Jul 2002 A1
20020099809 Lee Jul 2002 A1
20020099829 Richards et al. Jul 2002 A1
20020099854 Jorgensen Jul 2002 A1
20020101858 Stuart et al. Aug 2002 A1
20020103898 Moyer et al. Aug 2002 A1
20020103927 Parent Aug 2002 A1
20020107910 Zhao Aug 2002 A1
20020109580 Shreve et al. Aug 2002 A1
20020111698 Graziano et al. Aug 2002 A1
20020114439 Dunlap Aug 2002 A1
20020116117 Martens et al. Aug 2002 A1
20020118107 Yamamoto et al. Aug 2002 A1
20020118796 Menard et al. Aug 2002 A1
20020119800 Jaggers et al. Aug 2002 A1
20020120696 Mousseau et al. Aug 2002 A1
20020120698 Tamargo Aug 2002 A1
20020120790 Schwalb Aug 2002 A1
20020126009 Oyagi et al. Sep 2002 A1
20020128728 Murakami et al. Sep 2002 A1
20020131404 Mehta et al. Sep 2002 A1
20020133539 Monday Sep 2002 A1
20020133578 Wu Sep 2002 A1
20020136167 Steele et al. Sep 2002 A1
20020143805 Hayes et al. Oct 2002 A1
20020143923 Alexander Oct 2002 A1
20020147982 Naidoo et al. Oct 2002 A1
20020150086 Bailey et al. Oct 2002 A1
20020152298 Kikta et al. Oct 2002 A1
20020152432 Fleming Oct 2002 A1
20020156564 Preston et al. Oct 2002 A1
20020156899 Sekiguchi Oct 2002 A1
20020161885 Childers et al. Oct 2002 A1
20020163534 Choi et al. Nov 2002 A1
20020163997 Bergman et al. Nov 2002 A1
20020164953 Curtis Nov 2002 A1
20020164997 Parry Nov 2002 A1
20020165006 Haller et al. Nov 2002 A1
20020166125 Fulmer Nov 2002 A1
20020174367 Kimmel Nov 2002 A1
20020174434 Lee et al. Nov 2002 A1
20020177428 Menard et al. Nov 2002 A1
20020177482 Cheong et al. Nov 2002 A1
20020178100 Koveos Nov 2002 A1
20020178211 Singhal et al. Nov 2002 A1
20020180579 Nagaoka et al. Dec 2002 A1
20020184301 Parent Dec 2002 A1
20020184527 Chun et al. Dec 2002 A1
20020186683 Buck et al. Dec 2002 A1
20020188723 Choi et al. Dec 2002 A1
20020191636 Hallenbeck Dec 2002 A1
20030004088 Ushio et al. Jan 2003 A1
20030005030 Sutton et al. Jan 2003 A1
20030006879 Kang et al. Jan 2003 A1
20030009552 Benfield et al. Jan 2003 A1
20030009553 Benfield et al. Jan 2003 A1
20030010243 Roller Jan 2003 A1
20030023839 Burkhardt et al. Jan 2003 A1
20030025599 Monroe Feb 2003 A1
20030028294 Yanagi Feb 2003 A1
20030028398 Yamashita et al. Feb 2003 A1
20030030548 Kovacs et al. Feb 2003 A1
20030031165 O'Brien Feb 2003 A1
20030038730 Imafuku et al. Feb 2003 A1
20030038849 Craven et al. Feb 2003 A1
20030039242 Moore Feb 2003 A1
20030040813 Gonzales et al. Feb 2003 A1
20030041137 Horie et al. Feb 2003 A1
20030041167 French et al. Feb 2003 A1
20030046557 Miller et al. Mar 2003 A1
20030050731 Rosenblum Mar 2003 A1
20030050737 Osann Mar 2003 A1
20030051009 Shah et al. Mar 2003 A1
20030051026 Carter et al. Mar 2003 A1
20030052905 Gordon et al. Mar 2003 A1
20030052923 Porter Mar 2003 A1
20030056012 Modeste et al. Mar 2003 A1
20030056014 Verberkt et al. Mar 2003 A1
20030059005 Meyerson et al. Mar 2003 A1
20030060900 Lo et al. Mar 2003 A1
20030061344 Monroe Mar 2003 A1
20030061615 Van Der Meulen Mar 2003 A1
20030061621 Petty et al. Mar 2003 A1
20030062997 Naidoo et al. Apr 2003 A1
20030065757 Mentze et al. Apr 2003 A1
20030065784 Herrod Apr 2003 A1
20030065791 Garg et al. Apr 2003 A1
20030067923 Ju et al. Apr 2003 A1
20030069854 Hsu et al. Apr 2003 A1
20030069948 Ma et al. Apr 2003 A1
20030071724 D Amico Apr 2003 A1
20030071840 Huang et al. Apr 2003 A1
20030073406 Benjamin et al. Apr 2003 A1
20030074090 Becka et al. Apr 2003 A1
20030081768 Caminschi May 2003 A1
20030084165 Kjellberg et al. May 2003 A1
20030090473 Joshi May 2003 A1
20030096590 Satoh May 2003 A1
20030101243 Donahue et al. May 2003 A1
20030101459 Edson May 2003 A1
20030103088 Dresti et al. Jun 2003 A1
20030105850 Lean et al. Jun 2003 A1
20030110262 Hasan et al. Jun 2003 A1
20030110302 Hodges et al. Jun 2003 A1
20030112866 Yu et al. Jun 2003 A1
20030113100 Hecht et al. Jun 2003 A1
20030115345 Chien et al. Jun 2003 A1
20030120593 Bansal et al. Jun 2003 A1
20030123419 Rangnekar et al. Jul 2003 A1
20030123634 Chee Jul 2003 A1
20030126236 Marl et al. Jul 2003 A1
20030128114 Quigley Jul 2003 A1
20030128115 Giacopelli et al. Jul 2003 A1
20030132018 Okita et al. Jul 2003 A1
20030134590 Suda et al. Jul 2003 A1
20030137426 Anthony et al. Jul 2003 A1
20030137991 Doshi et al. Jul 2003 A1
20030147534 Ablay et al. Aug 2003 A1
20030149671 Yamamoto et al. Aug 2003 A1
20030153325 Veerepalli et al. Aug 2003 A1
20030155757 Larsen et al. Aug 2003 A1
20030158609 Chiu Aug 2003 A1
20030158635 Pillar et al. Aug 2003 A1
20030159135 Hiller et al. Aug 2003 A1
20030163514 Waldschmidt Aug 2003 A1
20030169728 Choi Sep 2003 A1
20030172145 Nguyen Sep 2003 A1
20030174154 Yukie et al. Sep 2003 A1
20030174648 Wang et al. Sep 2003 A1
20030174717 Zabarski et al. Sep 2003 A1
20030177236 Goto et al. Sep 2003 A1
20030182396 Reich et al. Sep 2003 A1
20030182640 Alani et al. Sep 2003 A1
20030184436 Seales et al. Oct 2003 A1
20030187920 Redkar Oct 2003 A1
20030187938 Mousseau et al. Oct 2003 A1
20030189509 Hayes et al. Oct 2003 A1
20030193991 Lansford Oct 2003 A1
20030196115 Karp Oct 2003 A1
20030197847 Shinoda Oct 2003 A1
20030198938 Murray et al. Oct 2003 A1
20030200325 Krishnaswamy et al. Oct 2003 A1
20030201889 Zulkowski Oct 2003 A1
20030208610 Rochetti et al. Nov 2003 A1
20030210126 Kanazawa Nov 2003 A1
20030214775 Fukuta et al. Nov 2003 A1
20030216143 Roese et al. Nov 2003 A1
20030217110 Weiss Nov 2003 A1
20030217136 Cho et al. Nov 2003 A1
20030225883 Greaves et al. Dec 2003 A1
20030227382 Breed Dec 2003 A1
20030227439 Lee et al. Dec 2003 A1
20030229779 Morais et al. Dec 2003 A1
20030230934 Cordelli et al. Dec 2003 A1
20030233155 Slemmer et al. Dec 2003 A1
20030233332 Keeler et al. Dec 2003 A1
20030233429 Matte et al. Dec 2003 A1
20030233549 Hatakeyama et al. Dec 2003 A1
20030233583 Carley Dec 2003 A1
20030233594 Earl Dec 2003 A1
20030236841 Epshteyn Dec 2003 A1
20040003051 Krzyzanowski et al. Jan 2004 A1
20040003241 Sengodan et al. Jan 2004 A1
20040005039 White et al. Jan 2004 A1
20040008724 Devine et al. Jan 2004 A1
20040015572 Kang Jan 2004 A1
20040034697 Fairhurst et al. Feb 2004 A1
20040034798 Yamada et al. Feb 2004 A1
20040036615 Candela Feb 2004 A1
20040037295 Tanaka et al. Feb 2004 A1
20040039459 Daugherty et al. Feb 2004 A1
20040049321 Lehr et al. Mar 2004 A1
20040054789 Breh et al. Mar 2004 A1
20040056665 Iwanaga et al. Mar 2004 A1
20040064351 Mikurak Apr 2004 A1
20040068583 Monroe et al. Apr 2004 A1
20040068657 Alexander et al. Apr 2004 A1
20040068668 Lor et al. Apr 2004 A1
20040075738 Burke et al. Apr 2004 A1
20040078825 Murphy Apr 2004 A1
20040083015 Patwari Apr 2004 A1
20040086093 Schranz May 2004 A1
20040093492 Daude et al. May 2004 A1
20040095943 Korotin May 2004 A1
20040102859 Bennett May 2004 A1
20040103308 Paller May 2004 A1
20040107027 Boudrieau Jun 2004 A1
20040107299 Lee et al. Jun 2004 A1
20040111294 McNally et al. Jun 2004 A1
20040113770 Falk et al. Jun 2004 A1
20040113778 Script et al. Jun 2004 A1
20040113937 Sawdey et al. Jun 2004 A1
20040117068 Lee Jun 2004 A1
20040117330 Ehlers et al. Jun 2004 A1
20040117462 Bodin et al. Jun 2004 A1
20040117465 Bodin et al. Jun 2004 A1
20040125146 Gerlach et al. Jul 2004 A1
20040125782 Chang Jul 2004 A1
20040125931 Archer Jul 2004 A1
20040133689 Vasisht Jul 2004 A1
20040136386 Miller et al. Jul 2004 A1
20040137915 Diener et al. Jul 2004 A1
20040139227 Takeda Jul 2004 A1
20040143428 Rappaport et al. Jul 2004 A1
20040143602 Ruiz et al. Jul 2004 A1
20040143749 Tajalli et al. Jul 2004 A1
20040153171 Brandt et al. Aug 2004 A1
20040155757 Litwin et al. Aug 2004 A1
20040160309 Stilp Aug 2004 A1
20040163073 Krzyzanowski et al. Aug 2004 A1
20040163118 Mottur Aug 2004 A1
20040163705 Uhler Aug 2004 A1
20040169288 Hsieh et al. Sep 2004 A1
20040170120 Reunamaki et al. Sep 2004 A1
20040170155 Omar et al. Sep 2004 A1
20040172396 Vanska et al. Sep 2004 A1
20040172657 Phillips et al. Sep 2004 A1
20040177163 Casey et al. Sep 2004 A1
20040181693 Milliot et al. Sep 2004 A1
20040183756 Freitas et al. Sep 2004 A1
20040189471 Ciarcia et al. Sep 2004 A1
20040189871 Kurosawa et al. Sep 2004 A1
20040196844 Hagino Oct 2004 A1
20040198386 Dupray Oct 2004 A1
20040199645 Rouhi Oct 2004 A1
20040201472 McGunn et al. Oct 2004 A1
20040202351 Park et al. Oct 2004 A1
20040212494 Stilp Oct 2004 A1
20040212503 Stilp Oct 2004 A1
20040212687 Patwari Oct 2004 A1
20040213150 Krause et al. Oct 2004 A1
20040215694 Podolsky Oct 2004 A1
20040215700 Shenfield et al. Oct 2004 A1
20040215750 Stilp Oct 2004 A1
20040215955 Tamai et al. Oct 2004 A1
20040218591 Ogawa et al. Nov 2004 A1
20040220830 Moreton et al. Nov 2004 A1
20040223605 Donnelly Nov 2004 A1
20040225516 Bruskotter et al. Nov 2004 A1
20040225719 Kisley et al. Nov 2004 A1
20040225878 Costa-Requena et al. Nov 2004 A1
20040229569 Franz Nov 2004 A1
20040243714 Wynn et al. Dec 2004 A1
20040243835 Terzis et al. Dec 2004 A1
20040243996 Sheehy et al. Dec 2004 A1
20040246339 Ooshima et al. Dec 2004 A1
20040249613 Sprogis et al. Dec 2004 A1
20040249922 Hackman et al. Dec 2004 A1
20040253926 Gross Dec 2004 A1
20040257433 Lia et al. Dec 2004 A1
20040258032 Kawamura Dec 2004 A1
20040260407 Wimsatt Dec 2004 A1
20040260527 Stanculescu Dec 2004 A1
20040263314 Dorai et al. Dec 2004 A1
20040263625 Ishigami et al. Dec 2004 A1
20040263626 Piccionelli Dec 2004 A1
20040266493 Bahl et al. Dec 2004 A1
20040267385 Lingemann Dec 2004 A1
20040267937 Klemets Dec 2004 A1
20040268298 Miller et al. Dec 2004 A1
20050002335 Adamczyk et al. Jan 2005 A1
20050002408 Lee Jan 2005 A1
20050002417 Kelly et al. Jan 2005 A1
20050007967 Keskar et al. Jan 2005 A1
20050010866 Humpleman et al. Jan 2005 A1
20050015458 La Jan 2005 A1
20050015805 Iwamura Jan 2005 A1
20050021309 Alexander et al. Jan 2005 A1
20050021626 Prajapat et al. Jan 2005 A1
20050021826 Kumar Jan 2005 A1
20050022210 Zintel et al. Jan 2005 A1
20050023858 Bingle et al. Feb 2005 A1
20050024203 Wolfe Feb 2005 A1
20050030928 Virtanen et al. Feb 2005 A1
20050031108 Eshun et al. Feb 2005 A1
20050033513 Gasbarro Feb 2005 A1
20050038325 Moll Feb 2005 A1
20050038326 Mathur Feb 2005 A1
20050044061 Klemow Feb 2005 A1
20050048957 Casey et al. Mar 2005 A1
20050049746 Rosenblum Mar 2005 A1
20050050214 Nishiyama et al. Mar 2005 A1
20050052831 Chen Mar 2005 A1
20050055575 Evans et al. Mar 2005 A1
20050055716 Louie et al. Mar 2005 A1
20050057361 Giraldo et al. Mar 2005 A1
20050060163 Barsness et al. Mar 2005 A1
20050060411 Coulombe et al. Mar 2005 A1
20050066045 Johnson et al. Mar 2005 A1
20050066912 Korbitz et al. Mar 2005 A1
20050069098 Kalervo et al. Mar 2005 A1
20050071483 Motoyama Mar 2005 A1
20050075764 Horst et al. Apr 2005 A1
20050079855 Jethi et al. Apr 2005 A1
20050079863 Macaluso Apr 2005 A1
20050081161 MacInnes et al. Apr 2005 A1
20050086093 Hammad et al. Apr 2005 A1
20050086126 Patterson Apr 2005 A1
20050086211 Mayer Apr 2005 A1
20050086366 Luebke et al. Apr 2005 A1
20050088983 Wesslen et al. Apr 2005 A1
20050089023 Barkley et al. Apr 2005 A1
20050090915 Geiwitz Apr 2005 A1
20050091435 Han et al. Apr 2005 A1
20050091696 Wolfe et al. Apr 2005 A1
20050096753 Arling et al. May 2005 A1
20050101314 Levi May 2005 A1
20050102152 Hodges May 2005 A1
20050102497 Buer May 2005 A1
20050105530 Kono May 2005 A1
20050108091 Sotak et al. May 2005 A1
20050108369 Sather et al. May 2005 A1
20050111660 Hosoda May 2005 A1
20050114432 Hodges et al. May 2005 A1
20050114528 Suito May 2005 A1
20050114900 Ladd et al. May 2005 A1
20050117602 Carrigan et al. Jun 2005 A1
20050117732 Arpin Jun 2005 A1
20050119767 Kiwimagi et al. Jun 2005 A1
20050119913 Hornreich et al. Jun 2005 A1
20050120082 Hesselink et al. Jun 2005 A1
20050125083 Kiko Jun 2005 A1
20050128068 Winick et al. Jun 2005 A1
20050128083 Puzio et al. Jun 2005 A1
20050128093 Genova et al. Jun 2005 A1
20050128314 Ishino Jun 2005 A1
20050144044 Godschall et al. Jun 2005 A1
20050144312 Kadyk et al. Jun 2005 A1
20050144645 Casey et al. Jun 2005 A1
20050148356 Ferguson et al. Jul 2005 A1
20050149639 Vrielink et al. Jul 2005 A1
20050149746 Lu et al. Jul 2005 A1
20050154494 Ahmed Jul 2005 A1
20050154774 Giaffreda et al. Jul 2005 A1
20050155757 Paton Jul 2005 A1
20050156568 Yueh Jul 2005 A1
20050156737 Al-Khateeb Jul 2005 A1
20050159823 Hayes et al. Jul 2005 A1
20050159911 Funk et al. Jul 2005 A1
20050169288 Kamiwada et al. Aug 2005 A1
20050174229 Feldkamp et al. Aug 2005 A1
20050177515 Kalavade et al. Aug 2005 A1
20050179531 Tabe Aug 2005 A1
20050181196 Aylward et al. Aug 2005 A1
20050182681 Bruskotter et al. Aug 2005 A1
20050184865 Han Aug 2005 A1
20050185618 Friday et al. Aug 2005 A1
20050187677 Walker Aug 2005 A1
20050188315 Campbell et al. Aug 2005 A1
20050197847 Smith Sep 2005 A1
20050198216 Behera et al. Sep 2005 A1
20050200474 Behnke Sep 2005 A1
20050204076 Cumpson et al. Sep 2005 A1
20050207429 Akita et al. Sep 2005 A1
20050216302 Raji et al. Sep 2005 A1
20050216580 Raji et al. Sep 2005 A1
20050220123 Wybenga et al. Oct 2005 A1
20050222820 Chung Oct 2005 A1
20050222933 Wesby Oct 2005 A1
20050229016 Addy Oct 2005 A1
20050232242 Karaoguz et al. Oct 2005 A1
20050232284 Karaoguz et al. Oct 2005 A1
20050234568 Chung et al. Oct 2005 A1
20050237182 Wang Oct 2005 A1
20050246119 Koodali Nov 2005 A1
20050246408 Chung Nov 2005 A1
20050249199 Albert et al. Nov 2005 A1
20050253709 Baker Nov 2005 A1
20050256608 King et al. Nov 2005 A1
20050257013 Ma Nov 2005 A1
20050257260 Lenoir et al. Nov 2005 A1
20050259673 Lu et al. Nov 2005 A1
20050262241 Gubbi et al. Nov 2005 A1
20050266826 Vlad Dec 2005 A1
20050267605 Lee et al. Dec 2005 A1
20050270151 Winick Dec 2005 A1
20050273831 Slomovich et al. Dec 2005 A1
20050276389 Hinkson et al. Dec 2005 A1
20050277434 Tuomi et al. Dec 2005 A1
20050280964 Richmond et al. Dec 2005 A1
20050281196 Tornetta et al. Dec 2005 A1
20050282557 Mikko et al. Dec 2005 A1
20050285934 Carter Dec 2005 A1
20050285941 Haigh et al. Dec 2005 A1
20050286518 Park et al. Dec 2005 A1
20060007005 Yui et al. Jan 2006 A1
20060009863 Lingemann Jan 2006 A1
20060015943 Mahieu Jan 2006 A1
20060018328 Mody et al. Jan 2006 A1
20060018479 Chen Jan 2006 A1
20060022816 Yukawa Feb 2006 A1
20060023847 Tyroler et al. Feb 2006 A1
20060026017 Walker Feb 2006 A1
20060026301 Maeda et al. Feb 2006 A1
20060028997 McFarland Feb 2006 A1
20060031426 Mesarina et al. Feb 2006 A1
20060031436 Sakata et al. Feb 2006 A1
20060031852 Chu et al. Feb 2006 A1
20060036750 Ladd et al. Feb 2006 A1
20060041655 Holloway et al. Feb 2006 A1
20060045074 Lee Mar 2006 A1
20060050692 Petrescu et al. Mar 2006 A1
20060050862 Shen et al. Mar 2006 A1
20060051122 Kawazu et al. Mar 2006 A1
20060052884 Staples et al. Mar 2006 A1
20060053447 Krzyzanowski et al. Mar 2006 A1
20060053459 Simerly et al. Mar 2006 A1
20060053491 Khuti et al. Mar 2006 A1
20060058923 Kruk et al. Mar 2006 A1
20060063534 Kokkonen et al. Mar 2006 A1
20060064305 Alonso Mar 2006 A1
20060064478 Sirkin Mar 2006 A1
20060067344 Sakurai Mar 2006 A1
20060067356 Kim et al. Mar 2006 A1
20060067484 Elliot et al. Mar 2006 A1
20060071773 Ahmed et al. Apr 2006 A1
20060072470 Moore et al. Apr 2006 A1
20060075235 Renkis Apr 2006 A1
20060077254 Shu et al. Apr 2006 A1
20060078344 Kawazu et al. Apr 2006 A1
20060080380 Aizu et al. Apr 2006 A1
20060080465 Conzola et al. Apr 2006 A1
20060088092 Chen et al. Apr 2006 A1
20060093365 Dybsetter et al. May 2006 A1
20060094400 Beachem et al. May 2006 A1
20060101062 Godman et al. May 2006 A1
20060103510 Chen et al. May 2006 A1
20060104312 Friar May 2006 A1
20060105713 Zheng et al. May 2006 A1
20060106933 Huang et al. May 2006 A1
20060109113 Reyes et al. May 2006 A1
20060109860 Matsunaga et al. May 2006 A1
20060109966 Sasakura et al. May 2006 A1
20060111095 Weigand May 2006 A1
20060114842 Miyamoto et al. Jun 2006 A1
20060121924 Rengaraj et al. Jun 2006 A1
20060123212 Yagawa Jun 2006 A1
20060129837 Im et al. Jun 2006 A1
20060130004 Hughes et al. Jun 2006 A1
20060132302 Stilp Jun 2006 A1
20060133412 Callaghan Jun 2006 A1
20060136558 Sheehan et al. Jun 2006 A1
20060142968 Han et al. Jun 2006 A1
20060142978 Suenbuel et al. Jun 2006 A1
20060143268 Chatani Jun 2006 A1
20060145842 Stilp Jul 2006 A1
20060153122 Hinman et al. Jul 2006 A1
20060154642 Scannell, Jr. Jul 2006 A1
20060155851 Ma et al. Jul 2006 A1
20060159032 Ukrainetz et al. Jul 2006 A1
20060161270 Luskin et al. Jul 2006 A1
20060161662 Ng et al. Jul 2006 A1
20060161960 Benoit Jul 2006 A1
20060167784 Hoffberg Jul 2006 A1
20060167919 Hsieh Jul 2006 A1
20060168013 Wilson et al. Jul 2006 A1
20060168095 Sharma et al. Jul 2006 A1
20060168178 Hwang et al. Jul 2006 A1
20060168190 Johan et al. Jul 2006 A1
20060171307 Gopalakrishnan et al. Aug 2006 A1
20060176146 Krishan et al. Aug 2006 A1
20060176167 Dohrmann Aug 2006 A1
20060181406 Petite et al. Aug 2006 A1
20060182100 Li et al. Aug 2006 A1
20060187900 Akbar Aug 2006 A1
20060189311 Cromer et al. Aug 2006 A1
20060190458 Mishina et al. Aug 2006 A1
20060190529 Morozumi et al. Aug 2006 A1
20060197660 Luebke et al. Sep 2006 A1
20060200845 Foster et al. Sep 2006 A1
20060206220 Amundson Sep 2006 A1
20060206246 Walker Sep 2006 A1
20060208872 Yu et al. Sep 2006 A1
20060208880 Funk et al. Sep 2006 A1
20060209857 Hicks, III Sep 2006 A1
20060215650 Wollmershauser et al. Sep 2006 A1
20060217115 Cassett et al. Sep 2006 A1
20060218244 Rasmussen et al. Sep 2006 A1
20060218593 Afshary et al. Sep 2006 A1
20060221184 Vallone et al. Oct 2006 A1
20060222153 Tarkoff et al. Oct 2006 A1
20060226972 Smith Oct 2006 A1
20060229746 Ollis et al. Oct 2006 A1
20060230270 Goffin Oct 2006 A1
20060233372 Shaheen et al. Oct 2006 A1
20060235963 Wetherly et al. Oct 2006 A1
20060236050 Sugimoto et al. Oct 2006 A1
20060238372 Jung et al. Oct 2006 A1
20060238617 Tamir Oct 2006 A1
20060242395 Fausak Oct 2006 A1
20060244589 Schranz Nov 2006 A1
20060245369 Schimmelpfeng et al. Nov 2006 A1
20060246886 Benco et al. Nov 2006 A1
20060246919 Park et al. Nov 2006 A1
20060250235 Astrin Nov 2006 A1
20060250578 Pohl Nov 2006 A1
20060251255 Batta Nov 2006 A1
20060258342 Fok et al. Nov 2006 A1
20060259951 Forssell et al. Nov 2006 A1
20060265489 Moore Nov 2006 A1
20060271695 Lavian Nov 2006 A1
20060274764 Mah et al. Dec 2006 A1
20060281435 Shearer et al. Dec 2006 A1
20060282886 Gaug Dec 2006 A1
20060288288 Girgensohn et al. Dec 2006 A1
20060291507 Sarosi et al. Dec 2006 A1
20060293100 Walter Dec 2006 A1
20060294565 Walter Dec 2006 A1
20070001818 Small et al. Jan 2007 A1
20070002833 Bajic Jan 2007 A1
20070005736 Hansen et al. Jan 2007 A1
20070005957 Sahita et al. Jan 2007 A1
20070006177 Aiber et al. Jan 2007 A1
20070008099 Kimmel et al. Jan 2007 A1
20070014248 Fowlow Jan 2007 A1
20070027987 Tripp et al. Feb 2007 A1
20070043478 Ehlers et al. Feb 2007 A1
20070043954 Fox Feb 2007 A1
20070046462 Fancella Mar 2007 A1
20070047585 Gillespie et al. Mar 2007 A1
20070052675 Chang Mar 2007 A1
20070055770 Karmakar et al. Mar 2007 A1
20070058627 Smith et al. Mar 2007 A1
20070061018 Callaghan et al. Mar 2007 A1
20070061020 Bovee et al. Mar 2007 A1
20070061266 Moore et al. Mar 2007 A1
20070061430 Kim Mar 2007 A1
20070061878 Hagiu et al. Mar 2007 A1
20070063836 Hayden et al. Mar 2007 A1
20070063866 Webb Mar 2007 A1
20070064714 Bi et al. Mar 2007 A1
20070067780 Kumar et al. Mar 2007 A1
20070079012 Walker Apr 2007 A1
20070079151 Connor et al. Apr 2007 A1
20070079385 Williams et al. Apr 2007 A1
20070083668 Kelsey et al. Apr 2007 A1
20070090944 Du Breuil Apr 2007 A1
20070094716 Farino et al. Apr 2007 A1
20070096981 Abraham May 2007 A1
20070101345 Takagi May 2007 A1
20070105072 Koljonen May 2007 A1
20070106124 Kuriyama et al. May 2007 A1
20070106536 Moore May 2007 A1
20070106547 Agrawal May 2007 A1
20070109975 Reckamp et al. May 2007 A1
20070116020 Cheever et al. May 2007 A1
20070117464 Freeman May 2007 A1
20070118609 Mullan et al. May 2007 A1
20070126875 Miyamaki Jun 2007 A1
20070127510 Bossemeyer et al. Jun 2007 A1
20070132576 Kolavennu et al. Jun 2007 A1
20070136759 Zhang et al. Jun 2007 A1
20070140267 Yang Jun 2007 A1
20070142022 Madonna et al. Jun 2007 A1
20070142044 Fitzgerald et al. Jun 2007 A1
20070143400 Kelley et al. Jun 2007 A1
20070143440 Reckamp et al. Jun 2007 A1
20070146127 Stilp et al. Jun 2007 A1
20070146484 Horton et al. Jun 2007 A1
20070147419 Tsujimoto et al. Jun 2007 A1
20070150616 Baek et al. Jun 2007 A1
20070154010 Wong Jul 2007 A1
20070155325 Bambic et al. Jul 2007 A1
20070155423 Carmody et al. Jul 2007 A1
20070156689 Meek et al. Jul 2007 A1
20070160017 Meier et al. Jul 2007 A1
20070161372 Rogalski et al. Jul 2007 A1
20070162228 Mitchell Jul 2007 A1
20070162680 Mitchell Jul 2007 A1
20070164779 Weston et al. Jul 2007 A1
20070168860 Takayama et al. Jul 2007 A1
20070176766 Cheng Aug 2007 A1
20070182543 Luo Aug 2007 A1
20070182819 Monroe Aug 2007 A1
20070183345 Fahim et al. Aug 2007 A1
20070185989 Corbett et al. Aug 2007 A1
20070192486 Wilson et al. Aug 2007 A1
20070192863 Kapoor et al. Aug 2007 A1
20070197236 Ahn et al. Aug 2007 A1
20070198698 Boyd et al. Aug 2007 A1
20070200658 Yang Aug 2007 A1
20070208521 Petite et al. Sep 2007 A1
20070214262 Buchbinder et al. Sep 2007 A1
20070214264 Koister Sep 2007 A1
20070216764 Kwak Sep 2007 A1
20070216783 Ortiz et al. Sep 2007 A1
20070218895 Saito et al. Sep 2007 A1
20070223465 Wang et al. Sep 2007 A1
20070223500 Lee et al. Sep 2007 A1
20070226182 Sobotka et al. Sep 2007 A1
20070230415 Malik Oct 2007 A1
20070230744 Dronge Oct 2007 A1
20070245223 Siedzik et al. Oct 2007 A1
20070249323 Lee et al. Oct 2007 A1
20070253361 Pristas et al. Nov 2007 A1
20070255856 Reckamp et al. Nov 2007 A1
20070256105 Tabe Nov 2007 A1
20070257986 Ivanov et al. Nov 2007 A1
20070260713 Moorer et al. Nov 2007 A1
20070262857 Jackson Nov 2007 A1
20070263782 Stock et al. Nov 2007 A1
20070265866 Fehling et al. Nov 2007 A1
20070271398 Manchester et al. Nov 2007 A1
20070275703 Lim et al. Nov 2007 A1
20070277111 Bennett et al. Nov 2007 A1
20070282665 Buehler et al. Dec 2007 A1
20070283001 Spiess et al. Dec 2007 A1
20070283004 Buehler Dec 2007 A1
20070286210 Gutt et al. Dec 2007 A1
20070286369 Gutt et al. Dec 2007 A1
20070287405 Radtke Dec 2007 A1
20070288849 Moorer et al. Dec 2007 A1
20070288858 Pereira Dec 2007 A1
20070290830 Gurley Dec 2007 A1
20070291118 Shu et al. Dec 2007 A1
20070296814 Cooper et al. Dec 2007 A1
20070298772 Owens et al. Dec 2007 A1
20080001734 Stilp et al. Jan 2008 A1
20080013531 Elliott et al. Jan 2008 A1
20080013957 Akers et al. Jan 2008 A1
20080025487 Johan et al. Jan 2008 A1
20080027587 Nickerson et al. Jan 2008 A1
20080040272 Eskin Feb 2008 A1
20080042826 Hevia et al. Feb 2008 A1
20080043107 Coogan et al. Feb 2008 A1
20080046593 Ando et al. Feb 2008 A1
20080048975 Leibow Feb 2008 A1
20080052348 Adler et al. Feb 2008 A1
20080056212 Karaoguz et al. Mar 2008 A1
20080056261 Osborn et al. Mar 2008 A1
20080059533 Krikorian Mar 2008 A1
20080059622 Hite et al. Mar 2008 A1
20080065681 Fontijn et al. Mar 2008 A1
20080065685 Frank Mar 2008 A1
20080069121 Adamson et al. Mar 2008 A1
20080072244 Eker et al. Mar 2008 A1
20080074258 Bennett et al. Mar 2008 A1
20080074993 Vainola Mar 2008 A1
20080082186 Hood et al. Apr 2008 A1
20080084294 Zhiying et al. Apr 2008 A1
20080084296 Kutzik et al. Apr 2008 A1
20080086564 Putman et al. Apr 2008 A1
20080091793 Diroo et al. Apr 2008 A1
20080094204 Kogan et al. Apr 2008 A1
20080095339 Elliott et al. Apr 2008 A1
20080100705 Kister et al. May 2008 A1
20080102845 Zhao May 2008 A1
20080103608 Gough et al. May 2008 A1
20080104215 Excoffier et al. May 2008 A1
20080104516 Lee May 2008 A1
20080109302 Salokannel et al. May 2008 A1
20080109650 Shim et al. May 2008 A1
20080112340 Luebke May 2008 A1
20080112405 Cholas et al. May 2008 A1
20080117029 Dohrmann et al. May 2008 A1
20080117201 Martinez et al. May 2008 A1
20080117922 Cockrell et al. May 2008 A1
20080120405 Son et al. May 2008 A1
20080122575 Lavian et al. May 2008 A1
20080126535 Zhu et al. May 2008 A1
20080128444 Schininger et al. Jun 2008 A1
20080129484 Dahl et al. Jun 2008 A1
20080129821 Howarter et al. Jun 2008 A1
20080130949 Ivanov et al. Jun 2008 A1
20080133725 Shaouy Jun 2008 A1
20080134165 Anderson et al. Jun 2008 A1
20080134343 Pennington et al. Jun 2008 A1
20080137572 Park et al. Jun 2008 A1
20080140868 Kalayjian et al. Jun 2008 A1
20080141303 Walker et al. Jun 2008 A1
20080141341 Vinogradov et al. Jun 2008 A1
20080144884 Habibi Jun 2008 A1
20080147834 Quinn et al. Jun 2008 A1
20080151037 Kumarasamy et al. Jun 2008 A1
20080155080 Marlow et al. Jun 2008 A1
20080155470 Khedouri et al. Jun 2008 A1
20080162637 Adamczyk et al. Jul 2008 A1
20080163355 Chu Jul 2008 A1
20080165787 Xu et al. Jul 2008 A1
20080170511 Shorty et al. Jul 2008 A1
20080180240 Raji et al. Jul 2008 A1
20080181239 Wood et al. Jul 2008 A1
20080183483 Hart Jul 2008 A1
20080183842 Raji et al. Jul 2008 A1
20080189609 Larson et al. Aug 2008 A1
20080201468 Titus Aug 2008 A1
20080201723 Bottaro et al. Aug 2008 A1
20080208399 Pham Aug 2008 A1
20080209505 Ghai et al. Aug 2008 A1
20080209506 Ghai et al. Aug 2008 A1
20080215450 Gates et al. Sep 2008 A1
20080215613 Grasso Sep 2008 A1
20080219239 Bell et al. Sep 2008 A1
20080221715 Krzyzanowski et al. Sep 2008 A1
20080227460 David et al. Sep 2008 A1
20080229415 Kapoor et al. Sep 2008 A1
20080235326 Parsi et al. Sep 2008 A1
20080235600 Harper et al. Sep 2008 A1
20080239075 Mehrotra et al. Oct 2008 A1
20080240372 Frenette Oct 2008 A1
20080240696 Kucharyson Oct 2008 A1
20080259818 Balassanian Oct 2008 A1
20080261540 Rohani et al. Oct 2008 A1
20080262990 Kapoor et al. Oct 2008 A1
20080262991 Kapoor et al. Oct 2008 A1
20080263150 Childers et al. Oct 2008 A1
20080266080 Leung et al. Oct 2008 A1
20080266257 Chiang Oct 2008 A1
20080271150 Boerger et al. Oct 2008 A1
20080284580 Babich et al. Nov 2008 A1
20080284587 Saigh et al. Nov 2008 A1
20080284592 Collins et al. Nov 2008 A1
20080288639 Ruppert et al. Nov 2008 A1
20080294588 Morris et al. Nov 2008 A1
20080295172 Bohacek Nov 2008 A1
20080297599 Donovan et al. Dec 2008 A1
20080303903 Bentley et al. Dec 2008 A1
20080313316 Hite et al. Dec 2008 A1
20080316024 Chantelou et al. Dec 2008 A1
20090003172 Yahata et al. Jan 2009 A1
20090003252 Salomone et al. Jan 2009 A1
20090003820 Law et al. Jan 2009 A1
20090007596 Goldstein et al. Jan 2009 A1
20090013210 McIntosh et al. Jan 2009 A1
20090018850 Abhyanker Jan 2009 A1
20090019141 Bush et al. Jan 2009 A1
20090022362 Gagvani et al. Jan 2009 A1
20090036142 Yan Feb 2009 A1
20090036159 Chen Feb 2009 A1
20090041467 Carleton et al. Feb 2009 A1
20090042649 Hsieh et al. Feb 2009 A1
20090046664 Aso Feb 2009 A1
20090049094 Howell et al. Feb 2009 A1
20090049488 Stransky Feb 2009 A1
20090051769 Kuo et al. Feb 2009 A1
20090055760 Whatcott et al. Feb 2009 A1
20090057427 Geadelmann et al. Mar 2009 A1
20090063582 Anna et al. Mar 2009 A1
20090066534 Sivakkolundhu Mar 2009 A1
20090066789 Baum et al. Mar 2009 A1
20090067395 Curtis et al. Mar 2009 A1
20090070436 Dawes et al. Mar 2009 A1
20090070477 Baum et al. Mar 2009 A1
20090070681 Dawes et al. Mar 2009 A1
20090070682 Dawes et al. Mar 2009 A1
20090070692 Dawes et al. Mar 2009 A1
20090072988 Haywood Mar 2009 A1
20090074184 Baum et al. Mar 2009 A1
20090076211 Yang et al. Mar 2009 A1
20090076879 Sparks et al. Mar 2009 A1
20090077167 Baum et al. Mar 2009 A1
20090077622 Baum et al. Mar 2009 A1
20090077623 Baum et al. Mar 2009 A1
20090077624 Baum et al. Mar 2009 A1
20090079547 Oksanen et al. Mar 2009 A1
20090083167 Subbloie Mar 2009 A1
20090086660 Sood et al. Apr 2009 A1
20090086740 Al-Bakri et al. Apr 2009 A1
20090089822 Wada Apr 2009 A1
20090092283 Whillock et al. Apr 2009 A1
20090094671 Kurapati et al. Apr 2009 A1
20090100176 Hicks, III et al. Apr 2009 A1
20090100329 Espinoza Apr 2009 A1
20090100460 Hicks et al. Apr 2009 A1
20090100492 Hicks et al. Apr 2009 A1
20090109959 Elliott et al. Apr 2009 A1
20090113344 Nesse et al. Apr 2009 A1
20090119397 Neerdaels May 2009 A1
20090125708 Woodring et al. May 2009 A1
20090128365 Laskin May 2009 A1
20090134998 Baum et al. May 2009 A1
20090138600 Baum et al. May 2009 A1
20090138958 Baum et al. May 2009 A1
20090144237 Branam et al. Jun 2009 A1
20090146846 Grossman Jun 2009 A1
20090158189 Itani Jun 2009 A1
20090158292 Rattner et al. Jun 2009 A1
20090161609 Bergstrom Jun 2009 A1
20090165114 Baum et al. Jun 2009 A1
20090172443 Rothman et al. Jul 2009 A1
20090177298 McFarland et al. Jul 2009 A1
20090177906 Paniagua et al. Jul 2009 A1
20090180430 Fadell Jul 2009 A1
20090182868 McFate et al. Jul 2009 A1
20090187297 Kish et al. Jul 2009 A1
20090189981 Siann et al. Jul 2009 A1
20090193373 Abbaspour et al. Jul 2009 A1
20090197539 Shiba Aug 2009 A1
20090202250 Dizechi et al. Aug 2009 A1
20090204693 Andreev et al. Aug 2009 A1
20090221368 Yen et al. Sep 2009 A1
20090224875 Rabinowitz et al. Sep 2009 A1
20090228445 Gangal Sep 2009 A1
20090240353 Songkakul et al. Sep 2009 A1
20090240730 Wood Sep 2009 A1
20090240787 Denny Sep 2009 A1
20090240814 Brubacher et al. Sep 2009 A1
20090240946 Yeap et al. Sep 2009 A1
20090254960 Yarom et al. Oct 2009 A1
20090256708 Hsiao et al. Oct 2009 A1
20090259515 Belimpasakis et al. Oct 2009 A1
20090260052 Bathula et al. Oct 2009 A1
20090260083 Szeto et al. Oct 2009 A1
20090260430 Zamfes Oct 2009 A1
20090265042 Mollenkopf et al. Oct 2009 A1
20090265193 Collins et al. Oct 2009 A1
20090270090 Kawamura Oct 2009 A1
20090271042 Voysey Oct 2009 A1
20090289787 Dawson et al. Nov 2009 A1
20090289788 Leblond Nov 2009 A1
20090292909 Feder et al. Nov 2009 A1
20090303100 Zemany Dec 2009 A1
20090307255 Park Dec 2009 A1
20090307307 Igarashi Dec 2009 A1
20090313693 Rogers Dec 2009 A1
20090316671 Rolf et al. Dec 2009 A1
20090319361 Conrady Dec 2009 A1
20090322510 Berger et al. Dec 2009 A1
20090324010 Hou Dec 2009 A1
20090327483 Thompson et al. Dec 2009 A1
20090327510 Edelman et al. Dec 2009 A1
20100000791 Alberty Jan 2010 A1
20100001812 Kausch Jan 2010 A1
20100004949 O'Brien Jan 2010 A1
20100008274 Kneckt et al. Jan 2010 A1
20100009758 Twitchell, Jr. Jan 2010 A1
20100011298 Campbell et al. Jan 2010 A1
20100013917 Hanna et al. Jan 2010 A1
20100023865 Fulker et al. Jan 2010 A1
20100026481 Oh et al. Feb 2010 A1
20100026487 Hershkovitz Feb 2010 A1
20100030578 Siddique et al. Feb 2010 A1
20100030810 Marr Feb 2010 A1
20100039958 Ge et al. Feb 2010 A1
20100041380 Hewes et al. Feb 2010 A1
20100042954 Rosenblatt Feb 2010 A1
20100067371 Gogic et al. Mar 2010 A1
20100070618 Kim et al. Mar 2010 A1
20100071053 Ansari et al. Mar 2010 A1
20100074112 Derr et al. Mar 2010 A1
20100077111 Holmes et al. Mar 2010 A1
20100077347 Kirtane et al. Mar 2010 A1
20100082744 Raji et al. Apr 2010 A1
20100095111 Gutt et al. Apr 2010 A1
20100095369 Gutt et al. Apr 2010 A1
20100100269 Ekhaguere et al. Apr 2010 A1
20100102951 Rutledge Apr 2010 A1
20100121521 Kiribayashi May 2010 A1
20100122091 Huang et al. May 2010 A1
20100138758 Mizumori et al. Jun 2010 A1
20100138764 Hatambeiki et al. Jun 2010 A1
20100141762 Siann et al. Jun 2010 A1
20100145485 Duchene et al. Jun 2010 A1
20100150170 Lee et al. Jun 2010 A1
20100159898 Krzyzanowski et al. Jun 2010 A1
20100159967 Pounds et al. Jun 2010 A1
20100164736 Byers et al. Jul 2010 A1
20100165897 Sood Jul 2010 A1
20100174643 Schaefer et al. Jul 2010 A1
20100177749 Essinger et al. Jul 2010 A1
20100177750 Essinger et al. Jul 2010 A1
20100185857 Neitzel et al. Jul 2010 A1
20100191352 Quail Jul 2010 A1
20100197219 Issa et al. Aug 2010 A1
20100204839 Behm et al. Aug 2010 A1
20100210240 Mahaffey et al. Aug 2010 A1
20100212012 Touboul et al. Aug 2010 A1
20100218104 Lewis Aug 2010 A1
20100222069 Abraham et al. Sep 2010 A1
20100238286 Boghossian et al. Sep 2010 A1
20100241711 Ansari et al. Sep 2010 A1
20100241748 Ansari et al. Sep 2010 A1
20100248681 Phills Sep 2010 A1
20100267390 Lin et al. Oct 2010 A1
20100274366 Fata et al. Oct 2010 A1
20100275018 Pedersen Oct 2010 A1
20100277302 Cohn et al. Nov 2010 A1
20100279649 Thomas Nov 2010 A1
20100280635 Cohn et al. Nov 2010 A1
20100280637 Cohn et al. Nov 2010 A1
20100281135 Cohn et al. Nov 2010 A1
20100281161 Cohn et al. Nov 2010 A1
20100298024 Choi Nov 2010 A1
20100299556 Taylor et al. Nov 2010 A1
20100308990 Simon et al. Dec 2010 A1
20100321151 Matsuura et al. Dec 2010 A1
20100325107 Kenton et al. Dec 2010 A1
20100332164 Aisa et al. Dec 2010 A1
20110000521 Tachibana Jan 2011 A1
20110018998 Guzik Jan 2011 A1
20110029875 Milch Feb 2011 A1
20110030056 Tokunaga Feb 2011 A1
20110037593 Foisy et al. Feb 2011 A1
20110040415 Nickerson et al. Feb 2011 A1
20110040877 Foisy Feb 2011 A1
20110046792 Imes et al. Feb 2011 A1
20110051638 Jeon et al. Mar 2011 A1
20110058034 Grass Mar 2011 A1
20110061011 Hoguet Mar 2011 A1
20110068921 Shafer Mar 2011 A1
20110080267 Clare et al. Apr 2011 A1
20110087988 Ray et al. Apr 2011 A1
20110090334 Hicks et al. Apr 2011 A1
20110093799 Hatambeiki et al. Apr 2011 A1
20110096678 Ketonen Apr 2011 A1
20110102588 Trundle et al. May 2011 A1
20110107436 Cholas et al. May 2011 A1
20110125333 Gray May 2011 A1
20110125846 Ham et al. May 2011 A1
20110128378 Raji Jun 2011 A1
20110130112 Saigh et al. Jun 2011 A1
20110131226 Chandra et al. Jun 2011 A1
20110148572 Ku Jun 2011 A1
20110156914 Sheharri et al. Jun 2011 A1
20110169637 Siegler et al. Jul 2011 A1
20110187497 Chin Aug 2011 A1
20110197327 McElroy et al. Aug 2011 A1
20110200052 Mungo et al. Aug 2011 A1
20110208359 Duchene et al. Aug 2011 A1
20110212706 Uusilehto Sep 2011 A1
20110213869 Korsunsky et al. Sep 2011 A1
20110214157 Korsunsky et al. Sep 2011 A1
20110218777 Chen et al. Sep 2011 A1
20110219035 Korsunsky et al. Sep 2011 A1
20110230139 Nakahara Sep 2011 A1
20110230160 Felgate Sep 2011 A1
20110231510 Korsunsky et al. Sep 2011 A1
20110231564 Korsunsky et al. Sep 2011 A1
20110234392 Cohn et al. Sep 2011 A1
20110238660 Riggs Sep 2011 A1
20110238855 Korsunsky et al. Sep 2011 A1
20110246762 Adams et al. Oct 2011 A1
20110257953 Li et al. Oct 2011 A1
20110261195 Martin et al. Oct 2011 A1
20110276699 Pedersen Nov 2011 A1
20110283006 Ramamurthy Nov 2011 A1
20110289517 Sather et al. Nov 2011 A1
20110299546 Prodan et al. Dec 2011 A1
20110302497 Garrett et al. Dec 2011 A1
20110309929 Myers Dec 2011 A1
20110314515 Hernoud et al. Dec 2011 A1
20120001436 Sami et al. Jan 2012 A1
20120005276 Guo et al. Jan 2012 A1
20120014363 Hassan et al. Jan 2012 A1
20120016607 Cottrell et al. Jan 2012 A1
20120017268 Dispensa Jan 2012 A9
20120020060 Myer et al. Jan 2012 A1
20120023151 Bennett et al. Jan 2012 A1
20120030130 Smith et al. Feb 2012 A1
20120062370 Feldstein et al. Mar 2012 A1
20120066608 Sundermeyer et al. Mar 2012 A1
20120066632 Sundermeyer et al. Mar 2012 A1
20120075469 Oskin et al. Mar 2012 A1
20120081842 Ewing et al. Apr 2012 A1
20120084184 Raleigh et al. Apr 2012 A1
20120086552 Fast et al. Apr 2012 A1
20120092447 Jeong et al. Apr 2012 A1
20120143383 Cooperrider et al. Jun 2012 A1
20120150966 Fan et al. Jun 2012 A1
20120154126 Cohn et al. Jun 2012 A1
20120172027 Partheesh et al. Jul 2012 A1
20120182245 Hutton Jul 2012 A1
20120209951 Enns et al. Aug 2012 A1
20120214502 Qiang Aug 2012 A1
20120232788 Diao Sep 2012 A1
20120240185 Kapoor et al. Sep 2012 A1
20120242788 Chuang et al. Sep 2012 A1
20120257061 Edwards et al. Oct 2012 A1
20120259722 Mikurak Oct 2012 A1
20120265892 Ma et al. Oct 2012 A1
20120269199 Chan et al. Oct 2012 A1
20120278877 Baum et al. Nov 2012 A1
20120280790 Gerhardt et al. Nov 2012 A1
20120290740 Tewari et al. Nov 2012 A1
20120296486 Marriam et al. Nov 2012 A1
20120307646 Xia et al. Dec 2012 A1
20120309354 Du Dec 2012 A1
20120313781 Barker et al. Dec 2012 A1
20120314901 Hanson et al. Dec 2012 A1
20120315848 Smith et al. Dec 2012 A1
20120324566 Baum et al. Dec 2012 A1
20120327242 Barley et al. Dec 2012 A1
20120331109 Baum et al. Dec 2012 A1
20130002880 Levinson et al. Jan 2013 A1
20130007871 Meenan et al. Jan 2013 A1
20130038730 Peterson et al. Feb 2013 A1
20130038800 Yoo Feb 2013 A1
20130047123 May et al. Feb 2013 A1
20130057695 Huisking Mar 2013 A1
20130062951 Raji et al. Mar 2013 A1
20130073746 Singh et al. Mar 2013 A1
20130082835 Shapiro et al. Apr 2013 A1
20130082836 Watts Apr 2013 A1
20130085615 Barker Apr 2013 A1
20130086618 Klein et al. Apr 2013 A1
20130091213 Diab et al. Apr 2013 A1
20130094538 Wang Apr 2013 A1
20130103207 Ruff et al. Apr 2013 A1
20130111576 Devine et al. May 2013 A1
20130115972 Ziskind et al. May 2013 A1
20130120131 Hicks, III May 2013 A1
20130125157 Sharif-Ahmadi et al. May 2013 A1
20130136102 Macwan et al. May 2013 A1
20130147799 Hoguet Jun 2013 A1
20130154822 Kumar et al. Jun 2013 A1
20130155229 Thornton et al. Jun 2013 A1
20130162571 Tam Jun 2013 A1
20130163491 Singh et al. Jun 2013 A1
20130163757 Bellovin et al. Jun 2013 A1
20130173797 Poirer et al. Jul 2013 A1
20130174239 Kim et al. Jul 2013 A1
20130183924 Saigh et al. Jul 2013 A1
20130184874 Frader-Thompson et al. Jul 2013 A1
20130185026 Vanker et al. Jul 2013 A1
20130191755 Balog et al. Jul 2013 A1
20130205016 Dupre et al. Aug 2013 A1
20130218959 Sa et al. Aug 2013 A1
20130222133 Schultz et al. Aug 2013 A1
20130223279 Tinnakornsrisuphap et al. Aug 2013 A1
20130245837 Grohman Sep 2013 A1
20130257611 Lamb et al. Oct 2013 A1
20130258119 Kim et al. Oct 2013 A1
20130261821 Lu et al. Oct 2013 A1
20130266193 Tiwari et al. Oct 2013 A1
20130271270 Jamadagni et al. Oct 2013 A1
20130286942 Bonar et al. Oct 2013 A1
20130311146 Miller et al. Nov 2013 A1
20130314542 Jackson Nov 2013 A1
20130318231 Raji et al. Nov 2013 A1
20130318443 Bachman et al. Nov 2013 A1
20130325935 Kiley et al. Dec 2013 A1
20130331109 Dhillon et al. Dec 2013 A1
20130344875 Chowdhury Dec 2013 A1
20130346921 Shiplacoff Dec 2013 A1
20140006660 Frei Jan 2014 A1
20140024361 Poon et al. Jan 2014 A1
20140032034 Raptopoulos et al. Jan 2014 A1
20140033136 St. Clair Jan 2014 A1
20140035726 Schoner et al. Feb 2014 A1
20140053246 Huang et al. Feb 2014 A1
20140068486 Sellers et al. Mar 2014 A1
20140075464 McCrea Mar 2014 A1
20140095630 Wohlert et al. Apr 2014 A1
20140098247 Rao et al. Apr 2014 A1
20140108151 Bookstaff Apr 2014 A1
20140109130 Sugimoto et al. Apr 2014 A1
20140112405 Jafarian et al. Apr 2014 A1
20140126425 Burd et al. May 2014 A1
20140136242 Weekes et al. May 2014 A1
20140136847 Huang May 2014 A1
20140136936 Patel et al. May 2014 A1
20140140575 Wolf May 2014 A1
20140143695 Sundermeyer et al. May 2014 A1
20140143851 Baum et al. May 2014 A1
20140143854 Lopez et al. May 2014 A1
20140146171 Brady et al. May 2014 A1
20140153695 Yanagisawa et al. Jun 2014 A1
20140167928 Burd Jun 2014 A1
20140172957 Baum et al. Jun 2014 A1
20140176797 Silva et al. Jun 2014 A1
20140180968 Song et al. Jun 2014 A1
20140188290 Steinberg et al. Jul 2014 A1
20140188729 Hong Jul 2014 A1
20140201291 Russell Jul 2014 A1
20140208214 Stern Jul 2014 A1
20140218517 Kim et al. Aug 2014 A1
20140233951 Cook Aug 2014 A1
20140236325 Sasaki et al. Aug 2014 A1
20140245014 Tuck et al. Aug 2014 A1
20140245160 Bauer et al. Aug 2014 A1
20140254896 Zhou et al. Sep 2014 A1
20140265359 Cheng Sep 2014 A1
20140266678 Shapiro et al. Sep 2014 A1
20140266736 Cretu-Petra Sep 2014 A1
20140278281 Vaynriber et al. Sep 2014 A1
20140282048 Shapiro et al. Sep 2014 A1
20140282934 Miasnik et al. Sep 2014 A1
20140289384 Kao et al. Sep 2014 A1
20140289388 Ghosh et al. Sep 2014 A1
20140293046 Ni Oct 2014 A1
20140298467 Bhagwat et al. Oct 2014 A1
20140316616 Kugelmass Oct 2014 A1
20140317660 Cheung et al. Oct 2014 A1
20140319232 Gourlay et al. Oct 2014 A1
20140328161 Haddad et al. Nov 2014 A1
20140340216 Puskarich Nov 2014 A1
20140359101 Dawes et al. Dec 2014 A1
20140359524 Sasaki et al. Dec 2014 A1
20140368331 Cohn et al. Dec 2014 A1
20140369584 Fan et al. Dec 2014 A1
20140372599 Gutt et al. Dec 2014 A1
20140372811 Cohn et al. Dec 2014 A1
20140378110 Chingon et al. Dec 2014 A1
20150009325 Kardashov Jan 2015 A1
20150019714 Shaashua et al. Jan 2015 A1
20150022666 Kay et al. Jan 2015 A1
20150026796 Alan et al. Jan 2015 A1
20150054947 Dawes Feb 2015 A1
20150058250 Stanzione et al. Feb 2015 A1
20150074206 Baldwin Mar 2015 A1
20150074259 Ansari et al. Mar 2015 A1
20150077553 Dawes Mar 2015 A1
20150082414 Dawes Mar 2015 A1
20150088982 Johnson et al. Mar 2015 A1
20150097680 Fadell et al. Apr 2015 A1
20150097949 Ure et al. Apr 2015 A1
20150097961 Ure et al. Apr 2015 A1
20150100167 Sloo et al. Apr 2015 A1
20150106721 Cha et al. Apr 2015 A1
20150140954 Maier et al. May 2015 A1
20150142991 Zaloom May 2015 A1
20150143395 Reisman May 2015 A1
20150161875 Cohn et al. Jun 2015 A1
20150170447 Buzhardt Jun 2015 A1
20150192940 Silva et al. Jul 2015 A1
20150193127 Chai Jul 2015 A1
20150205297 Stevens Jul 2015 A1
20150205465 Robison et al. Jul 2015 A1
20150222601 Metz et al. Aug 2015 A1
20150227118 Wong Aug 2015 A1
20150256355 Pera et al. Sep 2015 A1
20150261427 Sasaki Sep 2015 A1
20150266577 Jones et al. Sep 2015 A1
20150287310 Deiiuliis et al. Oct 2015 A1
20150304804 Lotito Oct 2015 A1
20150319006 Plummer et al. Nov 2015 A1
20150319046 Plummer et al. Nov 2015 A1
20150325106 Dawes et al. Nov 2015 A1
20150331662 Lambourne Nov 2015 A1
20150334087 Dawes Nov 2015 A1
20150348554 Orr et al. Dec 2015 A1
20150350031 Burks Dec 2015 A1
20150350735 Kser Dec 2015 A1
20150358359 Ghai et al. Dec 2015 A1
20150365217 Scholten et al. Dec 2015 A1
20150365933 Lee et al. Dec 2015 A1
20150371512 Bennett et al. Dec 2015 A1
20150373149 Lyons Dec 2015 A1
20150379355 Kanga et al. Dec 2015 A1
20160004820 Moore Jan 2016 A1
20160012715 Raji et al. Jan 2016 A1
20160019763 Raji et al. Jan 2016 A1
20160019778 Raji et al. Jan 2016 A1
20160023475 Bevier et al. Jan 2016 A1
20160027295 Raji et al. Jan 2016 A1
20160036944 Kitchen et al. Feb 2016 A1
20160037389 Tagg et al. Feb 2016 A1
20160042637 Cahill Feb 2016 A1
20160055573 Chen et al. Feb 2016 A1
20160062624 Sundermeyer et al. Mar 2016 A1
20160063642 Luciani et al. Mar 2016 A1
20160065413 Sundermeyer et al. Mar 2016 A1
20160065414 Sundermeyer et al. Mar 2016 A1
20160065653 Chen et al. Mar 2016 A1
20160068264 Ganesh et al. Mar 2016 A1
20160077935 Zheng et al. Mar 2016 A1
20160080365 Baker et al. Mar 2016 A1
20160087933 Johnson et al. Mar 2016 A1
20160094421 Bali et al. Mar 2016 A1
20160100348 Cohn et al. Apr 2016 A1
20160107749 Mucci Apr 2016 A1
20160116914 Mucci Apr 2016 A1
20160127641 Gove May 2016 A1
20160147919 Yabe et al. May 2016 A1
20160150433 Bergquist et al. May 2016 A1
20160156941 Alao et al. Jun 2016 A9
20160161277 Park et al. Jun 2016 A1
20160163185 Ramasubbu et al. Jun 2016 A1
20160164923 Dawes Jun 2016 A1
20160171853 Naidoo et al. Jun 2016 A1
20160180719 Wouhaybi et al. Jun 2016 A1
20160183073 Saito et al. Jun 2016 A1
20160187995 Rosewall Jun 2016 A1
20160189509 Malhotra et al. Jun 2016 A1
20160189524 Poder et al. Jun 2016 A1
20160189527 Peterson et al. Jun 2016 A1
20160189549 Marcus Jun 2016 A1
20160191265 Cohn et al. Jun 2016 A1
20160191621 Oh et al. Jun 2016 A1
20160192461 Minsky Jun 2016 A1
20160196734 Hicks, III Jul 2016 A1
20160202695 Deroos et al. Jul 2016 A1
20160209072 Golden et al. Jul 2016 A1
20160225240 Voddhi et al. Aug 2016 A1
20160226732 Kim et al. Aug 2016 A1
20160231916 Dawes Aug 2016 A1
20160232780 Cohn et al. Aug 2016 A1
20160234075 Sirpal et al. Aug 2016 A1
20160241633 Overby et al. Aug 2016 A1
20160260135 Zomet et al. Sep 2016 A1
20160261932 Fadell et al. Sep 2016 A1
20160266579 Chen et al. Sep 2016 A1
20160267751 Fulker et al. Sep 2016 A1
20160274759 Dawes Sep 2016 A1
20160323731 Mohammed et al. Nov 2016 A1
20160363337 Steinberg et al. Dec 2016 A1
20160364089 Blackman et al. Dec 2016 A1
20160371961 Narang et al. Dec 2016 A1
20160371967 Narang et al. Dec 2016 A1
20160373453 Ruffner et al. Dec 2016 A1
20160378109 Raffa et al. Dec 2016 A1
20170004714 Rhee Jan 2017 A1
20170005818 Gould Jan 2017 A1
20170006107 Dawes et al. Jan 2017 A1
20170019644 K V et al. Jan 2017 A1
20170026440 Cockrell et al. Jan 2017 A1
20170039413 Nadler Feb 2017 A1
20170052513 Raji et al. Feb 2017 A1
20170054570 Hagins et al. Feb 2017 A1
20170054571 Kitchen et al. Feb 2017 A1
20170054594 Decenzo et al. Feb 2017 A1
20170063967 Kitchen et al. Mar 2017 A1
20170063968 Kitchen et al. Mar 2017 A1
20170068419 Sundermeyer et al. Mar 2017 A1
20170070361 Sundermeyer et al. Mar 2017 A1
20170070563 Sundermeyer et al. Mar 2017 A1
20170078298 Vlaminck et al. Mar 2017 A1
20170092138 Trundle et al. Mar 2017 A1
20170103646 Naidoo et al. Apr 2017 A1
20170109999 Cohn et al. Apr 2017 A1
20170111227 Papageorgiou et al. Apr 2017 A1
20170118037 Kitchen et al. Apr 2017 A1
20170124987 Kim May 2017 A1
20170127124 Wilson et al. May 2017 A9
20170154507 Dawes et al. Jun 2017 A1
20170155545 Baum et al. Jun 2017 A1
20170180198 Baum et al. Jun 2017 A1
20170180306 Gutt et al. Jun 2017 A1
20170185277 Sundermeyer et al. Jun 2017 A1
20170185278 Sundermeyer et al. Jun 2017 A1
20170185281 Park Jun 2017 A1
20170187993 Martch et al. Jun 2017 A1
20170192402 Karp et al. Jul 2017 A1
20170225336 Deyle Aug 2017 A1
20170227965 Decenzo et al. Aug 2017 A1
20170244573 Baum et al. Aug 2017 A1
20170255452 Barnes et al. Sep 2017 A1
20170257257 Dawes Sep 2017 A1
20170278407 Lemmey et al. Sep 2017 A1
20170279629 Raji Sep 2017 A1
20170289323 Gelvin et al. Oct 2017 A1
20170289360 Baum et al. Oct 2017 A1
20170301216 Cohn et al. Oct 2017 A1
20170302469 Cohn et al. Oct 2017 A1
20170303257 Yamada et al. Oct 2017 A1
20170310500 Dawes Oct 2017 A1
20170330466 Demetriades et al. Nov 2017 A1
20170331781 Gutt et al. Nov 2017 A1
20170332055 Henderson Nov 2017 A1
20170337806 Cohn et al. Nov 2017 A1
20170353324 Baum et al. Dec 2017 A1
20180004377 Kitchen et al. Jan 2018 A1
20180012460 Heitz, III et al. Jan 2018 A1
20180019890 Dawes Jan 2018 A1
20180027517 Noonan Jan 2018 A9
20180045159 Patel Feb 2018 A1
20180054774 Cohn et al. Feb 2018 A1
20180063248 Dawes et al. Mar 2018 A1
20180063259 Connelly et al. Mar 2018 A1
20180069862 Cholas et al. Mar 2018 A1
20180069932 Tiwari et al. Mar 2018 A1
20180082575 El-Mankabady Mar 2018 A1
20180083831 Baum et al. Mar 2018 A1
20180092046 Egan et al. Mar 2018 A1
20180095155 Soni et al. Apr 2018 A1
20180096568 Cohn et al. Apr 2018 A1
20180107196 Bian et al. Apr 2018 A1
20180152342 Karaoguz et al. May 2018 A1
20180183668 Caldwell et al. Jun 2018 A1
20180191720 Dawes Jul 2018 A1
20180191740 Decenzo et al. Jul 2018 A1
20180191741 Dawes et al. Jul 2018 A1
20180191742 Dawes Jul 2018 A1
20180191807 Dawes Jul 2018 A1
20180197387 Dawes Jul 2018 A1
20180198688 Dawes Jul 2018 A1
20180198755 Domangue et al. Jul 2018 A1
20180198756 Dawes Jul 2018 A1
20180198788 Helen et al. Jul 2018 A1
20180198802 Dawes Jul 2018 A1
20180198841 Chmielewski et al. Jul 2018 A1
20180278701 Diem Sep 2018 A1
20180307223 Peeters et al. Oct 2018 A1
20180322759 Devdas Nov 2018 A1
20190014413 Kallai et al. Jan 2019 A1
20190041547 Rolf et al. Feb 2019 A1
20190058720 Lindquist et al. Feb 2019 A1
20190073193 Krispin Mar 2019 A1
20190073534 Dvir et al. Mar 2019 A1
20190103030 Banga et al. Apr 2019 A1
20190176985 Mucci Jun 2019 A1
20190197256 Hardt et al. Jun 2019 A1
20190204836 Rezvani Jul 2019 A1
20190239008 Lambourne Aug 2019 A1
20190245798 Short et al. Aug 2019 A1
20190265694 Chen et al. Aug 2019 A1
20190347924 Trundle et al. Nov 2019 A1
20190386892 Sundermeyer et al. Dec 2019 A1
20190391545 Trundle et al. Dec 2019 A1
20200014675 Helms et al. Jan 2020 A1
20200026285 Perrone Jan 2020 A1
20200029339 Suzuki Jan 2020 A1
20200032887 McBurney et al. Jan 2020 A1
20200036635 Ohuchi Jan 2020 A1
20200076858 Apsangi et al. Mar 2020 A1
20200094963 Myslinski Mar 2020 A1
20200127891 Johnson et al. Apr 2020 A9
20200137125 Patnala et al. Apr 2020 A1
20200142574 Sundermeyer et al. May 2020 A1
20200159399 Sundermeyer et al. May 2020 A1
20200162890 Spencer et al. May 2020 A1
20200186612 Saint Clair Jun 2020 A1
20200196213 Cheng et al. Jun 2020 A1
20200257721 McKinnon et al. Aug 2020 A1
20200273277 Kerning et al. Aug 2020 A1
20200279626 Ansari et al. Sep 2020 A1
20200322577 Raffa et al. Oct 2020 A1
20200328880 Bolotin et al. Oct 2020 A1
20200328887 Kostiainen et al. Oct 2020 A1
20200329136 Gerhardt Oct 2020 A1
20200333780 Kerzner Oct 2020 A1
20200349786 Ho Nov 2020 A1
20200380851 Farrand et al. Dec 2020 A1
20210021710 Stepanian Jan 2021 A1
20210029547 Beachem et al. Jan 2021 A1
20210053136 Rappl et al. Feb 2021 A1
20210068034 Juhasz et al. Mar 2021 A1
20210081553 Lemmey et al. Mar 2021 A1
20210099753 Connelly et al. Apr 2021 A1
20210153001 Eisner May 2021 A1
20210180815 Shamoon et al. Jun 2021 A1
20210250726 Jones Aug 2021 A1
20210326451 Nunez Di Croce Oct 2021 A1
20210335123 Trundle et al. Oct 2021 A1
20220021552 Ansari et al. Jan 2022 A1
20220027051 Kant et al. Jan 2022 A1
20220029994 Choyi et al. Jan 2022 A1
20220038440 Boynton et al. Feb 2022 A1
20220073052 Zhou et al. Mar 2022 A1
20220159334 Wang et al. May 2022 A1
20220247624 Johnson et al. Aug 2022 A1
20220415104 McLachlan Dec 2022 A1
20230057193 Ansari et al. Feb 2023 A1
Foreign Referenced Citations (155)
Number Date Country
2005223267 Dec 2010 AU
2010297957 May 2012 AU
2011250886 Jan 2013 AU
2013284428 Feb 2015 AU
2011305163 Dec 2016 AU
2017201365 Mar 2017 AU
2017201585 Mar 2017 AU
1008939 Oct 1996 BE
2203813 Jun 1996 CA
2174482 Oct 1997 CA
2346638 Apr 2000 CA
2389958 Mar 2003 CA
2878117 Jan 2014 CA
2559842 May 2014 CA
2992429 Dec 2016 CA
2976682 Feb 2018 CA
2976802 Feb 2018 CA
1599999 Mar 2005 CN
102834818 Dec 2012 CN
102985915 Mar 2013 CN
102004027893 Jan 2006 DE
0295146 Dec 1988 EP
0308046 Mar 1989 EP
0591585 Apr 1994 EP
1117214 Jul 2001 EP
1119837 Aug 2001 EP
0978111 Nov 2001 EP
1738540 Jan 2007 EP
1881716 Jan 2008 EP
2112784 Oct 2009 EP
2188794 May 2010 EP
2191351 Jun 2010 EP
2327063 Jun 2011 EP
2483788 Aug 2012 EP
2569712 Mar 2013 EP
2619686 Jul 2013 EP
2868039 May 2015 EP
3031206 Jun 2016 EP
3285238 Feb 2018 EP
3308222 Apr 2018 EP
2584217 Jan 1987 FR
2661023 Oct 1991 FR
2793334 Nov 2000 FR
2222288 Feb 1990 GB
2273593 Jun 1994 GB
2286423 Aug 1995 GB
2291554 Jan 1996 GB
2319373 May 1998 GB
2320644 Jun 1998 GB
2324630 Oct 1998 GB
2325548 Nov 1998 GB
2335523 Sep 1999 GB
2349293 Oct 2000 GB
2370400 Jun 2002 GB
2442628 Apr 2008 GB
2442633 Apr 2008 GB
2442640 Apr 2008 GB
2428821 Jun 2008 GB
452015 Nov 2015 IN
042016 Jan 2016 IN
63-033088 Feb 1988 JP
05-167712 Jul 1993 JP
06-339183 Dec 1993 JP
08-227491 Sep 1996 JP
10-004451 Jan 1998 JP
11-234277 Aug 1999 JP
2000-006343 Jan 2000 JP
2000-023146 Jan 2000 JP
2000-278671 Oct 2000 JP
2001-006088 Jan 2001 JP
2001-006343 Jan 2001 JP
2001-069209 Mar 2001 JP
2002-055895 Feb 2002 JP
2002-185629 Jun 2002 JP
2003-085258 Mar 2003 JP
2003-141659 May 2003 JP
2003-281647 Oct 2003 JP
2004-192659 Jul 2004 JP
2006-094394 Apr 2006 JP
2007-529826 Oct 2007 JP
2009213107 Sep 2009 JP
2010-140091 Jun 2010 JP
10-2005-0051577 Jun 2005 KR
10-2005-0052826 Jun 2005 KR
10-2006-0021605 Mar 2006 KR
10-0771941 Oct 2007 KR
340934 Sep 1998 TW
I239176 Sep 2005 TW
201101243 Jan 2011 TW
201102976 Jan 2011 TW
201102978 Jan 2011 TW
I340934 Apr 2011 TW
201117141 May 2011 TW
I480839 Apr 2015 TW
I480840 Apr 2015 TW
I509579 Nov 2015 TW
I517106 Jan 2016 TW
8907855 Aug 1989 WO
8911187 Nov 1989 WO
9403881 Feb 1994 WO
9513944 May 1995 WO
9636301 Nov 1996 WO
9713230 Apr 1997 WO
9825243 Jun 1998 WO
9849663 Nov 1998 WO
9852343 Nov 1998 WO
9859256 Dec 1998 WO
9934339 Jul 1999 WO
0021053 Apr 2000 WO
0036812 Jun 2000 WO
0072598 Nov 2000 WO
0111586 Feb 2001 WO
0152478 Jul 2001 WO
0171489 Sep 2001 WO
0186622 Nov 2001 WO
0199078 Dec 2001 WO
0211444 Feb 2002 WO
0221300 Mar 2002 WO
0297584 Dec 2002 WO
2002100083 Dec 2002 WO
2003026305 Mar 2003 WO
0340839 May 2003 WO
03049379 Jun 2003 WO
0398908 Nov 2003 WO
2004004222 Jan 2004 WO
2004077307 Sep 2004 WO
2004098127 Nov 2004 WO
2004107710 Dec 2004 WO
2005047990 May 2005 WO
2005091218 Sep 2005 WO
2007038872 Apr 2007 WO
2007124453 Nov 2007 WO
2008056320 May 2008 WO
2009006670 Jan 2009 WO
2009023647 Feb 2009 WO
2009029590 Mar 2009 WO
2009029597 Mar 2009 WO
2009064795 May 2009 WO
2009145747 Dec 2009 WO
2010019624 Feb 2010 WO
2010025468 Mar 2010 WO
2010127009 Nov 2010 WO
2010127194 Nov 2010 WO
2010127200 Nov 2010 WO
2010127203 Nov 2010 WO
2011038409 Mar 2011 WO
2011063354 May 2011 WO
2011143273 Nov 2011 WO
2012040653 Mar 2012 WO
2014004911 Jan 2014 WO
2015021469 Feb 2015 WO
2015134520 Sep 2015 WO
2015176775 Nov 2015 WO
2016201033 Dec 2016 WO
201302668 Jun 2014 ZA
Non-Patent Literature Citations (321)
Entry
US Patent Application filed Jun. 24, 2020, entitled “Method and System for Processing Security Event Data”, U.S. Appl. No. 16/910,967.
US Patent Application filed Jun. 27, 2018, entitled “Activation Of Gateway Device”, U.S. Appl. No. 16/020,499.
US Patent Application filed Jul. 2, 2019, entitled “Communication Protocols in Integrated Systems”, U.S. Appl. No. 16/460,712.
US Patent Application filed Jul. 3, 2018, entitled “WIFI-To-Serial Encapsulation In Systems”, U.S. Appl. No. 16/026,703.
US Patent Application filed Jul. 9, 2020, entitled “Automation System With Mobile Interface”, U.S. Appl. No. 16/925,026.
US Patent Application filed Jul. 12, 2018, entitled “Integrated Security System with Parallel Processing Architecture”, U.S. Appl. No. 16/034,132.
US Patent Application filed Jul. 20, 2018, entitled “Cross-Client Sensor User Interface in an Integrated Security Network”, U.S. Appl. No. 16/041,291.
US Patent Application filed Jul. 26, 2019, entitled “Device Integration Framework”, U.S. Appl. No. 16/522,949.
US Patent Application filed Jul. 28, 2016, entitled “Method and System for Automatically Providing Alternate Network Access for Telecommunications”, U.S. Appl. No. 15/222,416.
US Patent Application filed Aug. 8, 2016, entitled “Security, Monitoring and Automation Controller Access and Use of Legacy Security Control Panel Information”, U.S. Appl. No. 15/231,273.
US Patent Application filed Aug. 9, 2016, entitled “Controller and Interface for Home Security, Monitoring and Automation Having Customizable Audio Alerts for Sma Events”, U.S. Appl. No. 15/232,135.
US Patent Application filed Aug. 9, 2018, entitled “Method and System for Processing Security Event Data”, U.S. Appl. No. 16/059,833.
US Patent Application filed Aug. 21, 2018, entitled “Premises System Management Using Status Signal”, U.S. Appl. No. 16/107,568.
US Patent Application filed Aug. 23, 2019, entitled “Premises System Management Using Status Signal”, U.S. Appl. No. 16/549,837.
US Patent Application filed Aug. 26, 2020, entitled “Automation System User Interface With Three-Dimensional Display”, U.S. Appl. No. 17/003,550.
US Patent Application filed Sep. 6, 2018, entitled “Takeover of Security Network”, U.S. Appl. No. 16/123,695.
US Patent Application filed Sep. 10, 2020, entitled “Security System With Networked Touchscreen”, U.S. Appl. No. 17/017,519.
US Patent Application filed Sep. 11, 2020, entitled “Management Of Applications For A Device Located At A Premises”, U.S. Appl. No. 17/018,901.
US Patent Application filed Sep. 17, 2018, entitled “Integrated Security System With Parallel Processing Architecture”, U.S. Appl. No. 16/133,135.
US Patent Application filed Sep. 27, 2019, entitled “Control System User Interface”, U.S. Appl. No. 16/585,481.
US Patent Application filed Sep. 28, 2018, entitled “Control System User Interface”, U.S. Appl. No. 16/146,715.
US Patent Application filed Sep. 28, 2018, entitled “Forming a Security Network Including Integrated Security System Components and Network Devices”, U.S. Appl. No. 16/147,044.
US Patent Application filed Sep. 11, 18, entitled “Premises Management Networking”, U.S. Appl. No. 16/128,089.
US Patent Application filed Oct. 1, 2018, entitled “Integrated Security System With Parallel Processing Architecture”, U.S. Appl. No. 16/148,387.
US Patent Application filed Oct. 1, 2018, entitled “Integrated Security System with Parallel Processing Architecture”, U.S. Appl. No. 16/148,411.
US Patent Application filed Oct. 1, 2018, entitled “User Interface In A Premises Network”, U.S. Appl. No. 16/148,572.
US Patent Application filed Oct. 3, 2018, entitled “Activation of a Home Automation Controller”, U.S. Appl. No. 16/150,973.
US Patent Application filed Oct. 8, 2020, entitled “Communication Protocols in Integrated Systems”, U.S. Appl. No. 17/065,841.
US Patent Application filed Oct. 10, 2018, entitled “Method and System for Providing Alternate Network Access”, U.S. Appl. No. 16/156,448.
US Patent Application filed Oct. 12, 2020, entitled “Integrated Security System With Parallel Processing Architecture”, U.S. Appl. No. 17/068,584.
US Patent Application filed Oct. 13, 2017, entitled “Notification of Event Subsequent To Communication Failure With Security System”, U.S. Appl. No. 15/783,858.
US Patent Application filed Oct. 18, 2018, entitled “Generating Risk Profile Using Data Of Home Monitoring And Security System”, U.S. Appl. No. 16/164,114.
US Patent Application filed Oct. 18, 2019, entitled “Wifi-To-Serial Encapsulation in Systems”, U.S. Appl. No. 16/656,874.
US Patent Application filed Oct. 27, 2017, entitled “Security System With Networked Touchscreen”, U.S. Appl. No. 15/796,421.
US Patent Application filed Nov. 10, 2020, entitled “Integrated Cloud System for Premises Automation”, U.S. Appl. No. 17/094,120.
US Patent Application filed Nov. 19, 2019, entitled “Integrated Cloud System With Lightweight Gateway for Premises Automation”, U.S. Appl. No. 16/688,717.
US Patent Application filed Nov. 25, 2020, entitled “Premises Management Networking”, U.S. Appl. No. 17/105,235.
US Patent Application filed Nov. 26, 2019, entitled “Communication Protocols Over Internet Protocol (IP) Networks”, U.S. Appl. No. 16/696,657.
US Patent Application filed Nov. 28, 2017, entitled “Forming A Security Network Including Integrated Security System Components”, U.S. Appl. No. 15/824,503.
US Patent Application filed Nov. 29, 18, entitled “Premise Management Systems And Methods”, U.S. Appl. No. 16/204,442.
US Patent Application filed Nov. 30, 2017, entitled “Controller and Interface for Home Security, Monitoring and Automation Having Customizable Audio Alerts for SMA Events”, U.S. Appl. No. 15/828,030.
US Patent Application filed Dec. 9, 2020, entitled “Integrated Security System With Parallel Processing Architecture”, U.S. Appl. No. 17/115,936.
US Patent Application filed Dec. 14, 2018, entitled “Communication Protocols Over Internet Protocol (IP) Networks”, U.S. Appl. No. 16/221,299.
US Patent Application filed Dec. 27, 2018, entitled “Communication Protocols in Integrated Systems”, U.S. Appl. No. 16/233,913.
US Patent Application filed Dec. 27, 2019, entitled “Premises Management Systems”, U.S. Appl. No. 16/728,608.
Valtchev, D., and I. Frankov. “Service gateway architecture for a smart home.” Communications Magazine, IEEE 40.4 (2002): 126-132.
Visitalk, Communication with Vision, http://www.visitalk.jimbo.com; website accessed Jan. 10, 2018.
visitalk.com—communication with vision, http://www.visitalk.com.
Wang et al, “A Large Scale Video Surveillance System with Heterogeneous Information Fusion and Visualization for Wide Area Monitoring,” 2012 Eighth International Conference on Intelligent Information Hiding and Multimedia Signal Processing, Piraeus, 2012, pp. 178-181.
Wilkinson, S: “Logitech Harmony One Universal Remote” Ultimate AV magazine May 2008 (May 2008), XP002597782 Retrieved from the Internet: Original URL: http://www.ultimateavmag.com/remotecontrols/508logi) [retrieved on Aug. 23, 2010] the whole document; Updated URL: https://www.soundandvision.com/content/logitech-harmony-one-universal-remote, Retrieved from internet on Jan. 11, 2018.
Supplementary European Search Report for Application No. EP11827671, dated Mar. 10, 2015, 2 pages.
Supplementary Partial European Search Report for Application No. EP09807196, dated Nov. 17, 2014, 5 pages.
Supplementary European Search Report for Application No. EP2191351, dated Jun. 23, 2014, 2 pages.
Supplementary Non-Final Office Action dated Oct. 28, 2010 for U.S. Appl. No. 12/630,092, filed Dec. 3, 2009.
Topalis E., et al., “A Generic Network Management Architecture Targeted to Support Home Automation Networks and Home Internet Connectivity, Consumer Electronics, IEEE Transactions,” 2000, vol. 46 (1), pp. 44-51.
United States Patent and Trademark Office—Before the Patent Trial and Appeal Board, Alarm.com (U.S. Pat. No. 8,350,694B1) (inventors Stephen Scott Trundle & Alison Jane Slavin) V iControl Networks, Inc. (U.S. Appl. No. 13/311,365) (Inventors. Poul j. Dawes, Jim Fulker, Carolyn Wales, Reza Raji, And Gerald Gutt), Patent Interference 106,001 (HHB) (Technology Center 24000), Mar. 31, 2015.
US Patent Application filed Jan. 3, 2019, entitled “Methods and Systems for Data Communication”, U.S. Appl. No. 16/239,114.
US Patent Application filed Jan. 11, 2021, entitled “Premise Management Systems and Methods”, U.S. Appl. No. 17/145,773.
US Patent Application filed Jan. 22, 2019, entitled “Data Model for Home Automation”, U.S. Appl. No. 16/254,535.
US Patent Application filed Jan. 22, 2019, entitled “Premises System Automation”, U.S. Appl. No. 16/254,480.
US Patent Application filed Jan. 23, 2020, entitled “Forming a Security Network Including Integrated Security System Components and Network Dev”, U.S. Appl. No. 16/750,976.
US Patent Application filed Jan. 25, 2019, entitled Communication Protocols in Integrated Systems, U.S. Appl. No. 16/257,706.
US Patent Application filed Jan. 28, 2019, entitled “Automation System User Interface With Three-Dimensional Display”, U.S. Appl. No. 16/258,858.
US Patent Application filed Feb. 6, 2020, entitled “Activation Of Gateway Device”, U.S. Appl. No. 16/784,159.
US Patent Application filed Feb. 9, 2021, entitled “Premises Management Networking”, U.S. Appl. No. 17/171,398.
US Patent Application filed Mar. 2, 2017, entitled “Generating Risk Profile Using Data of Home Monitoring and Security System”, U.S. Appl. No. 15/447,982.
US Patent Application filed Mar. 2, 2020, entitled “Communication Protocols in Integrated Systems”, U.S. Appl. No. 16/807,100.
US Patent Application filed Mar. 2, 2020, entitled “Coordinated Control of Connected Devices in a Premise”, U.S. Appl. No. 16/807,028.
US Patent Application filed Mar. 7, 2014, entitled “Activation of Gateway Device”, U.S. Appl. No. 14/201,162.
US Patent Application filed Mar. 7, 2014, entitled “Communication Protocols in Integrated Systems”, U.S. Appl. No. 14/200,921.
US Patent Application filed Mar. 7, 2014, entitled “Device Integration Framework”, U.S. Appl. No. 14/201,227.
US Patent Application filed Mar. 7, 2014, entitled “Integrated Security and Control System With Geofencing”, U.S. Appl. No. 14/201,189.
US Patent Application filed Mar. 7, 2014, entitled “Security System Integrated With Social Media Platform”, U.S. Appl. No. 14/201,133.
US Patent Application filed Mar. 10, 2014, entitled “Communication Protocols in Integrated Systems”, U.S. Appl. No. 14/202,573.
US Patent Application filed Mar. 10, 2014, entitled “Communication Protocols in Integrated Systems”, U.S. Appl. No. 14/202,592.
US Patent Application filed Mar. 10, 2014, entitled “Communication Protocols in Integrated Systems”, U.S. Appl. No. 14/202,627.
US Patent Application filed Mar. 10, 2014, entitled “Communication Protocols in Integrated Systems”, U.S. Appl. No. 14/202,685.
US Patent Application filed Mar. 10, 2014, entitled “Communication Protocols in Integrated Systems”, U.S. Appl. No. 14/203,077.
US Patent Application filed Mar. 10, 2014, entitled “Communication Protocols in Integrated Systems”, U.S. Appl. No. 14/203,084.
US Patent Application filed Mar. 10, 2014, entitled “Communication Protocols in Integrated Systems”, U.S. Appl. No. 14/203,128.
US Patent Application filed Mar. 10, 2014, entitled “Communication Protocols in Integrated Systems”, U.S. Appl. No. 14/203,141.
US Patent Application filed Mar. 10, 2014, entitled “Communication Protocols in Integrated Systems”, U.S. Appl. No. 14/203,219.
US Patent Application filed Mar. 10, 2014, entitled “Communication Protocols Over Internet Protocol (IP) Networks”, U.S. Appl. No. 14/202,505.
US Patent Application filed Mar. 10, 2014, entitled “Communication Protocols Over Internet Protocol (IP) Networks”, U.S. Appl. No. 14/202,579.
US Patent Application filed Mar. 11, 2020, entitled “Management of a Security System at a Premises”, U.S. Appl. No. 16/816,134.
US Patent Application filed Mar. 17, 2021, entitled “Communication Protocols Over Internet Protocol (IP) Networks”, U.S. Appl. No. 17/204,068.
US Patent Application filed Mar. 18, 2019, entitled “Server-Based Notification of Alarm Event Subsequent To Communication Failure With Armed Security System”, U.S. Appl. No. 16/356,742.
US Patent Application filed Mar. 20, 2020, entitled “Security, Monitoring and Automation Controller Access and Use of Legacy Security Control Panel Information”, U.S. Appl. No. 16/825,099.
US Patent Application filed Apr. 17, 2020, entitled “Method and System for Providing Alternate Network Access”, U.S. Appl. No. 16/852,072.
US Patent Application filed Apr. 17, 2020, entitled “Networked Touchscreen With Integrated Interfaces”, U.S. Appl. No. 16/852,058.
US Patent Application filed Apr. 23, 2019, entitled “Control System User Interface”, U.S. Appl. No. 16/391,625.
US Patent Application filed Apr. 26, 2019, entitled “Custom Content for Premises Management”, U.S. Appl. No. 16/396,368.
US patent application filed May 2, 2018, entitled “Automation System With Mobile Interface”, U.S. Appl. No. 15/969,514.
US Patent Application filed May 11, 2020, entitled “Control System User Interface”, U.S. Appl. No. 16/871,151.
US Patent Application filed May 12, 2020, entitled “IP Device Discovery Systems and Methods”, U.S. Appl. No. 15/930,029.
US Patent Application filed May 19, 2020, entitled “User Interface in a Premises Network”, U.S. Appl. No. 16/878,099.
US Patent Application filed May 23, 2018, entitled “Networked Touchscreen With Integrated Interfaces”, U.S. Appl. No. 15/987,638.
US Patent Application filed May 26, 2020, entitled “Premises Management Configuration and Control”, U.S. Appl. No. 16/882,876.
US Patent Application filed Jun. 1, 2012, entitled “Gateway Registry Methods and Systems”, U.S. Appl. No. 13/486,276.
US Patent Application filed Jun. 10, 2020, entitled “Method and System for Communicating With and Controlling an Alarm System From a Remote Server”, U.S. Appl. No. 16/898,146.
3rd Generation Partnership Project! Technical Specification Group Services and System Aspects! Architecture enhancements to facilitate communications with packet data networks and application, Mar. 2015, 3GPP TS 23.682 V12.3.0, pp. 8-10. (Year: 2015).
Chapter 6, Securing TCP/IP, pp. 135-164, Oct. 12, 2004.
K. Lee, D. Murray, D. Hughes and W. Joosen, “Extending sensor networks into the Cloud using Amazon Web Services,” 2010 IEEE International Conference on Networked Embedded Systems for Enterprise Applications, 2010.
US Patent Application filed Mar. 22, 2021, entitled “Premises Management Configuration and Control”, U.S. Appl. No. 17/208,866.
US Patent Application filed Apr. 8, 2021, entitled “System For Data Routing In Networks”, U.S. Appl. No. 17/301,605.
US Patent Application filed May 10, 2021, entitled “Management of a Security System at a Premises”, U.S. Appl. No. 17/316,402.
US Patent Application filed Jun. 9, 2021, entitled “Premises Management Configuration and Control”, U.S. Appl. No. 17/343,315.
US Patent Application filed Jun. 18, 2021, entitled “Controlling Data Routing Among Networks”, U.S. Appl. No. 17/304,342.
US Patent Application filed Jul. 26, 2021, entitled “Notification of Event Subsequent To Communication Failure With Security System”, U.S. Appl. No. 17/443,427.
US Patent Application filed Jul. 30, 2021, entitled “Gateway Integrated With Premises Security System”, U.S. Appl. No. 17/390,222.
US Patent Application filed Aug. 10, 2021, entitled “Media Content Management”, U.S. Appl. No. 17/398,939.
US Patent Application filed Aug. 16, 2021, entitled “Control System User Interface”, U.S. Appl. No. 17/403,526.
US Patent Application filed Aug. 23, 2021, entitled “Method and System for Providing Alternate Network Access”, U.S. Appl. No. 17/409,528.
US Patent Application filed Aug. 31, 2021, entitled “Networked Touchscreen With Integrated Interfaces”, U.S. Appl. No. 17/463,267.
US Patent Application filed Sep. 7, 2021, entitled “Gateway Registry Methods and Systems”, U.S. Appl. No. 17/468,188.
US Patent Application filed Sep. 8, 2021, entitled “User Interface in a Premises Network”, U.S. Appl. No. 17/469,417.
US Patent Application filed Sep. 9, 2021, entitled “Premises System Management Using Status Signal”, U.S. Appl. No. 17/470,732.
US Patent Application filed Oct. 25, 2021, entitled “Forming a Security Network Including Integrated Security System Components and Network Devices”, U.S. Appl. No. 17/510,022.
US Patent Application filed Nov. 15, 2021, entitled “Communication Protocols in Integrated Systems”, U.S. Appl. No. 17/526,915.
US Patent Application filed Nov. 15, 2021, entitled “Integrated Cloud System With Lightweight Gateway for Premises Automation”, U.S. Appl. No. 17/455,005.
US Patent Application filed Nov. 23, 2021, entitled “Security, Monitoring and Automation Controller Access and Use of Legacy Security Control Panel Information”, U.S. Appl. No. 17/534,088.
US Patent Application filed Dec. 3, 2021, entitled “Communication Protocols in Integrated Systems”, U.S. Appl. No. 17/542,302.
US Patent Application filed Dec. 3, 2021, entitled “Control System User Interface”, U.S. Appl. No. 17/457,463.
US Patent Application filed Dec. 3, 2021, entitled “Method and System for Managing Communication Connectivity”, U.S. Appl. No. 17/542,310.
US Patent Application filed Dec. 17, 2021, entitled “Cross-Client Sensor User Interface in an Integrated Security Network”, U.S. Appl. No. 17/644,935.
US Patent Application filed Dec. 23, 2021, entitled “Defining and Implementing Sensor Triggered Response Rules”, U.S. Appl. No. 17/645,889.
“Associate”. Merriaim-Webster.com Dictionary, Merriam-Webster, https://web.archive.org/web/20061209213742/https://www.merriam-webster.com/dictionary/associate. Dec. 9, 2006.
“Dragging” The Authoritative Dictionary of IEEE Standard Terms. 7th ed. 2000, p. 337.
“File”, The Authoritative Dictionary of IEEE Standard Terms. 7th ed. 2000, pp. 432.
“Indicate”. Merriam-Webster.com Dictionary, Merriam-Webster, https://web.archive.org/web/20061209080613/https://www.merriam-webster.com/dictionary/indicate. Dec. 9, 2006.
“Application” The Authoritative Dictionary of IEEE Standard Terms. 7th ed. 2000.
“Icon”, Newton's Telecom Dictionary, 21st ed., Mar. 2005.
“Modular programming”, The Authoritative Dictionary of IEEE Standard Terms. 7th ed. 2000.
“Windows”. Newton's Telecom Dictionary, 21st ed., Mar. 2005.
6270 Touch Screen Keypad Notes, Honeywell, Sep. 2006.
Alarm.com—Interactive Security Systems, Elders [retrieved on Nov. 4, 2003], 1 page.
Alarm.com—Interactive Security Systems, Frequently Asked Questions [retrieved on Nov. 4, 2003], 3 pages.
Alarm.com—Interactive Security Systems, Overview [retrieved on Nov. 4, 2003], 2 pages.
Alarm.com—Interactive Security Systems, Product Advantages [retrieved on Nov. 4, 2003], 3 pages.
AU application filed on Feb. 28, 2017, entitled “Control System User Interface”, 2017201365.
AU application filed on Mar. 8, 2017, entitled “Integrated Security Network with Security Alarm Signaling System”, 2017201585.
CA application filed on Aug. 15, 2017, entitled “Automation System User Interface”, 2976682.
CA application filed on Aug. 16, 2017, entitled “Automation System User Interface”, 2976802.
Condry M et al., Open Service Gateway architecture overview, Industrial Electronics Society, 1999, IECON '99 Proceedings, The 25th Annual Conference of the IEEE, San Jose, CA, USA, Nov. 29-Dec. 3, 1999, Piscataway, NJ, USA, IEEE, US, vol. 2, Nov. 29, 1999 (Nov. 29, 1999), pp. 735-742, XP010366642.
Control Panel Standard—Features for False Alarm Reduction, The Security Industry Association, SIA 2009, pp. 1-48.
CorAccess Systems, Companion 6 User Guide, Jun. 17, 2002.
Court action filed for U.S. Pat. No. 7,262,690; U.S. Pat. No. 7,911,341; U.S. Pat. No. 8,073,931; U.S. Pat. No. 8,335,842; U.S. Pat. No. 8,473,619; U.S. Pat. No. 8,478,844 in U.S. District Court, Estern District of Virginia, Case No. 1:13-CV-00834, between iControl Networks, Inc. (Plaintiff) vs Alarm.com Incorporated et al. (Defendant) on Jul. 10, 2013.
Diaz, Redondo R P et al., Enhancing Residential Gateways: OSGI Service Composition, IEEE Transactions on Consumer Electronics, IEEE Service Center, New York, NY, US, vol. 53, No. 1, Feb. 1, 2007 (Feb. 1, 2007), pp. 87-95, XP011381790.
Elwahab et al. ; Device, System and . . . Customer Premises Gateways, Sep. 27, 2001; WO 01/71489.
EP application filed on Jun. 9, 2016, entitled, “Data Model for Home Automation”, 16808247.7.
EP application filed on Aug. 16, 2017, entitled, “Automation System User Interface”, 17186497.8.
EP examination report issued in EP08797646.0, dated May 17, 2017, 11 pages.
Examination Report under Section 18(3) re for UK Patent Application No. GB0620362.4, dated Aug. 13, 2007.
Examination Report under Section 18(3) re for UK Patent Application No. GB0724248.0, dated Jun. 4, 2008.
Examination Report under Section 18(3) re for UK Patent Application No. GB0724248.0, dated Jan. 30, 2008.
Examination Report under Section 18(3) re for UK Patent Application No. GB0724760.4, dated Jan. 30, 2008.
Examination Report under Section 18(3) re for UK Patent Application No. GB0800040.8, dated Jan. 30, 2008.
Faultline, “AT&T Targets video home security as next broadband market”; Nov. 2, 2006; The Register; 2 Pages.
Final Office Action dated Aug. 1, 2011 for U.S. Appl. No. 12/630,092, filed Dec. 3, 2009.
Final Office Action dated Jun. 1, 2009 for U.S. Appl. No. 11/084,232, filed Mar. 16, 2005.
Final Office Action dated Jun. 5, 2012 for U.S. Appl. No. 12/771,071, filed Apr. 30, 2010.
Final Office Action dated May 9, 2013 for U.S. Appl. No. 12/189,780, filed Aug. 11, 2008.
Final Office Action dated May 9, 2013 for U.S. Appl. No. 12/952,080, filed Nov. 22, 2010.
Final Office Action dated Jan. 10, 2011 for U.S. Appl. No. 12/189,785, filed Aug. 11, 2008.
Final Office Action dated Jun. 10, 2011 for U.S. Appl. No. 11/084,232, filed Mar. 16, 2005.
Final Office Action dated Jan. 13, 2011 for U.S. Appl. No. 12/189,780, filed Aug. 11, 2008.
Final Office Action dated Oct. 17, 2012 for U.S. Appl. No. 12/637,671, filed Dec. 14, 2009.
Final Office Action dated Sep. 17, 2012 for U.S. Appl. No. 12/197,958, filed Aug. 25, 2008.
Final Office Action dated Mar. 21, 2013 for U.S. Appl. No. 12/691,992, filed Jan. 22, 2010.
Final Office Action dated Jul. 23, 2013 for U.S. Appl. No. 13/531,757, filed Jun. 25, 2012.
Final Office Action dated Feb. 26, 2013 for U.S. Appl. No. 12/771,471, filed Apr. 30, 2010.
Final Office Action dated Jun. 29, 2012 for U.S. Appl. No. 12/539,537, filed Aug. 11, 2009.
Final Office Action dated Dec. 31, 2012 for U.S. Appl. No. 12/770,365, filed Apr. 29, 2010.
Final Office Action dated Oct. 31, 2012 for U.S. Appl. No. 12/771,624, filed Apr. 30, 2010.
Final Office Action dated Feb. 16, 2011 for U.S. Appl. No. 12/019,568, filed Jan. 24, 2008.
Final Office Action dated Jul. 12, 2010 for U.S. Appl. No. 12/019,554, filed Jan. 24, 2008.
US Patent Application filed Jan. 14, 2022, entitled “Mobile Premises Automation Platform”, U.S. Appl. No. 17/576,336.
US Patent Application filed Feb. 8, 2022, entitled “Server-Based Notification of Alarm Event Subsequent To Communication Failure With Armed Security System”, U.S. Appl. No. 17/650,324.
US Patent Application filed Mar. 10, 2022, entitled “Virtual Device Systems and Methods”, U.S. Appl. No. 17/691,774.
Windows Telecom Dictionary, Mar. 2005, pp. 937-938.
Windows, Newton's Telecom Dictionary, 21st Edition, Mar. 2005, 937-938.
Wireless, Battery-Powered Smoke Detectors, Brochure, SafeNight Technology, Inc. Roanoke, VA, 1995.
WLS906 Photoelectric Smoke Alarm, Data Sheet, DSC Security Products, Ontario, Canada, Jan. 1998.
X10—ActiveHome, Home Automation Made Easy [retrieved on Nov. 4, 2003], 3 pages.
Yanni Zhai et al., Design of Smart Home Remote Monitoring System Based on Embedded System, 2011 IEEE 2nd International Conference on Computing, Control and Industrial Engineering, vol. 2, pp. 41-44.
Final Office Action dated Sep. 14, 2011 for U.S. Appl. No. 12/197,931, filed Aug. 25, 2008.
Foreign communication from a related counterpart application—International Preliminary Examination Report, App No. PCT/US02/14450, dated Mar. 2, 2004, 4 pgs.
Foreign communication from a related counterpart application—International Search Report, App No. PCT/US02/14450, dated Dec. 17, 2002, 6 pgs.
Form PCT/ISA/210, “PCT International Search Report for the Application No. PCT/US05/08766,” dated May 23, 2006, 2 pages.
Form PCT/ISA/210, “PCT International Search Report for the Application No. PCT/US08/72831,” datd Nov. 4, 2008, 2 pages.
Form PCT/ISA/210, “PCT International Search Report for the Application No. PCT/US08/74246,” dated Nov. 14, 2008, 2 pages.
Form PCT/ISA/210, “PCT International Search Report for the Application No. PCT/US08/74260,” dated Nov. 13, 2008, 2 pages.
Form PCT/ISA/210, “PCT International Search Report for the Application No. PCT/US08/83254,” dated Jan. 14, 2009, 2 pages.
Form PCT/ISA/210, “PCT International Search Report for the Application No. PCT/US09/53485,” dated Oct. 22, 2009, 2 pages.
Form PCT/ISA/210, “PCT International Search Report for the Application No. PCT/US09/55559,” dated Nov. 12, 2009, 2 pages.
Form PCT/ISA/210, “PCT International Search Report for the Application No. PCT/US10/50585,” dated Dec. 30, 2010, 2 pages.
Form PCT/ISA/210, “PCT International Search Report for the Application No. PCT/US10/57674,” dated Mar. 2, 2011, 2 pages.
Form PCT/ISA/210, “PCT International Search Report for the Application No. PCT/US11/34858,” dated Oct. 3, 2011, 2 pages.
Form PCT/ISA/210, “PCT International Search Report for the Application No. PCT/US11/35994,” dated Sep. 28, 2011, 2 pages.
Form PCT/ISA/210, “PCT International Search Report for the Application No. PCT/US11/53136,” dated Jan. 5, 2012, 2 pages.
Form PCT/ISA/220, “PCT Notification of Transmittal of the International Search Report and the Written Opinion fo the International Searching Authority, or the Declaration for the Application No. PCT/US08/74260,” dated Nov. 13, 2008, 1 page.
Form PCT/ISA/220, “PCT Notification of Transmittal of the International Search Report and the Written Opinion for the International Searching Authority, or the Declaration for the Application No. PCT/US08/74260,” dated Nov. 13, 2008, 1 page.
Form PCT/ISA/220, “PCT Notification of Transmittal of the International Search Report and the Written Opinion ofthe International Searching Authority, or the Declaration for the Application No. PCT/US09/53485,” dated Oct. 22, 2009, 1 page.
Form PCT/ISA/220, PCT Notification of Transmittal of the International Search Report and the Written Opinion of the International Searching Authority, or the Declaration for the Application No. PCT/US05/08766, dated May 23, 2006, 1 page.
Form PCT/ISA/237, “PCT Written Opinion ofthe International Searching Authority for the Application No. PCT/US05/08766,” dated May 23, 2006, 5 pages.
Form PCT/ISA/237, “PCT Written Opinion of the International Searching Authority for the Application No. PCT/US08/72831,” dated Nov. 4, 2008, 6 pages.
Form PCT/ISA/237, “PCT Written Opinion of the International Searching Authority for the Application No. PCT/US08/74246,” dated Nov. 14, 2008, 6 pages.
Form PCT/ISA/237, “PCT Written Opinion of the International Searching Authority for the Application No. PCT/US09/53485,” dated Oct. 22, 2009, 8 pages.
Form PCT/ISA/237, “PCT Written Opinion of the International Searching Authority for the Application No. PCT/US09/55559,” dated Nov. 12, 2009, 6 pages.
Form PCT/ISA/237, “PCT Written Opinion of the International Searching Authority for the Application No. PCT/US10/50585,” dated Dec. 30, 2010, 7 pages.
Form PCT/ISA/237, “PCT Written Opinion of the International Searching Authority for the Application No. PCT/US10/57674,” dated Mar. 2, 2011, 6 pages.
Form PCT/ISA/237, “PCT Written Opinion of the International Searching Authority for the Application No. PCT/US11/34858,” dated Oct. 3, 2011, 8 pages.
Form PCT/ISA/237, “PCT Written Opinion of the International Searching Authority for the Application No. PCT/US11/35994,” dated Sep. 28, 2011, 11 pages.
Form PCT/ISA/237, “PCT Written Opinion of the International Searching Authority for the Application No. PCT/US11/53136,” dated Jan. 5, 2012.
Form PCT/ISA/237, “PCT Written Opinion of the International Searching Authority of the Application No. PCT/US08/83254,” dated Jan. 14, 2009, 7 pages.
Gateway Registry Methods and Systems, U.S. Appl. No. 13/486,276.
Genex OmniEye http://www.qenextech.com/prod01.htm, 1999 5 pages.
Genex Technologies, Genex OmniEye, www.av-iq.com/avcat/images/documents/pdfs/omnieye%20nightwatchbrochure.pdf; webpage accessed Jan. 10, 2018.
Gong, Li, A Software architecture for open service gateways, Internet Computing, IEEE 5.1, Jan.-Feb. 2001, 64-70.
GrayElectronics, “Digitizing TV cameras on TCP/IP Computer Networks,” http://www.grayelectronics.com/default.htm, printed on Oct. 12, 1999 (2 pages).
GrayElectronics, http://www.grayelectronics.com; webpage accessed on Jan. 10, 2018.
GTI Genex Technologies, Inc. OmniEye.(Trademark). Product Brochure, Sep. 14, 1999 (5 pages).
Gutierrez J.A., “On the Use of IEEE 802.15.4 to Enable Wireless Sensor Networks in Building Automation,” Personal, Indoor and Mobile Radio Communications (PIMRC), 15th IEEE International Symposium, 2004, vol. 3, pp. 1865-1869.
Indian Patent App. No. 10698/DELNP/2012, corresponds to WO2011/143273.
Indian Patent App. No. 3687/DELNP/2012, corresponds to WO2011/038409.
International Search Report for Application No. PCT/US13/48324, dated Jan. 14, 2014, 2 pages.
International Search Report for Application No. PCT/US2014/050548, dated Mar. 18, 2015, 4 pages.
J. David Eisenberg, SVG Essentials: Producing Scalable Vector Graphics with XML. O'Reilly & Associates, Inc., Sebastopol, CA 2002.
Lagotek Wireless Home Automation System, May 2006 [retrieved on Aug. 22, 2012].
Network Working Group, Request for Comments H.Schulzrinne Apr. 1998.
Non-Final Office Action dated Apr. 4, 2013 for U.S. Appl. No. 12/197,931, filed Aug. 25, 2008.
Non-Final Office Action dated Mar. 4, 2013 for U.S. Appl. No. 13/400,477, filed Feb. 20, 2012.
Non-Final Office Action dated May 5, 2010 for U.S. Appl. No. 12/189,780, filed Aug. 11, 2008.
Non-Final Office Action dated May 5, 2010 for U.S. Appl. No. 12/189,785, filed Aug. 11, 2008.
Non-Final Office Action dated Feb. 7, 2012 for U.S. Appl. No. 12/637,671, filed Dec. 14, 2009.
Non-Final Office Action dated Feb. 7, 2013 for U.S. Appl. No. 12/970,313, filed Dec. 16, 2010.
Non-Final Office Action dated Feb. 8, 2012 for U.S. Appl. No. 12/630,092, filed Dec. 3, 2009.
Non-Final Office Action dated Apr. 9, 2012 for U.S. Appl. No. 12/771,624, filed Apr. 30, 2010.
Non-Final Office Action dated Dec. 9, 2008 for U.S. Appl. No. 11/084,232, filed Mar. 16, 2005.
Non-Final Office Action dated Aug. 10, 2012 for U.S. Appl. No. 12/771,471, filed Apr. 30, 2010.
Non-Final Office Action dated Apr. 12, 2012 for U.S. Appl. No. 12/770,365, filed Apr. 29, 2010.
Non-Final Office Action dated Jul. 12, 2012 for U.S. Appl. No. 12/691,992, filed Jan. 22, 2010.
Non-Final Office Action dated Oct. 12, 2012 for U.S. Appl. No. 12/630,092, filed Dec. 3, 2009.
Non-Final Office Action dated Sep. 12, 2012 for U.S. Appl. No. 12/952,080, filed Nov. 22, 2010.
Non-Final Office Action dated Jul. 13, 2010 for U.S. Appl. No. 12/019,568, filed Jan. 24, 2008.
Non-Final Office Action dated Nov. 14, 2012 for U.S. Appl. No. 13/531,757, filed Jun. 25, 2012.
Non-Final Office Action dated Sep. 14, 2010 for U.S. Appl. No. 11/084,232, filed Mar. 16, 2005.
Non-Final Office Action dated Sep. 16, 2011 for U.S. Appl. No. 12/539,537, filed Aug. 11, 2009.
Non-Final Office Action dated Sep. 17, 2012 for U.S. Appl. No. 12/189,780, filed Aug. 11, 2008.
Non-Final Office Action dated Aug. 18, 2011 for U.S. Appl. No. 12/197,958, filed Aug. 25, 2008.
Non-Final Office Action dated Feb. 18, 2011 for U.S. Appl. No. 12/630,092, filed Dec. 3, 2009.
Non-Final Office Action dated Jan. 18, 2012 for U.S. Appl. No. 12/771,071, filed Apr. 30, 2010.
Non-Final Office Action dated Jul. 21, 2010 for U.S. Appl. No. 12/630,092, filed Dec. 3, 2009.
Non-Final Office Action dated Dec. 22, 2010 for U.S. Appl. No. 12/197,931, filed Aug. 25, 2008.
Non-Final Office Action dated Jul. 22, 2013 for U.S. Appl. No. 12/630,092, filed Dec. 3, 2009.
Non-Final Office Action dated Jan. 26, 2012 for U.S. Appl. No. 12/019,568, filed Jan. 24, 2008.
Non-Final Office Action dated Nov. 26, 2010 for U.S. Appl. No. 12/197,958, filed Aug. 25, 2008.
Non-Final Office Action dated Jun. 27, 2013 for U.S. Appl. No. 12/019,568, filed Jan. 24, 2008.
Non-Final Office Action dated Dec. 30, 2009 for U.S. Appl. No. 11/084,232, filed Mar. 16, 2005.
Non-Final Office Action dated May 30, 2008 for U.S. Appl. No. 11/084,232, filed Mar. 16, 2005.
Non-Final Office Action dated Apr. 13, 2010 for U.S. Appl. No. 11/761,745, filed Jun. 12, 2007.
Non-Final Office Action dated Feb. 21, 2013 for U.S. Appl. No. 12/771,372, filed Apr. 30, 2010.
Non-Final Office Action dated Jan. 5, 2010 for U.S. Appl. No. 12/019,554, filed Jan. 24, 2008.
Non-Final Office Action dated May 23, 2013 for U.S. Appl. No. 13/104,932, filed May 10, 2011.
Non-Final Office Action dated May 23, 2013 for U.S. Appl. No. 13/104,936, filed May 10, 2011.
Notice of Allowance dated May 14, 2013 for U.S. Appl. No. 12/637,671, filed Dec. 14, 2009.
Notice of Allowance dated Oct. 25, 2012 for U.S. Appl. No. 11/084,232, filed Mar. 16, 2005.
Oxford Dictionary, Definition of “application”, 2021, 2 pages (Year: 2021).
PCT Application filed on Jun. 9, 2016, entitled “Virtual Device Systems and Methods”, PCT/US2016/036674.
PCT Application filed on Jun. 29, 2016, entitled “Integrated Cloud System for Premises Automation”, PCT/US2016/040046.
PCT Application filed on Jun. 30, 2016, entitled “Integrated Cloud System with Lightweight Gateway for Premises Automation”, PCT/US2016/040451.
PCT Application filed on Jul. 7, 2016, entitled “Automation System User Interface with Three-Dimensional Display”, PCT/US2016/041353.
PCT Application filed on Aug. 16, 2016, entitled “Automation System User Interface”, PCT/US2016/047172.
PCT Application filed on Aug. 17, 2016, entitled “Automation System User Interface”, PCT/US2016/047262.
PCT Application filed on Oct. 13, 2016, entitled “Coordinated Control of Connected Devices in a Premise”, PCT/US2016/056842.
PCT Application filed on Nov. 17, 2016, entitled “Mobile Premises Automation Platform”, PCT/US2016/062519.
Requirement for Restriction/Election dated Jan. 22, 2013 for U.S. Appl. No. 13/104,932, filed May 10, 2011.
Requirement for Restriction/Election dated Jan. 22, 2013 for U.S. Appl. No. 13/104,936, filed May 10, 2011.
Requirement for Restriction/Election dated Oct. 24, 2012 for U.S. Appl. No. 12/750,470, filed Mar. 30, 2010.
Security For The Future, Introducing 5804B0—Advanced two-way wireless remote technology, Advertisement, ADEMCO Group, Syosset, NY, circa 1997.
Shang, Wei-lai, Study on Application of Embedded Intelligent Area System, Journal of Anyang Institute of Technology, vol. 9, No. 6, pp. 56-57 and 65.
South African Patent App. No. 2013/02668, corresponds to WO2012/040653.
Supplemental European Search Report for Application No. EP05725743.8 dated Sep. 14, 2010, 2 pages.
Supplementary European Search Report for Application No. EP10819658, dated Mar. 10, 2015, 2 pages.
US Patent Application filed Aug. 3, 2022, entitled “Premises Management Networking”, U.S. Appl. No. 17/817,210.
US Patent Application filed Aug. 11, 2022, entitled “Security Network Integrating Security System and Network Devices”, U.S. Appl. No. 17/819,083.
US Patent Application filed Jun. 1, 2022, entitled “Integrated Cloud System for Premises Automation”, U.S. Appl. No. 17/804,941.
US Patent Application filed Jun. 8, 2022, entitled “Methods and Systems for Data Communication”, U.S. Appl. No. 17/835,394.
US Patent Application filed Jun. 10, 2022, entitled “Media Content Management”, U.S. Appl. No. 17/838,046.
US Patent Application filed Jun. 10, 2022, entitled “Method, System and Apparatus for Automated Reporting of Account and Sensor Zone Information to a Central Station”, U.S. Appl. No. 17/806,341.
US Patent Application filed Jun. 22, 2022, entitled “Activation of Gateway Device”, U.S. Appl. No. 17/808,146.
US Patent Application filed Jun. 22, 2022, entitled “Automation System User Interface With Three-Dimensional Display”, U.S. Appl. No. 17/808,275.
US Patent Application filed Jun. 22, 2022, entitled “Communication Protocols in Integrated Systems”, U.S. Appl. No. 17/808,118.
US Patent Application filed Jul. 1, 2022, entitled “Forming a Security Network Including Integrated Security System Components”, U.S. Appl. No. 17/856,448.
US Patent Application filed Sep. 22, 2022, entitled “Forming a Security Network Including Integrated Security System Components and Network Devices”, U.S. Appl. No. 17/934,443.
US Patent Application filed Nov. 29, 2022, entitled “Communication Protocols Over Internet Protocol (IP) Networks”, U.S. Appl. No. 18/059,604.
US Patent Application filed Nov. 30, 2022, entitled “Custom Content for Premises Management”, U.S. Appl. No. 18/060,374.
US Patent Application filed Dec. 1, 2022, entitled “Controlling Data Routing in Premises Management Systems”, U.S. Appl. No. 18/073,514.
US Patent Application filed Apr. 4, 2022, entitled “Control System User Interface”, U.S. Appl. No. 17/712,911.
US Patent Application filed Apr. 6, 2022, entitled “Hardware Configurable Security, Monitoring and Automation Controller Having Modular Communication Protocol Interfaces”, U.S. Appl. No. 17/714,499.
US Patent Application filed Apr. 14, 2022, entitled “Premises Management Configuration and Control”, U.S. Appl. No. 17/659,259.
US Patent Application filed Apr. 14, 2022, entitled “Premises System Automation”, U.S. Appl. No. 17/721,192.
US Patent Application filed Apr. 18, 2022, entitled “Method and System for Processing Security Event Data”, U.S. Appl. No. 17/723,101.
US Patent Application filed Apr. 22, 2022, entitled “Communication Protocols in Integrated Systems”, U.S. Appl. No. 17/727,470.
US Patent Application filed May 4, 2022, entitled “Premises Management Configuration and Control”, U.S. Appl. No. 17/736,408.
US Patent Application filed May 16, 2022, entitled “Automation System With Mobile Interface”, U.S. Appl. No. 17/744,858.
US Patent Application filed May 23, 2022, entitled “Premise Management Systems and Methods”, U.S. Appl. No. 17/664,524.
US Patent Application filed May 1, 2023, entitled “Premises System Management Using Status Signal”, U.S. Appl. No. 18/310,294.
US Patent Application filed May 8, 2023, entitled “Communication Protocols in Integrated Systems”, U.S. Appl. No. 18/314,002.
US Patent Application filed May 8, 2023, entitled “Integrated Cloud System With Lightweight Gateway for Premises Automation”, U.S. Appl. No. 18/313,728.
US Patent Application filed May 8, 2023, entitled “Security Network Integrating Security System and Network Devices”, U.S. Appl. No. 18/313,817.
US Patent Application filed May 12, 2023, entitled “Virtual Device Systems and Methods”, U.S. Appl. No. 18/316,580.
Fujii et al., “Community security platform for individually maintained home computers: The Vigilante Network Project”, Proceedings of the 21st IEEE Instrumentation and Measurement Technology Conference, 2004, vol. 2, pp. 891-894.
Kobayashi et al., “Creating worldwide community safety with present technology and privacy protection: The e-JIKEI Network project”, Procedia-Social and Behavioral Sciences, 2010, vol. 2, pp. 6-13.
Prashyanusorn et al., “Sustainable tourism using security cameras with privacy protecting ability”, Journal of Information Security, 2010, vol. 1, pp. 68-73.
X. Li, R. Lu, X. Liang, X. Shen, J. Chen and X. Lin, “Smart community: an internet of things application,” in IEEE Communications Magazine, vol. 49, No. 11, pp. 68-75, Nov. 2011, doi: 10.1109/MCOM.2011.6069711. (Year: 2011).
U.S. Patent Application filed Jul. 13, 2023, entitled “Methods and Systems for Data Communication”, U.S. Appl. No. 18/351,636.
U.S. Patent Application filed Jul. 14, 2023, entitled “Bidirectional Security Sensor Communication for a Premises Security System”, U.S. Appl. No. 18/352,803.
U.S. Patent Application filed Jul. 21, 2023, entitled “Communication Protocols in Integrated Systems”, U.S. Appl. No. 18/356,337.
U.S. Patent Application filed Aug. 16, 2023, entitled “Mobile Premises Automation Platform”, U.S. Appl. No. 18/450,878.
U.S. Patent Application filed Aug. 25, 2023, entitled “Automation System With Mobile Interface”, U.S. Appl. No. 18/456,355.
Related Publications (1)
Number Date Country
20210200430 A1 Jul 2021 US
Provisional Applications (2)
Number Date Country
62205922 Aug 2015 US
62205872 Aug 2015 US
Divisions (1)
Number Date Country
Parent 15238864 Aug 2016 US
Child 17202279 US
Continuation in Parts (23)
Number Date Country
Parent 12189780 Aug 2008 US
Child 15238864 US
Parent 13531757 Jun 2008 US
Child 12189780 US
Parent 12197958 Aug 2008 US
Child 13531757 US
Parent 13334998 Dec 2011 US
Child 12197958 US
Parent 12539537 Aug 2009 US
Child 13334998 US
Parent 14645808 Mar 2015 US
Child 12539537 US
Parent 13104932 May 2011 US
Child 14645808 US
Parent 13104969 May 2011 US
Child 13104932 US
Parent 13929568 Jun 2013 US
Child 13104969 US
Parent 14704045 May 2015 US
Child 13929568 US
Parent 14704098 May 2015 US
Child 14704045 US
Parent 14704127 May 2015 US
Child 14704098 US
Parent 14628651 Feb 2015 US
Child 14704127 US
Parent 13718851 Dec 2012 US
Child 14628651 US
Parent 12972740 Dec 2010 US
Child 13718851 US
Parent 13954553 Jul 2013 US
Child 12972740 US
Parent 14943162 Nov 2015 US
Child 13954553 US
Parent 15177915 Jun 2016 US
Child 14943162 US
Parent 15177448 Jun 2016 US
Child 15177915 US
Parent 15196281 Jun 2016 US
Child 15177448 US
Parent 15198531 Jun 2016 US
Child 15196281 US
Parent 15204662 Jul 2016 US
Child 15198531 US
Parent 15237873 Aug 2016 US
Child 15204662 US