System, method and apparatus for remote monitoring

Abstract
A monitoring unit for security and automation in a premises are described. The monitoring unit uses remote configuration and control to enable monitoring of a premises. The monitoring unit provides multiple monitoring functions to respond to events within the space and alert a user at a remote device like a smartphone. An image sensor provides a wide field of view which can be segmented on the mobile device to enable specific areas to be viewed enabling the user to view the area and be alerted when an event occurs based upon a rules based configuration.
Description
TECHNICAL FIELD

The present disclosure relates to monitoring systems and in particular to a web-based integrated monitoring system for small spaces.


BACKGROUND

Home security and monitoring systems require professional installation and require professional management and monitoring. For small spaces such as apartments, the installation of a monitoring system is not practical due to the investment required to install the system and the on-going expense to monitor and maintain the monitoring system. In addition, in the rental apartment market, landlords do not want the additional expense of a monitoring system and renters do not want to install as system as they cannot take it with them. Existing solutions require multiple components to be installed and provide limited controllability and monitoring capability. Accordingly, an improved monitoring system remains highly desirable.


INCORPORATION BY REFERENCE

Each patent, patent application, and/or publication mentioned in this specification is herein incorporated by reference in its entirety to the same extent as if each individual patent, patent application, and/or publication was specifically and individually indicated to be incorporated by reference.





BRIEF DESCRIPTION OF THE DRAWINGS


FIGS. 1A-IF show views of a monitoring unit, under an embodiment.



FIG. 2A is a system overview of the monitoring system, under an embodiment.



FIG. 2B is a system overview of the monitoring system 200A, under an alternative embodiment.



FIG. 3 is a hardware block diagram of the monitoring system, under an embodiment.



FIG. 4 is a method of operation of the monitoring unit, under an embodiment.



FIG. 5 shows a block diagram of the monitoring unit firmware, under an embodiment.



FIGS. 6A-6B is an example monitoring system dashboard user interface on a mobile device, under an embodiment.



FIG. 7 is an example monitoring system night stand mode user interface on a mobile device, under an embodiment,



FIG. 8 is an example monitoring system status screen on a mobile device, under an embodiment.



FIG. 9 is an example live streaming user interface display on a mobile device, under an embodiment.



FIG. 10 is an example live streaming multi-view user interface display on a mobile device, under an embodiment.



FIG. 11 shows monitoring system accessory control on a live streaming user interface on a mobile device, under an embodiment.



FIGS. 12A-12C show control and scheduling screens for the monitoring unit on a mobile device, under an embodiment.



FIGS. 13A-13B show a user interface for configuring rules for the monitoring system unit on a mobile device, under an embodiment.



FIG. 14A is a front perspective view of the monitoring unit with a detachable stand, under an embodiment.



FIG. 14B is a rear perspective view of the monitoring unit with a detachable stand, under an embodiment.



FIG. 15A is a front perspective view of the monitoring unit with a detachable wall bracket, under an embodiment.



FIG. 15B is a rear perspective view of the monitoring unit with a detachable wall bracket, under an embodiment.





DETAILED DESCRIPTION

Embodiments of a monitoring system are described herein. The monitoring system is configured for use in smaller spaces, for example, but is not so limited. The architecture of the monitoring system includes numerous layers that operate in concert to provide security, notification, video streaming, and home automation functions, as described in detail herein.



FIGS. 1A-1F show views of a monitoring unit 100, under an embodiment. FIG. 1A shows a front view of the monitoring unit 100. The monitoring unit 100 integrates multiple devices and/or operations into a compact form factor to provide monitoring functions. The monitoring unit 100 comprises a lens (e.g., fish-eye lens, etc.) with a camera 102 that operates to view a wide target area from a single location. A motion sensor 104 is included to detect motion within the target area. The monitoring unit 100 of an embodiment includes one or more environmental sensors for monitoring parameters of the local environment. For example, the monitoring unit 100 includes an ambient light sensor 106 and/or a temperature and humidity sensor 114. The monitoring unit 100 also includes an indicator 108 (e.g., LED indicator, etc.) to provide visual status on the operation of the monitoring unit 100. A microphone 110, a speaker 112, and a siren 116 are also included to detect noises in the environment, provide feedback, allow two way communications and alert noises. A stand 118 is coupled or connected to the monitoring unit 100 but may be removed for wall mounting applications.



FIG. 1B is a perspective view of the monitoring unit 100, under an embodiment. FIG. 1C is a rear view of the monitoring unit 100, under an embodiment. FIG. 1D is a side view of the monitoring unit 100, under an embodiment. FIG. 1E is a top view of the monitoring unit 100, under an embodiment. FIG. 1F is a bottom view of the monitoring unit 100, under an embodiment.



FIG. 2A is a system overview of the monitoring system 200, under an embodiment. The monitoring system 200 includes the monitoring unit 100 along with one or more additional components as appropriate to an installation of the monitoring unit 100. For example, the monitoring unit 100 of an embodiment is coupled to a wide area network (e.g., the Internet) via a Wi-Fi router and/or cellular data coupling or connection. The system also includes an application for installation on a user's smartphone or tablet, and a web application accessible from any computer. Additionally, the system 200 includes a cloud-based back-end supporting the mobile application, web application, and device firmware.



FIG. 2B is a system overview of the monitoring system 200A, under an alternative embodiment. The monitoring system 200A includes a plurality of monitoring units 100-1/100-N (collectively referred to as 100) along with one or more additional components as appropriate to an installation of the monitoring units 100. For example, the monitoring units 100 of an embodiment are coupled to a wide area network (e.g., the Internet) via a Wi-Fi router and/or cellular data coupling or connection. The system also includes an application for installation on a user's smartphone or tablet, and a web application accessible from any computer. Additionally, the system 200A includes a cloud-based back-end supporting the mobile application, web application, and device firmware of the monitoring units 100. The monitoring system 200A comprising a plurality of monitoring units is described in detail herein.


The monitoring unit is installed in a user's home (e.g., on the wall, using the provided wall-mount, on a flat surface, etc.) and is powered by a power source. The power source of an embodiment includes a Direct Current (DC) wall adapter, for example, but is not so limited. Battery-backup is available to ensure the security aspects of the system remain functional if there is a loss of power. The monitoring unit of an embodiment uses encryption for all outgoing and incoming data. A user can set up multiple monitoring units to monitor and secure several areas.


Users interact with the environment in which a monitoring unit is installed in a number of ways. The monitoring unit's mobile and web applications show current and historical environmental readings of its surroundings by reporting on temperature, humidity, ambient light and sound. This information is periodically uploaded by monitoring unit and stored in the cloud infrastructure using a time-series database. It is presented using current values and graphs in the Vitals section of the mobile and web application.


The monitoring unit's internal sensors provide a wealth of information or data about the device and its surroundings, for use in security and notification scenarios. The data provided includes but is not limited to information representing one or more of the following: unexpected motion within monitoring unit's field of view (FOV); temperature changes within the space; humidity changes within the space; physical movement of monitoring unit (due to vibrations or tampering); loud, unexpected sounds; and changes in ambient light.


The monitoring unit of an embodiment can be coupled or connected to various remote or peripheral devices or sensors. The monitoring unit includes a home area network (HAN) radio, or personal area network (PAN), to control and receive information from paired accessories such as sensors, switches, and dimmers. HAN-connected accessories can be controlled by way of security rules, a programmable schedule and internal sensor triggers such as ambient light and temperature. The HAN devices include but are not limited to IEEE 802.11 Wireless Local Area Network devices, and IEEE 802.15 Wireless Personal Area Network devices, for example Wi-Fi, Zigbee and/or Z-wave based devices, but are not so limited.


The monitoring unit includes couplings or connections among a variety of remote components like remote sensors and other devices at the premises, and supports discovery, installation and configuration of the remote devices coupled or connected to the system, as described in detail herein. The monitoring unit uses this self-generated sub-network to discover and manage the remote devices at the premises. The monitoring unit thus enables or forms a separate wireless and/or wired network, or sub-network, that includes some number of devices and is coupled or connected to the LAN or WAN of the host premises. The monitoring unit sub-network can include, but is not limited to, any number of other devices like wired devices, wireless devices, sensors, cameras, actuators, interactive devices, WiFi devices, and security devices to name a few. The monitoring unit manages or controls the sub-network separately or privately from other communications and transfers data and information between components of the sob-network and the LAN and/or WAN, but is not so limited.


The monitoring unit also provides a coupling or connection to a central monitoring station (CMS) data for remote monitoring of the premises. The data of an embodiment is provided to the CMS and to the remote device, but in other embodiments is provided to one of the remote device and the CMS. Under this embodiment, one or more monitoring units are coupled to the CMS via a network (e.g., one or more of WiFi, LAN, WAN, cellular, etc.). Alternatively, the monitoring units are coupled to the CMS via the network (e.g., WAN, cellular, etc.) and an intermediate server or device (e.g., remote server, etc.). In operation, the monitoring unit transmits collected data and information to the CMS based upon a user-selected state of the monitoring unit. The data transmitted by the monitoring unit includes data of the monitoring unit as well as data of and data received from devices coupled to the monitoring unit via the local sub-network. The monitoring unit automatically delivers data of one or more onboard and/or coupled devices to the CMS. The interactions and notifications between the monitoring unit and the remote CMS of an embodiment are controlled or managed by the mobile application running on the mobile device. As such, the user interface presented by the mobile application provides controls for enabling or disabling remote monitoring by the CMS; for example, a user can activate monitoring at the CMS via the mobile application when leaving town on a trip, and can deactivate CMS monitoring upon his/her return. The monitoring unit and remote server therefore provide a mechanism to activate and deactivate monitoring by the remote CMS.


An embodiment of this mechanism is an Application Programming Interface (API) using an interface technology such as REST or SOAP, for example, to send monitoring activation and deactivation messages to the CMS and to receive acknowledgements from the CMS. Other embodiments such as the client application, monitoring device, or remote server utilize the user selection to enable/disable the delivery of activity messages to the CMS, where the CMS is always available and uses the presence of messages to trigger monitoring periods. The current invention also anticipates the integration of the CMS billing system into the service to enable on-demand billing of monitoring services, and/or to offer time-based monitoring of the system (e.g. the CMS monitoring is active for a specific period of time).


Users can place the monitoring unit into one of a plurality of security modes (e.g., Home, Away, Vacation, Off) using the mobile application, thereby activating and deactivating the various security preferences (defined as rules). Other security modes, such as ‘CMS Monitoring mode’ for example, can be utilized as well to effectuate different behaviors for the device and/or for its monitoring. Rules and other configurations may be stored on the monitoring unit's firmware and as such do not require a centralized server environment. In another embodiment these rules and configurations are stored on a remote server or backed up to a remote server to facilitate replacement of a defective unit.


The monitoring unit's mobile application allows users to set rules for each security mode pertaining to notifications, home-automation actions and alarms based on a set of scenarios, Under a scenario, the monitoring unit's various sensors (both internal and externally paired) can alert a user to activity within their environment, using data from sensors. The notification options of an embodiment include but are not limited to mobile push, SMS messages, telephone calls, and electronic mail to name but a few.


Under another scenario, sensors and their corresponding actions are configured by way of the mobile application. The monitoring unit can also leverage the use of externally paired HAN sensors to drive actions and notifications. The HAN sensors can include one or more of thermostats, door sensors, actuators, door locks, garage openers, window sensors, light dimmers or switches, to name a few.


The monitoring unit under yet another scenario allows rules associated with sensors (whether externally-paired or internal) to control connected appliances by way of paired HAN dimmers and switches. Furthermore, the monitoring unit can control the state of HAN-connected appliances by way of a configurable schedule, based on time and/or sunrise/sunset based on installed location.


The monitoring unit allows a user to set up notification-only rules that are outside the scope of any security modes. These rules can result in mobile push notifications derived from the same sensors that trigger security mode rules.


The monitoring unit can alert the surrounding environment to a potential breach of security by way of a very loud siren, driven by rules associated with sensors, both internal and externally paired. The siren can also be triggered by external parties such as the CMS and/or the user from a remote device. This capability allows a remote entity to interact with the device to warn occupants or deter an intruder. Moreover, the monitoring unit's system receives weather and other environmental information. This can influence rules and also provide additional environmental status to the mobile and web applications.


The monitoring unit allows users to connect to a live stream of video and audio via their mobile application or any remote device. This video is captured and streamed using a very wide field-of-view (FOV) lens, allowing the user to electronically pan, tilt, and zoom within their space. Additionally, multiple angles of the captured live stream can be viewed at once, in a segmented fashion. Each segment represents a distinct view of the monitoring unit's surroundings and the direction and zoom level chosen by the user are retained when a user returns to the live stream.


Conventional video cameras using a wide angle lens and local on-camera ‘de warping’ removed the distortion imparted by the wide angle lens locally on the camera processor to produce a flattened image, and then streamed portions or all of the flattened image to a remote device. In these systems the remote device displayed the de-warped video, but had no ability to simulate the raw video data being presented by the lens. These conventional systems therefore were optimized for lower-end remote devices that were not capable of advanced video processing.


In contrast to these conventional camera technologies, the monitoring unit described herein comprises ‘Immersive 3D video streaming’, which transmits lens-warped video data collected at the camera to the remote device where it is de-warped. In an embodiment, the raw lens-warped video data collected at the camera is transmitted or streamed to the remote device in a highly compressed format; in various alternative embodiments the warped video data collected at the camera can be clipped or processed in some manner before being compressed and transmitted or streamed. Regardless of any pre-processing technique applied to the video data collected at the camera, the embodiments described herein transmit or stream warped video data from the monitoring unit to the remote device, and the remote device performs de-warping of the video data. However, other alternative embodiments can de-warp the video data at the monitoring unit prior to transmitting the data stream to a remote device.


The local processor of the remote device manipulates the received video data to provide an optimal user experience. A key distinction in this approach is the ability to rely upon the high performance video decoding and three-dimensional (3D) manipulation capabilities present in state of the art remote devices, which include but are not limited to smart phones, tablet computers, personal computers, and other mobile and/or portable processor-based devices. Generally, the immersive 3D video streaming process executing on the remote device decodes and decrypts the video stream to a raw video frame buffer, creates a 3D space that emulates the specific lens geometry of the camera, and maps the video frame buffer to the 3D space providing an ‘immersive 3D video view’ that allows the remote device to zoom, pan, and move around in the 3D space giving the perception of being ‘inside the lens looking around’.


The monitoring unit of an embodiment generates a Immersive 3D video stream using components comprising a lens with a wide-angle geometry, as described in detail herein, that ‘warps’ or distorts the video to obtain the wide-angle view. The monitoring unit includes an image encoder that encodes the video image into a compressed streaming format. The monitoring unit of an embodiment stores the compressed streaming format with warped video to a local storage device coupled to the monitoring unit. Alternatively, the monitoring unit stores the compressed streaming format with warped video at a remote server or other remote processing component or memory to which it is coupled via a network coupling (e.g., LAN, WAN, Internet, etc.). Alternative embodiments may use other devices (e.g., a local Digital Video Recorder-DVR, etc.) to accomplish the process of encoding, compression, and storage separately from the monitoring device itself.


The monitoring unit streams the compressed video to a remote device (e.g., smart phones, tablet computers, personal computers, other mobile and portable pro based devices, etc.). The monitoring unit of an embodiment streams the compressed video directly to the remote device. Alternatively, however, the monitoring unit streams the compressed video to the remote device via an intermediate server (e.g., relay server, intermediate DVR, etc.).


The remote device decompresses the received compressed video stream. The remote device decompresses the video stream and then further processes the resulting decompressed video images using data of the camera lens geometry, more specifically the wide-angle geometry (de-warping) of the camera lens. For example, the remote device of an embodiment decompresses the video stream using a software codec (e.g. FFMPEG) executing on a processor. The remote device of an alternative embodiment decompresses the video stream using a hardware codec. The remote device uses 3D rendering technology to map the warped video to a 3D space replicating the lens geometry. The data of the lens geometry used by the remote device to process the received video stream is used by the mobile application, under an embodiment, and is one or more of received dynamically from a remote server or monitoring unit, included in a mapping table at the mobile device, and known a priori, but is not so limited. In an alternative embodiment the lens geometry is specified as part of the data interchange associated with the video feed setup. In yet another alternative embodiment the mobile application stores data of a plurality of known lens geometries associated with the camera types supported by the application.


The remote device ‘maps’ the decompressed warped image to the 3D space representing the lens geometry and displays this 3D view using a display that is a component of or coupled to the remote device. The remote device includes a user interface that enables a user to ‘move’ around the environment of the monitoring unit by panning and zooming around the 3D space and the mapped video image. The user interface of an embodiment is generated by the mobile application, but is not so limited. The user interface of an embodiment enables a user to navigate the 3D space using pinching gestures and swiping gestures when the remote device includes a touchscreen display. Additionally, the remote device enables playing of and interacting with stored video clips generated by the monitoring unit, where the stored video clips are stored at local storage or remote server in the same way.


By way of example in an embodiment, the process for a remote device to receive the Immersive 3D video stream from the monitoring unit comprises the remote device creating a tunnel (e.g., Secure Sockets Layer (SSL), etc.) to the monitoring unit by coupling or connecting to an external port that was configured by the monitoring unit (e.g., using Universal Plug and Play (UPnP), etc.). The monitoring unit encodes the raw video image (e.g., 1280×1070) using an encoder application (e.g., H.264 with High Profile, etc.). The monitoring unit sends the encoded video stream to the mobile device using the tunnel (e.g., sends video stream using RTP tunneled in RTSP).


The mobile device of this example embodiment includes a multimedia data library (e.g., FFmpeg library, etc.) that decodes packets of the video stream (e.g., Real-time Transport Protocol (RTP) packets of H.264 stream) to an image buffer (e.g., a YUV color space image) in memory. The size of the memory buffer of an example is 1280×1070, but is not so limited. The mobile device creates an a virtual surface (e.g., Open Graphics Library (OpenGL)) through an API for rendering vector graphics. The virtual surface of this example embodiment is YUV for rendering the pan/zoomed image, where the image size is based on the mobile device, but the embodiment is not so limited. The mobile device user interface includes controls (e.g., pinch with fingers, zoom with fingers, etc.) for selecting a position of the rendered image on the display of the mobile device. Based on the selected position on the image, the mobile device takes a portion of the decoded image (e.g., YUV) and executes a de-warping and scaling algorithm (e.g., Open (IL) to produce a rendered subset image into the image surface.


Users interact with their premises during a live-streaming session by controlling appliances and speaking into their space through the monitoring unit's built-in two-way audio functionality, which streams live audio from the mobile application to the monitoring unit's speaker. The monitoring unit supports two-way voice sessions under numerous embodiments. For example, a remote device of an embodiment initiates a two-way voice session with one or more monitoring units at a premises. Similarly, a third-party monitoring station initiates a two-way voice session with one or more monitoring units at a premises. Additionally, the monitoring unit provides live video contemporaneously with or as a component of the two-way voice session with the remote device and/or third party monitoring station. The two-way voice sessions include sessions over a WAN (e.g., Internet Protocol (IP) via WiFi, etc.) and/or sessions over a cellular network (e.g., cellular voice, IP data, etc.), but are not so limited.


The monitoring unit can record video of events associated with rules triggered by internal and externally paired sensors. The monitoring unit of an embodiment continuously records video and audio in a loop, enabling it to report on an event by presenting footage before and after it occurs. Users can review these recorded events on the mobile and web applications and perform electronic pan, tilt, and zoom operations within the captured video, as though they were streaming it in real time.


The monitoring unit records a longer video if subsequent events happen in rapid succession. A single video is created which encapsulates the several events in question. The system understands that this particular video maps to several events, and vice-versa.


The monitoring unit allows users to record video in an on-demand fashion, triggered from the mobile and web applications. As with event-driven footage, users can perform electronic pan, tilt, and zoom operations within the captured on-demand video recordings via the mobile and web applications. The monitoring unit also includes smart sound detection, allowing captured loud sounds to be fed to characterization software which helps identify it, (e.g. a smoke alarm, barking dog, etc.).


The monitoring unit periodically transmits a heartbeat signal or message to the cloud or other network infrastructure, allowing the user to be notified when the monitoring unit disconnects from the Internet. The user is also notified when the monitoring unit reconnects and continues to post heartbeat information. If connectivity issues arise and events trigger video recording, the monitoring unit saves videos locally and queues them for later upload, when connectivity resumes.


The cloud infrastructure of an embodiment comprises one or more of a number of components. The components include for example, front-end web services the expose or include an Application Programming Interface (API) for both the mobile application and the monitoring unit firmware. This public traffic is encrypted and authentication is performed against strong firmware and user credentials. Additionally, back-end databases include user account information and settings, and monitoring unit configuration and time-series sensor and event data. Back-end in-memory databases house real-time sensor history, cached data and heartbeat information.


The cloud infrastructure components also include notification services and worker services. The notification services send information to users by way of direct and third-party assisted methods. These include email, mobile push notifications, SMS and voice calls. The worker services process uploaded content, check in-memory real-time data for connected devices' heartbeats, trigger notifications, start on-demand recordings for users and perform additional infrastructure and product-related functions. File storage services providing fast and reliable disk space for the entire infrastructure.


Additionally, the cloud infrastructure components include infrastructure backup services, multi-site disaster recovery for database, worker and web services, and redundancy for each component. High availability for all worker and web services is provided by way of application-level clustering and load balancing for incoming web service requests from mobile apps and firmware.


Multiple monitoring units are installed at various independent locations at a premises, and when so installed function as a distributed premises security system, under an embodiment and as described herein. The monitoring units of this collective installation automatically discover and couple to each other and share data over a device network (e.g., IP, WiFi, wired connection, Z-Wave, etc.) that is separate and independent from a LAN and/or WAN to which the monitoring units are coupled at the premises. In an embodiment the monitoring units utilize the LAN (Wifi, Ethernet, etc.) to couple to each other and share data. In another embodiment the monitoring units utilize a WAN (such as a cellular or broadband network) to couple to each other and share data. In yet another embodiment the monitoring devices are installed in physically remote locations (such as a home and an office) and are coupled via a WAN, but can still share data and form a distributed security network. The monitoring units thereby combine logically to form an integrated monitoring or security network at the premises or between premises. Each monitoring unit includes an automatic installation process for adding and removing itself from this integrated network. The monitoring units are configured to repeat at least one message between devices in the integrated security network formed by the coupling of the devices. When the installation includes multiple monitoring units, the collection of monitoring units are controlled or monitored from a single mobile application and are associated and managed with a single user account. Similarly, the collection of monitoring units is coupled to a remote service and monitored at the CMS.



FIG. 3 is a hardware block diagram of the monitoring system, under an embodiment. The monitoring unit's hardware architecture comprises but is not limited to one or more of the following components: ARM System-on-chip 302; DDR Memory 304; Flash storage 306; Home area network RF module with antenna 308; Wi-Fi (or local area network technology) module with antenna 310; Cellular data module with antenna 312 if provisioned; Camera system comprising a multi-megapixel CMOS sensor and very wide FOV lens 314; Audio system comprising a microphone 316 and speaker 318; Alarm siren 320; Passive infrared (PIR) motion sensor with a very side FOV Fresnel lens 322; Temperature sensor 324: Relative humidity sensor 326; Accelerometer 328; Ambient light sensor 330; Power system 332 comprising a DC jack 334 and battery compartment 336; RGB LED indicator 338; Push-button 340.



FIG. 4 is a method of operation of the monitoring unit, under an embodiment. The monitoring unit's firmware is based upon an operating system such as Linux, for example, but is not so limited. Specialized software or applications along with this operating system provides the services, API and functionality for set up and use of the monitoring unit's features in concert with the cloud infrastructure and mobile and web applications.


During the user's initial setup of monitoring unit, the following tasks are performed by the firmware:

    • a. The monitoring unit's firmware boots.
    • b. Since no existing device information is present, the monitoring unit creates a Wi-Fi access point for setup functions.
    • c. User launches the mobile application and after creating an account using their information begins the setup process.
    • d. User connects to monitoring unit's Wi-Fi access point and submits Wi-Fi credentials for their home network.
    • e. The monitoring unit attempts to connect with the home network using the provided Wi-Fi credentials,
    • f. The monitoring unit registers itself to the cloud back-end, associates with the current user and attempts to open ports on the user's Internet router (for incoming connections) using Universal Plug and Play (UPNP) or Network Address Translation (NAP) Port Mapping Protocol (PMP), depending on the type of router present.
    • g. Once fully connected, the monitoring unit turns off its Wi-Fi access point and begins normal operation.
    • h. In the cases where a new Wi-Fi router is present, the monitoring unit has moved to a new environment, or connectivity to the existing router fails, the monitoring unit can accept new Wi-Fi credentials in a similar fashion to the initial setup process.


Embodiments described herein include a setup or enrollment process that comprises determining geolocation of the monitoring unit during installation at the premises. The monitoring unit of an embodiment incorporates a Wifi module (processor and radio (802.11)), and during enrollment the monitoring unit puts the WiFi module into ‘Access Point mode’. The mobile device running the mobile application described in detail herein enrolls as a WiFi client to the monitoring unit access point. The mobile application then provides new WiFi credentials (e.g., service set identification (SSID), password (optional), etc.) to the monitoring unit via the Wifi access point; subsequently, the mobile application automatically switches the mobile device over to the same WiFi SSID, or the user manually switches the mobile device to that SSID using a network configuration utility. Upon receipt of the new WiFi credentials, the monitoring unit automatically switches its WiFi processor to enroll as a client at the new WiFi SSID (using the optional password). Either the monitoring unit or the mobile application initiates a process to store the WiFi credentials on a remote server or other remote device. The monitoring unit of an embodiment restores the WiFi credentials from a remote server, but the remote server of an alternative embodiment initiates restoration of the Wifi credentials of the monitoring unit.


The mobile application of an embodiment provides numerous operations, but is not so limited. For example, the mobile application provides a user interface that enables a user to switch the monitoring unit to the access point mode in order to change the SSID. The mobile application provides authentication directly to the camera (e.g. username, password, etc.). Alternatively, the mobile application provides authentication against a remote server.


The mobile application provides to one or more monitoring units location information corresponding to the monitoring unit installation, where the location information corresponding to the monitoring unit is location data determined at the mobile device. The monitoring unit then provides its location data to the remote server. Alternatively, the mobile application provides the location data of the monitoring unit installation directly to a remote server or other remote device. The monitoring unit of an embodiment includes an administrative tool that provides information about numerous monitoring units and their respective physical locations.


In an alternative embodiment the monitoring unit is temporarily coupled or connected via a physical connector (e.g. a USB cable) to a mobile device running the mobile application. In this embodiment the mobile application delivers the Wifi SSID and password over the wired connection, and the monitoring device then switches to the Wifi access point as described above.


Generally, the monitoring unit's operating state comprises but is not limited to the following:

    • a. Sensor polling is running and receiving raw data from sensors,
    • b. The rules engine is running and can interface with sensors,
    • c. The audio and video service and RTSP server are running and are ready to accept incoming connections, record footage in a loop and detect loud sounds,
    • d. The PIR motion sensor service is running and able to detect movement within the monitoring unit's FOV,
    • e. Automated tasks run at their pre-defined intervals and perform, but are not limited to, one or more of the following: maintain contact or communication between the monitoring unit and the cloud back-end and ensure incoming ports remain open on the user's Internet router; check for updates to the monitoring unit's firmware; post status updates about the current environment around the monitoring unit; post heartbeats periodically to inform the cloud backend of the monitoring unit's state.


      Sensors and Rules


The sensor polling service reads from internal sensors (e.g., temperature, humidity, ambient light, acceleration/motion, etc.) and sends the data to the rules engine. It can also receive a signal from any other part of the firmware to force an immediate read of the sensors. All sensor data is sent to the rules engine.


The PIR motion sensor service reads from the PIR software driver directly, but is not so limited. The motion sensor, which implements a Bessel filter in order to eliminate false positives, issues a message to the Rules engine if a threshold for motion is exceeded.


When loud sound above a predefined threshold is detected, a signal is passed to the rules engine. When appropriate, the loud sound in question is passed through characterization software to help identify it, (e.g. a smoke alarm, barking dog, etc.).


The rules engine loads a list of rules for notifications for the current security mode (home, away, or vacation) from a database into memory. The rules engine also loads any HAN control rules that can be controlled via schedule, ambient light, temperature or any other sensors, be they internal or external. Notification-only rules are processed in parallel to mode-based security rules.


The rules engine saves the data with a timestamp in the monitoring unit's firmware database. The data is also sent to each active rule/control in order to determine what action, if any, should be taken (e.g. turn on an appliance, sound the siren, notify the user etc.).


Audio and Video


The audio and video service is responsible for streaming media, saving to a file and detecting loud sounds.


For saving footage to a file, audio and video are encoded and placed into a circular buffer of a certain time. This enables the monitoring unit to capture video and audio “before” an event has occurred. This queue is operating when the system is on but is not so limited.


For streaming, video and audio are encoded and served via RTP/RTSP to a user's mobile application. The streaming is encrypted and supports multiple clients at once,



FIG. 5 shows a block diagram of the monitoring unit firmware, under an embodiment. The mobile application is the user's interface to the monitoring unit. The mobile application is executed on a smartphone or other personal or mobile electronic device. Within the application, the user's account is created, security and notification rules are defined, environmental readings are displayed, live streaming takes place and other settings are submitted to the cloud back-end and the monitoring unit's firmware. The application also serves as a tool to set up the monitoring unit's hardware, enabling the monitoring unit to pair with the user's home Wi-Fi network.


Key functions are accessed from the application's tab bar at the bottom of the screen, once the user has logged into their account.



FIGS. 6A-6B show an example monitoring system dashboard user interface on a mobile device, under an embodiment. The Dashboard provides an at-a-glance view of the monitoring unit's system and provides access to one or more of the following functions: the monitoring unit's current security mode; the temperature near the monitoring unit; the monitoring unit's Wi-Fi signal strength; current status of HAN-connected accessories; weather alerts, if present; access to Events, Recordings, and Settings; access to the Night Stand.



FIG. 7 is an example monitoring system Night Stand mode user interface on a mobile device, under an embodiment. The user can activate the Night Stand mode (available from the Dashboard), providing access to the various HAN-connected control accessories, a clock, weather information, and a panic button. Pressing the panic button activates the monitoring unit's siren.



FIG. 8 is an example monitoring system status screen on a mobile device, under an embodiment. The Vitals section displays monitoring unit's internal sensor readings and external weather information in an easy-to-understand format. Historical information is displayed using graphs allowing the user to see trends for each reading.



FIG. 9 is an example live streaming user interface display on a mobile device, under an embodiment. Tapping or selecting the center icon of the mobile application's tab bar launches live streaming and couples or connects the user securely to the monitoring unit's built-in wide FOV camera. The user can then pan, tilt, and zoom the live stream of monitoring unit's surroundings, control connected HAN plug-in modules (e.g. turning on a light) and stream audio from the microphone of their mobile device to the user's space by way of monitoring unit's built-in two-way audio feature.



FIG. 10 is an example live streaming multi-view user interface display on a mobile device, under an embodiment. Users can also choose to switch to a multi-view version of the live stream, which allows them to look at different areas of their space at the same time. This is achieved by presenting the user with several smaller views of the pan, tilt, and zoom video. The video image is segmented allowing multiple areas within the field of view of the camera to be isolated. The image captured by the camera has wide angle viewing area such as provided by a fish-eye lens. The image maybe de-warped and post-processed to provide a more viewable image. The monitoring unit remembers the last-used settings of the live stream, including the direction the user was looking and the zoom level of the video.



FIG. 11 shows monitoring system accessory control on a live streaming user interface on a mobile device, under an embodiment. If the user has paired HAN control accessories to monitoring unit, they can be accessed and controlled from within the live video screen. This allows a user to turn an appliance on or off and see the results in real time, if desired.



FIGS. 12A-12C show control and scheduling screens for the monitoring unit on a mobile device, under an embodiment. The Controls section allows configuration and control of HAN-connected accessories. These accessories can be configured via one or more of a timed schedule, sunrise/sunset, ambient light level, and temperature.



FIGS. 13A-13B show a user interface for configuring rules for the monitoring system unit on a mobile device, under an embodiment. The rules section allows the user to set security-related actions for motion, loud sound and temperature change triggers. Actions can be set for each security mode and include but are not limited to one or more of the following: record video and audio of the event; notifications; push message; electronic mail; phone call; SMS message; notification to a user's Trusted Circle members; the sounding of the monitoring unit's built-in siren; control of any connected HAN switches.


Additionally, notification-only options are present which allow the user to be informed of events outside the scope of the current security mode.


Additional functionality may be provided by the camera such as motion detection and ambient light detection. The processor may use image processing to determine characteristics of the image for use in motion detection, face recognition or light detection. In addition the microphone may be used for voice recognition function of the monitoring unit.



FIG. 14A is a front perspective view of the monitoring unit with a detachable stand, under an embodiment. FIG. 14B is a rear perspective view of the monitoring unit with a detachable stand, under an embodiment.



FIG. 15A is a front perspective view of the monitoring unit with a detachable wall bracket, under an embodiment. FIG. 15B is a rear perspective view of the monitoring unit with a detachable wall bracket, under an embodiment.


Embodiments described herein include a monitoring unit comprising a camera. The monitoring unit comprises a network interface. The monitoring unit comprises a processor coupled to the camera and the network interface. The monitoring unit comprises at least one application executing on the processor. The processor receives an image from the camera. The processor receives sensor data from at least one sensor coupled to the processor. The processor generates an alert based upon a change in at least one of the image and the sensor data. The alert is sent via the network interface to a mobile device.


Embodiments described herein include a monitoring unit comprising: a camera; a network interface; a processor coupled to the camera and the network interface; and at least one application executing on the processor, wherein the processor receives an image from the camera, wherein the processor receives sensor data from at least one sensor coupled to the processor, wherein the processor generates an alert based upon a change in at least one of the image and the sensor data, wherein the alert is sent via the network interface to a mobile device.


The monitoring unit of an embodiment comprises at least one a memory device coupled to the processor.


The monitoring unit of an embodiment comprises at least one communication module coupled to the processor.


The at least one communication module comprises a home area network (HAN) radio frequency (RF) module.


The at least one communication module comprises a Wi-Fi module.


The executing of the at least one application generates an enrollment process.


The enrollment process automatically places the WiFi module into an Access Point mode.


The mobile device comprises a mobile application, wherein the mobile application enrolls as a client to the Access Point.


The mobile application provides WiFi credentials to the processor via the Access Point.


At least one of the mobile application and the processor initiate storage of the WiFi credentials on a remote server.


At least one of the remote server and the processor restore the WiFi credentials from the remote storage device.


The mobile application provides authentication against at least one of the processor and a remote server.


The processor automatically switches the WiFi module to enroll as a client using the WiFi credentials.


The mobile application automatically switches the mobile device to enroll using the WiFi credentials.


The mobile application provides a user interface that includes at least one control for switching the processor to the Access Point mode to change the WiFi credentials.


The mobile application provides to a device location information corresponding to installation of the monitoring unit.


The device comprises a remote server.


The device comprises at least one of the monitoring unit and at least one additional monitoring unit.


The monitoring unit of an embodiment comprises an administrative application that provides information about at least one monitoring unit that includes the location information.


The at least one communication module comprises a local area network (LAN) module.


The at least one communication module comprises a cellular data module.


The at least one communication module is coupled to a remote device to communicate with the remote device.


The at least one communication module is coupled to a remote device to communicate over an Internet Protocol (IP) channel.


The at least one communication module is coupled to a remote device to communicate over a cellular channel.


The communication comprises a two-way voice session with the remote device.


The communication comprises a data session, wherein video images are transmitted during the data session.


The remote device comprises the mobile device.


The remote device comprises a central monitoring station.


The communication module automatically establishes a coupling with the at least one sensor.


The communication module automatically establishes a coupling with a local area network. (LAN) at the premises.


The at least one application transfers data between at least one device on the LAN.


The communication module forms a sub-network at the premises.


The sub-network is a private network.


The at least one sensor is coupled to the sub-network.


Devices couple to the sub-network and communicate over the sub-network, wherein the devices include at least one of wireless devices, wired devices, and IP devices.


The monitoring unit of an embodiment comprises a remote server including a user account coupled to the processor.


The camera comprises an image sensor and a lens.


The camera comprises a lens including a wide-angle geometry, wherein the camera generates images including warped images, wherein the camera generates the images using a wide-angle view mapped to the geometry of the lens from collected images.


The camera comprises an encoder that encodes collected images to generate a processed data stream.


The processed data stream is a compressed video stream.


The monitoring unit of an embodiment comprises memory coupled to the camera, wherein the camera stores to the memory the processed data stream that includes warped video.


The memory is local to the camera.


The memory is remote to the camera.


The camera streams the processed data stream to a remote device, wherein the remote device comprises at least one of a mobile device and a server.


The camera streams the processed data stream directly to the remote device.


The camera streams the processed data stream to the remote device via at least one intermediary device.


Remote device processes the processed data stream using knowledge of a wide-angle geometry of the lens.


The processing comprises decompressing the processed data stream.


The remote device comprises a software codec, wherein the software codec decompresses the processed data stream.


The remote device comprises a hardware codec, wherein the hardware codec decompresses the processed data stream.


The processing comprises using three-dimensional (3D) rendering and mapping warped video to a 3D space representing at least a portion of the lens geometry.


The processing comprises displaying a 3D view of the collected images via a display coupled to the remote device.


The remote device comprises a user interface comprising control gestures for navigating around the 3D view presented via the display.


The navigating comprises at least one of panning and zooming around the 3D view presented via the display, wherein the control gestures comprise at least one of pinching gestures and swiping gestures.


The camera comprises at least one of a video camera and an imaging camera.


The camera comprises a CMOS sensor and very wide FOV lens.


The monitoring unit of an embodiment comprises an audio system.


The monitoring unit of an embodiment comprises an alarm siren.


The at least one sensor comprises a motion sensor.


The motion sensor comprises a passive infrared (PIR) motion sensor with a very wide FOV Fresnel lens.


The at least one sensor comprises an environmental sensor.


The environmental sensor comprises at least one of a temperature sensor and a humidity sensor.


The at least one sensor comprises an accelerometer.


The at least one sensor comprises an ambient light sensor.


The monitoring unit of an embodiment comprises a power system.


The monitoring unit of an embodiment comprises at least one indicator coupled to the processor.


The at least one application generates at least one notification.


The at least one notification comprises one or more of a push message, an electronic mail, a telephone call, a Short-Message-Service (SMS) message, a notification to at least one contact.


The monitoring unit is coupled to one or more accessories.


The accessories are controlled by at least one of a timed schedule, a sunrise/sunset event, an ambient light level, and a temperature.


The monitoring unit of an embodiment comprises a rules engine executing on the processor.


The monitoring unit of an embodiment comprises a mobile application installed on the mobile device.


The mobile application generates a user interface presented on the mobile device, wherein the user interface provides access to at least one of the image and the sensor data.


The at least one application is at least one of accessed and controlled using the user interface.


The at least one sensor is controlled via the user interface.


The monitoring unit of an embodiment comprises at least one actuator coupled to the processor, wherein the at least one actuator is controlled via the user interface.


The monitoring unit of an embodiment comprises a heartbeat signal generated by the processor and transmitted to a remote device.


The monitoring unit of an embodiment comprises at least one remote server coupled to the network interface.


The coupling comprises at least one of a wide area network and a cellular network.


The at least one remote server comprises a central monitoring station.


The processor transmits to the central monitoring station at least one of the image and the sensor data.


The processor transmits to the central monitoring station a message comprising information representing at least one of the image and the sensor data.


The mobile device comprises a mobile application.


The mobile application comprises an interface for enabling and disabling remote monitoring by the central monitoring station.


The mobile application comprises an interface for controlling characteristics of the message and transmission of the message.


Embodiments described herein include a monitoring unit comprising a plurality of sensors. The plurality of sensors includes an image sensor. The monitoring unit comprises a network interface. The monitoring unit comprises a processor coupled to the plurality of sensors and the network interface. The monitoring unit comprises at least one application executing on the processor. The processor receives sensor data from the plurality of sensors. The processor generates an alert based upon a change in the sensor data. The alert is sent via the network interface to a mobile device associated with a user.


Embodiments described herein include a monitoring unit comprising: a plurality of sensors, wherein the plurality of sensors include an image sensor; a network interface; a processor coupled to the plurality of sensors and the network interface; and at least one application executing on the processor, wherein the processor receives sensor data from the plurality of sensors, wherein the processor generates an alert based upon a change in the sensor data, wherein the alert is sent via the network interface to a mobile device associated with a user.


Embodiments described herein include a system for remote monitoring. The system comprises a monitoring unit at a premises. The monitoring unit comprises a processor coupled to a plurality of sensors. The plurality of sensors includes an image sensor. The processor includes at least one application executing on the processor. The processor receives sensor data from the plurality of sensors and generates monitoring unit data. The system includes a server and a database located remote to the premises and coupled to the monitoring unit via a network coupling. The server receives the sensor data and the monitoring unit data and stores the sensor data and the monitoring unit data in the database. The server provides access to the sensor data and the monitoring unit data via a mobile device.


Embodiments described herein include a system for remote monitoring, the system comprising: a monitoring unit at a premises, the monitoring unit comprising a processor coupled to a plurality of sensors, wherein the plurality of sensors include an image sensor, wherein the processor includes at least one application executing on the processor, wherein the processor receives sensor data from the plurality of sensors and generates monitoring unit data; and a server and a database located remote to the premises and coupled to the monitoring unit via a network coupling, wherein the server receives the sensor data and the monitoring unit data and stores the sensor data and the monitoring unit data in the database, wherein the server provides access to the sensor data and the monitoring unit data via a mobile device.


The processor generates an alert based upon a change in at least one of the sensor data and the monitoring unit data, wherein the alert is sent to the mobile device.


The system of an embodiment comprises at least one communication module coupled to the processor.


The at least one communication module comprises a home area network (HAN) radio frequency (RF) module.


The at least one communication module comprises a Wi-Fi module.


The executing of the at least one application generates an enrollment process.


The enrollment process automatically places the WiFi module into an Access Point mode.


The mobile device comprises a mobile application, wherein the mobile application enrolls as a client to the Access Point.


The mobile application provides WiFi credentials to the processor via the Access Point.


At least one of the mobile application and the processor initiate storage of the WiFi credentials on the server.


At least one of the server and the processor restore the WiFi credentials from the server.


The mobile application provides authentication against at least one of the processor and a server.


The processor automatically switches the WiFi module to enroll as a client using the WiFi credentials.


The mobile application automatically switches the mobile device to enroll using the WiFi credentials.


The mobile application provides a user interface that includes at least one control for switching the processor to the Access Point mode to change the WiFi credentials.


The mobile application provides to a device location information corresponding to installation of the monitoring unit.


The device comprises the server.


The device comprises at least one of the monitoring unit and at least one additional monitoring unit.


The system of an embodiment comprises an administrative application that provides information about at least one monitoring unit that includes the location information.


The at least one communication module comprises a local area network (LAN) module.


The at least one communication module comprises a cellular data module.


The at least one communication module is coupled to a remote device to communicate with the remote device.


The at least one communication module is coupled to a remote device to communicate over an Internet Protocol (IP) channel.


The at least one communication module is coupled to a remote device to communicate over a cellular channel.


The communication comprises a two-way voice session with the remote device.


The communication comprises a data session, wherein video images are transmitted during the data session.


The remote device comprises the mobile device.


The remote device comprises a central monitoring station.


The communication module automatically establishes a coupling with the plurality of sensors.


The communication module automatically establishes a coupling with a local area network (LAN) at the premises.


The at least one application transfers data between at least one device on the LAN.


The communication module forms a sub-network at the premises.


The sub-network is a private network.


The plurality of sensors are coupled to the sub-network.


Devices couple to the sub-network and communicate over the sub network, wherein the devices include at least one of wireless devices, wired devices, and IP devices.


The server includes a user account.


The image sensor comprises a camera including a lens.


The camera comprises a lens including a wide-angle geometry, wherein the camera generates images including warped images, wherein the camera generates the images using a wide-angle view mapped to the geometry of the lens from collected images.


The camera comprises an encoder that encodes collected images to generate a processed data stream.


The processed data stream is a compressed video stream.


The system of an embodiment comprises memory coupled to the camera, wherein the camera stores to the memory the processed data stream that includes warped video.


The memory is local to the camera.


The memory is remote to the camera.


The camera streams the processed data stream to a remote device, wherein the remote device comprises at least one of the mobile device and the server.


The camera streams the processed data stream directly to the remote device.


The camera streams the processed data stream to the remote device via at least one intermediary device.


Remote device processes the processed data stream using knowledge of a wide-angle geometry of the lens.


The processing comprises decompressing the processed data stream.


The remote device comprises a software codec, wherein the software codec decompresses the processed data stream.


The remote device comprises a hardware codec, wherein the hardware codec decompresses the processed data stream.


The processing comprises using three-dimensional (3D) rendering and mapping warped video to a 3D space representing at least a portion of the lens geometry.


The processing comprises displaying a 3D view of the collected images via a display coupled to the remote device.


The remote device comprises a user interface comprising control gestures for navigating around the 3D view presented via the display.


The navigating comprises at least one of panning and zooming around the 3D view presented via the display, wherein the control gestures comprise at least one of pinching gestures and swiping gestures.


The camera comprises at least one of a video camera and an imaging camera.


The camera comprises a CMOS sensor and very wide FOV lens.


The system of an embodiment comprises an audio system coupled to the processor.


The system of an embodiment comprises an alarm siren coupled to the processor.


The plurality of sensors comprises a motion sensor.


The motion sensor comprises a passive infrared (PIR) motion sensor with a very wide FOV Fresnel lens.


The plurality of sensors comprises an environmental sensor.


The environmental sensor comprises at least one of a temperature sensor and a humidity sensor.


The plurality of sensors comprises an accelerometer.


The plurality of sensors comprises an ambient light sensor.


The at least one application generates at least one notification corresponding to the alert.


The at least one notification comprises a notification to at least one contact.


The at least one notification comprises one or more of a push message, an electronic mail, a telephone call, and a Short-Message-Service (SMS) message.


The monitoring unit is coupled to one or more accessories.


The accessories are controlled by the monitoring unit data.


The accessories are controlled by a schedule.


The system of an embodiment comprises a rules engine executing on the processor.


The system of an embodiment comprises a mobile application installed on the mobile device.


The mobile application generates a user interface presented on the mobile device, wherein the user interface provides access to at least one of the Image and the sensor data.


At least one of the server, the database, and the at least one application are at least one of accessed and controlled using the user interface.


The plurality of sensors are controlled via the user interface.


The system of an embodiment comprises at least one actuator coupled to the processor, wherein the at least one actuator is controlled via the user interface.


The system of an embodiment comprises a heartbeat signal generated by the processor and transmitted to at least one of the server and the mobile device.


The system of an embodiment comprises at least one remote server coupled to at least one of the monitoring unit, the server, and the mobile device.


The coupling comprises at least one of a wide area network and a cellular network.


The at least one remote server comprises a central monitoring station.


The processor transmits to the central monitoring station the monitoring unit data.


The processor transmits to the central monitoring station a message comprising information representing the monitoring unit data.


The mobile device comprises a mobile application.


The mobile application comprises an interface for enabling and disabling remote monitoring by the central monitoring station.


The mobile application comprises an interface for controlling characteristics of the message and transmission of the message.


The system of an embodiment comprises at least one additional monitoring unit at the premises.


The at least one additional monitoring unit is physically separated at the premises from the monitoring unit.


The at least one additional monitoring unit is coupled to the monitoring unit.


The coupling includes at least one of a wired coupling, wireless coupling, WiFi coupling, and IP coupling.


The system of an embodiment comprises forming an integrated security network at the premises by logically combining the at least one additional monitoring unit and the monitoring unit.


At least one of the monitoring unit and the at least one additional monitoring unit comprise an automatic installation process that automatically controls at least one of adding and removing a monitoring unit to the integrated network.


The system of an embodiment comprises a central monitoring station coupled to at least one of the server, the monitoring unit, and the at least one additional monitoring unit.


The monitoring unit and the at least one additional monitoring unit are monitored and controlled from the mobile device.


The monitoring unit and the at least one additional monitoring unit are monitored and controlled from the mobile device.


The server comprises a user account that corresponds to the monitoring unit and the at least one additional monitoring unit.


Embodiments described herein include a system for remote monitoring. The system comprises a monitoring unit at a premises. The monitoring unit comprises a processor coupled to a camera and a network interface. The processor includes at least one application executing on the processor. The processor receives monitoring unit data that includes images from the camera and sensor data from one or more sensors. The system includes a server located remote to the premises and coupled to the monitoring unit via the network interface. The server is coupled to a database. The server receive monitoring unit data from the monitoring unit and stores the monitoring unit data in the database. The server provides access to the monitoring unit data via a mobile device associated with a user.


Embodiments described herein include a system for remote monitoring, the system comprising: a monitoring unit at a premises, the monitoring unit comprising a processor coupled to a camera and a network interface, wherein the processor includes at least one application executing on the processor, wherein the processor receives monitoring unit data that includes images from the camera and sensor data from one or more sensors; and a server located remote to the premises and coupled to the monitoring unit via the network interface, wherein the server is coupled to a database, wherein the server receives monitoring unit data from the monitoring unit and stores the monitoring unit data in the database, wherein the server provides access to the monitoring unit data via a mobile device associated with a user.


Embodiments described herein include a system for remote monitoring. The system comprises a monitoring unit at a premises. The monitoring unit comprises a processor coupled to a plurality of sensors. The plurality of sensors includes an image sensor. The processor includes at least one application executing on the processor. The processor receives sensor data from the plurality of sensors and generates monitoring unit data. The system includes a server and a database located remote to the premises and coupled to the monitoring unit via a network coupling. The server receives the sensor data and the monitoring unit data and stores the sensor data and the monitoring unit data in the database. The server provides access to the sensor data and the monitoring unit data via a mobile device.


Embodiments described herein include a system for remote monitoring, the system comprising: a monitoring unit at a premises, the monitoring unit comprising a processor coupled to a plurality of sensors, wherein the plurality of sensors include an image sensor, wherein the processor includes at least one application executing on the processor, wherein the processor receives sensor data from the plurality of sensors and generates monitoring unit data; and a server and a database located remote to the premises and coupled to the monitoring unit via a network coupling, wherein the server receives the sensor data and the monitoring unit data and stores the sensor data and the monitoring unit data in the database, wherein the server provides access to the sensor data and the monitoring unit data via a mobile device.


Although certain methods, apparatus, computer readable memory, and articles of manufacture have been described herein, the scope of coverage of this disclosure is not limited thereto. To the contrary, this disclosure covers all methods, apparatus, computer readable memory, and articles of manufacture fairly falling within the scope of the appended claims either literally or under the doctrine of equivalents.


Although the following discloses example methods, system and apparatus including, among other components, software executed on hardware, it should be noted that such methods, system and apparatus are merely illustrative and should not be considered as limiting. For example, it is contemplated that any or all of these hardware and software components could be embodied exclusively in hardware, exclusively in software, exclusively in firmware, or in any combination of hardware, software, and/or firmware. Accordingly, while the following describes example methods and apparatus, persons having ordinary skill in the art will readily appreciate that the examples provided are not the only way to implement such methods, system and apparatus.


Computer networks suitable for use with the embodiments described herein include local area networks (LAN), wide area networks (WAN), Internet, or other connection services and network variations such as the world wide web, the public internet, a private internet, a private computer network, a public network, a mobile network, a cellular network, a value-added network, and the like. Computing devices coupled or connected to the network may be any microprocessor controlled device that permits access to the network, including terminal devices, such as personal computers, workstations, servers, mini computers, main frame computers, laptop computers, mobile computers, palm top computers, hand held computers, mobile phones. TV set-top boxes, or combinations thereof. The computer network may include one of more LANS, WANS, Internets, and computers. The computers may serve as servers, clients, or a combination thereof.


The embodiments described herein can be a component of a single system, multiple systems, and/or geographically separate systems. The embodiments described herein can also be a subcomponent or subsystem of a single system, multiple systems, and/or geographically separate systems. The embodiments described herein can be coupled to one or more other components (not shown) of a host system or a system coupled to the host system.


One or more components of the embodiments described herein and/or a corresponding system or application to which the embodiments described herein is coupled or connected includes and/or runs under and/or in association with a processing system. The processing system includes any collection of processor-based devices or computing devices operating together, or components of processing systems or devices, as is known in the art. For example, the processing system can include one or more of a portable computer, portable communication device operating in a communication network, and/or a network server. The portable computer can be any of a number and/or combination of devices selected from among personal computers, personal digital assistants, portable computing devices, and portable communication devices, but is not so limited. The processing system can include components within a larger computer system.


The processing system of an embodiment includes at least one processor and at least one memory device or subsystem. The processing system can also include or be coupled to at least one database. The term “processor” as generally used herein refers to any logic processing unit, such as one or more central processing units (CPU's), digital signal processors (DSPs), application-specific integrated circuits (ASIC), etc. The processor and memory can be monolithically integrated onto a single chip, distributed among a number of chips or components, and/or provided by some combination of algorithms. The methods described herein can be implemented in one or more of software algorithm(s), programs, firmware, hardware, components, circuitry, in any combination.


The components of any system that includes the embodiments described herein can be located together or in separate locations, Communication paths couple the components and include any medium for communicating or transferring files among the components. The communication paths include wireless connections, wired connections, and hybrid wireless/wired connections. The communication paths also include couplings or connections to networks including local area networks (LANs), metropolitan area networks (MANs), wide area networks (WANs), proprietary networks, interoffice or backend networks, and the Internet. Furthermore, the communication paths include removable fixed mediums like floppy disks, hard disk drives, and CD-ROM disks, as well as flash RAM, Universal Serial Bus (USB) connections, RS-232 connections, telephone lines, buses, and electronic mail messages.


Aspects of the embodiments described herein and corresponding systems and methods described herein may be implemented as functionality programmed into any of a variety of circuitry, including programmable logic devices (PLDs), such as field programmable gate arrays (FPGAs), programmable array logic (PAL) devices, electrically programmable logic and memory devices and standard cell-based devices, as well as application specific integrated circuits (ASICs). Some other possibilities for implementing aspects of the embodiments described herein and corresponding systems and methods include: microcontrollers with memory (such as electronically erasable programmable read only memory (EEPROM)), embedded microprocessors, firmware, software, etc. Furthermore, aspects of the embodiments described herein and corresponding systems and methods may be embodied in microprocessors having software-based circuit emulation, discrete logic (sequential and combinatorial), custom devices, fuzzy (neural) logic, quantum devices, and hybrids of any of the above device types. Of course the underlying device technologies may be provided in a variety of component types, e.g. metal-oxide semiconductor field effect transistor (MOSFET) technologies like complementary metal-oxide semiconductor (CMOS), bipolar technologies like emitter-coupled logic (ECL), polymer technologies (e.g., silicon-conjugated polymer and metal-conjugated polymer-metal structures), mixed analog and digital, etc.


It should be noted that any system, method, and/or other components disclosed herein may be described using computer aided design tools and expressed (or represented), as data and/or instructions embodied in various computer-readable media, in terms of their behavioral, register transfer, logic component, transistor, layout geometries, and/or other characteristics. Computer-readable media in which such formatted data and/or instructions may be embodied include, but are not limited to, non-volatile storage media in various forms (e.g., optical, magnetic or semiconductor storage media) and carrier waves that may be used to transfer such formatted data and/or instructions through wireless, optical, or wired signalling media or any combination thereof. Examples of transfers of such formatted data and/or instructions by carrier waves include, but are not limited to, transfers (uploads, downloads, e-mail, etc.) over the Internet and/or other computer networks via one or more data transfer protocols (e.g., HTTP, FTP, SMTP, etc.). When received within a computer system via one or more computer-readable media, such data and/or instruction-based expressions of the above described components may be processed by a processing entity (e.g., one or more processors) within the computer system in conjunction with execution of one or more other computer programs.


Unless the context clearly requires otherwise, throughout the description and the claims, the words “comprise,” “comprising,” and the like are to be construed in an inclusive sense as opposed to an exclusive or exhaustive sense; that is to say, in a sense of “including, but not limited to.” Words using the singular or plural number also include the plural or singular number respectively. Additionally, the words “herein,” “hereunder,” “above,” “below.” and words of similar import, when used in this application, refer to this application as a whole and not to any particular portions of this application. When the word “or” is used in reference to a list of two or more items, that word covers all of the following interpretations of the word: any of the items in the list, all of the items in the list and any combination of the items in the list.


The above description of embodiments and corresponding systems and methods is not intended to be exhaustive or to limit the systems and methods to the precise forms disclosed. While specific embodiments of, and examples for, the embodiments and corresponding systems and methods are described herein for illustrative purposes, various equivalent modifications are possible within the scope of the systems and methods, as those skilled in the relevant art will recognize. The teachings of the embodiments described herein and corresponding systems and methods provided herein can be applied to other systems and methods, not only for the systems and methods described above.


The elements and acts of the various embodiments described above can be combined to provide further embodiments. These and other changes can be made to the integrated security system and corresponding systems and methods in light of the above detailed description.


In general, in the following claims, the terms used should not be construed to limit the integrated security system and corresponding systems and methods to the specific embodiments disclosed in the specification and the claims, but should be construed to include all systems that operate under the claims. Accordingly, the integrated security system and corresponding systems and methods is not limited by the disclosure, but instead the scope is to be determined entirely by the claims.


While certain aspects of the embodiments described herein and corresponding systems and methods are presented below in certain claim forms, the inventors contemplate the various aspects of the embodiments described herein and corresponding systems and methods in any number of claim forms. Accordingly, the inventors reserve the right to add additional claims after filing the application to pursue such additional claim forma for other aspects of the embodiments described herein and corresponding systems and methods.

Claims
  • 1. A system comprising one or more computers and one or more storage devices on which are stored instructions that are operable, when executed by the one or more computers, to cause the one or more computers to perform operations comprising: presenting, in a rule configuration user interface for configuring rules for a monitoring system at a property, a first user interface element for creation of a security mode specific rule that only applies when the monitoring system has a particular security mode and a second user interface element for creation of a notification rule that applies irrespective of a security mode of the monitoring system;receiving, through the rule configuration user interface, user input defining a rule that causes presentation content from a video stream that depicts at least a portion of a three-dimensional space at the property when the monitoring system detects particular sensor data captured by at least one sensor at the property; andstoring, in a rule database, data defining the rule that causes presentation content from the video stream that depicts at least the portion of the three-dimensional space at the property when the monitoring system detects particular sensor data captured by at least one sensor at the property.
  • 2. The system of claim 1, wherein: presenting the first user interface element and the second user interface element comprises presenting, by a user device, the rule configuration user interface that includes the first user interface element and the second user interface element; andreceiving the user input comprises receiving, by the user device through the rule configuration user interface, the user input defining the rule that causes presentation, by the user device, of the content from the video stream that depicts at least the portion of the three-dimensional space at the property when the monitoring system detects the particular sensor data captured by the at least one sensor at the property.
  • 3. The system of claim 2, the operations comprising: presenting, by the user device and in a user interface for the three-dimensional space, a video presentation region that includes content from the video stream that depicts at least the portion of the three-dimensional space and one or more controls that enable interaction with the content included in the video presentation region;receiving input data to activate one of the one or more controls; andin response to receiving the input data to activate the one of the one or more controls, adjusting, in the user interface, presentation of the content for the video stream of the portion of the three-dimensional space in the video presentation region of the user interface.
  • 4. The system of claim 1, wherein receiving the user input comprises: receiving, through the rule configuration user interface, first user input identifying a connected appliance at the property; andreceiving, through the rule configuration user interface, second user input defining the rule that causes presentation of the content from the video stream that depicts at least the portion of the three-dimensional space at the property that includes the connected appliance when the monitoring system detects the particular sensor data captured by a sensor for the connected appliance.
  • 5. The system of claim 1, the operations comprising: selecting, using an operational state of the monitoring system and from a plurality of three or more rules, a set of rules that comprises one or more notification-only rules and one or more security rules;determining that sensor data collected by one or more sensors located in the property is (i) logically related to a notification-only rule included in the one or more notification-only rules and (ii) not logically related to the one or more security rules; andtransmitting a notification without triggering an alarm condition at the property in response to determining that the sensor data is (i) logically related to the notification-only rule included in the one or more notification-only rules and (ii) not logically related to the one or more security rules.
  • 6. A computer-implemented method comprising: receiving, by a user device, camera specific video data for a video stream that depicts at least a portion of a three-dimensional space captured with a camera using one or more properties of the camera;converting, by the user device and using the one or more properties of the camera, the camera specific video data into camera agnostic video data that is not specific to the one or more properties of the camera; andpresenting, by the user device and in a user interface for the three-dimensional space using the one or more properties for the camera, content from the camera agnostic video data in a video presentation region that includes content for the video stream that depicts at least the portion of the three-dimensional space and one or more controls that enable interaction with the content included in the video presentation region;receiving input data to activate one of the one or more controls; andin response to receiving the input data to activate the one of the one or more controls, adjusting, in the user interface, presentation of the content for the video stream of the portion of the three-dimensional space in the video presentation region of the user interface.
  • 7. The method of claim 6, wherein presenting the content from the camera agnostic video data in the video presentation region is responsive to receiving the camera specific video data for the video stream.
  • 8. The method of claim 6, comprising: receiving, through a second user interface, user input defining one or more parameters for a notification-only rule that, when satisfied, triggers data transmission to a device and does not trigger data transmission to a security system; andproviding, to a monitoring system for the three-dimensional space, data a) defining the notification-only rule b) that includes the one or more parameters.
  • 9. The method of claim 6, wherein the input data to activate the one of the one or more controls comprises data for selecting a position of the presentation of the data for the video stream in the video presentation region of the user interface, and wherein adjusting, in the user interface, the presentation of the content for the video stream of the portion of the three-dimensional space in the video presentation region of the user interface comprises: converting, by the user device, the camera specific video data into the camera agnostic video data using the position of the presentation.
  • 10. One or more non-transitory computer storage media encoded with instructions that, when executed by one or more computers, cause the one or more computers to perform operations comprising: presenting, in a rule configuration user interface for configuring rules for a monitoring system at a property, a first user interface element for creation of a security mode specific rule that only applies when the monitoring system has a particular security mode and a second user interface element for creation of a notification rule that applies irrespective of a security mode of the monitoring system;receiving, through the rule configuration user interface, user input defining a rule that causes presentation content from a video stream that depicts at least a portion of a three-dimensional space at the property when the monitoring system detects particular sensor data captured by at least one sensor at the property; andstoring, in a rule database, data defining the rule that causes presentation content from the video stream that depicts at least the portion of the three-dimensional space at the property when the monitoring system detects particular sensor data captured by at least one sensor at the property.
  • 11. The non-transitory computer storage media of claim 10, wherein: presenting the first user interface element and the second user interface element comprises presenting, by a user device, the rule configuration user interface that includes the first user interface element and the second user interface element; andreceiving the user input comprises receiving, by the user device through the rule configuration user interface, the user input defining the rule that causes presentation, by the user device, of the content from the video stream that depicts at least the portion of the three-dimensional space at the property when the monitoring system detects the particular sensor data captured by the at least one sensor at the property.
  • 12. The non-transitory computer storage media of claim 11, the operations comprising: presenting, by the user device and in a user interface for the three-dimensional space, a video presentation region that includes content from the video stream that depicts at least the portion of the three-dimensional space and one or more controls that enable interaction with the content included in the video presentation region;receiving input data to activate one of the one or more controls; andin response to receiving the input data to activate the one of the one or more controls, adjusting, in the user interface, presentation of the content for the video stream of the portion of the three-dimensional space in the video presentation region of the user interface.
  • 13. The non-transitory computer storage media of claim 12, the operations comprising receiving, by the user device and from a system configured to transmit data for the video stream, the data for the video stream of the three-dimensional space, wherein presenting the video presentation region is responsive to receiving the data for the video stream.
  • 14. The non-transitory computer storage media of claim 12, the operations comprising: receiving, through a second user interface, user input defining one or more parameters for a notification-only rule that, when satisfied, triggers data transmission to a device and does not trigger data transmission to a security system; andproviding, to the monitoring system for the three-dimensional space, data a) defining the notification-only rule b) that includes the one or more parameters.
  • 15. The non-transitory computer storage media of claim 12, the operations comprising: receiving, by the user device, camera specific video data for the video stream captured with a camera;converting, by the user device and using one or more properties for the camera, the camera specific video data into camera agnostic video data; andpresenting, in the user interface and using the one or more properties for the camera, the content from the camera agnostic video data in the video presentation region.
  • 16. The non-transitory computer storage media of claim 15, wherein the input data to activate the one of the one or more controls comprises data for selecting a position of the presentation of the data for the video stream in the video presentation region of the user interface, and wherein converting, by the user device and using the one or more properties for the camera, the camera specific video data into the camera agnostic video data comprises: converting, by the user device, the camera specific video data into the camera agnostic video data using the position of the presentation.
  • 17. The non-transitory computer storage media of claim 10, wherein receiving the user input comprises: receiving, through the rule configuration user interface, first user input identifying a connected appliance at the property; andreceiving, through the rule configuration user interface, second user input defining the rule that causes presentation of the content from the video stream that depicts at least the portion of the three-dimensional space at the property that includes the connected appliance when the monitoring system detects the particular sensor data captured by a sensor for the connected appliance.
  • 18. The non-transitory computer storage media of claim 10, the operations comprising: selecting, using an operational state of the monitoring system and from a plurality of three or more rules, a set of rules that comprises one or more notification-only rules and one or more security rules;determining that sensor data collected by one or more sensors located in the property is (i) logically related to a notification-only rule included in the one or more notification-only rules and (ii) not logically related to the one or more security rules; andtransmitting a notification without triggering an alarm condition at the property in response to determining that the sensor data is (i) logically related to the notification-only rule included in the one or more notification-only rules and (ii) not logically related to the one or more security rules.
RELATED APPLICATIONS

This application is a continuation of U.S. application Ser. No. 17/888,741, filed Aug. 16, 2022, which is a continuation of U.S. application Ser. No. 16/866,521, filed May 4, 2020, now U.S. Pat. No. 11,438,553, issued Sep. 6, 2022, which is a continuation of U.S. application Ser. No. 14/456,449, filed Aug. 11, 2014, now U.S. Pat. No. 10,645,347, issued May 5, 2020, which claims the benefit of U.S. Provisional Application No. 61/864,248, filed Aug. 9, 2013. All of these prior applications are incorporated by reference in their entirety.

US Referenced Citations (843)
Number Name Date Kind
686838 Richard Nov 1901 A
4141006 Braxton Feb 1979 A
4257038 Rounds et al. Mar 1981 A
4363031 Reinowitz Dec 1982 A
4520503 Kirst et al. May 1985 A
4559526 Tani et al. Dec 1985 A
4574305 Campbell Mar 1986 A
4581606 Mallory Apr 1986 A
D284084 Ferrara, Jr. Jun 1986 S
4641127 Hogan et al. Feb 1987 A
4652859 Van Wienen Mar 1987 A
4694282 Tamura et al. Sep 1987 A
4730184 Bach Mar 1988 A
4754261 Marino Jun 1988 A
4779007 Schlanger Oct 1988 A
4801924 Burgmann et al. Jan 1989 A
4812820 Chatwin Mar 1989 A
4833449 Gaffigan May 1989 A
4855713 Brunius Aug 1989 A
4860185 Brewer et al. Aug 1989 A
4897630 Nykerk Jan 1990 A
4918623 Lockitt et al. Apr 1990 A
4951029 Severson Aug 1990 A
4959713 Morotomi et al. Sep 1990 A
4993059 Smith et al. Feb 1991 A
4994787 Kratt et al. Feb 1991 A
5023901 Sloan et al. Jun 1991 A
5086385 Launey et al. Feb 1992 A
5091780 Pomerleau Feb 1992 A
5111288 Blackshear May 1992 A
5132968 Cephus Jul 1992 A
5134644 Garton et al. Jul 1992 A
5159315 Schultz et al. Oct 1992 A
5160879 Tortola et al. Nov 1992 A
D337569 Kando Jul 1993 S
5227776 St Arefoss Jul 1993 A
5237305 Ishikuro et al. Aug 1993 A
5299971 Hart Apr 1994 A
5319394 Dukek Jun 1994 A
5319698 Glidewell Jun 1994 A
5334974 Simms et al. Aug 1994 A
5359363 Kuban et al. Oct 1994 A
5438607 Przygoda, Jr. Aug 1995 A
5446445 Bloomfield Aug 1995 A
5465081 Todd Nov 1995 A
5471194 Guscott Nov 1995 A
5486812 Todd Jan 1996 A
5499014 Greenwaldt Mar 1996 A
5499196 Pacheco Mar 1996 A
5519878 Dolin, Jr. May 1996 A
5578989 Pedtke Nov 1996 A
5579197 Mengel et al. Nov 1996 A
D377034 Matsushita Dec 1996 S
5587705 Morris Dec 1996 A
5623601 Vu Apr 1997 A
5625338 Pildner et al. Apr 1997 A
5630216 McEwan May 1997 A
5651070 Blunt Jul 1997 A
D389501 Mascarenas, Sr. Jan 1998 S
5715394 Jabs Feb 1998 A
5717379 Peters Feb 1998 A
5717578 Afzal Feb 1998 A
5731756 Roddy Mar 1998 A
5777551 Hess Jul 1998 A
5838996 deCarmo Nov 1998 A
5874952 Morgan Feb 1999 A
5886894 Rakoff Mar 1999 A
5892442 Ozery Apr 1999 A
5907279 Bruins et al. May 1999 A
5909183 Borgstahl et al. Jun 1999 A
5914655 Clifton et al. Jun 1999 A
5943394 Ader et al. Aug 1999 A
5955946 Beheshti et al. Sep 1999 A
5958053 Denker Sep 1999 A
5959528 Right et al. Sep 1999 A
5963916 Kaplan Oct 1999 A
D416910 Vasquez Nov 1999 S
5991795 Howard Nov 1999 A
6032036 Maystre Feb 2000 A
6037991 Thro Mar 2000 A
6038289 Sands Mar 2000 A
6040770 Britton Mar 2000 A
6049272 Lee Apr 2000 A
6049273 Hess Apr 2000 A
6052052 Delmonaco Apr 2000 A
6060994 Chen May 2000 A
6067346 Akhteruzzaman May 2000 A
6067440 Diefe May 2000 A
6078253 Fowler Jun 2000 A
6078257 Ferraro Jun 2000 A
6085030 Whitehead Jul 2000 A
6104785 Chen Aug 2000 A
6134591 Nickles Oct 2000 A
6138249 Nolet Oct 2000 A
6140987 Stein et al. Oct 2000 A
6181341 Shinagawa Jan 2001 B1
6192418 Hale et al. Feb 2001 B1
6198475 Kunimats Mar 2001 B1
6198479 Humpleman et al. Mar 2001 B1
6208247 Agre et al. Mar 2001 B1
6211783 Wang Apr 2001 B1
6219677 Howard Apr 2001 B1
6246320 Monroe Jun 2001 B1
6271752 Vaios Aug 2001 B1
6281790 Kimmel et al. Aug 2001 B1
6282569 Wallis et al. Aug 2001 B1
6286038 Reichmeyer et al. Sep 2001 B1
6288716 Humpleman et al. Sep 2001 B1
6289382 Bowman-Amuah Sep 2001 B1
6295346 Markowitz et al. Sep 2001 B1
6320506 Ferraro Nov 2001 B1
D451529 Vasquez Dec 2001 S
6331122 Wu Dec 2001 B1
6351829 Dupont et al. Feb 2002 B1
6353891 Borella et al. Mar 2002 B1
6363417 Howard et al. Mar 2002 B1
6363422 Hunter et al. Mar 2002 B1
6369695 Horon Apr 2002 B2
6369705 Kennedy Apr 2002 B1
6370436 Howard et al. Apr 2002 B1
6374079 Hsu Apr 2002 B1
6377861 York Apr 2002 B1
6385772 Courtney May 2002 B1
6400265 Saylor et al. Jun 2002 B1
D460472 Wang Jul 2002 S
6418037 Zhang Jul 2002 B1
6433683 Robinson Aug 2002 B1
D464328 Vasquez et al. Oct 2002 S
D464948 Vasquez et al. Oct 2002 S
6462507 Fisher et al. Oct 2002 B2
6462663 Wilson et al. Oct 2002 B1
6467084 Howard et al. Oct 2002 B1
6480901 Weber et al. Nov 2002 B1
6493020 Stevenson et al. Dec 2002 B1
6496927 McGrane et al. Dec 2002 B1
6529723 Bentley Mar 2003 B1
6542075 Barker et al. Apr 2003 B2
6553336 Johnson et al. Apr 2003 B1
6563800 Salo et al. May 2003 B1
6574234 Myer et al. Jun 2003 B1
6580950 Johnson et al. Jun 2003 B1
6587455 Ray et al. Jul 2003 B1
6587736 Howard et al. Jul 2003 B2
6591094 Bentley Jul 2003 B1
6597703 Li et al. Jul 2003 B1
6601086 Howard et al. Jul 2003 B1
6603488 Humpleman et al. Aug 2003 B2
6609127 Lee et al. Aug 2003 B1
6615088 Myer et al. Sep 2003 B1
6621827 Rezvani et al. Sep 2003 B1
6624750 Marman et al. Sep 2003 B1
6631416 Bendinelu et al. Oct 2003 B2
6636893 Fong Oct 2003 B1
6643652 Helgeson et al. Nov 2003 B2
6643669 Novak et al. Nov 2003 B1
6648682 Wu Nov 2003 B1
6658091 Naidoo Dec 2003 B1
6661340 Saylor Dec 2003 B1
6686838 Rezvani Feb 2004 B1
6690411 Naidoo Feb 2004 B2
6693530 Dowens Feb 2004 B1
6693545 Brown Feb 2004 B2
6697103 Fernandez Feb 2004 B1
6704786 Gupta Mar 2004 B1
6721689 Markle Apr 2004 B2
6721747 Lipkin Apr 2004 B2
6738824 Blair May 2004 B1
6754717 Day Jun 2004 B1
6756896 Ford Jun 2004 B2
6756998 Bilger Jun 2004 B1
6762686 Tabe Jul 2004 B1
6778085 Faulkner Aug 2004 B2
6781509 Oppedahl Aug 2004 B1
6785542 Blight Aug 2004 B1
6789147 Kessler Sep 2004 B1
6795322 Aihara Sep 2004 B2
6798344 Faulkner Sep 2004 B2
6810409 Fry Oct 2004 B1
6826233 Oosawa Nov 2004 B1
6850252 Hoffberg Feb 2005 B1
6856236 Christensen Feb 2005 B2
6865690 Kocin Mar 2005 B2
6873256 Lemelson Mar 2005 B2
D504889 Andre et al. May 2005 S
6891838 Petite May 2005 B1
6912429 Bilger Jun 2005 B1
6918112 Bourke-Dunphy Jul 2005 B2
6928148 Simon Aug 2005 B2
6930599 Naidoo Aug 2005 B2
6930730 Maxson Aug 2005 B2
6931445 Davis Aug 2005 B2
6943681 Rezvani Sep 2005 B2
6956477 Chun Oct 2005 B2
6959341 Leung Oct 2005 B1
6959393 Hollis Oct 2005 B2
6963981 Bailey Nov 2005 B1
6965313 Saylor Nov 2005 B1
6970183 Monroe Nov 2005 B1
6972676 Kimmel Dec 2005 B1
6975220 Foodman Dec 2005 B1
6977485 Wei Dec 2005 B1
6990591 Pearson Jan 2006 B1
7015806 Naidoo Mar 2006 B2
7016970 Harumoto Mar 2006 B2
7020697 Goodman Mar 2006 B1
7020701 Gelvin Mar 2006 B1
7024676 Klopfenstein Apr 2006 B1
7030752 Tyroler Apr 2006 B2
7032002 Rezvani Apr 2006 B1
7034681 Yamamoto Apr 2006 B2
7035907 Decasper Apr 2006 B1
7039391 Rezvani May 2006 B2
7043537 Pratt May 2006 B1
7047088 Nakamura May 2006 B2
7047092 Wimsatt May 2006 B2
7053764 Stilp May 2006 B2
7072934 Helgeson Jul 2006 B2
7075429 Marshall Jul 2006 B2
7079020 Stilp Jul 2006 B2
7080046 Rezvani Jul 2006 B1
7081813 Winick Jul 2006 B2
7082460 Hansen Jul 2006 B2
7085937 Rezvani Aug 2006 B1
7099944 Anschutz Aug 2006 B1
7099994 Thayer Aug 2006 B2
7103152 Naidoo Sep 2006 B2
7106176 La Sep 2006 B2
7107322 Freeny Sep 2006 B1
7110774 Davis Sep 2006 B1
7113090 Saylor Sep 2006 B1
7113099 Tyroler Sep 2006 B2
7114554 Bergman Oct 2006 B2
7119674 Sefton Oct 2006 B2
7120232 Naido Oct 2006 B2
7120233 Naidoo Oct 2006 B2
7130383 Naidoo Oct 2006 B2
7130585 Ollis Oct 2006 B1
7148810 Bhat Dec 2006 B2
7149798 Rezvan Dec 2006 B2
7149814 Neufeld Dec 2006 B2
7164907 Cochran Jan 2007 B2
7166987 Lee Jan 2007 B2
7174564 Weatherspoon Feb 2007 B1
7183907 Simon Feb 2007 B2
7203486 Patel Apr 2007 B2
7209945 Hicks Apr 2007 B2
7212570 Akiyama May 2007 B2
7218217 Adonailo May 2007 B2
7222359 Freund May 2007 B2
7237267 Rayes Jun 2007 B2
7248161 Spoltore Jul 2007 B2
7249317 Nakagawa Jul 2007 B1
7250854 Rezvani Jul 2007 B2
7250859 Martin Jul 2007 B2
7254779 Rezvani Aug 2007 B1
7262690 Heaton Aug 2007 B2
7277010 Joao Oct 2007 B2
7298253 Petricoin Nov 2007 B2
7305461 Ullman Dec 2007 B2
7310115 Tanimoto Dec 2007 B2
7313102 Stephenson Dec 2007 B2
D558460 Yu et al. Jan 2008 S
D558756 Andre et al. Jan 2008 S
7337217 Wang Feb 2008 B2
7337473 Chang Feb 2008 B2
7343619 Ofek Mar 2008 B2
7349761 Cruse Mar 2008 B1
7349967 Wang Mar 2008 B2
7367045 Ofek Apr 2008 B2
7370115 Bae May 2008 B2
7383339 Meenan Jun 2008 B1
7403838 Deen Jul 2008 B2
7409045 Naidoo Aug 2008 B2
7409451 Meenan Aug 2008 B1
7412447 Hilbert Aug 2008 B2
7425101 Cheng Sep 2008 B2
7428585 Owens Sep 2008 B1
7430614 Shen Sep 2008 B2
7437753 Nahum Oct 2008 B2
7440434 Chaskar Oct 2008 B2
7457869 Kernan Nov 2008 B2
7469139 Van Dec 2008 B2
7469294 Luo Dec 2008 B1
7469381 Ording Dec 2008 B2
D584738 Kim et al. Jan 2009 S
D585399 Hwang Jan 2009 S
7479949 Jobs Jan 2009 B2
7480713 Ullman Jan 2009 B2
7480724 Zilmer Jan 2009 B2
7498695 Gaudreau Mar 2009 B2
7506052 Wian Mar 2009 B2
7509687 Ofek Mar 2009 B2
7511614 Stilp Mar 2009 B2
7512965 Amour Mar 2009 B1
7526539 Hsu Apr 2009 B1
7526762 Ast Ala Apr 2009 B1
7528723 Fast May 2009 B2
7551071 Bennett Jun 2009 B2
7554934 Abraham Jun 2009 B2
7558379 Winick Jul 2009 B2
7568018 Hove Jul 2009 B1
7571459 Ganesh Aug 2009 B2
7577420 Srinivasan et al. Aug 2009 B2
7587464 Moorer Sep 2009 B2
D602014 Andre et al. Oct 2009 S
D602015 Andre et al. Oct 2009 S
D602017 Andre et al. Oct 2009 S
D602486 Andre et al. Oct 2009 S
D602487 Maskatia Oct 2009 S
7619512 Trundle Nov 2009 B2
7620427 Shanahan Nov 2009 B2
7627665 Barker Dec 2009 B2
7633385 Cohn Dec 2009 B2
7634519 Creamer Dec 2009 B2
7651530 Winick Jan 2010 B2
7653911 Doshi Jan 2010 B2
7681201 Dale Mar 2010 B2
7697028 Johnson Apr 2010 B1
7701970 Krits Apr 2010 B2
D615083 Andre et al. May 2010 S
7711796 Gutt May 2010 B2
7734020 Elliot Jun 2010 B2
7734286 Almeda Jun 2010 B2
7734906 Orlando Jun 2010 B2
7739596 Clarke Jun 2010 B2
7751409 Carolan Jul 2010 B1
7787863 Groenendaal Aug 2010 B2
D624896 Park et al. Oct 2010 S
7813538 Carroll et al. Oct 2010 B2
D626437 Lee et al. Nov 2010 S
7827252 Hopmann Nov 2010 B2
7855635 Cohn Dec 2010 B2
7859404 Chul Dec 2010 B2
7882537 Okajo Feb 2011 B2
7884855 Ortiz Feb 2011 B2
7890612 Todd Feb 2011 B2
7911341 Raji Mar 2011 B2
D636769 Wood et al. Apr 2011 S
7921686 Bagepalli Apr 2011 B2
D637596 Akana et al. May 2011 S
D639805 Song et al. Jun 2011 S
D640663 Arnholt Jun 2011 S
7956736 Cohn Jun 2011 B2
7970863 Fontaine Jun 2011 B1
D641018 Lee et al. Jul 2011 S
7974235 Ghozati Jul 2011 B2
D642563 Akana et al. Aug 2011 S
8001219 Moorer Aug 2011 B2
D645015 Lee et al. Sep 2011 S
D645435 Kim et al. Sep 2011 S
D645833 Seflic et al. Sep 2011 S
8022833 Cho Sep 2011 B2
8028041 Olliphant Sep 2011 B2
8042049 Killian Oct 2011 B2
8046411 Hayashi Oct 2011 B2
D650381 Park et al. Dec 2011 S
8073931 Dawes Dec 2011 B2
8086702 Baum Dec 2011 B2
8086703 Baum Dec 2011 B2
D654460 Kim et al. Feb 2012 S
D654497 Lee Feb 2012 S
8122131 Baum Feb 2012 B2
8125184 Raji Feb 2012 B2
D656137 Chung et al. Mar 2012 S
8140658 Vin Mar 2012 B1
8159519 Kurtz Apr 2012 B2
8200827 Hunyad Jun 2012 B1
8209400 Baum Jun 2012 B2
D663298 Song et al. Jul 2012 S
D664540 Kim et al. Jul 2012 S
8214496 Gutt Jul 2012 B2
8224122 Cohen et al. Jul 2012 B2
D664954 Kim et al. Aug 2012 S
D666198 Van Aug 2012 S
8239477 Sharma Aug 2012 B2
D667395 John et al. Sep 2012 S
D667396 Lee Sep 2012 S
D667397 Koh Sep 2012 S
D667398 Koh Sep 2012 S
D667399 Koh Sep 2012 S
8269623 Addy Sep 2012 B2
D668650 Koh Oct 2012 S
D668651 Han Oct 2012 S
D668652 Kim et al. Oct 2012 S
D669469 Kim et al. Oct 2012 S
8284258 Cetin et al. Oct 2012 B1
D670692 Kang Nov 2012 S
D671514 Akana et al. Nov 2012 S
8311526 Forstall Nov 2012 B2
D671938 Hsu et al. Dec 2012 S
D672344 Li Dec 2012 S
D672345 Li Dec 2012 S
D672739 Sin Dec 2012 S
D672768 Huang et al. Dec 2012 S
8335842 Raji Dec 2012 B2
8335854 Eldering Dec 2012 B2
D673561 Hyun et al. Jan 2013 S
D673948 Andre et al. Jan 2013 S
D673950 Li et al. Jan 2013 S
D674369 Jaewoong Jan 2013 S
D675203 Yang Jan 2013 S
D675588 Park Feb 2013 S
D675612 Andre et al. Feb 2013 S
D676443 Canizares Feb 2013 S
D676819 Choi Feb 2013 S
D677255 McManigal et al. Mar 2013 S
D677640 Kim et al. Mar 2013 S
D677659 Akana et al. Mar 2013 S
D677660 Groene et al. Mar 2013 S
D678271 Chiu Mar 2013 S
D678272 Groene et al. Mar 2013 S
D678877 Groene et al. Mar 2013 S
D679706 Tang et al. Apr 2013 S
D680151 Katori Apr 2013 S
D680524 Feng et al. Apr 2013 S
D681032 Akana et al. Apr 2013 S
D681583 Park May 2013 S
D681591 Sung May 2013 S
D681632 Akana et al. May 2013 S
D682239 Yeh et al. May 2013 S
8451986 Cohn May 2013 B2
D684553 Kim et al. Jun 2013 S
D684968 Smith Jun 2013 S
8473619 Baum Jun 2013 B2
D685778 Fahrendorff et al. Jul 2013 S
D685783 Bryan et al. Jul 2013 S
8478844 Baum Jul 2013 B2
8478871 Gutt Jul 2013 B2
8483853 Lambourne Jul 2013 B1
8499038 Vucurevich Jul 2013 B1
8520068 Naidoo Aug 2013 B2
8525664 Hadizad Sep 2013 B2
8543665 Ansari Sep 2013 B2
8570993 Austin et al. Oct 2013 B2
8584199 Chen Nov 2013 B1
8612591 Dawes Dec 2013 B2
8675071 Slavin Mar 2014 B1
8730834 Marusca May 2014 B2
8836467 Cohn Sep 2014 B1
8902740 Hicks Dec 2014 B2
8914526 Lindquist Dec 2014 B1
8935236 Morita Jan 2015 B2
9529440 Piemonte et al. Dec 2016 B2
10645347 Ure et al. May 2020 B2
11438553 Ure et al. Sep 2022 B1
11722806 Ure et al. Aug 2023 B2
20010016501 King Aug 2001 A1
20010030597 Inoue Oct 2001 A1
20010034754 Elwahab Oct 2001 A1
20020004828 Davis Jan 2002 A1
20020026476 Miyazaki Feb 2002 A1
20020026531 Keane Feb 2002 A1
20020027504 Davis Mar 2002 A1
20020029276 Bendinelli Mar 2002 A1
20020038380 Brawn Mar 2002 A1
20020052913 Yamada May 2002 A1
20020059637 Rakib May 2002 A1
20020083342 Webb Jun 2002 A1
20020095490 Barker Jul 2002 A1
20020099809 Lee Jul 2002 A1
20020099829 Richards Jul 2002 A1
20020103898 Moyer Aug 2002 A1
20020103927 Parent Aug 2002 A1
20020107910 Zhao Aug 2002 A1
20020109580 Shreve Aug 2002 A1
20020111698 Graziano Aug 2002 A1
20020112051 Ullman Aug 2002 A1
20020112182 Chang Aug 2002 A1
20020114439 Dunlap Aug 2002 A1
20020116117 Martens Aug 2002 A1
20020118107 Yamamoto Aug 2002 A1
20020120790 Schwalb Aug 2002 A1
20020128728 Murakami Sep 2002 A1
20020133539 Monday Sep 2002 A1
20020133578 Wu Sep 2002 A1
20020143923 Alexander Oct 2002 A1
20020147982 Naidoo Oct 2002 A1
20020156564 Preston Oct 2002 A1
20020163534 Choi Nov 2002 A1
20020163997 Bergman Nov 2002 A1
20020165006 Haller Nov 2002 A1
20020174367 Kimmel Nov 2002 A1
20020177428 Menard Nov 2002 A1
20020180579 Nagaoka Dec 2002 A1
20020184301 Parent Dec 2002 A1
20030005030 Sutton Jan 2003 A1
20030009552 Benfield Jan 2003 A1
20030009553 Benfield Jan 2003 A1
20030023839 Burkhardt Jan 2003 A1
20030030548 Kovacs Feb 2003 A1
20030038849 Craven Feb 2003 A1
20030041137 Horie Feb 2003 A1
20030041167 French Feb 2003 A1
20030051009 Shah Mar 2003 A1
20030052923 Porter Mar 2003 A1
20030061344 Monroe Mar 2003 A1
20030062997 Naidoo Apr 2003 A1
20030065757 Mentze Apr 2003 A1
20030071724 D'Amico Apr 2003 A1
20030090473 Joshi May 2003 A1
20030103088 Dresti Jun 2003 A1
20030115345 Chien Jun 2003 A1
20030128115 Giacopelli Jul 2003 A1
20030132018 Okit A Jul 2003 A1
20030137426 Anthony Jul 2003 A1
20030147534 Ablay Aug 2003 A1
20030158635 Pillar Aug 2003 A1
20030159135 Hiller Aug 2003 A1
20030174648 Wang Sep 2003 A1
20030177236 Goto Sep 2003 A1
20030182396 Reich Sep 2003 A1
20030187920 Redkar Oct 2003 A1
20030189509 Hayes Oct 2003 A1
20030197847 Shinoda Oct 2003 A1
20030200325 Krishnaswamy Oct 2003 A1
20030201889 Zulkowski Oct 2003 A1
20030210126 Kanazawa Nov 2003 A1
20030217136 Cho Nov 2003 A1
20030230934 Cordelli Dec 2003 A1
20030236841 Epshteyn Dec 2003 A1
20040003073 Krzyzanowski et al. Jan 2004 A1
20040003241 Sengodan Jan 2004 A1
20040015572 Kang Jan 2004 A1
20040024851 Naidoo Feb 2004 A1
20040037295 Tanaka Feb 2004 A1
20040054789 Breh Mar 2004 A1
20040086088 Naidoo May 2004 A1
20040086090 Naidoo May 2004 A1
20040086093 Schranz May 2004 A1
20040103308 Paller May 2004 A1
20040117330 Ehlers Jun 2004 A1
20040117462 Bodin Jun 2004 A1
20040117465 Bodin Jun 2004 A1
20040123149 Tyroler Jun 2004 A1
20040139227 Takeda Jul 2004 A1
20040143749 Tajalli Jul 2004 A1
20040155757 Litwin Aug 2004 A1
20040162902 Davis Aug 2004 A1
20040163073 Krzyzanowski Aug 2004 A1
20040163118 Mottur Aug 2004 A1
20040169288 Hsieh Sep 2004 A1
20040177163 Casey Sep 2004 A1
20040189460 Heato Sep 2004 A1
20040189871 Kurosawa Sep 2004 A1
20040196844 Hagino Oct 2004 A1
20040199645 Rouhi Oct 2004 A1
20040202351 Park Oct 2004 A1
20040212503 Stilp Oct 2004 A1
20040215694 Podolsky Oct 2004 A1
20040215700 Shenfield Oct 2004 A1
20040223605 Donnelly Nov 2004 A1
20040243835 Terzis Dec 2004 A1
20040243996 Sheehy Dec 2004 A1
20040246339 Ooshima Dec 2004 A1
20040249922 Hackman Dec 2004 A1
20040257433 Lia Dec 2004 A1
20040260407 Wimsatt Dec 2004 A1
20040260427 Wimsatt Dec 2004 A1
20040267937 Klemets Dec 2004 A1
20050010866 Humpleman Jan 2005 A1
20050015805 Iwamura Jan 2005 A1
20050023858 Bingle Feb 2005 A1
20050024203 Wolfe Feb 2005 A1
20050038325 Moll Feb 2005 A1
20050038326 Mathur Feb 2005 A1
20050044061 Klemow Feb 2005 A1
20050052831 Chen Mar 2005 A1
20050060411 Coulombe Mar 2005 A1
20050066045 Johnson Mar 2005 A1
20050069098 Kalervo Mar 2005 A1
20050079855 Jethi Apr 2005 A1
20050086126 Patterson Apr 2005 A1
20050086211 Mayer Apr 2005 A1
20050091696 Wolfe Apr 2005 A1
20050096753 Arling May 2005 A1
20050102152 Hodges May 2005 A1
20050108091 Sotak May 2005 A1
20050108369 Sather May 2005 A1
20050120082 Hesselink Jun 2005 A1
20050125083 Kiko Jun 2005 A1
20050128068 Winick Jun 2005 A1
20050128083 Puzio Jun 2005 A1
20050144312 Kadyk Jun 2005 A1
20050149639 Vrielink Jul 2005 A1
20050149746 Lu Jul 2005 A1
20050156568 Yueh Jul 2005 A1
20050159823 Hayes Jul 2005 A1
20050169288 Kamiwada Aug 2005 A1
20050184865 Han Aug 2005 A1
20050197847 Smith Sep 2005 A1
20050200474 Behnke Sep 2005 A1
20050204076 Cumpson Sep 2005 A1
20050210532 Winick Sep 2005 A1
20050216302 Raji Sep 2005 A1
20050216580 Raji Sep 2005 A1
20050222820 Chung Oct 2005 A1
20050225442 Kanayanna Oct 2005 A1
20050231349 Bhat Oct 2005 A1
20050237182 Wang Oct 2005 A1
20050249199 Albert Nov 2005 A1
20050256608 King Nov 2005 A1
20050267605 Lee Dec 2005 A1
20050273831 Slomovich Dec 2005 A1
20050276389 Hinkson Dec 2005 A1
20050280964 Richmond Dec 2005 A1
20060009863 Lingemann Jan 2006 A1
20060010078 Rezvani Jan 2006 A1
20060018328 Mody Jan 2006 A1
20060045074 Lee Mar 2006 A1
20060051122 Kawazu Mar 2006 A1
20060063534 Kokkonen Mar 2006 A1
20060064305 Alonso Mar 2006 A1
20060064478 Sirkin Mar 2006 A1
20060067344 Sakurai Mar 2006 A1
20060067484 Elliot Mar 2006 A1
20060075235 Renkis Apr 2006 A1
20060078344 Kawazu Apr 2006 A1
20060088092 Chen Apr 2006 A1
20060098168 McDowall et al. May 2006 A1
20060105713 Zhen May 2006 A1
20060109113 Reyes May 2006 A1
20060111095 Weigand May 2006 A1
20060129837 Im Jun 2006 A1
20060132302 Stilp Jun 2006 A1
20060142880 Deen Jun 2006 A1
20060142968 Han Jun 2006 A1
20060143268 Chatani Jun 2006 A1
20060145842 Stil Jul 2006 A1
20060161270 Luskin Jul 2006 A1
20060161662 Ng Jul 2006 A1
20060161960 Benoit Jul 2006 A1
20060167784 Hoffberg Jul 2006 A1
20060168178 Hwang Jul 2006 A1
20060181406 Petite Aug 2006 A1
20060182100 Li Aug 2006 A1
20060183460 Srinivasan et al. Aug 2006 A1
20060187900 Akbar Aug 2006 A1
20060190458 Mishina Aug 2006 A1
20060197660 Luebke Sep 2006 A1
20060200845 Foster Sep 2006 A1
20060206220 Amundson Sep 2006 A1
20060209857 Hicks Sep 2006 A1
20060218593 Afshary Sep 2006 A1
20060220830 Bennett Oct 2006 A1
20060222153 Tarkoff Oct 2006 A1
20060229746 Ollis Oct 2006 A1
20060230270 Goffin Oct 2006 A1
20060242395 Fausak Oct 2006 A1
20060245369 Ng Nov 2006 A1
20060246919 Park Nov 2006 A1
20060258342 Fok Nov 2006 A1
20060265489 Moore Nov 2006 A1
20060271695 Lavian Nov 2006 A1
20060282886 Gaug Dec 2006 A1
20070002833 Bajic Jan 2007 A1
20070005736 Hansen Jan 2007 A1
20070005957 Sahita Jan 2007 A1
20070006177 Aiber Jan 2007 A1
20070052675 Chang Mar 2007 A1
20070055770 Karmakar Mar 2007 A1
20070061266 Moore Mar 2007 A1
20070063866 Webb Mar 2007 A1
20070079151 Connor Apr 2007 A1
20070079385 Williams Apr 2007 A1
20070096981 Abraham May 2007 A1
20070101345 Takagi May 2007 A1
20070106124 Kuriy Ama May 2007 A1
20070130286 Hopmann Jun 2007 A1
20070142022 Madonna Jun 2007 A1
20070143440 Reckamp Jun 2007 A1
20070146484 Horton Jun 2007 A1
20070147419 Tsujimoto Jun 2007 A1
20070155325 Bambic Jul 2007 A1
20070160017 Meier et al. Jul 2007 A1
20070162228 Mitchell Jul 2007 A1
20070162680 Mitchell Jul 2007 A1
20070192486 Wilson Aug 2007 A1
20070198698 Boyd Aug 2007 A1
20070216783 Ortiz Sep 2007 A1
20070223465 Wang Sep 2007 A1
20070226182 Sobotka Sep 2007 A1
20070230415 Malik Oct 2007 A1
20070236595 Pan et al. Oct 2007 A1
20070245223 Siedzik Oct 2007 A1
20070256105 Tabe Nov 2007 A1
20070265866 Fehling Nov 2007 A1
20070271398 Manchester Nov 2007 A1
20070286210 Gutt Dec 2007 A1
20070286369 Gutt Dec 2007 A1
20070287405 Radtke Dec 2007 A1
20070288849 Moorer Dec 2007 A1
20070290830 Gurley Dec 2007 A1
20070298772 Owens Dec 2007 A1
20080001734 Stilp Jan 2008 A1
20080013957 Akers Jan 2008 A1
20080027587 Nickerson Jan 2008 A1
20080042826 Hevia Feb 2008 A1
20080048975 Leibow Feb 2008 A1
20080057929 Min Mar 2008 A1
20080065681 Fontijn Mar 2008 A1
20080072244 Eker Mar 2008 A1
20080084296 Kutzik Apr 2008 A1
20080091793 Diroo Apr 2008 A1
20080104516 Lee May 2008 A1
20080109650 Shim May 2008 A1
20080112405 Cholas May 2008 A1
20080117029 Dohrmann May 2008 A1
20080126535 Zhu May 2008 A1
20080133725 Shaouy Jun 2008 A1
20080141303 Walker Jun 2008 A1
20080141341 Vinogradov Jun 2008 A1
20080147834 Quinn Jun 2008 A1
20080163355 Chu Jul 2008 A1
20080168404 Ording Jul 2008 A1
20080170511 Shorty Jul 2008 A1
20080180240 Raji Jul 2008 A1
20080183842 Raji Jul 2008 A1
20080189609 Larson Aug 2008 A1
20080209506 Ghai et al. Aug 2008 A1
20080211683 Curt et al. Sep 2008 A1
20080219239 Bell Sep 2008 A1
20080235326 Parsi Sep 2008 A1
20080235600 Harper Sep 2008 A1
20080240372 Frenette Oct 2008 A1
20080253391 Krits Oct 2008 A1
20080261540 Rohani Oct 2008 A1
20080284587 Saigh Nov 2008 A1
20080316024 Chantelou Dec 2008 A1
20090019141 Bush Jan 2009 A1
20090036142 Yan Feb 2009 A1
20090041467 Carleton Feb 2009 A1
20090042649 Hsieh Feb 2009 A1
20090049488 Stransky Feb 2009 A1
20090063582 Anna Mar 2009 A1
20090066788 Baum Mar 2009 A1
20090066789 Baum Mar 2009 A1
20090067395 Curtis Mar 2009 A1
20090070436 Dawes Mar 2009 A1
20090070473 Baum Mar 2009 A1
20090070477 Baum Mar 2009 A1
20090070681 Dawes Mar 2009 A1
20090070682 Dawes Mar 2009 A1
20090070692 Dawes Mar 2009 A1
20090074184 Baum Mar 2009 A1
20090077167 Baum Mar 2009 A1
20090077622 Baum Mar 2009 A1
20090077623 Baum Mar 2009 A1
20090077624 Baum Mar 2009 A1
20090100329 Espinoza Apr 2009 A1
20090100492 Hicks Apr 2009 A1
20090113344 Nesse Apr 2009 A1
20090119397 Neerdaels May 2009 A1
20090125708 Woodring May 2009 A1
20090128365 Laskin May 2009 A1
20090134998 Baum May 2009 A1
20090138600 Baum May 2009 A1
20090138958 Baum May 2009 A1
20090146846 Grossman Jun 2009 A1
20090158189 Itani Jun 2009 A1
20090158292 Rattner Jun 2009 A1
20090165114 Baum Jun 2009 A1
20090177906 Paniagua Jul 2009 A1
20090204693 Andreev Aug 2009 A1
20090221368 Yen Sep 2009 A1
20090240787 Denny Sep 2009 A1
20090240814 Brubacher Sep 2009 A1
20090240946 Yeap Sep 2009 A1
20090256708 Hsiao Oct 2009 A1
20090265042 Mollenkopf et al. Oct 2009 A1
20090303100 Zemany Dec 2009 A1
20090313693 Rogers Dec 2009 A1
20090322510 Berger Dec 2009 A1
20100008274 Kneckt Jan 2010 A1
20100023865 Fulker Jan 2010 A1
20100026487 Hershkovitz Feb 2010 A1
20100026802 Titus et al. Feb 2010 A1
20100030578 Siddique Feb 2010 A1
20100030810 Marr Feb 2010 A1
20100052612 Raji Mar 2010 A1
20100066530 Cohn Mar 2010 A1
20100074112 Derr Mar 2010 A1
20100077111 Holmes Mar 2010 A1
20100082744 Raji Apr 2010 A1
20100095111 Gutt Apr 2010 A1
20100095369 Gutt Apr 2010 A1
20100121521 Kiribayashi May 2010 A1
20100124228 Tinnakornsrisuphap et al. May 2010 A1
20100153853 Dawes Jun 2010 A1
20100159967 Pounds Jun 2010 A1
20100185857 Neitzel Jul 2010 A1
20100197219 Issa Aug 2010 A1
20100210240 Mahaffey Aug 2010 A1
20100212012 Touboul Aug 2010 A1
20100218104 Lewis Aug 2010 A1
20100245107 Fulker Sep 2010 A1
20100267390 Lin Oct 2010 A1
20100280635 Cohn Nov 2010 A1
20100280637 Cohn Nov 2010 A1
20100298024 Choi Nov 2010 A1
20100321151 Matsuura Dec 2010 A1
20100332164 Aisa Dec 2010 A1
20110000521 Tachibana Jan 2011 A1
20110007159 Camp Jan 2011 A1
20110029875 Milch Feb 2011 A1
20110040415 Nickerson Feb 2011 A1
20110040877 Foisy Feb 2011 A1
20110087988 Ray et al. Apr 2011 A1
20110096678 Ketonen Apr 2011 A1
20110102588 Trundle May 2011 A1
20110197327 McElroy Aug 2011 A1
20110234392 Cohn Sep 2011 A1
20110283006 Ramamurthy Nov 2011 A1
20110286437 Austin et al. Nov 2011 A1
20120023151 Bennett, III Jan 2012 A1
20120066608 Sundermeyer Mar 2012 A1
20120081842 Ewing Apr 2012 A1
20120150966 Fan et al. Jun 2012 A1
20120154138 Cohn Jun 2012 A1
20120162423 Xiao et al. Jun 2012 A1
20120182245 Hutton Jul 2012 A1
20120242788 Chuang Sep 2012 A1
20120260184 Dawes Oct 2012 A1
20120278877 Baum Nov 2012 A1
20120309354 Du Dec 2012 A1
20120315848 Smith et al. Dec 2012 A1
20120327242 Barley Dec 2012 A1
20130002880 Levinson et al. Jan 2013 A1
20130010111 Laforte Jan 2013 A1
20130120134 Hicks, III May 2013 A1
20130121239 Hicks, III May 2013 A1
20130136102 MacWan et al. May 2013 A1
20130174239 Kim et al. Jul 2013 A1
20130223279 Tinnakornsrisuphap et al. Aug 2013 A1
20130258047 Morimoto et al. Oct 2013 A1
20140009568 Stec et al. Jan 2014 A1
20140143695 Sundermeyer May 2014 A1
20140143854 Lopez May 2014 A1
20140153695 Yanagisawa Jun 2014 A1
20140369584 Fan et al. Dec 2014 A1
20150009325 Kardashov Jan 2015 A1
20150088982 Johnson Mar 2015 A1
20150097949 Ure et al. Apr 2015 A1
20150142991 Zaloom May 2015 A1
Foreign Referenced Citations (47)
Number Date Country
2005223267 Dec 2010 AU
2010297957 May 2012 AU
2011250886 Jan 2013 AU
2011305163 May 2013 AU
2559842 May 2014 CA
0295146 Dec 1988 EP
0308046 Mar 1989 EP
0591585 Apr 1994 EP
0978111 Nov 2001 EP
2112784 Oct 2009 EP
2584217 Jan 1987 FR
2661023 Oct 1991 FR
2793334 Nov 2000 FR
2222288 Feb 1990 GB
2273593 Jun 1994 GB
2319373 May 1998 GB
2324630 Oct 1998 GB
2335523 Sep 1999 GB
2349293 Oct 2000 GB
2370400 Jun 2002 GB
8227491 Sep 1996 JP
2002055895 Feb 2002 JP
2003085258 Mar 2003 JP
2003141659 May 2003 JP
2004192659 Jul 2004 JP
20060021605 Mar 2006 KR
WO 8907855 Aug 1989 WO
WO 9403881 Feb 1994 WO
WO 9636301 Nov 1996 WO
WO 9849663 Nov 1998 WO
WO 9934339 Jul 1999 WO
WO 0152478 Jul 2001 WO
WO 0199078 Dec 2001 WO
WO 0221300 Mar 2002 WO
WO 02097584 Dec 2002 WO
WO 03040839 May 2003 WO
WO 2004004222 Jan 2004 WO
WO 2004098127 Nov 2004 WO
WO 2004107710 Dec 2004 WO
WO 2005091218 Sep 2005 WO
WO 2005091218 Jul 2006 WO
WO 2007038872 Apr 2007 WO
WO 2007124453 Nov 2007 WO
WO 2009006670 Jan 2009 WO
WO 2009145747 Dec 2009 WO
WO 2011060385 May 2011 WO
WO2013051916 Apr 2013 WO
Non-Patent Literature Citations (206)
Entry
Alarm.com—Interactive Security Systems, Elders [retrieved on Nov. 4, 2003], 1 page.
Alarm.com—Interactive Security Systems, Frequently Asked Questions [retrieved on Nov. 4, 2003], 3 pages.
Alarm.com—Interactive Security Systems, Overview [retrieved on Nov. 4, 2003], 2 pages.
Alarm.com—Interactive Security Systems, Product Advantages [retrieved on Nov. 4, 2003], 3 pages.
Australian Patent App. No. 2010297957.
Australian Patent App. No. 2011250886.
Australian Patent App. No. 2011305163.
Canadian Patent App. No. 2559842.
Chinese Patent App. No. 201080053845.7.
Chinese Patent App. No. 201180034090.0.
Control Panel Standard—Features for False Alarm Reduction, The Security Industry Association, SIA 2009, pp. 1-48.
Co-pending U.S. Appl. No. 11/761,745, filed Jun. 12, 2007.
Co-pending U.S. Appl. No. 12/019,568, filed Jan. 24, 2008.
Co-pending U.S. Appl. No. 12/189,780, filed Aug. 11, 2008.
Co-pending U.S. Appl. No. 12/189,785, filed Aug. 11, 2008.
Co-pending U.S. Appl. No. 12/197,931, filed Aug. 25, 2008.
Co-pending U.S. Appl. No. 12/197,946, filed Aug. 25, 2008.
Co-pending U.S. Appl. No. 12/197,958, filed Aug. 25, 2008.
Co-pending U.S. Appl. No. 12/198,039, filed Aug. 25, 2008.
Co-pending U.S. Appl. No. 12/198,051, filed Aug. 25, 2008.
Co-pending U.S. Appl. No. 12/198,060, filed May 28, 2008.
Co-pending U.S. Appl. No. 12/198,066, filed Aug. 25, 2008.
Co-pending U.S. Appl. No. 12/269,735, filed Nov. 12, 2008.
Co-pending U.S. Appl. No. 12/539,537, filed Aug. 11, 2009.
Co-pending U.S. Appl. No. 12/568,718, filed Sep. 29, 2009.
Co-pending U.S. Appl. No. 12/630,092, filed Dec. 3, 2009.
Co-pending U.S. Appl. No. 12/691,992, filed Jan. 22, 2010.
Co-pending U.S. Appl. No. 12/718,385, filed Mar. 5, 2010.
Co-pending U.S. Appl. No. 12/732,879, filed Mar. 26, 2010.
Co-pending U.S. Appl. No. 12/750,470, filed Mar. 30, 2010.
Co-pending U.S. Appl. No. 12/770,253, filed Apr. 29, 2010.
Co-pending U.S. Appl. No. 12/770,365, filed Apr. 29, 2010.
Co-pending U.S. Appl. No. 12/771,071, filed Apr. 30, 2010.
Co-pending U.S. Appl. No. 12/771,372, filed Apr. 30, 2010.
Co-pending U.S. Appl. No. 12/771,471, filed Apr. 30, 2010.
Co-pending U.S. Appl. No. 12/771,624, filed Apr. 30, 2010.
Co-pending U.S. Appl. No. 12/892,303, filed Sep. 28, 2010.
Co-pending U.S. Appl. No. 12/892,801, filed Sep. 28, 2010.
Co-pending U.S. Appl. No. 12/952,080, filed Nov. 22, 2010.
Co-pending U.S. Appl. No. 12/970,313, filed Dec. 16, 2010.
Co-pending U.S. Appl. No. 12/971,282, filed Dec. 17, 2010.
Co-pending U.S. Appl. No. 12/972,740, filed Dec. 20, 2010.
Co-pending U.S. Appl. No. 13/099,293, filed May 2, 2011.
Co-pending U.S. Appl. No. 13/104,932, filed May 10, 2011.
Co-pending U.S. Appl. No. 13/104,936, filed May 10, 2011.
Co-pending U.S. Appl. No. 13/153,807, filed Jun. 6, 2011.
Co-pending U.S. Appl. No. 13/244,008, filed Sep. 23, 2011.
Co-pending U.S. Appl. No. 13/311,365, filed Dec. 5, 2011.
Co-pending U.S. Appl. No. 13/334,998, filed Dec. 22, 2011.
Co-pending U.S. Appl. No. 13/335,279, filed Dec. 22, 2011.
Co-pending U.S. Appl. No. 13/400,477, filed Dec. 22, 2011.
Co-pending U.S. Appl. No. 13/406,264, filed Feb. 27, 2012.
Co-pending U.S. Appl. No. 13/486,276, filed Jun. 1, 2012.
Co-pending U.S. Appl. No. 13/531,757, filed Jun. 25, 2012.
Co-pending U.S. Appl. No. 13/718,851, filed Dec. 18, 2012.
Co-pending U.S. Appl. No. 13/725,607, filed Dec. 21, 2012.
Co-pending U.S. Appl. No. 13/925,181, filed Jun. 24, 2013.
Co-pending U.S. Appl. No. 13/929,568, filed Jun. 27, 2013.
Co-pending U.S. Appl. No. 13/932,816, filed Jul. 1, 2013.
Co-pending U.S. Appl. No. 13/932,837, filed Jul. 1, 2013.
Co-pending U.S. Appl. No. 29/419,628, filed Apr. 30, 2012.
Co-pending U.S. Appl. No. 29/420,377, filed May 8, 2012.
EP Examination Report in European Application No. 14833805.6, dated Jan. 14, 2019, 5 pages.
European Patent App. No. 05725743.8.
European Patent App. No. 08797646.0.
European Patent App. No. 08828613.3.
European Patent App. No. 09807196.2.
European Patent App. No. 10819658.5.
European Patent App. No. 11781184.4.
European Patent App. No. 11827671.6.
Examination Report under Section 18(3) re for UK Patent Application No. GB0620362.4, mailed on Aug. 13, 2007.
Examination Report under Section 18(3) re for UK Patent Application No. GB0724248.0, mailed on Jun. 4, 2008.
Examination Report under Section 18(3) re for UK Patent Application No. GB0724248.0, mailed on Jan. 30, 2008.
Examination Report under Section 18(3) re for UK Patent Application No. GB0724760.4, mailed on Jan. 30, 2008.
Examination Report under Section 18(3) re for UK Patent Application No. GB0800040.8, mailed on Jan. 30, 2008.
Faultline, “AT&T Targets Video Home Security as Next Broadband Market,” The Register, Nov. 2, 2006, 2 pages.
Final Office Action mailed Aug. 1, 2011 for U.S. Appl. No. 12/630,092, filed Dec. 3, 2009.
Final Office Action mailed Jun. 1, 2009 for U.S. Appl. No. 11/084,232, filed Mar. 16, 2005.
Final Office Action mailed Jun. 5, 2012 for U.S. Appl. No. 12/771,071, filed Apr. 30, 2010.
Final Office Action mailed May 9, 2013 for U.S. Appl. No. 12/189,780, filed Aug. 11, 2008.
Final Office Action mailed May 9, 2013 for U.S. Appl. No. 12/952,080, filed Nov. 22, 2010.
Final Office Action mailed Jan. 10, 2011 for U.S. Appl. No. 12/189,785, filed Aug. 11, 2008.
Final Office Action mailed Jun. 10, 2011 for U.S. Appl. No. 11/084,232, filed Mar. 16, 2005.
Final Office Action mailed Jul. 12, 2010 for U.S. Appl. No. 12/019,554, filed Jan. 24, 2008.
Final Office Action mailed Jan. 13, 2011 for U.S. Appl. No. 12/189,780, filed Aug. 11, 2008.
Final Office Action mailed Sep. 14, 2011 for U.S. Appl. No. 12/197,931, filed Aug. 25, 2008.
Final Office Action mailed Feb. 16, 2011 for U.S. Appl. No. 12/019,568, filed Jan. 24, 2008.
Final Office Action mailed Oct. 17, 2012 for U.S. Appl. No. 12/637,671, filed Dec. 14, 2009.
Final Office Action mailed Sep. 17, 2012 for U.S. Appl. No. 12/197,958, filed Aug. 25, 2008.
Final Office Action mailed Mar. 21, 2013 for U.S. Appl. No. 12/691,992, filed Jan. 22, 2010.
Final Office Action mailed Jul. 23, 2013 for U.S. Appl. No. 13/531,757, filed Jun. 25, 2012.
Final Office Action mailed Feb. 26, 2013 for U.S. Appl. No. 12/771,471, filed Apr. 30, 2010.
Final Office Action mailed Jun. 29, 2012 for U.S. Appl. No. 12/539,537, filed Aug. 11, 2009.
Final Office Action mailed Dec. 31, 2012 for U.S. Appl. No. 12/770,365, filed Apr. 29, 2010.
Final Office Action mailed Oct. 31, 2012 for U.S. Appl. No. 12/771,624, filed Apr. 30, 2010.
Final Office Action in U.S. Appl. No. 14/456,377, dated Jan. 7, 2020, 18 pages.
Form PCT/ISA/210, “PCT International Search Report for the Application No. PCT/US05/08766 (ICON.P020),” May 23, 2006, 2 pages.
Form PCT/ISA/210, “PCT International Search Report for the Application No. PCT/US08/72831 (ICON.P001WO),” Nov. 4, 2008, 2 pages.
Form PCT/ISA/210, “PCT International Search Report for the Application No. PCT/US08/74246 (ICON.P003WO),” Nov. 14, 2008, 2 pages.
Form PCT/ISA/210, “PCT International Search Report for the Application No. PCT/US08/74260 (ICON.P002WO),” Nov. 13, 2008, 2 pages.
Form PCT/ISA/210, “PCT International Search Report for the Application No. PCT/US09/53485 (ICON.P011WO),” Oct. 22, 2009, 2 pages.
Form PCT/ISA/210, “PCT International Search Report for the Application No. PCT/US09/55559 (ICON.P012WO),” Nov. 12, 2009, 2 pages.
Form PCT/ISA/210, “PCT International Search Report for the Application No. PCT/US10/50585 (ICON.P014WO),” Dec. 30, 2010, 2 pages.
Form PCT/ISA/210, “PCT International Search Report for the Application No. PCT/US10/57674 (ICON.P015WO),” Mar. 2, 2011, 2 pages.
Form PCT/ISA/210, “PCT International Search Report for the Application No. PCT/US11/34858 (ICON.P017WO),” Oct. 3, 2011, 2 pages.
Form PCT/ISA/210, “PCT International Search Report for the Application No. PCT/US11/35994 (ICON.P016WO),” Sep. 28, 2011, 2 pages.
Form PCT/ISA/210, “PCT International Search Report for the Application No. PCT/US11/53136 (ICON.P019WO),” Jan. 5, 2012, 2 pages.
Form PCT/ISA/210, “PCT International Search Report of the Application No. PCT/US08/83254 (ICON.P005WO),” Jan. 14, 2009, 2 pages.
Form PCT/ISA/220, ICON.P0014WO, “PCT Notification of Transmittal of The International Search Report and the Written Opinion of the International Searching Authority, or the Declaration,” 1 pg.
Form PCT/ISA/220, ICON.P0015WO, “PCT Notification of Transmittal of The International Search Report and the Written Opinion of the International Searching Authority, or the Declaration,” 1 pg.
Form PCT/ISA/220, “PCT Notification of Transmittal of the International Search Report and the Written Opinion of the International Searching Authority, or the Declaration for the Application No. PCT/US05/08766 (ICON.P020),” May 23, 2006, 1 page.
Form PCT/ISA/220, “PCT Notification of Transmittal of the International Search Report and the Written Opinion of the International Searching Authority, or the Declaration for the Application No. PCT/US08/72831 (ICON.P001WO),” Nov. 4, 2008, 1 page.
Form PCT/ISA/220, “PCT Notification of Transmittal of the International Search Report and the Written Opinion of the International Searching Authority, or the Declaration for the Application No. PCT/US08/74246 (ICON.P003WO)” Nov. 14, 2008, 1 page.
Form PCT/ISA/220, “PCT Notification of Transmittal of the International Search Report and the Written Opinion of the International Searching Authority, or the Declaration for the Application No. PCT/US08/74260 (ICON.P002WO),” Nov. 13, 2008, 1 page.
Form PCT/ISA/220, “PCT Notification of Transmittal of the International Search Report and the Written Opinion of the International Searching Authority, or the Declaration for the Application No. PCT/US09/53485 (ICON.P011WO),” Oct. 22, 2009, 1 page.
Form PCT/ISA/220, “PCT Notification of Transmittal of the International Search Report and the Written Opinion of the International Searching Authority, or the Declaration for the Application No. PCT/US09/55559 (ICON.P012WO),” Nov. 12, 2009, 1 page.
Form PCT/ISA/220, “PCT Notification of Transmittal of the International Search Report and the Written Opinion of the International Searching Authority, or the Declaration for the Application No. PCT/US10/50585 (ICON.P014WO),” Dec. 30, 2010, 1 page.
Form PCT/ISA/220, “PCT Notification of Transmittal of the International Search Report and the Written Opinion of the International Searching Authority, or the Declaration for the Application No. PCT/US10/57674 (ICON.P015WO),” Mar. 2, 2011, 1 page.
Form PCT/ISA/220, “PCT Notification of Transmittal of the International Search Report and the Written Opinion of the International Searching Authority, or the Declaration for the Application No. PCT/US11/35994 (ICON.P016WO),” Sep. 28, 2011, 1 page.
Form PCT/ISA/220, “PCT Notification of Transmittal of the International Search Report and the Written Opinion of the International Searching Authority, or the Declaration of the Application No. PCT/US08/83254 (ICON.POOSWO),” Jan. 14, 2009, 1 page.
Form PCT/ISA/237,“PCT Written Opinion of the International Searching Authority for the Application No. PCT/USOS/08766 (ICON.P020),” May 23, 2006, 5 pages.
Form PCT/ISA/237, ICON.P015WO, “PCT Written Opinion of the International Searching Authority,” 6 pgs.
Form PCT/ISA/237, “PCT Written Opinion of the International Searching Authority for the Application No. PCT/US08/72831 (ICON.P001WO),” Nov. 4, 2008, 6 pages.
Form PCT /ISA/237, “PCT Written Opinion of the International Searching Authority for the Application No. PCT/US08/74246 (ICON.P003WO),” Nov. 14, 2008, 6 pages.
Form PCT/ISA/237, “PCT Written Opinion of the International Searching Authority for the Application No. PCT/US08/74260 (ICON.P002WO),” Nov. 13, 2008, 6 pages.
Form PCT/ISA/237, “PCT Written Opinion of the International Searching Authority for the Application No. PCT/US09/53485 (ICON.P011WO),” Oct. 22, 2009, 8 pages.
Form PCT/ISA/237, “PCT Written Opinion of the International Searching Authority for the Application No. PCT/US09/55559 (ICON.P012WO),” Nov. 12, 2009, 6 pages.
Form PCT /ISA/237, “PCT Written Opinion of the International Searching Authority for the Application No. PCT/US10/50585 (ICON.P014WO),” Dec. 30, 2010, 7 pages.
Form PCT/ISA/237, “PCT Written Opinion of the International Searching Authority for the Application No. PCT/US10/57674 (ICON.P015WO),” Mar. 2, 2011, 6 pages.
Form PCT/ISA/237, “PCT Written Opinion of the International Searching Authority for the Application No. PCT/US11/34858 (ICON.P017WO),” Oct. 3, 2011, 8 pages.
Form PCT/ISA/237, “PCT Written Opinion of the International Searching Authority for the Application No. PCT/US11/35994 (ICON.P016WO),” Sep. 28, 2011, 11 pages.
Form PCT/ISA/237, “PCT Written Opinion of the International Searching Authority for the Application No. PCT/US11/53136 (ICON.P019WO),” Jan. 5, 2012.
Form PCT/ISA/237, “PCT Written Opinion ofthe International Searching Authority of the Application No. PCT/USOS/83254 (ICON.P005WO),” Jan. 14, 2009, 7 pages.
Gutierrez J.A., “On the Use of IEEE 802.15.4 to Enable Wireless Sensor Networks in Building Automation,” Personal, Indoor and Mobile Radio Communications (PIMRC), 15th IEEE International Symposium, 2004, vol. 3, pp. 1865-1869.
Haldas; Mike, “Zavio IP Security Cameras Support Push Video Notification to Mobile App.,” Security Camera & Video Surveillance Blog, Jul. 8, 2013, pp. 1-7.
Hugemann, Wolfgang, “Correcting Lens Distortions in Digital Photographs,” 2010, EVU, pp. 1-12.
Indian Patent App. No. 10698/DELNP/2012.
Indian Patent App. No. 3687/DELNP/2012.
International Patent Application No. PCT/US2013/048324.
International Search Report for Application No. PCT/US13/48324, mailed on Jan. 14, 2014, 2 pages.
International Search Report for Application No. PCT/US2014/050548, mailed on Mar. 18, 2015, 4 pages.
Lagotek Wireless Home Automation System, May 2006 [retrieved on Aug. 22, 2012].
Non-Final Office Action mailed Apr. 4, 2013 for U.S. Appl. No. 12/197,931, filed Aug. 25, 2008.
Non-Final Office Action mailed Mar. 4, 2013 for U.S. Appl. No. 13/400,477, filed Feb. 20, 2012.
Non-Final Office Action mailed Jan. 5, 2010 for U.S. Appl. No. 12/019,554, filed Jan. 24, 2008.
Non-Final Office Action mailed May 5, 2010 for U.S. Appl. No. 12/189,780, filed Aug. 11, 2008.
Non-Final Office Action mailed May 5, 2010 for U.S. Appl. No. 12/189,785, filed Aug. 11, 2008.
Non-Final Office Action mailed Feb. 7, 2012 for U.S. Appl. No. 12/637,671, filed Dec. 14, 2009.
Non-Final Office Action mailed Feb. 7, 2013 for U.S. Appl. No. 12/970,313, filed Dec. 16, 2010.
Non-Final Office Action mailed Feb. 8, 2012 for U.S. Appl. No. 12/630,092, filed Dec. 3, 2009.
Non-Final Office Action mailed Apr. 9, 2012 for U.S. Appl. No. 12/771,624, filed Apr. 30, 2010.
Non-Final Office Action mailed Dec. 9, 2008 for U.S. Appl. No. 11/084,232, filed Mar. 16, 2005.
Non-Final Office Action mailed Aug. 10, 2012 for U.S. Appl. No. 12/771,471, filed Apr. 30, 2010.
Non-Final Office Action mailed Oct. 11, 2012 for U.S. Appl. No. 12/019,568, filed Jan. 24, 2008.
Non-Final Office Action mailed Apr. 12, 2012 for U.S. Appl. No. 12/770,365, filed Apr. 29, 2010.
Non-Final Office Action mailed Jul. 12, 2012 for U.S. Appl. No. 12/691,992, filed Jan. 22, 2010.
Non-Final Office Action mailed Oct. 12, 2012 for U.S. Appl. No. 12/630,092, filed Dec. 3, 2009.
Non-Final Office Action mailed Sep. 12, 2012 for U.S. Appl. No. 12/952,080, filed Nov. 22, 2010.
Non-Final Office Action mailed Apr. 13, 2010 for U.S. Appl. No. 11/761,745, filed Jun. 12, 2007.
Non-Final Office Action mailed Jul. 13, 2010 for U.S. Appl. No. 12/019,568, filed Jan. 24, 2008.
Non-Final Office Action mailed Nov. 14, 2012 for U.S. Appl. No. 13/531,757, filed Jun. 25, 2012.
Non-Final Office Action mailed Sep. 14, 2010 for U.S. Appl. No. 11/084,232, filed Mar. 16, 2005.
Non-Final Office Action mailed Sep. 16, 2011 for U.S. Appl. No. 12/539,537, filed Aug. 11, 2009.
Non-Final Office Action mailed Sep. 17, 2012 for U.S. Appl. No. 12/189,780, filed Aug. 11, 2008.
Non-Final Office Action mailed Aug. 18, 2011 for U.S. Appl. No. 12/197,958, filed Aug. 25, 2008.
Non-Final Office Action mailed Feb. 18, 2011 for U.S. Appl. No. 12/630,092, filed Dec. 3, 2009.
Non-Final Office Action mailed Jan. 18, 2012 for U.S. Appl. No. 12/771,071, filed Apr. 30, 2010.
Non-Final Office Action mailed Feb. 21, 2013 for U.S. Appl. No. 12/771,372, filed Apr. 30, 2010.
Non-Final Office Action mailed Jul. 21, 2010 for U.S. Appl. No. 12/630,092, filed Dec. 3, 2009.
Non-Final Office Action mailed Dec. 22, 2010 for U.S. Appl. No. 12/197,931, filed Aug. 25, 2008.
Non-Final Office Action mailed Jul. 22, 2013 for U.S. Appl. No. 12/630,092, filed Dec. 3, 2009.
Non-Final Office Action mailed May 23, 2013 for U.S. Appl. No. 13/104,932, filed May 10, 2011.
Non-Final Office Action mailed May 23, 2013 for U.S. Appl. No. 13/104,936, filed May 10, 2011.
Non-Final Office Action mailed Jan. 26, 2012 for U.S. Appl. No. 12/019,568, filed Jan. 24, 2008.
Non-Final Office Action mailed Nov. 26, 2010 for U.S. Appl. No. 12/197,958, filed Aug. 25, 2008.
Non-Final Office Action mailed Jun. 27, 2013 for U.S. Appl. No. 12/019,568, filed Jan. 24, 2008.
Non-Final Office Action mailed Dec. 30, 2009 for U.S. Appl. No. 11/084,232, filed Mar. 16, 2005.
Non-Final Office Action mailed May 30, 2008 for U.S. Appl. No. 11/084,232, filed Mar. 16, 2005.
Non-Final Office Action issued in U.S. Appl. No. 14/456,377, dated May 30, 2019, 16 pages.
Notice of Allowance mailed May 14, 2013 for U.S. Appl. No. 12/637,671, filed Dec. 14, 2009.
Notice of Allowance mailed Oct. 25, 2012 for U.S. Appl. No. 11/084,232, filed Mar. 16, 2005.
Requirement for Restriction/Election mailed Jan. 22, 2013 for U.S. Appl. No. 13/104,932, filed May 10, 2011.
Requirement for Restriction/Election mailed Jan. 22, 2013 for U.S. Appl. No. 13/104,936, filed May 10, 2011.
Requirement for Restriction/Election mailed Oct. 24, 2012 for U.S. Appl. No. 12/750,470, filed Mar. 30, 2010.
Security For The Future, Introducing 580480—Advanced two-way wireless remote technology, Advertisement, ADEMCO Group, Syosset, NY, circa 1997.
Sheehan, “Dropcam HD is the Ultimate Home Monitoring Webcam Now with HD Video, Night Vision & 2-Way Audio”, dated Jul. 20, 2012, pp. 1-12.
South African Patent App. No. 2013/02668.
Supplemental European Search Report for Application No. EP05725743.8 mailed on Sep. 14, 2010, 2 pages.
Supplementary European Search Report for Application No. EP10819658, mailed on Mar. 10, 2015, 2 pages.
Supplementary European Search Report for Application No. EP11827671, mailed on Mar. 10, 2015, 2 pages.
Supplementary European Search Report for Application No. EP2191351, mailed on Jun. 23, 2014, 2 pages.
Supplementary Non-Final Office Action mailed Oct. 28, 2010 for U.S. Appl. No. 12/630,092, filed Dec. 3, 2009.
Supplementary Partial European Search Report for Application No. EP09807196, mailed on Nov. 17, 2014, 5 pages.
Taiwanese Patent App. No. 99113848.
Taiwanese Patent App. No. 99113853.
Taiwanese Patent App. No. 99113855.
Taiwanese Patent App. No. 99113856.
Topalis E., et al., “A Generic Network Management Architecture Targeted to Support Home Automation Networks and Home Internet Connectivity, Consumer Electronics, IEEE Transactions,” 2000, vol. 46 (1), pp. 44-51.
United Kingdom Patent No. 2428821.
United Kingdom Patent No. 2442628.
United Kingdom Patent No. 2442633.
United Kingdom Patent No. 2442640.
Wireless, Battery-Powered Smoke Detectors, Brochure, SafeNight Technology, Inc. Roanoke, VA, 1995.
WLS906 Photoelectric Smoke Alarm, Data Sheet, DSC Security Products, Ontario, Canada, Jan. 1998.
X10—ActiveHome, Home Automation Made Easy [retrieved on Nov. 4, 2003], 3 pages.
European Office action in European Application No. EP 14833805, dated May 25, 2017, 10 pages.
Related Publications (1)
Number Date Country
20230336896 A1 Oct 2023 US
Provisional Applications (1)
Number Date Country
61864248 Aug 2013 US
Continuations (3)
Number Date Country
Parent 17888741 Aug 2022 US
Child 18211054 US
Parent 16866521 May 2020 US
Child 17888741 US
Parent 14456449 Aug 2014 US
Child 16866521 US