Electronic devices connected to a common home automation network may be controlled via a common host computer system. In some cases, it may desirable to control certain electronic devices based on weather conditions. However, weather reports, such as those generated using data from nearby weather stations and subsequently broadcast via radio and/or television, are often generalized reports that are inaccurate for a particular operating location of the electronic devices. There is a need for a more localized and accurate assessment of weather conditions at the operating location of the electronic devices. This application is intended to address these and other issues, and to provide related advantages.
In general, the systems and methods disclosed herein are directed to controlling electronic devices, and more specifically, to controlling electronic devices based on detected weather events.
In one aspect, a method for controlling a device in a home automation network based on detection of a weather event is provided. The method includes receiving, by a host system, image data from a camera in operative communication with the host system, where the image data is representative of an outdoor weather event that is captured by the camera. The method may include analyzing, by the host system, the image data to identify the outdoor weather event, and/or determining, by the host system, a home automation rule based on the identified outdoor weather event, where the home automation rule includes an operational setting of a home automation device. Further, the method may include instructing, by the host system, the home automation device based on the determined home automation rule via a home automation network.
Various features of the present method may be contemplated. For instance, the image data may include a still image that is captured by the camera in predetermined time intervals, and/or a moving image that is captured over a predefined length of time by the camera at predetermined time intervals. The method may include receiving, by the host system, audio data from a microphone in operative communication with the host system, where the audio data corresponds to the received image data, and/or analyzing, by the host system, the image data with the audio data to identify the outdoor event. The method may include comparing, by the host system, the image data to a baseline image data to identify the outdoor weather event in real-time. Further, the method may include, based on the comparison, determining, by the host system, one or more descriptive tags representative of the identified outdoor weather event, and/or determining, by the host system, the home automation rule based on a match of the home automation rule with the one or more descriptive tags. Still further, the method may include sending, by the host system, an alert to a mobile device in operative communication with the host system, where the alert is indicative of the image data and the identified outdoor weather event. The method may also include receiving, by the host system, a third-party weather forecast comprising current conditions for a geographic location of the host system, and determining, by the host system, whether the identified outdoor weather event is consistent with the third-party weather forecast.
In another aspect, a system for controlling a device in a home automation network based on detection of a weather event includes a computer system, where the computer system is configured to receive image data from a camera in operative communication with the computer system. The image data may be representative of an outdoor weather event that is captured by the camera. The computer system may be configured to analyze the image data to identify the outdoor weather event and/or determine a home automation rule based on the identified outdoor weather event, where the home automation rule includes an operational setting of a home automation device. Further, the computer system may be configured to instruct the home automation device based on the determined home automation rule via a home automation network.
Various features of the present system may be contemplated. The image data may include a still image that is captured by the camera in predetermined time intervals, and/or a moving image that is captured over a predefined length of time by the camera at predetermined time intervals. The computer system may be further configured to receive audio data from a microphone in operative communication with the computer system, where the audio data corresponds to the received image data, and analyze the image data with the audio data to identify the outdoor event. The computer system may also be configured to compare the image data to a baseline image data to identify the outdoor weather event in real-time. Further, the computer system may be configured to, based on the comparison, determine one or more descriptive tags representative of the identified outdoor weather event, and determine the home automation rule based on a match of the home automation rule with the one or more descriptive tags. In other aspects, the computer system may be further configured to send an alert to a mobile device in operative communication with the computer system, where the alert is indicative of the image data and the identified outdoor weather event. Still further, the computer system may be configured to receive a third-party weather forecast comprising current conditions for a geographic location of the host system, and determine whether the identified outdoor weather event is consistent with the third-party weather forecast.
In yet another aspect, a computer-readable medium having instructions stored thereon for controlling a device in a home automation network based on detection of a weather event is provided. The instructions may be executable by one or more processors for receiving image data from a camera, where the image data is representative of an outdoor weather event that is captured by the camera, and analyzing the image data to identify the outdoor weather event. The instructions may be executable by one or more processors for determining a home automation rule based on the identified outdoor weather event, where the home automation rule includes an operational setting of a home automation device. Further, the instructions may be executable by one or more processors for instructing the home automation device based on the determined home automation rule via a home automation network.
Various embodiments of the present computer-readable medium may be contemplated. For instance, the instructions of the computer-readable medium may be executable by one or more processors for receiving audio data from a microphone in operative communication with the host system, where the audio data corresponds to the received image data, and analyzing the image data with the audio data to identify the outdoor event. The instructions of the computer-readable medium may be executable by one or more processors for comparing the image data to a baseline image data to identify the outdoor weather event in real-time. Further, the instructions of the computer-readable medium may be executable by one or more processors for, based on the comparison, determining one or more descriptive tags representative of the identified outdoor weather event and determining the home automation rule based on a match of the home automation rule with the one or more descriptive tags.
The present invention is described in conjunction with the appended figures:
In the appended figures, similar components and/or features may have the same numerical reference label. Further, various components of the same type may be distinguished by following the reference label by a letter that distinguishes among the similar components and/or features. If only the first numerical reference label is used in the specification, the description is applicable to any one of the similar components and/or features having the same first numerical reference label irrespective of the letter suffix.
In general, the systems and methods disclosed herein provide for controlling a device in a home automation network based on detection of a weather event. The systems and methods may be provided for by a host system, such as a television receiver, television, overlay device, and/or a combination thereof, that is further connected to one or more electronic devices via the home automation network. As described below, the present systems and methods may detect various weather events occurring at an operating location of the one or more electronic devices, determine instructions including operational settings and/or modes for the one or more electronic devices based on the detected weather event, and transmit such instructions to certain electronic devices in the network. Other features may be contemplated, as described further below.
As shown in
Still referring to
As shown in
Still in reference to
Referring again to
As further illustrated in
As shown in
Further shown in
Turning now to
As shown in
Referring to
Still referring to
As shown in
Still referring to
Referring again to
Still referring to
Still in reference to
As shown in
Further shown in
Still referring to
Still referring to
Referring again to
Still referring to
Further shown in
As shown in
As shown in
For instance, as shown in
Further shown in
Still shown in
Still referring to
Still shown in
As shown in
Also shown in
Still referring to
Further in reference to
Referring again to
Referring to
Referring again to
Further, as shown in
Referring to
Referring again to
Leak detection sensor 224 of
Further shown in
Appliance controller 226 of
Appliances and other electronic devices may also be monitored for electricity usage. For instance, US Pat. Pub. No. 2013/0318559, filed Nov. 19, 2012, to Crabtree, entitled “Apparatus for Displaying Electrical Device Usage Information on a Television Receiver,” which is hereby incorporated by reference, may allow for information regarding the electricity usage of one or more devices (e.g., other home automation devices or circuits within a home that are monitored) to be determined. Control of one or more home automation devices may be dependent on electrical usage and stored electrical rates. For instance, a washing machine may be activated in the evening when rates are lower. Additionally or alternatively, operation of devices may be staggered to help prevent consuming too much power at a given time. For instance, an electric heater may not be activated until a dryer powered via the same circuit is powered down.
Garage door controller 228 of
Lock controller 230 of
A home security system 207 of
Irrigation controller 232 of
One or more motion sensors can be incorporated into one or more of the previously detailed home automation devices or as a stand-alone device. Such motion sensors may be used to determine if a structure is occupied. Such information may be used in conjunction with a determined location of one or more wireless devices. If some or all users are not present in the structure, home automation settings may be adjusted, such as by lowering a temperature of thermostat 222, shutting off lights via light controller 220, and determining if one or more doors are closed by door sensor 208. In some embodiments, a user-defined script may be run when it is determined that no users or other persons are present within the structure.
Additional forms of sensors not illustrated in
The home automation functions detailed herein that are attributed to television receiver 150 may alternatively or additionally be incorporated into overlay device 251. As such, a separate overlay device 251 may be connected with display device 160 to provide home automation functionality.
Turning now to
As shown in
In other embodiments of television receiver 300, fewer or greater numbers of components may be present. It should be understood that the various components of television receiver 300 may be implemented using hardware, firmware, software, and/or some combination thereof. Functionality of components may be combined; for example, functions of descrambling engine 365 may be performed by tuning management processor 310-2. Further, functionality of components may be spread among additional components. For instance, the home automation settings database 347, home automation script database 348, and/or weather detection module 350 may be provided for, wholly or partly, in the overlay device 241.
In
Control processor 310-1 of
Control processor 310-1 of
Tuners 315 of
Network interface 320 of
Storage medium 325 of
Home automation settings database 347 of
Home automation settings database 347 of
Home automation script database 348 of
In some embodiments, home automation script database 248 of
EPG database 330 of
Decoder module 333 of
Television interface 335 of
Still referring to
DVR database 345 of
On-demand programming database 327 of
Referring back to tuners 315 of
Tuning management processor 310-2 of
Descrambling engine 365 of
In some embodiments, the television receiver 300 of
For simplicity, television receiver 300 of
While the television receiver 300 has been illustrated as a satellite-based television receiver, it is to be appreciated that techniques below may be implemented in other types of television receiving devices, such a cable receivers, terrestrial receivers, IPTV receivers or the like. In some embodiments, the television receiver 300 may be configured as a hybrid receiving device, capable of receiving content from disparate communication networks, such as satellite and terrestrial television broadcasts. In some embodiments, the tuners may be in the form of network interfaces capable of receiving content from designated network locations. The home automation functions of television receiver 300 may be performed by an overlay device. If such an overlay device, television programming functions may still be provided by a television receiver that is not used to provide home automation functions.
Turning now to
The method 400 may include receiving data from a camera in operative communication with the host system (step 402). For example, the host system may receive image data from a camera, where the image data is representative of an outdoor weather event or condition that is captured by the camera. It is noted that the camera may include an indoor and/or an outdoor camera. In some cases, the image data includes one or more still images captured by the camera at predetermined time intervals. In other cases, the image data includes a moving image, such as a video file, that is captured over a predefined length of time by the camera at predetermined time intervals. In still other examples, the method 400 may include receiving audio data at the host system from a microphone in operative communication with the host system, where the audio data corresponds to the received image data. For instance, the audio data and the image data may be taken at approximately a same time, and/or during the same predetermined time intervals. In another aspect, the microphone may be provided by the camera, such that the image data and the audio data are sent to the host system together as a video stream. Other examples are possible.
Further, method 400 may include analyzing the received data for one or more weather events (step 404). For example, the method may include analyzing the image data and/or audio data to identify an outdoor weather event, such as various forms of precipitation, cloud coverage, wind, level of sunshine, and/or temperature data. In some cases, the data received by the host system for analysis may be detected by various sensors, such as temperature and/or humidity sensors in operative communication with the host system. In other cases, the image data may contain images of sensor readings, such as a temperature reading at a temperature gauge. Portions of such images may be analyzed and the temperature reading may be extracted therefrom.
In some cases, the method 400 includes comparing the image data to a baseline image data to identify the outdoor weather event in real-time and/or during other predetermined time intervals. In other cases, the method 400 includes analyzing the image data along with the audio data to identify the outdoor weather events. Still, in some examples, the method 400 may include, based on the comparison, determining one or more descriptive tags representative of the identified outdoor weather event. Further, the method 400 may include receiving a third-party weather forecast that includes current conditions for a geographic location of the host system, such as a zip-code of a home having the host system, and/or determining whether the identified outdoor weather event is consistent with the third-party weather forecast. In this way, the host system may confirm the outdoor weather event, and/or initiate additional alerts or instructions if the weather event is different than the third-party weather forecast.
As described herein, the method 400 may provide for weather detection via an analysis of images, such as a comparison of images to determine various weather conditions and events. It is contemplated that images received at the host system may be compared with one or more baseline images previously stored and/or taken at the host system. Such baseline images may be taken by static cameras, and/or movable cameras, which may allow a user to capture multiple different angles or fields of view throughout a period of time and/or time of day. In some cases, the baseline images may be taken at predetermined time intervals throughout a day and/or week and stored by the host system under various categories and/or digitally marked with various descriptive tags. Such categories may correspond to an angle or field of view for the image and/or seasons of the year. For instance, baseline images having colored leaves may be categorized as autumn and implemented for comparing images during autumnal months of the year, while other baseline images having no leaves may be categorized as late autumn and/or winter and applied accordingly. It is contemplated that the baseline images and subsequent images utilized for determining weather event(s) may be taken with the same camera in operative communication with the host system, although different and/or additional cameras may be utilized as well.
The descriptive tags may be automatically assigned, user-determined, and/or based on user assignment. For instance, during setup and/or review of baseline images, the user may determine if a particular image is representative of a sunny day and/or a cloudy day. The host system may tag the image with the appropriate descriptive tag, and utilize such determinations as baselines for analyzing subsequently captured images. In some cases, the baseline images are captured and designated as representative of a normal day, where the host system may compare subsequent images to the normal day to determine if extra sunshine and/or a low amount of sunshine is prevalent. Other examples are possible.
In some cases, the host system may provide multiple modes of operation for detecting weather events via imaging. For instance, in a lower power standard mode, the host system may check incoming images from the camera with the appropriate baseline images at predetermined intervals of time that are less frequent than in real-time. In a real-time operation mode, the host system may implement a high end, frame-by-frame, comparison of all or a higher number of incoming images from the camera. It is contemplated that the host system may toggle between the two different modes, and/or other modes, based on time of day, upon determination of highly changing weather patterns, based on reception of national weather service warnings or alerts, and/or based on power availability and/or pricing.
Merely by way of example, the host system may be configured to take a preset number of baseline images over a day, week, and/or year in a standard mode, a low power mode, and/or a combination of such modes. It is contemplated that a static position camera may provide a smaller set of baseline images, while a moveable position camera may provide a larger set of baseline images in multiple positions. The host system may determine a positioning of the camera to mark the baseline images associated with the positioning, and/or determine a static or movable feature of the camera. In some examples, the host system may compare an image during predetermined time intervals, such as every few minutes, to the baseline image(s) to determine whether there is any significant change or difference. In another example, comparisons may be ramped up into real time with frame-by-frame comparisons. It is noted that the comparison of baseline images with incoming image data may contribute toward the determination of weather events and/or trigger additional levels of scrutiny and comparison. The method 400 may be utilized in conjunction with other weather-prediction and/or confirmation features, as described further below.
It is further noted that various techniques may be implemented by the host system for determining weather events with images. In some cases, determining the weather events includes detecting a reflectivity, sheen, and/or color of a surface, such as a sidewalk or sky captured in the image. For instance, the host system may determine from the image data that a typically a grey concrete sidewalk is now white and that the current weather event is snow. The host system may determine from the image that the typically grey concrete sidewalk is now a darker shade of grey and/or more reflective. Based on the determination, the host system may further determine that the weather event is wet conditions and/or rain. Such weather events may further be provided as the descriptive tags. In another example, the host system may determine that a change in a sky portion of the image data indicates clouds and/or an incoming weather front. Other examples are possible. It is noted that such analyses may trigger higher levels of sensitivity and cause the host system to request statuses from other home automation system devices via the home automation network. Merely by way of example, upon determining that the sidewalk is wet, the host system may inquire if a sprinkler near the sidewalk is on or was recently on, to determine if the wet perception of the sidewalk is due to the sprinkler rather than rain.
In some aspects, the host system may utilize overall image contrast for the weather detection. For example, the host system may determine that cloudy overcast skies lend to less contrast in images, and/or that heavy snow creates less overall contrast as well. In other aspects, the host system may monitor position changes of objects via the image data. For example, the host system may detect that a tree branch position shifts or otherwise changes from frame to frame and therefore indicates wind. In another aspect, the host system may identify that a patio chair position has changed, which may indicate a level of wind, such as high wind conditions.
In some examples, the camera and/or host system provides high video resolution and/or frame rate, such that wind speed and/or direction may be estimated based on tracking a horizontal, vertical, or other directional movement of precipitation, such as hail, snow, rain and so on. For instance, the host system may analyze the image for a change in distance and/or direction of such precipitation. In other examples, the host system may receive user input for scaling and/or otherwise measuring an object, such as a general size of a rain drop or hail. The host system may utilize such information to determine rainfall in inches per hour, and/or estimating total precipitation. The host system may maintain logs of precipitation and/or total rainfall, which may be utilized for setting patterns of sprinkler systems and/or watering a garden.
It is noted that the method 400 may integrate other measurements and/or data taken from other devices to determine one or more weather events that are represented in the image data. For example, the host system may communicate with a motion detector to look for indication of other possibly confounding activities, such as kids outside playing with water and wetting the sidewalk. In another example, the host system may determine if a sprinkler system is on via the home automation network. In yet another example, the method 400 may integrate weather broadcasts to determine and/or confirm if rain is in the area. Still, other examples are possible. For instance, the host system may activate a lighting device and/or a light on the camera for capturing evening and/or low-light images. In some examples, security cameras may include built-in lights or infrared lights, which may be activated by the host system at the predetermined intervals of time when image data is captured.
As further described herein, the method 400 may provide for weather detection via an analysis of sound, which may identify and/or confirm a particular weather event. For instance, such sound data may be detected by a microphone from a security camera, cell phone camera, and/or any other camera in operative communication with the host system. The microphone may capture sounds of rain, hail, high wind, and so on, which may be analyzed by the host system to confirm other possible triggers and/or weather events. In another aspect, the host system may provide an audio decibel comparison to confirm or determine various characteristics of weather events. For example, the host system may determine a general direction and/or position of a weather event, such as whether a storm is approaching the home and/or moving further away from the home. In another aspect, the host system may determine, based on the sound data, whether the storm is increasing in intensity, decreasing in intensity, and/or already has already passed.
Still further, the method 400 may include determining one or more rules based on the analysis of the data (step 406). For example, the method 400 may include determining a home automation rule based on the identified outdoor weather event, where the home automation rule includes an operational setting of a home automation device. In some cases, the method 400 may include determining the home automation rule based on a match of the home automation rule with the one or more descriptive tags associated with the identified outdoor weather event.
Further, the method 400 may include instructing home automation devices based on the determined rules (step 408). For instance, the host system may instruct the home automation device with particular operational settings and/or operational parameters via the home automation network.
Still further, the method 400 may include sending an alert to a mobile device (step 410), such as a laptop, smart phone, and/or a tablet device. For instance, the alert may be indicative of or otherwise include the image data, the identified outdoor weather event, and/or one or more descriptive tags associated with the identified outdoor weather event. In another aspect, the alert may be provided via a desktop webpage and/or a mobile webpage, email, text message, and/or channels of communication for relaying information from the home automation network with outside devices.
It is noted that the method 400 may include automatic weather detection and/or be user-initiated. For example, the user may press or otherwise activate a button for “show me the current weather at home” from the mobile application on the user's mobile device. Mobile alerts may be sent from the host system to a television for display via a display screen, a cellular phone, and/or a tablet device, among other devices. The mobile alert may include a video clip of the weather event, still images of the weather event, and/or current conditions that are determined based on the analysis of the image data, such as descriptive tags determined for the image data.
In another example, the home automation rules may include rules that trigger various devices to operate and/or respond to various weather events. Such rules may be user-defined and/or modified further via the mobile application. Merely by way of example, the home automation rules may include: closing a garage door upon detection of a weather event indicating rain; opening and/or closing automated doors, windows, and/or skylights based on detection of rain, sun, and/or temperature; closing automated shades on a west side of the house if the detected weather event indicates a hot and sunny afternoon; delaying an automated watering system if the weather event indicates rain that lasts for longer than a predetermined period of time, such as two minutes; turning on outdoor and/or indoor lights based on a detected level of natural light; adjusting a thermostat for heating and/or air conditioning systems, for instance, to prevent pipes from freezing when the detected weather event indicates extreme temperatures and weather; and so on. Further, the host system may provide weather triggers that are activated when certain descriptive tags and/or weather events are detected, such as rain, snow, hail, high wind, and/or any other user-defined situation. Such weather triggers may include or call out specific home automation rules associated with a start of a weather event, and/or an end of the weather event. Such weather triggers may further include a minimum amount of time or duration for the event, and/or a minimum threshold amount of precipitation. Other examples are possible.
In yet another example, the host system may send merchandise reminders to mobile devices based on the detected weather event and/or a future forecasted weather event. For instance, the host system may provide purchase suggestions to buy shovels and/or salt based on the detected wintry weather condition, and/or a fan based on a detected hot weather condition. In some cases, such merchandise reminders may be transmitted along with advertisements for the suggested products via text message and/or email to the user's mobile device. In another example, the merchandise reminder may include restocking on food if a storm is arriving. In yet another example, the host system may determine that an air conditioning unit is broken, and send a reminder to the user's mobile device to get it fixed. Such reminders may include relevant information, such as a phone number and/or address for picking up products and/or setting up repairs. In further aspects, the host system may integrate calendar events, where if the host system receives news reports that the weather looks unfavorable for a scheduled barbeque coming up, an alert may be sent to the user. In another example, the host system may monitor weather conditions more closely and/or switch to a real-time weather detection mode on the day of the scheduled event to stay up-to-date and transmit alerts promptly. Still, other examples are possible.
Turning now to
As shown in
As further shown in
Turning now to
The method 600 may include receiving weather-related data that is collected by a sensor (step 602). It is contemplated that such weather-related data may include still images, video files, and/or sounds recorded by a camera and/or microphone. In other aspects, other weather-related data may be collected by various other sensors, such as temperature by a thermometer, amount of rainfall by a rain gauge, and so on. The method 600 may include analyzing the data, such as analyzing images to determine a weather event (step 604) and/or analyzing sound to determine a weather event (step 606). The method 600 may include comparing, consolidating, and/or otherwise compiling the determined weather events and sending the determined weather events to a mobile application (step 608). In an example, sending the determined weather events may include pushing such events to the mobile application each time weather events are determined and available, sending the determined weather events in response to a user input that indicates interest in receiving the weather events, and/or pushing the determined weather events to the mobile application hourly or at other time intervals. In another aspect, at step 608, the method 600 includes providing the determined weather events through a cloud network and/or otherwise making the determined weather events available for retrieval by any mobile device and/or desktop device. Other examples are possible.
The method 600 may further include receiving user input via the mobile application for controlling one or more devices, such as home automation devices in a home automation network (step 610). It is contemplated that the user may be located remotely and therefore permitted to remotely control the home automation network and/or devices via the mobile application. It is further contemplated that the user input may be based, at least in part, on the determined weather conditions that are provided at the mobile application. Based on the user input, the method 600 may include instructing one or more devices with certain operational settings via the home automation network (step 612).
Turning now to
The computer device 700 is shown comprising hardware elements that may be electrically coupled via a bus 702 (or may otherwise be in communication, as appropriate). The hardware elements may include a processing unit with one or more processors 704, including without limitation one or more general-purpose processors and/or one or more special-purpose processors (such as digital signal processing chips, graphics acceleration processors, and/or the like); one or more input devices 706, which may include without limitation a remote control, a mouse, a keyboard, and/or the like; and one or more output devices 708, which may include without limitation a presentation device (e.g., television), a printer, and/or the like.
The computer system 700 may further include (and/or be in communication with) one or more non-transitory storage devices 710, which may comprise, without limitation, local and/or network accessible storage, and/or may include, without limitation, a disk drive, a drive array, an optical storage device, a solid-state storage device, such as a random access memory, and/or a read-only memory, which may be programmable, flash-updateable, and/or the like. Such storage devices may be configured to implement any appropriate data stores, including without limitation, various file systems, database structures, and/or the like.
The computer device 700 might also include a communications subsystem 712, which may include without limitation a modem, a network card (wireless and/or wired), an infrared communication device, a wireless communication device and/or a chipset such as a Bluetooth™ device, 802.11 device, WiFi device, WiMax device, cellular communication facilities such as GSM (Global System for Mobile Communications), W-CDMA (Wideband Code Division Multiple Access), LTE (Long Term Evolution), etc., and/or the like. The communications subsystem 712 may permit data to be exchanged with a network (such as the network described below, to name one example), other computer systems, and/or any other devices described herein. In many embodiments, the computer system 700 will further comprise a working memory 714, which may include a random access memory and/or a read-only memory device, as described above.
The computer device 700 also may comprise software elements, shown as being currently located within the working memory 714, including an operating system 716, device drivers, executable libraries, and/or other code, such as one or more application programs 718, which may comprise computer programs provided by various embodiments, and/or may be designed to implement methods, and/or configure systems, provided by other embodiments, as described herein. By way of example, one or more procedures described with respect to the method(s) discussed above, and/or system components might be implemented as code and/or instructions executable by a computer (and/or a processor within a computer); in an aspect, then, such code and/or instructions may be used to configure and/or adapt a general purpose computer (or other device) to perform one or more operations in accordance with the described methods.
A set of these instructions and/or code might be stored on a non-transitory computer-readable storage medium, such as the storage device(s) 710 described above. In some cases, the storage medium might be incorporated within a computer system, such as computer system 700. In other embodiments, the storage medium might be separate from a computer system (e.g., a removable medium, such as flash memory), and/or provided in an installation package, such that the storage medium may be used to program, configure, and/or adapt a general purpose computer with the instructions/code stored thereon. These instructions might take the form of executable code, which is executable by the computer device 700 and/or might take the form of source and/or installable code, which, upon compilation and/or installation on the computer system 700 (e.g., using any of a variety of generally available compilers, installation programs, compression/decompression utilities, etc.), then takes the form of executable code.
It will be apparent that substantial variations may be made in accordance with specific requirements. For example, customized hardware might also be used, and/or particular elements might be implemented in hardware, software (including portable software, such as applets, etc.), or both. Further, connection to other computing devices such as network input/output devices may be employed.
As mentioned above, in one aspect, some embodiments may employ a computer system (such as the computer device 700) to perform methods in accordance with various embodiments of the disclosure. According to a set of embodiments, some or all of the procedures of such methods are performed by the computer system 700 in response to processor 704 executing one or more sequences of one or more instructions (which might be incorporated into the operating system 716 and/or other code, such as an application program 718) contained in the working memory 714. Such instructions may be read into the working memory 714 from another computer-readable medium, such as one or more of the storage device(s) 710. Merely by way of example, execution of the sequences of instructions contained in the working memory 714 may cause the processor(s) 704 to perform one or more procedures of the methods described herein.
The terms “machine-readable medium” and “computer-readable medium,” as used herein, may refer to any non-transitory medium that participates in providing data that causes a machine to operate in a specific fashion. In an embodiment implemented using the computer device 700, various computer-readable media might be involved in providing instructions/code to processor(s) 704 for execution and/or might be used to store and/or carry such instructions/code. In many implementations, a computer-readable medium is a physical and/or tangible storage medium. Such a medium may take the form of a non-volatile media or volatile media. Non-volatile media may include, for example, optical and/or magnetic disks, such as the storage device(s) 710. Volatile media may include, without limitation, dynamic memory, such as the working memory 714.
Example forms of physical and/or tangible computer-readable media may include a floppy disk, a flexible disk, hard disk, magnetic tape, or any other magnetic medium, a compact disc, any other optical medium, ROM, RAM, and etc., any other memory chip or cartridge, or any other medium from which a computer may read instructions and/or code. Various forms of computer-readable media may be involved in carrying one or more sequences of one or more instructions to the processor(s) 704 for execution. By way of example, the instructions may initially be carried on a magnetic disk and/or optical disc of a remote computer. A remote computer might load the instructions into its dynamic memory and send the instructions as signals over a transmission medium to be received and/or executed by the computer system 700.
The communications subsystem 712 (and/or components thereof) generally will receive signals, and the bus 702 then might carry the signals (and/or the data, instructions, etc. carried by the signals) to the working memory 714, from which the processor(s) 704 retrieves and executes the instructions. The instructions received by the working memory 714 may optionally be stored on a non-transitory storage device 710 either before or after execution by the processor(s) 704.
It should further be understood that the components of computer device 700 can be distributed across a network. For example, some processing may be performed in one location using a first processor while other processing may be performed by another processor remote from the first processor. Other components of computer system 700 may be similarly distributed. As such, computer device 700 may be interpreted as a distributed computing system that performs processing in multiple locations. In some instances, computer system 700 may be interpreted as a single computing device, such as a distinct laptop, desktop computer, or the like, depending on the context.
The methods, systems, and devices discussed above are examples. Various configurations may omit, substitute, or add various method steps or procedures, or system components as appropriate. For instance, in alternative configurations, the methods may be performed in an order different from that described, and/or various stages may be added, omitted, and/or combined. Also, features described with respect to certain configurations may be combined in various other configurations. Different aspects and elements of the configurations may be combined in a similar manner. Also, technology evolves and, thus, many of the elements are examples and do not limit the scope of the disclosure or claims.
Specific details are given in the description to provide a thorough understanding of example configurations (including implementations). However, configurations may be practiced without these specific details. For example, well-known circuits, processes, algorithms, structures, and techniques have been shown without unnecessary detail in order to avoid obscuring the configurations. This description provides example configurations only, and does not limit the scope, applicability, or configurations of the claims. Rather, the preceding description of the configurations will provide those of skill with an enabling description for implementing described techniques. Various changes may be made in the function and arrangement of elements without departing from the spirit or scope of the disclosure.
Also, configurations may be described as a process which is depicted as a flow diagram or block diagram. Although each may describe the operations as a sequential process, many of the operations may be performed in parallel or concurrently. In addition, the order of the operations may be rearranged. A process may have additional steps not included in the figure. Furthermore, examples of the methods may be implemented by hardware, software, firmware, middleware, microcode, hardware description languages, or any combination thereof. When implemented in software, firmware, middleware, or microcode, the program code or code segments to perform the necessary tasks may be stored in a non-transitory computer-readable medium such as a storage medium. Processors may perform the described tasks.
Furthermore, the example embodiments described herein may be implemented as logical operations in a computing device in a networked computing system environment. The logical operations may be implemented as: (i) a sequence of computer implemented instructions, steps, or program modules running on a computing device; and (ii) interconnected logic or hardware modules running within a computing device.
Although the subject matter has been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the specific features or acts described above. Rather, the specific features and acts described above are disclosed as example forms of implementing the claims.
Number | Name | Date | Kind |
---|---|---|---|
4127966 | Schmidt | Dec 1978 | A |
4386436 | Kocher et al. | May 1983 | A |
4581606 | Mallory | Apr 1986 | A |
4694607 | Ishida et al. | Sep 1987 | A |
4728949 | Platte et al. | Mar 1988 | A |
4959713 | Morotomi et al. | Sep 1990 | A |
5400246 | Wilson et al. | Mar 1995 | A |
5770896 | Nakajima | Jun 1998 | A |
5805442 | Crater et al. | Sep 1998 | A |
5822012 | Jeon et al. | Oct 1998 | A |
5886638 | Tanguay | Mar 1999 | A |
5894331 | Yang | Apr 1999 | A |
5926090 | Taylor et al. | Jul 1999 | A |
5970030 | Dimitri et al. | Oct 1999 | A |
6081758 | Parvulescu | Jun 2000 | A |
6104334 | Allport | Aug 2000 | A |
6107918 | Klein et al. | Aug 2000 | A |
6107935 | Comerford et al. | Aug 2000 | A |
6111517 | Atick et al. | Aug 2000 | A |
6119088 | Ciluffo | Sep 2000 | A |
6142913 | Ewert | Nov 2000 | A |
6182094 | Humpleman et al. | Jan 2001 | B1 |
6225938 | Hayes et al. | May 2001 | B1 |
6286764 | Garvey et al. | Sep 2001 | B1 |
6330621 | Bakke et al. | Dec 2001 | B1 |
6337899 | Alcendor et al. | Jan 2002 | B1 |
6377858 | Koeppe | Apr 2002 | B1 |
6405284 | Bridge | Jun 2002 | B1 |
6415257 | Jungua et al. | Jul 2002 | B1 |
6502166 | Cassidy | Dec 2002 | B1 |
6529230 | Chong | Mar 2003 | B1 |
6543051 | Manson et al. | Apr 2003 | B1 |
6553375 | Huang et al. | Apr 2003 | B1 |
6646676 | DaGraca et al. | Nov 2003 | B1 |
6662282 | Cochran | Dec 2003 | B2 |
6663375 | Ulcej | Dec 2003 | B1 |
6744771 | Barber et al. | Jun 2004 | B1 |
6748343 | Alexander et al. | Jun 2004 | B2 |
6751657 | Zothner | Jun 2004 | B1 |
6756998 | Bilger | Jun 2004 | B1 |
6876889 | Lortz | Apr 2005 | B1 |
6891838 | Petite et al. | May 2005 | B1 |
6931104 | Foster et al. | Aug 2005 | B1 |
6976187 | Arnott et al. | Dec 2005 | B2 |
6989731 | Kawai et al. | Jan 2006 | B1 |
7009528 | Griep | Mar 2006 | B2 |
7010332 | Irvin et al. | Mar 2006 | B1 |
7088238 | Karaoguz et al. | Aug 2006 | B2 |
7103545 | Furuta | Sep 2006 | B2 |
7143298 | Wells et al. | Nov 2006 | B2 |
7216002 | Anderson | May 2007 | B1 |
7234074 | Cohn et al. | Jun 2007 | B2 |
7260538 | Calderone et al. | Aug 2007 | B2 |
7346917 | Gatto et al. | Mar 2008 | B2 |
7372370 | Stults et al. | May 2008 | B2 |
7386666 | Beauchamp et al. | Jun 2008 | B1 |
7391319 | Walker | Jun 2008 | B1 |
7395369 | Sepez et al. | Jul 2008 | B2 |
7395546 | Asmussen | Jul 2008 | B1 |
7529677 | Wittenberg | May 2009 | B1 |
7574494 | Mayernick et al. | Aug 2009 | B1 |
7579945 | Richter et al. | Aug 2009 | B1 |
7590703 | Cashman et al. | Sep 2009 | B2 |
7640351 | Reckamp et al. | Dec 2009 | B2 |
7659814 | Chen et al. | Feb 2010 | B2 |
7694005 | Reckamp et al. | Apr 2010 | B2 |
7739718 | Young et al. | Jun 2010 | B1 |
7861034 | Yamamoto et al. | Dec 2010 | B2 |
7870232 | Reckamp et al. | Jan 2011 | B2 |
7945297 | Philipp | May 2011 | B2 |
7969318 | White et al. | Jun 2011 | B2 |
8013730 | Oh et al. | Sep 2011 | B2 |
8042048 | Wilson | Oct 2011 | B2 |
8086757 | Chang | Dec 2011 | B2 |
8106768 | Neumann | Jan 2012 | B2 |
8156368 | Chambliss et al. | Apr 2012 | B2 |
8171148 | Lucas et al. | May 2012 | B2 |
8180735 | Ansari et al. | May 2012 | B2 |
8201261 | Barfield et al. | Jun 2012 | B2 |
8221290 | Vincent et al. | Jul 2012 | B2 |
8275143 | Johnson | Sep 2012 | B2 |
8289157 | Patenaude et al. | Oct 2012 | B2 |
8290545 | Terlizzi | Oct 2012 | B2 |
8310335 | Sivakkolundhu | Nov 2012 | B2 |
8316413 | Crabtree | Nov 2012 | B2 |
8320578 | Kahn et al. | Nov 2012 | B2 |
8335312 | Gerhardt et al. | Dec 2012 | B2 |
8350694 | Trundle et al. | Jan 2013 | B1 |
8413204 | White | Apr 2013 | B2 |
8436902 | Kuehnle | May 2013 | B2 |
8498572 | Schooley et al. | Jul 2013 | B1 |
8516087 | Wilson et al. | Aug 2013 | B2 |
8539567 | Logue et al. | Sep 2013 | B1 |
8550368 | Butler et al. | Oct 2013 | B2 |
8619136 | Howarter et al. | Dec 2013 | B2 |
8620841 | Filson et al. | Dec 2013 | B1 |
8644525 | Bathurst et al. | Feb 2014 | B2 |
8645327 | Falkenburg et al. | Feb 2014 | B2 |
8667529 | Taxier | Mar 2014 | B2 |
8750576 | Huang et al. | Jun 2014 | B2 |
8780201 | Scalisi et al. | Jul 2014 | B1 |
8781508 | Blakely | Jul 2014 | B1 |
8786698 | Chen et al. | Jul 2014 | B2 |
8799413 | Taylor et al. | Aug 2014 | B2 |
8818898 | Schlossberg et al. | Aug 2014 | B2 |
8898709 | Crabtree | Nov 2014 | B2 |
8923823 | Wilde | Dec 2014 | B1 |
8930700 | Wielopolski | Jan 2015 | B2 |
8948793 | Birkhold et al. | Feb 2015 | B1 |
8965170 | Benea et al. | Feb 2015 | B1 |
9019111 | Sloo et al. | Apr 2015 | B1 |
9049567 | Le Guen et al. | Jun 2015 | B2 |
9191804 | Paczkowski et al. | Nov 2015 | B1 |
9246921 | Vlaminck et al. | Jan 2016 | B1 |
9258593 | Chen et al. | Feb 2016 | B1 |
9286482 | Dumont et al. | Mar 2016 | B1 |
9353500 | Andreski | May 2016 | B1 |
9443142 | Reynolds, Jr. | Sep 2016 | B2 |
9462041 | Hagins et al. | Oct 2016 | B1 |
9495860 | Lett | Nov 2016 | B2 |
9511259 | Mountain | Dec 2016 | B2 |
9589448 | Schneider et al. | Mar 2017 | B1 |
9599981 | Crabtree | Mar 2017 | B2 |
9621959 | Mountain | Apr 2017 | B2 |
9628286 | Nguyen et al. | Apr 2017 | B1 |
9632746 | Keipert et al. | Apr 2017 | B2 |
9633186 | Ingrassia, Jr. et al. | Apr 2017 | B2 |
9729989 | Marten | Aug 2017 | B2 |
9769522 | Richardson | Sep 2017 | B2 |
9772612 | McCarthy et al. | Sep 2017 | B2 |
9798309 | Tirpak | Oct 2017 | B2 |
20010012998 | Jouet et al. | Aug 2001 | A1 |
20020003493 | Durst et al. | Jan 2002 | A1 |
20020019725 | Petite | Feb 2002 | A1 |
20020063633 | Park | May 2002 | A1 |
20020080238 | Ohmura | Jun 2002 | A1 |
20020193989 | Geilhufe et al. | Dec 2002 | A1 |
20030005431 | Shinohara | Jan 2003 | A1 |
20030052789 | Colmenarez et al. | Mar 2003 | A1 |
20030097452 | Kim et al. | May 2003 | A1 |
20030126593 | Mault | Jul 2003 | A1 |
20030133551 | Kahn | Jul 2003 | A1 |
20030140352 | Kim | Jul 2003 | A1 |
20030154242 | Hayes et al. | Aug 2003 | A1 |
20030192600 | Ford | Oct 2003 | A1 |
20030201900 | Bachinski et al. | Oct 2003 | A1 |
20040019489 | Funk et al. | Jan 2004 | A1 |
20040036579 | Megerle | Feb 2004 | A1 |
20040117038 | Karaoguz et al. | Jun 2004 | A1 |
20040117843 | Karaoguz et al. | Jun 2004 | A1 |
20040121725 | Matsui | Jun 2004 | A1 |
20040128034 | Lenker et al. | Jul 2004 | A1 |
20040143838 | Rose | Jul 2004 | A1 |
20040148419 | Chen et al. | Jul 2004 | A1 |
20040148632 | Park et al. | Jul 2004 | A1 |
20040260407 | Wimsatt | Dec 2004 | A1 |
20040266419 | Arling et al. | Dec 2004 | A1 |
20050038875 | Park | Feb 2005 | A1 |
20050049862 | Choi et al. | Mar 2005 | A1 |
20050106267 | Frykman et al. | May 2005 | A1 |
20050159823 | Hayes et al. | Jul 2005 | A1 |
20050188315 | Campbell et al. | Aug 2005 | A1 |
20050200478 | Koch et al. | Sep 2005 | A1 |
20050245292 | Bennett et al. | Nov 2005 | A1 |
20050252622 | Reid | Nov 2005 | A1 |
20050264698 | Eshleman | Dec 2005 | A1 |
20050289614 | Baek et al. | Dec 2005 | A1 |
20060011145 | Kates | Jan 2006 | A1 |
20060059977 | Kates | Mar 2006 | A1 |
20060087428 | Wolfe et al. | Apr 2006 | A1 |
20060115156 | Nakajima et al. | Jun 2006 | A1 |
20060136968 | Han et al. | Jun 2006 | A1 |
20060143679 | Yamada et al. | Jun 2006 | A1 |
20060155389 | Pessolano et al. | Jul 2006 | A1 |
20060192680 | Scuka et al. | Aug 2006 | A1 |
20060244624 | Wang et al. | Nov 2006 | A1 |
20060253894 | Bookman et al. | Nov 2006 | A1 |
20070044119 | Sullivan et al. | Feb 2007 | A1 |
20070078910 | Bopardikar | Apr 2007 | A1 |
20070129220 | Bardha | Jun 2007 | A1 |
20070135225 | Nieminen et al. | Jun 2007 | A1 |
20070142022 | Madonna et al. | Jun 2007 | A1 |
20070146545 | Iwahashi | Jun 2007 | A1 |
20070150460 | Evans | Jun 2007 | A1 |
20070157258 | Jung et al. | Jul 2007 | A1 |
20070192486 | Wilson et al. | Aug 2007 | A1 |
20070194922 | Nathan et al. | Aug 2007 | A1 |
20070256085 | Reckamp et al. | Nov 2007 | A1 |
20070271518 | Tischer et al. | Nov 2007 | A1 |
20070275670 | Chen et al. | Nov 2007 | A1 |
20070279244 | Haughawout et al. | Dec 2007 | A1 |
20070280504 | Badawy | Dec 2007 | A1 |
20080019392 | Lee | Jan 2008 | A1 |
20080021971 | Halgas | Jan 2008 | A1 |
20080022322 | Grannan et al. | Jan 2008 | A1 |
20080046930 | Smith et al. | Feb 2008 | A1 |
20080062258 | Bentkovski et al. | Mar 2008 | A1 |
20080062965 | Silva et al. | Mar 2008 | A1 |
20080092199 | McCarthy et al. | Apr 2008 | A1 |
20080109095 | Braithwaite et al. | May 2008 | A1 |
20080114963 | Cannon et al. | May 2008 | A1 |
20080120639 | Walter et al. | May 2008 | A1 |
20080123825 | Abramson et al. | May 2008 | A1 |
20080140736 | Jarno | Jun 2008 | A1 |
20080144884 | Habibi | Jun 2008 | A1 |
20080163330 | Sparrell | Jul 2008 | A1 |
20080179053 | Kates | Jul 2008 | A1 |
20080236214 | Han | Oct 2008 | A1 |
20080278635 | Hardacker et al. | Nov 2008 | A1 |
20080284905 | Chuang | Nov 2008 | A1 |
20080288876 | Fleming | Nov 2008 | A1 |
20080297660 | Shioya | Dec 2008 | A1 |
20090023554 | Shim | Jan 2009 | A1 |
20090027225 | Farley | Jan 2009 | A1 |
20090033505 | Jones et al. | Feb 2009 | A1 |
20090040013 | Ebrom et al. | Feb 2009 | A1 |
20090066320 | Posey | Mar 2009 | A1 |
20090069038 | Olague et al. | Mar 2009 | A1 |
20090083374 | Saint Clair | Mar 2009 | A1 |
20090112541 | Anderson et al. | Apr 2009 | A1 |
20090138507 | Burckart et al. | May 2009 | A1 |
20090146834 | Huang | Jun 2009 | A1 |
20090165069 | Kirchner | Jun 2009 | A1 |
20090167555 | Kohanek | Jul 2009 | A1 |
20090190040 | Watanabe et al. | Jul 2009 | A1 |
20090235992 | Armstrong | Sep 2009 | A1 |
20090249428 | White et al. | Oct 2009 | A1 |
20090270065 | Hamada et al. | Oct 2009 | A1 |
20090271203 | Resch et al. | Oct 2009 | A1 |
20090286654 | Rice | Nov 2009 | A1 |
20090307715 | Santamaria et al. | Dec 2009 | A1 |
20100031286 | Gupta et al. | Feb 2010 | A1 |
20100045471 | Meyers | Feb 2010 | A1 |
20100046918 | Takao et al. | Feb 2010 | A1 |
20100083371 | Bennetts et al. | Apr 2010 | A1 |
20100097225 | Petricoin, Jr. | Apr 2010 | A1 |
20100102082 | Ebrom et al. | Apr 2010 | A1 |
20100122284 | Yoon et al. | May 2010 | A1 |
20100131280 | Bogineni | May 2010 | A1 |
20100138007 | Clark et al. | Jun 2010 | A1 |
20100138858 | Velazquez et al. | Jun 2010 | A1 |
20100146445 | Kraut | Jun 2010 | A1 |
20100161082 | Ebrom et al. | Jun 2010 | A1 |
20100164732 | Wedig et al. | Jul 2010 | A1 |
20100211546 | Grohman et al. | Aug 2010 | A1 |
20100277300 | Cohn et al. | Nov 2010 | A1 |
20100283579 | Kraus et al. | Nov 2010 | A1 |
20100309004 | Grundler et al. | Dec 2010 | A1 |
20100321151 | Matsuura et al. | Dec 2010 | A1 |
20110003665 | Burton et al. | Jan 2011 | A1 |
20110018693 | Lim et al. | Jan 2011 | A1 |
20110030016 | Pino et al. | Feb 2011 | A1 |
20110032423 | Jing et al. | Feb 2011 | A1 |
20110093126 | Toba et al. | Apr 2011 | A1 |
20110119325 | Paul et al. | May 2011 | A1 |
20110139076 | Pu et al. | Jun 2011 | A1 |
20110140832 | Vinkenvleugel et al. | Jun 2011 | A1 |
20110150432 | Paul et al. | Jun 2011 | A1 |
20110156862 | Langer | Jun 2011 | A1 |
20110157468 | Dai | Jun 2011 | A1 |
20110167250 | Dicks et al. | Jul 2011 | A1 |
20110187928 | Crabtree | Aug 2011 | A1 |
20110187930 | Crabtree | Aug 2011 | A1 |
20110187931 | Kim | Aug 2011 | A1 |
20110202956 | Connelly et al. | Aug 2011 | A1 |
20110267180 | Ferringo et al. | Nov 2011 | A1 |
20110270549 | Jeansonne et al. | Nov 2011 | A1 |
20110282837 | Gounares et al. | Nov 2011 | A1 |
20110283311 | Luong | Nov 2011 | A1 |
20110285528 | Weinstein et al. | Nov 2011 | A1 |
20110295396 | Chinen et al. | Dec 2011 | A1 |
20110296463 | Suslov | Dec 2011 | A1 |
20120019388 | Kates et al. | Jan 2012 | A1 |
20120047083 | Qiao et al. | Feb 2012 | A1 |
20120047532 | McCarthy | Feb 2012 | A1 |
20120059495 | Weiss et al. | Mar 2012 | A1 |
20120069246 | Thornberry et al. | Mar 2012 | A1 |
20120092183 | Corbett et al. | Apr 2012 | A1 |
20120094696 | Ahn et al. | Apr 2012 | A1 |
20120105724 | Candelore | May 2012 | A1 |
20120124245 | Reeves et al. | May 2012 | A1 |
20120124456 | Perez et al. | May 2012 | A1 |
20120154108 | Sugaya | Jun 2012 | A1 |
20120154138 | Cohn et al. | Jun 2012 | A1 |
20120164975 | Dodeja et al. | Jun 2012 | A1 |
20120167646 | Sharma et al. | Jul 2012 | A1 |
20120226366 | Lee et al. | Sep 2012 | A1 |
20120226768 | Gaines et al. | Sep 2012 | A1 |
20120271472 | Brunner et al. | Oct 2012 | A1 |
20120271670 | Zaloom | Oct 2012 | A1 |
20120280802 | Yoshida et al. | Nov 2012 | A1 |
20120291068 | Khushoo et al. | Nov 2012 | A1 |
20120314713 | Singh et al. | Dec 2012 | A1 |
20120316876 | Jang et al. | Dec 2012 | A1 |
20120326835 | Cockrell et al. | Dec 2012 | A1 |
20130006400 | Caceres et al. | Jan 2013 | A1 |
20130013106 | Carelli et al. | Jan 2013 | A1 |
20130031037 | Brandt et al. | Jan 2013 | A1 |
20130046800 | Assi et al. | Feb 2013 | A1 |
20130049950 | Wohlert | Feb 2013 | A1 |
20130053063 | McSheffrey | Feb 2013 | A1 |
20130060358 | Li et al. | Mar 2013 | A1 |
20130070044 | Naidoo et al. | Mar 2013 | A1 |
20130074061 | Averbuch et al. | Mar 2013 | A1 |
20130090213 | Amini et al. | Apr 2013 | A1 |
20130120137 | Lehmann | May 2013 | A1 |
20130124192 | Lindmark et al. | May 2013 | A1 |
20130138757 | Ferron | May 2013 | A1 |
20130147604 | Jones et al. | Jun 2013 | A1 |
20130152139 | Davis et al. | Jun 2013 | A1 |
20130158717 | Zywicki et al. | Jun 2013 | A1 |
20130166073 | Pine et al. | Jun 2013 | A1 |
20130179926 | White et al. | Jul 2013 | A1 |
20130185750 | Ayoub | Jul 2013 | A1 |
20130204408 | Thiruvengada et al. | Aug 2013 | A1 |
20130219482 | Brandt | Aug 2013 | A1 |
20130238326 | Kim et al. | Sep 2013 | A1 |
20130242074 | Sekiguchi et al. | Sep 2013 | A1 |
20130247117 | Yamada et al. | Sep 2013 | A1 |
20130249688 | Nguyen et al. | Sep 2013 | A1 |
20130267383 | Watterson | Oct 2013 | A1 |
20130278828 | Todd | Oct 2013 | A1 |
20130289788 | Gupta et al. | Oct 2013 | A1 |
20130300576 | Sinsuan et al. | Nov 2013 | A1 |
20130318559 | Crabtree | Nov 2013 | A1 |
20130321637 | Frank et al. | Dec 2013 | A1 |
20130324247 | Esaki et al. | Dec 2013 | A1 |
20130325150 | Bury | Dec 2013 | A1 |
20140022051 | Levien et al. | Jan 2014 | A1 |
20140025798 | Apte et al. | Jan 2014 | A1 |
20140028546 | Jeon et al. | Jan 2014 | A1 |
20140070959 | Bhargava et al. | Mar 2014 | A1 |
20140089671 | Logue et al. | Mar 2014 | A1 |
20140095684 | Nonaka et al. | Apr 2014 | A1 |
20140101465 | Wang et al. | Apr 2014 | A1 |
20140135993 | Kang et al. | May 2014 | A1 |
20140140575 | Wolf | May 2014 | A1 |
20140142724 | Park et al. | May 2014 | A1 |
20140160360 | Hsu et al. | Jun 2014 | A1 |
20140167969 | Wedig et al. | Jun 2014 | A1 |
20140168277 | Ashley et al. | Jun 2014 | A1 |
20140192197 | Hanko et al. | Jul 2014 | A1 |
20140192997 | Niu et al. | Jul 2014 | A1 |
20140201315 | Jacob et al. | Jul 2014 | A1 |
20140215505 | Balasubramanian et al. | Jul 2014 | A1 |
20140217905 | Clayton et al. | Aug 2014 | A1 |
20140218517 | Kim et al. | Aug 2014 | A1 |
20140222634 | Gordon et al. | Aug 2014 | A1 |
20140223548 | Wassingbo | Aug 2014 | A1 |
20140266669 | Fadell et al. | Sep 2014 | A1 |
20140266684 | Poder et al. | Sep 2014 | A1 |
20140282653 | Ariantaj et al. | Sep 2014 | A1 |
20140297001 | Silverman | Oct 2014 | A1 |
20140306833 | Ricci | Oct 2014 | A1 |
20140310075 | Ricci | Oct 2014 | A1 |
20140313014 | Huh et al. | Oct 2014 | A1 |
20140333529 | Kim et al. | Nov 2014 | A1 |
20140351832 | Cho et al. | Nov 2014 | A1 |
20140362201 | Nguyen et al. | Dec 2014 | A1 |
20140373074 | Hwang et al. | Dec 2014 | A1 |
20150008846 | Chen et al. | Jan 2015 | A1 |
20150015401 | Wedig et al. | Jan 2015 | A1 |
20150029096 | Ishihara | Jan 2015 | A1 |
20150054910 | Offen et al. | Feb 2015 | A1 |
20150061859 | Matsuoka et al. | Mar 2015 | A1 |
20150066173 | Ellis et al. | Mar 2015 | A1 |
20150074259 | Ansari et al. | Mar 2015 | A1 |
20150082225 | Shearer | Mar 2015 | A1 |
20150084770 | Xiao et al. | Mar 2015 | A1 |
20150085184 | Vidal et al. | Mar 2015 | A1 |
20150097689 | Logue et al. | Apr 2015 | A1 |
20150100167 | Sloo et al. | Apr 2015 | A1 |
20150105880 | Slupik et al. | Apr 2015 | A1 |
20150106866 | Fujita | Apr 2015 | A1 |
20150113571 | Cholas et al. | Apr 2015 | A1 |
20150116113 | Caine et al. | Apr 2015 | A1 |
20150127712 | Fadell et al. | May 2015 | A1 |
20150131500 | Xie et al. | May 2015 | A1 |
20150137967 | Wedig et al. | May 2015 | A1 |
20150142991 | Zaloom | May 2015 | A1 |
20150143406 | Cho et al. | May 2015 | A1 |
20150143408 | Sallas | May 2015 | A1 |
20150145643 | Fadell et al. | May 2015 | A1 |
20150154850 | Fadell et al. | Jun 2015 | A1 |
20150156030 | Fadell et al. | Jun 2015 | A1 |
20150156612 | Vemaulapalli | Jun 2015 | A1 |
20150159401 | Patrick et al. | Jun 2015 | A1 |
20150160623 | Holley | Jun 2015 | A1 |
20150160634 | Smith et al. | Jun 2015 | A1 |
20150160635 | Schofield | Jun 2015 | A1 |
20150160636 | McCarthy et al. | Jun 2015 | A1 |
20150160663 | McCarthy et al. | Jun 2015 | A1 |
20150160935 | Nye | Jun 2015 | A1 |
20150161452 | McCarthy et al. | Jun 2015 | A1 |
20150161882 | Lett | Jun 2015 | A1 |
20150162006 | Kummer | Jun 2015 | A1 |
20150163411 | McCarthy, III et al. | Jun 2015 | A1 |
20150163412 | Holley | Jun 2015 | A1 |
20150163535 | McCarthy et al. | Jun 2015 | A1 |
20150172742 | Richardson | Jun 2015 | A1 |
20150180708 | Jacob et al. | Jun 2015 | A1 |
20150192914 | Slupik | Jul 2015 | A1 |
20150198941 | Pederson | Jul 2015 | A1 |
20150241860 | Raid | Aug 2015 | A1 |
20150260424 | Fadell et al. | Sep 2015 | A1 |
20150281824 | Nguyen et al. | Oct 2015 | A1 |
20150309487 | Lyman | Oct 2015 | A1 |
20150325096 | Hatch | Nov 2015 | A1 |
20150334069 | Winston et al. | Nov 2015 | A1 |
20150341599 | Carey | Nov 2015 | A1 |
20150347910 | Fadell et al. | Dec 2015 | A1 |
20150365787 | Farrell | Dec 2015 | A1 |
20160029153 | Linn et al. | Jan 2016 | A1 |
20160041565 | Edwards | Feb 2016 | A1 |
20160047569 | Fadell et al. | Feb 2016 | A1 |
20160063854 | Burton et al. | Mar 2016 | A1 |
20160066046 | Mountain | Mar 2016 | A1 |
20160091471 | Benn | Mar 2016 | A1 |
20160098309 | Kim | Apr 2016 | A1 |
20160100696 | Palashewski et al. | Apr 2016 | A1 |
20160109864 | Lonn | Apr 2016 | A1 |
20160121161 | Mountain | May 2016 | A1 |
20160123741 | Mountain | May 2016 | A1 |
20160163168 | Brav et al. | Jun 2016 | A1 |
20160182249 | Lea | Jun 2016 | A1 |
20160189527 | Peterson et al. | Jun 2016 | A1 |
20160191912 | Lea et al. | Jun 2016 | A1 |
20160191990 | McCarthy | Jun 2016 | A1 |
20160195856 | Spero | Jul 2016 | A1 |
20160196731 | Aich et al. | Jul 2016 | A1 |
20160203700 | Bruhn et al. | Jul 2016 | A1 |
20160234034 | Mahar et al. | Aug 2016 | A1 |
20160256485 | Wager et al. | Sep 2016 | A1 |
20160260135 | Zomet et al. | Sep 2016 | A1 |
20160285644 | Lu et al. | Sep 2016 | A1 |
20160286327 | Marten | Sep 2016 | A1 |
20160323548 | Khot et al. | Nov 2016 | A1 |
20160334811 | Marten | Nov 2016 | A1 |
20160335423 | Beals | Nov 2016 | A1 |
20160338179 | Aliakseyeu et al. | Nov 2016 | A1 |
20160342379 | Keipert et al. | Nov 2016 | A1 |
20160366746 | van de Ven et al. | Dec 2016 | A1 |
20170005822 | Gao | Jan 2017 | A1 |
20170041886 | Baker et al. | Feb 2017 | A1 |
20170048476 | Freiin von Kapri et al. | Feb 2017 | A1 |
20170051925 | Stefanski et al. | Feb 2017 | A1 |
20170054615 | Wilson | Feb 2017 | A1 |
20170065433 | Singh et al. | Mar 2017 | A1 |
20170082987 | Reddy et al. | Mar 2017 | A1 |
20170127124 | Wilson | May 2017 | A9 |
20170146964 | Beals | May 2017 | A1 |
20170168469 | Marten et al. | Jun 2017 | A1 |
20170176961 | Tirpak | Jun 2017 | A1 |
20170187993 | Martch et al. | Jun 2017 | A1 |
20170191693 | Bruhn et al. | Jul 2017 | A1 |
20170191695 | Bruhn et al. | Jul 2017 | A1 |
20170195130 | Landow et al. | Jul 2017 | A1 |
Number | Date | Country |
---|---|---|
2 267 988 | Apr 1998 | CA |
105814555 | Jul 2016 | CN |
2 736 027 | May 2014 | EP |
3 080 677 | Oct 2016 | EP |
3 080 710 | Oct 2016 | EP |
2 304 952 | Mar 1997 | GB |
2008148016 | Jun 2008 | JP |
9320544 | Oct 1993 | WO |
2004068386 | Aug 2004 | WO |
2011095567 | Aug 2011 | WO |
2014068556 | May 2014 | WO |
2015179120 | Nov 2015 | WO |
2016034880 | Mar 2016 | WO |
2016066399 | May 2016 | WO |
2016066442 | May 2016 | WO |
2016182696 | Nov 2016 | WO |
2017116533 | Jul 2017 | WO |
Entry |
---|
International Search Report and Written Opinion for PCT/US2016/028126 dated Jun. 3, 2016, all pages. |
U.S. Appl. No. 12/700,310, filed Feb. 4, 2010 Non-Final Office Action dated Jun. 16, 2016, 30 pages. |
U.S. Appl. No. 14/528,739, filed Oct. 30, 2014 Notice of Allowance dated Jun. 23, 2016, 34 pages. |
U.S. Appl. No. 14/485,188, filed Sep. 12, 2014, Non-Final Rejection dated Jun. 17, 2016, 29 pages. |
U.S. Appl. No. 14/710,331, filed May 12, 2015, Non-Final Rejection dated May 20, 2016, 42 pages. |
International Preliminary Report on Patentability for PCT/US2014/055441 dated Jun. 14, 2016, 8 pages. |
International Preliminary Report on Patentability for PCT/US2014/053876 dated Jun. 14, 2016, 7 pages. |
International Preliminary Report on Patentability for PCT/US2014/055476 dated Jun. 14, 2016, 9 pages. |
U.S. Appl. No. 12/700,310, filed Feb. 4, 2010 Notice of Allowance dated Nov. 8, 2016, all pages. |
U.S. Appl. No. 14/567,765, filed Dec. 11, 2014, First Action interview dated Oct. 18, 2016, all pages. |
U.S. Appl. No. 14/584,075, filed Dec. 29, 2014, Final Rejection dated Oct. 6, 2016, all pages. |
U.S. Appl. No. 14/566,977, filed Dec. 11, 2014, Non Final Rejection dated Oct. 3, 2016, all pages. |
U.S. Appl. No. 14/567,754, filed Dec. 11, 2014, Non Final Rejection dated Nov. 4, 2016, all pages. |
U.S. Appl. No. 14/567,770, filed Dec. 11, 2014, Non Final Rejection dated Nov. 4, 2016, all pages. |
U.S. Appl. No. 14/671,299, filed Mar. 27, 2015, Non Final Rejection dated Oct. 28, 2016, all pages. |
U.S. Appl. No. 14/476,377, filed Sep. 3, 2014, Non-Final Rejection dated Nov. 7, 2016, all pages. |
Office Action for EP14868928.4 dated Sep. 23, 2016, all pages. |
U.S. Appl. No. 14/470,352, filed Aug. 27, 2014 Non Final Office Action dated Nov. 20, 2015, 28 pages. |
U.S. Appl. No. 12/700,310, filed Feb. 4, 2010 Final Office Action dated Oct. 26, 2015, 19 pages. |
U.S. Appl. No. 14/485,188, filed Sep. 12, 2014, Pre-Interview First Office Action dated Oct. 1, 2015, 10 pages. |
Fong A.C.M. et al, “Indoor air quality control for asthma patients using smart home technology,” Consumer Electronics (ISCE), 2011 IEEE 15th International Symposium on, IEEE, Jun. 14, 2011, pp. 18-19, XP032007803, DOI: 10.1109/ISCE.2011.5973774, ISBN: 978-1-61284-843-3, Abstract and sections 3 and 4. |
Shunfeng Cheng et al., “A Wireless Sensor System for Prognostics and Health Management,” IEEE Sensors Journal, IEEE Service Center, New York, NY, US, vol. 10, No. 4, Apr. 1, 2010, pp. 856-862, XP011304455, ISSN: 1530-437X, Sections 2 and 3. |
International Search Report and Written Opinion for PCT/EP2015/070286 dated Nov. 5, 2015, 13 pages. |
International Search Report and Written Opinion for PCT/GB2015/052544 dated Oct. 6, 2015, 10 pages. |
International Search Report and Written Opinion for PCT/GB2015/052457 dated Nov. 13, 2015, 11 pages. |
International Search Report and Written Opinion for PCT/EP2015/073299 dated Jan. 4, 2016, 12 pages. |
International Search Report and Written Opinion for PCT/EP2015/073936 dated Feb. 4, 2016, all pages. |
U.S. Appl. No. 14/107,132, filed Dec. 16, 2013, Final Rejection dated Dec. 16, 2015, 32 pages. |
U.S. Appl. No. 14/485,188, filed Sep. 12, 2014, Final Rejection dated Feb. 23, 2016, 22 pages. |
U.S. Appl. No. 14/567,348, filed Dec. 11, 2014, Preinterview first office action dated Jan. 20, 2016, 23 pages. |
U.S. Appl. No. 14/470,352, filed Aug. 27, 2014 Final Office Action dated Mar. 17, 2016, all pages. |
U.S. Appl. No. 14/567,765, filed Dec. 11, 2014, Preinterview first office action dated Apr. 8, 2016, 30 pages. |
U.S. Appl. No. 14/577,717, filed Dec. 19, 2014, Preinterview first office action dated Apr. 4, 2016, 29 pages. |
U.S. Appl. No. 14/584,075, filed Dec. 29, 2014, Non-Final Rejection dated Apr. 1, 2016, 40 pages. |
U.S. Appl. No. 14/470,352, filed Aug. 27, 2014 Non Final Office Action dated Aug. 26, 2016, all pages. |
U.S. Appl. No. 14/107,132, filed Dec. 16, 2013, Non Final Office Action dated Jul. 18, 2016, all pages. |
U.S. Appl. No. 14/567,783, filed Dec. 11, 2014, Non Final Rejection dated Aug. 23, 2016, all pages. |
U.S. Appl. No. 14/715,248, filed May 18, 2015, Non-Final Rejection dated Jul. 19, 2016, 34 pages. |
U.S. Appl. No. 14/470,352, filed Aug. 27, 2014 Notice of Allowance dated Dec. 2, 2016, all pages. |
U.S. Appl. No. 15/050,958, filed Feb. 23, 2016 Notice of Allowance dated Dec. 6, 2016, all pages. |
U.S. Appl. No. 15/289,395, filed Oct. 10, 2016 Non-Final Rejection dated Dec. 2, 2016, all pages. |
U.S. Appl. No. 14/107,132, filed Dec. 16, 2013, Notice of Allowance dated Jan. 18, 2017, all pages. |
U.S. Appl. No. 14/485,188, filed Sep. 12, 2014, Final Rejection dated Nov. 25, 2016, 22 pages. |
U.S. Appl. No. 14/577,717, filed Dec. 19, 2014, Final Office Action dated Dec. 19, 2016, all pages. |
U.S. Appl. No. 14/567,783, filed Dec. 11, 2014, Final Rejection dated Dec. 20, 2016, all pages. |
U.S. Appl. No. 15/075,412, filed Mar. 21, 2016, Non Final Rejection dated Dec. 21, 2016, all pages. |
“Acoustic/Ultrasound Ultrasonic Flowmeter Basics,” Questex Media Group LLC, accessed on Dec. 16, 2014, 4 pages. Retrieved from http://www.sensorsmag.com/sensors/acoustic-ultrasound/ultrasonic-flowmeter-basics-842. |
Author Unknown, “Voice Activated TV using the Amulet Remote for Media Center,” AmuletDevices.com, accessed on Jul. 14, 2014, 1 page. Retrieved from http://www.amuletdevices.com/index.php/Features/television.html. |
Author Unknown, “App for Samsung Smart TV®,” Crestron Electronics, Inc., accessed on Jul. 14, 2014, 3 pages. Retrieved from http://www.crestron.com/products/smart tv television apps/. |
Author Unknown, “AllJoyn Onboarding Service Frameworks,” Qualcomm Connected Experiences, Inc., accessed on Jul. 15, 2014, 9 pages. Retrieved from https://www.alljoyn.org. |
“Do you want to know how to find water leaks? Use a Bravedo Water Alert Flow Monitor to find out!”, Bravedo.com, accessed Dec. 16, 2014, 10 pages. Retrieved from http://bravedo.com/. |
“International Building Code Excerpts, Updated with recent code changes that impact electromagnetic locks,” Securitron, Assa Abloy, IBC/IFC 2007 Supplement and 2009, “Finally—some relief and clarification”, 2 pages.Retrieved from: www.securitron.com/Other/.../New_IBC-IFC_Code_Language.pdf. |
“Introduction to Ultrasonic Doppler Flowmeters,” OMEGA Engineering inc., accessed on Dec. 16, 2014, 3 pages. Retrieved from http://www.omega.com/prodinfo/ultrasonicflowmeters.html. |
“Flow Pulse®, Non-invasive clamp-on flow monitor for pipes,” Pulsar Process Measurement Ltd, accessed on Dec. 16, 2014, 2 pages.Retrieved from http://www.pulsar-pm.com/product-types/flow/flow-pulse.aspx. |
Lamonica, M., “CES 2010 Preview: Green comes in many colors,” retrieved from CNET.com (http://ces.cnet.com/8301-31045_1-10420381-269.html), Dec. 22, 2009, 2 pages. |
Robbins, Gordon, Deputy Chief, “Addison Fire Department Access Control Installation,” 2006 International Fire Code, Section 1008.1.3.4, 4 pages. |
“Ultrasonic Flow Meters,” RS Hydro Ltd, accessed on Dec. 16, 2014, 3 pages. Retrieved from http://www.rshydro.co.uk/ultrasonic-flowmeter.shtml. |
Wang et al., “Mixed Sound Event Verification on Wireless Sensor Network for Home Automation,” IEEE Transactions on Industrial Informatics, vol. 10, No. 1, Feb. 2014, 10 pages. |
International Search Report and Written Opinion for PCT/EP2011/051608 dated May 30, 2011, 13 pages. |
International Preliminary Report on Patentability for PCT/EP2011/051608 dated Aug. 16, 2012, 8 pages. |
International Search Report and Written Opinion for PCT/US2014/053876 dated Nov. 26, 2014, 8 pages. |
International Search Report and Written Opinion for PCT/US2014/055441 dated Dec. 4, 2014, 10 pages. |
International Search Report and Written Opinion for PCT/US2014/055476 dated Dec. 30, 2014, 10 pages. |
U.S. Appl. No. 12/700,310, filed Feb. 4, 2010, Office Action dated May 4, 2012, 15 pages. |
U.S. Appl. No. 12/700,310, filed Feb. 4, 2010, Final Office Action dated Oct. 10, 2012, 16 pages. |
U.S. Appl. No. 12/700,310, filed Feb. 4, 2010 Non-Final Office Action dated Apr. 1, 2013, 16 pages. |
U.S. Appl. No. 12/700,310, filed Feb. 4, 2010 Non-Final Office Action dated Oct. 15, 2013, 15 pages. |
U.S. Appl. No. 12/700,310, filed Feb. 4, 2010 Final Office Action dated Feb. 28, 2014, 17 pages. |
U.S. Appl. No. 12/700,310, filed Feb. 4, 2010 Non-Final Office Action dated Aug. 14, 2014, 18 pages. |
U.S. Appl. No. 12/700,310, filed Feb. 4, 2010 Non-Final Office Action dated Mar. 11, 2015, 35 pages. |
U.S. Appl. No. 12/700,408, filed Feb. 4, 2010, Notice of Allowance dated Jul. 28, 2012, 8 pages. |
U.S. Appl. No. 13/680,934, filed Nov. 19, 2012,Non-Final Office Action dated Oct. 2, 2013, 7 pages. |
U.S. Appl. No. 13/680,934, filed Nov. 19, 2012,Final Office Action dated Feb. 10, 2014, 13 pages. |
U.S. Appl. No. 13/680,934, filed Nov. 19, 2012,Notice of Allowance dated Apr. 30, 2014, 9 pages. |
U.S. Appl. No. 13/680,934, filed Nov. 19, 2012, Notice of Allowance dated Jul. 25, 2014, 12 pages. |
U.S. Appl. No. 14/107,132, filed Dec. 16, 2013 Non Final Office Action dated May 27, 2015, 26 pages. |
U.S. Appl. No. 14/485,188, filed Sep. 12, 2014 Pre-Interview First Office Action dated Jul. 29, 2015, 20 pages. |
International Preliminary Report on Patentability for PCT/GB2015/052544 dated Mar. 7, 2017, all pages. |
International Search Report and Written Opinion for PCT/US2016/057729 dated Mar. 28, 2017, all pages. |
European Search Report for EP 16 20 0422 dated Jan. 13, 2017, all pages. |
Bdejong_Cree, “Cannot remove last user of a group even though members still exist,” Microsoft Visual Studio forum site, Topic ID #58405, Response by Microsoft, Dec. 17, 2010) retrieved on Apr. 6, 2017 from: https://connect.microsoft.com/VisualStudio/feedback/details/580405/tfs-2010-cannont-remove-last-user-of-a-group-even-though-members-still-exists. |
International Preliminary Report on Patentability for PCT/GB2015/052457 dated Feb. 28, 2017, all pages. |
U.S. Appl. No. 14/485,188, filed Sep. 12, 2014, Non-Final Rejection dated Apr. 19, 2017, all pages. |
U.S. Appl. No. 14/567,765, filed Dec. 11, 2014, Final Rejection dated Feb. 16, 2017, all pages. |
U.S. Appl. No. 14/485,038, filed Sep. 12, 2014, Non Final Rejection dated Apr. 6, 2017, all pages. |
U.S. Appl. No. 14/584,075, filed Dec. 29, 2014, Non-Final Rejection dated Mar. 10, 2017, all pages. |
U.S. Appl. No. 14/710,331, filed May 12, 2015, Non-Final Rejection dated Mar. 10, 2017, all pages. |
U.S. Appl. No. 14/566,977, filed Dec. 11, 2014, Final Rejection dated Feb. 10, 2017, all pages. |
U.S. Appl. No. 14/671,299, filed Mar. 27, 2015, Notice of Allowance dated Apr. 17, 2017, all pages. |
U.S. Appl. No. 14/565,853, filed Dec. 10, 2014, Non Final Rejection dated Mar. 10, 2017, all pages. |
U.S. Appl. No. 15/075,412, filed Mar. 21, 2016, Final Rejection dated Apr. 17, 2017, all pages. |
U.S. Appl. No. 14/497,130, filed Sep. 25, 2014, Non Final Rejection dated Feb. 8, 2017, all pages. |
U.S. Appl. No. 14/528,402, filed Oct. 30, 2014, Non-Final Rejection dated Apr. 11, 2017, all pages. |
U.S. Appl. No. 14/475,252, filed Sep. 2, 2014, Non-Final Rejection dated Apr. 12, 2017, all pages. |
U.S. Appl. No. 14/832,821, filed Aug. 21, 2015, Non-Final Rejection dated Apr. 24, 2017, all pages. |
U.S. Appl. No. 14/981,501, filed Dec. 28, 2015, Preinterview first office action dated Apr. 20, 2017, all pages. |
U.S. Appl. No. 15/289,395, filed Oct. 10, 2016 Non-Final Rejection dated Jun. 19, 2017, all pages. |
U.S. Appl. No. 14/497,130, filed Sep. 25, 2014, Final Rejection dated Aug. 4, 2017, all pages. |
U.S. Appl. No. 14/981,501, filed Dec. 28, 2015, First Action Interview—office action dated Jul. 19, 2017, all pages. |
U.S. Appl. No. 14/567,502, filed Dec. 11, 2014, Final Rejection dated Aug. 7, 2017, all pages. |
Notification of Publication of European Application No. 15763643.2 as EP 3189511 on Jul. 12, 2017, 1 page. |
Notification of Publication of Brazilian Application No. BR 11 2016 0112032 dated Aug. 8, 2017, 2 pages. |
Notification of Publication of Brazilian Application No. BR 11 2016 010376 9 dated Aug. 8, 2017, 1 page. |
Supplementary European Search Report for EP 14868928 dated Jul. 7, 2017, 11 pages. |
Supplementary European Search Report for EP 14870507 dated Jun. 28, 2017, all pages. |
“Plug-In Carbon Monoxide & Natural Gas Alarm with Backup Battery Protection,” Universal Security Instruments, Inc. , 2011, 12 pages. |
U.S. Appl. No. 14/584,075, filed Dec. 29, 2014, Final Rejection dated Sep. 9, 2017, all pages. |
U.S. Appl. No. 14/952,580, filed Nov. 25, 2015, Non-Final Rejection dated Sep. 20, 2017, all pages. |
U.S. Appl. No. 15/189,775, filed Jun. 22, 2016, Notice of Allowance dated Sep. 11, 2017, all pages. |
U.S. Appl. No. 14/986,496, filed Dec. 31, 2015, Non-Final Rejection dated Sep. 26, 2017, all pages. |
U.S. Appl. No. 14/710,331, filed May 12, 2015, Final Rejection dated Aug. 16, 2017, all pages. |
U.S. Appl. No. 14/553,763, filed Nov. 25, 2014 Preinterview first office action dated Oct. 6, 2017, all pages. |
Notification of Publication of European Application No. 162004220 as EP 3166308 on May 10, 2017, 2 pages. |
U.S. Appl. No. 14/567,765, filed Dec. 11, 2014, Notice of Allowance dated May 24, 2017, all pages. |
U.S. Appl. No. 14/567,754, filed Dec. 11, 2014, Final Rejection dated May 26, 2017, all pages. |
U.S. Appl. No. 14/567,770, filed Dec. 11, 2014, Final Rejection dated Jun. 1, 2017, all pages. |
U.S. Appl. No. 14/476,377, filed Sep. 3, 2014, Notice of Allowance dated May 19, 2017, all pages. |
Mark Edward Soper, “Absolute Beginner's Guide to Home Automation,” 2005, Que Publishing, p. 57, 121. |
U.S. Appl. No. 14/982,366, filed Dec. 29, 2015, Non-Final Rejection dated Nov. 1, 2017, all pages. |
U.S. Appl. No. 15/246,079, filed Aug. 24, 2016, Non-Final Rejection dated Oct. 19, 2017, all pages. |
U.S. Appl. No. 14/485,188, filed Sep. 12, 2014, Final Rejection dated Oct. 25, 2017, all pages. |
U.S. Appl. No. 14/485,038, filed Sep. 12, 2014, Notice of Allowance dated Nov. 13, 2017, all pages. |
U.S. Appl. No. 14/528,402, filed Oct. 30, 2014, Final Rejection dated Oct. 31, 2017, all pages. |
U.S. Appl. No. 14/981,501, filed Dec. 28, 2015, Final Office Action dated Oct. 10, 2017, all pages. |
U.S. Appl. No. 14/986,483, filed Dec. 31, 2015, Non-Final Rejection dated Dec. 1, 2017, all pages. |
Number | Date | Country | |
---|---|---|---|
20160334811 A1 | Nov 2016 | US |