Audio output from an electronic device may be covered up by various sounds coming from other devices in its vicinity. For example, a phone may ring during a television show, but the ring may not be heard by a television viewer if the show is louder than the ring. In that case, the viewer may miss the phone call. In other cases, the television viewer may be engrossed in watching the show and unaware of the presence of other events, such as the ringing. This application is intended to address these and other issues, and to provide related advantages.
In general, the systems and methods disclosed herein are directed to controlling electronic devices, and more specifically, to controlling volume levels of electronic devices.
In one aspect, the present disclosure provides a method for adjusting a volume level of a home automation device. The method may include receiving, by a television receiver, a user input indicative of triggering a remote find operation to locate a remote control. The method may include, in response to the user input, sending, by the television receiver, a remote find instruction to the remote control, whereby the remote find instruction triggers the remote control to emit a sound. Further, the method may include, in response to the user input, sending, by the television receiver, a temporary quiet instruction to the home automation device. The quiet instruction may initiate a lowering of the volume level of the home automation device while the remote control is emitting the sound as a part of the remote find operation.
Various embodiments of the present method may include one or more of the following features. The method may include, in response to the user input, determining, by the television receiver, a sound level being output by the home automation device. The method may include comparing, by the television receiver, the sound level to a threshold sound level, and generating, by the television receiver, the quiet instruction based on the comparison. In another aspect, the method may include, in response to the user input, determining, by the television receiver, one or more additional home automation devices that are currently outputting sound, and sending, by the television receiver, additional temporary quiet instructions to the determined additional home automation devices. The method may include, in response to the user input, determining, by the television receiver, one or more additional home automation devices that are currently outputting sound above a threshold level, and sending, by the television receiver, additional temporary quiet instructions to the determined additional home automation devices that are currently outputting sound above the threshold level.
Other embodiments of the present method may include one or more of the following features. The method may include determining, by the television receiver, a defined period of time has passed since sending the remote find instruction to the remote control, and based on the determination, sending, by the television receiver, a resume instruction to the home automation device to return the lowered volume level to an original volume level. The method may include, based on the determination, sending, by the television receiver, a cease instruction to the remote control to stop emitting a sound. Further, the method may include receiving, by the television receiver, an indication from the remote control to cease the remote find operation, and in response to the indication, sending, by the television receiver, a resume instruction to the home automation device to return the lowered volume level to an original volume level. The quiet instruction may initiate the lowering of the volume level of the home automation device to a muted volume level.
In another aspect, a system for adjusting a volume level of a home automation device includes a computer system, whereby the computer system may be configured to receive a user input indicative of triggering a remote find operation to locate a remote control. The computer system may be configured to, in response to the user input, send a remote find instruction to the remote control, whereby the remote find instruction triggers the remote control to emit a sound. The computer system may be configured to, in response to the user input, send a temporary quiet instruction to the home automation device. The quiet instruction may initiate a lowering of the volume level of the home automation device while the remote control is emitting the sound as a part of the remote find operation.
Various embodiments of the present system may include one or more of the following features. The computer system may be configured to, in response to the user input, determine a sound level being output by the home automation device, compare the sound level to a threshold sound level, and generate the quiet instruction based on the comparison. The computer system may be configured to, in response to the user input, determine one or more additional home automation devices that are currently outputting sound, and send additional temporary quiet instructions to the determined additional home automation devices. The computer system may be configured to, in response to the user input, determine one or more additional home automation devices that are currently outputting sound above a threshold level, and send additional temporary quiet instructions to the determined additional home automation devices that are currently outputting sound above the threshold level.
Other embodiments of the present system may include one or more of the following features. The computer system may be configured to determine a defined period of time has passed since sending the remote find instruction to the remote control, and based on the determination, send a resume instruction to the home automation device to return the lowered volume level to an original volume level. The computer system may be configured to, based on the determination, send a cease instruction to the remote control to stop emitting a sound. The computer system may be configured to receive an indication from the remote control to cease the remote find operation, and in response to the indication, send a resume instruction to the home automation device to return the lowered volume level to an original volume level. The quiet instruction may initiate the lowering of the volume level of the home automation device to a muted volume level.
In yet another aspect, a computer-readable medium having instructions stored thereon for adjusting a volume level of a home automation device is provided, whereby the instructions are executable by one or more processors for receiving a user input indicative of triggering a remote find operation to locate a remote control. The instructions are executable by one or more processors for, in response to the user input, sending a remote find instruction to the remote control, whereby the remote find instruction triggers the remote control to emit a sound, and in response to the user input, sending a temporary quiet instruction to the home automation device. The quiet instruction may initiate a lowering of the volume level of the home automation device while the remote control is emitting the sound as a part of the remote find operation.
Various embodiments of the present system may include one or more of the following features. The instructions are executable by one or more processors for, in response to the user input, determining a sound level being output by the home automation device. Further, the instructions are executable by one or more processors for comparing the sound level to a threshold sound level, and generating the quiet instruction based on the comparison. The instructions are executable by one or more processors for, in response to the user input, determining one or more additional home automation devices that are currently outputting sound, and sending additional temporary quiet instructions to the determined additional home automation devices. The instructions are executable by one or more processors for determining a defined period of time has passed since sending the remote find instruction to the remote control, and based on the determination, sending a resume instruction to the home automation device to return the lowered volume level to an original volume level.
The present invention is described in conjunction with the appended figures:
In the appended figures, similar components and/or features may have the same numerical reference label. Further, various components of the same type may be distinguished by following the reference label by a letter that distinguishes among the similar components and/or features. If only the first numerical reference label is used in the specification, the description is applicable to any one of the similar components and/or features having the same first numerical reference label irrespective of the letter suffix.
In general, the systems and methods disclosed herein provide for adjusting a volume level of an electronic device, and more specifically, for adjusting the volume level of sound being output from the electronic device when presence of another sound is detected. Merely by way of example, upon detection of a desirable sound, such as a remote-control beeping during a remote-control-find-operation, a television and/or speakers outputting other sounds may be muted and/or otherwise turned down to prevent the desired sound from being masked and therefore missed. In some aspects, the systems and methods disclosed herein may be implemented in conjunction with a home automation system, whereby multiple devices are in operative communication with one another via various home automation network protocols, engines, and so on. Such home automation features may be provided for by a television, television receiver, and/or an overlay device which may be in operative communication with the television receiver, to generate and send volume-level adjustment instructions to multiple electronic devices via the network. Other examples are possible. For instance, devices may receive volume adjustment instructions via other wired and/or wireless communication channels, and/or a combination of such channels.
In practice, the present systems and methods may offer various benefits. For example, present systems and methods may provide automatic muting and/or volume adjustments upon detection of specific, predetermined events. In an example described hereinbelow, the systems and methods may aid in a remote control find operation for locating a lost remote control, or any other electronic device in which an audial emitter, finder, and/or locator feature is provided. In the present example, a user may press a button on the television receiver, overlay device, and/or television to relay instructions to the remote control to instantiate a beeping or other sound output, which may last until the find operation is turned off by a user via a button on the remote control, television receiver, television, and/or overlay device. In some aspects, initiation of the find operation may instantiate other alerts, alternatively and/or additionally, such as flashing lights being emitted from the remote control. The present systems and methods may aid the find operation by lowering volume levels of other devices while the desired sound is being output. It is noted that any context may be contemplated for implementing the automatic volume adjustments described herein, and that any specific examples being provided are not intended to be limiting.
As shown in
Still referring to
As shown in
Still in reference to
Referring again to
As further illustrated in
As shown in
Further shown in
Turning now to
As shown in
Referring to
Still referring to
As shown in
Still referring to
Referring again to
Still referring to
Still in reference to
As shown in
Further shown in
Still referring to
Still referring to
Referring again to
Still referring to
Further shown in
As shown in
As shown in
For instance, as shown in
Further shown in
Still shown in
Still referring to
Still shown in
As shown in
Also shown in
Further in reference to
Referring again to
Referring to
Referring again to
Further, as shown in
Referring to
Referring again to
Leak detection sensor 224 of
Further shown in
Appliance controller 226 of
Appliances and other electronic devices may also be monitored for electricity usage. For instance, U.S. Pat. Pub. No. 2013/0318559, filed Nov. 19, 2012, to Crabtree, entitled “Apparatus for Displaying Electrical Device Usage Information on a Television Receiver,” which is hereby incorporated by reference, may allow for information regarding the electricity usage of one or more devices (e.g., other home automation devices or circuits within a home that are monitored) to be determined. Control of one or more home automation devices may be dependent on electrical usage and stored electrical rates. For instance, a washing machine may be activated in the evening when rates are lower. Additionally or alternatively, operation of devices may be staggered to help prevent consuming too much power at a given time. For instance, an electric heater may not be activated until a dryer powered via the same circuit is powered down.
Garage door controller 228 of
Lock controller 230 of
A home security system 207 of
Irrigation controller 232 of
One or more motion sensors can be incorporated into one or more of the previously detailed home automation devices or as a stand-alone device. Such motion sensors may be used to determine if a structure is occupied. Such information may be used in conjunction with a determined location of one or more wireless devices. If some or all users are not present in the structure, home automation settings may be adjusted, such as by lowering a temperature of thermostat 222, shutting off lights via light controller 220, and determining if one or more doors are closed by door sensor 208. In some embodiments, a user-defined script may be run when it is determined that no users or other persons are present within the structure.
Additional forms of sensors not illustrated in
The home automation functions detailed herein that are attributed to television receiver 150 may alternatively or additionally be incorporated into overlay device 251. As such, a separate overlay device 251 may be connected with display device 160 to provide home automation functionality.
Turning now to
As shown in
In other embodiments of television receiver 300, fewer or greater numbers of components may be present. It should be understood that the various components of television receiver 300 may be implemented using hardware, firmware, software, and/or some combination thereof. Functionality of components may be combined; for example, functions of descrambling engine 365 may be performed by tuning management processor 310-2. Further, functionality of components may be spread among additional components. For instance, the home automation settings database 347, home automation script database 348, and/or volume controls engine 350 may be provided for, wholly or partly, in the overlay device 241.
In
Control processor 310-1 of
Control processor 310-1 of
Tuners 315 of
Network interface 320 of
Storage medium 325 of
Home automation settings database 347 of
Home automation settings database 347 of
Home automation script database 348 of
In some embodiments, home automation script database 248 of
EPG database 330 of
Decoder module 333 of
Television interface 335 of
Still referring to
DVR database 345 of
On-demand programming database 327 of
Referring back to tuners 315 of
Tuning management processor 310-2 of
Descrambling engine 365 of
In some embodiments, the television receiver 300 of
For simplicity, television receiver 300 of
While the television receiver 300 has been illustrated as a satellite-based television receiver, it is to be appreciated that techniques below may be implemented in other types of television receiving devices, such a cable receivers, terrestrial receivers, IPTV receivers or the like. In some embodiments, the television receiver 300 may be configured as a hybrid receiving device, capable of receiving content from disparate communication networks, such as satellite and terrestrial television broadcasts. In some embodiments, the tuners may be in the form of network interfaces capable of receiving content from designated network locations. The home automation functions of television receiver 300 may be performed by an overlay device. If such an overlay device, television programming functions may still be provided by a television receiver that is not used to provide home automation functions.
Turning now to
The method 400 may include receiving user input for a remote find operation (step 402), where the user input may be indicative of triggering the remote find operation to locate a remote control. The user input may be received upon a user's manual depression or selection of a dedicated button that is provided on the television receiver, television, and/or overlay device. In other examples, the user input may be received wirelessly at the television receiver via communication with a mobile device, such as a smartphone, tablet, and/or laptop computer. For instance, the user may instantiate the remote find operation via a mobile application that is provided on the mobile device. The mobile application may communicate, wirelessly via a local wireless network and/or cellular networks, with the television receiver to activate the remote find operation. In other examples, user input may be received via activation of other dedicated buttons, such as an external trigger button on a key fob that is dedicated to triggering the remote find operation, and/or a separate mountable trigger button. Merely by way of example, such buttons may be physically mounted to an underside of an end table or coffee table and configured to communicate with the television receiver, television, and/or overlay device upon user activation. Such communication pathways may include infrared, radio frequency, Bluetooth, and/or other technologies.
Further, the method 400 may include instructing the remote control to emit a sound (step 404). For example, in response to receiving the user input, the method 400 may include sending, by the television receiver, a remote find instruction to the remote control, whereby the remote find instruction triggers the remote control to emit a beeping sound. In some examples, the television receiver may trigger the remote control to emit other signals, such as flashing lights and/or a particular sound profile. Such sound profiles may be preset to indicate a location or distance of the remote control from the television receiver. For instance, the television receiver may instruct the remote control to output short quick beeps upon determining that the remote control is within a predefined close range of the television receiver, such as in a same room of the television receiver. In another example, the television receiver may instruct the remote control to output long, slow beeps upon determining that the remote control is outside the predefined close range, or in another room. In another example, the television receiver may instruct the remote control to emit the sound at a particular decibel or volume level. Such volume levels may be dependent on a measure of a room's ambient noise, which may be determined by the television receiver, and/or on a distance of the remote control from the television receiver. Further, the television receiver may continue to send subsequent instructions regarding the sound emitted from the remote control. For example, if the remote control has not been located within a predetermined period of time, the television receiver may instruct the remote control to increase a volume level of its sound emission. Other examples are possible.
The method 400 may include instructing other electronic devices, such as another home automation device, to reduce sound during the remote find operation (step 406). For example, the method 400 may include, in response to receiving the user input, sending, by the television receiver, a temporary quiet instruction to the home automation device. In some examples, sending the temporary quiet instruction may be performed simultaneously, immediately before, and/or immediately after sending the remote find instruction to the remote control at step 404. The quiet instruction may initiate a lowering of the volume level of the home automation device such that any sound being output from the home automation device does not cover up or otherwise interfere with sound being emitted from the remote control as a part of the remote find operation. In some cases, the quiet instruction initiates the lowering of the volume level of the home automation device to a muted volume level. In another example, the temporary quiet instruction includes instructions for muting or otherwise maintaining the lowered level of sound for a predetermined period of time, such that after passage of that time, the home automation device resumes sound output at an original, preceding volume level. In still another example, upon passage of the predetermined period of time, the television receiver may send additional instructions to the home automation device to reinitiate muting of the volume level and/or to further decrease the volume level of the home automation device. Other examples are possible.
It is contemplated that a variety of methods for communication the temporary quiet instructions to the home automation device, and/or a plurality of home automation devices, may be utilized. In a particular example, the television receiver may be in operative, bi-directional communication with a network of home automation devices via a home automation network. In other examples, the television receiver may be connected to the home automation network via the overlay device. It is contemplated that wireless communication pathways and wired communication pathways may be implemented. For instance, the television receiver may relay HDMI-CEC commands or instructions via an HDMI link between the television receiver, speakers, and/or television. In another example, the instructions may be relayed to one or more devices via Bluetooth communications. In still other examples, instructions may be relayed by the television receiver via a plurality of different communication pathways.
Further, it is noted that the television receiver may transmit temporary quiet instructions to every device in a network or plurality of networks. In some cases, the temporary instructions are global and transmitted to every device in the network(s), and in other cases, the temporary instructions are unique for each device in the network(s). For instance, the quiet instructions may include muting for particular devices, and a decreased volume setting for other devices. Still, in other examples, the quiet instructions may call for pausing an operation and/or shut down of a device, for instance, if sound output is necessarily tied to the nature of the device's operation. The television receiver may send temporary quiet instructions to only those devices generating sound. In other cases, the television receiver may transmit temporary quiet instructions to only those devices that generate sound above a threshold sound level. Still, in other cases, the television receiver may transmit quiet instructions to devices generating sound in a particular location and/or area of the house. It is noted that the temporary quiet instructions may include any combination of the foregoing and that other examples are possible.
In some cases, the method 400 may include determining which home automation devices are outputting sound and/or their current volume level. For instance, the television receiver may, in response to the user input, determine a sound level being output by a home automation device. The television receiver may compare the sound level to a threshold sound level, and/or generate the quiet instruction based on the comparison. Further, the method 400 may include, in response to the user input, determining one or more additional home automation devices that are currently outputting sound, and/or sending, by the television receiver, additional temporary quiet instructions to the determined additional home automation devices. In still other cases, the method 400 may include, in response to the user input, determining, by the television receiver, one or more additional home automation devices that are currently outputting sound above a threshold level, and sending, by the television receiver, additional temporary quiet instructions to the determined additional home automation devices that are currently outputting sound above the threshold level.
Further shown in
In some cases, the method 400 may include instructing electronic devices to resume sound (410). For instance, the method may include determining, by the television receiver, a defined period of time has passed since sending the remote find instruction to the remote control. Based on the determination, the television receiver may send a resume instruction to the home automation device to return the lowered volume level to an original volume level. In another example, based on indication from a user to end the remote find operation, the method 400 may include sending, by the television receiver, the resume instruction to the home automation device to return the lowered or muted volume level to an original volume level. Such resume instructions may be transmitted to the home automation device utilizing the same communication pathways for transmitting the quiet instructions at step 406. It is noted that this step, and/or any of the other steps presented in any method of this disclosure, may be optional. For instance, temporary quiet instructions may indicate a predetermined period of time for maintaining the lowered or muted volume level at the home automation device, where after passage of the predetermined period of time, the home automation device automatically resumes its original volume output. In that case, resume instructions may not be sent from the television receiver. Other examples are possible.
Turning now to
The method 500 may start (step 502) with receiving user input to initiate a remote find operation (step 504). In response to receiving the user input, the method 500 may include instructing the remote control to emit a sound (step 506). In some cases, instructions transmitted to the remote control may define a particular sound profile, such as a type of sound and/or sound sequence, and/or a predetermined length of time for emitting the sound. Further, the method 500 may include determining a location of the remote control and/or a distance of the remote control from the television receiver. The instructions transmitted to the remote control may include a volume level of the sound to be emitted. For example, the television receiver may determine that the remote control is located farther away and/or in a different room, and based on the determination, instruct the remote control to emit a louder and/or higher-pitched sound.
The method 500 may include determining if the one or more electronic devices, such as home automation devices, is presently emitting sound, and/or identifying which electronic devices are presently emitting sound (step 508). If a particular device is not emitting sound, the example method 500 may end for that particular device, although it is noted that the remote find operation may continue and that other devices may still receive quiet instructions. If the particular device is emitting sound, the method 500 may determine if the sound is above a threshold level (step 512). If the sound is not above the threshold level, the method 500 may end (step 514) for that particular device while the remote find operation remains active. However, if the sound is above the threshold level, the method 500 may include instructing the electronic device to reduce its sound output (step 516). It is noted that the method 500 may include sending quiet instructions to all devices regardless of their individual levels of sound output. In some cases, the quiet instructions may prevent devices that were not outputting sound from outputting sound later on if the remote find operation is still active. In some cases, the quiet instructions may prevent such devices from outputting sounds above the threshold level and/or from outputting any sound at all.
Further, the method 500 may include determining if a predefined period of time has passed since initiating the remote find operation (step 518). If not, the method 500 may include determining if the remote control has been found (step 520), e.g. if a user input from the remote control has been received by the television receiver. If the remote control has not been found, e.g. no user input from the remote control has been received, the method 500 may loop back to monitoring for passage of a predetermined period of time at step 518. If the remote control has been found at step 520 and/or the predetermined period of time has passed at step 518, the method 500 may include instructing the electronic device to resume its sound output (step 522). In some cases, the instructions may include resuming the sound output to an original level of sound. In another example, the method 500 may include instructing the remote control to stop emitting sound (step 524) and end (step 526) the remote find operation.
It is noted that although a remote control find operation is discussed herein, any instance where a desired sound is being output from a device while in the presence of other sound output from other devices, such as a televisions and/or speakers, may benefit from the systems and methods described herein. For instance, turning to
The method 600 may include determining an onset of a desired sound (step 602). For instance, the method 600 may determining that a remote control find operation, and/or any other audial-based device finder, is initiated and/or about to emit sound for locating the device. In another example, the audial-based device finder may include a key fob that emits the sound in response to activation of a key locator button or switch, which may be provided via on a television receiver, remote control, television, overlay device, and/or any mobile device. Still, in other examples, the method 600 may include identifying events categorized as having desired sounds, such as detecting incoming cellular phone calls, incoming VoIP calls, activity of a baby monitor, and so on. Such detections may be sensed by microphones and/or cameras in operative communication with the television receiver, and/or based on detection of various incoming and/or outgoing signals that are picked up by the television receiver and/or in a vicinity of a television receiver. It is noted that preceding step 602, the user may activate the method 600 to listen for particular events, and/or the method 600 may be activated based on a timer and/or time of day.
Further, the method 600 may include broadcasting one or more quiet instructions to one or more electronic devices that are not emitting the desired sound (step 604). For instance, the method 600 may determine one or more electronic devices that are outputting other sounds that may potentially mask output of the desired sound. The method 600 may include transmitting quiet instructions to mute and/or otherwise decrease volume levels of those devices. In another example, the method 600 includes sending quiet instructions to all devices to mute and/or decrease volume levels thereof for a predetermined period of time, or while the desired sound is present.
Still further, the method 600 may include determining a cessation of the desired sound (step 606). For instance, the method 600 may include determining that a volume level of the desired sound has diminished, and/or that an incoming/outgoing signal corresponding to the desired sound is no longer present. Still, the method 600 may include detecting an input indicating cessation of the desired sound, e.g. manual user input directed to ending the desired sound. Subsequently, the method 600 may include broadcasting resume instruction(s) to the electronic device(s) to resume an original sound output and/or sound level (step 608). Other examples are possible.
Turning now to
The computer device 700 is shown comprising hardware elements that may be electrically coupled via a bus 702 (or may otherwise be in communication, as appropriate). The hardware elements may include a processing unit with one or more processors 704, including without limitation one or more general-purpose processors and/or one or more special-purpose processors (such as digital signal processing chips, graphics acceleration processors, and/or the like); one or more input devices 706, which may include without limitation a remote control, a mouse, a keyboard, and/or the like; and one or more output devices 708, which may include without limitation a presentation device (e.g., television), a printer, and/or the like.
The computer system 700 may further include (and/or be in communication with) one or more non-transitory storage devices 710, which may comprise, without limitation, local and/or network accessible storage, and/or may include, without limitation, a disk drive, a drive array, an optical storage device, a solid-state storage device, such as a random access memory, and/or a read-only memory, which may be programmable, flash-updateable, and/or the like. Such storage devices may be configured to implement any appropriate data stores, including without limitation, various file systems, database structures, and/or the like.
The computer device 700 might also include a communications subsystem 712, which may include without limitation a modem, a network card (wireless and/or wired), an infrared communication device, a wireless communication device and/or a chipset such as a Bluetooth™ device, 802.11 device, WiFi device, WiMax device, cellular communication facilities such as GSM (Global System for Mobile Communications), W-CDMA (Wideband Code Division Multiple Access), LTE (Long Term Evolution), etc., and/or the like. The communications subsystem 712 may permit data to be exchanged with a network (such as the network described below, to name one example), other computer systems, and/or any other devices described herein. In many embodiments, the computer system 700 will further comprise a working memory 714, which may include a random access memory and/or a read-only memory device, as described above.
The computer device 700 also may comprise software elements, shown as being currently located within the working memory 714, including an operating system 716, device drivers, executable libraries, and/or other code, such as one or more application programs 718, which may comprise computer programs provided by various embodiments, and/or may be designed to implement methods, and/or configure systems, provided by other embodiments, as described herein. By way of example, one or more procedures described with respect to the method(s) discussed above, and/or system components might be implemented as code and/or instructions executable by a computer (and/or a processor within a computer); in an aspect, then, such code and/or instructions may be used to configure and/or adapt a general purpose computer (or other device) to perform one or more operations in accordance with the described methods.
A set of these instructions and/or code might be stored on a non-transitory computer-readable storage medium, such as the storage device(s) 710 described above. In some cases, the storage medium might be incorporated within a computer system, such as computer system 700. In other embodiments, the storage medium might be separate from a computer system (e.g., a removable medium, such as flash memory), and/or provided in an installation package, such that the storage medium may be used to program, configure, and/or adapt a general purpose computer with the instructions/code stored thereon. These instructions might take the form of executable code, which is executable by the computer device 700 and/or might take the form of source and/or installable code, which, upon compilation and/or installation on the computer system 700 (e.g., using any of a variety of generally available compilers, installation programs, compression/decompression utilities, etc.), then takes the form of executable code.
It will be apparent that substantial variations may be made in accordance with specific requirements. For example, customized hardware might also be used, and/or particular elements might be implemented in hardware, software (including portable software, such as applets, etc.), or both. Further, connection to other computing devices such as network input/output devices may be employed.
As mentioned above, in one aspect, some embodiments may employ a computer system (such as the computer device 700) to perform methods in accordance with various embodiments of the disclosure. According to a set of embodiments, some or all of the procedures of such methods are performed by the computer system 700 in response to processor 704 executing one or more sequences of one or more instructions (which might be incorporated into the operating system 716 and/or other code, such as an application program 718) contained in the working memory 714. Such instructions may be read into the working memory 714 from another computer-readable medium, such as one or more of the storage device(s) 710. Merely by way of example, execution of the sequences of instructions contained in the working memory 714 may cause the processor(s) 704 to perform one or more procedures of the methods described herein.
The terms “machine-readable medium” and “computer-readable medium,” as used herein, may refer to any non-transitory medium that participates in providing data that causes a machine to operate in a specific fashion. In an embodiment implemented using the computer device 700, various computer-readable media might be involved in providing instructions/code to processor(s) 704 for execution and/or might be used to store and/or carry such instructions/code. In many implementations, a computer-readable medium is a physical and/or tangible storage medium. Such a medium may take the form of a non-volatile media or volatile media. Non-volatile media may include, for example, optical and/or magnetic disks, such as the storage device(s) 710. Volatile media may include, without limitation, dynamic memory, such as the working memory 714.
Example forms of physical and/or tangible computer-readable media may include a floppy disk, a flexible disk, hard disk, magnetic tape, or any other magnetic medium, a compact disc, any other optical medium, ROM, RAM, and etc., any other memory chip or cartridge, or any other medium from which a computer may read instructions and/or code. Various forms of computer-readable media may be involved in carrying one or more sequences of one or more instructions to the processor(s) 704 for execution. By way of example, the instructions may initially be carried on a magnetic disk and/or optical disc of a remote computer. A remote computer might load the instructions into its dynamic memory and send the instructions as signals over a transmission medium to be received and/or executed by the computer system 700.
The communications subsystem 712 (and/or components thereof) generally will receive signals, and the bus 702 then might carry the signals (and/or the data, instructions, etc. carried by the signals) to the working memory 714, from which the processor(s) 704 retrieves and executes the instructions. The instructions received by the working memory 714 may optionally be stored on a non-transitory storage device 710 either before or after execution by the processor(s) 704.
It should further be understood that the components of computer device 700 can be distributed across a network. For example, some processing may be performed in one location using a first processor while other processing may be performed by another processor remote from the first processor. Other components of computer system 700 may be similarly distributed. As such, computer device 700 may be interpreted as a distributed computing system that performs processing in multiple locations. In some instances, computer system 700 may be interpreted as a single computing device, such as a distinct laptop, desktop computer, or the like, depending on the context.
The methods, systems, and devices discussed above are examples. Various configurations may omit, substitute, or add various method steps or procedures, or system components as appropriate. For instance, in alternative configurations, the methods may be performed in an order different from that described, and/or various stages may be added, omitted, and/or combined. Also, features described with respect to certain configurations may be combined in various other configurations. Different aspects and elements of the configurations may be combined in a similar manner. Also, technology evolves and, thus, many of the elements are examples and do not limit the scope of the disclosure or claims.
Specific details are given in the description to provide a thorough understanding of example configurations (including implementations). However, configurations may be practiced without these specific details. For example, well-known circuits, processes, algorithms, structures, and techniques have been shown without unnecessary detail in order to avoid obscuring the configurations. This description provides example configurations only, and does not limit the scope, applicability, or configurations of the claims. Rather, the preceding description of the configurations will provide those of skill with an enabling description for implementing described techniques. Various changes may be made in the function and arrangement of elements without departing from the spirit or scope of the disclosure.
Also, configurations may be described as a process which is depicted as a flow diagram or block diagram. Although each may describe the operations as a sequential process, many of the operations may be performed in parallel or concurrently. In addition, the order of the operations may be rearranged. A process may have additional steps not included in the figure. Furthermore, examples of the methods may be implemented by hardware, software, firmware, middleware, microcode, hardware description languages, or any combination thereof. When implemented in software, firmware, middleware, or microcode, the program code or code segments to perform the necessary tasks may be stored in a non-transitory computer-readable medium such as a storage medium. Processors may perform the described tasks.
Furthermore, the example embodiments described herein may be implemented as logical operations in a computing device in a networked computing system environment. The logical operations may be implemented as: (i) a sequence of computer implemented instructions, steps, or program modules running on a computing device; and (ii) interconnected logic or hardware modules running within a computing device.
Although the subject matter has been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the specific features or acts described above. Rather, the specific features and acts described above are disclosed as example forms of implementing the claims.
Number | Name | Date | Kind |
---|---|---|---|
4386436 | Kocher et al. | May 1983 | A |
4581606 | Mallory | Apr 1986 | A |
4728949 | Platte et al. | Mar 1988 | A |
4959713 | Morotomi et al. | Sep 1990 | A |
5400246 | Wilson et al. | Mar 1995 | A |
5770896 | Nakajima | Jun 1998 | A |
5805442 | Crater et al. | Sep 1998 | A |
5822012 | Jeon et al. | Oct 1998 | A |
5894331 | Yang | Apr 1999 | A |
5926090 | Taylor | Jul 1999 | A |
5970030 | Dimitri et al. | Oct 1999 | A |
6081758 | Parvulescu | Jun 2000 | A |
6104334 | Allport | Aug 2000 | A |
6107918 | Klein et al. | Aug 2000 | A |
6107935 | Comerford et al. | Aug 2000 | A |
6119088 | Ciluffo | Sep 2000 | A |
6182094 | Humpleman et al. | Jan 2001 | B1 |
6330621 | Bakke et al. | Dec 2001 | B1 |
6337899 | Alcendor et al. | Jan 2002 | B1 |
6377858 | Koeppe | Apr 2002 | B1 |
6405284 | Bridge | Jun 2002 | B1 |
6415257 | Jungua et al. | Jul 2002 | B1 |
6502166 | Cassidy | Dec 2002 | B1 |
6529230 | Chong | Mar 2003 | B1 |
6553375 | Huang et al. | Apr 2003 | B1 |
6662282 | Cochran | Dec 2003 | B2 |
6756998 | Bilger | Jun 2004 | B1 |
6931104 | Foster et al. | Aug 2005 | B1 |
6976187 | Arnott et al. | Dec 2005 | B2 |
6989731 | Kawai et al. | Jan 2006 | B1 |
7009528 | Griep | Mar 2006 | B2 |
7010332 | Irvin et al. | Mar 2006 | B1 |
7088238 | Karaoguz et al. | Aug 2006 | B2 |
7103545 | Furuta | Sep 2006 | B2 |
7143298 | Wells et al. | Nov 2006 | B2 |
7234074 | Cohn et al. | Jun 2007 | B2 |
7260538 | Calderone et al. | Aug 2007 | B2 |
7346917 | Gatto et al. | Mar 2008 | B2 |
7372370 | Stults et al. | May 2008 | B2 |
7386666 | Beauchamp et al. | Jun 2008 | B1 |
7395369 | Sepez et al. | Jul 2008 | B2 |
7395546 | Asmussen | Jul 2008 | B1 |
7529677 | Wittenberg | May 2009 | B1 |
7574494 | Mayernick et al. | Aug 2009 | B1 |
7590703 | Cashman et al. | Sep 2009 | B2 |
7640351 | Reckamp et al. | Dec 2009 | B2 |
7694005 | Reckamp et al. | Apr 2010 | B2 |
7739718 | Young et al. | Jun 2010 | B1 |
7861034 | Yamamoto et al. | Dec 2010 | B2 |
7870232 | Reckamp et al. | Jan 2011 | B2 |
7945297 | Philipp | May 2011 | B2 |
7969318 | White et al. | Jun 2011 | B2 |
8013730 | Oh et al. | Sep 2011 | B2 |
8086757 | Chang | Dec 2011 | B2 |
8106768 | Neumann | Jan 2012 | B2 |
8156368 | Chambliss et al. | Apr 2012 | B2 |
8171148 | Lucas et al. | May 2012 | B2 |
8180735 | Ansari et al. | May 2012 | B2 |
8201261 | Barfield et al. | Jun 2012 | B2 |
8221290 | Vincent et al. | Jul 2012 | B2 |
8275143 | Johnson | Sep 2012 | B2 |
8289157 | Patenaude et al. | Oct 2012 | B2 |
8290545 | Terlizzi | Oct 2012 | B2 |
8310335 | Sivakkolundhu | Nov 2012 | B2 |
8316413 | Crabtree | Nov 2012 | B2 |
8320578 | Kahn et al. | Nov 2012 | B2 |
8335312 | Gerhardt et al. | Dec 2012 | B2 |
8413204 | White et al. | Apr 2013 | B2 |
8498572 | Schooley et al. | Jul 2013 | B1 |
8516087 | Wilson et al. | Aug 2013 | B2 |
8550368 | Butler et al. | Oct 2013 | B2 |
8619136 | Howarter et al. | Dec 2013 | B2 |
8644525 | Bathurst | Feb 2014 | B2 |
8645327 | Falkenburg et al. | Feb 2014 | B2 |
8667529 | Taxier | Mar 2014 | B2 |
8750576 | Huang et al. | Jun 2014 | B2 |
8780201 | Scalisi et al. | Jul 2014 | B1 |
8786698 | Chen et al. | Jul 2014 | B2 |
8799413 | Taylor et al. | Aug 2014 | B2 |
8898709 | Crabtree | Nov 2014 | B2 |
8930700 | Wielopolski | Jan 2015 | B2 |
8965170 | Benea et al. | Feb 2015 | B1 |
9019111 | Sloo et al. | Apr 2015 | B1 |
9049567 | Le Guen et al. | Jun 2015 | B2 |
9246921 | Vlaminck et al. | Jan 2016 | B1 |
9462041 | Hagins et al. | Oct 2016 | B1 |
9495860 | Lett | Nov 2016 | B2 |
9511259 | Mountain | Dec 2016 | B2 |
20010012998 | Jouet et al. | Aug 2001 | A1 |
20020019725 | Petite | Feb 2002 | A1 |
20020063633 | Park | May 2002 | A1 |
20020080238 | Ohmura | Jun 2002 | A1 |
20020193989 | Geilhufe et al. | Dec 2002 | A1 |
20030005431 | Shinohara | Jan 2003 | A1 |
20030052789 | Colmenarez et al. | Mar 2003 | A1 |
20030097452 | Kim et al. | May 2003 | A1 |
20030126593 | Mault | Jul 2003 | A1 |
20030133551 | Kahn | Jul 2003 | A1 |
20030140352 | Kim | Jul 2003 | A1 |
20030201900 | Bachinski et al. | Oct 2003 | A1 |
20040019489 | Funk et al. | Jan 2004 | A1 |
20040117038 | Karaoguz et al. | Jun 2004 | A1 |
20040117843 | Karaoguz et al. | Jun 2004 | A1 |
20040121725 | Matsui | Jun 2004 | A1 |
20040128034 | Lenker et al. | Jul 2004 | A1 |
20040143838 | Rose | Jul 2004 | A1 |
20040148419 | Chen et al. | Jul 2004 | A1 |
20040148632 | Park et al. | Jul 2004 | A1 |
20040260407 | Wimsatt | Dec 2004 | A1 |
20040266419 | Arling et al. | Dec 2004 | A1 |
20050038875 | Park | Feb 2005 | A1 |
20050049862 | Choi et al. | Mar 2005 | A1 |
20050188315 | Campbell et al. | Aug 2005 | A1 |
20050200478 | Koch | Sep 2005 | A1 |
20050245292 | Bennett et al. | Nov 2005 | A1 |
20050264698 | Eshleman | Dec 2005 | A1 |
20050289614 | Baek et al. | Dec 2005 | A1 |
20060011145 | Kates | Jan 2006 | A1 |
20060087428 | Wolfe et al. | Apr 2006 | A1 |
20060136968 | Han et al. | Jun 2006 | A1 |
20060143679 | Yamada et al. | Jun 2006 | A1 |
20060155389 | Pessolano et al. | Jul 2006 | A1 |
20070044119 | Sullivan et al. | Feb 2007 | A1 |
20070078910 | Bopardikar | Apr 2007 | A1 |
20070129220 | Bardha | Jun 2007 | A1 |
20070135225 | Nieminen et al. | Jun 2007 | A1 |
20070142022 | Madonna et al. | Jun 2007 | A1 |
20070146545 | Iwahashi | Jun 2007 | A1 |
20070157258 | Jung et al. | Jul 2007 | A1 |
20070192486 | Wilson et al. | Aug 2007 | A1 |
20070256085 | Reckamp et al. | Nov 2007 | A1 |
20070271518 | Tischer et al. | Nov 2007 | A1 |
20070275670 | Chen et al. | Nov 2007 | A1 |
20080021971 | Halgas | Jan 2008 | A1 |
20080022322 | Grannan et al. | Jan 2008 | A1 |
20080062258 | Bentkovski et al. | Mar 2008 | A1 |
20080062965 | Silva | Mar 2008 | A1 |
20080109095 | Braithwaite | May 2008 | A1 |
20080114963 | Cannon et al. | May 2008 | A1 |
20080123825 | Abramson et al. | May 2008 | A1 |
20080140736 | Jarno | Jun 2008 | A1 |
20080163330 | Sparrell | Jul 2008 | A1 |
20080278635 | Hardacker | Nov 2008 | A1 |
20080284905 | Chuang | Nov 2008 | A1 |
20080288876 | Fleming | Nov 2008 | A1 |
20080297660 | Shioya | Dec 2008 | A1 |
20090023554 | Shim | Jan 2009 | A1 |
20090069038 | Olague et al. | Mar 2009 | A1 |
20090112541 | Anderson et al. | Apr 2009 | A1 |
20090138507 | Burckart | May 2009 | A1 |
20090146834 | Huang | Jun 2009 | A1 |
20090165069 | Kirchner | Jun 2009 | A1 |
20090167555 | Kohanek | Jul 2009 | A1 |
20090190040 | Watanabe et al. | Jul 2009 | A1 |
20090249428 | White et al. | Oct 2009 | A1 |
20090271203 | Resch et al. | Oct 2009 | A1 |
20100031286 | Gupta et al. | Feb 2010 | A1 |
20100046918 | Takao et al. | Feb 2010 | A1 |
20100083371 | Bennetts et al. | Apr 2010 | A1 |
20100097225 | Petricoin, Jr. | Apr 2010 | A1 |
20100122284 | Yoon et al. | May 2010 | A1 |
20100131280 | Bogineni | May 2010 | A1 |
20100138007 | Clark et al. | Jun 2010 | A1 |
20100138858 | Velazquez et al. | Jun 2010 | A1 |
20100146445 | Kraut | Jun 2010 | A1 |
20100211546 | Grohman et al. | Aug 2010 | A1 |
20100283579 | Kraus et al. | Nov 2010 | A1 |
20100321151 | Matsuura et al. | Dec 2010 | A1 |
20110030016 | Pino et al. | Feb 2011 | A1 |
20110032423 | Jing et al. | Feb 2011 | A1 |
20110093126 | Toba et al. | Apr 2011 | A1 |
20110119325 | Paul et al. | May 2011 | A1 |
20110150432 | Paul et al. | Jun 2011 | A1 |
20110156862 | Langer | Jun 2011 | A1 |
20110187928 | Crabtree | Aug 2011 | A1 |
20110187930 | Crabtree | Aug 2011 | A1 |
20110187931 | Kim | Aug 2011 | A1 |
20110202956 | Connelly et al. | Aug 2011 | A1 |
20110270549 | Jeansonne et al. | Nov 2011 | A1 |
20110282837 | Gounares et al. | Nov 2011 | A1 |
20110283311 | Luong | Nov 2011 | A1 |
20110295396 | Chinen | Dec 2011 | A1 |
20120019388 | Kates et al. | Jan 2012 | A1 |
20120047532 | McCarthy | Feb 2012 | A1 |
20120059495 | Weiss | Mar 2012 | A1 |
20120069246 | Thornberry et al. | Mar 2012 | A1 |
20120094696 | Ahn et al. | Apr 2012 | A1 |
20120124456 | Perez et al. | May 2012 | A1 |
20120154108 | Sugaya | Jun 2012 | A1 |
20120271670 | Zaloom | Oct 2012 | A1 |
20120280802 | Yoshida et al. | Nov 2012 | A1 |
20120291068 | Khushoo et al. | Nov 2012 | A1 |
20120316876 | Jang et al. | Dec 2012 | A1 |
20120326835 | Cockrell et al. | Dec 2012 | A1 |
20130046800 | Assi et al. | Feb 2013 | A1 |
20130053063 | McSheffrey | Feb 2013 | A1 |
20130060358 | Li et al. | Mar 2013 | A1 |
20130070044 | Naidoo et al. | Mar 2013 | A1 |
20130074061 | Averbuch et al. | Mar 2013 | A1 |
20130090213 | Amini et al. | Apr 2013 | A1 |
20130124192 | Lindmark et al. | May 2013 | A1 |
20130138757 | Ferron | May 2013 | A1 |
20130152139 | Davis et al. | Jun 2013 | A1 |
20130204408 | Thiruvengada et al. | Aug 2013 | A1 |
20130267383 | Watterson | Oct 2013 | A1 |
20130300576 | Sinsuan et al. | Nov 2013 | A1 |
20130318559 | Crabtree | Nov 2013 | A1 |
20130321637 | Frank et al. | Dec 2013 | A1 |
20130324247 | Esaki et al. | Dec 2013 | A1 |
20140095684 | Nonaka et al. | Apr 2014 | A1 |
20140101465 | Wang et al. | Apr 2014 | A1 |
20140142724 | Park et al. | May 2014 | A1 |
20140160360 | Hsu et al. | Jun 2014 | A1 |
20140168277 | Ashley et al. | Jun 2014 | A1 |
20140192197 | Hanko et al. | Jul 2014 | A1 |
20140192997 | Niu et al. | Jul 2014 | A1 |
20140215505 | Balasubramanian et al. | Jul 2014 | A1 |
20140218517 | Kim et al. | Aug 2014 | A1 |
20140266669 | Fadell et al. | Sep 2014 | A1 |
20140266684 | Poder et al. | Sep 2014 | A1 |
20140310075 | Ricci | Oct 2014 | A1 |
20140333529 | Kim et al. | Nov 2014 | A1 |
20140351832 | Cho et al. | Nov 2014 | A1 |
20140362201 | Nguyen et al. | Dec 2014 | A1 |
20140373074 | Hwang et al. | Dec 2014 | A1 |
20150029096 | Ishihara | Jan 2015 | A1 |
20150054910 | Offen et al. | Feb 2015 | A1 |
20150066173 | Ellis et al. | Mar 2015 | A1 |
20150084770 | Xiao et al. | Mar 2015 | A1 |
20150106866 | Fujita | Apr 2015 | A1 |
20150143408 | Sallas | May 2015 | A1 |
20150156612 | Vemulapalli | Jun 2015 | A1 |
20150159401 | Patrick et al. | Jun 2015 | A1 |
20150160623 | Holley | Jun 2015 | A1 |
20150160634 | Smith et al. | Jun 2015 | A1 |
20150160635 | Schofield et al. | Jun 2015 | A1 |
20150160636 | McCarthy et al. | Jun 2015 | A1 |
20150160663 | McCarthy et al. | Jun 2015 | A1 |
20150161452 | McCarthy et al. | Jun 2015 | A1 |
20150161882 | Lett | Jun 2015 | A1 |
20150162006 | Kummer | Jun 2015 | A1 |
20150163411 | McCarthy, III et al. | Jun 2015 | A1 |
20150163412 | Holley et al. | Jun 2015 | A1 |
20150163535 | McCarthy et al. | Jun 2015 | A1 |
20150172742 | Richardson | Jun 2015 | A1 |
20150198941 | Pederson | Jul 2015 | A1 |
20150281824 | Nguyen et al. | Oct 2015 | A1 |
20150309487 | Lyman | Oct 2015 | A1 |
20150341599 | Carey | Nov 2015 | A1 |
20160063854 | Burton et al. | Mar 2016 | A1 |
20160066046 | Mountain | Mar 2016 | A1 |
20160091471 | Benn | Mar 2016 | A1 |
20160109864 | Lonn | Apr 2016 | A1 |
20160121161 | Mountain | May 2016 | A1 |
20160123741 | Mountain | May 2016 | A1 |
20160163168 | Brav et al. | Jun 2016 | A1 |
20160182249 | Lea | Jun 2016 | A1 |
20160191912 | Lea et al. | Jun 2016 | A1 |
20160191990 | McCarthy | Jun 2016 | A1 |
20160203700 | Bruhn et al. | Jul 2016 | A1 |
20160286327 | Marten | Sep 2016 | A1 |
20160334811 | Marten | Nov 2016 | A1 |
20160335423 | Beals | Nov 2016 | A1 |
Number | Date | Country |
---|---|---|
2 267 988 | Apr 1998 | CA |
105814555 | Jul 2016 | CN |
2 736 027 | May 2014 | EP |
3 080 677 | Oct 2016 | EP |
3 080 710 | Oct 2016 | EP |
2 304 952 | Mar 1997 | GB |
2008148016 | Jun 2008 | JP |
9320544 | Oct 1993 | WO |
2004068386 | Aug 2004 | WO |
2011095567 | Aug 2011 | WO |
2014068556 | May 2014 | WO |
2016034880 | Mar 2016 | WO |
2016066442 | May 2016 | WO |
2016066399 | May 2016 | WO |
2016182696 | Nov 2016 | WO |
Entry |
---|
U.S. Appl. No. 14/470,352, filed Aug. 27, 2014 Non Final Office Action mailed Nov. 20, 2015, 28 pages. |
U.S. Appl. No. 12/700,310, filed Feb. 4, 2010 Final Office Action mailed Oct. 26, 2015, 19 pages. |
U.S. Appl. No. 14/485,188, filed Sep. 12, 2014, Pre-Interview First Office Action mailed Oct. 1, 2015, 10 pages. |
Fong A.C.M. et al, “Indoor air quality control for asthma patients using smart home technology,” Consumer Electronics (ISCE), 2011 IEEE 15th International Symposium on, IEEE, Jun. 14, 2011, pp. 18-19, XP032007803, DOI: 10.1109/ISCE.2011.5973774, ISBN: 978-1-61284-843-3, Abstract and sections 3 and 4. |
Shunfeng Cheng et al., “A Wireless Sensor System for Prognostics and Health Management,” IEEE Sensors Journal, IEEE Service Center, New York, NY, US, vol. 10, No. 4, Apr. 1, 2010, pp. 856-862, XP011304455, ISSN: 1530-437X, Sections 2 and 3. |
International Search Report and Written Opinion for PCT/EP2015/070286 mailed Nov. 5, 2015, 13 pages. |
International Search Report and Written Opinion for PCT/GB2015/052544 mailed Oct. 6, 2015, 10 pages. |
International Search Report and Written Opinion for PCT/GB2015/052457 mailed Nov. 13, 2015, 11 pages. |
International Search Report and Written Opinion for PCT/EP2015/073299 mailed Jan. 4, 2016, 12 pages. |
International Search Report and Written Opinion for PCT/EP2015/073936 mailed Feb. 4, 2016, all pages. |
U.S. Appl. No. 14/107,132, filed Dec. 16, 2013, Final Rejection mailed Dec. 16, 2015, 32 pages. |
U.S. Appl. No. 14/485,188, filed Sep. 12, 2014, Final Rejection mailed Feb. 23, 2016, 22 pages. |
U.S. Appl. No. 14/567,348, filed Dec. 11, 2014, Preinterview first office action mailed Jan. 20, 2016, 23 pages. |
U.S. Appl. No. 14/470,352, filed Aug. 27, 2014 Final Office Action mailed Mar. 17, 2016, all pages. |
U.S. Appl. No. 14/567,765, filed Dec. 11, 2014, Preinterview first office action mailed Apr. 8, 2016, 30 pages. |
U.S. Appl. No. 14/577,717, filed Dec. 19, 2014, Preinterview first office action mailed Apr. 4, 2016, 29 pages. |
U.S. Appl. No. 14/584,075, filed Dec. 29, 2014, Non-Final Rejection mailed Apr. 1, 2016, 40 pages. |
“Acoustic/Ultrasound Ultrasonic Flowmeter Basics,” Questex Media Group LLC, accessed on Dec. 16, 2014, 4 pages. Retrieved from http://www.sensorsmag.com/sensors/acoustic-ultrasound/ultrasonic-flowmeter-basics-842. |
Author Unknown, “Voice Activated TV using the Amulet Remote for Media Center,” AmuletDevices.com, accessed on Jul. 14, 2014, 1 page. Retrieved from http://www.amuletdevices.com/index.php/Features/television.html. |
Author Unknown, “App for Samsung Smart TV®,” Crestron Electronics, Inc., accessed on Jul. 14, 2014, 3 pages. Retrieved from http://www.crestron.com/products/smart tv television apps/. |
Author Unknown, “AllJoyn Onboarding Service Frameworks,” Qualcomm Connected Experiences, Inc., accessed on Jul. 15, 2014, 9 pages. Retrieved from https://www.alljoyn.org. |
“Do you want to know how to find water leaks? Use a Bravedo Water Alert Flow Monitor to find out!”, Bravedo.com, accessed Dec. 16, 2014, 10 pages. Retrieved from http://bravedo.com/. |
“International Building Code Excerpts, Updated with recent code changes that impact electromagnetic locks,” Securitron, Assa Abloy, IBC/IFC 2007 Supplement and 2009, “Finally-some relief and clarification”, 2 pages.Retrieved from: www.securitron.com/Other/.../New—IBC-IFC—Code—Language.pdf. |
“Introduction to Ultrasonic Doppler Flowmeters,” Omega Engineering inc., accessed on Dec. 16, 2014, 3 pages. Retrieved from http://www.omega.com/prodinfo/ultrasonicflowmeters.html. |
“Flow Pulse®, Non-invasive clamp-on flow monitor for pipes,” Pulsar Process Measurement Ltd, accessed on Dec. 16, 2014, 2 pages. Retrieved from http://www.pulsar-pm.com/product-types/flow/flow-pulse.aspx. |
Lamonica, M., “CES 2010 Preview: Green comes in many colors,” retrieved from CNET.com (http://ces.cnet.com/8301-31045—1-10420381-269.html), Dec. 22, 2009, 2 pages. |
Robbins, Gordon, Deputy Chief, “Addison Fire Department Access Control Installation,” 2006 International Fire Code, Section 1008.1.3.4, 4 pages. |
“Ultrasonic Flow Meters,” RS Hydro Ltd, accessed on Dec. 16, 2014, 3 pages. Retrieved from http://www.rshydro.co.uk/ultrasonic-flowmeter.shtml. |
Wang et al., “Mixed Sound Event Verification on Wireless Sensor Network for Home Automation,” IEEE Transactions on Industrial Informatics, vol. 10, No. 1, Feb. 2014, 10 pages. |
International Search Report and Written Opinion for PCT/EP2011/051608 mailed on May 30, 2011, 13 pages. |
International Preliminary Report on Patentability for PCT/EP2011/051608 mailed Aug. 16, 2012, 8 pages. |
International Search Report and Written Opinion for PCT/US2014/053876 mailed Nov. 26, 2014, 8 pages. |
International Search Report and Written Opinion for PCT/US2014/055441 mailed Dec. 4, 2014, 10 pages. |
International Search Report and Written Opinion for PCT/US2014/055476 mailed Dec. 30, 2014, 10 pages. |
Mexican Institute of Industrial Property Notice of Allowance dated Feb. 10, 2014, for Mex. Patent Appln No. MX/a/2012/008882, 1 page. |
Mexican Institute of Industrial Property Office Action dated Nov. 1, 2013, for Mex. Patent Appln No. MX/a/2012/008882 is not translated into English, 3 pages. |
U.S. Appl. No. 12/700,310, filed Feb. 4, 2010, Office Action mailed May 4, 2012, 15 pages. |
U.S. Appl. No. 12/700,310, filed Feb. 4, 2010, Final Office Action mailed Oct. 10, 2012, 16 pages. |
U.S. Appl. No. 12/700,310, filed Feb. 4, 2010 Non-Final Office Action mailed Apr. 1, 2013, 16 pages. |
U.S. Appl. No. 12/700,310, filed Feb. 4, 2010 Non-Final Office Action mailed Oct. 15, 2013, 15 pages. |
U.S. Appl. No. 12/700,310, filed Feb. 4, 2010 Final Office Action mailed Feb. 28, 2014, 17 pages. |
U.S. Appl. No. 12/700,310, filed Feb. 4, 2010 Non-Final Office Action mailed Aug. 14, 2014, 18 pages. |
U.S. Appl. No. 12/700,310, filed Feb. 4, 2010 Non-Final Office Action mailed Mar. 11, 2015, 35 pages. |
U.S. Appl. No. 12/700,408, filed Feb. 4, 2010, Notice of Allowance mailed Jul. 28, 2012, 8 pages. |
U.S. Appl. No. 13/680,934, filed Nov. 19, 2012,Non-Final Office Action mailed Oct. 2, 2013, 7 pages. |
U.S. Appl. No. 13/680,934, filed Nov. 19, 2012,Final Office Action mailed Feb. 10, 2014, 13 pages. |
U.S. Appl. No. 13/680,934, filed Nov. 19, 2012,Notice of Allowance mailed Apr. 30, 2014, 9 pages. |
U.S. Appl. No. 13/680,934, filed Nov. 19, 2012, Notice of Allowance mailed Jul. 25, 2014, 12 pages. |
U.S. Appl. No. 14/107,132, filed Dec. 16, 2013 Non Final Office Action mailed May 27, 2015, 26 pages. |
U.S. Appl. No. 14/485,188, filed Sep. 12, 2014 Pre-Interview First Office Action mailed Jul. 29, 2015, 20 pages. |
U.S. Appl. No. 14/470,352, filed Aug. 27, 2014 Non Final Office Action mailed Aug. 26, 2016, all pages. |
U.S. Appl. No. 14/107,132, filed Dec. 16, 2013, Non Final Office Action mailed Jul. 18, 2016, all pages. |
U.S. Appl. No. 14/567,783, filed Dec. 11, 2014, Non Final Rejection mailed Aug. 23, 2016, all pages. |
U.S. Appl. No. 12/700,310, filed Feb. 4, 2010 Notice of Allowance mailed Nov. 8, 2016, all pages. |
U.S. Appl. No. 14/567,765, filed Dec. 11, 2014, First Action interview mailed Oct. 18, 2016, all pages. |
U.S. Appl. No. 14/584,075, filed Dec. 29, 2014, Final Rejection mailed Oct. 6, 2016, all pages. |
U.S. Appl. No. 14/566,977, filed Dec. 11, 2014, Non Final Rejection mailed Oct. 3, 2016, all pages. |
U.S. Appl. No. 14/567,754, filed Dec. 11, 2014, Non Final Rejection mailed Nov. 4, 2016, all pages. |
U.S. Appl. No. 14/567,770, filed Dec. 11, 2014, Non Final Rejection mailed Nov. 4, 2016, all pages. |
U.S. Appl. No. 14/671,299, filed Mar. 27, 2015, Non Final Rejection mailed Oct. 28, 2016, all pages. |
U.S. Appl. No. 14/476,377, filed Sep. 3, 2014, Non-Final Rejection mailed Nov. 7, 2016, all pages. |
Office Action for EP14868928.4 dated Sep. 23, 2016, all pages. |
U.S. Appl. No. 14/470,352, filed Aug. 27, 2014 Notice of Allowance mailed Dec. 2, 2016, all pages. |
U.S. Appl. No. 15/050,958, filed Feb. 23, 2016 Notice of Allowance mailed Dec. 6, 2016, all pages. |
U.S. Appl. No. 15/289,395, filed Oct. 10, 2016 Non-Final Rejection mailed Dec. 2, 2016, all pages. |
U.S. Appl. No. 14/485,188, filed Sep. 12, 2014, Final Rejection mailed Nov. 25, 2016, 22 pages. |
International Search Report and Written Opinion for PCT/US2016/028126 mailed Jun. 3, 2016, all pages. |
U.S. Appl. No. 12/700,310, filed Feb. 4, 2010 Non-Final Office Action mailed Jun. 16, 2016, 30 pages. |
U.S. Appl. No. 14/528,739, filed Oct. 30, 2014 Notice of Allowance mailed Jun. 23, 2016, 34 pages. |
U.S. Appl. No. 14/485,188, filed Sep. 12, 2014, Non-Final Rejection mailed Jun. 17, 2016, 29 pages. |
U.S. Appl. No. 14/710,331, filed May 12, 2015, Non-Final Rejection mailed May 20, 2016, 42 pages. |
International Preliminary Report on Patentability for PCT/US2014/055441 issued Jun. 14, 2016, 8 pages. |
International Preliminary Report on Patentability for PCT/US2014/053876 issued Jun. 14, 2016, 7 pages. |
International Preliminary Report on Patentability for PCT/US2014/055476 issued Jun. 14, 2016, 9 pages. |
Number | Date | Country | |
---|---|---|---|
20160342379 A1 | Nov 2016 | US |