In many residential, commercial, and/or industrial environments, occupants may control lights, room temperature, and/or alarm/security systems inside and/or outside buildings or rooms thereof. Occupants may control lights via individual or grouped light switches. Room temperature may be controlled via remote or local thermostatic controllers. Occupants can control and/or monitor alarm/security systems, television systems, and/or music systems via controllers/monitors located in various rooms and/or centrally in a building.
One or more devices, systems, methods, may implement one or more techniques for implementing an automation platform (e.g., an OMNI Smart Home) and/or may include a network of proprietary devices installed in (e.g., primary) rooms throughout the site. These devices, processes, and/or techniques may form a neural network that may work together to recognize patterns in the daily routine of people, pets, and/or other repeatable activities, such as for example, identifying/recognizing individual users and/or user profiles.
The platform can be utilized in residential (e.g., home), commercial, industrial, military, and/or potentially any site that for which on-site automation for a variety of applications may be useful.
One or more techniques may be used in one or more targeted industries/environments such as Residential, Commercial, Smart Home Automation, Home Services, and/or IoT, and/or the like.
The embodiments and other features, advantages and disclosures contained herein, and the manner of attaining them, will become apparent and the present disclosure will be better understood by reference to the following description of various examples of the present disclosure taken in conjunction with the accompanying drawings, wherein:
For the purposes of promoting an understanding of the principles of the present disclosure, reference will now be made to the embodiments illustrated in the drawings, and specific language will be used to describe the same. It will nevertheless be understood that no limitation of the scope of this disclosure is thereby intended.
The computing device 104 may take the form of a laptop computer, a desktop computer, a computer mainframe, a server, a terminal, a tablet, a smartphone, a cloud-based computing device (e.g., at least partially), a light switch, and/or a modular component, and/or the like.
The processor 132 may be a general-purpose processor, a special-purpose processor, a conventional processor, a digital-signal processor (DSP), a plurality of microprocessors, one or more microprocessors in association with a DSP core, a controller, a microcontroller, one or more Application Specific Integrated Circuits (ASICs), one or more Field Programmable Gate Array (FPGAs) circuits, any other type of integrated circuit (IC), and/or a finite-state machine, and/or the like. The processor 132 may perform signal coding, data processing, power control, sensor control, interface control, video control, audio control, input/output processing, and/or any other functionality that enables the computing device 104 to serve as and/or perform as (e.g., at least partially) one or more of the devices, methods, and/or systems disclosed herein.
The processor 132 may be connected to the transceiver 112, which may be connected to the transmit/receive element 124. The processor 132 and the transceiver 112 may operate as connected separate components (as shown). The processer 132 and the transceiver 112 may be integrated together in an electronic package or chip (not shown).
The transmit/receive element 114 may be configured to transmit signals to, and/or receive signals from, one or more wireless transmit/receive sources (not shown). For example, the transmit/receive element 114 may be an antenna configured to transmit and/or receive RF signals. The transmit/receive element 114 may be an emitter/detector configured to transmit and/or receive IR, UV, or visible light signals, for example. The transmit/receive element 114 may be configured to transmit and/or receive RF and/or light signals. The transmit/receive element 114 may be configured to transmit and/or receive any combination of wireless signals.
Although the transmit/receive element 114 is shown as a single element, the computing device 104 may include any number of transmit/receive elements 114 (e.g., the same as for any of the elements 112-150). The computing device 104 may employ Multiple-Input and Multiple-Output (MIMO) technology. For example, the computing device 104 may include two or more transmit/receive elements 114 for transmitting and/or receiving wireless signals.
The transceiver 112 may be configured to modulate the signals that are to be transmitted by the transmit/receive element 114 and/or to demodulate the signals that are received by the transmit/receive element 114. The transceiver 112 may include multiple transceivers for enabling the computing device 104 to communicate via one or more, or multiple, radio access technologies, such as Universal Terrestrial Radio Access (UTRA), Evolved UTRA (E-UTRA), and/or IEEE 802.11, for example. The transceiver 112 may include transceivers for communicating via WiFi, Zigby, Z Wave, Bluetooth, and/or proprietary RF.
The processor 132 may be connected to, may receive user input data from, and/or may send (e.g., as output) user data to: the speaker 116, microphone 118, the keypad/keyboard 122, and/or the display/touchpad/touchscreen 126 (e.g., a liquid crystal display (LCD) display unit or organic light-emitting diode (OLED) display unit, among others). The processor 132 may retrieve information/data from and/or store information/data in, any type of suitable memory, such as the in-place memory 144 and/or the removable memory 146. The in-place memory 144 may include random-access memory (RAM), read-only memory (ROM), a register, cache memory, semiconductor memory devices, and/or a hard disk, and/or any other type of memory storage device.
The removable memory 146 may include a subscriber identity module (SIM) card, a portable hard drive, a memory stick, and/or a secure digital (SD) memory card, and/or the like. The processor 132 may retrieve information/data from, and/or store information/data in, memory that might not be physically located on the computing device 104, such as on a server, the cloud, and/or a home computer (not shown).
One or more of the elements 112-146 may receive power from the in-place power source 148. In-place power source 148 may be configured to distribute and/or control the power to one or more of the elements 112-146 of the computing device 104. The in-place power source 148 may be any suitable device for powering the computing device 104. For example, the in-place power source 148 may include one or more dry cell batteries (e.g., nickel-cadmium (NiCd), nickel-zinc (NiZn), nickel metal hydride (NiMH), lithium-ion (Li-ion), etc.), solar cells, and/or fuel cells, and/or the like.
Power interface 150 may include a receptacle and/or a power adapter (e.g., transformer, regulator, and/or rectifier) that may receive externally sourced power via one or more AC and/or DC power cables, and/or via wireless power transmission. Any power received via power interface 150 may energize one or more of the elements 112-146 of computing device 104, perhaps for example exclusively or in parallel with in-place power source 148. Any power received via power interface 150 may be used to charge in-place power source 148.
The processor 132 may be connected to the GPS/location circuitry 130, which may be configured to provide location information (e.g., longitude and/or latitude) regarding the current location of the computing device 104. The computing device 104 may acquire location information by way of any suitable location-determination technique.
The processor 132 may be connected to the one or more input/output devices 124, which may include one or more software and/or hardware modules that provide additional features, functionality and/or wired and/or wireless connectivity. For example, the one or more input/output devices 124 may include a digital camera (e.g., for photographs and/or video), a hands free headset, a digital music player, a media player, a frequency modulated (FM) radio unit, an Internet browser, and/or a video game player module, and/or the like.
The processor 132 may be connected to the one or more sensor devices 128, which may include one or more software and/or hardware modules that provide additional features, functionality and/or wired and/or wireless connectivity. For example, the one or more sensor devices 128 may include an accelerometer, an e-compass, and/or a vibration device, and/or the like.
The processor 132 may be connected to the network interface 134, which may include one or more software and/or hardware modules that provide additional features, functionality and/or wireless and/or wired connectivity. For example, the network interface 134 may include a Network Interface Controller (NIC) module, a Local Area Network (LAN) module, an Ethernet module, a Physical Network Interface (PNI) module, and/or an IEEE 802 module (e.g., one or more IEEE 802.11 standard series protocols), and/or the like.
The processor 132 may be connected to the video interface 136, which may include one or more software and/or hardware modules that provide additional features, functionality and/or wired and/or wireless connectivity. For example, the video interface 136 may include a High-Definition Multimedia Interface (HDMI) module, a Digital Visual Interface (DVI) module, a Super Video Graphics Array (SVGA) module, and/or a Video Graphics Array (VGA) module, and/or the like.
The processor 132 may be connected to the USB interface 138, which may include one or more software and/or hardware modules that provide additional features, functionality and/or wired and/or wireless connectivity. For example, the USB interface 138 may include a universal serial bus (USB) port, and/or the like.
The processor 132 may be connected to the optical interface 140, which may include one or more software and/or hardware modules that provide additional features, functionality and/or wired and/or wireless connectivity. For example, the optical interface 140 may include a read/write Compact Disc module, a read/write Digital Versatile Disc (DVD) module, and/or a read/write Blu-ray™ disc module, and/or the like.
The processor 132 may be connected to the wireless interface 142, which may include one or more software and/or hardware modules that provide additional features, functionality and/or wireless connectivity. For example, the wireless interface 142 may include a Bluetooth® module, an Ultra-Wideband (UWB) module, a Z Wave module, a cellular module, a ZigBee module, and/or a Wi-Fi (IEEE 802.11) module, and/or the like.
One or more devices, systems, methods, may implement one or more techniques to for the home (e.g., residential) market, an (e.g., OMNI Smart Home) Automation Platform may be installed and/or integrated into a home, commercial location, service location, business location, industrial location, and/or military location. One or more techniques may use a state-of-the-art intuitive user interface for setup and/or day-to-day operation. An (e.g., a single) application may connects one or more devices, and/or everything, that may be found within the home.
There may be full WiFi coverage throughout the home, perhaps with no more “dead zones.” Home automation platforms (e.g., OMNI) may learn and/or remember the location of people, pets, and/or objects inside the home. Perhaps using one or more (e.g., proprietary) Artificial Intelligence (AI) algorithms, the (e.g., Omni Core Home) Automation Platform may recognize patterns and/or may program itself with repeatable patterns over time. The control device/home automation platform (e.g., OMNI device) may be provide the device and/or extend Wifi and/or cellular coverage in the home and/or business.
There are three levels of device being offered: Basic Touch, Sensor Touch, Display Touch. Each device is connected to a 120v power supply and is modular in design and similar in size and shape to a standard lighting switch plate. The device includes a fixed base (e.g., Control Unit/Processor), user removable input/output faceplates and has multiple functional capabilities.
The Basic Touch unit consists of a capacitive touch faceplate that fits over an existing light switch cover (name) and connects to a Control Unit/Processor that is housed behind the wall in the space that a standard lighting switch box would normally exist.
The Basic Touch unit contains one or more of the following functional capabilities:
The Sensor Touch unit contains one or more, or all, of the components and features of the Basic Touch unit and adds one or more of the following components and functionality:
The Display Touch unit contains one or more, or all, of the components and features of the Basic Touch plus Sensor Touch units and adds one or more of the following components and functionality:
There may by one or more (e.g., unique) input and/or output to the algorithm. There may be one or more Training Dynamic Automation (Learning Models):
There may be one or more Machine Learning models that run locally, even without internet connectivity. There may be Sensor-stream meta data.
There may be Audio signature ID recognition. Ability to recognize a sound in the environment and associate an action when that sound has triggered.
There may be Software distributed ledger encryption, for example In-network evaluation of the ledger.
There may be In-home user [geo]location:
There may be device triangulation and/or trilateration:
There may be Yes/No user input ties directly into the Machine Learning training data. There may be Complete Voice Configuration, including transmission of credentials as audio to the device.
There may be a transfer of content, profiles, etc. from room-to-room as a device/user moves throughout the house. For example, a user's lighting preferences follow them throughout the residence. For example, information (e.g., messages, multimedia, and/or content, etc.) that may follow users throughout the residence.
There may be a display of only necessary data (and controllable devices) on the mobile app based on relevance as determined by ML (e.g., one or more, or all inputs including location).
There may be a use of a combination of sensors (accelerometer and microphone) to measure, identify health and alert a user of system status of ‘mechanical’ systems (e.g., a/c, washer/dryer, refrigerator). There may be other use cases that may gather sensor data, for example to determine a status of users (e.g., health and/or identification, etc.).
There may be use of calendar data to run in-home systems (lights, security system, etc. when you are away), such as:
There may be data obfuscation (misinformation) for system security. There may be a stream of data that comes out of the home that may be in a completely unique format that is never devisable again. There may be devices that self-configure as front faceplates are interchanged. In addition, you can change the features and usability of the device (e.g., and/or capabilities of the light switch).
There may be one or more unique product designs. The Product Design may include Snap-on/modular front face design.
There may be one or more all-in-one solutions for single, 3-way, 4-way switching, and including software enabled/disabled dimming. This may include monitoring power on multiple inputs to determine n-way switching mode.
There may be one or more home automation systems (e.g., entirely) housed in one or more light switches (e.g. WiFi access point via the switch location device). There may be integrating of the path-light into the light switch (e.g., using ubiquitous motion sensors as predictive controls). There may be interior pet fencing capability (e.g., using sonic frequencies as deterrents).
There may be one or more System Software Architectures. The device can gather information from the local switch sensors, mobile device sensors, Cloud or servers and send it to the CORE for system processing. The system can take several actions, such as for example, self-healing, load balancing, and/or redundancy. For example, the CORE may be virtual and/or can be located on any of the devices (e.g., OMNI devices) in the system.
There may be Immediate Action. The software can cause the system to respond in a variety of ways based on the analysis of the input, for example: Turn lights on/off, dim lights, adjust temperature, trigger alarms, etc.
There may be Time or Date based Action. Based on pre-programmed input, the system can take action based on a specific time or date:
There may be Delayed Action, for example Pre-programmable capability for events at later times.
All sensor data is used to provide a dataset for conventional actions based on IF THEN, AND, AND, OR.
The system analyzes streams of metadata (to/from cloud) for machine learning (AI) to enable actionable patterns or to take appropriate action.
Store information in machine learned patterns for comparison to patterns to add new learned patterns.
Programmable by the user based on stored thresholds which can be changed through a user interface.
Machine learning provides questions to the user such as, for example: Is this (e.g., condition, action) correct? YES or NO.
Places the user in the feedback loop may be based on the answers to the questions. One or more answers may provide some of the actions that get stored to be executed (e.g., creates a pattern), for example: The pattern then becomes a macro to perform the specified action (e.g., turn on a light).
Feedback answer response gets built into a program to take action, but also adds it to the machine learning side, specifically the training set, and/or revising the learning based on responses.
The process of updating the machine learning (training) based on simple user interactions is the central aspect of the software.
Shifting the user experience from the current state where the user initiates an action, to a new state where the system initiates the action based on a learned response or validation from the user.
There may be System Security. The security of the system uses a distributed ledger (e.g., Blockchain) method to validate the software. Such validation may include one or more of:
In one or more scenarios, one or more techniques and/or devices may be used (e.g., entirely) without cloud services, perhaps for example so that no user data may be useful (e.g., required) to be sent outside the physical structure. System security via a distributed computing structure may permit that no user information may leave (e.g., may need to leave) the LAN. Distributing computing may allow for individual “nodes” (e.g., control units, sensor touches, and/or display touches) to be upgraded via replacing the hardware and/or can be auto configured into system.
Table 1 lists some functions/attributes of the control system, for example outside a user's home.
Table 2 lists some functions/attributes of the control system, for example in a family room and/or bedroom.
Table 3 lists some functions/attributes of the control system, for example in a kitchen.
In one or more scenarios, perhaps with just a few devices (e.g., each under $100) with the control system, you could have security, lighting control, and/or control of other devices on one app.
In one or more scenarios, the control system can do this, perhaps for example via a network of “light switches” that may provide Security, Lighting and Control of smart devices—all through one (or more) app, for example. In one or more scenarios, the control unit/processor may installed in any standard US light switch location.
In one or more scenarios, the control system may use interior triangulation based on system devices. In one or more scenarios, geo fencing of interior space may be made for “pet fencing purposes”. For example, triangulation and/or trilateration of pets, but with systems providing “sonic deterrent” or other limitations to pet movement. In one or more scenarios, distributed computing may be processed through one or more devices.
In one or more scenarios, personalization of users for preferences, tracking etc. may be configured. In one or more scenarios, predictive path lighting may be configured.
In one or more scenarios, an outlet version device may be used, a plug-in version may be used, and/or a 120v smoke detector replacement version may be used.
While the present disclosure has been illustrated and described in detail in the drawings and foregoing description, the same is to be considered as illustrative and not restrictive in character, it being understood that only certain embodiments have been shown and described, and that all changes and modifications that come within the spirit of the present disclosure are desired to be protected.
The present application is a nonprovisional patent application, which claims the priority benefit of U.S. Application Ser. No. 62/931,558, filed Nov. 6, 2019, the text and drawings of which are hereby incorporated in its entirety.
Number | Date | Country | |
---|---|---|---|
62931558 | Nov 2019 | US |