SYSTEMS AND METHODS FOR LOCATION CONTROL BASED ON AIR QUALITY METRICS

Information

  • Patent Application
  • 20250020351
  • Publication Number
    20250020351
  • Date Filed
    July 10, 2023
    a year ago
  • Date Published
    January 16, 2025
    14 days ago
  • CPC
    • F24F11/63
    • F24F2130/00
  • International Classifications
    • F24F11/63
Abstract
Disclosed are systems and methods that provide a novel framework for personalized location management. The framework can provide dynamically determined, customized location control based on activities of the user and particular environment characteristics that can impact such activities. The framework provides a comprehensive location optimization system that delves deep into the factors that can impact a user's actions by providing personalized insights and recommendations for controlling a real-world location. Accordingly, the framework can manipulate, control, modify and/or manage real-world and/or digital components of the location respective to a performed activity in line with determined environmental factors.
Description
FIELD OF THE DISCLOSURE

The present disclosure is generally related to a location monitoring and control system, and more particularly, to a decision intelligence (DI)-based computerized framework for automatically and dynamically managing a real-world location and its components therein in real-time based on user activity and environmental characteristics.


BACKGROUND

Conventional mechanisms for monitoring and controlling a location are tied to security panels and/or thermostats. That is, for example, a thermostat can receive data about a home, for example, and modify how the heat, ventilation and air conditioning (HVAC) system operates to manage the temperature within the home.


SUMMARY OF THE DISCLOSURE

Conventional systems, however, lack the robust and comprehensive understanding of a location and how environmental characteristics can impact planned and/or spontaneous activities of users within the location. Indeed, conventional systems lack personalized, data-driven insights which can enable optimized real-world environments to be curated for users.


To that end, according to some embodiments, the disclosed systems and methods provide a novel computerized framework that addresses such current shortcomings, among others, and provides dynamically determined, personalized location recommendations and/or optimized location configurations that enable users to engage in their activities as customized to their specific needs.


According to some embodiments, as discussed herein, the disclosed systems and methods provide a comprehensive location optimization system that delves deep into the factors that influence a user's activities, which can include, for example, which activities a user is performing as well as the environmental factors that can impact such performance.


According to some embodiments, as discussed herein, the disclosed framework can collect data about a location and user(s) at the location, which can be from any device associated with a user as well as from a variety of data resources (e.g., cloud-hosted data, for example), as discussed in more detail below in relation to at least FIGS. 1 and 3-4. For example, data collected from a user's smart ring, smart phone, sensors at a location and/or network-based data stored in a cloud can provide the data from which the disclosed framework can learn and/or determine/develop a personalized, configured environment for the location for the user(s).


According to some embodiments, a location should be understood to correspond to any definable geographic area, such as, but not limited to, a home, office, patio, building (or other type of structure), and the like. In some embodiments, a location can correspond to sub-parts of a larger location—for example, an apartment in a building, and the like.


According to some embodiments, a method is disclosed for a DI-based computerized framework for automatically and dynamically managing a location in real-time based on user and environmental characteristics. In accordance with some embodiments, the present disclosure provides a non-transitory computer-readable storage medium for carrying out the above-mentioned technical steps of the framework's functionality. The non-transitory computer-readable storage medium has tangibly stored thereon, or tangibly encoded thereon, computer readable instructions that when executed by a device cause at least one processor to perform a method for automatically and dynamically managing a location in real-time based on user and environmental characteristics.


In accordance with one or more embodiments, a system is provided that includes one or more processors and/or computing devices configured to provide functionality in accordance with such embodiments. In accordance with one or more embodiments, functionality is embodied in steps of a method performed by at least one computing device. In accordance with one or more embodiments, program code (or program logic) executed by a processor(s) of a computing device to implement functionality in accordance with one or more such embodiments is embodied in, by and/or on a non-transitory computer-readable medium.





DESCRIPTIONS OF THE DRAWINGS

The features, and advantages of the disclosure will be apparent from the following description of embodiments as illustrated in the accompanying drawings, in which reference characters refer to the same parts throughout the various views. The drawings are not necessarily to scale, emphasis instead being placed upon illustrating principles of the disclosure:



FIG. 1 is a block diagram of an example configuration within which the systems and methods disclosed herein could be implemented according to some embodiments of the present disclosure;



FIG. 2 is a block diagram illustrating components of an exemplary system according to some embodiments of the present disclosure;



FIG. 3 illustrates an exemplary workflow according to some embodiments of the present disclosure;



FIG. 4 illustrates an exemplary workflow according to some embodiments of the present disclosure;



FIG. 5 depicts an exemplary implementation of an architecture according to some embodiments of the present disclosure;



FIG. 6 depicts an exemplary implementation of an architecture according to some embodiments of the present disclosure; and



FIG. 7 is a block diagram illustrating a computing device showing an example of a client or server device used in various embodiments of the present disclosure.





DETAILED DESCRIPTION

The present disclosure will now be described more fully hereinafter with reference to the accompanying drawings, which form a part hereof, and which show, by way of non-limiting illustration, certain example embodiments. Subject matter may, however, be embodied in a variety of different forms and, therefore, covered or claimed subject matter is intended to be construed as not being limited to any example embodiments set forth herein; example embodiments are provided merely to be illustrative. Likewise, a reasonably broad scope for claimed or covered subject matter is intended. Among other things, for example, subject matter may be embodied as methods, devices, components, or systems. Accordingly, embodiments may, for example, take the form of hardware, software, firmware or any combination thereof (other than software per se). The following detailed description is, therefore, not intended to be taken in a limiting sense.


Throughout the specification and claims, terms may have nuanced meanings suggested or implied in context beyond an explicitly stated meaning. Likewise, the phrase “in one embodiment” as used herein does not necessarily refer to the same embodiment and the phrase “in another embodiment” as used herein does not necessarily refer to a different embodiment. It is intended, for example, that claimed subject matter include combinations of example embodiments in whole or in part.


In general, terminology may be understood at least in part from usage in context. For example, terms, such as “and”, “or”, or “and/or,” as used herein may include a variety of meanings that may depend at least in part upon the context in which such terms are used. Typically, “or” if used to associate a list, such as A, B or C, is intended to mean A, B, and C, here used in the inclusive sense, as well as A, B or C, here used in the exclusive sense. In addition, the term “one or more” as used herein, depending at least in part upon context, may be used to describe any feature, structure, or characteristic in a singular sense or may be used to describe combinations of features, structures or characteristics in a plural sense. Similarly, terms, such as “a.” “an,” or “the,” again, may be understood to convey a singular usage or to convey a plural usage, depending at least in part upon context. In addition, the term “based on” may be understood as not necessarily intended to convey an exclusive set of factors and may, instead, allow for existence of additional factors not necessarily expressly described, again, depending at least in part on context.


The present disclosure is described below with reference to block diagrams and operational illustrations of methods and devices. It is understood that each block of the block diagrams or operational illustrations, and combinations of blocks in the block diagrams or operational illustrations, can be implemented by means of analog or digital hardware and computer program instructions. These computer program instructions can be provided to a processor of a general purpose computer to alter its function as detailed herein, a special purpose computer, ASIC, or other programmable data processing apparatus, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, implement the functions/acts specified in the block diagrams or operational block or blocks. In some alternate implementations, the functions/acts noted in the blocks can occur out of the order noted in the operational illustrations. For example, two blocks shown in succession can in fact be executed substantially concurrently or the blocks can sometimes be executed in the reverse order, depending upon the functionality/acts involved.


For the purposes of this disclosure a non-transitory computer readable medium (or computer-readable storage medium/media) stores computer data, which data can include computer program code (or computer-executable instructions) that is executable by a computer, in machine readable form. By way of example, and not limitation, a computer readable medium may include computer readable storage media, for tangible or fixed storage of data, or communication media for transient interpretation of code-containing signals. Computer readable storage media, as used herein, refers to physical or tangible storage (as opposed to signals) and includes without limitation volatile and non-volatile, removable and non-removable media implemented in any method or technology for the tangible storage of information such as computer-readable instructions, data structures, program modules or other data. Computer readable storage media includes, but is not limited to, RAM, ROM, EPROM, EEPROM, flash memory or other solid state memory technology, optical storage, cloud storage, magnetic storage devices, or any other physical or material medium which can be used to tangibly store the desired information or data or instructions and which can be accessed by a computer or processor.


For the purposes of this disclosure the term “server” should be understood to refer to a service point which provides processing, database, and communication facilities. By way of example, and not limitation, the term “server” can refer to a single, physical processor with associated communications and data storage and database facilities, or it can refer to a networked or clustered complex of processors and associated network and storage devices, as well as operating software and one or more database systems and application software that support the services provided by the server. Cloud servers are examples.


For the purposes of this disclosure a “network” should be understood to refer to a network that may couple devices so that communications may be exchanged, such as between a server and a client device or other types of devices, including between wireless devices coupled via a wireless network, for example. A network may also include mass storage, such as network attached storage (NAS), a storage area network (SAN), a content delivery network (CDN) or other forms of computer or machine-readable media, for example. A network may include the Internet, one or more local area networks (LANs), one or more wide area networks (WANs), wire-line type connections, wireless type connections, cellular or any combination thereof. Likewise, sub-networks, which may employ differing architectures or may be compliant or compatible with differing protocols, may interoperate within a larger network.


For purposes of this disclosure, a “wireless network” should be understood to couple client devices with a network. A wireless network may employ stand-alone ad-hoc networks, mesh networks, Wireless LAN (WLAN) networks, cellular networks, or the like. A wireless network may further employ a plurality of network access technologies, including Wi-Fi, Long Term Evolution (LTE), WLAN, Wireless Router mesh, or 2nd, 3rd, 4th or 5th generation (2G, 3G, 4G or 5G) cellular technology, mobile edge computing (MEC), Bluetooth, 802.11b/g/n, or the like. Network access technologies may enable wide area coverage for devices, such as client devices with varying degrees of mobility, for example.


In short, a wireless network may include virtually any type of wireless communication mechanism by which signals may be communicated between devices, such as a client device or a computing device, between or within a network, or the like.


A computing device may be capable of sending or receiving signals, such as via a wired or wireless network, or may be capable of processing or storing signals, such as in memory as physical memory states, and may, therefore, operate as a server. Thus, devices capable of operating as a server may include, as examples, dedicated rack-mounted servers, desktop computers, laptop computers, set top boxes, integrated devices combining various features, such as two or more features of the foregoing devices, or the like.


For purposes of this disclosure, a client (or user, entity, subscriber or customer) device may include a computing device capable of sending or receiving signals, such as via a wired or a wireless network. A client device may, for example, include a desktop computer or a portable device, such as a cellular telephone, a smart phone, a display pager, a radio frequency (RF) device, an infrared (IR) device a Near Field Communication (NFC) device, a Personal Digital Assistant (PDA), a handheld computer, a tablet computer, a phablet, a laptop computer, a set top box, a wearable computer, smart watch, an integrated or distributed device combining various features, such as features of the forgoing devices, or the like.


A client device may vary in terms of capabilities or features. Claimed subject matter is intended to cover a wide range of potential variations, such as a web-enabled client device or previously mentioned devices may include a high-resolution screen (HD or 4K for example), one or more physical or virtual keyboards, mass storage, one or more accelerometers, one or more gyroscopes, global positioning system (GPS) or other location-identifying type capability, or a display with a high degree of functionality, such as a touch-sensitive color 2D or 3D display, for example.


Certain embodiments and principles will be discussed in more detail with reference to the figures. With reference to FIG. 1, system 100 is depicted which includes user equipment (UE) 102 (e.g., a client device, as mentioned above and discussed below in relation to FIG. 7), access point (AP) device 112, network 104, cloud system 106, database 108, sensors 110 and location engine 200. It should be understood that while system 100 is depicted as including such components, it should not be construed as limiting, as one of ordinary skill in the art would readily understand that varying numbers of UEs, AP devices, peripheral devices, sensors, cloud systems, databases and networks can be utilized; however, for purposes of explanation, system 100 is discussed in relation to the example depiction in FIG. 1.


According to some embodiments, UE 102 can be any type of device, such as, but not limited to, a mobile phone, tablet, laptop, sensor, IoT device, autonomous machine, and any other device equipped with a cellular or wireless or wired transceiver. For example, UE 102 can be a smart ring, which as discussed below in more detail, can enable the identification and/or collection of vitals of the wearing user. In some embodiments, such vitals can correspond to, but not be limited to, heart rate, heart rate variability (HRV), blood oxygen levels, blood pressure, hydration temperature, pulse, motion, sleep, and/or any other type of biometric for a person, or some combination thereof.


In some embodiments, peripheral device (not shown) can be connected to UE 102, and can be any type of peripheral device, such as, but not limited to, a wearable device (e.g., smart ring or smart watch), printer, speaker, sensor, and the like. In some embodiments, peripheral device can be any type of device that is connectable to UE 102 via any type of known or to be known pairing mechanism, including, but not limited to, WiFi, Bluetooth™, Bluetooth Low Energy (BLE), NFC, and the like. For example, the peripheral device can be a smart ring that connectively pairs with UE 102, which is a user's smart phone.


According to some embodiments, AP device 112 is a device that creates a wireless local area network (WLAN) for the location. According to some embodiments, the AP device 112 can be, but is not limited to, a router, switch, hub and/or any other type of network hardware that can project a WiFi signal to a designated area. For example, an AP device 112 can be a Plume Pod™, and the like. In some embodiments, UE 102 may be an AP device.


According to some embodiments, sensors 110 (or sensor devices 110) can correspond to any type of device, component and/or sensor associated with a location of system 100 (referred to, collectively, as “sensors”). In some embodiments, the sensors 110 can be any type of device that is capable of sensing and capturing data/metadata related to a user and/or activity of the location. For example, the sensors 110 can include, but not be limited to, cameras, motion detectors, door and window contacts, heat and smoke detectors, passive infrared (PIR) sensors, time-of-flight (ToF) sensors, and the like.


In some embodiments, sensors 110 can be associated with devices associated with the location of system 100, such as, for example, lights, smart locks, garage doors, smart appliances (e.g., thermostat, refrigerator, television, personal assistants (e.g., Alexa®, Nest®, for example)), smart rings, smart phones, smart watches or other wearables, tablets, personal computers, and the like, and some combination thereof. For example, the sensors 110 can include the sensors on UE 102 (e.g., smart phone) and/or peripheral device (e.g., a paired smart watch). In another example, sensors 110 can correspond to the sensors on a user's smart ring.


In some embodiments, sensors 110 can provide functionality for detecting air quality metrics, characteristics and/or attributes, which can correspond to specific types of air quality measurements. For example, such sensors can include sensors to measure data related to, but not limited to, temperature, humidity, air pressure, pollen, ultraviolet light (UV), carbon monoxide, nitrogen oxide, ozone, particulates, sulfur dioxide, and the like, or some combination thereof. In some embodiments, such sensors can be embodied as, but not limited to, air flow sensors, volatile organic compound (VOC) sensors, particulate matter (PM) sensors, gas sensors, temperature and humidity sensors, air quality monitors, and the like, or some combination thereof.


In some embodiments, network 104 can be any type of network, such as, but not limited to, a wireless network, cellular network, the Internet, and the like (as discussed above). Network 104 facilitates connectivity of the components of system 100, as illustrated in FIG. 1.


According to some embodiments, cloud system 106 may be any type of cloud operating platform and/or network based system upon which applications, operations, and/or other forms of network resources may be located. For example, system 106 may be a service provider and/or network provider from where services and/or applications may be accessed, sourced or executed from. For example, system 106 can represent the cloud-based architecture associated with a smart home or network provider, which has associated network resources hosted on the internet or private network (e.g., network 104), which enables (via engine 200) the location management and control discussed herein.


In some embodiments, cloud system 106 may include a server(s) and/or a database of information which is accessible over network 104. In some embodiments, a database 108 of cloud system 106 may store a dataset of data and metadata associated with local and/or network information related to a user(s) of the components of system 100 and/or each of the components of system 100 (e.g., UE 102, AP device 112, sensors 110, and the services and applications provided by cloud system 106 and/or location engine 200).


In some embodiments, for example, cloud system 106 can provide a private/proprietary management platform, whereby engine 200, discussed infra, corresponds to the novel functionality system 106 enables, hosts and provides to a network 104 and other devices/platforms operating thereon.


Turning to FIGS. 5 and 6, in some embodiments, the exemplary computer-based systems/platforms, the exemplary computer-based devices, and/or the exemplary computer-based components of the present disclosure may be specifically configured to operate in a cloud computing/architecture 106 such as, but not limiting to: infrastructure as a service (IaaS) 610, platform as a service (PaaS) 608, and/or software as a service (SaaS) 606 using a web browser, mobile app, thin client, terminal emulator or other endpoint 604. FIGS. 5 and 6 illustrate schematics of non-limiting implementations of the cloud computing/architecture(s) in which the exemplary computer-based systems for administrative customizations and control of network-hosted application program interfaces (APIs) of the present disclosure may be specifically configured to operate.


Turning back to FIG. 1, according to some embodiments, database 108 may correspond to a data storage for a platform (e.g., a network hosted platform, such as cloud system 106, as discussed supra) or a plurality of platforms. Database 108 may receive storage instructions/requests from, for example, engine 200 (and associated microservices), which may be in any type of known or to be known format, such as, for example, standard query language (SQL). According to some embodiments, database 108 may correspond to any type of known or to be known storage, for example, a memory or memory stack of a device, a distributed ledger of a distributed network (e.g., blockchain, for example), a look-up table (LUT), and/or any other type of secure data repository


Location engine 200, as discussed above and further below in more detail, can include components for the disclosed functionality. According to some embodiments, location engine 200 may be a special purpose machine or processor, and can be hosted by a device on network 104, within cloud system 106, on AP device 112 and/or on UE 102. In some embodiments, engine 200 may be hosted by a server and/or set of servers associated with cloud system 106.


According to some embodiments, as discussed in more detail below, location engine 200 may be configured to implement and/or control a plurality of services and/or microservices, where each of the plurality of services/microservices are configured to execute a plurality of workflows associated with performing the disclosed location management and control. Non-limiting embodiments of such workflows are provided below in relation to at least FIGS. 3-4.


According to some embodiments, as discussed above, location engine 200 may function as an application provided by cloud system 106. In some embodiments, engine 200 may function as an application installed on a server(s), network location and/or other type of network resource associated with system 106. In some embodiments, engine 200 may function as an application installed and/or executing on UE 102 and/or sensors 110 (and/or AP device 112, in some embodiments). In some embodiments, such application may be a web-based application accessed by AP device 112, UE 102 and/or devices associated with sensors 110 over network 104 from cloud system 106. In some embodiments, engine 200 may be configured and/or installed as an augmenting script, program or application (e.g., a plug-in or extension) to another application or program provided by cloud system 106 and/or executing on AP device 112, UE 102 and/or sensors 110.


As illustrated in FIG. 2, according to some embodiments, location engine 200 includes identification module 202, analysis module 204, determination module 206 and control module 208. It should be understood that the engine(s) and modules discussed herein are non-exhaustive, as additional or fewer engines and/or modules (or sub-modules) may be applicable to the embodiments of the systems and methods discussed. More detail of the operations, configurations and functionalities of engine 200 and each of its modules, and their role within embodiments of the present disclosure will be discussed below.


Turning to FIG. 3, Process 300 provides non-limiting example embodiments for the disclosed location management framework. According to some embodiments, as discussed herein, engine 200 can enable the tracking of data related to activity and environmental characteristics to manage a location for a user(s).


According to some embodiments, Step 302 of Process 300 can be performed by identification module 202 of location engine 200; Steps 304 and 308 can be performed by analysis module 204; Steps 306, 310 and 312 can be performed by determination module 204; and Step 310 can be performed by output module 208.


According to some embodiments, Process 300 begins with Step 302 where engine 200 can monitor the location to detect, determine or otherwise identify activity related to an activity of a user (e.g., movement or non-movement). For example, is the user in their bedroom sleeping, in the basement exercising, in the kitchen cooking, and the like. In some embodiments, engine 200 can monitor the location continuously, and/or according to a predetermined time interval. In some embodiments, the monitoring of the location can be performed via the components of FIG. 1, as discussed above (e.g., UE 102, AP device 112, sensors 110, for example). In some embodiments, the monitoring can involve periodically pinging each or a portion of the devices at the location, and awaiting a reply. In some embodiments, the monitoring can involve push and/or fetch protocols to collect user data from each sensor and/or from cloud storage (e.g., cloud system 106, discussed supra).


According to some embodiments, the monitoring in Step 302 can involve collecting data related to the location and/or activities of the user. In some embodiments, engine 200 can collect the data based on detected events. In some embodiments, type and/or quantity of collected data may be directly tied to the type of device performing such data collection. For example, a sensor associated with lights within the user's bedroom may only collect user data when a light in that room is turned off (and/or turned off from previously being turned on). For example, such data can provide, but is not limited to, an on event, which can indicate, but is not limited to, the identity of the light/room, time of toggling, duration of being turned on, frequency of being turned on, and the like, or some combination thereof). In another non-limiting example, a gyroscope sensor on a user's smartphone and/or smart ring can detect when a user is moving, the type and/or metrics of such movements. And, in yet another example, a humidity sensor can provide an indication of a measurement of the humidity at the location (or in a specific room depending on placement of the sensor, for example).


In some embodiments, the collected data may be derived and/or mined from stored user data within an associated or third party cloud. For example, engine 200 can be associated with a cloud, which can store collected network traffic and/or collected user data for the user in an associated account of the user. Thus, in some embodiments, Step 304 can involve querying the cloud for information about the user, which can be based on a criteria that can include, but is not limited to, a time, date, activity, event, other collected data, and the like, or some combination thereof.


In some embodiments, the collected data in Step 302 can be stored in database 108 in association with an identifier (ID) of a user, an ID of the location and/or an ID of an account of the user/location.


In Step 304, based on the monitoring of the location and collection of data therefrom, engine 200 can detect an event corresponding to a type and/or quantity of activity. Accordingly, in some embodiments, engine 200 can analyze the collected data, which can be performed via any type of known or to be known computational analysis technique, algorithm, mechanism or technology to analyze the collected user data from Step 306.


In some embodiments, engine 200 may include a specific trained artificial intelligence/machine learning model (AI/ML), a particular machine learning model architecture, a particular machine learning model type (e.g., convolutional neural network (CNN), recurrent neural network (RNN), autoencoder, support vector machine (SVM), and the like), or any other suitable definition of a machine learning model or any suitable combination thereof.


In some embodiments, engine 200 may be configured to utilize one or more AI/ML techniques chosen from, but not limited to, computer vision, feature vector analysis, decision trees, boosting, support-vector machines, neural networks, nearest neighbor algorithms, Naive Bayes, bagging, random forests, logistic regression, and the like.


In some embodiments and, optionally, in combination of any embodiment described above or below, a neural network technique may be one of, without limitation, feedforward neural network, radial basis function network, recurrent neural network, convolutional network (e.g., U-net) or other suitable network. In some embodiments and, optionally, in combination of any embodiment described above or below, an implementation of Neural Network may be executed as follows:

    • a. define Neural Network architecture/model,
    • b. transfer the input data to the neural network model,
    • c. train the model incrementally,
    • d. determine the accuracy for a specific number of timesteps,
    • e. apply the trained model to process the newly-received input data,
    • f. optionally and in parallel, continue to train the trained model with a predetermined periodicity.


In some embodiments and, optionally, in combination of any embodiment described above or below, the trained neural network model may specify a neural network by at least a neural network topology, a series of activation functions, and connection weights. For example, the topology of a neural network may include a configuration of nodes of the neural network and connections between such nodes. In some embodiments and, optionally, in combination of any embodiment described above or below, the trained neural network model may also be specified to include other parameters, including but not limited to, bias values/functions and/or aggregation functions. For example, an activation function of a node may be a step function, sine function, continuous or piecewise linear function, sigmoid function, hyperbolic tangent function, or other type of mathematical function that represents a threshold at which the node is activated. In some embodiments and, optionally, in combination of any embodiment described above or below, the aggregation function may be a mathematical function that combines (e.g., sum, product, and the like) input signals to the node. In some embodiments and, optionally, in combination of any embodiment described above or below, an output of the aggregation function may be used as input to the activation function. In some embodiments and, optionally, in combination of any embodiment described above or below, the bias may be a constant value or function that may be used by the aggregation function and/or the activation function to make the node more or less likely to be activated.


In Step 306, based on the analysis in Step 304, engine 200 can determine characteristics related to the event. In some embodiments, the characteristics can correspond to, but are not limited to, a type of activity, time, date, participating user or users, area within the location (or around location), duration of activity, and the like, or some combination thereof.


In some embodiments, such determined characteristics can be stored in a similar manner as discussed above.


In Step 308, engine 200 can collect and analyze characteristics of the location corresponding to a time proximate to the event. That is, in some embodiments, Step 308 (and Step 310, discussed infra) can be perform simultaneously (or substantially simultaneously) in accordance with Steps 304-306, whereby upon detection of event, in addition to the activity information being collected and analyzed, environmental factors related to such activity can be collected and analyzed.


According to some embodiments, Step 308 can involve collecting information from sensors 110, discussed above in relation to FIG. 1, whereby air quality metrics and measurements can be compiled. For example, as discussed supra, data from air flow sensors, VOC sensors, particle PM sensors, gas sensors, temperature and humidity sensors, air quality monitors, and the like, can provide air quality metrics (or information) corresponding to specific types of air quality attributes related to the event. Thus, such air quality metrics can provide information related to, but not limited to, temperature, humidity, air pressure, pollen, UV, carbon monoxide, nitrogen oxide, ozone, particulates, sulfur dioxide, and the like, or some combination thereof.


In some embodiments, such determined air quality information can be stored in a similar manner as discussed above.


In Step 310, based on the collected air quality metrics and/or measurements, engine 200 can determine the environmental characteristics for the location that correspond to the event. In some embodiments, thresholds of air quality can be leveraged to determine which metrics are of value in leveraging to control the location and/or alert the user. For example, if the temperature within a location's garage is 72 degrees fahrenheit and the user is exercising, then this may be considered to be within a temperature range for the location/user; however, if the temperature is 88 degrees fahrenheit, then this may be considered too warm/hot for the type of activity the user is performing (e.g., additionally based on the location of the user in the garage).


In some embodiments, engine 200 can compile an input related to information related to the event, the collected environmental data (from Step 308), as well as other information related to the location (from Step 304, in some embodiments) and analyze such input data via the AI/ML modelling discussed above. Accordingly, engine 200 can determine a comprehensive understanding of the characteristics of the environment in accordance with the event occurring at the location.


In some embodiments, such determined characteristics information can be stored in a similar manner as discussed above.


In Step 312, engine 200 can determine (or generate) a location-based recommendation(s) based on the characteristics related to the event (from Step 306) and the environment (from Step 310). According to some embodiments, engine 200 can compile the event information and environment information as input to an AI/ML model(s), as discussed above, whereby such recommendations can be output. For example, if the user is exercising in the garage and the temperature is above a threshold (e.g., about 75 degrees), then the recommendation can be to open the garage door at least a portion to enable air flow to occur at least a particular factor.


In some embodiments, the location-based recommendation(s) can provide a set of instructions that can be implemented and/or executed so as to manipulate, control or manage a location and its components therein, as well as alert a user as to certain environment circumstances that currently exist in line with their activity. For example, a recommendation can include information related to, but not limited to, screen time limits, Internet traffic limits, television limits, preferred temperatures in the location, lighting instructions, recommendations for amount of blankets on the bed as per the user's ideal sleep temperature (against the indoor temperature and outside climate), and the like, or some combination thereof. For example, continuing the above example, instructions can be sent to alert the user to open the garage door and turn on a fan in the garage; and/or, the instructions can automatically engage the components of the location to turn the fan on and open the garage door automatically.


In some embodiments, such recommendation can be compiled as a data structure and stored in database 108, in a similar manner as discussed above.


In Step 314, engine 200 can cause the location-based recommendation to output to the user. For example, the location-based recommendation can be displayed on a device of the user, which can be in the form of an application notification, electronic message, and the like. In some embodiments, such notifications can be push notifications that can alert the user to certain circumstances at the location. For example, if the pollen count in the air is at or above a threshold, the user can be sent a notification in an application executing on their smart phone that indicates the pollen count, advises to change the filters in the home and close the windows (rather than run the air conditioning).


In some embodiments, the location-based recommendation can visibly display a set of steps and/or instructions for the user to perform. In some embodiments, the location-based recommendation can visibly display steps/instructions that engine 200 will automatically control, and in some embodiments, can provide interactive features for the user to approve and/or decline such optimizations.


According to some embodiments, particular devices, appliances, and/or other real-world and/or digital components of the location can be controlled via the location-based recommendations determined in Step 312 and output in Step 314. For example, as discussed above, network traffic, devices and/or other types of known or to be known electronic devices can be managed according to the location-based recommendation. In another non-limiting example, as discussed above, lights, temperature, humidity and/or other types of known or to be known real-world environmental items can be managed according to the location-based recommendation.


Thus, according to some embodiments, Step 314 can involve engine 200 controlling the location and/or components of the location, as well as causing information related to such control and reasons as to the cause of such control to be displayed within application interfaces.


For example, a user's HomePass® application can be controlled to notify the user that certain environment conditions exist that can be improved upon, and provide recommendations for such improvement. Further discussion of how this can be optimized and performed is discussed below at least in relation to FIG. 4.


Accordingly, upon performing Step 314, engine 200 may then continue monitoring the user and/or location. In some embodiments, the monitoring can continue running in the backend, while certain modules of engine 200 execute to optimize location for the activity of the user.


According to some embodiments, a user and/or location can have a dedicated engine 200 model so that the location protocols discussed herein can be specific to the events and patterns learned and detected for the user and/or at that location. In some embodiments, the model can be specific for a user or set of users (e.g., users that live at a certain location (e.g., a house).


In some embodiments, as evident from the above discussion, patterns of user activity can be learned and predicted; and, as a result, upon a predicted activity being performed, engine 200 can generate recommendations (or retrieve stored recommendations) that can manipulate or control the location to optimize the projected activity/event.


Turning to FIG. 4, provided is Process 400 for serving or providing related digital media content based on the information associated with an event at a location, as discussed above. For example, if the UV index is high (e.g., at a 9), and the user is predicted or planning to go outside and perform an activity (e.g., barbeque on the patio, which can be monitored via the sensors 110 AP 112 and/or UE 102, for example), engine 200 can leverage this information to provide alerts and recommend products, services and/or activities to remedy any dangers associated with the activity. For example, engine 200 can retrieve/receive and provide digital content for purchasing sun screen or an awning for the location, as discussed below. In some embodiments, the provided content can be associated with or comprising advertisements (e.g., digital advertisement content). Such information can be referred to as “event information” for reference purposes only.


As discussed above, reference to an “advertisement” should be understood to include, but not be limited to, digital media content that provides information provided by another user, service, third party, entity, and the like. Such digital ad content can include any type of known or to be known media renderable by a computing device, including, but not limited to, video, text, audio, images, and/or any other type of known or to be known multi-media. In some embodiments, the digital ad content can be formatted as hyperlinked multi-media content that provides deep-linking features and/or capabilities. Therefore, while the content is referred to as an advertisement, it is still a digital media item that is renderable by a computing device, and such digital media item comprises digital content relaying promotional content provided by a network associated third party.


In Step 402, event information is identified. This information can be derived, determined, based on or otherwise identified from the steps of Process 300, as discussed above. For example, event information can refer to, but is not limited to, an activity, a user, a location, a component at the location, an environmental characteristic, a recommendation, and the like, or some combination thereof.


In Step 504, a context is determined based on the identified event information. This context forms a basis for serving content related to the event. For example, if the event involves the user sleeping, yet the temperature during the winter is always too cold for the user given the lack of insulation in their home, the context can correspond to “blankets” or “cold weather sleeping gear.” Accordingly, such context can be leveraged to identify digital content for purchasing such gear.


In another non-limiting example, if the user's activity is being on video conference calls in the evening, yet her children use most of the bandwidth via their streaming apps, the context may correspond to “network accessibility.” Thus, it may be recommended to upgrade or increase your Mbps or network plan, and/or purchase another extender so she can work in another location and not be bothered by the noise of her children.


In some embodiments, the identification of the context from Step 404 can occur before, during and/or after the analysis detailed above with respect to FIG. 3, or it can be a separate process altogether, or some combination thereof.


In Step 406, the determined context can be communicated (or shared) with a content providing platform. For example, the platform can correspond to cloud system 106, a third party platform, or any other type of platform that includes a server and database (e.g., content server and content database, and/or advertisement server and ad database). Upon receipt of the context, the server can be caused to perform (e.g., as per instructions received from the device executing the engine 200) a search for a relevant digital content within the associated database. The search for the content is based at least on the identified context.


In Step 408, the server can be caused to search the database for a digital content item(s) that matches the identified context. In Step 410, a content item can be selected (or retrieved) based on the results of Step 408.


In some embodiments, the selected content item can be modified to conform to attributes or capabilities of a device, application, browser user interface (UI), video, page, interface, platform or method upon which a user will be viewing the content item and/or recommendation. Accordingly, in some embodiments, the selected content item can be shared or communicated via the device and/or application being used by a user. Step 412. In some embodiments, the selected content item can be sent directly to a user computing device for display on a device and/or within an application user interface (UI) displayed on the device's display (e.g., within the application or browser window and/or within an inbox of a high-security network property). In some embodiments, the selected content item can be displayed within a portion of the interface or within an overlaying or pop-up interface associated with a rendering interface displayed on the device.



FIG. 7 is a schematic diagram illustrating a client device showing an example embodiment of a client device that may be used within the present disclosure. Client device 700 may include many more or less components than those shown in FIG. 7. However, the components shown are sufficient to disclose an illustrative embodiment for implementing the present disclosure. Client device 700 may represent, for example, UE 102 discussed above at least in relation to FIG. 1.


As shown in the figure, in some embodiments, Client device 700 includes a processing unit (CPU) 722 in communication with a mass memory 730 via a bus 724. Client device 700 also includes a power supply 726, one or more network interfaces 750, an audio interface 752, a display 754, a keypad 756, an illuminator 758, an input/output interface 760, a haptic interface 762, an optional global positioning systems (GPS) receiver 764 and a camera(s) or other optical, thermal or electromagnetic sensors 766. Device 700 can include one camera/sensor 766, or a plurality of cameras/sensors 766, as understood by those of skill in the art. Power supply 726 provides power to Client device 700.


Client device 700 may optionally communicate with a base station (not shown), or directly with another computing device. In some embodiments, network interface 750 is sometimes known as a transceiver, transceiving device, or network interface card (NIC).


Audio interface 752 is arranged to produce and receive audio signals such as the sound of a human voice in some embodiments. Display 754 may be a liquid crystal display (LCD), gas plasma, light emitting diode (LED), or any other type of display used with a computing device. Display 754 may also include a touch sensitive screen arranged to receive input from an object such as a stylus or a digit from a human hand.


Keypad 756 may include any input device arranged to receive input from a user. Illuminator 758 may provide a status indication and/or provide light.


Client device 700 also includes input/output interface 760 for communicating with external. Input/output interface 760 can utilize one or more communication technologies, such as USB, infrared, Bluetooth™, or the like in some embodiments. Haptic interface 762 is arranged to provide tactile feedback to a user of the client device.


Optional GPS transceiver 764 can determine the physical coordinates of Client device 700 on the surface of the Earth, which typically outputs a location as latitude and longitude values. GPS transceiver 764 can also employ other geo-positioning mechanisms, including, but not limited to, triangulation, assisted GPS (AGPS), E-OTD, CI, SAI, ETA, BSS or the like, to further determine the physical location of client device 700 on the surface of the Earth. In one embodiment, however, Client device may through other components, provide other information that may be employed to determine a physical location of the device, including for example, a MAC address, Internet Protocol (IP) address, or the like.


Mass memory 730 includes a RAM 732, a ROM 734, and other storage means. Mass memory 730 illustrates another example of computer storage media for storage of information such as computer readable instructions, data structures, program modules or other data. Mass memory 730 stores a basic input/output system (“BIOS”) 740 for controlling low-level operation of Client device 700. The mass memory also stores an operating system 741 for controlling the operation of Client device 700.


Memory 730 further includes one or more data stores, which can be utilized by Client device 700 to store, among other things, applications 742 and/or other information or data. For example, data stores may be employed to store information that describes various capabilities of Client device 700. The information may then be provided to another device based on any of a variety of events, including being sent as part of a header (e.g., index file of the HLS stream) during a communication, sent upon request, or the like. At least a portion of the capability information may also be stored on a disk drive or other storage medium (not shown) within Client device 700.


Applications 742 may include computer executable instructions which, when executed by Client device 700, transmit, receive, and/or otherwise process audio, video, images, and enable telecommunication with a server and/or another user of another client device. Applications 742 may further include a client that is configured to send, to receive, and/or to otherwise process gaming, goods/services and/or other forms of data, messages and content hosted and provided by the platform associated with engine 200 and its affiliates.


According to some embodiments, certain aspects of the instant disclosure can be embodied via functionality discussed herein, as disclosed supra. According to some embodiments, some non-limiting aspects can include, but are not limited to the below method aspects, which can additionally be embodied as system, apparatus and/or device functionality:


Aspect 1. A method comprising:

    • collecting, by a device, data at a location, the location data comprising information related to an event involving a user and an environment of the location in relation to the user;
    • analyzing, by the device, the collected data;
    • determining, by the device, based on the analysis, characteristics related to an activity of the event;
    • determining, by the device, based on the analysis, characteristics related to the environment of the location;
    • generating, by the device, a location-based recommendation based on the characteristics of the event and the characteristics of the location, the location-based recommendation comprising electronic instructions for controlling real-world and digital components of the location; and
    • communicating, over a network, the location-based recommendation to at least one device at the location, the communication causing the at least one device to execute the electronic instructions so as to manipulate the real-world and digital components in relation to performance of the activity of the event by the user.


      Aspect 2. The method of Aspect 1, wherein the manipulation of the real-world and digital components comprises automatic control of electronic devices at the location, the automatic control causing the electronic devices to operate in a manner as directed by the electronic instructions.


      Aspect 3. The method of Aspect 1, further comprising:
    • compiling a notification related to the location-based recommendation; and
    • communicating, over the network, the notification to the at least one device for display on a display associated with the at least one device.


Aspect 4. The method of Aspect 3, wherein the notification is visibly displayed upon communication, wherein the visible display provides visibly displayed indications for the user related to how the user is recommended to act based on the electronic instructions.


Aspect 5. The method of Aspect 4, wherein the notification further comprises information indicating at least the determined environment characteristics.


Aspect 6. The method of Aspect 1, further comprising:

    • determining, based on the environmental and event characteristics, a context, the context corresponding to a manner in which the activity is performed at the location; and
    • determining, based on the context, a digital content item, the digital content item comprising content related to a product or service that corresponds to managing the location in line with the environmental and event characteristics.


Aspect 7. The method of Aspect 6, further comprising:

    • communicating, to the at least one device, the digital content item, the communication causing the digital content item to be displayed and enabling access to the product or service.


      Aspect 8. The method of Aspect 1, wherein the environmental characteristics correspond to air quality metrics of the location, wherein the location-based recommendation is directed to improving the air quality metrics during the event.


      Aspect 9. The method of Aspect 1, wherein the data related to the event corresponds to screen usage, motion or activity data at the location, cloud-hosted data about the user, biometrics of the user, and network data related to a network at the location.


As used herein, the terms “computer engine” and “engine” identify at least one software component and/or a combination of at least one software component and at least one hardware component which are designed/programmed/configured to manage/control other software and/or hardware components (such as the libraries, software development kits (SDKs), objects, and the like).


Examples of hardware elements may include processors, microprocessors, circuits, circuit elements (e.g., transistors, resistors, capacitors, inductors, and so forth), integrated circuits, application specific integrated circuits (ASIC), programmable logic devices (PLD), digital signal processors (DSP), field programmable gate array (FPGA), logic gates, registers, semiconductor device, chips, microchips, chip sets, and so forth. In some embodiments, the one or more processors may be implemented as a Complex Instruction Set Computer (CISC) or Reduced Instruction Set Computer (RISC) processors; x86 instruction set compatible processors, multi-core, or any other microprocessor or central processing unit (CPU). In various implementations, the one or more processors may be dual-core processor(s), dual-core mobile processor(s), and so forth.


Computer-related systems, computer systems, and systems, as used herein, include any combination of hardware and software. Examples of software may include software components, programs, applications, operating system software, middleware, firmware, software modules, routines, subroutines, functions, methods, procedures, software interfaces, application program interfaces (API), instruction sets, computer code, computer code segments, words, values, symbols, or any combination thereof. Determining whether an embodiment is implemented using hardware elements and/or software elements may vary in accordance with any number of factors, such as desired computational rate, power levels, heat tolerances, processing cycle budget, input data rates, output data rates, memory resources, data bus speeds and other design or performance constraints.


For the purposes of this disclosure a module is a software, hardware, or firmware (or combinations thereof) system, process or functionality, or component thereof, that performs or facilitates the processes, features, and/or functions described herein (with or without human interaction or augmentation). A module can include sub-modules. Software components of a module may be stored on a computer readable medium for execution by a processor. Modules may be integral to one or more servers, or be loaded and executed by one or more servers. One or more modules may be grouped into an engine or an application.


One or more aspects of at least one embodiment may be implemented by representative instructions stored on a machine-readable medium which represents various logic within the processor, which when read by a machine causes the machine to fabricate logic to perform the techniques described herein. Such representations, known as “IP cores,” may be stored on a tangible, machine readable medium and supplied to various customers or manufacturing facilities to load into the fabrication machines that make the logic or processor. Of note, various embodiments described herein may, of course, be implemented using any appropriate hardware and/or computing software languages (e.g., C++, Objective-C, Swift, Java, JavaScript, Python, Perl, QT, and the like).


For example, exemplary software specifically programmed in accordance with one or more principles of the present disclosure may be downloadable from a network, for example, a website, as a stand-alone product or as an add-in package for installation in an existing software application. For example, exemplary software specifically programmed in accordance with one or more principles of the present disclosure may also be available as a client-server software application, or as a web-enabled software application. For example, exemplary software specifically programmed in accordance with one or more principles of the present disclosure may also be embodied as a software package installed on a hardware device.


For the purposes of this disclosure the term “user”, “subscriber” “consumer” or “customer” should be understood to refer to a user of an application or applications as described herein and/or a consumer of data supplied by a data provider. By way of example, and not limitation, the term “user” or “subscriber” can refer to a person who receives data provided by the data or service provider over the Internet in a browser session, or can refer to an automated software application which receives the data and stores or processes the data. Those skilled in the art will recognize that the methods and systems of the present disclosure may be implemented in many manners and as such are not to be limited by the foregoing exemplary embodiments and examples. In other words, functional elements being performed by single or multiple components, in various combinations of hardware and software or firmware, and individual functions, may be distributed among software applications at either the client level or server level or both. In this regard, any number of the features of the different embodiments described herein may be combined into single or multiple embodiments, and alternate embodiments having fewer than, or more than, all of the features described herein are possible.


Functionality may also be, in whole or in part, distributed among multiple components, in manners now known or to become known. Thus, myriad software/hardware/firmware combinations are possible in achieving the functions, features, interfaces and preferences described herein. Moreover, the scope of the present disclosure covers conventionally known manners for carrying out the described features and functions and interfaces, as well as those variations and modifications that may be made to the hardware or software or firmware components described herein as would be understood by those skilled in the art now and hereafter.


Furthermore, the embodiments of methods presented and described as flowcharts in this disclosure are provided by way of example in order to provide a more complete understanding of the technology. The disclosed methods are not limited to the operations and logical flow presented herein. Alternative embodiments are contemplated in which the order of the various operations is altered and in which sub-operations described as being part of a larger operation are performed independently.


While various embodiments have been described for purposes of this disclosure, such embodiments should not be deemed to limit the teaching of this disclosure to those embodiments. Various changes and modifications may be made to the elements and operations described above to obtain a result that remains within the scope of the systems and processes described in this disclosure.

Claims
  • 1. A method comprising: collecting, by a device, data at a location, the location data comprising information related to an event involving a user and an environment of the location in relation to the user;analyzing, by the device, the collected data;determining, by the device, based on the analysis, characteristics related to an activity of the event;determining, by the device, based on the analysis, characteristics related to the environment of the location;generating, by the device, a location-based recommendation based on the characteristics of the event and the characteristics of the location, the location-based recommendation comprising electronic instructions for controlling real-world and digital components of the location; andcommunicating, over a network, the location-based recommendation to at least one device at the location, the communication causing the at least one device to execute the electronic instructions so as to manipulate the real-world and digital components in relation to performance of the activity of the event by the user.
  • 2. The method of claim 1, wherein the manipulation of the real-world and digital components comprises automatic control of electronic devices at the location, the automatic control causing the electronic devices to operate in a manner as directed by the electronic instructions.
  • 3. The method of claim 1, further comprising: compiling a notification related to the location-based recommendation; andcommunicating, over the network, the notification to the at least one device for display on a display associated with the at least one device.
  • 4. The method of claim 3, wherein the notification is visibly displayed upon communication, wherein the visible display provides visibly displayed indications for the user related to how the user is recommended to act based on the electronic instructions.
  • 5. The method of claim 4, wherein the notification further comprises information indicating at least the determined environment characteristics.
  • 6. The method of claim 1, further comprising: determining, based on the environmental and event characteristics, a context, the context corresponding to a manner in which the activity is performed at the location; anddetermining, based on the context, a digital content item, the digital content item comprising content related to a product or service that corresponds to managing the location in line with the environmental and event characteristics.
  • 7. The method of claim 6, further comprising: communicating, to the at least one device, the digital content item, the communication causing the digital content item to be displayed and enabling access to the product or service.
  • 8. The method of claim 1, wherein the environmental characteristics correspond to air quality metrics of the location, wherein the location-based recommendation is directed to improving the air quality metrics during the event.
  • 9. The method of claim 1, wherein the data related to the event corresponds to screen usage, motion or activity data at the location, cloud-hosted data about the user, biometrics of the user, and network data related to a network at the location.
  • 10. A device comprising: a processor configured to: collect data at a location, the location data comprising information related to an event involving a user and an environment of the location in relation to the user;analyze the collected data;determine, based on the analysis, characteristics related to an activity of the event;determine, based on the analysis, characteristics related to the environment of the location;generate a location-based recommendation based on the characteristics of the event and the characteristics of the location, the location-based recommendation comprising electronic instructions for controlling real-world and digital components of the location; andcommunicate, over a network, the location-based recommendation to at least one device at the location, the communication causing the at least one device to execute the electronic instructions so as to manipulate the real-world and digital components in relation to performance of the activity of the event by the user.
  • 11. The device of claim 10, wherein the manipulation of the real-world and digital components comprises automatic control of electronic devices at the location, the automatic control causing the electronic devices to operate in a manner as directed by the electronic instructions.
  • 12. The device of claim 10, wherein the processor is further configured to: compile a notification related to the location-based recommendation; andcommunicate, over the network, the notification to the at least one device for display on a display associated with the at least one device.
  • 13. The device of claim 12, wherein the notification is visibly displayed upon communication, wherein the visible display provides visibly displayed indications for the user related to how the user is recommended to act based on the electronic instructions.
  • 14. The device of claim 13, wherein the notification further comprises information indicating at least the determined environment characteristics.
  • 15. The device of claim 10, wherein the processor is further configured to: determine, based on the environmental and event characteristics, a context, the context corresponding to a manner in which the activity is performed at the location;determine, based on the context, a digital content item, the digital content item comprising content related to a product or service that corresponds to managing the location in line with the environmental and event characteristics; andcommunicate, to the at least one device, the digital content item, the communication causing the digital content item to be displayed and enabling access to the product or service.
  • 16. The device of claim 10, wherein: the environmental characteristics correspond to air quality metrics of the location, wherein the location-based recommendation is directed to improving the air quality metrics during the event, andthe data related to the event corresponds to screen usage, motion or activity data at the location, cloud-hosted data about the user, biometrics of the user, and network data related to a network at the location.
  • 17. A non-transitory computer-readable storage medium tangibly encoded with computer-executable instructions that when executed by a device, perform a method comprising: collecting, by a device, data at a location, the location data comprising information related to an event involving a user and an environment of the location in relation to the user;analyzing, by the device, the collected data;determining, by the device, based on the analysis, characteristics related to an activity of the event;determining, by the device, based on the analysis, characteristics related to the environment of the location;generating, by the device, a location-based recommendation based on the characteristics of the event and the characteristics of the location, the location-based recommendation comprising electronic instructions for controlling real-world and digital components of the location; andcommunicating, over a network, the location-based recommendation to at least one device at the location, the communication causing the at least one device to execute the electronic instructions so as to manipulate the real-world and digital components in relation to performance of the activity of the event by the user.
  • 18. The non-transitory computer-readable storage medium of claim 17, wherein the manipulation of the real-world and digital components comprises automatic control of electronic devices at the location, the automatic control causing the electronic devices to operate in a manner as directed by the electronic instructions.
  • 19. The non-transitory computer-readable storage medium of claim 17, further comprising: compiling a notification related to the location-based recommendation; andcommunicating, over the network, the notification to the at least one device for display on a display associated with the at least one device, wherein the notification is visibly displayed upon communication, wherein the visible display provides visibly displayed indications for the user related to how the user is recommended to act based on the electronic instructions, wherein the notification further comprises information indicating at least the determined environment characteristics.
  • 20. The non-transitory computer-readable storage medium of claim 17, further comprising: determining, based on the environmental and event characteristics, a context, the context corresponding to a manner in which the activity is performed at the location;determining, based on the context, a digital content item, the digital content item comprising content related to a product or service that corresponds to managing the location in line with the environmental and event characteristics;communicating, to the at least one device, the digital content item, the communication causing the digital content item to be displayed and enabling access to the product or service.