Embodiments described herein generally relate to systems and methods for improved interfacing between one or more human users and one or more computing devices.
Human users rely on computing devices to perform a wide variety of tasks, including tasks that until recently would have been completed through direct human-to-human transactions. For example, computing devices are now used to manage financial matters, make reservations for travel or entertainment, make electronic purchases, and control other smart machines, such as thermostats, refrigerators, etc.
In the drawings, which are not necessarily drawn to scale, like numerals may describe similar components in different views. Like numerals having different letter suffixes may represent different instances of similar components. Some embodiments are illustrated by way of example, and not of limitation, in the figures of the accompanying drawings.
Various examples are directed to systems and methods for a computing device interface to interface one or more computing devices with a plurality of human users. When multiple users utilize a computing device interface, there is a potential for conflicts between the users about the way that the computing device is or should be used. Various interface examples described herein are arranged to detect and remediate conflicts between multiple users.
In multi-user scenarios, unmediated conflicts between users can lead to inconsistent and inefficient operation of computing devices. For example, a first user may instruct a computing device to perform a first action. A second user may instruct the computing device to reverse the first action and/or to execute a second action inconsistent with the first action. In this way, a computing device and/or system of computing devices operates inefficiently.
To avoid inefficient operation, various examples herein include one or more computing devices that are programmed to detect conflicts between multiple human users and to remediate those conflicts. Detecting and remediating conflicts among human users in this manner may lead to faster, more efficient use of computing resources.
An example interface system may receive goal data describing one or more shared goals held in common by a set of one, two, or more users. Shared goals may be of any suitable type, such as financial goals, business goals, etc. In some examples, a family may have a common financial goal to save for retirement, a child's education, etc. An example of a user business goal could be meeting a periodic budget, meeting a sales goal, etc.
The interface system may also receive transaction data and audio data. Transaction data may describe one or more transactions made by one or more of the users. Transactions may include financial transactions, such as purchases of goods or services, product sales, trades in securities, etc. Audio data may indicate voice data from one or more of the users. Voice data may indicate, for example, voice commands provided by one or more users to the interface system, conversation between the users, etc. The interface system may analyze the audio data to extract additional information such as, for example, word data indicating key words spoken by the users, tone data indicating the users' emotional state, etc.
The interface system may utilize the received data to detect a goal conflict between the set of users. For example, transaction data may indicate that one user is spending at a level that is inconsistent with a financial goal. Also, voice data may include keywords that indicate that one or more users are not in agreement with one or more goals of the group. Voice data may also show that one or more of the users are speaking with raised tones, which may indicate a goal conflict.
In response to detecting a goal conflict, the interface system may select and execute a mediation routine. The mediation routine may prompt the set of users to resolve the goal conflict, for example, by modifying the goal and/or by modifying the behavior of one or more of the set of users that is inconsistent with the goal. In one example mediation routine, the interface system determines a modified goal for the set of users that is consistent with the users' behavior and/or that minimizes the behavioral changes that the users are to make to be consistent with the goal. The interface system may select a mediation routine, for example, based on a type of goal conflict detected. For example, some goal conflicts may arise because one or more of the users do not agree with the goal. Goal conflicts of this type may be remediated, for example, with a mediation routine that takes the user through a process of selecting a new goal that can be agreed to by all or most of the users in the set of users. Another type of goal conflict may arise if one or more users are having difficulty abiding by a goal. Goal conflicts of this type may be remediated, for example, with a mediation routine that is directed to identifying ways for the wayward users to conform to the goal (e.g., by helping the users' budget, by finding savings in one area that would offset excess spending in another area, etc.).
The interface system 102 may be or include any suitable computing device or devices. In some examples, the interface system 102 includes one or more servers or other suitable computing devices. The interface system 102 may be implemented at one or more computing devices at a single location and/or at multiple computing devices distributed over different locations.
User computing devices 108, 110 may be utilized by users 106A, 106B to access functionality of the environment 100. For example, user computing devices 108, 110 may include one or more mobile telephones, smart speaker devices, tablet computers, laptop computers, desktop computers, etc. User computing devices 108, 110 may be configured with various input/output (I/0) devices for receiving input from and providing output to users 106A, 106B. For example, the user computing devices 108, 110 may include one or more microphones or other audio sensors to receive audio data. For example, audio data may describe the users' speech. User computing devices 108, 110 may also include one or more speakers for providing audio output to the users 106A, 106B. Some user computing devices 108, 110 may also include a display or other output device for providing visual output. Further details of example user computing devices are provided herein with respect to
In some examples, user computing devices 108, 110 may execute interface applications 114A, 114B. Interface applications 114A, 1148 may provide the users 106A, 106B with access to computer functionality executed at the user computing devices 108, 110, at the interface system 102, and/or at another system. In some examples, interface applications 114A, 114B implement a virtual assistant that provides an audio interface between the user 106A, 106B and one or more computing devices.
Smart machines 128 may include any suitable household or other device that is network-enabled to provide usage data to the interface system 102. Example smart machines 128 in a household setting may include a thermostat, a hot water heater, etc. Example smart machines 128 in a business setting, may include, for example, industrial equipment, a stock or supply dispensing machine, etc. Smart machines 128 may provide the interface system 102 with usage data 136 describing usage of the smart machines 128 by users 106A, 106B. For example, a network-enabled thermostat may provide usage data 136 describing thermostat settings including, in some examples, a user 106A, 106B who initiated a change to the thermostat settings. In another example, a network-enabled Computer Numerical Control (CNC) machine may provide usage data 136 describing the user 106A, 106B who uses the CNC machine as well as a description of the type of work performed (e.g., work pieces used, etc.). In some examples, smart machines may also include one or more machines that measure biometric data of the user 106A, 106B. For example, heart rate, skin temperature, etc. For example, a smart machine may be a wearable computing device worn by a user 106A, 106B, a remote sensor mounted in the user's environment (e.g., on a wall, on an appliance), etc.
One or more media systems 130 may include any system that records communications of the user 106A, 106B, for example, utilizing social media. For example, media systems 130 may include one or more servers from social media providers, such as Facebook, Inc., Twitter, Inc., etc. Media systems 130 may provide media data 138 that may include, for example, social media feeds of one or more of the users 106A, 106B. Biometric parameters may indicate conflict. For example, elevated heart rate, temperature, etc., may indicate that the user 106A, 106B is angry, and therefore indicate a conflict.
One or more account management systems 132 may be associated, for example, with a financial services institution that maintains one or more financial accounts one behalf of a user 106A, 106B and/or a group of users. Financial accounts may include, for example, checking accounts, savings accounts, credit accounts, etc. Account management systems 132 may provide transaction data 140 describing transactions on one or more accounts including, for example, the user 106A, 106B initiating the transactions, an amount of the transactions, etc.
One or more advisor systems 134 may be associated, for example, with a financial advisor or other advisor that, manually or automatically, facilitates the creation of goals for the set of users 106A, 106B. Advisor systems 134 may provide goal data 142 describing one or more goals for the set of users 106A, 106B.
The interface system 102 may execute a conflict detection application 116 that may receive the various data 136, 138, 140, 142 as well as audio and other user input data from the user computing devices 108, 110 and use the data to detect a goal conflict and determine a mediation routine 120A, 120B, 120C for remediating the goal conflict. The conflict detection application 116 may detect a goal conflict, for example, when the actions or words of one or more of the user 106A, 106B is inconsistent with at least one goal, for example, described by goal data 142.
In some examples, the interface system 102 utilizes database processing to detect goal conflicts and identify mediation routines. For example, the interface system 102 may be in communication with a database 104 including data for detecting goal conflicts among users 106A, 106B and selecting a mediation routine 120A, 120B, 120C. For example, the database 104 may include various tables 122, 124, 126 including records. A record, for example, may be stored in a row of a database table. A record may include a location for storing data corresponding to a set of columns. In some examples, each record is capable of storing a value for each column of the database table; however, not all records include a value for each column.
In the example of
At conflict type columns, records at the keywords database table 122 may reference corresponding records at the conflicts database table 124. For example, conflict type columns at the keywords database table 122 may include foreign keys referring to records at the conflicts database table 124. The conflicts database table 124 may include records that indicate a conflict type and also one or more conflict parameters indicating other descriptors of the conflict type. For example, a conflicts database table 124 record (a conflict record) may indicating a type of contact and various conflict parameters. The interface system 102 may detect a conflict and/or classify a conflict by testing the conflict parameters indicated at the conflict type database table 124.
Conflict parameters, in some examples, may refer to other factors that are likely to be present if the indicated conflict type is present. For example, some conflict parameters may refer to other keywords, to transaction goal conflicts, to tone goal conflicts, to biometric data from users 106A, 106B indicative of a conflict, etc., for example, as described herein. Keywords records may be utilized to detect goal conflicts in isolation, or in conjunction with other factors. For example, if all or a sufficient number or portion of the conflict parameters of the conflicts database table 124 are present, the interface system 102 may determine that a conflict exits. Also, in some examples, keyword records may be used as one factor in a multi-factor determination, for example, as described herein with respect to
A mediation database table 126 may include records for different types of goal conflicts as well as references to one or more mediation routines 120A, 120B, 120C that may be executed to remediate the indicated type of goal conflict. For example, when the interface system 102 detects a goal conflict and classifies the goal conflict as a particular type, it may refer to the mediation database table 126 to find the mediation record corresponding to the conflict type. The mediation record may indicate one or more mediation routines 120A, 120B, 120C that may be suitable for addressing the goal conflict. In some examples, the mediation record may also indicate one or more mediation parameters that the interface system 102 may test to select one or more mediation routines 120A, 120B, 120C to execute. The various tables 122, 124, 126 described herein may have additional columns and/or may omit one or more of the columns shown.
At operation 304, the interface system (e.g., the conflict detection application) may receive transaction data describing one or more transactions made by users of the set of users. Transactions may include, for example, purchase transactions made from checking accounts, credit accounts, etc.; deposit transactions to savings accounts, checking accounts, etc.; payments (e.g., bill payments) from checking accounts, credit accounts, etc.
At operation 306, the interface system (e.g., the conflict detection application 116) may receive audio data. The audio data may be received, directly or indirectly, from a user computing device such as one of user computing devices 108, 110 of
At operation 308, the interface system may determine if a goal conflict is detected. The determination at operation 308 may be based, at least in part, on the transaction data received at operation 304 and on the audio data received at operation 306. In some examples, the determining at operation 308 may also be based on additional data such as, for example, media data described herein. Additional examples describing how a goal conflict may be detected are described herein including, for example, with reference to
If a goal conflict is detected at operation 308, the interface system may select a mediation routine at operation 310. The mediation routine may be selected, for example, based on the transaction data, audio data, or other suitable data. At operation 312, the interface system may execute the selected mediation routine.
At operation 402, the interface system may determine whether the received data describes one or more transaction conflict indicators. A transaction conflict indicator is an indicator of a goal conflict that is based on one or more transactions. For example, a transaction conflict indicator may occur if one or more transactions described by transaction data deviates from one or more of the goals of the set of users. A transaction may deviate from a goal, for example, if a transaction or sum of a set of transactions is inconsistent with the goal. Depending on the type of goal, a transaction or set of transactions may deviate from the goal when it is too low or too high. For example, if the goal is a budgeted level of spending, a transaction may deviate from the goal if the transaction is above a budgeted amount for the transaction. Also, for example, a set of transactions may deviate from a goal (e.g., for a period of time) if the sum of transactions over a period of time (e.g., a month) exceeds budgeted spending for the period of time; then the set of transactions during the period of time may be a transaction goal conflict. In some examples, a transaction or set of transactions may deviate from a goal when the transaction or set of transactions is too low, for example where a goal is a level of sales, a level of savings, etc.
If a transaction conflict indicator is detected at operation 402, the interface system may write a description of the transaction conflict indicator, for example, to a conflict record. The description may be stored at any suitable data storage including, for example, a table of a database, a memory, etc. In some examples, the interface system may also determine a weighting for one or more determined transaction goal conflicts. The weighting may indicate the severity of the goal conflict. For example, if a transaction or set of transactions exceeds a spending goal by 50%, the transaction conflict indicator may be given a large weight. On the other hand, if the transaction or set of transactions exceeds the spending goal by 1%, the resulting transaction conflict indicator may be given a smaller weight. In some examples, each transaction that is inconsistent with the goal may be considered a separate goal conflict. Accordingly, the interface system may store a number of goal conflicts from the conflict data that are transaction conflict indicators.
After the indication of a transaction conflict is written at operation 404 (and/or if no transaction conflict is detected at operation 402), the interface system may determine if there are one or more tone conflict indicators at operation 406. A tone conflict indicator may occur if the received audio data shows that one or more users is using a raised tone of voice. For example, the interface system may analyze the received audio data to detect raised tones. Raised tones may be detected in any suitable manner. In some examples, the interface system may detect changes in volume. For example, if one user's voice becomes louder by more than a threshold, then a tone conflict may be detected. Also, in some examples, the interface system may detect a change in frequency of a user's voice. For example, if the user's voice becomes more high-pitched, it may be a tone conflict indicator. In some examples, the interface system may assign a weight to detected tone conflict indicators. For example, a higher weight may be assigned to tone conflict indicators involving multiple users with raised voices. If a tone conflict indicator is detected, the interface system may store the indicator, for example, to the conflict record, at operation 406.
If no tone conflict indicator is detected (or after one or more tone conflict indicators are written at operation 408), the interface system may, at operation 410, determine if one or more keyword conflict indicators are detected. A keyword conflict indicator may occur, for example, if the audio data indicates that one or more of the set of users has used a keyword or keywords indicating goal conflict. In some examples, the interface system may utilize a keyword database table and conflict database table, such as the tables 122 and 124, to detect keyword goal conflicts. For example, if a particular word detected in the audio data has a keyword record in the keyword database table, the interface system may determine conflict parameters, either from the keyword database table or from a conflict type database table. The interface system may evaluate the conflict parameters to determine if the indicated goal conflict or type of goal conflict is present. If a keyword conflict indicator is present, the interface system may write a description of the keyword conflict indicator at operation 412.
If no keyword conflict indicator is detected (or after the indication of one or more keyword goal conflicts is written at operation 412), the interface system may, at operation 414, determine if the count of conflict indicators detected at operations 402, 406, and 410 is greater than a threshold. In some examples, the count utilized at operation 414 may be a weighted count. For example, some indicators may be weighted, as described above. Also, in some examples, different categories of indicators may be weighted differently. For example, tone conflict indicators may be weighted higher than keyword conflict indicators. If the count is higher than the threshold, the interface system may determine that there is a goal conflict, at operation 418. If the count is not higher than the threshold, then the interface system may determine that there is no goal conflict at operation 416.
At operation 502, the interface system may classify users from the set of users that is in conflict. Classifying the users from set of users may include, for example, determining the positions on the conflict of some or all of the users of the set of users. For example, one or more users who have engaged in transactions that are inconsistent with a group goal may be classified to one position on the goal conflict while one or more users who have not may be classified to another position on the goal conflict. In some examples, users may be classified to different positions of a goal conflict based on keywords and/or tone used by the users. For example, if two years utilize conflict-indicating keywords and/or tone at about the same time, those users may be classified to different sides of the goal conflict.
At operation 504, the interface system may determine a severity of the goal conflict. The severity of the goal conflict may be measured in any suitable manner. Referring to
At operation 506, the interface system may select a mediation routine for the goal conflict. Potential mediation routines may be selected based on, for example, the conflict type, the classifications of the conflicted users, the severity of the goal conflict and, in some examples, conflict indicators. For example, a goal conflict characterized by the use of harsh keywords and tone conflict indicators may select a mediation routine that includes a waiting or “cool-down” period between the time that the mediation routine begins and the time that one or more of the set of users is contacted. Also, in some examples, a goal conflict characterized by having a large majority of the set of users on one side may select a mediation routine that is directed towards mediating the goal conflict by changing the goal that is the source of the goal conflict.
The processor unit 610 may be coupled, either directly or via appropriate intermediary hardware, to a display 650 and to one or more input/output (I/O) devices 660, such as a keypad, a touch panel sensor, a microphone, and the like. Such I/O devices 660 may include a touch sensor for capturing fingerprint data, a camera for capturing one or more images of the user, a retinal scanner, or any other suitable devices. The I/O devices 660 may be used to implement I/O channels, as described herein. In some examples, the I/O devices 660 may also include sensors.
Similarly, in some examples, the processor unit 610 may be coupled to a transceiver 670 that interfaces with an antenna 690. The transceiver 670 may be configured to both transmit and receive cellular network signals, wireless data signals, or other types of signals via the antenna 690, depending on the nature of the computing device implemented by the architecture 600. Although one transceiver 670 is shown, in some examples, the architecture 600 includes additional transceivers. For example, a wireless transceiver may be utilized to communicate according to an IEEE 802.11 specification, such as Wi-Fi and/or a short-range communication medium. Some short-range communication mediums, such as NFC, may utilize a separate, dedicated transceiver. Further, in some configurations, a Global Positioning System (GPS) receiver 680 may also make use of the antenna 690 to receive GPS signals. In addition to or instead of the GPS receiver 680, any suitable location-determining sensor may be included and/or used, including, for example, a Wi-Fi positioning system. In some examples, the architecture 600 (e.g., the processor unit 610) may also support a hardware interrupt. In response to a hardware interrupt, the processor unit 610 may pause its processing and execute an interrupt service routine (ISR).
The representative hardware layer 704 comprises one or more processing units 706 having associated executable instructions 708. The executable instructions 708 represent the executable instructions of the software architecture 702, including implementation of the methods, modules, components, and so forth of
In the example architecture of
The operating system 714 may manage hardware resources and provide common services. The operating system 714 may include, for example, a kernel 728, services 730, and drivers 732. The kernel 728 may act as an abstraction layer between the hardware and the other software layers. For example, the kernel 728 may be responsible for memory management, processor management (e.g., scheduling), component management, networking, security settings, and so on. The services 730 may provide other common services for the other software layers. In some examples, the services 730 include an interrupt service. The interrupt service may detect the receipt of a hardware or software interrupt and, in response, cause the software architecture 702 to pause its current processing and execute an ISR when an interrupt is received. The ISR may generate an alert.
The drivers 732 may be responsible for controlling or interfacing with the underlying hardware. For instance, the drivers 732 may include display drivers, camera drivers, Bluetooth® drivers, flash memory drivers, serial communication drivers (e.g., Universal Serial Bus (USB) drivers), Wi-Fi® drivers, NFC drivers, audio drivers, power management drivers, and so forth depending on the hardware configuration.
The libraries 716 may provide a common infrastructure that may be utilized by the applications 720 and/or other components and/or layers. The libraries 716 typically provide functionality that allows other software modules to perform tasks in an easier fashion than by interfacing directly with the underlying operating system 714 functionality (e.g., kernel 728, services 730, and/or drivers 732). The libraries 716 may include system libraries 734 (e.g., C standard library) that may provide functions such as memory allocation functions, string manipulation functions, mathematic functions, and the like. In addition, the libraries 716 may include API libraries 736 such as media libraries (e.g., libraries to support presentation and manipulation of various media formats such as MPEG4, H.264, MP3, AAC, AMR, JPG, and PNG), graphics libraries (e.g., an OpenGL framework that may be used to render 2D and 3D graphic content on a display), database libraries (e.g., SQLite that may provide various relational database functions), web libraries (e.g., WebKit that may provide web browsing functionality), and the like. The libraries 716 may also include a wide variety of other libraries 738 to provide many other APIs to the applications 720 and other software components/modules.
The frameworks 718 (also sometimes referred to as middleware) may provide a higher-level common infrastructure that may be utilized by the applications 720 and/or other software components/modules. For example, the frameworks 718 may provide various graphical user interface (GUI) functions, high-level resource management, high-level location services, and so forth. The frameworks 718 may provide a broad spectrum of other APIs that may be utilized by the applications 720 and/or other software components/modules, some of which may be specific to a particular operating system or platform.
The applications 720 include built-in applications 740 and/or third-party applications 742. Examples of representative built-in applications 740 may include, but are not limited to, a contacts application, a browser application, a book reader application, a location application, a media application, a messaging application, and/or a game application. The third-party applications 742 may include any of the built-in applications 740 as well as a broad assortment of other applications. In a specific example, the third-party application 742 (e.g., an application developed using the Android™ or iOST™ software development kit (SDK) by an entity other than the vendor of the particular platform) may be mobile software running on a mobile operating system such as iOS™, Android™, Windows® Phone, or other computing device operating systems. In this example, the third-party application 742 may invoke the API calls 724 provided by the mobile operating system such as the operating system 714 to facilitate functionality described herein.
The applications 720 may utilize built-in operating system functions (e.g., kernel 728, services 730, and/or drivers 732), libraries (e.g., system libraries 734, API libraries 736, and other libraries 738), or frameworks/middleware 718 to create user interfaces to interact with users of the system. Alternatively, or additionally, in some systems, interactions with a user may occur through a presentation layer, such as the presentation layer 744. In these systems, the application/module “logic” can be separated from the aspects of the application/module that interact with a user.
Some software architectures utilize virtual machines. For example, systems described herein may be executed utilizing one or more virtual machines executed at one or more server computing machines. In the example of
The example architecture 800 includes a processor unit 802 comprising at least one processor (e.g., a central processing unit (CPU), a graphics processing unit (GPU), or both, processor cores, compute nodes, etc.). The architecture 800 may further comprise a main memory 804 and a static memory 806, which communicate with each other via a link 808 (e.g., bus). The architecture 800 can further include a video display unit 810, an alphanumeric input device 812 (e.g., a keyboard), and a UI navigation device 814 (e.g., a mouse). In some examples, the video display unit 810, alphanumeric input device 812., and UI navigation device 814 are incorporated into a touchscreen display. The architecture 800 may additionally include a storage device 816 (e.g., a drive unit), a signal generation device 818 (e.g., a speaker), a network interface device 820, and one or more sensors (not shown), such as a GPS sensor, compass, accelerometer, or other sensor.
In some examples, the processor unit 802 or another suitable hardware component may support a hardware interrupt. In response to a hardware interrupt, the processor unit 802 may pause its processing and execute an ISR, for example, as described herein.
The storage device 816 includes a machine-readable medium 822 on which is stored one or more sets of data structures and instructions 824 (e.g., software) embodying or utilized by any one or more of the methodologies or functions described herein. The instructions 824 can also reside, completely or at least partially, within the main memory 804, within the static memory 806, and/or within the processor unit 802 during execution thereof by the architecture 800, with the main memory 804, the static memory 806, and the processor unit 802 also constituting machine-readable media. The instructions 824 stored at the machine-readable medium 822 may include, for example, instructions for implementing the software architecture 702, instructions for executing any of the features described herein, etc.
While the machine-readable medium 822 is illustrated in an example to be a single medium, the term “machine-readable medium” can include a single medium or multiple media (e.g., a centralized or distributed database, and/or associated caches and servers) that store the one or more instructions 824. The term “machine-readable medium” shall also be taken to include any tangible medium that is capable of storing, encoding, or carrying instructions for execution by the machine and that cause the machine to perform any one or more of the methodologies of the present disclosure, or that is capable of storing, encoding, or carrying data structures utilized by or associated with such instructions. The term “machine-readable medium” shall accordingly be taken to include, but not be limited to, solid-state memories, and optical and magnetic media. Specific examples of machine-readable media include non-volatile memory, including, but not limited to, by way of example, semiconductor memory devices (e.g., electrically programmable read-only memory (EPROM) and electrically erasable programmable read-only memory (EEPROM)) and flash memory devices; magnetic disks such as internal hard disks and removable disks; magneto-optical disks; and CD-ROM and DVD-ROM disks.
The instructions 824 can further be transmitted or received over a communications network 826 using a transmission medium via the network interface device 820 utilizing any one of a number of well-known transfer protocols (e.g., hypertext transfer protocol (HTTP)). Examples of communication networks include a LAN, a WAN, the Internet, mobile telephone networks, plain old telephone service (POTS) networks, and wireless data networks (e.g., Wi-Fi, 3G, and 5G LTE/LTE-A or WiMAX networks). The term “transmission medium” shall be taken to include any intangible medium that is capable of storing, encoding, or carrying instructions for execution by the machine, and includes digital or analog communications signals or other intangible media to facilitate communication of such software.
Various components are described in the present disclosure as being configured in a particular way. A component may be configured in any suitable manner. For example, a component that is or that includes a computing device may be configured with suitable software instructions that program the computing device. A component may also be configured by virtue of its hardware arrangement or in any other suitable manner.
The above description is intended to be illustrative, and not restrictive. For example, the above-described examples (or one or more aspects thereof) can be used in combination with others. Other embodiments can be used, such as by one of ordinary skill in the art upon reviewing the above description. The Abstract is to allow the reader to quickly ascertain the nature of the technical disclosure, for example, to comply with 37 C.F.R. § 1.72(b) in the United States of America. It is submitted with the understanding that it will not be used to interpret or limit the scope or meaning of the claims.
Also, in the above Detailed Description, various features can be grouped together to streamline the disclosure. However, the claims cannot set forth every feature disclosed herein, as embodiments can feature a subset of said features. Further, embodiments can include fewer features than those disclosed in a particular example. Thus, the following claims are hereby incorporated into the Detailed Description, with each claim standing on its own as a separate embodiment. The scope of the embodiments disclosed herein is to be determined with reference to the appended claims, along with the full scope of equivalents to which such claims are entitled.