Non-invasive health monitoring devices are increasingly helping people to better monitor their health status. Some consumer wearable devices have incorporated sensors for gathering biometric data, such as a pulse oximeter, which can be used to generate a photoplethysmogram (PPG). A PPG is an optically obtained plethysmogram which can be used to detect blood volume changes in the microvascular bed of living tissue. A PPG can be obtained using a pulse oximeter which illuminates the skin and measures changes in light absorption.
A pulse oximeter monitors the perfusion of blood to the dermis and subcutaneous tissue of the skin. With each cardiac cycle the heart pumps blood to the periphery causing a pressure pulse that distends the arteries and arterioles in the subcutaneous tissue. A change in volume caused by the pressure pulse is detected by illuminating the skin with the light from a light-emitting diode (LED) and then measuring the amount of light either transmitted or reflected to a photodiode where each cardiac cycle appears as a peak.
A system is described for improving a heart rate prediction. The system may include at least one processor, a memory device including instructions that, when executed by the at least one processor, cause the system to obtain a PPG dataset generated by a PPG sensor included in a device. The instructions that, when executed by the at least one processor, may cause the system to input the PPG dataset to a machine learning model trained to classify a PPG signal in the PPG dataset as a predicted heart rate, wherein the machine learning model classifies PPG signals contained in the PPG dataset as predicted heart rates. The instructions that, when executed by the at least one processor, may cause the system to obtain accelerometer data generated by an accelerometer included in the device and input the accelerometer data to the machine learning model, which classifies an acceleration signal contained in the accelerometer data as a heart rate, indicating that movement represented by the acceleration signal mimics the heart rate. The instructions that, when executed by the at least one processor, may cause the system to identify the PPG signal in the PPG dataset that corresponds to the acceleration signal that mimics the heart rate, and remove the signal that corresponds to the acceleration signal from consideration and/or from the PPG data to improve accuracy of the heart rate prediction.
A method is described for determining fetal movement. The method may include receiving a PPG dataset generated by a PPG sensor and inputting the PPG dataset to a machine learning model to classify a PPG signal in the PPG dataset as a predicted heart rate, wherein the machine learning model has been trained using PPG training data to classify a signal from a generic data source as a heart rate. The method may include receiving accelerometer data generated by an accelerometer and inputting the accelerometer data to the machine learning model to classify an acceleration signal contained in the accelerometer data as a heart rate, wherein movement represented by the acceleration signal mimics the heart rate. The method may include identifying a heart rate confidence score output by the machine learning model that corresponds to the acceleration signal that mimics the heart rate, and removing that heart rate from consideration to improve accuracy of the true heart rate prediction.
A non-transitory machine-readable storage medium including instructions embodied thereon is provided. The instructions, when executed by at least one processor may receive a PPG dataset generated by a PPG sensor and input the PPG dataset to a machine learning model to classify a PPG signal in the PPG dataset as a heart rate prediction. The instructions, when executed by at least one processor may receive accelerometer data generated by an accelerometer, wherein the accelerometer data corresponds to a time frame of the PPG dataset, and input the accelerometer data to the machine learning model to classify an acceleration signal contained in the accelerometer data as a heart rate, wherein movement represented by the acceleration signal mimics the heart rate. The instructions, when executed by at least one processor may identify the PPG signal in the PPG dataset as corresponding to the acceleration signal that mimics the heart rate and remove the PPG signal that corresponds to the acceleration signal to improve accuracy of the heart rate prediction.
Technologies are described for improving a heart rate prediction generated using a machine learning model, such as an artificial neural network (ANN) model, a decision tree model, a naive Bayes model, or another appropriate machine learning model. In one example configuration, a heart rate prediction can be improved by identifying a motion-based signal in a photoplethysmogram (PPG) signal dataset that mimics a heart rate and the motion-based signal can be excluded as a heart rate candidate to prevent the motion-based signal as being selected as a predicted heart rate. For example, an ANN model can be trained using PPG data to classify a signal from a generic data source (e.g., a PPG sensor, an accelerometer, or other data source) as a predicted heart rate. Because some types of repetitive movements (e.g., running, swinging, bouncing, etc.) can generate motion-based signals, these motion-based signals can be introduced as noise into a PPG dataset. When the PPG dataset is input to the ANN model, a motion-based signal may be mistakenly classified as a predicted heart rate instead of a PPG signal that represents a true heart rate.
In order to prevent a motion-based signal from being classified as a predicted heart rate, the motion-based signal can be identified using external data generated by an accelerometer, and the motion-based signal can be excluded as a heart rate candidate. For example, in addition to classifying a PPG signal as a predicted heart rate, the ANN model can classify an acceleration signal included in accelerometer data as corresponding to a heart rate. As an example, accelerometer data generated by an accelerometer can be input to the ANN model, and the ANN model can determine whether an acceleration signal contained in the accelerometer data corresponds to one or more characteristics of a heart rate. In the case that the ANN model classifies the acceleration signal as a heart rate, a motion-based PPG signal (e.g., noise generated by movement) that corresponds to the acceleration signal can be identified in the PPG signal dataset, and the motion-based PPG signal can be removed to prevent the motion-based signal from being selected as the predicted heart rate, thereby improving the probability that a PPG signal that matches the true heart rate is selected. In one example, the motion-based PPG signal identified in the PPG dataset can be removed from consideration as a true heart rate (an actual heart rate of a subject) in order to allow the true heart rate signal in the PPG dataset to be selected as a heart rate prediction. For example, the output of the ANN model can be a heart rate confidence score that indicates whether a motion based signal corresponds to a heart rate, and the corresponding signal in the PPG dataset can be identified and excluded as a heart rate candidate. In another example, it is believed that the motion-based PPG signal can be removed from the PPG signal dataset, such that an actual heart rate signal remains in the PPG signal dataset and the actual heart rate signal can be selected as a heart rate prediction.
In one example configuration, the ANN model can include: a first series of convolution layers used to identify a PPG signal in the PPG data and remove artifacts contained in the PPG data, a fast Fourier transform (FFT) layer used to identify PPG frequencies in the PPG data, and a dense layer used to decode a heart rate value from the PPG frequencies. After training the ANN model, a PPG signal can be obtained from a PPG sensor (e.g., a pulse oximeter) and PPG data representing the PPG signal can be input to the ANN model which outputs a heart rate prediction that represents a heart rate extracted from the PPG signal. As will be appreciated, the ANN model can include additional components used to predict a heart rate.
The network architecture of the ANN model provides improvements in the accuracy of heart rate predictions obtained from a PPG signal over previous methods for computing a heart rate from a PPG signal. In particular, the accuracy of heart rate predictions output by the ANN model is improved by placing an FFT layer after a first series of convolution layers and providing the output of the FFT layer to a dense layer of the ANN model. Placement of the FFT layer in this way improves the accuracy of heart rate predictions by using the FFT layer to identify fundamental and harmonic frequencies of a PPG signal, thereby reducing a number of parameters provided to the dense layer of the ANN model.
To further describe the present technology, examples are now provided with reference to the figures.
The ANN model 104, in one example, is an end-to-end neural network. As described in more detail later, the architecture of the ANN model 104 can include a series of convolution layers followed by an FFT layer and a dense decoding layer. The ANN model 104 can be trained to classify a signal as a predicted heart rate using a training dataset of PPG data. After training, the ANN model 104 can be used to classify signals contained in any type of data that correspond to characteristics of a heart rate. For example, the ANN Model 104 can be used to classify signals contained in PPG data 102, and signals contained in accelerometer data 106, as predicted heart rates. A motion-based signal classified as a predicted heart rate can be compared to PPG signal classified heart rates to determine whether the motion-based signal mimics any of the PPG signal classified heart rates. Any motion-based signals identified as corresponding to PPG signal classified heart rates can be removed (e.g., removed from consideration and/or from the PPG data 102) to improve the accuracy of a heart rate probability generated by the ANN model 104. For example, repetitive movements, such as running, swinging, bouncing, and the like can generate a motion-based signal that is introduced as noise into PPG data 102. When the PPG data 102 is input to the ANN model 104, the motion-based signal may be mistakenly classified as a heart rate probability 122 instead of a PPG signal that represents a true heart rate.
A heart rate selection module 112 can be configured to improve the accuracy of a heart rate prediction output by the ANN model 104 by determining whether a PPG heart rate probability 108 output by the ANN model 104 contains a motion-based signal which could potentially be incorrectly selected as a heart rate probability 122. In one example, as shown in
The PPG data 102 can be input to the ANN model 104 which generates a PPG heart rate probability 108 from the PPG data 102. For example, the PPG heart rate probability 108 may be a PPG heartbeat confidence score having a value between zero (0) and one (1). The PPG heartbeat confidence score can represent a probability that there is a heartbeat-induced fluctuation in the PPG data 102 at that sample. In one example, the accelerometer data 106 can be input to the same ANN model 104, or a portion of the same ANN model 104, which generates a motion-based heart rate probability 110 from the accelerometer data 106. For example, the motion-based heart rate probability 110 may produce or generate a motion-based heart rate confidence score having a value between zero (0) and one (1). The motion-based heart rate confidence score can represent a probability that there is a motion-induced fluctuation in the accelerometer data 106 at that sample that mimics a heart rate. The motion-based heart rate probability 110 output by the ANN model 104 can be evaluated to determine whether a motion-based signal in the accelerometer data 106 mimics a heart rate. More specifically, as in block 114, a motion-based signal can be provided to the heart rate selection module 112 when the motion-based heart rate probability 110 indicates a high-probability that the motion-based signal in the accelerometer data 106 simulates a heart rate. In the case of a low-probability that the motion-based signal in the accelerometer data 106 does not simulate a heart rate, then the motion-based signal is not provided to the heart rate selection module 112.
As in block 116, the heart rate selection module 112 can identify a PPG signal in the PPG data 102 that corresponds to a motion-based signal that mimics a true heart rate (as described in greater detail later) and, as in block 118, the heart rate selection module 112 can remove the PPG signal (as a heart rate candidate or from the PPG data 102). After removing the signal that corresponds to the motion-based signal, the heart rate selection module 112 can select a signal having the highest heart rate confidence score as the true heart rate, as in block 120. The heart rate selection module 112 can then provide the predicted heart rate along with the associated confidence score 122.
Referring again to
The ANN model 204, in one example, is an end-to-end neural network having an architecture that includes a series of convolution layers 206 followed by a fast Fourier transform (FFT) layer 208 and a dense decoding layer 210. As will be appreciated, the ANN model 204 can include additional components used to generate a heart rate probability. The PPG and accelerometer data 202 can be separately provided as input to the ANN model 204, and the architecture of the ANN model 204 can be configured to generate a heart rate probability 212. In one example, the accelerometer data can correspond to a time frame of the PPG data, such that the PPG data and the accelerometer data is captured during the same time frame (e.g., a ten, twenty, sixty, etc. time frame). In another example, a previously captured dataset of accelerometer data (for a specific user or from a plurality of users) can be used as input to the ANN model 204.
As illustrated, the architecture of the ANN model 204 includes a series of convolution layers 206. The series of convolution layers 206 can include any number of convolution layers. In a specific example of the architecture of the ANN model 204, the series of convolution layers 206 can include three convolution layers. In some examples, the series of convolution layers 206 may be a first convolution layer that proceeds the FFT layer 208, and the architecture of the ANN model 204 can include a second series of convolution layers (not shown) located between the FFT layer 208 and the dense decoding layer 210. The second series of convolution layers can be used to identify and remove artifacts from a Fourier transform output by the FFT layer 208.
The convolution layers 206 of the PPG data trained ANN model 204 can be configured to identify a PPG signal in PPG data and an acceleration signal in accelerometer data. The PPG data can be obtained from a PPG sensor, such as a heart rate monitor device or pulse oximeter monitor device. A PPG is an optically obtained plethysmogram used to detect blood volume changes in the microvascular bed of tissue of a subject. A PPG sensor illuminates the skin and measures changes in light absorption to monitor the perfusion of blood to the dermis and subcutaneous tissue of the skin. The PPG sensor detects a change in blood volume and measures an amount of light either transmitted or reflected to a photodiode. The PPG sensor generates PPG data containing a PPG signal or PPG waveform where each cardiac cycle appears as a peak in the PPG signal. Accelerometer data can be obtained from an accelerometer device. The accelerometer device measures proper acceleration by detecting vibration, and can in some cases, detect both the magnitude and the direction of the proper acceleration. The convolution layers 206 of the PPG data trained ANN model 204 can analyze the PPG and accelerometer data 202 obtained from the PPG sensor and accelerometer to identify signals that correspond to heart rates.
The architecture of the ANN model 204 shown in
Placing the FFT layer 208 after the series of convolution layers 206 and before the dense decoding layer 210 improves performance of predicting heart rates using the ANN model 204. In particular, applying a fast Fourier transform to a signal output by the series of convolution layers 206 reduces a number of parameters that are provided to the dense decoding layer 210 of the ANN model 204. By reducing the number of parameters provided as input to the dense decoding layer 210, an amount of data processed by the dense decoding layer 210 is decreased, which results in a shorter amount of time to generate heart rate probabilities 212.
Also, placing the FFT layer 208 after the series of convolution layers 206 and before the dense decoding layer 210 improves accuracy of heart rate predictions output by the ANN model 204. More specifically, applying a fast Fourier transform to signals output by the series of convolution layers 206 allows frequencies to be quantized, thereby restricting a number of possible frequency values that can be classified as heart rate values. The signals output by the FFT layer 208 provide signal representations that are easier to decode and increase the accuracy of a heart rate prediction as compared to using alternative techniques. For example, using the alternative technique of a mean squared error function as a loss function tends to pull values toward the mean, which creates bias in PPG and accelerometer data 202. Using a fast Fourier transform technique reduces the chance of this bias. For example, a fast Fourier transform technique allows frequencies output by the FFT layer 208 to be classified as a probability distribution of heart rates, and allows for a maximum likelihood estimation to be applied to the probability distribution of heart rates to determine a heart rate probability 212.
The dense decoding layer 210 included in the ANN model 204 architecture can be configured to decode frequency representations output by the FFT layer 208 into heart rate predictions. In one example, the dense decoding layer 210 decodes the frequency representations into heart rate information (e.g., beats per minute (BPM)) used to generate a heart rate prediction. As an example, the dense decoding layer 210 selects a frequency representation (e.g., a harmonic frequency) output by the FFT layer 208 and applies a mask to the PPG frequency representation which is used to visualize the frequency representation as a heart rate value (e.g., 131, 132, or 133 BPM). Thereafter, the heart rate values can be scored to create a probability distribution that indicates a maximum likelihood of a heart rate, which can be output as a heart rate probability 212. In one example, after scoring the heart rate values, the heart rate values can be input to a softmax layer (not shown) that has one neuron for each heart rate value. The softmax layer can normalize the heart rate values to sum to a value of one (1), creating a probability distribution of heart rate values that indicates a maximum likelihood of a heart rate value.
The following example is an illustration of an end-to-end ANN architecture configured to generate heart rate probabilities based on PPG and accelerometer data input. As will be appreciated, the example artificial neural network architecture shown in Example 2 is merely representative of an ANN architecture and is not meant to be limiting.
The ANN model 204 can be trained to generate heart rate probabilities using a training dataset of PPG data. The training data set can comprise PPG data collected from subjects using a PPG sensor. PPG data can then be split into a training dataset and a test dataset. In some examples, synthetic PPG data can be generated to supplement the training dataset. For example, synthetic PPG data can be generated to have a uniform heart rate that is between 30-300 beats per minute (BPM). Also, additional synthetic PPG data containing noise and no PPG signal can be added to the training dataset to train the ANN model 204 to indicate an uncertainty of a true heart rate. As an example, synthetic PPG data can be labeled with a zero (0) heart rate to correspond to an unknown value. In one example, the ANN model 204 can be trained using categorical cross entropy to label PPG data in the training dataset to a heart rate category and an Adam optimizer to update weights assigned to the PPG data. As will be appreciated, techniques other than those described above can be used to train the ANN model 204.
The various processes and/or other functionality described above may be executed on one or more processors that are in communication with one or more memory modules. The processing system 200 may include one or more computing devices. In some examples, the processing system 200 can include a plurality of data stores used to store PPG and accelerometer data 202 and/or heart rate probabilities 212 output by the PPG data trained ANN model 204. The term “data store” may refer to any device or combination of devices capable of storing, accessing, organizing and/or retrieving data. The storage system components of the data store may include storage systems such as volatile or non-volatile RAM, hard-drive type media, and a cloud storage network. The data store may be representative of a plurality of data stores as can be appreciated.
In some examples, the processing system 200 may include a network for transmitting data between servers, clients, and devices. The network may include any useful computing network, including an intranet, the Internet, a local area network, a wide area network, a wireless data network, or any other such network or combination thereof. Components utilized for such a system may depend at least in part upon the type of network and/or environment selected. Communication over the network may be enabled by wired or wireless connections and combinations thereof.
Preprocessing of data 302/312 can include one or more preprocessing steps. In one example, the preprocessing steps can include: (i) calculating a derivative of a signal to accentuate high frequency components in the signal, (ii) clipping the signal to remove outlier data included in the data 302/312, and (iii) normalizing the signal to a standard deviation.
Referring to
Now referring to
Prior predictions of heart rates can be used as part of generating a current heart rate probability in a number of ways. In one example, a series of sine waves corresponding to a fundamental frequency and harmonic frequencies of a prior heart rate prediction can be summed. The resulting sum provides a prior heart rate template 410 which can be passed to the ANN model 400. One method that can be used to pass a prior heart rate template 410 to the ANN model 400 includes concatenating a Fourier transform of the prior heart rate template 410 to the Fourier transform output by the FFT layer 406 of the ANN model 400. As an illustration, PPG data 402 included in a training dataset can be input to a series of convolution layers 404 to remove artifacts and clean up the PPG signal. The FFT layer 406 can be applied to the PPG signal to produce a Fourier transform of the PPG signal. A Fourier transform of a prior heart rate template 410 can be produced, and the Fourier transform of a prior heart rate template 410 can be concatenated 408 to the Fourier transform of the PPG signal. The resulting concatenated Fourier transform, comprising PPG frequency representations of the PPG data 402 and the prior heart rate template 410, can be input to a dense decoding layer 412 of the ANN model 400. The dense decoding layer 412 decodes the PPG frequency representations, as described earlier, and outputs a heart rate probability 414.
The following example illustrates an end-to-end ANN architecture configured to incorporate prior heart rate information into an artificial ANN model to generate a heart rate probability. As will be appreciated, the example artificial neural network architecture shown in Example 2 is merely representative of an ANN architecture and is not limiting.
Example 3 below illustrates an ANN architecture configured to generate a heartbeat confidence score. As will be appreciated, the example ANN architecture shown in Example 3 is merely representative and is not meant to be limiting.
After generating heartbeat confidence scores 512/514 from the PPG data 502 and the accelerometer data 504, a heart rate selection method can be used to identify and remove a motion-based signal from consideration as a true heart rate and/or remove the motion-based signal from the PPG data. For example, as in block 516, it is believed that the heart rate selection method can remove the motion-based signal from the PPG data that corresponds to the accelerometer heartbeat confidence scores.
In the example illustrated in
One or more PPG signals contained in the PPG dataset input to the ANN model can be classified as predicted heart rates. The method 600 can be used to correct a heart rate prediction by identifying a motion-based signal included in the PPG dataset and removing the motion-based signal. More specifically, as in block 630, accelerometer data generated by an accelerometer included in the device can be obtained, and as in block 640, the accelerometer data can be input to the ANN model, which classifies an acceleration signal contained in the accelerometer data as a heart rate, thereby indicating that movement represented by the acceleration signal mimics a heart rate. Thereafter, as in block 650, a PPG signal included in the PPG dataset can be identified as corresponding to the acceleration signal that mimics the heart rate, and as in block 660, the PPG signal can be removed from consideration as a heart rate probability and/or removed from the PPG dataset to correct the heart rate prediction.
In one example configuration, a prior heart rate prediction (e.g., an immediate prior heart rate prediction or a prior heart rate prediction within a defined time frame) output by the artificial ANN model can be obtained, and the prior heart rate prediction can be compared to a current heart rate prediction output by the artificial ANN model to determine whether the current heart rate prediction is within a quality threshold of the prior heart rate prediction. For example, the prior heart rate prediction can be compared to the current heart rate prediction by (i) generating a heart rate distribution, wherein the current heart rate prediction is multiplied by a Gaussian function that has a mean value equal to the prior heart rate prediction, (ii) calculating an argmax of the heart rate distribution to produce the heart rate prediction, and/or (iii) accept the heart rate prediction if confidence score is greater than a quality threshold. In the case that a prior heart rate prediction is unavailable, a heart rate distribution can be generated by multiplying a current heart rate prediction by an identity vector, and calculating an argmax of the heart rate distribution to produce the heart rate prediction.
A memory device 720 may contain modules 724 that are executable by the processor(s) 712 and data for the modules 724. The modules 724 can include ANN modules, convolution modules, fast Fourier transform modules, dense decoding modules, and other modules. The modules 724 may execute the functions described earlier. A data store 722 may also be located in the memory device 720 for storing data related to the modules 724 and other applications along with an operating system that is executable by the processor(s) 712.
Other applications may also be stored in the memory device 720 and may be executable by the processor(s) 712. Components or modules discussed in this description may be implemented in the form of software using high-level programming languages that are compiled, interpreted, or executed using a hybrid of the methods.
The computing device 710 may also have access to I/O (input/output) devices 714 that are usable by the computing device 710. One example of an I/O device is a display screen 730 that is accessible to the computing device 710. Networking devices 716 and similar communication devices may be included in the computing device 710. The networking devices 716 may be wired or wireless networking devices that connect to the internet, a LAN, WAN, or other computing network.
The components or modules that are shown as being stored in the memory device 720 may be executed by the processor(s) 712. The term “executable” may mean a program file that is in a form that may be executed by a processor 712. For example, a program in a higher level language may be compiled into machine code in a format that may be loaded into a random access portion of the memory device 720 and executed by the processor 712, or source code may be loaded by another executable program and interpreted to generate instructions in a random access portion of the memory device 720 to be executed by the processor(s) 712. The executable program may be stored in any portion or component of the memory device 720. For example, the memory device 720 may be random access memory (RAM), read only memory (ROM), flash memory, a solid state drive, memory card, a hard drive, optical disk, floppy disk, magnetic tape, or any other memory components.
The processor(s) 712 may represent multiple processors and the memory device 720 may represent multiple memory units that operate in parallel to the processing circuits. This may provide parallel processing channels for the processes and data in the computing device 710. The local communication interface 718 may be used as a network to facilitate communication between any of the multiple processors and multiple memories. The local communication interface 718 may use additional systems designed for coordinating communication such as load balancing, bulk data transfer, and similar systems.
While the flowcharts presented for this technology may imply a specific order of execution, the order of execution may differ from what is illustrated. For example, the order of two more blocks may be rearranged relative to the order shown. Further, two or more blocks shown in succession may be executed in parallel or with partial parallelization. In some configurations, one or more blocks shown in the flow chart may be omitted or skipped. Any number of counters, state variables, warning semaphores, or messages might be added to the logical flow for purposes of enhanced utility, accounting, performance, measurement, troubleshooting, or for similar reasons.
Some of the functional units described in this specification have been labeled as modules in order to more particularly emphasize their implementation independence. For example, a module may be implemented as a hardware circuit comprising custom VLSI circuits or gate arrays, off-the-shelf semiconductors such as logic chips, transistors, or other discrete components. A module may also be implemented in programmable hardware devices such as field programmable gate arrays, programmable array logic, programmable logic devices, or the like.
Modules may also be implemented in software for execution by various types of processors. An identified module of executable code may, for instance, comprise one or more blocks of computer instructions, which may be organized as an object, procedure, or function. Nevertheless, the executables of an identified module need not be physically located together, but may comprise disparate instructions stored in different locations which comprise the module and achieve the stated purpose for the module when joined logically together.
Indeed, a module of executable code may be a single instruction, or many instructions, and may even be distributed over several different code segments, among different programs, and across several memory devices. Similarly, operational data may be identified and illustrated herein within modules and may be embodied in any suitable form and organized within any suitable type of data structure. The operational data may be collected as a single data set, or may be distributed over different locations including over different storage devices. The modules may be passive or active, including agents operable to perform desired functions.
The technology described here may also be stored on a computer readable storage medium that includes volatile and non-volatile, removable and non-removable media, implemented with any technology for the storage of information such as computer readable instructions, data structures, program modules, or other data. Computer readable storage media includes, but is not limited to, a non-transitory machine readable storage medium, such as RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, digital versatile disks (DVD) or other optical storage, magnetic cassettes, magnetic tapes, magnetic disk storage or other magnetic storage devices, or any other computer storage medium which may be used to store the desired information and described technology.
The devices described herein may also contain communication connections or networking apparatus and networking connections that allow the devices to communicate with other devices. Communication connections are an example of communication media. Communication media typically embodies computer readable instructions, data structures, program modules and other data in a modulated data signal such as a carrier wave or other transport mechanism and includes any information delivery media. A “modulated data signal” means a signal that has one or more of its characteristics set or changed in such a manner as to encode information in the signal. By way of example and not limitation, communication media includes wired media such as a wired network or direct-wired connection and wireless media such as acoustic, radio frequency, infrared and other wireless media. The term computer readable media as used herein includes communication media.
Reference was made to the examples illustrated in the drawings and specific language was used herein to describe the same. It will nevertheless be understood that no limitation of the scope of the technology is thereby intended. Alterations and further modifications of the features illustrated herein and additional applications of the examples as illustrated herein are to be considered within the scope of the description.
Furthermore, the described features, structures, or characteristics may be combined in any suitable manner in one or more examples. In the preceding description, numerous specific details were provided, such as examples of various configurations to provide a thorough understanding of examples of the described technology. It will be recognized, however, that the technology may be practiced without one or more of the specific details, or with other methods, components, devices, etc. In other instances, well-known structures or operations are not shown or described in detail to avoid obscuring aspects of the technology.
Although the subject matter has been described in language specific to structural features and/or operations, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the specific features and operations described above. Rather, the specific features and acts described above are disclosed as example forms of implementing the claims. Numerous modifications and alternative arrangements may be devised without departing from the spirit and scope of the described technology.
Number | Date | Country | |
---|---|---|---|
63049585 | Jul 2020 | US |