Closed-loop wearable sensor and method

Abstract
Methods and electronic devices for measuring motion and acoustic signatures of physiological processes of a human subject. The method includes measuring motion and acoustic signatures of physiological processes of a human subject; sending the first feature-related data to a machine learning service; sending the first feature-related data to a machine learning service; and determining a predicted detection of human scratching activity by the machine-learning service by performing a machine-learning operation on the feature-related data.
Description
Claims
  • 1. An electronic device for measuring motion and acoustic signatures of physiological processes of a human subject, comprising: an inertial measurement unit (IMU) for detecting the motion and acoustic signatures;a microcontroller unit (MCU) communicatively coupled to the IMU;wherein the IMU is operably in direct mechanical communication with the skin of the subject;wherein the MCU calculates feature-related data from the motion and acoustic signatures;wherein the MCU sends the feature-related data to a machine learning service executing on the MCU; andwherein the machine-learning service predicts the detection of human scratching activity by the human subject by performing a first machine-learning operation on the feature-related data.
  • 2. The device of claim 1, wherein the electronic device also comprises a vibratory motor for alerting the human subject.
  • 3. The device of claim 2, wherein the MCU turns on the vibratory motor when human scratching activity is detected.
  • 4. The device of claim 1, wherein the MCU calculates a second-feature-related data from the motion and acoustic signatures.
  • 5. The device of claim 4, wherein the second feature-related data is used to predict that the human subject is asleep.
  • 6. The device of claim 5, wherein a second machine learning operation is used to detect scratching activity if the subject is predicted to be asleep.
  • 7. The device of claim 5, wherein a pattern of vibratory motor operation is determined based on whether the subject is predicted to be asleep.
  • 8. The device of claim 1, wherein the detection of human scratching activity comprises detecting a start of the scratching activity, associating time one with the start of the scratching activity, detecting an end of the scratching activity, and associating time two with the end of the scratching activity.
  • 9. The device of claim 8, wherein the MCU calculates a scratch duration by subtracting time one from time two.
  • 10. The device of claim 4, wherein the second feature-related data is used to determine the intensity of the detected scratching activity.
  • 11. A method, comprising: measuring motion and acoustic signatures of physiological processes of a human subject;calculating a first feature-related data from the motion and acoustic signatures;sending the first feature-related data to a machine learning service; anddetermining a predicted detection of human scratching activity by the machine-learning service by performing a machine-learning operation on the feature-related data.
  • 12. The method of claim 11, wherein a second feature-related data is calculated from the motion and acoustic signatures.
  • 13. The method of claim 12, wherein the second feature-related data is used to predict that the human subject is asleep.
  • 14. The method of claim 11 further comprising providing haptic feedback to the human subject upon detecting human scratching activity.
  • 15. The method of claim 11 wherein measuring the motion and acoustic signatures involves measuring the signatures by an inertial measurement unit, and determining the predicted detection involves executing a microcontroller unit, wherein the inertial measurement unit and the microcontroller are located on the sensor.
  • 16. The method of claim 11 wherein the machine learning operation includes at least one of a neural network, a logistic regression classifier, and a random forest classifier.
Provisional Applications (1)
Number Date Country
63321737 Mar 2022 US