The present disclosure relates generally to the monitoring of subject motion and the processing of motion data. More specifically, the present disclosure relates to the processing of subject motion data to monitor subjects susceptible to epileptic events and to detect the occurrence of seizures.
Epilepsy may be characterized by episodes of disturbed brain activity that cause changes in attention or behavior. Increased heart rate, changes in electrocardiogram (ECG) data, changes in electroencephalography (EEG) data, and changes in movement may be correlated to an onset or an occurrence of a seizure. Information can be obtained from an EEG and other sources to characterize and measure the electrical activity of a subject's brain, and the information can be further analyzed to detect the occurrence of a seizure. Likewise, information can be obtained from an ECG, a heart rate monitor, and other sources to characterize and measure electrical activity of a subject's heart, and the information can be further analyzed to detect the occurrence of a seizure.
Seizure-related motion can be exhibited in a variety of body motions, ranging from an episode of no motion or minimal motion to an episode of severe shaking or other extreme movements. Examining motion to detect seizure-related motions is difficult because normal body motions include many motions that mimic or appear similar to seizures. The effective management of epilepsy often necessitates reliable long-term monitoring of seizures, usually over days and months. Although, visual inspection of EEG signals is the current gold standard for seizure detection in supervised environments such as an epilepsy monitoring unit or an intensive care unit where the subject is mostly stationary, it is not practical to use this approach to objectively quantify long-term seizure frequency, especially when the subject is mobile. A current approach to track long-term seizure frequency is by maintaining seizure diaries. However, it has been shown that self-reporting of seizure incidence is severely inaccurate. In this context, seizure detection via the detection of autonomic signatures, such as cardiac or motor signals that are altered by seizures, presents itself as a viable alternative for long-term monitoring of seizures. This approach becomes even more attractive for monitoring the pediatric epilepsy population, especially during the night when supervision is reduced and the risk of SUDEP (Sudden Unexplained Death in Epilepsy Patients) is high. Wearable devices to chronically monitor cardiac or motor signals associated with seizures can be implemented with greater ease than EEG-based devices and can significantly improve the overall quality of life of patients and caregivers as well as provide an objective way for physicians to track their patients' seizures. Seizures that express themselves in movements or seizures that disturb normal movement patterns can be detected. However, with motion-based seizure detection, motions at any point on the body can influence the motions detected at the point of measurement, and a multitude of normal motions can provide data that overlaps or obscures motion data that can be attributed to a seizure.
Accordingly, what are needed are methods and systems that provide improved measurements of body motion that facilitate the detection of seizures, and that provide and implement improved motion data processing techniques that can be used to identify seizure-related motions within motion data sets containing a multitude of normal motions. Also needed are methods and systems that provide improved resolution of subject motion data to distinguish seizure-related motion from normal motion so as to minimize false positives that may wrongly report the occurrence of a seizure. It is believed that an improved detection of seizure events with motion data and an improved processing of motion data to identify seizures and eliminate or reduce false positives will assist in the diagnosis and treatment of motion-affecting disease states, such as epilepsy, help persons suffering from epilepsy better manage their lives, and assist caregivers in the monitoring of people susceptible to seizures.
To address these and other unmet needs, the present disclosure provides, in exemplary non-limiting embodiments, systems, devices, and methods for effective seizure detection via the detection of the motion of a subject. In particular, the present disclosure is directed to, among other things, the use of a motion monitoring device to assess motion parameters selected to distinguish between seizure and non-seizure motions.
In at least one embodiment, described further below, a method of distinguishing between a first type of motion and a second type of motion of a subject is disclosed. The first and second types of motion may be characterized by a signal corresponding to the first and second types of motions. The method may include the step of receiving the signal at a processor with the signal being representative of subject motion data of the subject and with the subject motion data including subject position data and subject change-in-position data, and using the processor to analyze the subject motion data to distinguish between the first type of motion occurring over a first time period and the second type of motion occurring over a second time period. The method may characterize the first type of motion as having a first bandwidth that is inclusively within a first bandwidth range, as including subject position data that indicates that the subject is in a recumbent orientation throughout the first time period with the recumbent orientation defined by an initial calibration during which the subject is in the recumbent position while defining an offset angle between a subject axis extending from the subject and a vertical axis and with the recumbent orientation further defined by the subject axis remaining inclusively within an offset angle range throughout the first time period, and as including subject change-in-position data that indicates that a first rotation parameter of the subject change-in-position data is inclusively within a rotation range throughout the first time period. The method may also characterize the second type of motion as having a second bandwidth that is inclusively within a second bandwidth range, as including subject position data that indicates that the subject is in an upright orientation throughout the second time period with the upright orientation defined by the offset angle equaling or exceeding an offset angle threshold throughout the second time period, and as including subject change-in-position data that indicates that a second rotation parameter of the subject change-in-position data is greater than a rotation threshold throughout the second time period. The method may further include the generating of a first output from the processor in response to an identification of the first type of motion and the generating a second output from the processor in response to an identification of the second type of motion.
In at least another embodiment, described further below, a method of detecting a neurological condition of a subject is disclosed. The method may include receiving a signal from the subject at a processor with the signal being representative of subject motion data of the subject and with the subject motion data including subject position data and subject change-in-position data, and may include using a processor to analyze the subject motion data to identify a seizure motion occurring over a first time period and a non-seizure motion occurring over a different second time period. The method may characterize the seizure motion as having a first bandwidth that is inclusively within a first bandwidth range, as including subject position data that indicates that the subject is in a recumbent orientation for at least a portion of the first time period with the recumbent orientation defined by an initial calibration during which the subject is in the recumbent position while defining an offset angle between a subject axis extending from the subject and a vertical axis and with the recumbent orientation further defined by the subject axis remaining inclusively within an offset angle range for at least a portion of the first time period and with subject change-in-position data indicating that a first rotation parameter of the subject change-in-position data is inclusively within a rotation range for at least a portion of the first time period. The method may also characterize the non-seizure motion as having a second bandwidth that is inclusively within a second bandwidth range, as including subject position data indicating that the subject is in an upright orientation for at least a portion of the second time period with the upright orientation defined by the offset angle equaling or exceeding an offset angle threshold for at least a portion of the second time period, and as including subject change-in-position data indicating that a second rotation parameter of the subject change-in-position data is greater than a rotation threshold for at least a portion of the second time period. The method may further include generating a first output from the processor in response to an identification of the seizure motion and generating a second output from the processor in response to an identification of the non-seizure motion.
In yet another embodiment, described further below, a motion monitoring system for monitoring a motion of a subject is disclosed. The motion monitoring system may include a housing, a mounting system configured to couple the housing to the subject, an accelerometer disposed on the housing with the accelerometer configured to obtain subject motion data and with the subject motion data including subject position data and subject change-in-position data, and a processor configured to analyze the subject motion data to distinguish between a first type of motion occurring over a first time period and a second type of motion occurring over a second time period. The motion monitoring system may characterize the first type of motion as having a first bandwidth that is inclusively within a first bandwidth range, as including subject position data indicating that the subject is in a recumbent orientation throughout the first time period with the recumbent orientation defined by an initial calibration during which the subject is in the recumbent position while defining an offset angle between a subject axis extending from the subject and a vertical axis and with the recumbent orientation further defined by the subject axis remaining inclusively within an offset angle range throughout the first time period, and as including subject change-in-position data indicating that a first rotation parameter of the subject change-in-position data is inclusively within a rotation range throughout the first time period. The motion monitoring system may further characterize the second type of motion as having a second bandwidth that is inclusively within a second bandwidth range, as including subject position data indicating that the subject is in an upright orientation throughout the second time period with the upright orientation defined by the offset angle equaling or exceeding an offset angle threshold throughout the second time period, and as including subject change-in-position data indicating that a second rotation parameter of the subject change-in-position data is greater than a rotation threshold throughout the second time period. The motion monitoring system may further include an interface that is responsive to the processor, with the interface providing a first output from the processor in response to an identification of the first type of motion and providing a second output from the processor in response to an identification of the second type of motion.
In each of these embodiments, and in others, described below, the first type or seizure type of motion and the second type or the non-seizure type of motion may be characterized or further characterized by one or more of five motion parameters provided in the subject motion data, including the amplitude or magnitude of the detected motion, the period or frequency of the detected motion, the bandwidth of the detected motion, the position, orientation, or posture of the subject during the detected motion, and changes in the position, orientation, or posture of the subject during the detected motion. Those features may be expressed as values that include amplitude, period, bandwidth, offset angle, and rotation, and that may further include magnitude and frequency. Those values may be compared to ranges or thresholds to determine whether the detected motion is a first type of motion, a second type of motion, a seizure motion, or a non-seizure motion. Those ranges may include an amplitude range, a period range, a bandwidth range, an offset angle range or threshold, and a rotation range or threshold, and may further include a range or threshold expressed as a magnitude or a frequency.
The ranges and thresholds (for use in comparison to the detected motion) that are associated with the first type or the seizure type of motion may be preferred values that include: a first amplitude range that is at least one of 0.01 g to 0.60 g and 0.04 g to 0.48 g, a first period range that is at least one of 100 ms to 1000 ms and 160 ms to 750 ms, a first bandwidth range that is at least one of 0.05 to 0.60 and 0.10 to 0.50, an offset angle range that is at least one of zero degrees to 45 degrees and zero degrees to 60 degrees, and a rotation range that is at least one of zero degrees to 30 degrees and zero degrees to 20 degrees. The preferred ranges and thresholds associated with the first type or the seizure type of motion may be substituted by or used with alternative values that include: a first amplitude range that is at least one of 0.11 g to 0.50 g and 0.14 g to 0.38 g, a first period range that is at least one of 200 ms to 900 ms and 260 ms to 650 ms, a first bandwidth range that is at least one of 0.15 to 0.50 and 0.20 to 0.40, an offset angle range that is at least one of zero degrees to 35 degrees and zero degrees to 50 degrees, and a rotation range that is at least one of zero degrees to 20 degrees and zero degrees to 10 degrees.
The ranges and thresholds (for use in comparison to the detected motion) that are associated with the second type or the non-seizure type of motion may be preferred values that include: a second amplitude range that is at least one of 0.04 g to 1.00 g and 0.48 g to 1.00 g, a second period range that is at least one of 100 ms to 2000 ms and 100 ms to 1000 ms, a second bandwidth range that is at least one of zero to 0.80 and 0.10 to 0.80, an offset angle threshold that is at least one of 60 degrees and 45 degrees, and a rotation threshold that is at least one of 15 degrees and 30 degrees. The preferred ranges and thresholds associated with the second type or the non-seizure type of motion may be substituted by or used with alternative values that include: a second amplitude range that is at least one of 0.14 g to 0.90 g and 0.58 g to 0.90 g, a second period range that is at least one of 200 ms to 1900 ms and 200 ms to 900 ms, a second bandwidth range that is at least one of 0.10 to 0.70 and 0.20 to 0.70, an offset angle threshold that is at least one of 70 degrees and 55 degrees, and a rotation threshold that is at least one of 25 degrees and 40 degrees.
In each of these embodiments, and in others, described below, the detected motion may be compared to different sets of ranges and thresholds or combinations of ranges or thresholds to determine whether the motion is the first type or seizure type of motion or the second type or non-seizure type of motion. For example, subject motion data may be compared to the above-described ranges and thresholds relating to the preferred values for the first type of motion and compared to the above-described ranges and thresholds relating to the preferred values for the second type of motion to determine whether the detected motion is consistent with the first or second types of motion. In another example, subject motion data may be compared to the above-described ranges and thresholds relating to the alternative values for the first type of motion and compared to the above-described ranges and thresholds relating to the alternative values for the second type of motion to determine whether the detected motion is consistent with the first or second types of motion. In yet another example, subject motion data may be compared to the above-described ranges and thresholds relating to the preferred values for the first type of motion and compared to the above-described ranges and thresholds relating to the alternative values for the second type of motion to determine whether the detected motion is consistent with the first or second types of motion. In still another example, subject motion data may be compared to the above-described ranges and thresholds relating to the alternative values for the first type of motion and compared to the above-described ranges and thresholds relating to the preferred values for the second type of motion to determine whether the detected motion is consistent with the first or second types of motion. As further described below, the preferred and alternative ranges and threshold can each include different sets of ranges or threshold, thus providing two categories of preferred values and two categories of alternative values that can be selected when evaluating the detected motion. As can be appreciated, the ranges and thresholds provided in these categories can used in their entireties or used piecemeal with values of different categories being used to evaluate motion. For example, subject motion data may be compared to the above-described ranges and thresholds relating to a first category of the preferred values for the first type of motion (noted as “first” amplitude, period, etc. in
In each of these embodiments, the comparison of the subject motion date to the selected ranges for the first/seizure type of motion and the second/non-seizure type of motion may be made to only a minimum or a maximum of the range instead of the entire range. The comparison may also be made over the entire duration of the motion time period, or for only a portion of the time period, or at only the beginning or end of the relevant time period, or using a combination of these comparison techniques. For example, the first amplitude may be compared to a minimum and/or a maximum of the first amplitude range, the first period may be compared to a minimum and/or a maximum of the first period range, the first bandwidth value may be compared to a minimum and/or a maximum of the first bandwidth range, the offset angle may be compared to a minimum and/or a maximum of the offset angle range, the first rotation parameter may be compared to a minimum and/or a maximum of the rotation range. Likewise, in another example, the second amplitude may be compared to a minimum and/or a maximum of the second amplitude range, the second period may be compared to a minimum and/or a maximum of the second period range, and the second bandwidth may be compared to a minimum and/or a maximum of the second bandwidth range.
For each of these embodiments, an identification of a first type or a seizure type of motion, and an identification of a second type or a non-seizure type of motion may be provided as an output or as a report to the subject or to a caregiver.
The features, functions, and advantages of the disclosed embodiments can be achieved independently in various embodiments or may be combined in yet other embodiments, further details of which are disclosed with reference to the following description and drawings.
Illustrative embodiments are described herein. Particular illustrative embodiments of the present disclosure are described below with reference to the drawings. In the description, common elements are designated by common reference numbers throughout the drawings.
A medical device system may include a motion monitoring system to gather and monitor motion data associated with a subject and perform seizure detection using the subject motion data. The monitoring or sensor system may generate a signal corresponding to the motion data and communicate the motion data to a base station system. The base station system may comprise a stand-alone communication device, a cellphone (mobile phone) device, or a combination thereof. The motion data may be sent by the base station system to a remote computing device associated with a healthcare provider, or a manufacturer or distributor of the medical device system, to monitor and perform additional medical diagnosis. The remote computing device may be associated with a care giver (e.g., a relative or an emergency care facility) of the user. The motion data may be sent by the base station system to the remote computing device to alert the care giver to a seizure event so that emergency medical services may be provided to the user. The motion data may be sent directly to a remote computing device to monitor, alert, or perform additional medical diagnoses. The motion data may also be processed at the sensor or at a portion of the sensor system containing or controlling the sensor, with the signal corresponding to the motion being transmitted to a base station or a remote computing device. Alternatively, the sensor or a portion of the sensor system containing or controlling the sensor may provide the data directly to the base station so as to perform most or all of the signal processing at the base station.
During operation, the sensor system 110 may be configured to detect and monitor movement of the user 108. For example, the sensor system 110 may include one or more accelerometers, as further described with reference to
A processor of the sensor system 110 may receive subject motion data associated with the subject 108 (e.g., from the accelerometer), or may receive accelerometer data from the accelerometer that includes subject motion data. The subject motion data or the acceleration data may include the subject position data and the subject change-in-position data. The subject motion data may be time sequenced and may relate to different periods of time. For example, the subject motion data may indicate acceleration data and timestamps associated with multiple measurement events. To illustrate, the accelerometer may detect the acceleration data periodically (e.g., at 10 millisecond intervals). The accelerometer data may be first and second accelerometer data and include, respectively, first subject motion data and first subject change-in-position data associated with a first timestamp (which may be expressed as a first time period) and second subject motion data and second change-in-position data associated with a second timestamp (which may be expressed as a second time period). The first timestamp may also indicate a first time period at which the first acceleration data is detected. The second timestamp may also indicate a second time period at which the second acceleration data is detected. As can be appreciated, the first acceleration data may correspond to a first type of motion by the subject that corresponds to a first period of time, and the second acceleration data may correspond to a second type of motion by the subject that corresponds to a second period of time. Furthermore, the first and second types of motions may be different types of motions and may concern different periods of time.
The sensor system 110 may be configured to detect a body position (e.g., posture) of the subject 108 relative to a vertical axis 102 defined based on gravitational pull, and the body position of the subject can be evaluated to determine whether the subject is in an recumbent orientation or an upright orientation. In one embodiment, the subject 108 can be disposed on a plane or surface 104 supporting the subject and the plane 104 can be at a 90 degree or similar angle relative to the vertical axis 102 to allow determination of the subject's orientation. The sensor system 110 may determine the position or orientation of the user 108 based on an angle formed by a subject axis (e.g., a subject axis in a first orientation 106 or a subject axis in a second orientation 116) relative to the vertical axis 102 or relative to the plane 104. The subject axis 106, 116 can extend from the subject 108 or the sensor system 110 in a normal direction away from the subject 108 or sensor system 110 to provide a position or orientation of the subject 108 or sensor system 110 relative to a frame of reference defined by the vertical axis 102 or the plane 104. As can be appreciated, the subject axis 106, 116 may not be perpendicular or parallel to the vertical axis 102 or the plane 104 and instead be offset by an angle that can be identified and accounted for when determining subject position or orientation relative to an external reference such as vertical axis 102 or plane 104. As can also be appreciated, the position or orientation of the subject axis 106, 116 and the offset or offset angle used to relate the subject axis 106, 116 to an external frame of reference may be based on an initial calibration of the sensor system 110 or the accelerometer or accelerometers supported by the sensor system 110 that is made when the subject 108 is in a known position or orientation to the vertical axis 102 or to the plane or surface 104. The calibration may provide a baseline orientation of the subject from which the offset or offset angle can be defined. The subject axis 106, 116 may be an axis extending from the subject 108 or the sensor system 110 that is compared to the vertical axis 102 or plane 104 to identify an offset or offset angle between the subject axis 106, 116 and the vertical axis 102 and to identify whether the subject is in a recumbent position or an upright position. The angle between the subject axis 106, 116 and the vertical axis 102 or the plane 104 may define an initial position of the subject axis 106, 116 relative to the vertical axis 102 or plane 104 and subsequent positions of the subject axis 106, 116 may be identified and compared to the vertical axis 102 or plane 104. Likewise, a similar comparison can be made to identify an initial position of the subject axis 106, 116 to determine any change in the subject's position over time. As can be appreciated, such change-in-position data can be compared to the first or second timestamps or to the first or second time periods to determine a rate at which the subject changes position.
The subject axis (e.g., the subject axis in the first orientation 106 or the subject axis in the second orientation 116) may extend away from the user 108 in a direction normal to a frontal plane of the user 108, which may be understood to be a coronal plane of the subject's body and may be further understood to be a plane that divides the subject's body into ventral and dorsal sections. The coronal plane of the subject can be parallel or nearly parallel to the plane 104 when the subject in the recumbent orientation, and the coronal plane can extend in a direction that is parallel or nearly parallel to the vertical axis 102 when the subject is in the upright position. In a particular embodiment, the accelerometer may be coupled to the subject (e.g., the user 108) to define the subject axis. For example, the accelerometer may be coupled to a chest of the user 108 and the subject axis may extend away from the user 108 or the sensor system 110 in a direction perpendicular (normal) to the subject's chest. As another example, the accelerometer may be coupled to a back of the user 108 and the subject axis may extend away from the user in a direction perpendicular to the subject's back. In still another example, the accelerometer may be coupled to any portion of the subject to provide a subject axis extending away from the subject, and the subject axis established by the accelerometer may be calibrated with the vertical axis 102 or with some other reference point. The calibration may provide a baseline orientation of the subject from which the offset or offset angle can be defined. The accelerometer may detect acceleration about multiple axes to determine a relative orientation of the subject axis with respect to the vertical axis 102.
As illustrated in
The sensor system 110 may be configured to detect subject position (or posture) and subject change-in-position (or change of posture) over a time period or time window. The time period may be any suitable length of time extending between an initial determination of the subject position and a subsequent determination of the subject position and, as an example, can be 1 ms, 10 ms, 100 ms, 1 second, 10 seconds, 1 minute, or any increment between these values. The time period can be established to best suit the type of motion detected, the quality of the sensed motion data, the type of seizure to be detected, subject body type factors, the location of the motion monitoring system relative to the body, and the power levels and battery capacity of the system. As can be appreciated, a first time period may be associated with a first type of motion and a second time period may be associated with a second type of motion that may be a different type of motion than the first type of motion. As can be further appreciated, the time period can be constant for the first type of motion and the second type of motion, or the time window for the first type of motion can be different than the time window for the second type of motion. As can be further appreciated, the time periods for the first and second types of motion can be determined independently, be based on different parameters or values, share common parameters, or be constructed so that the time period for one type of motion is a function of the time period for another type of motion. The initial and subsequent determinations of subject position can each be relative to the vertical axis 102 or, alternatively, relative to each other. The initial and subsequent subject positions can define a varying angle between the initial and subsequent positions of the subject axis. The varying angle can have a value in degrees that the subject axis has moved over a period of time defined by the time period. For example, the posture of the user 108 may be changing from a recumbent or lying down orientation (e.g., as illustrated in
The sensor system 110 may be configured to detect an amplitude or magnitude 206 associated with the movement of the subject 108 and may be configured to generate a signal corresponding to the detected motion of the subject, such as the amplitude signal 1102 illustrated in
The sensor system 110 may be configured to detect a period or frequency 208 associated with the movement of the subject 108 and may be configured to generate a signal corresponding to the detected motion of the subject, such as the period signal 1104 illustrated in
The sensor system 110 may be configured to detect a bandwidth 210 associated with the movement of the subject 108 and may be configured to generate a signal corresponding to the detected motion of the subject, such as the bandwidth signal 1106 illustrated in
The sensor system 110 may be configured to detect a position or orientation (or posture) 212 of the subject 108 and may be configured to generate a signal corresponding to the detected position or orientation of the subject, such as the position signal 1108 illustrated in
The sensor system 110 may be configured to detect a change in position or a change in orientation (or change in posture) 214 of the subject 108 and may be configured to generate a signal corresponding to the detected change in position or change in orientation of the subject, such as the change-in-position signal 1110 illustrated in
In a particular embodiment, the processor may analyze the posture, the change of the posture, the amplitude, the period, the bandwidth, or a combination thereof, associated with a particular axis (e.g., the x-axis, the y-axis, and the z-axis) of the 3D domain. For example, the processor may determine an axis amplitude associated with each of the three axes. The processor may determine the axis amplitude based on a high peak and a low peak associated with each axis (e.g., the x-axis, the y-axis, or the z-axis) during a sample time window. As another example, the processor may determine an axis posture based on an axis orientation (e.g., an x-axis orientation, a y-axis orientation, or a z-axis orientation) relative to the vertical axis 102. As a further example, the processor may determine an axis change in posture based on a change of the axis orientation. As another example, the processor may determine an axis period based on a time difference associated with peaks or zero crossing corresponding to a particular axis (e.g., the x-axis, the y-axis, or the z-axis). As a further example, the processor may determine an axis bandwidth based on a ratio of a particular axis period and an average axis period.
In a particular embodiment, the processor may use axis measurements associated with a particular axis to distinguish between various types of motion (e.g., seizure motions and non-seizure motions). For example, when the user 108 is sleeping, one or more of the axis measurements (e.g., posture, change of posture, amplitude, period, or bandwidth) associated with the y-axis may have higher values than the axis measurements associated with the x-axis and the z-axis. The processor may use the axis measurements associated with the y-axis in distinguishing between the various types of motions, as further described with reference to
The motion of a subject can be monitored and measured in a variety of ways and with a variety of measurement devices. Preferably, the motion of a subject can be observed by a motion monitoring device or system oriented to receive data from the subject as the subject moves. The motion monitoring device can be configured to collect a single parameter of motion, such as velocity, or collect multiple parameters of motion, such as velocity and direction. The motion monitoring device can be configured to collect motion data that is combined with motion data obtained from another device, which can be another motion monitoring device or system, or a device or system that does not measure motion directly, such as a pressure sensor. The monitoring can be indirect, such as with a video or visual system that can record the motion of a subject from a distance. One example of an indirect motion monitoring device is a Kinect motion monitoring system provided with some Microsoft gaming systems, which remotely monitors the motion of persons operating the system. Other examples on indirect motion monitoring systems include a video camera and an RF motion detector. In still another example of an indirect motion monitoring system, the aforementioned monitoring systems can be mounted on the subject, e.g., as a camera on a helmet, and the indirect motion monitoring is derived from how the subject-mounted system moves relative to a stationary environment viewed by the monitoring system. The monitoring can also be direct, with the motion of the subject measured by the placement of a sensor on or in a defined relationship to the subject's body. Preferably, the direct motion monitoring system is an accelerometer affixed to a subject body, preferably to a portion of the subject's body that is adjacent to the subject's skeleton. More preferably, the direct motion monitoring system is an accelerometer affixed to the surface of the subject's skin at the subject's chest, placed over the rib cage or over the sternum. The direct motion monitoring system may be coupled to the subject's limbs, such as the ankle or wrist. The direct motion monitoring system may be affixed to the subject with an adhesive or held fast with tape or a strap, or the system may be embedded in another structure such as on or within an item of clothing, jewelry, or a watch. The direct motion monitoring system may also have components that are implanted within a subject.
A motion monitoring system, such as an accelerometer, may monitor and provide motion data corresponding to activity and movement associated with a patient's body. The motion data may be used as an alternative to, or in addition to, other data (such as ECG data or EEG data) to identify a seizure. The motion monitoring system may be attached to an external surface of the patient's body. The motion data may be associated with movement of the patient's chest or with a change or a rate of change of a patient's position (such as associated with the patient moving from a lying position to a sitting position). The motion data may be compared with threshold values to distinguish a first type of motion (e.g., seizure motion) from a second type of motion (e.g., non-seizure motion). For example, the first type of motion may be associated with first threshold values of posture, amplitude, period, bandwidth, subject position or axis, and/or change or rate of change of the subject position or axis. As another example, the second type of motion may be associated with second threshold values of posture, amplitude, period, bandwidth, subject position or axis, and/or change or rate of change of the subject axis. The motion data may be analyzed to distinguish between the first type of motion and the second type of motion by determining whether the motion data satisfies one or more of the first threshold values or one or more of the second threshold values.
In a particular embodiment, the sensor system 110 may use one or more other sensors alternatively, or in addition to, the accelerometer to detect the subject motion data. The one or more other sensors may be coupled to a chest, a back, a shoulder, a side, or a limb of the user 108. One example of an additional sensor is a gyroscope that may be used to detect the subject motion data and, in particular, to detect the rotation of the subject.
In a particular embodiment, the one or more other sensors may gather data regarding the user 108 at a distance from the user 108. For example, the sensor system 110 may include or be coupled to a visualization device that may include a video device such as a camera or a thermal imaging system and/or may include or be coupled to a motion detector, a depth sensor, or an infrared laser device. As can be appreciated, in some embodiments the sensor system 110 may provide a two dimensional image with subject motion data obtained via the video device and may provide data regarding a third dimension with the motion detector, depth sensor, or infrared laser device. The sensor system 110 may be located in a same room as the user 108. The sensor system 110 may periodically capture images of the user 108. The subject motion data may be generated by sensor system 110 based on the images obtained by a video device alone or in association with other devices such as the motion detector, depth sensor, or infrared laser device. For example, the sensor system 110 may identify a frontal plane of the user 108 presented in a first image of the user 108 and may define the subject axis in the first orientation 106 extending in a direction normal to the frontal plane. As another example, the sensor system 110 may determine a position of the user 108 in the at least 3-axes (e.g., the x-axis, the y-axis, and the z-axis) based on an analysis of the images obtained by the sensor system 110 or the camera of the sensor system 110. The sensor system 110 may also determine a posture, a change of the posture, an amplitude, a period and a bandwidth associated with movements of the user 108 in the at least 3-axes (e.g., the x-axis, the y-axis, and the z-axis) based on the analysis of the images obtained with the sensor system 110.
The sensor system 110 may be configured to distinguish between various types of motion by analyzing the subject motion data, as further described with reference to
Referring to
The table 200 (or 200′) includes a first column associated with a first type of motion 202/202′ (e.g., seizure motion) and a second column associated with a second type of motion 204/204′ (e.g., non-seizure motion). The table 200 (200′) also includes a first row associated with amplitude range values 206 (206′), a second row associated with period range values 208 (208′), a third row associated with bandwidth range values 210 (210′), a fourth row associated with posture range and threshold values 212 (212′), and a fifth row associated with change of posture range and threshold values 214 (214′). The table 200 indicates ranges and threshold values that are indicative of the first type of motion 202 and of the second type of motion 204.
During operation, the sensor system 110 of
For example, a processor of the sensor system 110 may determine that the subject motion data corresponds to the first type of motion 202/202′ (e.g., seizure motion) in response to determining that one or more of the range values corresponding to the first type of motion 202/202′ are satisfied. As another example, the processor of the sensor system 110 may determine that the subject motion data corresponds to the second type of motion 204/204′ (e.g., non-seizure motion) in response to determining that one or more of the range values corresponding to the second type of motion 204/204′ are satisfied.
The processor may distinguish between the first type of motion 202 and the second type of motion 204 based on the amplitude indicated by the subject motion data and the amplitude range values 206. For example, a processor of the sensor system 110 may determine that subject motion data corresponds to the first type of motion 202 (e.g., seizure motion) in response to determining that the amplitude is within a first amplitude range 222 (e.g., 0.01 gravitational force (g) to 0.60 g) and may determine that the subject motion data corresponds to the second type of motion 204 (e.g., non-seizure motion) in response to determining that the amplitude is within a third amplitude range 226 (e.g., 0.04 g to 1.00 g). As another example, the processor may determine that the subject motion data corresponds to the first type of motion 202 in response to determining that the amplitude is within a second amplitude range 224 (e.g., 0.04 g to 0.48 g) and may determine that the subject motion data corresponds to the second type of motion 204 in response to determining that the amplitude is within a fourth amplitude range 228 (e.g., 0.48 g to 1.00 g). In a similar fashion, the processor may also distinguish between the first type of motion 202′ and the second type of motion 204′ based on the alternative ranges and thresholds provided in table 200′.
The processor may distinguish between the first type of motion 202 and the second type of motion 204 based on the period indicated by the subject motion data and the period threshold values 208. For example, the processor may determine that the subject motion data corresponds to the first type of motion 202 in response to determining that the period is within a first period range 230 (e.g., 100 milliseconds (ms) to 1000 ms) and may determine that the subject motion data corresponds to the second type of motion 204 in response to determining that the period is within a third period range 234 (e.g., 100 ms to 2000 ms). As another example, the processor may determine that the subject motion data corresponds to the first type of motion 202 in response to determining that the period is within a second period range 232 (e.g., 160 ms to 750 ms) and may determine that the subject motion data corresponds to the second type of motion 204 in response to determining that the period is within a fourth period range 236 (e.g., 100 ms to 1000 ms).
The processor may distinguish between the first type of motion 202 and the second type of motion 204 based on the bandwidth indicated by the subject motion data and the bandwidth threshold values 210. For example, the processor may determine that the subject motion data corresponds to the first type of motion 202 in response to determining that the bandwidth is within a first bandwidth range 240 (e.g., 0.05 to 0.60) and may determine that the subject motion data corresponds to the second type of motion 204 in response to determining that the bandwidth is within a third bandwidth range 244 (e.g., 0.00 to 0.80). As another example, the processor may determine that the subject motion data corresponds to the first type of motion 202 in response to determining that the bandwidth is within a second bandwidth range 242 (e.g., 0.10 to 0.50) and may determine that the subject motion data corresponds to the second type of motion 204 in response to determining that the bandwidth is within a fourth bandwidth range 246 (e.g., 0.10-0.80).
The processor may distinguish between the first type of motion 202 and the second type of motion 204 based on the posture indicated by the subject motion data and the posture threshold values 212 relative to a vertical axis 102 or some other reference system. For example, the processor may determine that the subject motion data corresponds to the first type of motion 202 in response to determining that the posture is within a first angle range 250 (e.g., less than or equal to 45 degrees) and may determine that the subject motion data corresponds to the second type of motion 204 in response to determining that the posture is greater than or equal to a third angle threshold 254 (e.g., greater than or equal to 60 degrees). As another example, the processor may determine that the subject motion data corresponds to the first type of motion 202 in response to determining that the posture is within a second angle range 252 (e.g., less than 60 degrees) and may determine that the subject motion data corresponds to the second type of motion 204 in response to determining that the posture is greater than or equal to a fourth angle range 256 (e.g., greater than 45 degrees).
The processor may distinguish between the first type of motion 202 and the second type of motion 204 based on a change of the posture indicated by the subject motion data and the change of the posture range and threshold values 214. For example, the processor may determine that the subject motion data corresponds to the first type of motion 202 in response to determining that the change of the posture over a time window is within a first change range 260 (e.g., less than 30 degrees) and may determine that the subject motion data corresponds to the second type of motion 204 in response to determining that the change of the posture of a time window is greater than a third change threshold 264 (e.g., greater than 15 degrees). As another example, the processor may determine that the subject motion data corresponds to the first type of motion 202 in response to determining that the change of the posture over a time window is within a second change range 262 (e.g., less than 20 degrees) and may determine that the subject motion data corresponds to the second type of motion 204 in response to determining that the change of the posture over a time window is greater than a fourth change threshold 266 (e.g., greater than 30 degrees).
The processor may generate an output based on identifying the first type of motion 202 or the second type of motion 204. For example, the output may indicate the type of motion identified (e.g., the first type of motion 202 or the second type of motion 204). The processor may transmit the output, the subject motion data, or both, to a base station system.
In a particular embodiment, the processor may use more than one of the threshold values of the table 200 to distinguish between the first type of motion 202 and the second type of motion 204. For example, the processor may sequentially analyze each threshold value. To illustrate, the processor may compare the amplitude to the amplitude threshold values 206 prior to comparing the period to the period threshold values 208. In this example, the processor may determine the first type of motion 202 when all the threshold values associated with the first type of motion 202 (e.g., indicated in the first column of the table 200) are satisfied. Alternatively, the processor may determine the second type of motion 204 when all the threshold values associated with the second type of motion 204 (e.g., indicated in the second column of table 200) are satisfied.
In a particular embodiment, the processor may analyze a subsequent threshold value in response to determining that an analysis of a prior threshold value is inconclusive. For example, the processor may determine that the amplitude (e.g., 0.05 g) is within a region where the first amplitude range 222 and the third amplitude range 226 overlap indicating that an analysis of the subject motion data based on the amplitude is inconclusive. In response to the determination, the processor may compare the subject motion data to the period threshold values 208 or to another threshold.
In a particular embodiment, the processor may refrain from comparing subsequent threshold values in response to determining that an analysis of a particular threshold value conclusively identifies the first type of motion 202 or the second type of motion 204. For example, the processor may determine that the amplitude is within the first amplitude range 222 and outside the third amplitude range 226 indicating that the amplitude conclusively identifies the first type of motion 202. In response to the determination, the processor may refrain from analyzing subsequent threshold values (e.g., the period threshold values, the bandwidth threshold values, the posture threshold values, the change of the posture threshold values, or a combination thereof).
Thus, one or more of the threshold values indicated by the table 200 may enable the sensor system 110 to distinguish between various types of motion based on detected subject motion data.
Referring to
The remote computing device 386 may be a computing device that is located at a location remote from the base station system 388. For example, the remote computing device 386 may be at a location associated with a health care provider, such as a hospital. The remote computing device 386 may communicate patient information to the base station system 388, may receive motion data (e.g., the subject motion data), may receive indications of motion types (e.g., an indication of a seizure onset or offset) from the base station system 388, or a combination thereof. The remote computing device 386 may monitor the patient based on the data received from the base station system 388.
The sensor system 320 may include a preprocessor 330, a processor 340, and a memory 350. The memory 350 may be coupled to the preprocessor 330, to the processor 340, or to both. The memory 350 may include instructions that are executable by a processor (e.g., the preprocessor 330, the processor 340, or both) to operate the sensor system 320. The instructions may further cause the processor to perform one or more of the methods described herein as being performed by a sensor system (e.g., the sensor system 110 of
The sensor system 320 may include a user input device 360. The user input device 360 may be coupled to the preprocessor 330. The sensor system 320 may include one or more interface connectors 324. The sensor system 320 may include an input interface 302, a power manager 304, a data transfer controller 306, a battery 314, a battery protector 316, a power treatment unit 318, or a combination thereof. The input interface 302 may include a micro-universal serial bus (USB) connector. The input interface 302 may be coupled to the data transfer controller 306. The data transfer controller 306 may be coupled to the processor 340. The input interface 302 may be coupled to the power manager 304.
The power manager 304 may be coupled to the battery 314 and may control distribution of power to the sensor system 320 by the battery 314. The power manager 304 may be a USB power manager. The battery 314 may be coupled to the battery protector 316. The battery 314 may provide power to a power treatment unit 318. The power treatment unit 318 may control distribution and treatment of the power to the sensor system 320. The power treatment unit 318 may be coupled to the memory 350, the processor 340, the preprocessor 330, and the data transfer controller 306. The power treatment unit 318 may include a buck/boost converter, a boost converter, or a combination thereof.
In a particular embodiment, the sensor system 320 may include a sense amplifier 332. An input of the sense amplifier 332 may be coupled to the one or more interface connectors 324. An output of the sense amplifier 332 may be coupled to the processor 340.
The sensor system 320 may include a transceiver 346 and an antenna 348, coupled to the transceiver 346. The transceiver 346 may be coupled to the processor 340. The sensor system 320 may include a ferroelectric random-access memory (FRAM) 344, an accelerometer 342, one or more non-ECG sensors 380, an output indicator 362, or a combination thereof.
The FRAM 344 may store data and instructions for the processor 340. The FRAM 344 may perform access operations faster than access operation performed by the memory 350. The FRAM 344 may operate in the event of a power loss in the sensor system 320. The processor 340 may include, or be coupled to, the FRAM 344.
The accelerometer 342 may be a 3D accelerometer. In a particular embodiment, the accelerometer 342 may correspond to the accelerometer described with reference to
The one or more other sensors 380 may be configured to sense other data. The other data may be stored in the memory 350. The other data may include heart beat data, electrical activity data generated by muscle activity or movement within the patient's body, etc.
The output indicator 362 may provide the patient (e.g., the user 108 of
The sensor system 320 may be entirely or at least partially enclosed by a housing 390. In a particular embodiment, the housing 390 may at least partially enclose the one or more interface connectors 324, the preprocessor 330, the processor 340, and the transceiver 346. The housing 390 may provide water-resistant protection for the one or more interface connectors 324, the preprocessor 330, the processor 340, and the transceiver 346.
The one or more interface connectors 324 may at least partially extend outside of the housing 390. The one or more interface connectors 324 may be operatively coupled to a connector interface of a mounting system (e.g., a patch). The mounting system may be configured to couple the housing 390 to a subject (e.g., the user 108 of
The processor 340 may analyze the output received from the accelerometer 342 to distinguish between various types of motion (e.g., the first type of motion 202 and the second type of motion 204). For example, the processor 340 may analyze the subject motion data to detect a seizure event. The processor 340 may generate a particular output in response to the subject motion data indicating a particular type of motion (e.g., the first type of motion 202 or the second type of motion 204). The processor 340 may store a log indicating the detected type of motion in the memory 350.
The processor 340 may be configured to maintain a log of system activity within the sensor system 320. The log of system activity may include communication activity of the sensor system 320. The communication activity may include activation and deactivation activity performed by the transceiver 346. The log of system activity may include memory activity including operation of the memory 350, the FRAM 344, or both. The memory activity may include memory read and write operations.
The transceiver 346 may configured to communicate with one or more external devices, such as the base station system 388. The transceiver 346 may perform transmission via the antenna 348. The transceiver 346 may include a transmitter to transmit communications signals and a receiver to receive communication signals. The sensor system 320 may use the transceiver 346 to communicate with the external device via the communication connection 384. For example, the sensor system 320 may transmit motion data (e.g., the subject motion data), an output indicating a detect type of motion (e.g., the first type of motion 202 or the second type of motion 204), or both, via transmission from the transmitter to the base station system 388. The communication connection 384 may facilitate data communication according to one or more of wireless mobile data communication compliant standards including code division multiple access (CDMA), time division multiple access (TDMA), frequency division multiple access (FDMA), orthogonal frequency division multiple access (OFDMA), single-carrier frequency division multiple access (SC-FDMA), a global system for mobile communications (GSM), enhanced data rates for GSM evolution (EDGE), evolved EDGE, Universal Mobile Telecommunications System (UMTS), Worldwide Interoperability for Microwave Access (Wi-Max), general packet radio service (GPRS), 3rd generation partnership project (3GPP), 3GPP2, 4th generation (4G), long term evolution (LTE), 4G-LTE, high speed packet access (HSPA), HSPA+, Institute of Electrical and Electronics Engineers (IEEE) 802.11x , or a combination thereof.
The user input device 360 may enable the patient to provide input to the sensor system 320. The input may be used to control operation of the sensor system 320. For example, the user input device 360 may be configured to cause the processor 340 to process the subject motion data in response to user input via the user input device 360.
The system 300 may operable to distinguish between various types of motion and to store information regarding a detected type of motion. The information may be communicated to a user (e.g., the user 108), to the base station system 388, to the remote computing device 386, or a combination thereof. The information may be used to log and monitor user activity. For example, the information may be used to monitor a frequency of seizures experienced by the user 108. The information may facilitate medical diagnostics and treatment of the user 108.
The method 400 includes obtaining, at a processor, subject motion data from an accelerometer, at 402. The accelerometer may be coupled to a subject to define a subject axis that extends away from the subject in a direction normal to a frontal plane of the subject. For example, a processor of the sensor system 110 of
The method 400 also includes analyzing, by the processor, the subject motion data to distinguish between a first type of motion and a second type of motion, at 404. For example, a processor of the sensor system 110 of
The first type of motion may be characterized by a portion of the motion data having a first amplitude of 0.01 g to 0.60 g, the portion of the motion data having a first period of 100 ms to 1000 ms, the portion of the motion data having a first bandwidth of 0.05 to 0.60, the subject axis being disposed at a first angle of 45 degrees or more relative to a vertical axis, a first position of the subject axis changing less than 30 degrees over a time window, or a combination thereof. The second type of motion may be characterized by the portion of the motion data having a second amplitude of 0.04 g to 1.00 g, the portion of the motion data having a second period of 100 ms to 2000 ms, the portion of the motion data having a second bandwidth of 0.00 to 0.80, the subject axis being disposed at a second angle of 60 degrees or less relative to the vertical axis, a second position of the subject axis changing greater than 15 degrees over a time window, or a combination thereof.
In a particular embodiment, the first type of motion may be characterized by a portion of the motion data having a first amplitude of 0.04 g to 0.48 g, the portion of the motion data having a first period of 160 ms to 750 ms, the portion of the motion data having a first bandwidth of 0.10 to 0.50, the subject axis being disposed at a first angle of greater than 60 degrees relative to a vertical axis, a first position of the subject axis changing at less than 20 degrees over a time window, or a combination thereof. The second type of motion may be characterized by the portion of the motion data having a second amplitude of 0.48 g to 1.00 g, the portion of the motion data having a second period of 100 ms to 1000 ms, the portion of the motion data having a second bandwidth of 0.10 to 0.80, the subject axis being disposed at a second angle of less than 45 degrees relative to the vertical axis, a second position of the subject axis changing at greater than 30 degrees over a time window, or a combination thereof.
The method 400 further includes generating an output from the processor in response to an identification of the first type of motion, at 406. For example, a processor of the sensor system 110 of
With reference to
As can be appreciated from the embodiment illustrated in
Although the description above contains many specificities, these specificities are utilized to illustrate some particular embodiments of the disclosure and should not be construed as limiting the scope of the disclosure. The scope of this disclosure should be determined by the claims and their legal equivalents. A method or device does not have to address each and every problem to be encompassed by the present disclosure. All structural, chemical and functional equivalents to the elements of the disclosure that are known to those of ordinary skill in the art are expressly incorporated herein by reference and are intended to be encompassed by the present claims. A reference to an element in the singular is not intended to mean one and only one, unless explicitly so stated, but rather it should be construed to mean at least one. No claim element herein is to be construed under the provisions of 35 U.S.C. §112, sixth paragraph, unless the element is expressly recited using the phrase “means for.” Furthermore, no element, component or method step in the present disclosure is intended to be dedicated to the public, regardless of whether the element, component or method step is explicitly recited in the claims.
The disclosure is described above with reference to drawings. These drawings illustrate certain details of specific embodiments of the systems and methods and programs of the present disclosure. However, describing the disclosure with drawings should not be construed as imposing on the disclosure any limitations that may be present in the drawings. The present disclosure contemplates methods, systems and program products on any machine-readable media for accomplishing its operations. The embodiments of the present disclosure may be implemented using an existing computer processor, a special purpose computer processor, or by a hardwired system.
As noted above, embodiments within the scope of the present disclosure include program products including machine-readable media for carrying or having machine-executable instructions or data structures stored thereon. Such machine-readable media can be any available media which can be accessed by a general purpose or special purpose computer or other machine with a processor. By way of example, such machine-readable media can include RAM, ROM, EPROM, EEPROM, CD ROM or other optical disk storage, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to carry or store program code in the form of machine-executable instructions or data structures and which can be accessed by a general purpose or special purpose computer or other machine with a processor. The disclosure may be utilized in a non-transitory media. When information is transferred or provided over a network or another communications connection (either hardwired, wireless, or a combination of hardwired or wireless) to a machine, the machine properly views the connection as a machine-readable medium. Thus, any such connection is properly termed a machine-readable medium. Combinations of the above are also included within the scope of machine-readable media. Machine-executable instructions include, for example, instructions and data which cause a general purpose computer, a special purpose computer, or special purpose processing machines to perform a certain function or group of functions.
Embodiments of the disclosure are described in the general context of method steps which may be implemented in one embodiment by a program product including machine-executable instructions, such as program code, for example, in the form of program modules executed by machines in networked environments. Generally, program modules include routines, programs, objects, components, data structures, etc., that perform particular tasks or implement particular abstract data types. Machine-executable instructions, associated data structures, and program modules represent examples of program code for executing steps of the methods disclosed herein. The particular sequence of such executable instructions or associated data structures represent examples of corresponding acts for implementing the functions described in such steps
Embodiments of the present disclosure may be practiced in a networked environment using logical connections to one or more remote computers having processors. Logical connections may include a local area network (LAN) and a wide area network (WAN) that are presented here by way of example and not limitation. Such networking environments are commonplace in office-wide or enterprise-wide computer networks, intranets and the Internet and may use a wide variety of different communication protocols. Those skilled in the art will appreciate that such network computing environments will typically encompass many types of computer system configurations, including personal computers, hand-held devices, multi-processor systems, microprocessor-based or programmable consumer electronics, network PCs, servers, minicomputers, mainframe computers, and the like. For example, the network computing environment may include the sensor system 110 of
An exemplary system for implementing the overall system or portions of the disclosure might include a general purpose computing device in the form of a computer, including a processing unit, a system memory, and a system bus that couples various system components including the system memory to the processing unit. For example, the general purpose computing device may include the sensor system 110 of
It should be noted that although the flowcharts provided herein show a specific order of method steps, it is understood that the order of these steps may differ from what is depicted. Also two or more steps may be performed concurrently or with partial concurrence. Such variation will depend on the software and hardware systems chosen and on designer choice. It is understood that all such variations are within the scope of the disclosure.
The foregoing description of embodiments of the disclosure has been presented for purposes of illustration and description. It is not intended to be exhaustive or to limit the disclosure to the precise form disclosed, and modifications and variations are possible in light of the above teachings or may be acquired from practice of the disclosure. The embodiments were chosen and described in order to explain the principals of the disclosure and its practical application to enable one skilled in the art to utilize the disclosure in various embodiments and with various modifications as are suited to the particular use contemplated.
The Abstract of the Disclosure is submitted with the understanding that it will not be used to interpret or limit the scope or meaning of the claims. In addition, in the foregoing Detailed Description, various features may be grouped together or described in a single embodiment for the purpose of streamlining the disclosure. This disclosure is not to be interpreted as reflecting an intention that the claimed embodiments require more features than are expressly recited in each claim. Rather, as the following claims reflect, the claimed subject matter may be directed to less than all of the features of any of the disclosed embodiments.
This application is a non-provisional application of U.S. Provisional Patent Application No. 61/912,502, filed Dec. 5, 2013, and U.S. Provisional Patent Application No. 61/913,207, filed Dec. 6, 2013. U.S. Provisional Patent Application Nos. 61/912,502 and 61/913,207 are hereby incorporated herein by reference in their entireties.
Number | Date | Country | |
---|---|---|---|
61912502 | Dec 2013 | US | |
61913207 | Dec 2013 | US |