The present technology is directed to medical devices and, more particularly, to systems and methods for detecting strokes.
Stroke is a serious medical condition that can cause permanent neurological damage, complications, and death. Stroke may be characterized as the rapidly developing loss of brain functions due to a disturbance in the blood vessels supplying blood to the brain. The loss of brain functions can be a result of ischemia (lack of blood supply) caused by thrombosis or embolism. During a stroke, the blood supply to an area of a brain may be decreased, which can lead to dysfunction of the brain tissue in that area.
Stroke is the number two cause of death worldwide and the number one cause of disability. Speed to treatment is the critical factor in stroke treatment as 1.9 M neurons are lost per minute on average during stroke. Stroke diagnosis and time between event and therapy delivery are the primary barriers to improving therapy effectiveness. Stroke has 3 primary etiologies; i) ischemic stroke (representing about 65% of all strokes), ii) hemorrhagic stroke (representing about 10% of all strokes), and iii) cryptogenic strokes (includes TIA, representing 25% of all strokes). Strokes can be considered as having neurogenic and/or cardiogenic origins.
A variety of approaches exist for treating patients undergoing a stroke. For example, a clinician may administer anticoagulants, such as warfarin, or may undertake intravascular interventions such as thrombectomy procedures to treat ischemic stroke. For example, a clinician may administer antihypertensive drugs, such as beta blockers (e.g., Labetalol) and ACE-inhibitors (e.g., Enalapril) or may undertake intravascular interventions such as coil embolization to treat hemorrhagic stroke. Lastly, if stroke symptoms have resolved on their own with negative neurological work-up, a clinician may administer long-term cardiac monitoring (external or implantable) to determine potential cardiac origins of cryptogenic stroke. However, such treatments may be frequently underutilized and/or relatively ineffective due to the failure to timely identify whether a patient is undergoing or has recently undergone a stroke. This is a particular risk with more minor strokes that leave patients relatively functional upon cursory evaluation.
The present technology is illustrated, for example, according to various aspects described below. Various examples of aspects of the present technology are described as numbered clauses (1, 2, 3, etc.) for convenience. These are provided as examples and do not limit the present technology. It is noted that any of the dependent clauses may be combined in any combination, and placed into a respective independent clause. The other clauses can be presented in a similar manner.
Additional features and advantages of the present technology will be set forth in the description below, and in part will be apparent from the description, or may be learned by practice of the subject technology. The advantages of the present technology will be realized and attained by the structure particularly pointed out in the written description and claims hereof as well as the appended drawings.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory and are intended to provide further explanation of the present technology as claimed.
Many aspects of the present disclosure can be better understood with reference to the following drawings. The components in the drawings are not necessarily to scale. Instead, emphasis is placed on illustrating clearly the principles of the present technology. For ease of reference, throughout this disclosure identical reference numbers may be used to identify identical or at least generally similar or analogous components or features.
It can be difficult to determine whether a patient is suffering from a stroke or has suffered from a stroke. Current diagnostic techniques typically involve evaluating a patient for visible symptoms, such as paralysis or numbness of the face, arm, or leg, as well as difficultly walking, speaking, or understanding. However, these techniques may result in undiagnosed strokes, particularly more minor strokes that leave patients relatively functional upon cursory evaluation. Even for relatively minor strokes, it is important to treat the patient as soon as possible because treatment outcomes for stroke patients are highly time-dependent. Accordingly, there is a need for improved methods for detecting strokes.
Embodiments of the present technology enable detection of strokes by obtaining patient physiological data using a sensor device and analyzing the physiological data to provide a stroke indication, as described in more detail below. For example, a monitoring device can be equipped with electrodes (e.g., electroencephalogram (EEG) electrodes) that can be used to sense and record a patient's brain electrical activity. The monitoring device can be implantable (e.g., subcutaneously) or configured to be disposed over a patient's skin.
Conventional EEG electrodes are typically positioned over a large portion of a user's scalp. While electrodes in this region are well positioned to detect electrical activity from the patient's brain, there are certain drawbacks. Sensors in this location interfere with patient movement and daily activities, making them impractical for prolonged monitoring. Additionally, implanting electrodes under the patient's scalp is difficult and may lead to significant patient discomfort. To address these and other shortcomings of conventional EEG sensors, embodiments of the present technology include a sensor device configured to record electrical signals at a region adjacent a rear portion of the patient's neck or base the patient's skull. In this position, implantation under the patient's skin is relatively simple, and a temporary application of a wearable sensor device (e.g., coupled to a bandage, garment, band, or adhesive member) does not unduly interfere with patient movement and activity.
However, the EEG signals detected via electrodes disposed at or adjacent the back of a patient's neck may be relatively noisy. For example, the electrical signals associated with brain activity may be intermixed with electrical signals associated with cardiac activity (e.g., ECG signals) and muscle activity (e.g., EMG signals) among other artifacts. Accordingly, in some embodiments, the sensor data may be filtered or otherwise manipulated to separate the brain activity data (e.g., EEG signals) from other electrical signals (e.g., ECG signals, EMG signals, etc.).
As described in more detail below, in some embodiments, the sensor data can be analyzed to make a stroke determination includes using a classification algorithm, which can itself be derived using machine learning techniques applied to databases of known stroke patient data. The detection algorithm(s) can be passive (involving measurement of a purely resting patient) or active (involving prompting a patient to perform potentially impaired functionality, such as moving particular muscle groups (e.g., raising an arm, moving a finger, moving facial muscles, etc.,) and/or speaking while recording the electrical response).
Example Systems
The following discussion provides a brief, general description of a suitable environment in which the present technology may be implemented. Although not required, aspects of the technology are described in the general context of computer-executable instructions, such as routines executed by a general-purpose computer. Aspects of the technology can be embodied in a special purpose computer or data processor that is specifically programmed, configured, or constructed to perform one or more of the computer-executable instructions explained in detail herein. Aspects of the technology can also be practiced in distributed computing environments where tasks or modules are performed by remote processing devices, which are linked through a communication network (e.g., a wireless communication network, a wired communication network, a cellular communication network, the Internet, a short-range radio network (e.g., via Bluetooth)). In a distributed computing environment, program modules may be located in both local and remote memory storage devices.
Computer-implemented instructions, data structures, screen displays, and other data under aspects of the technology may be stored or distributed on computer-readable storage media, including magnetically or optically readable computer disks, as microcode on semiconductor memory, nanotechnology memory, organic or optical memory, or other portable and/or non-transitory data storage media. In some embodiments, aspects of the technology may be distributed over the Internet or over other networks (e.g. a Bluetooth network) on a propagated signal on a propagation medium (e.g., an electromagnetic wave(s), a sound wave) over a period of time, or may be provided on any analog or digital network (packet switched, circuit switched, or other scheme).
The system 100 can be configured to sense physiological patient data and analyze that data to make a stroke determination. In an example, the system 100 includes a sensor device 110 that is configured to be implanted in a target site of the patient or disposed over the skin of the patient at a target site. The sensor device may be a relatively small device, and may be placed (e.g., inserted) under or over the skin at the back of the patient's neck or base of the skull. As described in more detail below, the sensor device 110 can detect one more physiological parameters of a patient (e.g., electrical activity corresponding to brain activity in particular regions of the patient's brain, heart rhythm data, motion data, etc.). The sensor device 110 can be communicatively coupled to an external device 150, for example via a wireless connection. In some embodiments, the external device 150 can be a mobile device (e.g., a smartphone, tablet, smartwatch, etc.) or other computing device with which the patient can interact. In operation, the patient may receive output or instructions from the external device 150 that are based at least in part on data received at the external device 150 from the sensor device 110. For example, the external device 150 may provide an alert to the patient or another entity (e.g., a call center) based on a stroke indication provided by the sensor device 110. Additionally or alternatively, the external device 150 may output user prompts which can be synchronized with data collection via the sensor device 110. For example, the external device 150 may instruct the user to lift an arm, make a facial expression, etc., and the sensor device 110 may record physiological data while the user performs the requested actions. Moreover, the external device 150 may itself analyze the patient (e.g., the patient's activity or condition in response to such prompts), for example using a camera to detect facial drooping, using a microphone to detect slurred speech, or to detect any other indicia of stroke. In some embodiments, such indicia can be compared against pre-stroke inputs (e.g., a stored baseline facial image or voice-print with baseline speech recording).
The sensor device 110 and/or the external device 150 can also be communicatively coupled with one or more external computing devices 180 (e.g., over network 170). In some examples, the external computing devices 180 can take the form of servers, personal computers, tablet computers or other computing devices associated with one or more healthcare providers (e.g., hospitals, medical data analytic companies, device manufacturers, etc.). These external computing devices 180 can collect data recorded by the sensor device 110 and/or the external device 150. In some embodiments, such data can be anonymized and aggregated to perform large-scale analysis (e.g., using machine-learning techniques or other suitable data analysis techniques) to develop and improve stroke detection algorithms using data collected by a large number of sensor devices 110. Additionally, the external computing devices 180 may transmit data to the external device 150 and/or the sensor device 110. For example, an updated algorithm for making stroke determinations may be developed by the external computing devices 180 (e.g., using machine learning or other techniques) and then provided to the sensor device 110 and/or the external device 150 via the network (e.g., as an over-the-air update), and installed on the sensor device 110 and/or external device 150.
In some embodiments, the system 100 can also include additional implantable devices, such as an implantable cardiac monitors, an implantable pacemaker, an implantable cardiac defibrillator, a cardiac resynchronization therapy (CRT) device (e.g., CRT-D defibrillator or CRT-P pacemaker), a neurostimulator, a deep-brain stimulation device, a nerve stimulator, a drug pump (e.g., an insulin pump), a glucose monitor, or other devices. Other devices that may support and enhance a personal ecosystem to reduce stroke risk include fitness monitors, nutrition devices, etc. Additionally or alternatively, a stroke detection device can be used in conjunction with other disease therapies with high risk of stroke as an adverse event (e.g., LVAD devices, TAVI/TAMR devices, bariatric/gastric surgery, etc.)
As noted previously, the sensor device 110 is configured to be coupled to a patient for recording physiological data relevant to a stroke determination. For example, the sensor device 110 can be implanted within the body of a patient, may be disposed directly over a patient's skin (e.g., held in place via an adhesive or fastener), or may be removably worn by the patient. The sensor device 110 includes sensing components 111, which can include a number of different sensors and/or types of sensors. For example, the sensing components 111 can include a plurality of electrodes 113, an accelerometer 115, and optionally other sensors 117. Examples of other sensors 117 include a blood pressure sensor, a pulse oximeter, an ECG sensor or other heart-recording device, an EMG sensor or other muscle-activity recording device, a temperature sensor, a skin galvanometer, hygrometer, altimeter, gyroscope, magnetometer, proximity sensor, hall effect sensors, or any other suitable sensor for monitoring physiological characteristics of the patient. These particular sensing components 111 are exemplary, and in various embodiments the sensors employed can vary.
The electrodes 113 can be configured to detect electrical activity such as brain activity (e.g., EEG data), heart activity (e.g., ECG data), and/or muscle activity (e.g., EMG data). The electrodes 113 may be formed from any suitable conductive material or materials to enable the electrodes to perform electrical measurements on the patient. In some embodiments, the sensor device 110 can be configured to analyze data from the electrodes 113 to extract both brain activity data (e.g., EEG signals) and heart activity data (e.g., ECG signals). The brain activity data may be evaluated to provide a stroke determination or other assessment of brain condition, while the heart activity data may be evaluated to provide an assessment of heart condition or to detect certain cardiac events (e.g., heart rate variability, arrhythmias (e.g., tachyarrhythmias or bradycardia), ventricular or atrial fibrillation episodes, etc.
In some embodiments, the sensor device 110 is configured to analyze data from the electrodes 113 to extract brain activity data and to discard or reduce any contribution from heart or muscle activity. In some embodiments, the electrodes 113 are configured to be disposed over the patient's skin. In such embodiments, the electrodes 113 can include protrusions (e.g., microneedles or other suitable structures) configured to at least partially penetrate the patient's skin so as to improve detection of subcutaneous electrical activity. In some embodiments, the sensor device 110 can be configured to be implanted within the body (e.g., subcutaneously), and as such the electrodes 113 can include a conductive surface exposed along at least a portion of the sensor device 110 so as to detect electrical activity within the body.
The sensor device 110 may be configured to calculate physiological characteristics relating to one or more electrical signals received from the electrodes 113. For example, the sensor device 110 may be configured to algorithmically determine the presence or absence of a stroke or other neurological condition from the electrical signal. In certain embodiments, the sensor device 110 may make a stroke determination for each electrode 113 (e.g., channel) or may make a stroke determination using electrical signals acquired from two or more selected electrodes 113.
In various embodiments, the number and configuration of electrodes 113 can vary. For example, the sensor device 110 can include at least 2, at least 3, at least 4, at least 5, or more electrodes 113 in an array. In some embodiments, the sensor device 110 includes fewer than 6, fewer than 5, fewer than 4, or fewer than 3 electrodes 113 in an array. As described in more detail below, although conventional EEG arrays include a large number of electrodes disposed over the top of a patient's head, some embodiments of the present technology include a relatively small number of electrodes (e.g., three electrodes) configured to be placed over the back of the patient's neck or base of the skull. In this position, electrical data collected via these electrodes 113 may correspond to brain activity in regions determined to be of interest for stroke determination (e.g., the P3, Pz, and/or P4 regions).
In some embodiments, the electrodes 113 may all reside within a single housing of the sensor device 110. In some embodiments, the electrodes 113 may extend away from a housing of the sensor device 110 and be connected via leads or other connective components. For example, the sensor device 110 can include a housing that encompasses certain components (e.g., the power 119, communications link 121, processing circuitry 123, and/or memory 125), and the electrodes 113 (and/or other sensing components 111) can be coupled to the housing via electrical leads or other suitable connections. In such configurations, the electrodes 113 can be positioned at locations spaced apart from the housing of the sensor device 110. In some embodiments, the electrodes 113 can be disposed within discrete housings that are in turn coupled to a housing containing the other components of the sensor device 110. Such a configuration, in which multiple housings (or sub-housings) are coupled together via flexible or other connectors, may facilitate placement of the sensor device 110 at a desired location to improve patient comfort. Additionally, this may facilitate placement of electrodes 113 at desirable positions for detecting clinically useful brain activity data.
The accelerometer 115 can be configured to detect patient movement. In some embodiments, patient movement data collected via the accelerometer 115 can be used to make a fall determination. Fall detection can be particularly valuable when assessing potential stroke patients, as a large percentage of patients admitted for ischemic or hermorrhagic stroke have been found to have had a significant fall within 15 days of the stroke event. Accordingly, in some embodiments, the sensor device 110 can be configured to initiate monitoring of brain activity via the electrodes 113 upon fall detection using the accelerometer 115. In some embodiments, the sensing performed via the electrodes 113 can be modified in response to a fall determination, for example with an increased sampling rate or other modification. In addition to fall detection, the accelerometer 115 (or similar sensor) can be used to determine potential body trauma due to sudden acceleration and/or deceleration (e.g., a vehicular accident, sports collision, concussion, etc.). These events could be thrombolytic, a precursor to stroke.
The sensor device 110 can also include a power source 119 (e.g., a battery, capacitors). In some embodiments, the power source 119 can be rechargeable, for example using inductive charging or other wireless charging techniques. Such rechargeability can facilitate long-term placement of the sensor device 110 on or within a patient.
A communications link 121 enables the sensor device 110 to transmit to and/or receive data from external devices (e.g., external device 150 or external computing devices 180). The communications link 121 can include a wired communication link and/or a wireless communication link (e.g., Bluetooth, Near-Field Communications, LTE, 5G, Wi-Fi, infrared and/or another wireless radio transmission network).
The processing circuitry 123 can include one or more CPUs, ASICs, digital signal processing circuitry, or any other suitable electrical components configured to process data from the sensing components 111 and control operation of the sensor device 110. In some embodiments, the processing circuitry 123 includes hardware particularly adapted for artificially intelligence and/or machine learning applications, for example, a tensor processing unit (TPU) or other such hardware. In certain embodiments, the processing circuitry of the sensor device 110 may include one or more input protection circuits to filter the electrical signals and may include amplifier/filter circuitry to remove DC and high frequency components, one or more analog-to-digital (A/D) converters, or any other suitable components.
The sensor device 110 can further include memory 125, which can take the form of one or more computer readable storage modules configured to store information (e.g., signal data, subject information or profiles, environmental data, data collected from one or more sensing components, media files) and/or executable instructions that can be executed by the processing circuitry 123. The memory 125 can include, for example, instructions for analyzing patient data to determine whether a patient is undergoing or has recently or previously undergone a stroke. In some embodiments, the memory 125 stores data (e.g., signal data acquired from the sensing components 111) used in the stroke detection techniques disclosed herein.
As noted above, in some embodiments, the sensor device 110 may also communicate with an external device 150. The external device 150 can be, for example, a smartwatch, smartphone, laptop, tablet, desktop PC, or any other suitable computing device and can include one or more features, applications and/or other elements commonly found in such devices. For example, the external device 150 can include display 151, a communications link 153 (e.g., a wireless transceiver that may include one or more antennas for wirelessly communicating with, for example, other devices, websites, and the sensor device 110). Communication between the external device 150 and other devices can be performed via, e.g., a network 170 (which can include the Internet, public and private intranet, a local or extended Wi-Fi network, cell towers, the plain old telephone system (POTS), etc.), direct wireless communication, etc. The external device 150 can additionally include well-known input components 131 and output components 133, including, for example, a touch screen, a keypad, speakers, a camera, etc.
In operation, the patient may receive output or instructions from the external device 150 that are based at least in part on data received at the external device 150 from the sensor device 110. For example, the sensor device 110 may generate a stroke indication based on analysis of data collected via sensing components 111. The sensor device 110 may then instruct the external device 150 to output an alert to the patient (e.g., via display 151 and/or output 157) or another entity. In some embodiments, the alert can both be displayed to the user (e.g., via display 151 of the external device) and can also be transmitted to an appropriate emergency medical response service (e.g., a 9-1-1 call may be placed with location data from the external device 150 used to direct responders to locate the patient), and/or to other healthcare provider entities or individuals (e.g. a hospital, emergency room, or physician). In some embodiments, embedded circuitry that provides location data (e.g., a GPS unit) can be included within the sensor device 110.
Additionally or alternatively, the external device 150 may output user prompts which may be used in conjunction with physiological data collection via the sensor device 110. For example, the external device 150 may instruct the user to perform an action (e.g., lift an arm, make a facial expression, etc.), and the sensor device 150 may record physiological data while the user performs the requested actions. In some embodiments, the external device 150 may itself analyze physiological parameters of the patient, for example using a camera to detect facial drooping or other indicia of stroke. In some embodiments, such physiological data collected via the external device 150 can be combined with data collected via the sensing components 111 and analyzed together to make a stroke determination.
As noted previously, the external computing device(s) 180 can take the form of servers or other computing devices associated with healthcare providers or other entities. The external devices can include a communications link 181 (e.g., components to facilitate wired or wireless communication with other devices either directly or via the network 170), a memory 183, and processing circuitry 185. These external computing devices 180 can collect data recorded by the sensor device 110 and/or the external device 150. In some embodiments, such data can be anonymized and aggregated to perform large-scale analysis (e.g., using machine-learning techniques or other suitable data analysis techniques) to develop and improve stroke detection algorithms using data collected by a large number of sensor devices 110 associated with a large population of patients. Additionally, the external computing devices 180 may transmit data to the external device 150 and/or the sensor device 110. For example, an updated algorithm for making stroke determinations may be developed by the external computing devices 180 (e.g., using machine learning or other techniques) and then provided to the sensor device 110 and/or the external device 150 via the network 170, and installed on the recipient device 110/150.
Example Sensor Devices
In the example of
The configuration of the housing 201 can facilitate placement either over the user's skin in a bandage-like form or for subcutaneous implantation. As such, a relatively thin housing 201 can be advantageous. Additionally, the housing 201 can be flexible in some embodiments, so that the housing 201 can at least partially bend to correspond to the anatomy of the patient's neck (e.g., with left and right lateral portions 207 and 209 of the housing 201 bending anteriorly relative to the central portion 205 of the housing 201).
In some embodiments, the housing 201 can have a length L of between about 15-50 mm, between about 20-30 mm, or about 25 mm. The housing 201 can have a width W of between about 2.5-15 mm, between about 5-10 mm, or about 7.5 mm. In some embodiments, the housing 201 can have a thickness of the thickness is less than about 10 mm, about 9 mm, about 8 mm, about 7 mm, about 6 mm, about 5 mm, about 4 mm, or about 3 mm. In some embodiments, the thickness of the housing 201 can be between about 2-8 mm, between 3-5 mm, or about 4 mm. The housing 201 can have a volume of less than about 1.5 cc, about 1.4 cc, about 1.3 cc, about 1.2 cc, about 1.1 cc, about 1.0 cc, about 0.9 cc, about 0.8 cc, about 0.7 cc, about 0.6 cc, about 0.5 cc, or about 0.4 cc. In some embodiments, the housing 201 can have dimensions suitable for implantation through a trocar introducer or any other suitable implantation technique.
As illustrated, the electrodes 213 carried by the housing 201 are arranged so that all three electrodes 213 do not lie on a common axis. In such a configuration, the electrodes 213 can achieve a better signal vector as compared to electrodes that are all aligned along a single axis. This can be particularly useful in a sensor device 210 configured to be implanted at the neck while detecting electrical activity in the brain. In some embodiments, this electrode configuration also provides for improved cardiac ECG sensitivity by integrating 3 potential signal vectors.
In the example shown in
In operation, the electrodes 213 are used to sense electrical signals (e.g., EEG signals) which may be submuscular or subcutaneous. The sensed electrical signals may be stored in a memory of the sensor device 210, and signal data may be transmitted via a communications link to another device (e.g., external device 150 of
In the example shown in
In the example shown in
Proximal electrode 313a and distal electrode 313b are used to sense electrical signals (e.g., EEG signals) which may be submuscular or subcutaneous. Electrical signals may be stored in a memory of the sensor device 310, and signal data may be transmitted via integrated antenna 326 to another medical device, which may be another implantable device or an external device, such as external device 150 (
In the example shown in
In the example shown in
Example Methods
While conventional EEG electrodes are placed over the patient's scalp, the present technology advantageously enables recording of clinically useful brain activity data via electrodes positioned at the target region 401 at the rear of the patient's neck. This anatomical area is well suited to suited both to implantation of a sensor device and to temporary placement of a sensor device over the patient's skin. In contrast, EEG electrodes positioned over the scalp are cumbersome, and implantation over the patient's skull is challenging and may introduce significant patient discomfort. As noted elsewhere here, conventional EEG electrodes are typically positioned over the scalp to more readily achieve a suitable signal-to-noise ratio for detection of brain activity. However, by using certain digital signal processing, and a special-purpose classifier algorithm, clinically useful brain activity data can be obtained using sensors disposed at the target region 401. Specifically, the electrodes can detect electrical activity that corresponds to brain activity in the P3, Pz, and/or P4 regions (see
While conventional approaches to stroke detection utilizing EEG have relied on data from a large number of EEG electrodes, the inventors have discovered that clinically useful stroke determinations may be made utilizing relatively few electrodes. In an experiment conducted by the inventors, data from a base set of 56 patients (26 stroke and 30 non-stroke) was used. EEG data was recorded somewhere between 1 and 22 hours post-event using a conventional EEG array with a sampling frequency of 500 Hz over a period of 3 minutes. The EEG data was detrended, then bandpass filtered (e.g., filter 6-40 Hz to remove high-frequency noise), followed re-referencing to Pz, wavelet denoising, and finally low-pass filtering below 25 Hz. With an EEG array of 16 contacts (with Pz serving as ground), and 14 power bins, a total of 224 features were extracted.
A gradient boosting algorithm was trained on the data set following feature extraction to generate a classifier algorithm. The classifier was tuned by paring down features to only those related to the stroke/non-stroke condition. A sequentially backward floating feature selection approach was employed, which sequentially removes individual features using a classifier performance metric. The classifier was further tuned by adjusting the frequency bins. The result of this analysis was five features that effectively discriminate between stroke and non-stroke conditions. These features were three frequency bins associated with the P3 electrode (5.5-7.5 Hz, 8-9.5 Hz, and 13.5-15 Hz) and two frequency bins associated with the P4 electrode (5.5-7.5 Hz and 13.5-15 Hz).
The resulting classifier succeeded in making stroke/non-stroke determinations with an accuracy of approximately 85%.
The accuracy of such a classifier can be improved by training the algorithm on larger sets of data corresponding to stroke and non-stroke EEG readings. Additionally, other physiological parameters can be added to the classifier model (e.g., fall detection as determined using an accelerometer, particular heart rhythms, gender, age, medical history, etc.). Additionally, in some embodiments a classifier can be used to discriminate between ischemic and hemorrhagic strokes. Such discrimination can be particularly useful as the interventions may differ. For example, an ischemic stroke may be treated using thrombectomy, while a hemorrhagic stroke may be treated using surgery or another suitable technique.
As illustrated, the process 800 begins in block 802 with collecting EEG sensor data via electrodes disposed at or adjacent the back of the neck or base of the skull (e.g., the target region 401 shown in
The process 800 continues in block 804 with filtering the EEG sensor data to remove ECG artifacts. Conventionally, EEG data has been obtained via electrodes positioned over the scalp because it is a relatively noise-free location for signal acquisition. Other anatomical locations such as back of the neck have not been used, not because EEG signal isn't present, but because of the noisier environment and band-overlap with other physiologic signals such as ECG. However, recent techniques for machine learning/adaptive neural network processing have enhanced the signal extraction capability (e.g., to filter out or reduce the contribution of ECG signals from the EEG signals). One such methodology is described in “ECG Artifact Removal of EEG signal using Adaptive Neural Network” as published in IEEE Xplore 27 May 2019, which is hereby incorporated by reference in its entirety. Similarly, electrical signals associated with muscle activity may also be filtered from the EEG sensor data to remove such artifacts.
In block 806, a patient stroke indicator is provided. The patient stroke indicator can be, for example, a binary output of stroke condition/non-stroke condition, a probabilistic indication of stroke likelihood, or other output relating to the patient's condition and likelihood of having suffered a stroke. This stroke indicator can be calculated using a classifier model as described elsewhere herein. In addition to providing the patient stroke indicator, information or instructions can also be output to a patient or user. The information or instructions can be output via a display device (e.g., the display 151 of
In some embodiments, prior to, concurrently with, or after providing the stroke indicator in block 806, the method 800 can include triggering an automatic data transmission, for example of a stroke determination which can be output to the patient or another entity (e.g., a call center, emergency response personnel, etc.). A call center may contact the patient or a patient's designated contact to inquire as the patient's status, and/or to confirm a patient stroke. If the patient stroke is confirmed (or if the call center is unable to reach the patient), a 9-1-1 emergency call can be initiated, either manually by call center personnel or automatically.
In block 902, instructions are output to a patient to perform an action. For example, instructions may be output via external device 150 (
In block 904, EEG sensor data is collected while the patient performs the actions included in the instructions of block 902. In some embodiments, the EEG sensor data can be collected via electrodes disposed at or adjacent the back of the neck or base of the skull (e.g., the target region 401 shown in
In block 906, the sensor data is analyzed and, based on the analysis, the system can provide a patient stroke indicator. The analysis can include, for example, using a classifier algorithm as described elsewhere herein. The patient stroke indicator can be, for example, a binary output of stroke condition/non-stroke condition, a probabilistic indication of stroke likelihood, or other output relating to the patient's condition and likelihood of having suffered a stroke. In some embodiments, if a stroke is indicated, the system can output appropriate information or instructions via a display device (e.g., the display 116 of
The process 1000 continues in block 1004 with filtering the EEG sensor data to remove ECG artifacts as described elsewhere herein. In block 1006, a classification algorithm is applied. The classification algorithm can be, for example, an algorithm adapted from the use of artificial intelligence (e.g., machine learning, neural networks, etc.) as applied to patient stroke data, for example as described above with respect to
In addition to outputting the results, information or instructions can also be output to a patient or user. The information or instructions can be output via a display device (e.g., the display 151 of
In block 1120, the identified patterns can be integrated or otherwise combined and a stroke risk parameter can be calculated. The stroke risk can be based on the physiological data as well as other patient parameters (e.g., gender, age, history of stroke or heart conditions, etc.), and can include a probabilistic output of a patient's risk for stroke over. If, in block 1122, there is no risk of stroke identified (e.g., the stroke risk parameter falls below a pre-defined threshold), then no action is taken in block 11124. Optionally, a result of “no risk” or “low risk” can be output to the patient or other entity. If, in block 1122, a stroke risk is identified (e.g., the stroke risk parameter exceeds a pre-determined threshold) then an alert can be output in block 1126. Such an alert can be provided to the patient (e.g., via the external device 150), to a call center, a patient's medical team, or any other suitable entity.
Conclusion
This disclosure is not intended to be exhaustive or to limit the present technology to the precise forms disclosed herein. Although specific embodiments are disclosed herein for illustrative purposes, various equivalent modifications are possible without deviating from the present technology, as those of ordinary skill in the relevant art will recognize. In some cases, well-known structures and functions have not been shown and/or described in detail to avoid unnecessarily obscuring the description of the embodiments of the present technology. Although steps of methods may be presented herein in a particular order, in alternative embodiments the steps may have another suitable order. Similarly, certain aspects of the present technology disclosed in the context of particular embodiments can be combined or eliminated in other embodiments. Furthermore, while advantages associated with certain embodiments may have been disclosed in the context of those embodiments, other embodiments can also exhibit such advantages, and not all embodiments need necessarily exhibit such advantages or other advantages disclosed herein to fall within the scope of the present technology. Accordingly, this disclosure and associated technology can encompass other embodiments not expressly shown and/or described herein.
Unless otherwise indicated, all numerical values used in the specification and claims, are to be understood as being modified in all instances by the term “about.” Accordingly, unless indicated to the contrary, the numerical parameters set forth in the following specification and attached claims are approximations that may vary depending upon the desired properties sought to be obtained by the present technology. At the very least, and not as an attempt to limit the application of the doctrine of equivalents to the scope of the claims, each numerical parameter should at least be construed in light of the number of reported significant digits and by applying ordinary rounding techniques. Additionally, all ranges disclosed herein are to be understood to encompass any and all subranges subsumed therein. For example, a range of “1 to 10” includes any and all subranges between (and including) the minimum value of 1 and the maximum value of 10, i.e., any and all subranges having a minimum value of equal to or greater than 1 and a maximum value of equal to or less than 10, e.g., 5.5 to 10.
Throughout this disclosure, the singular terms “a,” “an,” and “the” include plural referents unless the context clearly indicates otherwise. Similarly, unless the word “or” is expressly limited to mean only a single item exclusive from the other items in reference to a list of two or more items, then the use of “or” in such a list is to be interpreted as including (a) any single item in the list, (b) all of the items in the list, or (c) any combination of the items in the list. Additionally, the terms “comprising,” and the like are used throughout this disclosure to mean including at least the recited feature(s) such that any greater number of the same feature(s) and/or one or more additional types of features are not precluded. Directional terms, such as “upper,” “lower,” “front,” “back,” “vertical,” and “horizontal,” may be used herein to express and clarify the relationship between various elements. It should be understood that such terms do not denote absolute orientation. Reference herein to “one embodiment,” “an embodiment,” or similar formulations means that a particular feature, structure, operation, or characteristic described in connection with the embodiment can be included in at least one embodiment of the present technology. Thus, the appearances of such phrases or formulations herein are not necessarily all referring to the same embodiment. Furthermore, various particular features, structures, operations, or characteristics may be combined in any suitable manner in one or more embodiments. For example, a master-slave configuration could be possible leveraging the well-established pectoral implant location to derive cardiac ECG information and the back-of-head/neck implant location to derive neuro EEG information. These slave devices can be converged into a master device that could be an external smartwatch or smartphone to provide stroke detection capability.
This application claims the benefit of U.S. Provisional Application No. 62/997,503, filed Feb. 17, 2020, the entire content of which is incorporated herein by reference.
Number | Name | Date | Kind |
---|---|---|---|
4201222 | Haase | May 1980 | A |
4907597 | Chamoun | Mar 1990 | A |
5029590 | Allain et al. | Jul 1991 | A |
5305746 | Fendrock | Apr 1994 | A |
5458117 | Chamoun et al. | Oct 1995 | A |
6032064 | Devlin et al. | Feb 2000 | A |
6728564 | Laehteenmaeki | Apr 2004 | B2 |
6961601 | Matthews et al. | Nov 2005 | B2 |
7054453 | Causevic et al. | May 2006 | B2 |
7054454 | Causevic et al. | May 2006 | B2 |
7072705 | Miga et al. | Jul 2006 | B2 |
7130673 | Tolvanen-Laakso et al. | Oct 2006 | B2 |
7367949 | Korhonen et al. | May 2008 | B2 |
7398115 | Lynn | Jul 2008 | B2 |
7407485 | Huiku | Aug 2008 | B2 |
7447541 | Huiku et al. | Nov 2008 | B2 |
7471978 | John et al. | Dec 2008 | B2 |
7904144 | Causevic et al. | Mar 2011 | B2 |
8323188 | Tran | Dec 2012 | B2 |
8364254 | Jacquin et al. | Jan 2013 | B2 |
8370287 | Snyder | Feb 2013 | B2 |
8423145 | Pless et al. | Apr 2013 | B2 |
8603014 | Alleman et al. | Dec 2013 | B2 |
8837800 | Bammer et al. | Sep 2014 | B1 |
8862199 | Ko et al. | Oct 2014 | B2 |
8926509 | Magar et al. | Jan 2015 | B2 |
9119656 | Bose et al. | Sep 2015 | B2 |
9126018 | Garrison | Sep 2015 | B1 |
9149229 | Tarler | Oct 2015 | B1 |
9211132 | Bowman | Dec 2015 | B2 |
9241699 | Kume et al. | Jan 2016 | B1 |
9265512 | Garrison et al. | Feb 2016 | B2 |
9308007 | Cully et al. | Apr 2016 | B2 |
9370313 | Mcpeck et al. | Jun 2016 | B2 |
9399118 | Kume et al. | Jul 2016 | B2 |
9408575 | Bordoley et al. | Aug 2016 | B2 |
9445828 | Turjman et al. | Sep 2016 | B2 |
9445829 | Brady et al. | Sep 2016 | B2 |
9492637 | Garrison et al. | Nov 2016 | B2 |
9532748 | Denison et al. | Jan 2017 | B2 |
9539022 | Bowman | Jan 2017 | B2 |
9561345 | Garrison et al. | Feb 2017 | B2 |
9579028 | Bonmassar et al. | Feb 2017 | B2 |
9579119 | Cully et al. | Feb 2017 | B2 |
9585741 | Ma | Mar 2017 | B2 |
D784542 | Zwierstra et al. | Apr 2017 | S |
9642635 | Vale et al. | May 2017 | B2 |
9655633 | Leynov et al. | May 2017 | B2 |
9675264 | Acquista et al. | Jun 2017 | B2 |
9737318 | Monstadt et al. | Aug 2017 | B2 |
9770251 | Bowman et al. | Sep 2017 | B2 |
9801643 | Hansen et al. | Oct 2017 | B2 |
9861783 | Garrison et al. | Jan 2018 | B2 |
9878160 | Pless et al. | Jan 2018 | B2 |
9943690 | Pless et al. | Apr 2018 | B2 |
9993257 | Losordo et al. | Jun 2018 | B2 |
10028782 | Orion | Jul 2018 | B2 |
10029008 | Creighton | Jul 2018 | B2 |
10039906 | Kume et al. | Aug 2018 | B2 |
10182723 | Evans et al. | Jan 2019 | B2 |
10195402 | Zhadkevich | Feb 2019 | B2 |
10252058 | Fuerst | Apr 2019 | B1 |
10281478 | Franco | May 2019 | B2 |
10285606 | Jensen | May 2019 | B2 |
10285617 | Toth et al. | May 2019 | B2 |
10335083 | Keteyian et al. | Jul 2019 | B2 |
10398319 | Wang et al. | Sep 2019 | B2 |
10463271 | Intrator | Nov 2019 | B2 |
10555861 | Zwierstra et al. | Feb 2020 | B2 |
10575741 | Kim et al. | Mar 2020 | B2 |
10575818 | O'Brien et al. | Mar 2020 | B2 |
10610200 | Arant et al. | Apr 2020 | B2 |
10616473 | Costa et al. | Apr 2020 | B2 |
10617388 | Flores, II et al. | Apr 2020 | B2 |
10743809 | Kamousi et al. | Aug 2020 | B1 |
10779747 | Simon | Sep 2020 | B2 |
10786209 | Park et al. | Sep 2020 | B2 |
11006841 | Wainwright et al. | May 2021 | B2 |
11399761 | Intrator | Aug 2022 | B2 |
11457866 | Kesinger et al. | Oct 2022 | B2 |
20020002390 | Fischell et al. | Jan 2002 | A1 |
20020099412 | Fischell et al. | Jul 2002 | A1 |
20020169485 | Pless et al. | Nov 2002 | A1 |
20030093004 | Sosa et al. | May 2003 | A1 |
20040077967 | Jordan | Apr 2004 | A1 |
20040163648 | Burton | Aug 2004 | A1 |
20040220644 | Shalev et al. | Nov 2004 | A1 |
20040243017 | Causevic | Dec 2004 | A1 |
20040267153 | Bergethon | Dec 2004 | A1 |
20050113704 | Lawson et al. | May 2005 | A1 |
20050203366 | Donoghue et al. | Sep 2005 | A1 |
20060217614 | Takala et al. | Sep 2006 | A1 |
20060224421 | St. Ores et al. | Oct 2006 | A1 |
20070010723 | Uutela et al. | Jan 2007 | A1 |
20070021687 | Keith et al. | Jan 2007 | A1 |
20070032736 | Finnigan et al. | Feb 2007 | A1 |
20070167694 | Causevic et al. | Jul 2007 | A1 |
20070239054 | Giftakis et al. | Oct 2007 | A1 |
20070249954 | Virag et al. | Oct 2007 | A1 |
20070276270 | Tran | Nov 2007 | A1 |
20080081980 | Maschke et al. | Apr 2008 | A1 |
20080139908 | Kurth | Jun 2008 | A1 |
20080167540 | Korhonen et al. | Jul 2008 | A1 |
20080208073 | Causevic | Aug 2008 | A1 |
20080208074 | Snyder et al. | Aug 2008 | A1 |
20080243021 | Causevic et al. | Oct 2008 | A1 |
20080294019 | Tran | Nov 2008 | A1 |
20090099624 | Kokones | Apr 2009 | A1 |
20090247894 | Causevic | Oct 2009 | A1 |
20090290772 | Avinash et al. | Nov 2009 | A1 |
20100217147 | Odame | Aug 2010 | A1 |
20100274152 | Mcpeck et al. | Oct 2010 | A1 |
20110053859 | Deadwyler et al. | Mar 2011 | A1 |
20110066055 | Bharmi et al. | Mar 2011 | A1 |
20110245633 | Goldberg et al. | Oct 2011 | A1 |
20110245707 | Castle et al. | Oct 2011 | A1 |
20110301448 | deCharms | Dec 2011 | A1 |
20130009783 | Tran | Jan 2013 | A1 |
20130030461 | Marks et al. | Jan 2013 | A1 |
20130231947 | Shusterman | Sep 2013 | A1 |
20130281788 | Garrison | Oct 2013 | A1 |
20130303900 | Nowinski | Nov 2013 | A1 |
20140187973 | Brown et al. | Jul 2014 | A1 |
20140276074 | Warner | Sep 2014 | A1 |
20140276123 | Yang | Sep 2014 | A1 |
20140316230 | Denison et al. | Oct 2014 | A1 |
20140343595 | Monstadt et al. | Nov 2014 | A1 |
20150157235 | Jelen et al. | Jun 2015 | A1 |
20150157273 | An et al. | Jun 2015 | A1 |
20150359547 | Vale et al. | Dec 2015 | A1 |
20150366518 | Sampson | Dec 2015 | A1 |
20160015402 | Brady et al. | Jan 2016 | A1 |
20160015935 | Chan et al. | Jan 2016 | A1 |
20160106448 | Brady et al. | Apr 2016 | A1 |
20160106449 | Brady et al. | Apr 2016 | A1 |
20160113663 | Brady et al. | Apr 2016 | A1 |
20160113665 | Brady et al. | Apr 2016 | A1 |
20160128592 | Rosen et al. | May 2016 | A1 |
20160151618 | Powers et al. | Jun 2016 | A1 |
20160157985 | Vo et al. | Jun 2016 | A1 |
20160199620 | Pokorney et al. | Jul 2016 | A1 |
20160256130 | Hamilton et al. | Sep 2016 | A1 |
20160278736 | Hamilton et al. | Sep 2016 | A1 |
20160296690 | Kume et al. | Oct 2016 | A1 |
20160302808 | Loganathan et al. | Oct 2016 | A1 |
20160331255 | Cheatham, III et al. | Nov 2016 | A1 |
20160367217 | Flores, II et al. | Dec 2016 | A1 |
20160375180 | Anzai | Dec 2016 | A1 |
20170007167 | Kostic et al. | Jan 2017 | A1 |
20170020454 | Keteyian et al. | Jan 2017 | A1 |
20170055839 | Levinson et al. | Mar 2017 | A1 |
20170071495 | Denison et al. | Mar 2017 | A1 |
20170079766 | Wang et al. | Mar 2017 | A1 |
20170079767 | Leon-Yip | Mar 2017 | A1 |
20170086862 | Vale et al. | Mar 2017 | A1 |
20170100143 | Grandfield | Apr 2017 | A1 |
20170105743 | Vale et al. | Apr 2017 | A1 |
20170119347 | Flores, II et al. | May 2017 | A1 |
20170127946 | Levinson et al. | May 2017 | A1 |
20170164963 | Goyal | Jun 2017 | A1 |
20170188993 | Hamilton et al. | Jul 2017 | A1 |
20170215902 | Leynov et al. | Aug 2017 | A1 |
20170224953 | Tran et al. | Aug 2017 | A1 |
20170281909 | Northrop et al. | Oct 2017 | A1 |
20170290599 | Youn et al. | Oct 2017 | A1 |
20170307420 | Flores, II et al. | Oct 2017 | A1 |
20170319099 | Levinson et al. | Nov 2017 | A1 |
20180021021 | Zwierstra et al. | Jan 2018 | A1 |
20180049762 | Seip et al. | Feb 2018 | A1 |
20180064364 | Oziel et al. | Mar 2018 | A1 |
20180084982 | Yamashita et al. | Mar 2018 | A1 |
20180103927 | Chung et al. | Apr 2018 | A1 |
20180116717 | Taff et al. | May 2018 | A1 |
20180117331 | Kuzniecky et al. | May 2018 | A1 |
20180132876 | Zaidat | May 2018 | A1 |
20180140203 | Wang et al. | May 2018 | A1 |
20180140314 | Goyal et al. | May 2018 | A1 |
20180140315 | Bowman et al. | May 2018 | A1 |
20180140354 | Lam et al. | May 2018 | A1 |
20180153477 | Nagale | Jun 2018 | A1 |
20180185614 | Garrison et al. | Jul 2018 | A1 |
20180220919 | Wershing et al. | Aug 2018 | A1 |
20180220991 | O'Brien et al. | Aug 2018 | A1 |
20180249967 | Lederman et al. | Sep 2018 | A1 |
20180353084 | Wainright et al. | Dec 2018 | A1 |
20190021627 | Levinson et al. | Jan 2019 | A1 |
20190021665 | Kesinger et al. | Jan 2019 | A1 |
20190051409 | Petrossian et al. | Feb 2019 | A1 |
20190059850 | Zwierstra et al. | Feb 2019 | A1 |
20190099132 | Mulinti et al. | Apr 2019 | A1 |
20190175433 | Zwierstra et al. | Jun 2019 | A1 |
20190200954 | Flores, II et al. | Jul 2019 | A1 |
20190209128 | O'Brien et al. | Jul 2019 | A1 |
20190209141 | O'Brien et al. | Jul 2019 | A1 |
20190216433 | Hamilton et al. | Jul 2019 | A1 |
20190223830 | Thorpe et al. | Jul 2019 | A1 |
20190223837 | Petrossian et al. | Jul 2019 | A1 |
20190282318 | Arant et al. | Sep 2019 | A1 |
20190298210 | Bennet et al. | Oct 2019 | A1 |
20190357845 | Willis et al. | Nov 2019 | A1 |
20190365274 | Wyeth et al. | Dec 2019 | A1 |
20200000355 | Khair | Jan 2020 | A1 |
20200008697 | Kesinger et al. | Jan 2020 | A1 |
20200085255 | Yoo et al. | Mar 2020 | A1 |
20200085525 | Zwierstra et al. | Mar 2020 | A1 |
20200100974 | Hewes et al. | Apr 2020 | A1 |
20210030299 | Naber et al. | Feb 2021 | A1 |
20210241908 | Ciupa et al. | Aug 2021 | A1 |
20210251497 | Schulhauser et al. | Aug 2021 | A1 |
20210259621 | Alves et al. | Aug 2021 | A1 |
20210267465 | Wainwright et al. | Sep 2021 | A1 |
20210378582 | Day et al. | Dec 2021 | A1 |
20220022800 | Abrams et al. | Jan 2022 | A1 |
20220061678 | Schulhauser et al. | Mar 2022 | A1 |
20220061742 | Panken et al. | Mar 2022 | A1 |
20220071547 | Revels et al. | Mar 2022 | A1 |
20220183633 | Kinzie et al. | Jun 2022 | A1 |
20220203091 | Vysokov | Jun 2022 | A1 |
20220296174 | Kinzie et al. | Sep 2022 | A1 |
Number | Date | Country |
---|---|---|
2014228116 | Jan 2019 | AU |
1891144 | Jan 2007 | CN |
1891145 | Jan 2007 | CN |
105792741 | Jul 2016 | CN |
108834398 | Nov 2018 | CN |
110612057 | Dec 2019 | CN |
102014100133 | Aug 2016 | DE |
2319575 | May 2011 | EP |
3068294 | Sep 2016 | EP |
2014004219 | Jan 2014 | JP |
2018118132 | Aug 2018 | JP |
2020511173 | Apr 2020 | JP |
20180102877 | Sep 2018 | KR |
2013110001 | Jul 2013 | WO |
2013165474 | Nov 2013 | WO |
2015073903 | May 2015 | WO |
2015141317 | Sep 2015 | WO |
2016036946 | Mar 2016 | WO |
2017049628 | Mar 2017 | WO |
2017120382 | Jul 2017 | WO |
2017189623 | Nov 2017 | WO |
2017192999 | Nov 2017 | WO |
2018019829 | Feb 2018 | WO |
2018033401 | Feb 2018 | WO |
2018046408 | Mar 2018 | WO |
2018089035 | May 2018 | WO |
2018137029 | Aug 2018 | WO |
2018137030 | Aug 2018 | WO |
2018145212 | Aug 2018 | WO |
2018156813 | Aug 2018 | WO |
2018172891 | Sep 2018 | WO |
2018187776 | Oct 2018 | WO |
2018201190 | Nov 2018 | WO |
2019004710 | Jan 2019 | WO |
2019094877 | May 2019 | WO |
2019166557 | Sep 2019 | WO |
2019177630 | Sep 2019 | WO |
2019190583 | Oct 2019 | WO |
2019195844 | Oct 2019 | WO |
2019199334 | Oct 2019 | WO |
2020144687 | Jul 2020 | WO |
2021167988 | Aug 2021 | WO |
2021181395 | Sep 2021 | WO |
2022011077 | Jan 2022 | WO |
2022020339 | Jan 2022 | WO |
2022047066 | Mar 2022 | WO |
2022047215 | Mar 2022 | WO |
2022055948 | Mar 2022 | WO |
2022132938 | Jun 2022 | WO |
2022170150 | Aug 2022 | WO |
Entry |
---|
Routray et al., “ECG Artifact Removal of EEG signal using Adaptive Neural Network,” 2018 IEEE 13th International Conference on Industrial and Information Systems (ICIIS), May 27, 2019, 4 pages. |
Ponciano et al., “Experimental Study for Determining the Parameters Required for Detecting ECG and EEG Related Diseases During the Timed-Up and Go Test,” Aug. 27, 2020, 21 pages. |
Giri et al., “Ischemic Stroke Identification Based on EEG and EOG using ID Convolutional Neural Network and Batch Normalization,” ICACSIS 2016, IEEE, Oct. 15, 2016, 8 pp. |
International Search Report and Written Opinion of International Application No. PCT/US2021/018394, dated Jun. 1, 2021, 15 pp. |
Response to Office Action dated Sep. 9, 2022 from U.S. Appl. No. 17/176,504, filed Dec. 9, 2022, 11 pp. |
Office Action from U.S. Appl. No. 17/176,504 dated Sep. 9, 2022, 15 pp. |
Final Office Action from U.S. Appl. No. 17/176,504 dated Apr. 4, 2023, 9 pp. |
Response to Final Office Action dated Apr. 4, 2023 from U.S. Appl. No. 17/176,504, filed May 26, 2023, 5 pp. |
Notice of Allowance from U.S. Appl. No. 17/176,504 dated Jun. 22, 2023, 8 pp. |
Hu et al., “Intelligent Sensor Networks—The Integration of Sensor Networks, Signal Processing and Machine Learning”, CRC Press, Boca Raton, Mar. 20, 2013, 674 pp., https://doi.org/10.1201/b14300. |
Huang et al., “Kernal Based Algorthims for Mining Huge Data Sets—Supervised, Semi-Supervised, and Unsupervised Learning”, Studies in Computational Intelligence, vol. 17, Springer, The Netherlands, Jan. 2006, 266 pp., doi:10.1007/3-540-31689-2. |
Notice of Allowance from U.S. Appl. No. 17/176,504 dated Sep. 29, 2023, 8 pp. |
Communication pursuant to Article 94(3) EPC from counterpart European Application No. 21711121.0 dated Nov. 3, 2001, 5 pp. |
Office Action from U.S. Appl. No. 17/176,504 dated Feb. 1, 2024, 13 pp. |
Response to Office Action dated Feb. 1, 2024 from U.S. Appl. No. 17/176,504, filed Apr. 17, 2024, 11 pp. |
Final Office Action from U.S. Appl. No. 17/176,504 dated Sep. 11, 2024, 16 pp. |
First Office Action and Search Report, and translation thereof, from counterpart Chinese Application No. 202180014972.4 dated Jul. 31, 2024, 15 pp. |
Notice of Allowance from U.S. Appl. No. 17/176,504 dated Dec. 4, 2024, 7 pp. |
Response to Final Office Action dated Sep. 11, 2024 from U.S. Appl. No. 17/176,504, filed Nov. 11, 2024, 12 pp. |
Number | Date | Country | |
---|---|---|---|
20210251578 A1 | Aug 2021 | US |
Number | Date | Country | |
---|---|---|---|
62977503 | Feb 2020 | US |