Automatic therapy adjustments

Information

  • Patent Grant
  • 10130815
  • Patent Number
    10,130,815
  • Date Filed
    Monday, February 11, 2013
    11 years ago
  • Date Issued
    Tuesday, November 20, 2018
    5 years ago
Abstract
A medical device detects a previously defined event, and controls delivery of therapy to a patient according to therapy information associated with the previously defined event. In exemplary embodiments, the medical device enters a learning mode in response to a command received from a user, e.g., the patient or a clinician. In such embodiments, the medical device defines the event, collects the therapy information, and associates the therapy information with the defined event while operating in the learning mode. In some embodiments, the medical device defines the event based on the output of a sensor that indicates a physiological parameter of the patient during the learning mode. The sensor may be an accelerometer, which generates an output that reflects motion and/or posture of the patient. The medical device may collect therapy information by recording therapy changes made by the user during the learning mode.
Description
TECHNICAL FIELD

The invention relates to medical devices, and more particularly, to medical devices used for chronic therapy provision.


BACKGROUND

A variety of types of medical devices are used for chronic, e.g., long-term, provision of therapy to patients. As examples, pulse generators are used for chronic provision of cardiac pacing and neurostimulation therapies, and pumps are used for chronic delivery of therapeutic agents, such as drugs. Typically, such devices provide therapy continuously or periodically according to parameters, e.g., a program, specified by a clinician.


In some cases, the patient is allowed to activate and/or modify the therapy. For example, the symptoms, e.g., the intensity of pain, of patients who receive spinal cord stimulation (SCS) therapy may vary over time based on the activity level or posture of the patient, the specific activity undertaken by the patient, or the like. For this reason, a patient who receives SCS therapy from an implantable medical device (IMD), e.g., an implantable pulse generator, is often given a patient programming device that communicates with his IMD via device telemetry, and allows the patient to activate and/or adjust the intensity of the delivered neurostimulation.


SUMMARY

In general, the invention is directed to techniques for providing automatic adjustments to a therapy. A medical device, such as an implanted medical device (IMD) for delivering a therapy or a programming device, automatically adjusts delivery of the therapy in response to detecting a previously defined event. By automatically adjusting therapy in response to detecting a previously defined event, the medical device can automatically provide appropriate therapy to address changes in the symptoms of a patient, and/or changes in the efficacy or side effects of the therapy associated with the event. The medical device may deliver neurostimulation therapy, and an event may be an activity and/or posture undertaken by the patient, such as running or sitting in a chair, which will likely impact the type or level of symptoms and/or the paresthesia experienced by the patient.


In exemplary embodiments, the medical device enters a learning mode in response to a command received from a user, e.g., the patient. In such embodiments, the medical device defines the event, collects the therapy information, and associates the therapy information with the defined event while operating in the learning mode. In some embodiments, the medical device defines the event based on an indication of the event received from the user. In other embodiments, the medical device defines the event based on the output of a sensor that indicates the activity, posture, or a physiological parameter of the patient during the learning mode. The sensor may be an accelerometer, which generates an output that reflects motion and/or posture of the patient. The medical device may collect therapy information by recording values of one or more therapy parameters, such as pulse amplitude, width and rate, and/or changes made to the parameters by the user during the learning mode.


When a patient undertakes certain activities and/or postures, the patient may experience an uncomfortable increase in the intensity of the neurostimulation delivered by a medical device. This phenomenon is referred to as a “jolt.” Some of the events detected by the medical device may correspond to a jolt. In response to detecting these events, the medical device may suspend delivery of neurostimulation therapy for a period of time, which may advantageously allow the medical device avoid providing uncomfortable stimulation to the patient.


In one embodiment, the invention is directed to a method in which a command to enter a learning mode is received from a user. An event is defined, and therapy information is associated with the defined event, in response to the command. The defined event is subsequently detected, and therapy is provided to a patient via a medical device according to the therapy information in response to the detection.


In another embodiment, the invention is directed to a medical device that comprises a memory and a processor. The processor receives a command to enter a learning mode from a user, and defines an event and associates therapy information with the defined event within the memory in response to the command. The processor subsequently detects the event, and controls delivery of therapy to a patient according to the therapy information in response to the detection.


In another embodiment, the invention is directed to a computer-readable medium containing instructions. The instructions cause a programmable processor to receive a command from a user to enter a learning mode, and define and event and associate therapy information with the defined event in response to the command. The computer-readable medium further comprises instructions that cause a programmable processor to subsequently detect the defined event, and control delivery of therapy to a patient via a medical device according to the therapy information in response to the detection.


The invention may provide advantages. For example, by automatically adjusting therapy in response to a detected event, a medical device can provide therapy that better addresses changes in the symptoms of a patient and/or level of efficacy or side effects of the therapy associated with an activity undertaken by the patient. The medical device may automatically provide the appropriate therapy for frequently occurring events, e.g., activities that the patient frequently undertakes, allowing the patient to avoid having to manually adjust the therapy each time the event occurs. Manual adjustment of stimulation parameters can be tedious, requiring the patient to, for example, depress one or more keys of a keypad of a patient programmer multiple times during the event to maintain adequate symptom control. Instead, according to the invention, the patient may perform such adjustments a single time during a learning mode, and the medical device may automatically provide the adjustments during subsequent occurrences of the event.


The details of one or more embodiments of the invention are set forth in the accompanying drawings and the description below. Other features, objects, and advantages of the invention will be apparent from the description and drawings, and from the claims.





BRIEF DESCRIPTION OF DRAWINGS


FIG. 1 is a conceptual diagram illustrating an exemplary system that facilitates automatic discrete therapy adjustment according to the invention.



FIG. 2 is a block diagram illustrating an example medical device that provides therapy and automatically makes discrete adjustments to the therapy.



FIG. 3 is a block diagram illustrating an exemplary configuration of a memory of the medical device of FIG. 2.



FIG. 4 is a block diagram illustrating an example programming device that allows a user to communicate with the medical device of FIG. 2.



FIG. 5 is a flow diagram illustrating an exemplary operation of the medical device of FIG. 2 according to a learning mode.



FIG. 6 is a flow diagram illustrating another exemplary operation of the medical device of FIG. 2 according to a learning mode.



FIG. 7 is a flow diagram illustrating an exemplary operation of the medical device of FIG. 2 to provide discrete therapy adjustments according to the invention.



FIG. 8 is a timing diagram illustrating display of diagnostic information including learned events according to the invention.





DETAILED DESCRIPTION


FIG. 1 is a conceptual diagram illustrating an exemplary system 10 that facilitates automatic therapy adjustment according to the invention. In the illustrated example, system 10 includes an implantable medical device (IMD) 12, which is implanted within a patient 14, and delivers neurostimulation therapy to patient 14. In exemplary embodiments, IMD 12 takes the form of an implantable pulse generator, and delivers neurostimulation therapy to patient 14 in the form of electrical pulses.


IMD 12 delivers neurostimulation therapy to patient 14 via leads 16A and 16B (collectively “leads 16”). Leads 16 may, as shown in FIG. 1, be implanted proximate to the spinal cord 18 of patient 14, and IMD 12 may deliver spinal cord stimulation (SCS) therapy to patient 14 in order to, for example, reduce pain experienced by patient 14. However, the invention is not limited to the configuration of leads 16 shown in FIG. 1 or the delivery of SCS therapy. For example, one or more leads 16 may extend from IMD 12 to the brain (not shown) of patient 14, and IMD 12 may deliver deep brain stimulation (DBS) therapy to patient 14 to, for example, treat tremor or epilepsy. As further examples, one or more leads 16 may be implanted proximate to the pelvic nerves (not shown) or stomach (not shown), and IMD 12 may deliver neurostimulation therapy to treat incontinence or gastroparesis.


In exemplary embodiments, IMD 12 delivers therapy to patient 14 according to a program. A program includes one or more parameters that define an aspect of the therapy delivered by the medical device according to that program. For example, a program that controls delivery of neurostimulation by IMD 12 may define a voltage or current pulse amplitude, a pulse width, a pulse rate, for stimulation pulses delivered by IMD 12 according to that program. Further, each of leads 16 includes electrodes (not shown in FIG. 1), and the parameters for a program that controls delivery of neurostimulation therapy by IMD 12 may include information identifying which electrodes have been selected for delivery of pulses according to the program, and the polarities of the selected electrodes.


In the illustrated example, system 10 also includes a programming device 20, which is a medical device, and may, as shown in FIG. 1, be a handheld computing device. Programming device 20 allows a user to interact with IMD 12. Programming device 20 may, for example, communicate via wireless communication with IMD 12 using RF telemetry techniques known in the art.


Programming device 20 may, as shown in FIG. 1, include a display 22 and a keypad 24 to allow the user to interact with programming device 20. In some embodiments, display 22 may be a touch screen display, and the user may interact with programming device 20 via display 22. The user may also interact with programming device 20 using peripheral pointing devices, such as a stylus or mouse. Keypad 24 may take the form of an alphanumeric keypad or a reduced set of keys associated with particular functions.


In exemplary embodiments, programming device 20 is a patient programmer used by patient 14 to control the delivery of neurostimulation therapy by IMD 12. Patient 14 may use programming device 20 to activate or deactivate neurostimulation therapy. Patient 14 may also use programming device 20 to adjust one or more program parameters, e.g., adjust the amplitude, width, or rate of delivered stimulation pulse. Where more than one program is available to IMD 12 for delivery of neurostimulation to patient 14, patient 14 may use programming device 20 to select from among the available programs. The programs available for selection by patient 14 may be stored in either of IMD 12 and programming device 20.


As will be described in greater detail below, one or both of IMD 12 and programming device 20 provide automatic adjustment of the therapy delivered by IMD 12 according to the invention. Specifically, one of IMD 12 and programming device 20 detects a previously defined event, and the delivery of therapy by IMD 12 is automatically adjusted according to therapy information stored in association with defined event. In exemplary embodiments, the one of IMD 12 and programming device 20 may make automatic adjustments to the therapy over a period of time in response to detection of the previously defined event, e.g., provide a series of therapy adjustments defined by the therapy information associated with the event. By automatically adjusting therapy in response to a detected event, system 10 can provide therapy that better addresses changes in the symptoms of patient 14 associated with the event.


For ease of description, the provision of automatic therapy adjustment will be described hereinafter primarily with reference to embodiments in which IMD 12 provides automatic therapy adjustments. However, it is understood that both of IMD 12 and programming device 20 are medical devices capable of providing automatic therapy adjustments according to the invention.


In exemplary embodiments, IMD 12 provides a learning mode. IMD 12 may enter the learning mode in response to a command received from a user. For example, patient 14 may direct IMD 12 to enter the learning mode via keypad 24 of patient programmer 20.


When operating in the learning mode, IMD 12 defines events and associates therapy information with the events. In some embodiments, IMD 12 defines the event based on the indication of the event to IMD 12 by a user. In such embodiments, IMD 12 later detects the event by receiving the indication from the user, and automatically adjusts therapy according to information stored in association with that indication, e.g., with the event.


For example, patient 14 may indicate the occurrence of an event to IMD 12 via keypad 24 of patient programmer 20. In some embodiments, a particular key of keypad 24 is associated with the event. The event may correspond to an activity undertaken by patient 14, such as running, golfing, taking medication, sleeping, or a particular activity related to an occupation of patient 14. A first time patient 14 undertakes the activity, the activity, e.g., event, may be associated with a key of keypad 24. Subsequent times patient 14 undertakes the activity, patient 14 may press the key to cause IMD 12 to provide therapy adjustment according to therapy information associated with depression of the key.


In other embodiments, IMD 12 defines the event based on the output of a sensor (not shown in FIG. 1). IMD 12 may monitor the sensor output in response to the command to enter the learning mode received from the user, e.g., patient 14. After the event is defined, IMD 12 may monitor the output of the sensor, and, if the event is subsequently detected, provide automatic therapy adjustment according to information stored in association with the event. For example, IMD 12 may record the sensor output for a period during the learning mode to define the event, and, when no longer operating in the learning mode, apply digital signal and/or pattern recognition analysis techniques to the sensor output to automatically identify subsequent occurrences of the event based on comparison to the recorded exemplar.


The output of the sensor may reflect motion, posture, and/or one or more physiological parameters of patient 14. Consequently, events defined by IMD 12 based on the sensor output may correspond to an activity undertaken by patient 14. For example, patient 14 may direct IMD 12 to enter the learning mode via patient programmer 20 when patient 14 is about to undertake an activity, such as running IMD 12 may record the output of the sensor in response to the command, and, when no longer in the learning mode, use the recorded exemplar to detect when patient 14 is running so as to automatically provide an appropriate therapy adjustment according to therapy information stored in association with the exemplar.


IMD 12 may associate therapy information with the defined event while operating in the learning mode, and provide therapy, e.g., automatically adjusts the therapy, according to the therapy information in response to subsequent detection of the defined event. The therapy information may be the values of one or more parameters, e.g., pulse amplitude, pulse width, or pulse rate, recorded by IMD 12 upon entering, or at some point after entering, the learning mode. The therapy information may be a change to a parameter made by a user while IMD 12 is operating in the learning mode. In exemplary embodiments, IMD 12 records a series of changes made to parameters by the user over a period of time while IMD 12 is operating in the learning mode.


For example, patient 14 may direct IMD 12 to enter the learning mode so that IMD 12 will learn the appropriate adjustment or adjustments to make to the stimulation parameters while patient 14 is running. Patient 14 may indicate the occurrence of the event to IMD 12, e.g., may associate a key of keypad 24 with the activity of running, or may simply begin running and allow IMD 12 to record an exemplar of the sensor output while patient 14 is running. In any case, while patient 14 is running during the learning mode, patient 14 uses programming device 20, e.g., keypad 24, to change one or more stimulation parameters in an attempt to maintain adequate symptom control during the activity. IMD 12 may record the value of the parameters when patient 14 indicates satisfaction, or the one or more changes made by patient 14 over a period of time while running IMD 12 stores the values or a recording of the changes over the time period in association with the event, and, when no longer operating in the learning mode, delivers therapy according to the therapy information upon subsequently detecting that patient 14 is running.


By associating therapy information with defined events, IMD 12 may automatically provide appropriate therapy to patient 14 for frequently occurring events, e.g., activities that patient 14 frequently undertakes. By providing therapy adjustments automatically, IMD 12 may allow patient 14 to avoid having to manually adjust the therapy each time the event occurs. Such manual adjustment of stimulation parameters can be tedious, requiring patient 14 to, for example, depress one or more keys of keypad 24 multiple times during the event to maintain adequate symptom control. Instead, according to the invention, patient 14 may perform such adjustments a single time during the learning mode, and IMD 12 may automatically provide the adjustments during subsequent occurrences of the event.



FIG. 2 is a block diagram illustrating IMD 12 in greater detail. IMD 12 may deliver neurostimulation therapy via electrodes 30A-D of lead 16A and electrodes 30E-H of lead 16B (collectively “electrodes 30”). Electrodes 30 may be ring electrodes. The configuration, type and number of electrodes 30 illustrated in FIG. 2 are merely exemplary.


Electrodes 30 are electrically coupled to a therapy delivery circuit 32 via leads 16. Therapy delivery circuit 32 may, for example, include an output pulse generator coupled to a power source such as a battery. Therapy delivery circuit 32 may deliver electrical pulses to patient 14 via at least some of electrodes 30 under the control of a processor 34.


Processor 34 may control therapy delivery circuit 32 to deliver neurostimulation therapy according to a selected program. Specifically, processor 34 may control circuit 32 to deliver electrical pulses with the amplitudes and widths, and at the rates specified by the program. Processor 34 may also control therapy delivery circuit 32 to deliver the pulses via a selected subset of electrodes 40 with selected polarities, as specified by the program.


Processor 34 may also provide a learning mode of IMD 12 as described above. Specifically, processor 34 may receive commands from a user to enter the learning mode, may define an event during the learning mode, and may associate therapy information with the defined event within memory 36, as described above. When processor 34 is no longer operating in the learning mode, processor 34 and/or monitor 42 may detect previously defined events, and control therapy delivery circuit 32 to deliver therapy via at least some of electrodes 30 as indicated by the associated therapy information. Specifically, processor 34 may control therapy delivery circuit to deliver stimulation pulses with the amplitude, width, and rate indicated by the therapy information, and, in some embodiments, may control therapy delivery circuit to adjust the amplitude, width, and/or rate over time as indicated by the therapy information.


IMD 12 also includes a telemetry circuit 38 that allows processor 34 to communicate with programming device 20. Processor 34 may receive program selections, commands to enter a learning mode, indications of events, and adjustments to therapy made by a user, e.g., patient 14, using programming device 20 via telemetry circuit 38. In some embodiments, as will be described in greater detail below, processor 34 communicates with a clinician programmer to provide diagnostic information stored in memory 36 to a clinician via telemetry circuit 38. Telemetry circuit 38 may correspond to any telemetry circuit known in the implantable medical device arts.


In exemplary embodiments, as described above, IMD 12 includes a sensor 40, and processor 34 defines events based on the output of sensor 40. Sensor 40 is a sensor that generates an output based on motion, posture, and/or one or more physiological parameters of patient 14. In exemplary embodiments, sensor 40 is an accelerometer, such as a piezoresistive and/or micro-electro-mechanical accelerometer.


In some embodiments, IMD 12 includes an activity/posture monitor 42 that processes the analog output of sensor 40 to provide digital activity and/or posture information to processor 34. For example, where sensor 40 comprises a piezoresistive accelerometer, monitor 42 may process the raw signal provided by sensor 40 to provide activity counts to processor 34. In some embodiments, IMD 12 includes multiple sensors oriented along various axes, or sensor 40 comprises a single multi-axis, e.g., three-axis, accelerometer. In such embodiments, monitor 42 may process the signals provided by the one or more sensors 40 to provide velocity of motion information for each direction to processor 34.


In exemplary embodiments, the one or more sensors 40 are housed within a housing (not shown) of IMD 12. However, the invention is not so limited. In some embodiments, one or more sensors 40 are coupled to monitor 42 housed within IMD 12 via additional leads 16 (not shown). Such sensors may be located anywhere within patient 14. In some embodiments, IMD 12 may include multiple accelerometer sensors 40 located at various positions within patient 14 or on the external surface of patient 14, and processor 34 may receive more detailed information about the posture of and activity undertaken by patient 14. For example, accelerometer sensors 40 may be located within the torso and at a position within a limb, e.g. a leg, of patient 14.


Sensors 40 may be coupled to a single monitor 42, or IMD 12 may include multiple monitors 42 coupled to one or more sensors 40. Further, the invention is not limited to embodiments of IMD 12 that include a monitor 42. Rather, sensors 40 may be coupled directly to processor 34, which may include an analog-to-digital converter, and perform the functions attributed to monitor 42. In some embodiments, sensors located external to patient 12 may communicate wirelessly with processor 34, either directly or via programming device 20. In some embodiments, one or more sensors 40 may be included as part of or coupled to programming device 20.


Moreover, the invention is not limited to embodiments where sensors 40 are accelerometers. In some embodiments, one or more sensors 40 may take the form of, for example, a thermistor, a pressure transducer, or electrodes to detect thoracic impedance or an electrogram. Such sensors 40 may be appropriately positioned within or on an external surface of patient 14 to measure a physiological parameter of patient 14, such as a skin temperature, an arterial or intracardiac pressure, a respiration rate, a heart rate, or a Q-T interval of patient 14. In such embodiments, one or more monitor circuits 42 may provide appropriate circuitry to process the signals generated by such sensors, and to provide values of the physiological parameter to processor 34.


Processor 34 may include a microprocessor, a controller, a digital signal processor (DSP), an application specific integrated circuit (ASIC), a field-programmable gate array (FPGA), discrete logic circuitry, or the like. Memory 38 may include program instructions that, when executed by processor 34, cause IMD 12 to perform the functions ascribed to IMD 12 herein. Memory 36 may include any volatile, non-volatile, magnetic, optical, or electrical media, such as a random access memory (RAM), read-only memory (ROM), non-volatile RAM (NVRAM), electrically-erasable programmable ROM (EEPROM), flash memory, and the like.



FIG. 3 is a block diagram illustrating an exemplary configuration of memory 36 of IMD 12. In some embodiments, memory 36 stores the one or more programs 50 used by processor 34 (FIG. 2) to control delivery of stimulation by therapy delivery circuit 32 (FIG. 2). Processor 34 may receive the programs from a clinician via a clinician programming device and telemetry circuit 38 (FIG. 2), and store the programs in memory 36. In other embodiments, programs 50 are stored within a memory of programming device 20, and provided to processor 34 via telemetry circuit 38 as needed.


Memory 38 stores events 52 defined by processor 34 during operation in the learning mode, and learned therapies 54, i.e., the therapy information collected during operation in the learning mode. As described above, an event 52 may be information describing an event indication received from a user, e.g., patient 14 (FIG. 1), during the learning mode. For example, an event 52 may indicate a signal received via telemetry circuit 38 when patient presses a key of keypad 24 (FIG. 1) that patient 14 has associated with an activity undertaken by patient 14.


In some embodiments, as described above, processor 34 defines events 52 based on the output of one or more sensors 40. Processor 34 may store one or more sample of the output of sensor 40 and/or monitor 42 collected while operating in the learning mode as an event, or one or more results of an analysis such samples. For example, processor 34 may store information related to the detection of features within the one or more samples, such as peaks, zero-crossings, or the like, or the results of a Fourier or wavelet analysis of the one or more samples as a defined event 52.


As described above, learned therapies 54 comprises information describing values of stimulation parameters and/or information describing one or more changes to parameters made by a user while processor 34 is operating in the learning mode. In exemplary embodiments, a learned therapy 54 comprises information describing initial parameter values and changes to be made to some or all of the parameter values over a period of time. In such embodiments, the learned therapy may include time values associated with parameter values, so that processor 34 may direct changes to parameter values at appropriate times. Memory 36 maintains associations between events 52 and corresponding learned therapies 54.


Processor 34 may also collect diagnostic information 56 and store diagnostic information 56 within memory 36 for future retrieval by a clinician. Diagnostic information 56 may, for example, include selected recordings of the output of sensor 40 and/or of therapy changes made by patient 14. In exemplary embodiments, diagnostic information 56 includes information identifying the time at which defined events occurred, either during operation in a learning mode or as subsequently detected by processor 34. Diagnostic information 56 may include other information or events indicated by patient 14 outside of learning mode using programming device 20, such as changes in symptoms, taking medication, or other activities undertaken by patient 14 for which patient 14 does not wish IMD 12 to learn a therapy. A clinician programming device (not shown in FIGS.) may present diagnostic information 56 to a clinician in a variety of forms, such as timing diagrams, or a graph resulting from statistical analysis of diagnostic information 56, e.g., a bar graph.



FIG. 4 is a block diagram further illustrating programming device 20. As indicated above, in exemplary embodiments programming device 20 takes the form of a patient programming device used by patient 14 to control delivery of therapy by IMD 12. Patient 14 may interact with a processor 60 via a user interface 62 in order to control delivery of neurostimulation therapy, direct IMD 12 to enter a learning mode, indicate events and make therapy changes, as described herein. User interface 62 may include display 22 and keypad 24, and may also include a touch screen or peripheral pointing devices as described above. Processor 60 may also provide a graphical user interface (GUI) to facilitate interaction with patient 14. Processor 60 may include a microprocessor, a controller, a DSP, an ASIC, an FPGA, discrete logic circuitry, or the like.


Programming device 20 also includes a telemetry circuit 64 that allows processor 60 to communicate with IMD 12. In exemplary embodiments, processor 60 communicates commands, indications, and therapy changes made by patient 14 via user interface 62 to IMD 12 via telemetry circuit 64. Telemetry circuit 64 may correspond to any telemetry circuit known in the implantable medical device arts.


Programming device also includes a memory 66. In some embodiments, memory 66, rather than memory 36 of IMD 12, may store programs 50 that are available to be selected by patient 14 for delivery of neurostimulation therapy. Memory 66 may also include program instructions that, when executed by processor 60, cause programming device 20 to perform the functions ascribed to programming device 20 herein. Memory 66 may include any volatile, non-volatile, fixed, removable, magnetic, optical, or electrical media, such as a RAM, ROM, CD-ROM, hard disk, removable magnetic disk, memory cards or sticks, NVRAM, EEPROM, flash memory, and the like.



FIG. 5 is a flow diagram illustrating an exemplary operation of IMD 12 according to a learning mode. Specifically, FIG. 5 illustrates an exemplary mode of IMD 12 to learn a therapy for an event that is indicated by patient 14. Processor 34 enters the learning mode in response to receiving a command from patient 14 (70). Patient 14 may direct processor 34 to enter the learning mode by pressing a key of keypad 24 of programming device 20.


When operating in the learning mode, processor 34 defines an event 52 by receiving an indication from patient 14 (72). Patient 14 may indicate the event by, for example, pressing a key of keypad 24 that patient will thereafter use to identify the event to processor 34. The event 52 may be an activity and/or posture to be undertaken by patient 14, and the key may be used by patient 14 in the future to indicate to processor 34 that patient 14 is about to undertake the activity. Processor 34 may store information identifying the signal received by via telemetry circuit 38 when patient presses the key as the event 52 within memory 36.


Processor 34 then records therapy information, e.g., a learned therapy 54, while operating in the learning mode (74). As described above, the learned therapy 54 may be stimulation parameter values and/or one or more changes made to stimulation parameters by patient 14 over a period of time during operation within the learning mode. Processor 34 may store therapy information as a learned therapy at any time after receiving the command to enter the learning mode, e.g., before or after receiving an indication of the event from patient 14. Processor 34 stores the learned therapy 54 within memory 36, and associates the learned therapy 54 with the defined event 52 within memory 36 (76).


In exemplary embodiments, patient 14 adjusts stimulation parameters over a period of time after directing IMD 12 to enter the learning mode, e.g., during the event. For example, patient 14 may direct IMD 12 to enter the learning mode, so that IMD 12 learns appropriate adjustments to therapy to provide while patient 14 is running, and may adjust stimulation parameters while running to maintain effective and comfortable neuro stimulation therapy. IMD 12 may store the stimulation parameters and/or changes to the stimulation parameters and associate times with the parameters or changes, so that stimulation according to the parameters and changes to the stimulation may be provided at appropriate times during a subsequent occurrence of patient 14 running.


In other embodiments, rather than IMD 12 recording therapy information over time, patient 14 may use programming device 20 to enter a learned therapy 54 that includes time as a parameter. For example, patient 14 may create a learned therapy 54 for the “running” event that includes increases to pulse amplitude and width at particular time after the event is detected by IMD 12, and/or after N minutes that the event continues to be detected by IMD 12.



FIG. 6 is a flow diagram illustrating another exemplary operation IMD 12 according to a learning mode. Specifically, FIG. 6 illustrates an exemplary mode of IMD 12 to learn a therapy for an event that that is defined by IMD 12 based on the output of a sensor 40. Processor 34 enters the learning mode in response to receiving a command from patient 14 (80).


While operating in the learning mode, processor 34 records at least one of the output of sensor 40 or the information provided by monitor circuit 42 based on the sensor output (82). Processor 34 may record the sensor output or information over any length of time, may record multiple samples, and may make the recording or recordings at any time after entering the learning mode. Processor 34 may store the recording(s), or the result of an analysis, e.g. feature, Fourier, or wavelet, or the recording(s) in memory 36 as an event 52. Processor 34 records therapy information as a learned therapy 54 during operation in the learning mode (84), and associates the learned therapy 54 with the defined event 52 (86), as described above with reference to FIG. 5.



FIG. 7 is a flow diagram illustrating an exemplary operation of IMD 12 to provide automatic therapy adjustments according to the invention. Processor 34 monitors signals received from programming device 20 via telemetry circuit 38, and output of sensor 40 and/or monitor circuit 42, to detect previously defined events 52 (90). To monitor the sensor output, processor 34 compares the current sensor output to the event. 52. For example, processor 34 can compare the current sensor output to the sample sensor output recorded during operation in the learning mode, or the result of a signal analysis of the current sensor output to the result of a signal analysis of the sample sensor output recorded during operation in the learning mode. Processor 34 may use any of a variety of known pattern matching techniques or algorithms, such as fuzzy logic or neural network techniques or algorithms, to subsequently detect the previously defined events 52.


If processor 34 detects a previously defined event 52 (92), processor 34 controls therapy delivery circuit 32 to deliver therapy according to the learned therapy 54 associated with the detected event 52 in memory 36 (94). Processor 34 may control circuit 32 to deliver therapy according to parameter values of the learned therapy 54. Processor 34 may also control circuit 32 to change the parameter values over time according to the learned therapy 54.


If processor 34 detects that patient 14 has made changes to stimulation parameters during provision of therapy according to the learned therapy 54 (96), processor 34 may query patient 14 via programming device 20 as to whether the changes should be saved as a modification to the learned therapy 54 (98). If patient 14 wishes to save the changes, processor 34 modifies the learned therapy 54 according to the changes (100).


As described above, an event 52 may be an activity or posture undertaken by patient 14. For example, an event 52 may be patient 14 running, and the learned therapy 54 may include changes to stimulation parameters occurring at associated times during the “running” event such that effective and comfortable therapy is maintained. Other activities and postures that may affect the symptoms experienced by patient 14, or the effectiveness and side effects of the stimulation may include golfing, gardening, driving a car, sitting in a chair, twisting, or bending over. In some cases the duration of a particular activity or posture may affect the symptoms experienced by patient 14, or the effectiveness and side effects of the stimulation. In such cases an event 52 may be defined as occurring after patient 14 maintains an activity or posture for a defined duration.


In some cases, an activity or posture undertaken by patient 14 is results in an uncomfortable increase in the intensity of the stimulation delivered by IMD 12. This phenomenon is referred to as a “jolt.”. Activities and postures that may lead to “jolts” include sitting in a seat, twisting, bending over, rapid posture changes, or other like postures or transitions between postures. Patient 14 may use the learning mode provided by IMD 12 as described herein to cause IMD 12 to define events 52 associated with the activities or postures that lead to “jolts,” and associate such “jolt” events with therapy information 54 that causes IMD 12 to suspend or reduce the intensity of stimulation upon subsequent detection of the “jolt” events. Consequently, embodiments of IMD 12 may advantageously provide efficacious therapy during certain defined events 52, and avoid providing uncomfortable therapy during other defined events 52.



FIG. 8 is a timing diagram illustrating display of diagnostic information 56 including learned events 52 according to the invention. As described above, in some embodiments processor 34 collects diagnostic information for review by a clinician that may include defined events, events indicated by patient 14 outside of the learning mode, the output of sensor 40 and/or monitor circuit 42, stimulation parameter values and/or changes made thereto over time, or the like. Diagnostic information 56 may be retrieved from IMD 12 by a clinician programmer and presented to a clinician in a variety of forms, such as the illustrated timing diagram, or various graphs, such as bar graphs, illustrating the result of a statistical analysis of diagnostic information 56. A clinician may use diagnostic data 56 to, for example, objectively assess patient activity, therapy effectiveness, patient compliance, or the like.


In the illustrated timing diagram, a curve 110 representing the activity level of patient 14, e.g., the output of one or both of sensor 40 and monitor 42, over time is displayed. Markers 112A-E are used to indicate the occurrence of events, which may be defined events 52. A second curve 114 illustrates the symptom, e.g., pain, intensity indicated by patient 14 over time. Curve 114 may be estimated based on intensity values 116A-F periodically entered by patient 14 using programming device 20.


Various embodiments of the invention have been described. However, one skilled in the art will appreciate that various modification may be made to the described embodiments without departing from the scope of the invention. For example, the invention is not limited to medical devices that deliver neurostimulation therapy or to implantable medical devices. Rather, systems that facilitate automatic therapy adjustment according to the invention may include one or more implantable or external medical devices, of any type, that deliver therapy to a patient. For example, in some embodiments, an implantable or external pump that delivers a therapeutic agent to a patient can provide automatic therapy adjustment according to the invention


In some embodiments, a medical device that does not itself deliver therapy, such as a programming device, provides automatic therapy adjustment according to the invention. In such embodiments, the programming device may receive a command to enter a learning mode, an indication of an event, and therapy changes from the patient via a keypad, for example. The programming device may include a memory to store defined events and associated therapy information. When the user, e.g., the patient, again indicates occurrence of the event to the programming device via the keypad, the programming device controls a therapy delivery device to deliver therapy according to therapy information associated with the defined event.


The invention is not limited to embodiments wherein a programming device is a patient programmer. For example, in some embodiments, a programming device may be a clinician programmer used by a clinician to, for example, create the programs that control the delivery of therapy by a therapy delivery device. The clinician may use the clinician programmer, during a programming session for example, to cause the clinician programmer or the therapy delivery device to learn therapies for defined events as described herein


In other embodiments, a system that facilitates automatic therapy adjustment does not include a programming device at all. Where a system includes an external medical device that provides therapy to a patient, for example, a user may interact with a user interface provided by the medical device and a programming device may therefore be unnecessary. A user may also interact with an implanted medical device using a magnetic activator, or by tapping over the implanted medical device, which may be detected via an accelerometer, as is known in the art. These and other embodiments are within the scope of the following claims.

Claims
  • 1. A method, comprising: defining an event via at least one processor;monitoring therapy delivered by a medical device during occurrence of the defined event, wherein monitoring comprises receiving, from a user, during a learning mode of the medical device, and during the occurrence of the defined event, therapy information indicative of changes made by the user to the therapy delivered by the medical device during the occurrence of the defined event;associating the therapy information indicative of the monitored therapy with the defined event;subsequently detecting, via a sensor, the defined event; andproviding subsequent therapy to a patient via the medical device according to the therapy information in response to the detection of the defined event.
  • 2. The method of claim 1, wherein subsequently detecting the defined event comprises: monitoring an output of the sensor; andcomparing the sensor output to the defined event.
  • 3. The method of claim 2, wherein the sensor output reflects at least one of motion and posture of the patient.
  • 4. The method of claim 1, wherein providing subsequent therapy to the patient comprises automatically making changes to the provided therapy over time based on the therapy information.
  • 5. The method of claim 4, wherein the therapy information comprises information indicative of multiple changes to therapy parameters over time, and wherein providing subsequent therapy to the patient comprises automatically making changes to the provided therapy over time based on the therapy information.
  • 6. The method of claim 5, wherein the therapy information comprises time information indicative of how the therapy parameters are to be changed over time, and wherein providing subsequent therapy to the patient comprises automatically making changes to the provided therapy over time based on the time information.
  • 7. The method of claim 1, wherein defining the event comprises sensing a signal reflecting a physiological parameter of a patient and wherein receiving, from the user and during the occurrence of the defined event, therapy information indicative of changes to therapy delivered by the medical device comprises receiving, from the user, therapy information indicative of changes made by the user to therapy delivered by the medical device while sensing the signal reflecting the physiological parameter of the patient.
  • 8. The method of claim 1, wherein receiving, from the user, during the learning mode of the medical device, and during the occurrence of the defined event, therapy information indicative of changes made by the user to therapy delivered by the medical device comprises recording a change to the monitored therapy during the occurrence of the defined event and wherein associating therapy information indicative of the monitored therapy comprises associating therapy information indicative of the recorded change to the monitored therapy with the defined event.
  • 9. The method of claim 8, wherein receiving, from the user, during the learning mode of the medical device, and during the occurrence of the event, therapy information indicative of changes made by the user to therapy delivered by the medical device comprises recording a time of the change to the monitored therapy and wherein associating therapy information indicative of the monitored therapy comprises associating therapy information indicative of the recorded time of the change to the monitored therapy with the defined event.
  • 10. The method of claim 9, wherein providing subsequent therapy to the patient comprises automatically changing the provided therapy according to the therapy information indicative of the recorded change and at a time determined by the therapy information indicative of the recorded time of the change.
  • 11. The method of claim 1, wherein receiving, from the user, during the learning mode of the medical device, and during the occurrence of the event, therapy information indicative of changes made by the user to therapy delivered to the patient comprises receiving multiple changes to the therapy over time during the occurrence of the defined event and wherein associating therapy information indicative of the monitored therapy comprises associating therapy information indicative of the multiple changes to the therapy over time with the defined event.
  • 12. The method of claim 1, wherein the medical device comprises an implantable medical device, and wherein receiving, from the user, during the learning mode of the medical device, and during the occurrence of the event, therapy information indicative of changes made by the user to therapy delivered by the medical device during the occurrence of the defined event comprises receiving user input to control the monitored therapy via a programming device.
  • 13. The method of claim 1, wherein associating therapy information indicative of the monitored therapy with the defined event comprises associating therapy information indicative of multiple therapy parameter values with the defined event.
  • 14. The method of claim 1, wherein defining the event via at least the one processor comprises defining the event during an initial occurrence of the event, andwherein receiving, from the user, during the learning mode of the medical device, and during the occurrence of the defined event, therapy information indicative of the changes made by the user to the therapy delivered by the medical device comprises receiving, from the user, during the learning mode of the medical device, and during the occurrence of the initial occurrence of the event, therapy information indicative of the changes to the therapy during the initial occurrence of the event.
  • 15. The method of claim 1, further comprising, subsequent to defining the event, receiving input from the user to indicate that the event is about to occur.
  • 16. The method of claim 15, wherein the event is a patient activity, and wherein receiving the input from the user indicates the patient is about to undertake the activity.
  • 17. The method of claim 1, wherein providing subsequent therapy to a patient via the medical device according to the therapy information comprises receiving input from the user to change the therapy information.
  • 18. The method of claim 1, wherein defining the event comprises defining the event to occur after the patient maintains an activity or posture for a predefined duration.
  • 19. A medical device, comprising: a therapy delivery module configured to deliver therapy to a patient;a sensor configured to generate an output; anda processor configured to: define an event;monitor delivery of the therapy delivered during occurrence of the defined event, wherein to monitor, the processor is configured to receive, from a user, during a learning mode of the medical device, and during occurrence of the defined event, therapy information indicative of changes made by the user to therapy delivered during the occurrence of the defined event;associate the therapy information indicative of the monitored therapy with the defined event;detect, via the output of the sensor, the defined event; andcontrol delivery of subsequent therapy to the patient according to the therapy information in response to the detection of the defined event via the sensor.
  • 20. The medical device of claim 19, wherein the processor is further configured to detect the defined event by monitoring an output of the sensor and comparing the sensor output to the defined event.
  • 21. The medical device of claim 20, wherein the sensor is configured to provide an output that reflects at least one of motion and posture of the patient.
  • 22. The medical device of claim 19, wherein the processor is configured to control delivery of subsequent therapy to the patient by automatically making changes to the delivered therapy over time based on the therapy information in response to detecting the defined event.
  • 23. The medical device of claim 22, wherein the processor is configured to associate the therapy information with the defined event that comprises information indicative of multiple changes to therapy parameters over time, and wherein the processor is configured to control delivery of subsequent therapy to the patient by automatically making changes to the delivered therapy over time based on the therapy information.
  • 24. The medical device of claim 23, wherein the processor is configured to associate the therapy information with the defined event that comprises time information indicative of how the therapy parameters are to be changed over time, and wherein the processor is configured to control delivery of subsequent therapy to the patient by automatically making changes to the delivered therapy over time based on the time information.
  • 25. The medical device of claim 19, wherein the sensor is configured to provide an output that reflects a physiological parameter of the patient, and wherein the processor is configured to receive, from the user, during the learning mode of the medical device, and during the occurrence of the defined event, therapy information indicative of changes made by the user to therapy delivered by the medical device by receiving, from the user, during the learning mode of the medical device, and during the occurrence of the defined event, therapy information indicative of changes made by the user to therapy delivered by the medical device while monitoring the output of the sensor that reflects the physiological parameter of the patient.
  • 26. The medical device of claim 19, wherein the processor is configured to receive, from the user, during the learning mode of the medical device, and during the occurrence of the event, therapy information by recording a change to the monitored therapy while sensing the signal and to associate therapy information indicative of the recorded change to the monitored therapy with the defined event.
  • 27. The medical device of claim 26, wherein the processor is configured to receive, from the user, during the learning mode of the medical device, and during the occurrence of the event, therapy information by recording a time of the change to the monitored therapy and to associate therapy information indicative of the recorded time of the change to the monitored therapy with the defined event.
  • 28. The medical device of claim 27, wherein the processor is configured to control delivery of subsequent therapy to the patient by automatically changing the delivered therapy according to the therapy information indicative of the recorded change and at a time determined by the therapy information indicative of the recorded time of the change.
  • 29. The method of claim 19, wherein the processor is configured to receive, from the user, during the learning mode of the medical device, and during the occurrence of the event, therapy information by receiving multiple changes to the therapy over time while sensing the signal and to associate therapy information indicative of the multiple changes to the therapy over time with the defined event.
  • 30. The medical device of claim 19, wherein the medical device is an implantable medical device, and further comprising a telemetry module configured to receive user input to control the delivery of subsequent therapy during occurrence of the defined event.
  • 31. The medical device of claim 19, wherein the processor is configured to associate therapy information indicative of multiple therapy parameter values with the defined event.
  • 32. The medical device of claim 19, wherein the processor is configured to receive input from the user subsequent to defining of the event to indicate that the event is about to occur.
  • 33. The medical device of claim 32, wherein the event is a patient activity, and wherein receiving the input from the user indicates the patient is about to undertake the activity.
  • 34. The medical device of claim 19, wherein the processor is configured to receive input from the user while controlling delivery of subsequent therapy to the patient, the input being provided to change the therapy information.
  • 35. The medical device of claim 19, wherein the processor is configured to define the event to occur after the patient maintains an activity or posture for a predefined duration.
  • 36. A medical system, comprising: a medical device, comprising: a therapy delivery module configured to deliver therapy to a patient; anda sensor configured to generate an output; andone or more processors configured to: define an event;monitor delivery of the therapy delivered by the medical device during occurrence of the defined event, wherein to monitor, the one or more processors are configured to receive, from a user, during a learning mode of the medical device, and during the occurrence of the defined event, therapy information indicative of changes made by the user to the therapy delivered during the occurrence of the defined event;associate the therapy information indicative of the monitored therapy with the defined event;detect, via the output of the sensor, the defined event; andcontrol delivery of subsequent therapy to the patient according to the therapy information in response to the detection of the defined event via the sensor.
  • 37. The medical system of claim 36, wherein the one or more processors comprises a processor external to the patient.
  • 38. The medical system of claim 36, wherein the one or more processors comprises a processor implanted in the patient.
  • 39. The medical system of claim 36, further comprising a programmer configured to communicate telemetrically with at least one of the one or more processors.
  • 40. A non-transitory computer readable storage medium storing instructions to cause a programmable processor of a medical device to: define an event;monitor therapy delivered by a medical device during occurrence of the defined event, wherein the instructions that cause the programmable processor to receive comprise instructions that cause the programmable processor to receive, from a user, during a learning mode of the medical device, and during the occurrence of the defined event, therapy information indicative of changes made by the user to therapy delivered by the medical device during the occurrence of the defined event;associate therapy the information indicative of the monitored therapy with the defined event;subsequently detect, via a sensor, the defined event; andprovide subsequent therapy to a patient via the medical device according to the therapy information in response to the detection of the defined event.
  • 41. A system, comprising: means for defining an event;means for monitoring therapy delivered by a medical device during occurrence of the defined event, wherein the means for monitoring comprises means for receiving, from a user, during a learning mode of the medical device, and during the occurrence of the defined event, therapy information indicative of changes made by the user to therapy delivered by the medical device during the occurrence of the defined event;means for associating the therapy information indicative of the monitored therapy with the defined event;means for subsequently detecting the defined event; andmeans for providing subsequent therapy to a patient according to the therapy information in response to the detection of the defined event.
Parent Case Info

This application is a continuation of, and claims priority to, U.S. application Ser. No. 10/691,917 filed Oct. 23, 2003, issued as U.S. Pat. No. 8,396,565 B2 on Mar. 12, 2013, which claims priority to U.S. Provisional Application Ser. No. 60/503,218, filed Sep. 15, 2003, the entire content of both of which are incorporated herein by reference.

US Referenced Citations (389)
Number Name Date Kind
4297685 Brainard, II Oct 1981 A
4365633 Loughman Dec 1982 A
4543955 Schroeppel Oct 1985 A
4550736 Broughton et al. Nov 1985 A
4566456 Koning et al. Jan 1986 A
4771780 Sholder Sep 1988 A
4776345 Cohen et al. Oct 1988 A
4846180 Buffet Jul 1989 A
4846195 Alt Jul 1989 A
5031618 Mullett Jul 1991 A
5040534 Mann et al. Aug 1991 A
5040536 Riff Aug 1991 A
5058584 Bourgeois Oct 1991 A
5125412 Thornton Jun 1992 A
5154180 Blanchet et al. Oct 1992 A
5158078 Bennett et al. Oct 1992 A
5167229 Peckham et al. Dec 1992 A
5233984 Thompson Aug 1993 A
5275159 Griebel Jan 1994 A
5312446 Holschbach et al. May 1994 A
5335657 Terry, Jr. et al. Aug 1994 A
5337758 Moore et al. Aug 1994 A
5342409 Mullett Aug 1994 A
5354317 Alt Oct 1994 A
5425750 Moberg Jun 1995 A
5476483 Bornzin et al. Dec 1995 A
5487755 Snell et al. Jan 1996 A
5513645 Jacobson et al. May 1996 A
5514162 Bornzin et al. May 1996 A
5558640 Pfeiler et al. Sep 1996 A
5562707 Prochazka et al. Oct 1996 A
5593431 Sheldon Jan 1997 A
5622428 Bonnet Apr 1997 A
5628317 Starkebaum et al. May 1997 A
5643332 Stein Jul 1997 A
5645053 Remmers et al. Jul 1997 A
5674258 Henschel et al. Oct 1997 A
5711316 Elsberry et al. Jan 1998 A
5716377 Rise et al. Feb 1998 A
5720770 Nappholz et al. Feb 1998 A
5732696 Rapoport et al. Mar 1998 A
5741310 Wittkampf Apr 1998 A
5782884 Stotts et al. Jul 1998 A
5814093 Stein Sep 1998 A
5832932 Elsberry et al. Nov 1998 A
5833709 Rise et al. Nov 1998 A
5836989 Shelton Nov 1998 A
5851193 Arikka et al. Dec 1998 A
5865760 Lidman et al. Feb 1999 A
5885471 Ruben et al. Mar 1999 A
5893883 Torgerson et al. Apr 1999 A
5895371 Levitas et al. Apr 1999 A
5904708 Goedeke May 1999 A
5911738 Sikorski et al. Jun 1999 A
5913727 Ahdoot Jun 1999 A
5919149 Allum Jul 1999 A
5938690 Law et al. Aug 1999 A
5941906 Barreras, Sr. et al. Aug 1999 A
5944680 Christopherson et al. Aug 1999 A
5957957 Sheldon Sep 1999 A
6027456 Feler et al. Feb 2000 A
6044297 Sheldon et al. Mar 2000 A
6083475 Sikorski et al. Mar 2000 A
6045513 Stone et al. Apr 2000 A
6059576 Brann May 2000 A
6081750 Hoffberg et al. Jun 2000 A
6095991 Krausman et al. Aug 2000 A
6099479 Christopherson et al. Aug 2000 A
6102874 Stone et al. Aug 2000 A
6120467 Schallhorn Sep 2000 A
6128534 Park et al. Oct 2000 A
6134459 Roberts et al. Oct 2000 A
6157857 Dimpfel Dec 2000 A
6165143 Van Lummel Dec 2000 A
6216537 Henschel et al. Apr 2001 B1
6259948 Florio et al. Jul 2001 B1
6280409 Stone et al. Aug 2001 B1
6296606 Goldberg et al. Oct 2001 B1
6308098 Meyer Oct 2001 B1
6308099 Fox et al. Oct 2001 B1
6315740 Singh Nov 2001 B1
6327501 Levine et al. Dec 2001 B1
6341236 Osorio et al. Jan 2002 B1
6351672 Park et al. Feb 2002 B1
6368284 Bardy Apr 2002 B1
6381496 Meadows et al. Apr 2002 B1
6393325 Mann et al. May 2002 B1
6438408 Mulligan et al. Aug 2002 B1
6440090 Schallhorn Aug 2002 B1
6449508 Sheldon et al. Sep 2002 B1
6459934 Kadhiresan Oct 2002 B1
6466821 Pianca et al. Oct 2002 B1
6468234 Van der Loos et al. Oct 2002 B1
6507757 Swain et al. Jan 2003 B1
6514218 Yamamoto Feb 2003 B2
6516749 Salasidis Feb 2003 B1
6539249 Kadhiresan et al. Mar 2003 B1
6547755 Lippe et al. Apr 2003 B1
6572557 Tchou et al. Jun 2003 B2
6574507 Bonnet Jun 2003 B1
6605038 Teller et al. Aug 2003 B1
6609031 Law et al. Aug 2003 B1
6611783 Kelly, Jr. et al. Aug 2003 B2
6620151 Blischak et al. Sep 2003 B2
6625493 Kroll et al. Sep 2003 B2
6635048 Ullestad et al. Oct 2003 B1
6641542 Cho et al. Nov 2003 B2
6658292 Kroll et al. Dec 2003 B2
6659968 McClure Dec 2003 B1
6662047 Sorensen Dec 2003 B2
6665558 Kalgren et al. Dec 2003 B2
6668188 Sun et al. Dec 2003 B2
6687538 Hrdlicka et al. Feb 2004 B1
6731984 Cho et al. May 2004 B2
6740075 Lebel et al. May 2004 B2
6748276 Daignault, Jr. et al. Jun 2004 B1
6752766 Kowallik et al. Jun 2004 B2
6773404 Poezevera et al. Aug 2004 B2
6782315 Lu et al. Aug 2004 B2
6817979 Nihtilä Nov 2004 B2
6820025 Bachmann et al. Nov 2004 B2
6829507 Lidman et al. Dec 2004 B1
6832113 Belalcazar Dec 2004 B2
6834436 Townsend Dec 2004 B2
6853863 Carter et al. Feb 2005 B2
6878121 Krausman et al. Apr 2005 B2
6884596 Civelli et al. Apr 2005 B2
6890306 Poezevera May 2005 B2
6895341 Barrey et al. May 2005 B2
6922587 Weinberg Jul 2005 B2
6923784 Stein Aug 2005 B2
6928324 Park et al. Aug 2005 B2
6937899 Sheldon et al. Aug 2005 B2
6937900 Pianca et al. Aug 2005 B1
6945934 Bardy Sep 2005 B2
6964641 Cho et al. Nov 2005 B2
6975904 Sloman Dec 2005 B1
6997882 Parker et al. Feb 2006 B1
6999817 Park et al. Feb 2006 B2
7016730 Ternes Mar 2006 B2
7031772 Condie Apr 2006 B2
7043305 KenKnight et al. May 2006 B2
7054687 Andersen May 2006 B1
7066910 Bauhahn et al. Jun 2006 B2
7082333 Bauhahn Jul 2006 B1
7092759 Nehls et al. Aug 2006 B2
7095424 Satoh et al. Aug 2006 B2
7110820 Tcheng et al. Sep 2006 B2
7117036 Florio Oct 2006 B2
7123967 Weinberg Oct 2006 B2
7130681 Gebhardt et al. Oct 2006 B2
7130689 Turcott Oct 2006 B1
7141026 Aminian et al. Nov 2006 B2
7142921 Mattes et al. Nov 2006 B2
7149579 Koh et al. Dec 2006 B1
7149584 Koh et al. Dec 2006 B1
7151961 Whitehurst et al. Dec 2006 B1
7155279 Whitehurst et al. Dec 2006 B2
7160252 Cho et al. Jan 2007 B2
7162304 Bradley Jan 2007 B1
7167743 Heruth et al. Jan 2007 B2
7167751 Whitehurst et al. Jan 2007 B1
7181281 Kroll Feb 2007 B1
7189204 Ni et al. Mar 2007 B2
7207947 Koh et al. Apr 2007 B2
7210240 Townsend et al. May 2007 B2
7212862 Park et al. May 2007 B2
7214197 Prass May 2007 B2
7218964 Hill et al. May 2007 B2
7218968 Condie et al. May 2007 B2
7221979 Zhou et al. May 2007 B2
7231254 DiLorenzo Jun 2007 B2
7242983 Frei et al. Jul 2007 B2
7252640 Ni et al. Aug 2007 B2
7266412 Stypulkowski Sep 2007 B2
7308311 Sorensen et al. Dec 2007 B2
7313440 Miesel Dec 2007 B2
7317948 King et al. Jan 2008 B1
7330760 Heruth et al. Feb 2008 B2
7366569 Belalcazar Apr 2008 B2
7366572 Heruth et al. Apr 2008 B2
7387610 Stahmann Jun 2008 B2
7389147 Wahlstrand et al. Jun 2008 B2
7395113 Heruth Jul 2008 B2
7403820 DiLorenzo Jul 2008 B2
7406351 Wesselink Jul 2008 B2
7415308 Gerber et al. Aug 2008 B2
7447545 Heruth et al. Nov 2008 B2
7471290 Wang et al. Dec 2008 B2
7471980 Koshiol Dec 2008 B2
7489970 Lee et al. Feb 2009 B2
7491181 Heruth et al. Feb 2009 B2
7505815 Lee et al. Mar 2009 B2
7519431 Goetz et al. Apr 2009 B2
7542803 Heruth et al. Jun 2009 B2
7548786 Lee et al. Jun 2009 B2
7559901 Maile Jul 2009 B2
7572225 Stahmann Aug 2009 B2
7577479 Hartley et al. Aug 2009 B2
7580752 Gerber et al. Aug 2009 B2
7584808 Dolgin et al. Sep 2009 B2
7590453 Heruth Sep 2009 B2
7590455 Heruth et al. Sep 2009 B2
7590481 Lu et al. Sep 2009 B2
7591265 Lee et al. Sep 2009 B2
7603170 Hatlestad et al. Oct 2009 B2
7623919 Goetz et al. Nov 2009 B2
7634379 Noble Dec 2009 B2
7664546 Hartley et al. Feb 2010 B2
7672806 Tronconi Mar 2010 B2
7717848 Heruth et al. May 2010 B2
7769464 Gerber et al. Aug 2010 B2
7792583 Miesel et al. Sep 2010 B2
7853322 Bourget et al. Dec 2010 B2
7957797 Bourget et al. Jun 2011 B2
7957809 Bourget et al. Jun 2011 B2
8396565 Singhal et al. Mar 2013 B2
20020038137 Stein Mar 2002 A1
20020091308 Kipshidze et al. Jul 2002 A1
20020107553 Hill et al. Aug 2002 A1
20020115939 Mulligan et al. Aug 2002 A1
20020165586 Hill et al. Nov 2002 A1
20020169485 Pless Nov 2002 A1
20020170193 Townsend et al. Nov 2002 A1
20030004423 Lavie et al. Jan 2003 A1
20030036783 Bauhahn et al. Feb 2003 A1
20030045910 Sorensen et al. Mar 2003 A1
20030065370 Lebel et al. Apr 2003 A1
20030088185 Prass May 2003 A1
20030149457 Tcheng et al. Aug 2003 A1
20030171791 KenKnight et al. Sep 2003 A1
20030181960 Carter et al. Sep 2003 A1
20030204211 Condie et al. Oct 2003 A1
20040015103 Aminian et al. Jan 2004 A1
20040049132 Barron et al. Mar 2004 A1
20040088020 Condie et al. May 2004 A1
20040102814 Sorensen et al. May 2004 A1
20040133248 Frei et al. Jul 2004 A1
20040138716 Kon et al. Jul 2004 A1
20040147975 Popovic et al. Jul 2004 A1
20040199215 Lee et al. Oct 2004 A1
20040199216 Lee et al. Oct 2004 A1
20040199217 Lee et al. Oct 2004 A1
20040199218 Lee et al. Oct 2004 A1
20040215286 Stypulkowski Oct 2004 A1
20040220621 Zhou et al. Nov 2004 A1
20040225332 Gebhardt et al. Nov 2004 A1
20040257693 Ehrlich Dec 2004 A1
20050042589 Hatlestad et al. Feb 2005 A1
20050043767 Belalcazar Feb 2005 A1
20050043772 Stahmann Feb 2005 A1
20050060001 Singhal et al. Mar 2005 A1
20050061320 Lee et al. Mar 2005 A1
20050113710 Stahmann et al. May 2005 A1
20050113887 Bauhahn May 2005 A1
20050126026 Townsend et al. Jun 2005 A1
20050137627 Koshiol et al. Jun 2005 A1
20050145246 Hartley et al. Jul 2005 A1
20050172311 Hjelt et al. Aug 2005 A1
20050177192 Rezai et al. Aug 2005 A1
20050209511 Heruth et al. Sep 2005 A1
20050209512 Heruth et al. Sep 2005 A1
20050209513 Heruth et al. Sep 2005 A1
20050209643 Heruth et al. Sep 2005 A1
20050209644 Heruth et al. Sep 2005 A1
20050209645 Heruth et al. Sep 2005 A1
20050215847 Heruth et al. Sep 2005 A1
20050215947 Heruth et al. Sep 2005 A1
20050216064 Heruth et al. Sep 2005 A1
20050222522 Heruth et al. Oct 2005 A1
20050222638 Foley et al. Oct 2005 A1
20050228455 Kramer et al. Oct 2005 A1
20050234514 Heruth et al. Oct 2005 A1
20050234518 Heruth et al. Oct 2005 A1
20050240242 DiLorenzo Oct 2005 A1
20050245988 Miesel Nov 2005 A1
20050283210 Blischak et al. Dec 2005 A1
20060190049 Gerber et al. Aug 2006 A1
20060190050 Gerber et al. Aug 2006 A1
20060190051 Gerber et al. Aug 2006 A1
20060195051 Schnapp et al. Aug 2006 A1
20060206167 Flaherty et al. Sep 2006 A1
20060212080 Hartley et al. Sep 2006 A1
20060213267 Tronconi et al. Sep 2006 A1
20060235289 Wesselink et al. Oct 2006 A1
20060235472 Goetz et al. Oct 2006 A1
20060241513 Hatlestad et al. Oct 2006 A1
20060247732 Wesselink Nov 2006 A1
20060247739 Wahlstrand et al. Nov 2006 A1
20060259099 Goetz et al. Nov 2006 A1
20060262120 Rosenberg Nov 2006 A1
20060265025 Goetz et al. Nov 2006 A1
20060287686 Cullen et al. Dec 2006 A1
20070015976 Miesel et al. Jan 2007 A1
20070038265 Tcheng et al. Feb 2007 A1
20070050715 Behar Mar 2007 A1
20070073355 DiLorenzo et al. Mar 2007 A1
20070115277 Wang et al. May 2007 A1
20070118056 Wang et al. May 2007 A1
20070123758 Miesel et al. May 2007 A1
20070129622 Bourget et al. Jun 2007 A1
20070129641 Sweeney Jun 2007 A1
20070129769 Bourget et al. Jun 2007 A1
20070129774 Bourget et al. Jun 2007 A1
20070150026 Bourget et al. Jun 2007 A1
20070150029 Bourget et al. Jun 2007 A1
20070213789 Nolan et al. Sep 2007 A1
20070233201 Lovett et al. Oct 2007 A1
20070249968 Miesel et al. Oct 2007 A1
20070250121 Miesel et al. Oct 2007 A1
20070250134 Miesel et al. Oct 2007 A1
20070255118 Miesel et al. Nov 2007 A1
20070255154 Lu et al. Nov 2007 A1
20070265664 Gerber et al. Nov 2007 A1
20070265681 Gerber et al. Nov 2007 A1
20070276439 Miesel et al. Nov 2007 A1
20070293737 Heruth et al. Dec 2007 A1
20070293917 Thompson Dec 2007 A1
20080071150 Miesel et al. Mar 2008 A1
20080071324 Miesel et al. Mar 2008 A1
20080071326 Heruth et al. Mar 2008 A1
20080071327 Miesel et al. Mar 2008 A1
20080079444 Denison Apr 2008 A1
20080081958 Denison et al. Apr 2008 A1
20080114219 Zhang et al. May 2008 A1
20080164979 Otto Jul 2008 A1
20080177355 Miesel et al. Jul 2008 A1
20080188901 Sanghera et al. Aug 2008 A1
20080188909 Bradley Aug 2008 A1
20080194998 Holmstrom et al. Aug 2008 A1
20080204255 Flexer et al. Aug 2008 A1
20080269812 Gerber et al. Oct 2008 A1
20080269843 Gerber Oct 2008 A1
20080281376 Gerber et al. Nov 2008 A1
20080281379 Wesselink Nov 2008 A1
20080281381 Gerber et al. Nov 2008 A1
20080288200 Noble Nov 2008 A1
20080300449 Gerber et al. Dec 2008 A1
20080300470 Gerber et al. Dec 2008 A1
20090030263 Heruth et al. Jan 2009 A1
20090036951 Heruth et al. Feb 2009 A1
20090046056 Rosenberg et al. Feb 2009 A1
20090076343 Kristofer et al. Mar 2009 A1
20090082829 Panken et al. Mar 2009 A1
20090099627 Molnar et al. Apr 2009 A1
20090105785 Wei et al. Apr 2009 A1
20090118599 Heruth et al. May 2009 A1
20090228841 Hildreth Sep 2009 A1
20090233770 Vincent et al. Sep 2009 A1
20090259216 Drew et al. Oct 2009 A1
20090264789 Molnar et al. Oct 2009 A1
20090306740 Heruth et al. Dec 2009 A1
20100010380 Panken et al. Jan 2010 A1
20100010381 Skelton et al. Jan 2010 A1
20100010382 Panken et al. Jan 2010 A1
20100010383 Skelton et al. Jan 2010 A1
20100010384 Panken et al. Jan 2010 A1
20100010385 Skelton et al. Jan 2010 A1
20100010386 Skelton et al. Jan 2010 A1
20100010387 Skelton et al. Jan 2010 A1
20100010388 Panken et al. Jan 2010 A1
20100010389 Davis et al. Jan 2010 A1
20100010390 Skelton et al. Jan 2010 A1
20100010391 Skelton et al. Jan 2010 A1
20100010392 Skelton et al. Jan 2010 A1
20100010432 Skelton et al. Jan 2010 A1
20100010571 Skelton et al. Jan 2010 A1
20100010572 Skelton et al. Jan 2010 A1
20100010573 Skelton et al. Jan 2010 A1
20100010574 Skelton et al. Jan 2010 A1
20100010575 Skelton et al. Jan 2010 A1
20100010576 Skelton et al. Jan 2010 A1
20100010577 Skelton et al. Jan 2010 A1
20100010578 Skelton et al. Jan 2010 A1
20100010579 Skelton et al. Jan 2010 A1
20100010580 Skelton et al. Jan 2010 A1
20100010583 Panken et al. Jan 2010 A1
20100010584 Skelton et al. Jan 2010 A1
20100010585 Davis et al. Jan 2010 A1
20100010586 Skelton et al. Jan 2010 A1
20100010587 Skelton et al. Jan 2010 A1
20100010588 Skelton et al. Jan 2010 A1
20100010589 Skelton et al. Jan 2010 A1
20100030286 Goetz et al. Feb 2010 A1
20100121415 Skelton et al. May 2010 A1
20100174155 Heruth et al. Jul 2010 A1
20110082522 Bourget et al. Apr 2011 A1
20110238130 Bourget et al. Sep 2011 A1
20110238136 Bourget et al. Sep 2011 A1
Foreign Referenced Citations (44)
Number Date Country
19831109 Jan 2000 DE
10024103 Nov 2001 DE
0564803 Oct 1993 EP
0845240 Jun 1998 EP
0849715 Jun 1998 EP
0613390 Oct 2000 EP
1195139 Apr 2002 EP
1291036 Mar 2003 EP
1308182 May 2003 EP
1391846 Feb 2004 EP
1437159 Jul 2004 EP
1731088 Dec 2006 EP
1870128 Dec 2007 EP
1938862 Jul 2008 EP
2330912 May 1999 GB
2408342 May 2005 GB
2447647 Sep 2008 GB
9405371 Mar 1994 WO
9629007 Sep 1996 WO
9704705 Feb 1997 WO
9749455 Dec 1997 WO
9800197 Jan 1998 WO
9956820 Nov 1999 WO
0137930 May 2001 WO
0228282 Apr 2002 WO
0241771 May 2002 WO
0287433 Nov 2002 WO
0296512 Dec 2002 WO
02100267 Dec 2002 WO
0351356 Jun 2003 WO
0365891 Aug 2003 WO
0528029 Mar 2005 WO
0535050 Apr 2005 WO
0579487 Sep 2005 WO
0589646 Sep 2005 WO
0589647 Sep 2005 WO
0589860 Sep 2005 WO
05102499 Nov 2005 WO
05120348 Dec 2005 WO
0709088 Jan 2007 WO
0751196 May 2007 WO
0764682 Jun 2007 WO
0764936 Jun 2007 WO
0826970 Mar 2008 WO
Non-Patent Literature Citations (78)
Entry
Office Action from U.S. Appl. No. 12/966,827 dated Dec. 4, 2013, 10 pp.
Office Action from U.S. Appl. No. 13/154,303 dated Dec. 5, 2013, 7 pp.
“Analysis of heart rate dynamics by methods derived from non-linear mathematics: Clinical applicability and prognostic significance,” http://herkules.oulu.fi.isbn9514250133/html, 4 pp., 2004.
“Design Competition: Runners-Up for the Best Three Designs,” EPN, vol. 26, No. 1, 1 pg., 2002.
“IBM and Citizen Watch develop Linux-Based WatchPad,” http:/wwwlinuxdevices.com.news/NS6580187845.html, 5 pp., 2006.
“MiniMitter® Physiological and Behavioral Monitoring for Humans and Animals,” http://www.minimitter.com/Products/Actiwatch, 3 pp., 2006.
“Watch,” Wikipedia, 6 pp., http://en.wikipedia.org/wiki/Watch, 2006.
Aminian et al., “Physical Activity Monitoring Based on Accelerometry: Validation and Comparison with Video Observation,” Medical & Biological Engineering and Computing, vol. 37, No. 2, pp. 304-308,1999.
Amzica, “Physiology of Sleep and Wakefulness as it Relates to the Physiology of Epilepsy,” Journal of Clinical Neurophysiology, American Clinical Neurophysiology Society, 19(6)1, pp. 488-503, 2002.
Ang et al., “Physical model of a MEMS accelerometer for low-g motion tracking applications,” 2004 IEEE International Conference on Robotics and Automation, vol. 2, pp. 1345-1351, 2004.
Buchser et al., “Improved Physical Activity in Patients Treated for Chronic Pain by Spinal Cord Stimulation,” Neuromodulation, vol. 8, Issue 1, pp. 40-48, Mar. 2005.
Crago et al., “An Elbow Extension Neuroprosthesis for Individuals with Tetraplegia,” IEEE Transactions on Rehabilitation Engineering, vol. 6, No. 1, pp. 1-6, Mar. 1998.
Dejnabadi et al., “Estimation and Visualization of Sagittal Kinematics of Lower Limbs Orientation Using Body-Fixed Sensors,” IEEE Transactions on Biomedical Engineering, vol. 53, No. 7, pp. 1385-1393, Jul. 2006.
Dinner, “Effect of Sleep on Epilepsy,” Journal of Clinical Neurophysiology, American Clinical Neurophysiology Society, 19(6), pp. 504-513, 2002.
Foerster et al., “Motion Pattern and Posture: Correctly Assessed by Calibrated Accelerometers,” Forschungsgrupe Psychophysiologie, Universität Freiburg, Germany, Mar. 2000, 28 pp.
Foldvary-Schaefer, “Sleep Complaints and Epilepsy: The Role of Seizures, Antiepileptic Drugs and Sleep Disorders,” Journal of Clinical Neurophysiology, American Clinical Neurophysiology Society, 19(6), pp. 514-521, 2002.
Fourcade et al., “Modeling Phase Transitions in Human Posture,” Studies in Perception and Action VII, Sheena Rogers & Judith Effken (eds), Lawrence Erlbaum Associated, Inc., pp. 99-103, 2003.
Giansanti et al., “The development and test of a device for the reconstruction of 3-D position and orientation by means of a kinematic sensor assembly with rate gyroscopes and accelerometers,” IEEE Transactions on Biomedical Engineering, v. 52, No. 7, pp. 1271-1277, Jul. 2005.
Goodrich et al., “The Prediction of Pain Using Measures of Sleep Quality,” Pain Digest, 8:23-25, 1998.
Heinz et al., “Using Wearable Sensors for Real-time Recognition Tasks in Games of Martial Arts—An Initial Experiment,” Institute for Computer Systems and Networks (CSN), UMIT—University of Health Systems, Medical Informatics and Technology Hall in Tyrol, Austria, 2006, 5 pp., http://eis.comp.lancs.ac.uk/fileadmin/relate/publication/2006-WearableSensors.pdf.
Hendelman et al., “Validity of Accelerometry for the Assessment of Moderate Intensity Physical Activity in the Field,” Medicine & Science in Sports & Exercise, pp. S442-S449, 2000.
Hinckley, K., Pierce, J., Sinclair, M., Horvitz, E., Sensing Techniques for Mobile Interaction, ACM UIST 2000 Symposium on User Interface Software & Technology, CHI Letters 2 (2), pp. 91-100.
Husak, “Model of Tilt Sensor Systems,” ICECS 2002, 9th IEEE International Conference on Electronics, Circuits and Systems, vol. 1, pp. 227-230, 2002.
Karantonis et al., “Implementation of a Real-Time Human Movement Classifier Using a Triaxial Accelerometer for Ambulatory Monitoring,” IEEE Transactions on Information Technology in Biomedicine, vol. 10, No. 1, pp. 156-167, Jan. 2006.
Kassam, “2005 EDP Topic “MK4”: Tremor Data-Logger for Parkinson's Disease Patients,” http://www.ee.ryerson.ca/˜courses/edp2005/MK4.html, 3 pp., 2005.
Kerr et al., “Analysis of the sit-stand-sit movement cycle in normal subjects,” Clinical Biomechanics, vol. 12, No. 4, pp. 236-245, 1977.
Kiani et al., “Computerized Analysis of Daily Life Motor Activity for Ambulatory Monitoring,” Technology and Health Care 5, pp. 307-318, 1997.
Kitchin et al., “Compensating for the 0 g Offset Drift of the ADXL50 Accelerometer,” Analog Devices Application Note AN-380, 2 pp.
Lau, “Strategies for Generating Prolonged Functional Standing Using Intramuscular Stimulation or Intraspinal Microstimulation,” IEEE Transactions on Neural Systems and Rehabilitation Engineering, vol. 15 No. 2, pp. 273-285, Jun. 2007.
Leiper et al., “Sensory Feedback for Head Control in Cerebral Palsy,” Physical Therapy, vol. 61, No. 4, pp. 512-518, Apr. 1981.
Lorussi, “Wearable, Redundant Fabric-Based Sensor Arrays for Reconstruction of Body Segment Posture,” IEEE Sensors Journal, vol. 4, No. 6, pp. 808-817, Dec. 2004.
Mathie et al., “A Pilot Study of Long-Term Monitoring of Human Movements in the Home Using Accelerometer,” Journal of Telemedicine and Telecare10:144-151, Jun. 2007.
Mathie et al., “Determining Activity Using a Triaxial Accelerometer,” Proceedings of the Second Joint EMBS/BMES Conference, Houston, TX, pp. 2481-2482, Oct. 23-26, 2002.
Mattmann et al., “Recognizing Upper Body Postures Using Textile Strain Sensors,” Proceedings Eleventh IEEE International Symposium on Wearable Computers, ISWC, pp. 29-36, 2007.
Mendez et al., “Interactions Between Sleep and Epilepsy,” Journal of Clinical Neurophysiology, American Clinical Neurophysiology Society, 18(2), pp. 106-127, 2001.
Paraschiv-Ionescu et al., “Ambulatory System for the Quantitative and Qualitative Analysis of Patients Treated with Spinal Cord Stimulation,” Gait and Posture, vol. 20, Issue 2, pp. 113-125, Oct. 2004.
Slyper et al., “Action Capture with Accelerometers,” Eurographics/ACM SIGGRAPH Symposium on Computer Animation, Carnegie Mellon University, 7 pp., 2008.
Smith et al., “How do sleep disturbance and chronic pain inter-relate? Insights from the longitudinal and cognitive-behavioral clinical trials literature,” Sleep Medicine Reviews, YSMRV 286, pp. 1-14, 2003.
Smith et al., “Presleep cognitions in Patients with Insomnia Secondary to Chronic Pain,” Journal of Behavioral Medicine, vol. 24, No. 1, pp. 93-114, 2001.
Emmanuel Munguia Tapia, “Activity Recognition from Accelerometer Data for Videogame Applications,” http://alumni.media.mit.edu/˜emunguia/html/videogames.htm, 7 pp., Dec. 2, 2003, printed Oct. 1, 2009.
Trolier-Mckinstry et al., “Thin Film Piezoelectrics for MEMS,” Journal of Electroceramics, v. 12, No. 1-2, pp. 7-17, Jan./Mar. 2004.
Tuck, “Implementing Auto-Zero Calibration Technique for Accelerometers,” Freescale Semiconductor Application Note AN3447, 5 pp., Mar. 2007.
Tuisku, “Motor Activity Measured by Actometry in Neuropsychiatric Disorders,” Department of Psychiatry, University of Helsinki, Helsinki, Finland, 115 pp., 2002.
Vega-Gonzalez, “Continuous Monitoring of Upper Limb Activity in a Free-Living Environment,” Arch Phys Med Rehabil, vol. 86, pp. 541-548, Mar. 2005.
Leung et al., “An Integrated Dual Sensor System Automatically Optimized by Target Rate Histogram,” Pacing and Clinical Electrophysiology, vol. 21, No. 8, 7 pp. Aug. 8, 1998.
Saoudi et al., “How Smart Should Pacemakers Be?,” American Journal of Cardiology, vol. 83, No. 5, 6 pp., Mar. 5, 1999.
Velten et al., “A New Three-Axis Accelerometer,” Sensor '99-9th Int'l Trade Fair and Conference for Sensors/Transducers & Systems, Nurnberg, Germany, May 18-20, 1999, Sensor '99 Proceedings II, 1999, A 5.2, 6 pp.
International Preliminary Report on Patentability for PCT Application PCT/US2004/002113, dated Sep. 19, 2005, 5 pp.
European Office Action dated Nov. 13, 2008 for Application No. 06844740.8, 2 pp.
European Office Action dated Nov. 13, 2008 for Application No. 06844725.9, 2 pp.
Canadian Office Action dated Mar. 12, 2012 for Canadian Application No. 2,538,356, 3pp.
Prosecution History from U.S. Pat. No. 8,396,565 from Feb. 28, 2006 through Jan. 11, 2013, 263 pp.
Prosecution History from U.S. Pat. No. 7,853,322 from Jul. 23, 2008 through Sep. 15, 2010, 139 pp.
Prosecution History from U.S. Pat. No. 7,957,809 from Mar. 9, 2010 through Jan. 31, 2011, 37 pp.
Prosecution History from U.S. Pat. No. 7,957,797 from Mar. 9, 2010 through Jan. 31, 2011, 58 pp.
Prosecution History from U.S. Appl. No. 12/966,827, filed Jul. 10, 2012 through Nov. 15, 2013, 48 pp.
Prosecution History from U.S. Appl. No. 13/154,303, filed Aug. 16, 2013 through Nov. 15, 2013, 14 pp.
Prosecution History from U.S. Appl. No. 13/154,309, filed Nov. 23, 2012 through Nov. 1, 2013, 41 pp.
Notice of Allowance from U.S. Appl. No. 13/154,309, dated Jan. 8, 2014, 7 pp.
Response to Office Action dated Dec. 5, 2013 from U.S. Appl. No. 13/154,303, filed Mar. 5, 2014, 8 pp.
Office Action from U.S. Appl. No. 13/154,303, dated Jul. 7, 2014, 7 pp.
Notice of Allowance from U.S. Appl. No. 12/966,827, dated Jul. 18, 2014, 5 pp.
Response to Office Action dated Jul. 7, 2014, from U.S. Appl. No. 13/154,303, filed Oct. 7, 2014, 8 pp.
Final Office Action from U.S. Appl. No. 13/154,303, dated Dec. 11, 2014, 8 pp.
Response to Final Office Action dated Dec. 11, 2014, from U.S. Appl. No. 13/154,303, filed Feb. 11, 2015, 4 pp.
Examiners Answer from U.S. Appl. No. 13/154,303, dated Nov. 9, 2015, 9 pp.
Response to Decision on Appeal dated Oct. 2, 2017, from U.S. Appl. No. 13/154,303, filed Dec. 4, 2017, 9 pp.
Decision on Appeal from U.S. Appl. No. 13/154,303, dated Oct. 2, 2017, 16 pp.
Written Opinion from International Application No. PCT/US2004/002113, dated Jun. 21, 2004, 5 pp.
Response to Examiner's second report from counterpart Australian Patent Application No. 2004279285, filed on Aug. 10, 2010, 2 pp.
Examiners Report from counterpart Canadian Patent Application No. 2538356, dated Mar. 21, 2013, 3 pp.
Response to Examiner's Report dated Mar. 12, 2012, from counterpart Canadian Patent Application No. 2538356, filed on Sep. 11, 2012, 16 pp.
Response to Examiner's Report dated Mar. 21, 2013, from counterpart Canadian Patent Application No. 2538356, filed on Jul. 8, 2013, 10 pp.
Examiners Report from counterpart Canadian Patent Application No. 2538356, dated Feb. 3, 2011, 2 pp.
Response to Examiner's Report dated Feb. 3, 2011, from counterpart Canadian Patent Application No. 2538356, filed on Jun. 21, 2011, 9 pp.
Office Action from US. Appl. No. 13/154,303, dated Jan. 12, 2018, 9 pp.
Response to Office Action dated Jan. 12, 2018, from U.S. Appl. No. 13/154,303, filed Apr. 12, 2018, 6 pp.
Office Action from U.S. Appl. No. 13/154,303, dated Aug. 16, 2018, 9 pp.
Related Publications (1)
Number Date Country
20130150921 A1 Jun 2013 US
Provisional Applications (1)
Number Date Country
60503218 Sep 2003 US
Continuations (1)
Number Date Country
Parent 10691917 Oct 2003 US
Child 13764054 US