Electroencephalography (“EEG”), which involves recording electrical activity along the scalp, is a valuable tool in detecting and monitoring for a host of neurological disorders. For example, by comparing the recorded electrical activity (e.g., voltage fluctuations) in a brain of a subject to the physical manifestations (e.g., a seizure) observed in the subject, a practitioner may diagnosis the particular neurological disorder (e.g., epilepsy). However, even if EEG and video data is readily available, the practitioner still faces the hurdle of identifying what physical manifestations correspond to what electrical activity. This process is further complicated by the fact that in addition to observing a subject for the slightest physical manifestation (e.g., a movement of a finger, a blink of an eye, a twitch of a lip, etc.), the practitioner must simultaneously review numerous channels of incoming EEG data for millisecond fluctuations, any of which may affect an eventual diagnosis.
Accordingly, methods and systems are disclosed herein for a guidance application that allows a practitioner to easily correlate a fluctuation in any channel of a plurality of channels constituting received electroencephalography (“EEG”) data to a particular physical manifestation. For example, by improving the ability of a practitioner to correlate a millisecond fluctuation in a single channel of EEG data to even the slightest of physical manifestations, the likelihood that the practitioner can accurately diagnose a cause of an underlying neurological disorder is increased.
Specifically, the guidance application provides a series of user interfaces that allows a practitioner to simultaneously observe both incoming EEG data and physical manifestations of the subject. Furthermore, the guidance application automatically synchronizes incoming EEG data to the physical manifestations. For example, the guidance application allows a practitioner to automatically retrieve a portion of video data (e.g., of a physical manifestation of a subject) that corresponds to a selected portion of EEG data (e.g., of an electrical fluctuation of the brain activity of the subject). Moreover, the guidance application allows a practitioner to highlight and compare the selected portion to other selected portions in order to ease any review and analysis.
In some aspects, the guidance application may receive EEG data comprising a plurality of EEG channels, in which each EEG channel of the plurality of EEG channels comprises a plurality of EEG instances. For example, in order to monitor brain activity of a subject, the guidance application may receive EEG data from multiple electrodes attached to the scalp of the subject, in which each electrode corresponds to a particular channel. Furthermore, each channel may correspond to real-time voltage fluctuations in the brain activity of the user. The guidance application may also receive video data of a subject to which the multiple electrodes are attached. For example, in addition to monitoring brain activity of the user, the guidance application may monitor physical manifestations by generating a video recording of the subject.
The guidance application may correlate each EEG instance of the plurality of EEG instances to a respective portion of the video data in a database. For example, the guidance application may generate a time stamp for each EEG instance, each of which corresponds to a time stamp for a portion of the video data. The various time stamp may then be stored for later retrieval.
The guidance application may then receive a first user input selecting a first EEG instance of the plurality of EEG instances. For example, the guidance application may receive a user input selecting a particular EEG instance (e.g., corresponding to a large voltage fluctuation) for which a practitioner wishes to examine a subject for a corresponding physical manifestation.
In response to receiving the first user input, the guidance application may cross-reference the first EEG instance with the database to determine a first portion of the video data that corresponds to the first EEG instance. For example, the guidance application may determine the time stamp corresponding to the EEG instance and match that time stamp with a time stamp corresponding to a portion of the video data.
The guidance application may then generate for display, on a display device, the first portion. For example, in response to the request of the practitioner to view the physical manifestation of a large voltage fluctuation in the EEG data, the guidance application presents a portion of video data that corresponds to the large voltage fluctuation.
The guidance application may continue to generate for display the first portion until the end of the video data or until the guidance application receives a second user input (e.g., corresponding to a second EEG instance), at which point the first portion is replaced with a second portion of the video data (e.g., corresponding to second EEG instance). Accordingly, the guidance application may easily facilitate the review of physical manifestations corresponding to any voltage fluctuation (or lack thereof) in order to diagnosis a neurological disorder.
In some embodiments, the guidance application may generate for display a graphical representation of each EEG channel in a region of a display screen. The region may be divided into a plurality of sub-regions, in which each sub-region corresponds to a particular portion of the video data and/or EEG instances with a particular time stamp. Furthermore, the guidance application may be configured to allow a practitioner to select a particular EEG instance by hovering over the sub-region corresponding to the particular portion.
In some aspects, the guidance application may receive electroencephalography data comprising a plurality of EEG channels received from a subject during a hyperventilation period and post-hyperventilation period, in which each EEG channel of the plurality of EEG channels comprises a plurality of EEG instances. For example, in some cases, a practitioner may need to induce hyperventilation (e.g., a condition in which the brain of a subject is stressed) in order to diagnosis a neurological disorder. However, as such inductions involve safety concerns to the subject, the practitioner must actively monitor the incoming EEG data and the length of the hyperventilation of the subject.
To ensure patient safety, the guidance application may receive a first user input defining the hyperventilation period and a second user input defining the post-hyperventilation period. For example, the guidance application may receive an input from the practitioner indicating how long a user should remain in a hyperventilation state as well as how long a user needs to recover.
The guidance application may then correlate each EEG instance of the plurality of EEG instances received in incoming EEG data to a progression of the subject through the hyperventilation period and the post-hyperventilation period. For example, the guidance application may both receive and record incoming EEG data, but also manage the progression of the subject through the hyperventilation period and the post-hyperventilation period by generating for display, on a display screen, an on-screen graphic corresponding to the progression. For example, in order to allow the practitioner to analyze the incoming EEG without losing track of the length of the hyperventilation of the user, the guidance application may generate an on-screen graphic that intuitively informs the practitioner of the progress of the subject. For example, the on-screen graphic may include a first portion that corresponds to the hyperventilation period and a second portion that corresponds to the post-hyperventilation period.
In some embodiments, to increase the intuitiveness of the on-screen graphic and to reduce the reliance on textual elements (which may require a greater amount of the attention of the practitioner to monitor), the on-screen graphic may include one or more non-textual elements that indicate the progression of the subject through the hyperventilation period and the post-hyperventilation period. For example, the guidance application may determine whether the progression corresponds to a hyperventilation period or a post-hyperventilation period and modify the characteristics (e.g., size, shape, color, etc.) of the textual elements accordingly.
Finally, in some aspects, the guidance application may receive user inputs highlighting one or more portions of EEG data. For example, the user inputs may highlight portions of EEG data that include particular voltage fluctuations that a practitioner wishes to analyze in detail. In order to facilitate easy review of these portions, the guidance application may provide options for the practitioner to view truncated EEG data that features only the highlighted portions.
Furthermore, the guidance application may allow a practitioner to interactively adjust the EEG data that is truncated such that the practitioner can understand the relationship between highlighted EEG data and adjacent, unhighlighted EEG data. For example, the guidance application may provide options for incrementally increasing or decreasing the amount of EEG data that is truncated.
It should be noted, the embodiments described above may be applied to, or used in accordance with, other embodiments throughout this disclosure.
The above and other objects and advantages of the disclosure will be apparent upon consideration of the following detailed description, taken in conjunction with the accompanying drawings, in which like reference characters refer to like parts throughout, and in which:
Methods and systems are disclosed herein for a guidance application that allows a practitioner to easily correlate a fluctuation in any channel of a plurality of channels constituting received electroencephalography (“EEG”) data to a particular physical manifestation. For example, by improving the ability of a practitioner to correlate a millisecond fluctuation in a single channel of EEG data to even the slightest of physical manifestations, the likelihood that the practitioner can accurately diagnose a cause of an underlying neurological disorder is increased.
As referred to herein, a “guidance application” is an application that facilitates the receipt, navigation, and/or viewing of EEG data. The guidance application and/or any instructions for performing any of the embodiments discussed herein may be encoded on computer readable media. Computer readable media includes any media capable of storing data. The computer readable media may be transitory, including, but not limited to, propagating electrical or electromagnetic signals, or may be non-transitory including, but not limited to, volatile and nonvolatile computer memory or storage devices such as a hard disk, floppy disk, USB drive, DVD, CD, media cards, register memory, processor caches, Random Access Memory (“RAM”), etc.
As referred to herein, “EEG data” is data received via electroencephalography. For example, EEG data may correspond to real-time voltage fluctuations in the brain activity of the user resulting from ionic current flows within the neurons of the brain. The guidance application may receive EEG data via a plurality of EEG channels. Each EEG channel may correspond to EEG data received from a specific EEG electrode. Furthermore, EEG data may include a plurality of EEG instances. As referred to herein, an “EEG instance” is EEG data that is received at a particular point in time. For example, EEG data may be received in a continuous or real-time manner as a practitioner performs an EEG test on a subject. While the test is being performed, the guidance application may receive a continuous stream of EEG data, which includes a series of EEG instances.
The guidance application may also monitor (e.g., via video recording) physical manifestations of a subject. As referred to herein, a “physical manifestation” is any event or action performed by a subject that may be observed through visual observation (e.g., a movement of a finger, a blink of an eye, a twitch of a lip, an uneven breathing pattern, any increase rate of perspiration, etc.). For example, in addition to monitoring brain activity of the user, the guidance application may monitor physical manifestations by generating a video recording of the user.
In some embodiments, the guidance application may receive electroencephalography data comprising a plurality of EEG channels received from a subject during a hyperventilation period and post-hyperventilation period, in which each EEG channel of the plurality of EEG channels comprises a plurality of EEG instances. As referred to herein, a “hyperventilation period” is a period of time associated with the hyperventilation (or the goal of inducing a hyperventilation) of a subject. As referred to herein, a “post-hyperventilation period” is a period of time that is not associated with the hyperventilation (or the goal of inducing a hyperventilation) of a subject.
To ensure patient safety, the guidance application may receive a first user input defining the hyperventilation period and a second user input defining the post-hyperventilation period. The guidance application may then generate for display, on a display screen, an on-screen graphic corresponding to the progression of a subject through the hyperventilation period and the post-hyperventilation period.
A referred to herein, an “on-screen graphic” is any human-consumable data that is visually displayed on a display screen. For example, an on-screen graphic may include any visual such as a drawing, a graph, a chart, an image, etc. The on-screen graphic may include both textual elements (e.g., number, letters, etc.) or non-textual elements (e.g., symbols, shapes, images, etc.). The elements of the on-screen graphic may include various characteristics (e.g., related to size, shape, brightness, hue, opaqueness, and/or any other feature or quality that may visually distinguish one element from another). In some embodiments, the guidance application may vary the characteristics of textual and non-textual elements in order to distinguish particular data or otherwise gain the attention of a practitioner. For example, a non-textual element associated with a hyperventilation period in an on-screen graphic may have one or more different characteristics than a non-textual element associated with a post-hyperventilation period in the on-screen graphic. Such variances are key in intuitively alerting a practitioner to changes in data or conditions.
For example, the scalp or cranium surface of user 100 includes first portion 102, second portion 104, third portion 106, and fourth portion 108. In some embodiments, each of first portion 102, second portion 104, third portion 106, and fourth portion 108 may correspond to a different region of brain 110. During EEG, a position on the scalp of a user (e.g., user 100) may correspond to a region of a brain (e.g., brain 150) of the user. For example, in some embodiments, first portion 102 may correspond to frontal lobe 112, second portion 104 may correspond to parietal lobe 114, third portion 106 may correspond to occipital lobe 116, and fourth portion 108 may correspond to temporal lobe 118. The voltage fluctuations associated with a specific region of the brain may provide evidence of specific neurological disorders. In addition, voltage fluctuations associated with a specific region of the brain may often correlate to specific physical manifestations.
During an EEG test, a practitioner may distribute a plurality of electrodes about the scalp of a user in order to detect voltage fluctuations resulting from ionic current flows within the neurons of the various regions of the brain as shown in
Each of the positions identified in
The positions labeled and identified in
Control circuitry 404 may be based on any suitable processing circuitry such as processing circuitry 406. As referred to herein, processing circuitry should be understood to mean circuitry based on one or more microprocessors, microcontrollers, digital signal processors, programmable logic devices, field-programmable gate arrays (FPGAs), application-specific integrated circuits (ASICs), etc., and may include a multi-core processor (e.g., dual-core, quad-core, hexa-core, or any suitable number of cores) or supercomputer. In some embodiments, processing circuitry may be distributed across multiple separate processors or processing units, for example, multiples of the same type of processing units (e.g., two Intel Core i7 processors) or multiple different processors (e.g., an Intel Core i5 processor and an Intel Core i7 processor). In some embodiments, control circuitry 404 executes instructions for a guidance application stored in memory (i.e., storage 408). Specifically, control circuitry 404 may be instructed by the guidance application to perform the functions discussed above and below. For example, the guidance application may provide instructions to control circuitry 404 to monitor and record EEG data and/or execute user inputs. In some implementations, any action performed by control circuitry 404 may be based on instructions received from the guidance application.
In client-server based embodiments, control circuitry 404 may include communications circuitry suitable for communicating with a guidance application server or other networks or servers. The instructions for carrying out the above mentioned functionality may be stored on the guidance application server. Communications circuitry may include a cable modem, an integrated services digital network (ISDN) modem, a digital subscriber line (DSL) modem, a telephone modem, Ethernet card, or a wireless modem for communications with other equipment, or any other suitable communications circuitry. Such communications may involve the Internet or any other suitable communications networks or paths. In addition, communications circuitry may include circuitry that enables peer-to-peer communication of user equipment devices, or communication of user equipment devices in locations remote from each other (described in more detail below).
Memory may be an electronic storage device provided as storage 408 that is part of control circuitry 404. As referred to herein, the phrase “electronic storage device” or “storage device” should be understood to mean any device for storing electronic data, computer software, or firmware, such as random-access memory, read-only memory, hard drives, optical drives, digital video disc (DVD) recorders, compact disc (CD) recorders, BLU-RAY disc (BD) recorders, BLU-RAY 3D disc recorders, digital video recorders (DVR, sometimes called a personal video recorder, or PVR), solid state devices, quantum storage devices, gaming consoles, gaming media, or any other suitable fixed or removable storage devices, and/or any combination of the same. Storage 408 may be used to store various types of content described herein as well as media guidance data described above. Nonvolatile memory may also be used (e.g., to launch a boot-up routine and other instructions). Cloud-based storage may be used to supplement storage 408 or instead of storage 408.
Control circuitry 404 may include video generating circuitry and tuning circuitry, such as one or more analog tuners, one or more MPEG-2 decoders or other digital decoding circuitry, high-definition tuners, or any other suitable tuning or video circuits or combinations of such circuits. Encoding circuitry (e.g., for converting over-the-air, analog, or digital signals to MPEG signals for storage) may also be provided. Control circuitry 404 may also include scaler circuitry for upconverting and downconverting content into the preferred output format of the user equipment 400. Circuitry 404 may also include digital-to-analog converter circuitry and analog-to-digital converter circuitry for converting between digital and analog signals. The tuning and encoding circuitry may be used by the user equipment device to receive and to display, to play, or to record content.
A user may send instructions to control circuitry 404 using user input interface 410. User input interface 410 may be any suitable user interface, such as a remote control, mouse, trackball, keypad, keyboard, touch screen, touchpad, stylus input, joystick, voice recognition interface, or other user input interfaces. Display 412 may be provided as a stand-alone device or integrated with other elements of user equipment device 400. For example, display 412 may be a touchscreen or touch-sensitive display. In such circumstances, user input interface 410 may be integrated with or combined with display 412. Display 412 may be one or more of a monitor, a television, a liquid crystal display (LCD) for a mobile device, amorphous silicon display, low temperature poly silicon display, electronic ink display, electrophoretic display, active matrix display, electro-wetting display, electrofluidic display, cathode ray tube display, light-emitting diode display, electroluminescent display, plasma display panel, high-performance addressing display, thin-film transistor display, organic light-emitting diode display, surface-conduction electron-emitter display (SED), laser television, carbon nanotubes, quantum dot display, interferometric modulator display, or any other suitable equipment for displaying visual images. In some embodiments, display 412 may be HDTV-capable. In some embodiments, display 412 may be a 3D display, and the guidance application and any suitable content may be displayed in 3D. A video card or graphics card may generate the output to the display 412. The video card may offer various functions such as accelerated rendering of 3D scenes and 2D graphics, MPEG-2/MPEG-4 decoding, TV output, or the ability to connect multiple monitors. The video card may be any processing circuitry described above in relation to control circuitry 404. The video card may be integrated with the control circuitry 404. Speakers 414 may be provided as integrated with other elements of user equipment device 400 or may be stand-alone units. The audio component of videos and other content displayed on display 412 may be played through speakers 414. In some embodiments, the audio may be distributed to a receiver (not shown), which processes and outputs the audio via speakers 414.
The guidance application may be implemented using any suitable architecture. For example, it may be a stand-alone application wholly-implemented on user equipment device 400. In such an approach, instructions of the application are stored locally (e.g., in storage 408), and data for use by the application is downloaded on a periodic basis (e.g., from an out-of-band feed, from an Internet resource, or using another suitable approach). Control circuitry 404 may retrieve instructions of the application from storage 408 and process the instructions to generate any of the displays discussed herein. Based on the processed instructions, control circuitry 404 may determine what action to perform when input is received from user input interface 410. For example, movement of a cursor on a display up/down may be indicated by the processed instructions when input interface 410 indicates that an up/down button was selected.
In some embodiments, the guidance application is a client-server based application. Data for use by a thick or thin client implemented on user equipment device 400 is retrieved on-demand by issuing requests to a server remote to the user equipment device 400. In one example of a client-server based guidance application, control circuitry 404 runs a web browser that interprets web pages provided by a remote server. For example, the remote server may store the instructions for the application in a storage device. The remote server may process the stored instructions using circuitry (e.g., control circuitry 404) and generate the displays discussed above and below. The client device may receive the displays generated by the remote server and may display the content of the displays locally on equipment device 400. This way, the processing of the instructions is performed remotely by the server while the resulting displays are provided locally on equipment device 400. Equipment device 400 may receive inputs from the user via input interface 410 and transmit those inputs to the remote server for processing and generating the corresponding displays. For example, equipment device 400 may transmit a communication to the remote server indicating that an up/down button was selected via user input interface 410. The remote server may process instructions in accordance with that input and generate a display of the application corresponding to the input (e.g., a display that moves a cursor up/down). The generated display is then transmitted to equipment device 400 for presentation to the user.
In some embodiments, the guidance application is downloaded and interpreted or otherwise run by an interpreter or virtual machine (run by control circuitry 404). In some embodiments, the guidance application may be encoded in the ETV Binary Interchange Format (EBIF), received by control circuitry 404 as part of a suitable feed, and interpreted by a user agent running on control circuitry 404. For example, the guidance application may be an EBIF application. In some embodiments, the guidance application may be defined by a series of JAVA-based files that are received and run by a local virtual machine or other suitable middleware executed by control circuitry 404. In some of such embodiments (e.g., those employing MPEG-2 or other digital media encoding schemes), the guidance application may be, for example, encoded and transmitted in an MPEG-2 object carousel with the MPEG audio and video packets of a program.
In addition to first region 500,
For example, in
For example, after on-screen cursor 802 is positioned at a location in a first region (e.g., corresponding to a particular sub-region as discussed above), the guidance application may (e.g., via control circuitry 404 (
The guidance application may (e.g., via control circuitry 404 (
Furthermore, in some embodiments, the guidance application may cross-reference a coordinate and/or a time stamp associated with the position of an on-screen cursor (or EEG data that corresponds to the position of the on-screen cursor) with a database that indicates a portion of video data, an on-screen graphic, etc. associated with different coordinates and/or a time stamps. For example, the guidance application may (e.g., via control circuitry 404 (
For example, in order to intuitively present how much time remains in a hyperventilation or post-hyperventilation period, the guidance application (e.g., via control circuitry 404 (
In some embodiments, the guidance application may receive (e.g., via control circuitry 404 (
In
In some embodiments, the guidance application may trigger the transition from a hyperventilation period to a post-hyperventilation period automatically. For example, in response to determining that a selected amount of time for the hyperventilation period has expired, the guidance application may automatically transition to the post-hyperventilation period. Alternatively, guidance application may trigger the transition from a hyperventilation period to a post-hyperventilation period in response to a user input (e.g., received via user input interface 410 (
In some embodiments, the guidance application may highlight particular portions of EEG data. For example, a portion of EEG data may include voltage fluctuations, physical manifestations, etc. that are of interest to a practitioner. Accordingly, the guidance application may allow a user to highlight those portions.
The guidance application may generate for display truncation bar 2302 on region 2300, which indicates the order between two selected portions. The guidance application further allows unselected EEG data to be intuitively explored while its relation to the selected EEG is preserved. For example, as shown in
At step 2502, the guidance application receives (e.g., via I/O path 402 (
At step 2504, the guidance application receives (e.g., via I/O path 402 (
At step 2506, the guidance application correlates (e.g., via control circuitry 404 (
At step 2508, the guidance application receives a first user input selecting a first EEG instance of the plurality of EEG instances. For example, the guidance application may receive a user input (e.g., via user input interface 410 (
At step 2510, in response to receiving the first user input, the guidance application cross-references (e.g., via control circuitry 404 (
At step 2512, the guidance application generates for display, on a display device, the first portion. For example, in response to the request of the practitioner to view the physical manifestation of a large voltage fluctuation in the EEG data, the guidance application presents (e.g., via control circuitry 404 (
In some embodiments, the guidance application may continue to generate for display (e.g., via control circuitry 404 (
In some embodiments, the guidance application may generate for display (e.g., via control circuitry 404 (
It is contemplated that the steps or descriptions of
At step 2602, the guidance application receives (e.g., via I/O path 402 (
At step 2604, the guidance application receives a first user input (e.g., via user input interface 410 (
At step 2608, the guidance application correlates (e.g., via control circuitry 404 (
At step 2610, the guidance application generates for display, on a display screen, an on-screen graphic corresponding to the progression. For example, in order to allow the practitioner to analyze the incoming EEG without losing track of the length of the hyperventilation of the user, the guidance application may generate an on-screen graphic that intuitively informs the practitioner of the progress of the subject. For example, the on-screen graphic may include a first portion that corresponds to the hyperventilation period and a second portion that corresponds to the post-hyperventilation period.
In some embodiments, to increase the intuitiveness of the on-screen graphic and to reduce the reliance on textual elements (which may require a greater amount of the attention of the practitioner to monitor), the on-screen graphic may include one or more non-textual elements that indicates the progression of the subject through the hyperventilation period and the post-hyperventilation period. For example, the guidance application may determine whether the progression corresponds to a hyperventilation period or a post-hyperventilation period and modify the characteristics (e.g., size, shape, color, etc.) of the textual elements accordingly.
It is contemplated that the steps or descriptions of
At step 2702, the guidance application detects (e.g., via control circuitry 404 (
At step 2704, the guidance application determines (e.g., via control circuitry 404 (
At step 2706, the guidance application identifies (e.g., via control circuitry 404 (
At step 2708, the guidance application determines (e.g., via control circuitry 404 (
If the guidance application identifies a time stamp for the sub-region, the guidance application proceeds to step 2714 and compares the time stamp for the sub-region to times stamps corresponding to video data. For example, the guidance application may cross-reference the time stamp for the sub-region with a database (e.g., located at storage 408 (
At step 2716, the guidance application generates for display (e.g., on display 412 (
It is contemplated that the steps or descriptions of
As described with respect to
The above-described embodiments of the present disclosure are presented for purposes of illustration and not of limitation, and the present disclosure is limited only by the claims that follow. Furthermore, it should be noted that the features and limitations described in any one embodiment may be applied to any other embodiment herein, and flowcharts or examples relating to one embodiment may be combined with any other embodiment in a suitable manner, done in different orders, or done in parallel. In addition, the systems and methods described herein may be performed in real time. It should also be noted, the systems and/or methods described above may be applied to, or used in accordance with, other systems and/or methods.
This application claims the benefit of U.S. Provisional Application No. 62/139,151, filed Mar. 27, 2015, which is hereby incorporated by reference herein in its entirety.
Number | Name | Date | Kind |
---|---|---|---|
751475 | De Vilbiss | Feb 1904 | A |
2320709 | Arnesen | Jun 1943 | A |
2807259 | Guerriero | Sep 1957 | A |
3682162 | Colyer | Aug 1972 | A |
3985125 | Rose | Oct 1976 | A |
4155353 | Rea | May 1979 | A |
4263899 | Burgin | Apr 1981 | A |
4545374 | Jacobson | Oct 1985 | A |
4562832 | Wilder | Jan 1986 | A |
4616635 | Caspar | Oct 1986 | A |
4705049 | John | Nov 1987 | A |
4716901 | Jackson | Jan 1988 | A |
4765311 | Kulik | Aug 1988 | A |
4817587 | Janese | Apr 1989 | A |
4862891 | Smith | Sep 1989 | A |
5171279 | Mathews | Dec 1992 | A |
5196015 | Neubardt | Mar 1993 | A |
5284153 | Raymond | Feb 1994 | A |
5284154 | Raymond | Feb 1994 | A |
5299563 | Seton | Apr 1994 | A |
5377667 | Patton | Jan 1995 | A |
5472426 | Bonati | Dec 1995 | A |
5474558 | Neubardt | Dec 1995 | A |
5540235 | Wilson | Jul 1996 | A |
5560372 | Cory | Oct 1996 | A |
5565779 | Arakawa | Oct 1996 | A |
5601608 | Mouchawar | Feb 1997 | A |
5681265 | Maeda | Oct 1997 | A |
5728046 | Mayer | Mar 1998 | A |
5741261 | Moskovitz | Apr 1998 | A |
5772661 | Michelson | Jun 1998 | A |
5775331 | Raymond | Jul 1998 | A |
5785648 | Min | Jul 1998 | A |
5792044 | Foley | Aug 1998 | A |
5795291 | Koros | Aug 1998 | A |
5798798 | Rector | Aug 1998 | A |
5830150 | Palmer | Nov 1998 | A |
5860973 | Michelson | Jan 1999 | A |
5868668 | Weiss | Feb 1999 | A |
5885210 | Cox | Mar 1999 | A |
5891147 | Moskovitz | Apr 1999 | A |
5928139 | Koros | Jul 1999 | A |
5928158 | Aristides | Jul 1999 | A |
5931777 | Sava | Aug 1999 | A |
5944658 | Koros | Aug 1999 | A |
5954635 | Foley | Sep 1999 | A |
5993385 | Johnston | Nov 1999 | A |
6004312 | Finneran | Dec 1999 | A |
6004341 | Zhu | Dec 1999 | A |
6042540 | Johnston | Mar 2000 | A |
6074343 | Nathanson | Jun 2000 | A |
6095987 | Shmulewitz | Aug 2000 | A |
6139493 | Koros | Oct 2000 | A |
6152871 | Foley | Nov 2000 | A |
6181961 | Prass | Jan 2001 | B1 |
6196969 | Bester | Mar 2001 | B1 |
6206826 | Mathews | Mar 2001 | B1 |
6224545 | Cocchia | May 2001 | B1 |
6259945 | Epstein | Jul 2001 | B1 |
6266558 | Gozani | Jul 2001 | B1 |
6287322 | Zhu | Sep 2001 | B1 |
6302842 | Auerbach | Oct 2001 | B1 |
6306100 | Prass | Oct 2001 | B1 |
6309349 | Bertolero | Oct 2001 | B1 |
6325764 | Griffith | Dec 2001 | B1 |
6334068 | Hacker | Dec 2001 | B1 |
6425859 | Foley | Jul 2002 | B1 |
6450952 | Rioux | Sep 2002 | B1 |
6466817 | Kaula | Oct 2002 | B1 |
6500128 | Marino | Dec 2002 | B2 |
6535759 | Epstein | Mar 2003 | B1 |
6609018 | Cory | Aug 2003 | B2 |
6712795 | Cohen | Mar 2004 | B1 |
6805668 | Cadwell | Oct 2004 | B1 |
6847849 | Mamo | Jan 2005 | B2 |
6851430 | Tsou | Feb 2005 | B2 |
6870109 | Villarreal | Mar 2005 | B1 |
6926728 | Zucherman | Aug 2005 | B2 |
6945933 | Branch | Sep 2005 | B2 |
7072521 | Cadwell | Jul 2006 | B1 |
7089059 | Pless | Aug 2006 | B1 |
7104965 | Jiang | Sep 2006 | B1 |
7177677 | Kaula | Feb 2007 | B2 |
7214197 | Prass | May 2007 | B2 |
7230688 | Villarreal | Jun 2007 | B1 |
7261688 | Smith | Aug 2007 | B2 |
7374448 | Jepsen | May 2008 | B1 |
7470236 | Kelleher | Dec 2008 | B1 |
7522953 | Kaula | Apr 2009 | B2 |
7713210 | Byrd | May 2010 | B2 |
7801601 | Maschino | Sep 2010 | B2 |
7914350 | Bozich | Mar 2011 | B1 |
7963927 | Kelleher | Jun 2011 | B2 |
7983761 | Giuntoli | Jul 2011 | B2 |
8147421 | Farquhar | Apr 2012 | B2 |
8160694 | Salmon | Apr 2012 | B2 |
8192437 | Simonson | Jun 2012 | B2 |
D670656 | Jepsen | Nov 2012 | S |
8323208 | Davis | Dec 2012 | B2 |
8876813 | Min | Nov 2014 | B2 |
8942797 | Bartol | Jan 2015 | B2 |
8958869 | Kelleher | Feb 2015 | B2 |
9084551 | Brunnett | Jul 2015 | B2 |
9155503 | Cadwell | Oct 2015 | B2 |
9295401 | Cadwell | Mar 2016 | B2 |
9352153 | Van Dijk | May 2016 | B2 |
9730634 | Cadwell | Aug 2017 | B2 |
20020007188 | Arambula | Jan 2002 | A1 |
20020095080 | Cory | Jul 2002 | A1 |
20030045808 | Kaula | Mar 2003 | A1 |
20040263353 | Imajo | Dec 2004 | A1 |
20050075578 | Gharib | Apr 2005 | A1 |
20050182454 | Gharib | Aug 2005 | A1 |
20060009754 | Boese | Jan 2006 | A1 |
20060085048 | Cory | Apr 2006 | A1 |
20060085049 | Cory | Apr 2006 | A1 |
20060122514 | Byrd | Jun 2006 | A1 |
20060258951 | Bleich | Nov 2006 | A1 |
20070016097 | Farquhar | Jan 2007 | A1 |
20070021682 | Gharib | Jan 2007 | A1 |
20070032841 | Urmey | Feb 2007 | A1 |
20070049962 | Marino | Mar 2007 | A1 |
20070184422 | Takahashi | Aug 2007 | A1 |
20080027507 | Bijelic | Jan 2008 | A1 |
20080058606 | Miles | Mar 2008 | A1 |
20080065144 | Marino | Mar 2008 | A1 |
20080071191 | Kelleher | Mar 2008 | A1 |
20080082136 | Gaudiani | Apr 2008 | A1 |
20080097164 | Miles | Apr 2008 | A1 |
20080167574 | Farquhar | Jul 2008 | A1 |
20080194970 | Steers | Aug 2008 | A1 |
20080269777 | Appenrodt | Oct 2008 | A1 |
20080281313 | Fagin | Nov 2008 | A1 |
20090018399 | Martinelli | Jan 2009 | A1 |
20090088660 | McMorrow | Apr 2009 | A1 |
20090105604 | Bertagnoli | Apr 2009 | A1 |
20090177112 | Gharib | Jul 2009 | A1 |
20090204016 | Gharib | Aug 2009 | A1 |
20090209879 | Kaula | Aug 2009 | A1 |
20090259108 | Miles | Oct 2009 | A1 |
20090279767 | Kukuk | Nov 2009 | A1 |
20100036384 | Gorek | Feb 2010 | A1 |
20100106011 | Byrd | Apr 2010 | A1 |
20100152604 | Kaula | Jun 2010 | A1 |
20100286554 | Davis | Nov 2010 | A1 |
20100317989 | Gharib | Dec 2010 | A1 |
20110082383 | Cory | Apr 2011 | A1 |
20110184308 | Kaula | Jul 2011 | A1 |
20110201911 | Johnson | Aug 2011 | A1 |
20110218820 | Himes | Sep 2011 | A1 |
20110295579 | Tang | Dec 2011 | A1 |
20110313530 | Gharib | Dec 2011 | A1 |
20120109000 | Kaula | May 2012 | A1 |
20120109004 | Cadwell | May 2012 | A1 |
20120220891 | Kaula | Aug 2012 | A1 |
20120238855 | Lanning | Sep 2012 | A1 |
20120238893 | Farquhar | Sep 2012 | A1 |
20120296230 | Davis | Nov 2012 | A1 |
20130303933 | Bonnstetter | Nov 2013 | A1 |
20140121555 | Scott | May 2014 | A1 |
20140275926 | Scott | Sep 2014 | A1 |
20160000382 | Jain | Jan 2016 | A1 |
20160128593 | Sinharay | May 2016 | A1 |
20160174861 | Cadwell | Jun 2016 | A1 |
Number | Date | Country |
---|---|---|
298268 | Jan 1989 | EP |
890341 | Jan 1999 | EP |
972538 | Jan 2000 | EP |
2000038574 | Jul 2000 | WO |
2000066217 | Nov 2000 | WO |
2001037728 | May 2001 | WO |
2003005887 | Jan 2003 | WO |
2005030318 | Apr 2005 | WO |
2006042241 | Apr 2006 | WO |
Entry |
---|
“Software for Electroencephalogram Acquisition and Processing ‘WinEEG,’” Version 2.8. 2009. 277 pages. |
“WinEEG: QEEG and ERP Analysis Software.” Oct. 21, 2012. https://web.archive.org/web/20121021055030/http://www.mitsar-medical.com/eeg-software/geeg-software/. |
Panayiotopoulos, CP. “Chapter 10. Idiopathic Generalised Epilepsies.” The Epilepsies: Seizures, Syndromes and Management. Oxfordshire (UK): Bladon Medical Publishing; 2005. 101 pages. |
Aage R. Møller, “Intraoperative Neurophysiologic Monitoring”, University of Pittsburgh, School of Medicine Pennsylvania, © 1995 by Harwood Academic Publishers GmbH. |
Clements, et. al., “Evoked and Spontaneous Electromyography to Evaluate Lumbosacral Pedicle Screw Placement”, 21 (5):600-604 (1996). |
Danesh-Clough, et. al., “The Use of Evoked EMG in Detecting Misplaced Thoracolumbar Pedicle Screws”, 26(12):1313-1316 (2001). |
Dezawa et al., “Retroperitoneal Laparoscopic Lateral Approach to the Lumbar Spine: A New Approach, Technique, and Clinical Trial”, Journal of Spinal Disorders 13(2):138-143 (2000). |
Dickman, et al., “Techniques in Neurosurgery”, National Library of Medicine, 3 (4) 301-307 (1997). |
Epstein, et al., “Evaluation of Intraoperative Somatosensory-Evoked Potential Monitoring During 100 Cervical Operations”, 18(6):737-747 (1993), J.B. Lippincott Company. |
Glassman, et. al., “A Prospective Analysis of Intraoperative Electromyographic Monitoring of Pedicle Screw Placement with Computed Tomographic Scan Confirmation”, 20(12):1375-1379. |
Goldstein, et. al., “Minimally Invasive Endoscopic Surgery of the Lumbar Spine”, Operative Techniques in Orthopaedics, 7 (1):27-35 (1997). |
Greenblatt, et. al., “Needle Nerve Stimulator-Locator”, 41 (5):599-602 (1962). |
H.M. Mayer, “Minimally Invasive Spine Surgery, A Surgical Manual”, Chapter 12, pp. 117-131 (2000). |
Hinrichs, et al., “A trend-detection algorithm for intraoperative ERH monitoring”, Med. Eng. Phys. 18 (8):626-631 (1996). |
Bergey et al., “Endoscopic Lateral Transpsoas Approach to the Lumbar Spine”, SPINE 29 (15):1681-1688 (2004). |
Holland, “Spine Update, Intraoperative Electromyography During Thoracolumbar Spinal Surgery”, 23 (17):1915-1922 (1998). |
Holland, et al., “Continuous Electromyographic Monitoring to Detect Nerve Root Injury During Thoracolumbar Scoliosis Surgery”, 22 (21):2547-2550 (1997), Lippincott-Raven Publishers. |
Hovey, A Guide to Motor Nerve Monitoring, pp. 1-31 Mar. 20, 1998, The Magstim Company Limited. |
Kevin T. Foley, et. al., “Microendoscipic Discectomy” Techniques in Neurosurgery, 3:(4):301-307, © 1997 Lippincott-Raven Publishers, Philadelphia. |
Kossmann et al., “The use of a retractor system (SynFrame) for open, minimal invasive reconstruction of the anterior column of the thoracic and lumbar spine”, 10:396-402 (2001). |
Kossmann, et. al., “Minimally Invasive Vertebral Replacement with Cages in Thoracic and Lumbar Spine”, European Journal of Trauma, 2001, No. 6, pp. 292-300. |
Lenke, et. al., “Triggered Electromyographic Threshold for Accuracy of Pedicle Screw Placement, An Animal Model and Clinical Correlation”, 20 (14):1585-1591 (1995). |
Lomanto et al., “7th World Congress of Endoscopic Surgery” Singapore, Jun. 1-4, 2000 Monduzzi Editore S.p. A.; email: monduzzi©monduzzi.com, pp. 97-103 and 105-111. |
MaGuire, et. al., “Evaluation of Intrapedicular Screw Position Using Intraoperative Evoked Electromyography”, 20 (9):1068-1074 (1995). |
Mathews et al., “Laparoscopic Discectomy With Anterior Lumbar Interbody Fusion, A Preliminary Review”, 20 (16):1797-1802, (1995), Lippincott-Raven Publishers. |
Bertagnoli, et. al., “The AnteroLateral transPsoatic Approach (ALPA), A New Technique for Implanting Prosthetic Disc-Nucleus Devices”, 16 (4):398-404 (2003). |
Michael R. Isley, et. al., “Recent Advances in Intraoperative Neuromonitoring of Spinal Cord Function: Pedicle Screw Stimulation Techniques”, Am. J. End Technol. 37:93-126 (1997). |
Minahan, et. al., “The Effect of Neuromuscular Blockade on Pedicle Screw Stimulation Thresholds” 25(19):2526-2530 (2000). |
Pimenta et. al., “Implante de prótese de núcleo pulposo: análise inicial”, J Bras Neurocirurg 12 (2):93-96, (2001). |
Raymond J. Gardocki, MD, “Tubular diskectomy minimizes collateral damage”, AAOS Now, Sep. 2009 Issue, http://www.aaos.org/news/aaosnow/sep09/clinical12.asp. |
Raymond, et. al., “The NerveSeeker: A System for Automated Nerve Localization”, Regional Anesthesia 17:151-162 (1992). |
Reidy, et. al., “Evaluation of electromyographic monitoring during insertion of thoracic pedicle screws”, British Editorial Society of Bone and Joint Surgery 83 (7):1009-1014, (2001). |
Rose et al., “Persistently Electrified Pedicle Stimulation Instruments in Spinal Instrumentation: Technique and Protocol Development”, Spine: 22(3): 334-343 (1997). |
Teresa Riordan “Patents; A businessman invents a device to give laparoscopic surgeons a better view of their worK”, New York Times www.nytimes.com/2004/29/business/patents-businessman-invents-device-give-la (Mar. 2004). |
Toleikis, et. al., “The usefulness of Electrical Stimulation for Assessing Pedicle Screw Placements”, Journal of Spinal Disorders, 13 (4):283-289 (2000). |
U.Schick, et. al., “Microendoscopic lumbar discectomy versus open surgery: an intraoperative EMG study”, pp. 20-26, Published online: Jul. 31, 2001 © Springer-Verlag 2001. |
Bose, et. al., “Neurophysiologic Monitoring of Spinal Nerve Root Function During Instrumented Posterior Lumbar Spine Surgery”, 27 (13):1440-1450 (2002). |
Vaccaro, et. al., “Principles and Practice of Spine Surgery”, Mosby, Inc. © 2003, Chapter 21, pp. 275-281. |
Vincent C. Traynelis, “Spinal arthroplasty”, Neurosurg Focus 13 (2):1-7. Article 10, (2002). |
Welch, et. al., “Evaluation with evoked and spontaneous electromyography during lumbar instrumentation: a prospective study”, J Neurosurg 87:397-402, (1997). |
Zouridakis, et. al., “A Concise Guide to Intraoperative Monitoring”, Library of Congress card No. 00-046750, Chapter 3, p. 21, chapter 4, p. 58 and chapter 7 pp. 119-120. |
Medtronic, “Nerve Integrity Monitor, Intraoperative EMG Monitor, User's Guide”, Medtronic Xomed U.K. Ltd., Unit 5, West Point Row, Great Park Road, Almondsbury, Bristol B5324QG, England, pp. 1-39. |
Chapter 9, “Root Finding and Nonlinear Sets of Equations”, Chapter 9:350-354, http://www.nr.com. |
Digitimer Ltd., 37 Hydeway, Welwyn Garden City, Hertfordshire. AL7 3BE England, email:sales@digitimer.com, website: www.digitimer.com, “Constant Current High Voltage Stimulator, Model DS7A, for Percutaneous Stimulation of Nerve and Muscle Tissue”. |
Ford et al, Electrical characteristics of peripheral nerve stimulators, implications for nerve localization, Dept. of Anesthesia, University of Cincinnati College of Medicine, Cincinnati, OH 45267, pp. 73-77. |
Deletis et al, “The role of intraoperative neurophysiology in the protection or documentation of surgically induced injury to the spinal cord”, Correspondence Address: Hyman Newman Institute for Neurology & Neurosurgery, Beth Israel Medical Center, 170 East End Ave., Room 311, NY 10128. |
Urmey “Using the nerve stimulator for peripheral or plexus nerve blocks” Minerva Anesthesiology 2006; 72:467-71. |
Butterworth et. al., “Effects of Halothane and Enflurane on Firing Threshold of Frog Myelinated Axon”, Journal of Physiology 411:493-516, (1989) From the Anesthesia Research Labs, Brigham and Women's Hospital, Harvard Medical School, 75 Francis St., Boston, MA 02115, jp.physoc.org. |
Calancie, et. al., “Threshold-level multipulse transcranial electrical stimulation of motor cortex for intraoperative monitoring of spinal motor tracts: description of method and comparison to somatosensory evoked potential monitoring” J Neurosurg 88:457-470 (1998). |
Calancie, et. al., “Threshold-level repetitive transcranial electrical stimulation for intraoperative monitoring of central motor conduction”, J. Neurosurg 95:161-168 (2001). |
Calancie, et. al., Stimulus-Evoked EMG Monitoring During Transpedicular Lumbosacral Spine Instrumentation, Initial Clinical Results, 19 (24):2780-2786 (1994). |
Carl T. Brighton, “Clinical Orthopaedics and Related Research”, Clinical Orthopaedics and related research No. 384, pp. 82-100 (2001). |
Office Action dated Oct. 16, 2017 for U.S. Appl. No. 14/206,945; (pp. 1-12). |
Notice of Allowance dated Jun. 13, 2018 for U.S. Appl. No. 14/206,945 (pp. 1-12). |
Notice of Allowability dated Jun. 25, 2018 for U.S. Appl. No. 14/206,945 (pp. 1-4). |
Office Action dated Nov. 22, 2017 for U.S. Appl. No. 15/056,681; (pp. 1-5). |
Notice of Allowanance dated Apr. 5, 2018 for U.S. Appl. No. 15/056,681 (pp. 1-5). |
Number | Date | Country | |
---|---|---|---|
62139151 | Mar 2015 | US |