This disclosure is directed to an external monitor connection for a clinician programmer and the display of one or more simulated cursors.
Neurostimulation devices deliver therapy in the form of electrical stimulation pulses to treat symptoms and conditions, such as chronic pain, Parkinson's disease, or epilepsy, for example. Implantable neurostimulation devices, for example, deliver neurostimulation therapy via leads that include electrodes located proximate to the muscles and nerves of a patient.
Clinician programmers are used to control and program the neurostimulation devices with stimulation sequences to treat symptoms and conditions. These clinician programmers and devices of their type are relatively small to allow for easy transportation and storage. The portability has its price, however. It is difficult for more than one person to view the relatively small screen of a handheld programmer. People would have to crowd around the device to be able to attempt to see what is happening on the screen.
Further, even though the clinician programmer is portable, there are some areas where its use may be restricted. For instance, a clinician programmer may be covered under the drapes while a sales representative is talking to the patient. The clinician programmer thus may not be visible to the physician. As another example, the clinician programmer may not be a sterile device and cannot be taken into the sterile field in an operating room. Since the clinician programmer must remain outside of the sterile field, the physician is unable to read the screen while performing the procedure. Accordingly, the physician must verbally interact with and rely on someone (an external operator), who acts as his eyes and hands controlling the programmer outside of the sterile field. The situation could also be reversed, where the physician is doing the programming, and the staff is observing his/her actions, for example, talking to the patient at the head end of the surgery table. In any case, requiring an extra person results in additional time for the procedure to be completed as a result of the verbal communication of the programming device state and adjustments to be made between the physician and the external operator. The verbal interchange may also result in miscommunication which will add additional time to complete the procedure and possibly result in more severe consequences.
In addition, due to the small size of the clinician programmer, it may be difficult for users and observers to precisely identify an area of the screen (on the clinician programmer) where the user is performing data entry or other programming actions. In other words, users and observers may not know what part of the screen is being touched while the user is interacting with the user interface. Also, if device is coupled to an external monitor, observers will not know what part of the clinician screen is being touched by the user, especially if the external monitor and the observers are typically located in a separate room than the clinician programmer and the user.
The present disclosure is directed to devices, systems, and methods that address one or more deficiencies in the prior art.
This disclosure is directed to a patient programmer in communication with an external monitor that helps to alleviate the problems set out above. When demonstrating the device, for example, the screen can be displayed on a large monitor for group viewing. In addition, when used for a procedure within an operating room, the programmer can be kept outside the sterile field, but its user interface can be made available for viewing by the physician and others through a projector or large screen monitor. In many cases, the external screen may be the only screen that the physician can see, because the clinician programmer is under the cover or tucked away.
In one exemplary aspect, the present disclosure is directed to a clinician programming system operable to control an implantable medical device. The clinician programming system includes a clinician programmer with a housing. The clinician programmer includes: a processor and memory having executable instructions enabling programming of an implantable pulse generator; a user interface configured to receive inputs by a clinician instructing operation of an implantable pulse generator; a first display configured to display information indicative of the inputs by the clinician or display information indicative of status of an implantable pulse generator, the first display having a first display size; an implant communication interface configured to transmit information from the clinician programmer to an implantable pulse generator and configured to receive information from an implantable pulse generator; and a display communication interface configured to transmit content shown on the display. The clinician programming system also includes a secondary unit separate from the housing of the clinician programmer, the secondary unit having a secondary display of a second display size, the secondary display being configured to communicate with the clinician programmer via the secondary display communication interface and configured to display information received via the secondary display communication interface. The secondary display may display information either mirrored or extended from the clinician programmer.
In one exemplary aspect, the present disclosure is directed to a clinician programmer. The clinician programmer includes a first display configured to display information to a user relating to an implantable device; a user input mechanism configured to receive inputs from the user controlling content shown on the first display; a secondary unit communication interface selectively attachable to a secondary unit, the secondary unit communication interface configured to transmit information to a secondary unit having a secondary display and configured to receive information for processing from the secondary unit; and a controller configured to receive a user input from the user input mechanism, the user input selecting a first mode that sends a display signal to the secondary display causing content shown on the secondary display to mirror or extend content shown on the first display and a second mode that sends a display signal to the secondary display causing content shown on the secondary display to differ from content shown on the primary display.
In one exemplary aspect, the present disclosure is directed to a clinician programmer. The clinician programmer includes a programming software module configured to generate a treatment program executable on an implantable medical device as a result of a user input; an implant communication interface configured to send the treatment program to an implantable medical device to operate the implantable device and configured to receive information from the implantable device; a primary display configured to display information relating to the treatment program; a secondary display unit communication interface configured to transmit information to a secondary display unit and configured to receive information from a secondary display unit; a microphone in communication with the secondary display unit communication interface and configured to capture audio from the user for transmission from the secondary display unit communication interface; and a speaker in communication with the a secondary display unit communication interface and configured to receive signals representing audio captured at the secondary display unit.
In one exemplary aspect, the present disclosure is directed to a surgical arrangement used when programming an implantable medical device. The surgical arrangement includes: a non-sterile (or wrapped sterile) clinician programmer having a memory storing instructional information for programming an implantable pulse generator and having a first display screen configured to display the instructional information to a user, the clinician programmer having a secondary display unit interface comprising a transmitter and a receiver configured to send display signals and configured to receive instructional signals; a secondary unit comprising a second display screen sized larger than the clinician programmer first display screen, the second display screen being disposed for viewing from a sterile room and connectable with the clinician programmer, the secondary unit being configured to receive the instructional information from the clinician programmer and display the instructional information on the second display screen under the instruction of the clinician programmer; and an implantable pulse generator in communication with one of the secondary unit and the clinician programmer, the implantable pulse generator having a memory and processor configured to activate electrodes based on information received from said one of the secondary unit and the clinician programmer, the implantable pulse generator being configured to electrically receive said information displayed relating to an electrode of the implantable pulse generator.
In one exemplary aspect, the present disclosure is directed to a method for performing trial stimulation during neurostimulator implant surgery. The method includes: providing a clinician programmer having a first display screen having a first display screen size; providing an external secondary unit having a second display screen having a second display screen size that is visible to medical personnel, for example operating room staff working within the sterile field; providing at least one stimulation lead operable to provide electrical stimulation to target tissue within a patient; connecting the clinician programmer to the external monitor; operating the clinician programmer to control the stimulation provided through the stimulation lead and to display information related to the stimulation on the external monitor; and displaying information relating to the stimulation lead on either one of the first and the second display screens, or both.
In one exemplary aspect, the present disclosure is directed to a method for programming an implantable device. The method includes: receiving an input at a user interface on a tablet-style clinician programmer; generating a first display signal on the clinician programmer that updates content on a first display based on the received user input, the first display having a first size; generating a second display signal for transmission to a secondary unit having a second display separate from the clinician programmer, the second display having a second size, wherein generating the second display signal includes enhancing the content of the second display signal to provide a clear image on the second size display; and transmitting the second display signal from the clinician programmer to the second display.
Yet another aspect of the present disclosure involves a clinician programmer. The clinician programmer includes: a touch-sensitive screen configured to display visual content to a user; one or more sensors configured to detect an engagement from the user with respect to the screen; a transceiver configured to conduct telecommunications with an external monitor that is multiple times larger than the clinician programmer; a memory storage component configured to store programming instructions; and a computer processor configured to execute the programming instructions to perform the following tasks: detecting, via the one or more sensors, the engagement from the user with respect to the screen; determining one or more locations on the screen corresponding to the user engagement; and sending signals, via the transceiver, to the external monitor to display one or more cursors on the external monitor, the one or more cursors graphically representing the one or more locations on the screen of the clinician programmer corresponding to the user engagement, respectively.
Another aspect of the present disclosure involves a medical system. The medical system includes: a monitor; and a clinician programmer located remotely from the monitor, the clinician programmer being configured to program parameters of an electrical stimulation therapy for a patient, the clinician programmer including: a touch-sensitive screen configured to display visual content to a user; one or more sensors configured to detect an engagement from the user with respect to the screen; a transceiver configured to conduct telecommunications with the monitor; a memory storage component configured to store programming instructions; and a computer processor configured to execute the programming instructions to perform the following tasks: detecting, via the one or more sensors, the engagement from the user with respect to the screen; determining one or more locations on the screen corresponding to the user engagement; and sending signals, via the transceiver, to the monitor to display one or more cursors on the monitor, the one or more cursors graphically representing the one or more locations on the screen of the clinician programmer corresponding to the user engagement, respectively.
Yet another aspect of the present disclosure involves a method of visualizing a user interaction with a clinician programmer. The method includes: detecting a user engagement with respect to a screen of the clinician programmer via one or more sensors associated with the screen of the clinician programmer; determining one or more locations on the screen of the clinician programmer corresponding to the user engagement; and displaying, via an external monitor communicatively coupled to the clinician programmer, one or more cursors that graphically represent the one or more locations on the screen of the clinician programmer corresponding to the user engagement, respectively.
Yet one more aspect of the present disclosure involves an apparatus for visualizing a user interaction. The apparatus includes: means for detecting a user engagement with respect to a screen of a clinician programmer, the clinician programmer being configured to program a pulse generator so that the pulse generator delivers an electrical stimulation therapy to a patient; means for determining one or more locations on the screen of the clinician programmer corresponding to the user engagement; and means for displaying or more cursors that graphically represent the one or more locations on the screen of the clinician programmer corresponding to the user engagement, respectively, wherein the means for the displaying is remotely located from, and communicatively coupled to, the clinician programmer.
Aspects of the present disclosure are best understood from the following detailed description when read with the accompanying figures.
The following disclosure provides many different embodiments, or examples, for implementing different features of various embodiments. Specific examples of components and arrangements are described below to simplify the present disclosure. These are, of course, merely examples and are not intended to be limiting. In addition, the present disclosure may repeat reference numerals and/or letters in the various examples. This repetition is for the purpose of simplicity and clarity and does not in itself dictate a relationship between the various embodiments and/or configurations discussed.
The devices, systems, and methods described herein introduce an improved way for controlling and programming an implanted medical device. They use a clinician programmer (“CP”) with a first display electrically coupled (for example, coupled in a wired or wireless manner) to a second display disposed for viewing by others than the clinician performing the programming. In one example, the CP may be outside a sterile field, but may be in communication with a display viewable to others who are within the sterile field. This may be particularly helpful to surgeons who perform surgeries and must issue instructions to a programming clinician to direct or control an implant in a certain manner. The surgeon may look at the second display and see the settings and programming as it occurs on the CP, instead of relying merely on verbal feedback from the programming clinician. This may streamline the surgery and since the surgeon can now see the clinician programming screen, may help assure the surgeon that his instructions are being carried out as requested without relying solely on verbal feedback. This may reduce verbal communication errors between staff during programming. Accordingly, this may provide a level of redundancy and risk management not achieved when programming is performed with only a CP outside the sterile field. In another example, the CP sends information shown on its display to the second larger display for viewing by additional people. This may be particularly helpful during training processes, when the clinician may be instructing trainees or other clinicians in treatment techniques, for example.
Referring now to
Neural tissue (not illustrated for the sake of simplicity) branches off from the spinal cord through spaces between the adjacent vertebrae. The neural tissue, along with the cord itself, can be individually and selectively stimulated in accordance with various aspects of the present disclosure. For example, referring to
The electrodes 110 deliver current drawn from the IPG 102, thereby generating an electric field near the neural tissue. The electric field stimulates the neural tissue to accomplish its intended functions. For example, the neural stimulation may alleviate pain in an embodiment. In other embodiments, a stimulator as described above may be placed in different locations throughout the body and may be programmed to address a variety of problems, including for example but without limitation; prevention or reduction of epileptic seizures, bladder control, weight control or regulation of heart beats.
It is understood that the IPG 102, the lead 108, and the electrodes 110 may be implanted completely inside the body, or may have only one or more components implanted within the body while other components remain outside the body. When they are implanted inside the body, the implant location may be adjusted (e.g., anywhere along the spine 10) to deliver the intended therapeutic effects of spinal cord electrical stimulation in a desired region of the spine. The IPG 102 in this system is a fully implantable, battery-powered neurostimulation device for providing electrical stimulation to a body region of a patient. In some embodiments, an external pulse generator (EPG) is used. The EPG is identical to the IPG but the connection is done through percutaneous wires (communication may still be wireless though). In the example shown in
The CP 104 is typically maintained in a health care provider's possession and can be used to program the IPG 102 as a part of the implantation treatment and later during office visits. For example only, the CP 104 can define the available stimulation programs for the IPG 102 by enabling and disabling particular stimulation programs, can define the actual stimulation programs by creating defined relationships between pulses, and perform other functions.
The CP 104 is, in one embodiment, a tablet-style device with a touch screen and radios for communicating with active implantable medical devices, such as neurostimulators like the IPG 102. As can be seen in
As shown in
The communication interface 166 enables the CP 104 to communicate with other components of the clinician programming system 150. In the embodiment shown, the communication interface 166 includes a secondary display communication interface 176 and a peripheral interface 178. These, along with other elements of the CP 104, are described in detail with reference to
The CP 104 includes memory 162, which can be internal to the processor 160 (e.g., memory 305), external to the processor 160 (e.g., memory 310), or a combination of both. The memory 162 stores sets of instructional information with stimulation control parameters that are available to be selected for delivery through the communication interface 166 to the IPG 102 for electrical stimulation therapy or to the secondary display unit 152 for display to a plurality of individuals in the surgical area or elsewhere. Exemplary memory include a read-only memory (“ROM”), a random access memory (“RAM”), an electrically erasable programmable read-only memory (“EEPROM”), a flash memory, a hard disk, or another suitable magnetic, optical, physical, or electronic memory device. The processor 160 executes software that is capable of being stored in the RAM (e.g., during execution), the ROM (e.g., on a generally permanent basis), or another non-transitory computer readable medium such as another memory or a disc. The CP 104 also includes input/output (“I/O”) systems that include routines for transferring information between components within the processor 160 and other components of the CP 104 or external to the CP 104.
Software included in the implementation of the CP 104 is stored in the memory 305 of the processor 160, RAM 310, ROM 315, or external to the CP 104. The software includes, for example, firmware, one or more applications, program data, one or more program modules, and other executable instructions. The processor 160 is configured to retrieve from memory and execute, among other things, instructions related to the control processes and methods described below for the CP 104. For example, the processor 160 is configured to execute instructions retrieved from the memory 162 for establishing a protocol to control the IPG 102. Some embodiments include software modules configured to provide instructions for accomplishing particular tasks handled by the CP 104. For example, the CP 104 includes a programming software module configured to generate a treatment or stimulation program based on input received from a user of the CP 104. A secondary display software module controls the signals and communication sent from the CP 104 to the secondary display unit 152. Additional exemplary software will be described in further detail below.
Since the secondary display screen 182 is larger than the primary display screen 164 as described below, the secondary display software module may be configured to enhance the resolution or otherwise format or modify the display signal in a way that creates a clearer image of the content on the secondary display. This may ensure that a relatively clear image is shown on a secondary screen having a larger screen size than that of the CP primary display screen 164, while not requiring as high resolution for the smaller primary display screen 164.
One memory shown in
The peripheral interface 178 is configured, depending on the embodiment, to receive data or signals from the IPG 102, the PFT 210, and the surgeon input device 212. Accordingly, it may include an implant communication interface, a PFT communication interface, and a surgeon input device communication interface. The implant communication interface includes structure and components enabling the CP 104 to send or receive information to and from the IPG 102. For example, it may comprise a radio transceiver that enables one-way or two-way communication with the IPG 102. The interface 178 may include components for wireless or wired communication and may be configured with any of the components discussed above with reference to the secondary display communication interface 176. In one example, the implant communication interface 178 comprises a medical implant communication service (MICS) RF transceiver used to communicate with the IPG 102 to communicate desired changes and to receive status updates from and relating to the IPG 102, such as battery status and any error information. In this example, the MICS RF transceiver utilizes a loop antenna for the communications with the IPG 102. Other antennas, such as, for example, dipole, chip antennas, or other antennas known in the art also may be used. The CP 104 may also include a programming interface used during manufacturing to load an operating system and program the CP 104.
The PFT communication interface and the surgeon input device communication interface may include structure and components enabling the CP to send and receive information to and from the PFT and the surgeon input device. These interfaces may be similar to that of the implant communication interface, or alternatively, may be otherwise enabled. Depending on the embodiment, the PFT communication interface and the surgeon input device communication interface may be one or more ports for wired communication, such as universal serial bus (USB) connectivity 355, including a Type A port and a Micro-B port; a related port for supporting Joint Test Action Group (JTAG) or other plug-in style port, or may be wireless using, for example, Wi-Fi portion 325 and Bluetooth portion 330 that respectively include a Wi-Fi communication interface, a Bluetooth communication interface, an antenna switch, and a related antenna all of which allows wireless communication following the Wi-Fi Alliance standard and Bluetooth Special Interest Group standard. Of course, other wireless local area network (WLAN) standards and wireless personal area networks (WPAN) standards can be used with the CP 104. Any other interface enabling communication between the CP 104 and the PFT 120 or surgeon input device 212 may be used. In some embodiments, the interface 178 is the same as the interface 176.
The secondary display communication interface 176 includes multiple bi-directional radio communication capabilities. Specific wireless portions included with the CP 104 are a Wi-Fi bi-direction radio communication portion 325, and a Bluetooth bi-direction radio communication portion 330. The Wi-Fi portion 325 and Bluetooth portion 330 include a Wi-Fi communication interface, a Bluetooth communication interface, an antenna switch, and a related antenna all of which allows wireless communication following the Wi-Fi Alliance standard and Bluetooth Special Interest Group standard. Of course, other wireless local area network (WLAN) standards and wireless personal area networks (WPAN) standards can be used with the CP 104.
The CP 104 includes multiple communication portions for wired communication. Exemplary circuitry and ports for receiving a wired connector include a port for supporting universal serial bus (USB) connectivity 355, including a Type A port and a Micro-B port, a portion and related port for supporting Joint Test Action Group (JTAG) connectivity 360, and a portion and related port for supporting universal asynchronous receiver/transmitter (UART) connectivity 365. Of course, other wired communication standards and connectivity can be used with or in place of the types shown in
The secondary display communication interface 176 includes the structure and components enabling the CP 104 to send or receive information to and from the secondary display unit 152. In one embodiment, the secondary display communication interface 176 is integral with the CP 104 and is available along an edge, such as a lower edge, under a protective cover of the CP 104. In one embodiment, the secondary display communication interface 176 includes a HDMI port formed in a housing of the CP 104. The interface 176 may allow connection to the external secondary display unit 152 via a micro High-Definition Multimedia Interface (HDMI) 370, which provides a compact audio/video interface for transmitting uncompressed digital data to the external display unit 152. The use of the HDMI connection 370 allows the CP 104 to transmit video (and audio) communication to the external display unit 152. This may be beneficial in situations where others (e.g., the surgeon) may want to view the information being viewed by a CP user. The surgeon typically has no visual access to the CP 104 in the operating room. The HDMI connection 370 allows the surgeon to view information from the CP 104, thereby allowing greater communication between the clinician and the surgeon. For a specific example, the HDMI connection 370 can broadcast a high definition television signal that allows the surgeon to view the same information that is shown on the LCD (discussed below) of the CP 104. In addition, as HDMI signals are compatible with DVI, the CP 104 can also be connected, with the proper cabling, to other external display devices that only support DVI input. In some embodiments, audio and video can be played independently. In other embodiments, audio and video can be played in a synchronized manner.
In another embodiment, the secondary display communication interface 176 includes a wireless transmitter and receiver configured to wirelessly communicate with the secondary display communication interface 176. In one example, it includes structure and encoding for Wi-Fi communication. In another example, it includes structure and encoding for Bluetooth communication. Additional wireless protocols are contemplated. In some examples, the secondary display communication interface 176 is a networking port on the CP that enables the CP 104 to communicate with the secondary display unit 152 over a WAN, LAN, or other network, including the Internet.
The CP 104 includes three hard buttons: a “home” button 335 for returning the CP to a home screen for the device, a “quick off” button 340 for quickly deactivating stimulation, and a “reset” button 345 for rebooting the CP 104. The CP 104 also includes an “ON/OFF” switch 350, which is part of the power generation and management block (discussed below). In some embodiments, the “reset” button 345 may be eliminated, and the “ON/OFF” switch 350 can be used to remove all power when held long enough.
In
The CP 104 includes a camera 380 allowing the device to take pictures or video. The resulting image files can be used to document a procedure or an aspect of the procedure. For example, the camera 380 can be used to take pictures of barcodes associated with the IPG 102 or the leads 120, or documenting an aspect of the procedure, such as the positioning of the leads. Similarly, it is envisioned that the CP 104 can communicate with a fluoroscope or similar device to provide further documentation of the procedure. Other devices can be coupled to the CP 104 to provide further information, such as scanners or RFID detection. Similarly, the CP 104 includes an audio portion 385 having an audio codec circuit, audio power amplifier, and related speaker for providing audio communication to the user, such as the clinician or the surgeon.
The CP 104 further includes a power generation and management block 390. The power block 390 has a power source (e.g., a lithium-ion battery) and a power supply for providing multiple power voltages to the processor, LCD touch screen, and peripherals.
In one embodiment, the CP 104 is a handheld computing tablet with touch screen capabilities. The tablet is a portable personal computer with a touch screen, which is typically the primary input device. However, an external keyboard or mouse can be attached to the CP 104. The tablet allows for mobile functionality not associated with even typical laptop personal computers.
In operation, the IPG 102 (which may also be an EPG) through the use of the implanted medical electrical leads 108, and specifically the electrodes 110 (
Returning now to the block diagram in
Because the CP 104 is a portable device and includes a relatively small primary display screen 168, it is not easily viewed by multiple people at a single time. However, the display screen 182 of the secondary display unit 152 is sized larger than the display screen 168 of the primary CP 104 and enables multiple people to simultaneously view the screen and allows surgeons to see the IPG status or action taken by the CP. In one example, the display screen 182 is more than twice the size of the primary display screen 168 of the CP 104. One exemplary display screen 182 is sized with a diagonal measurement greater than about 22 inches. Another exemplary display screen 182 is sized with a diagonal measurement greater than about 30 inches. Other sizes, both larger and smaller are contemplated. The display screen, in some examples, is a large monitor whose image is controlled entirely from the clinician programmer 104. In one example it is a smart monitor configured to convert information received from the clinician into a three-dimensional image and to display information in a three-dimensional manner.
The speaker 184 and microphone 186 are linked respectively with the microphone 174 and the speaker 172 on the CP 104. As such, communication is enabled between individuals proximate the secondary display unit 152 and the individual proximate to and operating the CP 104. As indicated above, this may be useful when the CP 104 and the secondary display unit 152 are in different locations. In one embodiment, the communication via the microphones and the speakers on the CP 104 and the secondary display unit 152 communicate over the same secondary display communication interface 176 and the CP interface 190. In other embodiments, they have separate communication channels, wired or wireless, for transmitting and receiving information.
The communication interface 188 in the secondary display unit 152 includes a CP interface 190 and a peripheral interface 192. The CP interface 190 is configured, depending on the embodiment, to receive data or signals from the secondary display communication interface 176 on the CP 104. For example, the CP interface 190 may connect to the secondary display communication interface 176 on the CP 104 via HDMI using a Type D Micro to Type A HDMI connector. Alternatively, or in addition to, the HDMI signals may be compatible with DVI. Thus, the CP 104 can also be connected, with the proper cabling, to other external display devices that only support DVI input.
In one example, the secondary display unit is configured to display the same information as the primary display screen 168 on the CP 104. This mirroring of the screens 168, 182 enables a surgeon in an operating room and the clinician who is operating the CP to see the same information, albeit in different locations.
The peripheral interface 192 is configured, depending on the embodiment, to receive data or signals from the IPG 102, the PFT 210, and the surgeon input device 212. In one example, the peripheral interface 192 comprises a MICS RF transceiver used to communicate with the IPG 102, the PFT 210, and the surgeon input device 212. As described above, the MICS RF transceiver may utilize a loop antenna for its communications, with other antennas, such as, for example, dipole, chip antennas, or others known in the art also considered. For example, a communications link can be established between the IPG 102 and the secondary display unit 152, and communications can then occur over a MICS transceiver using a standard frequency for a medical device transmission.
The IPG 102 includes all the treatment information required for treating the patient condition with the electrodes 110, but also includes a communication interface 192. In one embodiment, the communication interface 192 is configured to communicate with one or both of the CP 104 and the secondary display unit 152 and convey information including IPG status, treatment status, program operation, and other information to the CP 104 and/or the secondary display unit 152. The secondary display unit 152, under the control of the CP 104, may communicate with the IPG communication interface 192 and may convey new treatment programs, including electrode management routines, such as activating particular electrodes in a particular order with a particular intensity, by varying amplitude, pulse width, and frequency. Once these treatment programs are received, the IPG 102 may execute or respond to the received information as directed by the clinician programmer 104 through the secondary display unit 152. In one example, the IPG 102 also communicates directly with the peripheral communication interface 178 of the CP 104. This embodiment may work well when the IPG 102 is in the proximity of the CP 104 and able to receive information via wireless or wired transmission.
The PFT 210 is sized to be held by a patient and can be used to provide feedback during programming of the IPG 102. In one example, the PFT 210 may be used to provide feedback to the CP 104 while a clinician operating the CP 104, under instruction from the surgeon, develops the protocol for the IPG 102. In one example, the PFT 210 is an ergonomic handheld device having a sensor (also referred to as input), a controller, and a communications output. The sensor can include a discrete switch and/or a continuously variable input, such as through the use of a thermocouple, strain gauge, pressure sensor, piezoelectric device, accelerometer, displacement mechanism, or other variable sensing mechanism. It is envisioned that the use of a continuously variable input can provide magnitude information, thereby providing improved feedback information from the patient.
The PFT 210 includes a communication interface 214 that communicates information to the communication interface 188 on the secondary display unit 152, which relays the information to the CP 104. For example, the communication interface 214 and the peripheral interface 192 may establish a communication link. Communications can then occur over Bluetooth or other wireless formats. The CP 104 may then, if appropriate, adjust the display imagery on one or both of the primary display screen 168 and the display screen 182 of the secondary display unit 152 to reflect the patient feedback.
The PFT 210 is used to help the surgeon program the IPG 102 based on patient feedback. For example in use, the CP 104 activates one or more of the electrodes (on leads that are connected to the IPG 102) in various patterns. When the patient feels a sensation as a result of a stimulus, such as a stimulus for paresthesia, he or she activates a sensor on the PFT 210. The activation of the sensor indicates to the clinician programming system 150 that the patient felt the stimulus and can also convey the degree of sensation that is felt, depending on the type of sensor that is employed. Given that there may be a delay from the time the patient feels a sensation and activates the sensor, the system 150 then re-stimulates the most recently-activated combinations of electrodes, and the patient again uses the PFT 210 to indicate when (and to what degree) a sensation is felt in order to determine the combination of electrodes to which the patient was reacting. Further description of methods for use of the PFT 210 is disclosed in U.S. patent application Ser. No. 13/118,781, filed on May 31, 2011, titled “Device to Provide Feedback For Neurostimulation Programming”, the contents of which are incorporated herein by reference in its entirety.
In some embodiments, the patient may use the clinician programmer or another portable device (for example an electronic tablet or an electronic programmer) to draw a pain map and/or a stimulation map. For example, referring to
Referring back to
In one aspect, this disclosure is directed to a method for displaying information on a CP to operating room staff working within a sterile field. One example of this will be described with reference to
The secondary display unit 152 is, in this example, hung on a wall of the surgical room in a fixed location visible to the surgeon and his operating staff. In other examples, it may be visible to the surgeon and his operating staff through a window or other barrier. As can be seen, the secondary display unit 152 is much larger than the CP 104 and is configured to be easily viewable by several people at the same time. In one example, the secondary display screen 182 is at least double the size of the primary display screen 168 of the CP 104. The secondary display unit 152 includes the speaker 184, the microphone 186, and the communication interface 188. Verbal instructions from the surgeon to the clinician are captured at the microphone 186 and emitted from the speaker 172 on the CP 104. Likewise, verbal responses from the programmer can be captured by the microphone 174 on the CP 104 and heard though the speaker 184. In some embodiments, the contents of the screen of the CP 104 may also be broadcast via a suitable communications network.
In the example shown, the secondary display unit 152 is hung via a fixation structure 258. Here, the fixation structure 258 is a fixation bracket that extends from a rigid structure, such as the wall, and supports the secondary display unit 152 in a fixed position. As can be seen, the secondary display unit 152 is tipped at an angle to promote simple and convenient viewing from the sterile field in the surgical room. In this example it is spaced from the surgical area, and may not be a sterile device itself. It is operated without tactile feedback or input directly on the secondary display unit 152, and may be considered a hands-free device. It is controlled via the CP 104, but also may relay information collected from the IPG 102, the PFT 210, and the surgeon input device 212. Information is also relayed back from the secondary input to the CP display.
The communication interface 188 on the secondary display unit 152, and particularly the peripheral interface 192 communicates with one or more of the IPG 102, the PFT 210, and the surgeon input device 212. As indicated above, in one embodiment, the peripheral interface 192 provides two-way communication to each of these devices. In other examples, the peripheral interface 192 receives one-way communication from one or more of these devices. The communication interface 188 may be attached onto or otherwise carried on the display screen 182 or on or within its housing.
In the illustrated embodiment, a cable 200 extends from the secondary display communication interface 176 of the CP 104 to an outlet 256 shown on the wall structure 254. In this example, the secondary display communication interface 176 of the CP 104 is a micro-HDMI connector, and the cable 200 may be an HDMI cable extending between the secondary display communication interface 176 of the CP 104 and outlet 256.
The outlet 254 connects to an outlet 256 in the surgical room 250 via a cable within the wall, from which a cable extends to the secondary display unit 152. Over this wired line, the CP 104 communicates display information for presentation on the secondary display unit 152. That is, the CP 104 is able to drive a HDMI signal over the secondary display communication interface 176 of the CP 104 to copy the content from the primary display screen 168 on the CP 104 to the external display screen 182 on the secondary display unit 152. This can also be done via wireless communication. In other words, the wire 200 may not be needed in some embodiments, as the communication between the secondary display unit 152 and external devices are done wirelessly. In that case, the secondary display unit 152 may only need a power source or a battery.
In one example, the displayed content mirrors the information displayed on the CP 104. Accordingly, by viewing the secondary display unit 152, the surgeon sees the same information as the clinician operating the CP 104. In another example, the displayed content is different than or includes additional information than that displayed on the CP 104. Accordingly, with this extended display feature, the surgeon, by viewing the secondary display unit 152, sees different, but relevant information than the clinician operating the CP 104. In one example, the information shown on the CP 104 and/or the secondary display unit 152 includes IPG status information, charge information, program information, status and settings for one or more electrodes, including frequency, pulse width, and amplitude information for one or more electrodes. It may also include additional information. In one visualization of the patient's organs, x-ray information, or status of the patient's overall condition including items such as vital signs, including blood pressure, temperature, respiratory rates, heart beat, and/or other vitals.
In one example, the secondary display unit 152 is connected to a Digital Video Interface (DVI) connector on the CP 104 and the content of the primary display screen 168 was cloned, copied, or replicated onto the display screen 182 of the secondary display unit 152 via the DVI interface. In one example, the system 150 achieves 30 frames per second (FPS) when outputting the primary display screen content to an 800×600 display screen 182. This is representative of the CP 104 because the data stream format output by the CP 104 over the HDMI interface may be nearly identical to that output from the DVI interface on the EVK.
In one example, the CP 104 requires the user to enable the secondary display screen 182 by selecting a display mode for the external display monitor 152. In one embodiment, the hardware for the secondary display communication interface 176 is arranged to detect when the external secondary display unit 152 is plugged in or otherwise connected. Using this functionality, the CP 104 may detect when the secondary display unit 152 is connected to the secondary display communication interface 176 of the CP 104, the driver in the CP 104 automatically switches to extending the display on the external monitor. Likewise, when the external secondary display unit 152 is detached, the CP 104 may disable the mirroring or extended display function.
Software modules on CP 104 provide instructions for accomplishing particular tasks handled by the CP 104. In the embodiment described above, the software includes a secondary display unit control module that controls the image generated on the secondary display screen 182. In one example, the module enables a user to select between two operating modes. For example, a first mode or mirroring mode may control the secondary display screen 182 to show content mimicking that shown on the primary display screen 164. In this mode, the CP 104 may generate display signals for transmission to the secondary display unit 152 so that the primary and secondary display screens show the same content. A second mode or extended display mode of the secondary display unit may control the secondary display screen 182 to show content different than that shown on the primary display screen 164. In this mode, the CP 104 may generate display signals for transmission to the secondary display unit 152 so that the primary and secondary display screens show different content, although some content may still overlap.
The ability to output the primary display screen display to an external, and notably, larger, secondary display unit 152, such as a large screen monitor or projector, provides a decided advantage when it comes to communicating instructions to and receiving verbal responses from a clinician outside the sterile field, such as in another room. The external secondary display unit 152 makes it much easier for the surgeon and others to view what is happening.
In this embodiment, the CP 104 is a tablet controller fitted with a micro-HDMI connector, and the cable 200 is a HDMI cable extending between the secondary display communication interface 176 of the CP 104 and the communication interface 188 of the secondary display unit 152. The CP 104 is able to drive a HDMI signal over the secondary display communication interface 176 of the CP 104 and copies the display from the primary display screen 168 on the CP 104 to the external display screen 182 on the secondary display unit 152.
In this example, the display surfaces of the primary display screen 168 on the CP 104 and the external display screen 182 on the secondary display unit 152 are connected with DirectDraw. Using the DirectDraw API, a screen copy is made of the primary display screen 168 on the CP 104. A copy of the primarily display screen 168 is then drawn to the secondary display screen's surface. In one example, the contents displayed on the secondary display screen 182 mirrors that of the primary displace seen 168. The software routine adds the ability for the screen to be copied automatically at a user defined rate and for the copying to be enabled or disabled as necessary.
As discussed above, the CP 104 may require the user to enable the secondary display screen 182 by selecting a display mode for the external display monitor 152, may be arranged to detect when the external secondary display unit 152 is plugged in or otherwise connected. The ability to output the primary display screen display to an external, and notably, larger, secondary display unit 152, such as a large screen monitor or projector, provides a decided advantage when it comes to training personnel because it is much easier for multiple people to view what is happening.
In one embodiment, the surgeon input device 212 is not a handheld device, but is a motion detector associated with the secondary display unit 152. In this embodiment, the secondary display unit 152 may include a light source and a camera or sensor to generate a depth map or other imagery of the surgeon or other health care provider in the operating room 250. By detecting surgeon movement, the surgeon input device 212 may receive inputs for controlling either the CP or features of the secondary input monitor 152. For example, a surgeon may be able to select a particular electrode or an array of electrodes using the motion detector and increase or decrease the amplitude and frequency of pulses from the electrode or electrode array to create a treatment program that may be loaded onto the IPG 102.
According to various aspects of the present disclosure, one or more cursors are displayed simultaneously both on the screen of the CP 104 and the secondary display screen 182 of the secondary display unit 152. For the sake of illustration, a cursor 400 is illustrated on both the screen of the CP 104 as well as on the secondary display screen 182. The cursor 400 corresponds to an area of the screen of the CP 104 engaged by the user (for example either through a finger or a stylus-like device). In some embodiments, the user engagement may be an actual touching of the area of the screen. In other embodiments, the user engagement may be a detected proximity to the area of the screen. The cursor 400 shown on the secondary display screen 182 tracks or mirrors the cursor 400 shown on the screen of the CP 104. Thus, as the user moves his finger or stylus to interactively engage with the screen of the CP 104, the audience can “see” exactly what the user is doing by looking at the secondary display unit 152 through the movement of the cursor 400 on the secondary display screen 182. It is understood that although one cursor 400 is illustrated herein (one on each display screen), a plurality of cursors may be displayed to correspond with multi-touch related user interactions in other embodiments. The cursor 400 will be discussed in more detail below with reference to
In use, a clinician may take the cable 200 and plug it into the secondary display communication interface 176 of the CP 104. With this connection made the CP 104 can send display information to the secondary display unit 152 for display on the display screen 182. In one embodiment, over the same cable 200, feedback information from the secondary display unit 152 can be transmitted to the CP 104. This feedback information may be from the secondary display unit 152, or relayed through the secondary display unit 152 to the CP 104 from the PFT 210, the IPG 102, or the surgeon input device 212. In addition, the audio feed between the speakers and microphones on the CP 104 and the secondary display unit 152 may also be carried over the cable 200. In other embodiments, the system 150 includes a separate feedback line and/or a separate audio feed line. In yet another embodiment, the communication occurs wirelessly over a direct connection, between the CP 104 and the secondary display unit 152, such as, for example, through infra-red transmission, or over an indirect connection, such as through a wireless network.
The IPG 102 may then be implanted using methods and techniques known in the art. The surgeon may give instructions to the programmer of the CP 104 to activate or deactivate particular electrodes to develop a treatment program that provides a suitable level of relief to the patient. Since the surgeon can see the secondary display unit 152, he knows whether his instructions are being properly carried out without additional questions or explanation from the programmer of the CP 104. This reduction in reliance on verbal instructions may increase efficiently of the surgical procedure. Further, during the procedure, the surgeon may intervene or request additional views of displayed information using the surgeon input device 212. This allows the surgeon to have a level of control over the CP 104, although that level of control may be a lesser level than the level of control of the programmer of the CP 104. In one example, the surgeon input device may allow the surgeon to select an electrode and modify its frequency or amplitude of applied stimulation. Although the patient programmer is not sterile, the surgeon input device may be a sterile device, and in one embodiment, is a single-use device that is discarded after use.
During the programming process, information from the PFT 210 may be transmitted to the secondary display unit 152. The secondary display unit 152 may then relay the received information to the CP 104 for consideration or processing. Based on the patient feedback, the CP 104 may be controlled to update the images on the screens 164, 182 or provide additional information for programming the implant. Software on the CP 104 may control the images shown on the secondary display screen 182, as described above.
When a stimulation program is set, it may be transmitted to the IPG 102 either directly from the CP 104 or it may be transmitted to the secondary display unit for relay to the IPG 102. The IPG may then store the treatment program for operation.
Based on the discussions above, it can be seen that the clinician programmer and devices of its type are commonly used in a setting where a group of people is observing the actions of the user. Typically, the user (e.g., a clinician operating the clinician programmer 104 in
In these types of scenarios, users and observers don't know what part of the clinician programmer screen is being touched while the user is interacting with the user interface. Also, in situations where the clinician programmer is connected to an external monitor, such as the situation shown in
According to the various aspects of the present disclosure, one or more simulated cursors that are overlaid on top of user interface elements on the touch screen of the clinician programmer for indication of user interface interaction.
Referring now to
The location of the simulated cursors 610-611 on the screen 600 corresponds to the location of the user's fingers or stylus. In the illustrated embodiment, more than one finger is detected, and therefore two simulated cursors 610-611 are displayed. Other numbers of fingers and/or styluses will result in their corresponding number of simulated cursors as well. In this manner, the clinician programmer of the present disclosure supports multi-touch functionalities. Multi-touch may refer to a touch sensitive interface's (e.g., the screen 600 of the clinician programmer) capability of detecting the presence of multiple points of contact with the interface. For example, the touch sensitive interface may be able to translate the detection of the engagement (or movement) of one finger with the interface as one command, but may translate the engagement (or movement) of two or three fingers with the interface as a different type of command. Styluses or other objects may also be used instead of, or in conjunction with, fingers in other embodiments.
The construction of the simulated cursors 610-611 is such that most of the area they cover is transparent to show other user interface elements that exist in the same area. For example, the simulated cursor 610 is disposed over a dialog box 612, but the content of the dialog box 612 is not substantially obstructed by the display of the simulated cursor 610. If the content of the dialog box 612 contained text, for example, the user may still be able to read such text even though portions of the text may be overlapping with the simulated cursor 610. Upon a detection that the user has disengaged the screen 600, for example by releasing the fingers or styluses, the user interface of the clinician programmer will go back to the default mode where no simulated cursors are being shown.
As discussed above, the clinician programmer can be electrically coupled to an external monitor such as the secondary display unit 152 of
In more detail, sometimes it may be physically impossible for the observers to see what part of the clinician programmer screen the user is touching. For example, the clinician programmer may be located outside the view of the observers, such as in the situation illustrated in
Even if the observers are in the same room as the user and the clinician programmer, it is difficult for a group of observers to all get within close proximity to the user and see how the user is using the clinician programmer (e.g., entering programming parameters). First, it may not be feasible for a group of observers to be all crowded near the user, since space around the user may be constricted, and the crowding around the user may interfere with the user programming. Second, even if the observers all manage to get within viewing distance of the clinician programmer, the relatively small size of the clinician programmer means that the observers may not see (at least not clearly) exactly where on the clinician programmer screen the user is touching.
According to the present disclosure, the simulated cursors 610-611 shown in
In addition to showing the position of the user's fingers or styluses on the screen 600 or external monitor, the simulated cursors 610-611 can also portray the action of selecting or tapping on the touch screen 600 by changing the color of one or more of the simulated cursors 610-611 or highlighting one or more of the simulated cursors 610-611 for a limited amount of time. For example, as shown in
In the embodiment illustrated, the simulated cursor 613 is highlighted with an orange color, while the other two simulated cursors 614-615 are not colored (or assume the same color as the color of the background in the touch screen 600). Of course, the simulated cursor 613 may be visually differentiated from the simulated cursors 614-615 via other approaches or techniques in other embodiments. For example, instead of representing the simulated cursors 613 and 614-615 with different colors, they may be visually differentiated with different sizes or shapes, or different associated texts, or by different animations (for example the simulated cursor 613 may be flashing/flickering while the other simulated cursors 614-615 remain static) in various alternative embodiments. In any case, the visual differentiation among the simulated cursors allows the user or observers to distinguish between simulated indicators that are tracking the detected fingers from the one finger that invoked an action on the CP.
Although an IPG is used here as an example, it is understood that the various aspects of the present disclosure apply to an external pulse generator (EPG) as well. An EPG is intended to be worn externally to the patient's body. The EPG connects to one end (referred to as a connection end) of one or more percutaneous, or skin-penetrating, leads. The other end (referred to as a stimulating end) of the percutaneous lead is implanted within the body and incorporates multiple electrode surfaces analogous in function and use to those of an implanted lead.
The external charger 640 of the medical device system 620 provides electrical power to the IPG 670. The electrical power may be delivered through a charging coil 690. In some embodiments, the charging coil can also be an internal component of the external charger 640. The IPG 670 may also incorporate power-storage components such as a battery or capacitor so that it may be powered independently of the external charger 640 for a period of time, for example from a day to a month, depending on the power requirements of the therapeutic electrical stimulation delivered by the IPG.
The patient programmer 650 and the clinician programmer 660 may be portable handheld devices that can be used to configure the IPG 670 so that the IPG 670 can operate in a certain way. The patient programmer 650 is used by the patient in whom the IPG 670 is implanted. The patient may adjust the parameters of the stimulation, such as by selecting a program, changing its amplitude, frequency, and other parameters, and by turning stimulation on and off. The clinician programmer 660 is used by a medical personnel to configure the other system components and to adjust stimulation parameters that the patient is not permitted to control, such as by setting up stimulation programs among which the patient may choose, selecting the active set of electrode surfaces in a given program, and by setting upper and lower limits for the patient's adjustments of amplitude, frequency, and other parameters.
In the embodiments discussed below, the clinician programmer 660 is used as an example of the electronic programmer. However, it is understood that the electronic programmer may also be the patient programmer 650 or other touch screen programming devices (such as smart-phones or tablet computers) in other embodiments.
The IPG provides stimuli to electrodes of an implanted medical electrical lead (not illustrated herein). As shown in
The IPG also includes a power supply portion 740. The power supply portion includes a rechargeable battery 745, fuse 750, power ASIC 755, recharge coil 760, rectifier 763 and data modulation circuit 765. The rechargeable battery 745 provides a power source for the power supply portion 740. The recharge coil 760 receives a wireless signal from the PPC. The wireless signal includes an energy that is converted and conditioned to a power signal by the rectifier 763. The power signal is provided to the rechargeable battery 745 via the power ASIC 755. The power ASIC 755 manages the power for the IPG. The power ASIC 755 provides one or more voltages to the other electrical and electronic circuits of the IPG. The data modulation circuit 765 controls the charging process.
The IPG also includes a magnetic sensor 780. The magnetic sensor 780 provides a “hard” switch upon sensing a magnet for a defined period. The signal from the magnetic sensor 780 can provide an override for the IPG if a fault is occurring with the IPG and is not responding to other controllers.
The IPG is shown in
The IPG includes memory, which can be internal to the control device (such as memory 790), external to the control device (such as serial memory 795), or a combination of both. Exemplary memory include a read-only memory (“ROM”), a random access memory (“RAM”), an electrically erasable programmable read-only memory (“EEPROM”), a flash memory, a hard disk, or another suitable magnetic, optical, physical, or electronic memory device. The programmable portion 785 executes software that is capable of being stored in the RAM (e.g., during execution), the ROM (e.g., on a generally permanent basis), or another non-transitory computer readable medium such as another memory or a disc.
Software included in the implementation of the IPG is stored in the memory 790. The software includes, for example, firmware, one or more applications, program data, one or more program modules, and other executable instructions. The programmable portion 785 is configured to retrieve from memory and execute, among other things, instructions related to the control processes and methods described below for the IPG. For example, the programmable portion 285 is configured to execute instructions retrieved from the memory 790 for sweeping the electrodes in response to a signal from the CP.
Referring now to
The medical infrastructure 800 also includes a plurality of electronic programmers 820. For sake of illustration, one of these electronic programmers 820A is illustrated in more detail and discussed in detail below. Nevertheless, it is understood that each of the electronic programmers 820 may be implemented similar to the electronic programmer 820A.
In some embodiments, the electronic programmer 820A may be a clinician programmer, for example the clinician programmer discussed above with reference to
The electronic programmer 820A contains a communications component 830 that is configured to conduct electronic communications with external devices. For example, the communications device 830 may include a transceiver. The transceiver contains various electronic circuitry components configured to conduct telecommunications with one or more external devices. The electronic circuitry components allow the transceiver to conduct telecommunications in one or more of the wired or wireless telecommunications protocols, including communications protocols such as IEEE 802.11 (Wi-Fi), IEEE 802.15 (Bluetooth), GSM, CDMA, LTE, WIMAX, DLNA, HDMI, Medical Implant Communication Service (MICS), etc. In some embodiments, the transceiver includes antennas, filters, switches, various kinds of amplifiers such as low-noise amplifiers or power amplifiers, digital-to-analog (DAC) converters, analog-to-digital (ADC) converters, mixers, multiplexers and demultiplexers, oscillators, and/or phase-locked loops (PLLs). Some of these electronic circuitry components may be integrated into a single discrete device or an integrated circuit (IC) chip.
The electronic programmer 820A contains a touchscreen component 840. The touchscreen component 840 may display a touch-sensitive graphical user interface that is responsive to gesture-based user interactions. The touch-sensitive graphical user interface may detect a touch or a movement of a user's finger(s) on the touchscreen and interpret these user actions accordingly to perform appropriate tasks. The graphical user interface may also utilize a virtual keyboard to receive user input. In some embodiments, the touch-sensitive screen may be a capacitive touchscreen. In other embodiments, the touch-sensitive screen may be a resistive touchscreen.
It is understood that the electronic programmer 820A may optionally include additional user input/output components that work in conjunction with the touchscreen component 840 to carry out communications with a user. For example, these additional user input/output components may include physical and/or virtual buttons (such as power and volume buttons) on or off the touch-sensitive screen, physical and/or virtual keyboards, mouse, track balls, speakers, microphones, light-sensors, light-emitting diodes (LEDs), communications ports (such as USB or HDMI ports), joy-sticks, etc.
The electronic programmer 820A contains an imaging component 850. The imaging component 850 is configured to capture an image of a target device via a scan. For example, the imaging component 850 may be a camera in some embodiments. The camera may be integrated into the electronic programmer 820A. The camera can be used to take a picture of a medical device, or scan a visual code of the medical device, for example its barcode or Quick Response (QR) code.
The electronic programmer contains a memory storage component 860. The memory storage component 860 may include system memory, (e.g., RAM), static storage 608 (e.g., ROM), or a disk drive (e.g., magnetic or optical), or any other suitable types of computer readable storage media. For example, some common types of computer readable media may include floppy disk, flexible disk, hard disk, magnetic tape, any other magnetic medium, CD-ROM, any other optical medium, RAM, PROM, EPROM, FLASH-EPROM, any other memory chip or cartridge, or any other medium from which a computer is adapted to read. The computer readable medium may include, but is not limited to, non-volatile media and volatile media. The computer readable medium is tangible, concrete, and non-transitory. Logic (for example in the form of computer software code or computer instructions) may be encoded in such computer readable medium. In some embodiments, the memory storage component 860 (or a portion thereof) may be configured as a local database capable of storing electronic records of medical devices and/or their associated patients.
The electronic programmer contains a processor component 870. The processor component 870 may include a central processing unit (CPU), a graphics processing unit (GPU) a micro-controller, a digital signal processor (DSP), or another suitable electronic processor capable of handling and executing instructions. In various embodiments, the processor component 870 may be implemented using various digital circuit blocks (including logic gates such as AND, OR, NAND, NOR, XOR gates, etc.) along with certain software code. In some embodiments, the processor component 870 may execute one or more sequences computer instructions contained in the memory storage component 860 to perform certain tasks.
It is understood that hard-wired circuitry may be used in place of (or in combination with) software instructions to implement various aspects of the present disclosure. Where applicable, various embodiments provided by the present disclosure may be implemented using hardware, software, or combinations of hardware and software. Also, where applicable, the various hardware components and/or software components set forth herein may be combined into composite components comprising software, hardware, and/or both without departing from the spirit of the present disclosure. Where applicable, the various hardware components and/or software components set forth herein may be separated into sub-components comprising software, hardware, or both without departing from the scope of the present disclosure. In addition, where applicable, it is contemplated that software components may be implemented as hardware components and vice-versa.
It is also understood that the electronic programmer 820A is not necessarily limited to the components 830-870 discussed above, but it may further include additional components that are used to carry out the programming tasks. These additional components are not discussed herein for reasons of simplicity. It is also understood that the medical infrastructure 800 may include a plurality of electronic programmers similar to the electronic programmer 820A discussed herein, but they are not illustrated in
The medical infrastructure 800 also includes an institutional computer system 890. The institutional computer system 890 is coupled to the electronic programmer 820A. In some embodiments, the institutional computer system 890 is a computer system of a healthcare institution, for example a hospital. The institutional computer system 890 may include one or more computer servers and/or client terminals that may each include the necessary computer hardware and software for conducting electronic communications and performing programmed tasks. In various embodiments, the institutional computer system 890 may include communications devices (e.g., transceivers), user input/output devices, memory storage devices, and computer processor devices that may share similar properties with the various components 830-870 of the electronic programmer 820A discussed above. For example, the institutional computer system 890 may include computer servers that are capable of electronically communicating with the electronic programmer 820A through the MICS protocol or another suitable networking protocol.
The medical infrastructure 800 includes a database 900. In various embodiments, the database 900 is a remote database—that is, located remotely to the institutional computer system 890 and/or the electronic programmer 820A. The database 900 is electronically or communicatively (for example through the Internet) coupled to the institutional computer system 890 and/or the electronic programmer. In some embodiments, the database 900, the institutional computer system 890, and the electronic programmer 820A are parts of a cloud-based architecture. In that regard, the database 900 may include cloud-based resources such as mass storage computer servers with adequate memory resources to handle requests from a variety of clients. The institutional computer system 890 and the electronic programmer 820A (or their respective users) may both be considered clients of the database 900. In certain embodiments, the functionality between the cloud-based resources and its clients may be divided up in any appropriate manner. For example, the electronic programmer 820A may perform basic input/output interactions with a user, but a majority of the processing and caching may be performed by the cloud-based resources in the database 900. However, other divisions of responsibility are also possible in various embodiments.
According to the various aspects of the present disclosure, electronic data may be uploaded from the electronic programmer 820A to the database 900. The data in the database 900 may thereafter be downloaded by any of the other electronic programmers 820B-820N communicatively coupled to it, assuming the user of these programmers has the right login permissions.
The database 900 may also include a manufacturer's database in some embodiments. It may be configured to manage an electronic medical device inventory, monitor manufacturing of medical devices, control shipping of medical devices, and communicate with existing or potential buyers (such as a healthcare institution). For example, communication with the buyer may include buying and usage history of medical devices and creation of purchase orders. A message can be automatically generated when a client (for example a hospital) is projected to run out of equipment, based on the medical device usage trend analysis done by the database. According to various aspects of the present disclosure, the database 900 is able to provide these functionalities at least in part via communication with the electronic programmer 820A and in response to the data sent by the electronic programmer 820A. These functionalities of the database 900 and its communications with the electronic programmer 820A will be discussed in greater detail later.
The medical infrastructure 800 further includes a manufacturer computer system 910. The manufacturer computer system 910 is also electronically or communicatively (for example through the Internet) coupled to the database 900. Hence, the manufacturer computer system 910 may also be considered a part of the cloud architecture. The computer system 910 is a computer system of medical device manufacturer, for example a manufacturer of the medical devices 810 and/or the electronic programmer 820A.
In various embodiments, the manufacturer computer system 910 may include one or more computer servers and/or client terminals that each includes the necessary computer hardware and software for conducting electronic communications and performing programmed tasks. In various embodiments, the manufacturer computer system 910 may include communications devices (e.g., transceivers), user input/output devices, memory storage devices, and computer processor devices that may share similar properties with the various components 830-870 of the electronic programmer 820A discussed above. Since both the manufacturer computer system 910 and the electronic programmer 820A are coupled to the database 900, the manufacturer computer system 910 and the electronic programmer 820A can conduct electronic communication with each other.
The method 950 proceeds to step 975, in which the simulated cursors are updated at all locations of the sensed fingers. The method 950 then proceeds to a decision step 980 to determine whether action is detected on one finger. If the answer from the decision step 980 is no, then the method 950 loops back to the step 975. If the answer from the decision step 980 is yes, then the method 950 proceeds to a step 985 to highlight the simulated cursor for the finger that caused an action temporarily. The method 950 continues to a decision step 990 to determine whether the finger has moved away from the area of the screen corresponding to the simulated cursor. If the answer from the decision step 990 is no, then the method 950 loops back to the step 975. If the answer from the decision step 990 is yes, then the method 950 proceeds to step 995 to hide simulated cursor for the finger that moved away.
It is understood that though the method 950 refers to fingers as an example, other input devices such as styluses may also trigger the display of the simulated cursors in a similar manner. It is also understood that the method 950 may include additional steps that are performed before, during, or after the steps 955-995 discussed herein, but these steps are not specifically discussed for reasons of simplicity. In addition, it is understood that multi-touch functionalities may be implemented for each of the steps 955-995 in some embodiments. Multi-touch may refer to a touch sensitive interface's (such as a touchscreen or a touchpad) capability of detecting the presence of multiple points of contact with the interface. For example, the touch sensitive interface may be able to translate the detection of the engagement (or movement) of one finger with the interface as one command, but may translate the engagement (or movement) of two or three fingers with the interface as a different type of command.
The method 1000 includes a step 1005 of detecting a user engagement with respect to a screen of the clinician programmer via one or more sensors associated with the screen of the clinician programmer. In some embodiments, the step 1005 of detecting includes detecting a physical contact of the screen of the clinician programmer via a touch sensor of the clinician programmer. In some embodiments, a physical contact from a finger or a stylus is detected. In some embodiments, the step 1005 of detecting includes detecting a proximity of a finger or a stylus via a proximity sensor of the clinician programmer. In some embodiments, the engagement from the user invokes an action on the clinician programmer.
The method 1000 includes a step 1010 of determining one or more locations on the screen of the clinician programmer corresponding to the user engagement.
The method 1000 include a step 1015 of displaying, via an external monitor communicatively coupled to the clinician programmer, one or more cursors that graphically represent the one or more locations on the screen of the clinician programmer corresponding to the user engagement, respectively. In some embodiments, the step 1015 of displaying comprises graphically differentiating a selected one of the cursors from the rest of the cursors, wherein the selected one of the cursors corresponds to the invoked action. In some embodiments, the step 1015 of displaying is performed in a manner such that a majority of an area covered by the cursors is transparent. In some embodiments, the external monitor is multiple times larger in size than the clinician programmer. In some embodiments, the external monitor and the clinician programmer are located in separate rooms.
The method 1000 includes a step 1020 of mirroring the displaying of the one or more cursors on the screen of the clinician programmer.
It is also understood that the method 950 may include additional steps that are performed before, during, or after the steps 905-945 discussed herein, but these steps are not specifically discussed for reasons of simplicity.
Other embodiments of the invention will be apparent to those skilled in the art from consideration of the specification and practice of the invention disclosed herein. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the invention being indicated by the following claims.
The present application is continuation application of U.S. patent application Ser. No. 14/926,152, filed on Oct. 29, 2015, now U.S. Pat. No. 9,314,640, issued on Apr. 19, 2016, which is a divisional application of U.S. patent Ser. No. 14/011,156, filed on Aug. 27, 2013, now U.S. Pat. No. 9,180,302, issued Nov. 10, 2015, which is a continuation-in-part (CIP) application of U.S. patent application Ser. No. 13/600,875, filed on Aug. 31, 2012, now U.S. Pat. No. 8,903,496, issued Dec. 2, 2014, which claims priority to provisional U.S. Patent Application No. 61/695,394, filed on Aug. 31, 2012, entitled “Touch Screen Finger Position Indicator for a Spinal Cord Stimulation Programming Device,” the disclosures of each of which are hereby incorporated by reference in their respective entireties.
Number | Name | Date | Kind |
---|---|---|---|
3386882 | Thomas et al. | Jun 1968 | A |
4432360 | Mumford et al. | Feb 1984 | A |
5286202 | De Gyarfas et al. | Feb 1994 | A |
5304206 | Baker, Jr. et al. | Apr 1994 | A |
5312446 | Holschbach et al. | May 1994 | A |
5370672 | Fowler et al. | Dec 1994 | A |
5383914 | O'Phelan | Jan 1995 | A |
5421830 | Epstein et al. | Jun 1995 | A |
5570113 | Zetts | Oct 1996 | A |
5628776 | Paul et al. | May 1997 | A |
5713937 | Nappholz et al. | Feb 1998 | A |
5722999 | Snell | Mar 1998 | A |
5724996 | Piunti | Mar 1998 | A |
5819740 | Muhlenberg | Oct 1998 | A |
5879374 | Powers et al. | Mar 1999 | A |
5905500 | Kamen et al. | May 1999 | A |
5938690 | Law et al. | Aug 1999 | A |
6016447 | Juran et al. | Jan 2000 | A |
6016448 | Busacker et al. | Jan 2000 | A |
6052624 | Mann | Apr 2000 | A |
6083156 | Lisiecki | Jul 2000 | A |
6148233 | Owen et al. | Nov 2000 | A |
6154675 | Juran et al. | Nov 2000 | A |
6216036 | Jenkins et al. | Apr 2001 | B1 |
6246414 | Kawasaki | Jun 2001 | B1 |
6249705 | Snell | Jun 2001 | B1 |
6278890 | Chassaing et al. | Aug 2001 | B1 |
6307554 | Arai et al. | Oct 2001 | B1 |
6308102 | Sieracki et al. | Oct 2001 | B1 |
6325756 | Webb et al. | Dec 2001 | B1 |
6345200 | Mouchawar et al. | Feb 2002 | B1 |
6386882 | Linberg | May 2002 | B1 |
6442432 | Lee | Aug 2002 | B2 |
6525727 | Junkins et al. | Feb 2003 | B1 |
6564104 | Nelson et al. | May 2003 | B2 |
6587104 | Hoppe | Jul 2003 | B1 |
6611267 | Migdal et al. | Aug 2003 | B2 |
6622048 | Mann et al. | Sep 2003 | B1 |
6669631 | Norris et al. | Dec 2003 | B2 |
6786405 | Wiedenhoefer | Sep 2004 | B2 |
6852080 | Bardy | Feb 2005 | B2 |
6882982 | McMenimen et al. | Apr 2005 | B2 |
6895280 | Meadows et al. | May 2005 | B2 |
6920360 | Lee et al. | Jul 2005 | B2 |
6931155 | Gioia | Aug 2005 | B1 |
6961448 | Nichols et al. | Nov 2005 | B2 |
6961617 | Snell | Nov 2005 | B1 |
7003349 | Andersson et al. | Feb 2006 | B1 |
7034823 | Dunnet | Apr 2006 | B2 |
7058453 | Nelson et al. | Jun 2006 | B2 |
7060030 | Von Arx et al. | Jun 2006 | B2 |
7065409 | Mazar | Jun 2006 | B2 |
7066910 | Bauhahn et al. | Jun 2006 | B2 |
7076303 | Linberg | Jul 2006 | B2 |
7087015 | Comrie et al. | Aug 2006 | B1 |
7092761 | Cappa et al. | Aug 2006 | B1 |
7107102 | Daignault et al. | Sep 2006 | B2 |
7142923 | North et al. | Nov 2006 | B2 |
7181286 | Sieracki et al. | Feb 2007 | B2 |
7181505 | Haller et al. | Feb 2007 | B2 |
7184837 | Goetz | Feb 2007 | B2 |
7239926 | Goetz | Jul 2007 | B2 |
7257446 | Greenberg et al. | Aug 2007 | B2 |
7266412 | Stypulkowski | Sep 2007 | B2 |
7299085 | Bergelson et al. | Nov 2007 | B2 |
7359751 | Erickson et al. | Apr 2008 | B1 |
7373204 | Gelfand et al. | May 2008 | B2 |
7440806 | Whitehurst et al. | Oct 2008 | B1 |
7452336 | Thompson | Nov 2008 | B2 |
7463927 | Chaouat | Dec 2008 | B1 |
7474223 | Nycz et al. | Jan 2009 | B2 |
7481759 | Whitehurst et al. | Jan 2009 | B2 |
7489970 | Lee et al. | Feb 2009 | B2 |
7496403 | Cao et al. | Feb 2009 | B2 |
7499048 | Sieracki et al. | Mar 2009 | B2 |
7505815 | Lee et al. | Mar 2009 | B2 |
7551960 | Forsberg et al. | Jun 2009 | B2 |
7602384 | Rosenberg et al. | Oct 2009 | B2 |
7617002 | Goetz | Nov 2009 | B2 |
7627372 | Vaisnys et al. | Dec 2009 | B2 |
7640059 | Forsberg et al. | Dec 2009 | B2 |
7657317 | Thacker et al. | Feb 2010 | B2 |
7685005 | Riff et al. | Mar 2010 | B2 |
7711603 | Vanker et al. | May 2010 | B2 |
7720549 | Schroeppel et al. | May 2010 | B2 |
7747330 | Nolan et al. | Jun 2010 | B2 |
7774067 | Keacher et al. | Aug 2010 | B2 |
7778710 | Propato | Aug 2010 | B2 |
7801596 | Fischell et al. | Sep 2010 | B2 |
7801611 | Persen et al. | Sep 2010 | B2 |
7805199 | KenKnight et al. | Sep 2010 | B2 |
7822483 | Stone et al. | Oct 2010 | B2 |
7853323 | Goetz | Dec 2010 | B2 |
7885712 | Goetz et al. | Feb 2011 | B2 |
7890180 | Quiles et al. | Feb 2011 | B2 |
7928995 | Daignault | Apr 2011 | B2 |
7934508 | Behm | May 2011 | B2 |
7940933 | Corndorf | May 2011 | B2 |
7953492 | Corndorf | May 2011 | B2 |
7953612 | Palmese et al. | May 2011 | B1 |
7957808 | Dawant et al. | Jun 2011 | B2 |
7978062 | LaLonde et al. | Jul 2011 | B2 |
7991482 | Bradley | Aug 2011 | B2 |
8014863 | Zhang et al. | Sep 2011 | B2 |
8021298 | Baird et al. | Sep 2011 | B2 |
8027726 | Ternes | Sep 2011 | B2 |
8046241 | Dodson | Oct 2011 | B1 |
8060216 | Greenberg et al. | Nov 2011 | B2 |
8068915 | Lee et al. | Nov 2011 | B2 |
8068918 | Vallapureddy et al. | Nov 2011 | B2 |
8078440 | Otto et al. | Dec 2011 | B2 |
8082162 | Flood | Dec 2011 | B2 |
8121702 | King | Feb 2012 | B2 |
8135566 | Marshall et al. | Mar 2012 | B2 |
8140160 | Pless et al. | Mar 2012 | B2 |
8140167 | Donders et al. | Mar 2012 | B2 |
8160328 | Goetz et al. | Apr 2012 | B2 |
8160704 | Freeberg | Apr 2012 | B2 |
8165385 | Reeves et al. | Apr 2012 | B2 |
8187015 | Boyd et al. | May 2012 | B2 |
8200324 | Shen et al. | Jun 2012 | B2 |
8200340 | Skelton et al. | Jun 2012 | B2 |
8219206 | Skelton et al. | Jul 2012 | B2 |
8233991 | Woods et al. | Jul 2012 | B2 |
8246680 | Betz et al. | Aug 2012 | B2 |
8249713 | Fang et al. | Aug 2012 | B2 |
8255060 | Goetz et al. | Aug 2012 | B2 |
8323218 | Davis et al. | Dec 2012 | B2 |
8326433 | Blum et al. | Dec 2012 | B2 |
8340775 | Cullen et al. | Dec 2012 | B1 |
8382666 | Mao et al. | Feb 2013 | B1 |
8386032 | Bachinski et al. | Feb 2013 | B2 |
8401666 | Skelton et al. | Mar 2013 | B2 |
8428727 | Bolea et al. | Apr 2013 | B2 |
20010037220 | Merry et al. | Nov 2001 | A1 |
20030076301 | Tsuk et al. | Apr 2003 | A1 |
20030107572 | Smith et al. | Jun 2003 | A1 |
20030139652 | Kang et al. | Jul 2003 | A1 |
20030171911 | Fairweather | Sep 2003 | A1 |
20030177031 | Malek | Sep 2003 | A1 |
20030233129 | Matos | Dec 2003 | A1 |
20040088374 | Webb et al. | May 2004 | A1 |
20040122477 | Whitehurst et al. | Jun 2004 | A1 |
20040210273 | Wang | Oct 2004 | A1 |
20050107831 | Hill et al. | May 2005 | A1 |
20050149356 | Cyr et al. | Jul 2005 | A1 |
20050168460 | Razdan et al. | Aug 2005 | A1 |
20050277872 | Colby et al. | Dec 2005 | A1 |
20060089888 | Roger | Apr 2006 | A1 |
20060095092 | Drew | May 2006 | A1 |
20060100832 | Bowman | May 2006 | A1 |
20060241720 | Woods et al. | Oct 2006 | A1 |
20060242159 | Bishop et al. | Oct 2006 | A1 |
20060282168 | Sherman et al. | Dec 2006 | A1 |
20070043585 | Matos | Feb 2007 | A1 |
20070078484 | Talarico et al. | Apr 2007 | A1 |
20070078497 | Vandanacker | Apr 2007 | A1 |
20070093998 | El-Baroudi et al. | Apr 2007 | A1 |
20070179349 | Hoyme et al. | Aug 2007 | A1 |
20070203537 | Goetz et al. | Aug 2007 | A1 |
20070203538 | Stone et al. | Aug 2007 | A1 |
20070203543 | Stone et al. | Aug 2007 | A1 |
20070213790 | Nolan et al. | Sep 2007 | A1 |
20080004675 | King et al. | Jan 2008 | A1 |
20080033303 | Wariar et al. | Feb 2008 | A1 |
20080046036 | King et al. | Feb 2008 | A1 |
20080140161 | Goetz et al. | Jun 2008 | A1 |
20080177362 | Phillips et al. | Jul 2008 | A1 |
20080218517 | Holmdahl | Sep 2008 | A1 |
20080262565 | Bentwich | Oct 2008 | A1 |
20090018617 | Skelton et al. | Jan 2009 | A1 |
20090018619 | Skelton et al. | Jan 2009 | A1 |
20090024178 | Hennig | Jan 2009 | A1 |
20090048871 | Skomra | Feb 2009 | A1 |
20090089034 | Penney et al. | Apr 2009 | A1 |
20090099624 | Kokones et al. | Apr 2009 | A1 |
20090132009 | Torgerson | May 2009 | A1 |
20090136094 | Driver et al. | May 2009 | A1 |
20090196471 | Goetz et al. | Aug 2009 | A1 |
20090234873 | Li et al. | Sep 2009 | A1 |
20090264967 | Giftakis et al. | Oct 2009 | A1 |
20090281596 | King et al. | Nov 2009 | A1 |
20100004033 | Choe et al. | Jan 2010 | A1 |
20100010566 | Thacker et al. | Jan 2010 | A1 |
20100010574 | Skelton et al. | Jan 2010 | A1 |
20100010580 | Skelton et al. | Jan 2010 | A1 |
20100058462 | Chow | Mar 2010 | A1 |
20100076534 | Mock | Mar 2010 | A1 |
20100090004 | Sands et al. | Apr 2010 | A1 |
20100106475 | Smith et al. | Apr 2010 | A1 |
20100123547 | Stevenson et al. | May 2010 | A1 |
20100152534 | Kim et al. | Jun 2010 | A1 |
20100161345 | Cain et al. | Jun 2010 | A1 |
20100198103 | Meadows et al. | Aug 2010 | A1 |
20100198304 | Wang | Aug 2010 | A1 |
20100222845 | Goetz | Sep 2010 | A1 |
20100223020 | Goetz | Sep 2010 | A1 |
20100265072 | Goetz et al. | Oct 2010 | A1 |
20100268304 | Matos | Oct 2010 | A1 |
20100280417 | Skelton et al. | Nov 2010 | A1 |
20100280578 | Skelton et al. | Nov 2010 | A1 |
20110004059 | Arneson et al. | Jan 2011 | A1 |
20110015514 | Skalli et al. | Jan 2011 | A1 |
20110015693 | Williamson | Jan 2011 | A1 |
20110023343 | Turner et al. | Feb 2011 | A1 |
20110038498 | Edgar | Feb 2011 | A1 |
20110040546 | Gerber et al. | Feb 2011 | A1 |
20110040547 | Gerber et al. | Feb 2011 | A1 |
20110046697 | Gerber et al. | Feb 2011 | A1 |
20110054560 | Rosenberg et al. | Mar 2011 | A1 |
20110054870 | Dariush et al. | Mar 2011 | A1 |
20110077459 | Rofougaran | Mar 2011 | A1 |
20110077616 | Bennet et al. | Mar 2011 | A1 |
20110093030 | Goetz et al. | Apr 2011 | A1 |
20110093047 | Davis et al. | Apr 2011 | A1 |
20110093051 | Davis et al. | Apr 2011 | A1 |
20110153341 | Diaz-Cortes | Jun 2011 | A1 |
20110170739 | Gillam et al. | Jul 2011 | A1 |
20110172564 | Drew | Jul 2011 | A1 |
20110172737 | Davis et al. | Jul 2011 | A1 |
20110172744 | Davis et al. | Jul 2011 | A1 |
20110185178 | Gotthardt | Jul 2011 | A1 |
20110191275 | Lujan et al. | Aug 2011 | A1 |
20110224523 | Burdiman | Sep 2011 | A1 |
20110224665 | Crosby et al. | Sep 2011 | A1 |
20110246219 | Smith et al. | Oct 2011 | A1 |
20110264165 | Molnar et al. | Oct 2011 | A1 |
20110270358 | Davis et al. | Nov 2011 | A1 |
20110282414 | Kothandaraman et al. | Nov 2011 | A1 |
20110305376 | Neff | Dec 2011 | A1 |
20110307284 | Thompson et al. | Dec 2011 | A1 |
20110313268 | Kokones et al. | Dec 2011 | A1 |
20110313487 | Kokones et al. | Dec 2011 | A1 |
20120041518 | Kim et al. | Feb 2012 | A1 |
20120046715 | Moffitt et al. | Feb 2012 | A1 |
20120071947 | Gupta et al. | Mar 2012 | A1 |
20120083857 | Bradley et al. | Apr 2012 | A1 |
20120084689 | Ledet et al. | Apr 2012 | A1 |
20120089008 | Strehl et al. | Apr 2012 | A1 |
20120109230 | Kothandaraman et al. | May 2012 | A1 |
20120192874 | Bolea et al. | Aug 2012 | A1 |
20120215284 | Berg et al. | Aug 2012 | A1 |
20120239116 | Lee et al. | Sep 2012 | A1 |
20120256857 | Mak | Oct 2012 | A1 |
20120265269 | Lui et al. | Oct 2012 | A1 |
20120277828 | O'Conner et al. | Nov 2012 | A1 |
20120290041 | Kim et al. | Nov 2012 | A1 |
20120290272 | Bryan | Nov 2012 | A1 |
20120290976 | Lahm et al. | Nov 2012 | A1 |
20120296392 | Lee et al. | Nov 2012 | A1 |
20120296396 | Moffitt et al. | Nov 2012 | A1 |
20120296397 | Vansickle | Nov 2012 | A1 |
20120303087 | Moffitt et al. | Nov 2012 | A1 |
20120310300 | Kaula et al. | Dec 2012 | A1 |
20130023950 | Gauthier | Jan 2013 | A1 |
20130060299 | Polefko et al. | Mar 2013 | A1 |
20130060300 | Polefko et al. | Mar 2013 | A1 |
20130060301 | Polefko et al. | Mar 2013 | A1 |
20130060302 | Polefko et al. | Mar 2013 | A1 |
20130079848 | Campbell et al. | Mar 2013 | A1 |
Number | Date | Country |
---|---|---|
1192972 | Apr 2002 | EP |
2277586 | Jan 2011 | EP |
WO 9959106 | Nov 1999 | WO |
WO 0209808 | Feb 2002 | WO |
WO 02084637 | Oct 2002 | WO |
WO 2009113102 | Sep 2009 | WO |
WO 2011028261 | Mar 2011 | WO |
WO 2011063248 | May 2011 | WO |
WO 2011104028 | Sep 2011 | WO |
WO 2011123669 | Oct 2011 | WO |
WO 2012018851 | Feb 2012 | WO |
WO 2012021862 | Feb 2012 | WO |
WO 2012135949 | Oct 2012 | WO |
WO 2013023085 | Feb 2013 | WO |
Entry |
---|
Synalink Features, SynaMed Web Page, http://synamed.com/synalinkFeatures.html., Copyright 2010, 2 pgs. |
Boston Scientific Corporation, “Boston Scientific Precision Spectra System Programming Manual”, Copyright 2010, 580 pgs. |
Extended European Search Report issued for Patent Application No. 13179897.7, dated Dec. 13, 2013, 6 pgs. |
Number | Date | Country | |
---|---|---|---|
20160228717 A1 | Aug 2016 | US |
Number | Date | Country | |
---|---|---|---|
61695394 | Aug 2012 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 14011156 | Aug 2013 | US |
Child | 14926152 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 14926152 | Oct 2015 | US |
Child | 15091644 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 13600875 | Aug 2012 | US |
Child | 14011156 | US |