When hearing loss progresses beyond the point where hearing aids are effective, acoustic hearing can be augmented with electronic hearing via a cochlear implant. These life-changing devices supplement acoustic hearing with electronic hearing by directly stimulating the auditory nerve.
Measurable features collected during implantation can predict outcomes of the surgery. Such features include the cochlear implant placement, insertion force and structural damage. Currently, these features are sensed qualitatively by the surgeon and the surgeon can adjust the insertion, based on the sensed features. Expert surgeons have optimized the insertion technique to reduce trauma and preserve hearing based on subtle changes as the electrode array is implanted. Currently however, the surgeon's feedback of the insertion force is limited to the resistance they perceive as they manually thread the electrode array into the cochlea, which is limited to the sensitivity of human perception and is highly dependent on the surgeon's experience and dexterity.
Force-measurement systems in robotic platforms have been used to monitor external insertion force during surgery, but only measure the cumulative force and are unable to localize causes of increased insertion force. Additionally, while prior work has attempted to utilize MEMS technology to replace the cochlear implant electrode array, sensing capabilities using dedicated sensors integrated in to the electrode array are unknown in the art. There have also been attempts to enhance the surgeon's ability to actuate the electrode array precisely, either using robotically guided insertion, magnetic guidance systems, or built-in actuators. However, these techniques do not incorporate in situ feedback from the cochlear implant electrode array.
Described herein as a first aspect of the invention is a novel design of an instrumented cochlear implant, wherein the electrode array portion of the implant is provided with one or more sensors to detect various features of the electrode array during insertion and to provide feedback to the surgeon during implantation. The sensor uses a sensor array to collect intraoperative information on the state of the electrode array during insertion. For example, if configured with an array of strain sensors, flexing of the electrode array can be detected at any point along the length of the electrode array. This allows for reconstruction of the pose of the electrode array during insertion and detection of contact or possible contact with the inner walls of the cochlea.
Disclosed herein as a second aspect of the invention is a system for interpreting the signals received from the sensors and providing intraoperative feedback to the surgeon. The system is capable of determining the deflection, and therefore the overall pose of a cochlear implant electrode array from noisy sensor data of a sensing array having a plurality of strain sensing elements. Additionally, the system has the capability of providing surgical planning capabilities based on a surgical simulator model. In various embodiments, the system may use analytical models, machine learning models or a combination thereof to interpret the raw sensor data and to determine the pose of the electrode array. In addition, the system is capable of making predictions of a positive or negative surgical outcome and possible steps that can be taken by the user to improve the probability of a positive surgical outcome.
By way of example, a specific exemplary embodiment of the disclosed system and method will now be described, with reference to the accompanying drawings, in which:
Disclosed herein is an instrumented electrode array of a cochlear implant wherein the electrode array is configured with one or more sensing elements formed into a microfabricated thin-film sensing array. The sensing elements are preferably microfabricated as thin-film sensors and integrated with the electrode array. Various types of sensors may be deployed as part of the sensing array, including, for example, strain sensors (i.e., resistive, capacitive, or crack-based), force/pressure sensors (capacitive or electrochemical diaphragm), temperature sensors, proximity sensors, optical sensors, optical spectrometry, reflectometry, imaging, coherence tomography based on integrated optical fibers or waveguides, chemical detection, etc. The sensing array may also integrate microfluidic capabilities to enable sensing or to aid surgery via drug delivery or to relieve fluid pressure in the scala tympani.
In various embodiments, one or more different types of sensing elements may be deployed as part of the thin-film sensing array to provide multiple sensing modalities within a single sensing array. Additionally, like sensing elements may be oriented differently on the thin-film. For example, a plurality of strain sensing elements may be oriented in different directions on the thin-film such as to be capable of detecting elongation or compression along multiple axes.
The invention is described herein the context of a microfabricated interdigitated electrode array used as a strain sensor, however, as would be realized by one of skill in the art, any type of sensor previously mentioned or known in the art is contemplated be within the scope of the invention.
In one embodiment, the microfabricated thin-film sensor 304 is designed to be disconnected from the readout system after implantation by severing cable 208 as shown in
In one embodiment, the microfabricated thin-film sensing array 204 may be attached to a cochlear implant electrode array 202 after manufacturing via an assembly process. For example, sensor array 204 may be joined to the electrode array 202 using a silicone adhesive. In an alternative embodiment, sensing array 204 may be integrated into the manufacturing process of electrode array 202 by including it, for example, in an injection molding process used to produce electrode array 202. The dimensions of the thin-film sensing array 204 may be varied to match the dimensions of various cochlear implant electrode arrays 202 from different manufacturers.
The construction of the thin-film sensing array 204 is not limited to a single material platform. The one embodiment, the sensing elements 206 use platinum traces embedded in a Parylene C insulation to form an interdigitated electrode array strain sensor. These materials are largely equivalent to other common biocompatible materials such as aluminum and gold to form traces and other polymer insulators, for example, Parylenes, Siloxanes, Polyamide, SU-8, etc. Similarly, an optical waveguide may be implemented with a Parylene C core and silicone cladding (e.g., Parylene photonics), but may also be composed of other materials (e.g., SU-8, Ormocers, etc.).
The microfabricated thin-film sensing array 204 may be as previously described and may utilize one or more optical, electrical, electrochemical or microfluidic systems. One exemplary embodiment of the thin-film sensing array 204 is a metal strain gauge based on an interdigitated electrode array capacitive strain sensor. A second exemplary embodiment of the sensing array 204 is an integrated photonic waveguide to perform fiber optical coherence tomography intraoperatively.
In one embodiment, one or more interdigitated electrode array (IDE) capacitive strain sensors may be utilized as sensing elements 206 on the thin-film sensing array 204. The sensing elements 206 and the overall thin-film sensing array 204 may be fabricated as described in Provisional Patent Application No. 63/324,839, to which this application claims priority. The contents of this application are incorporated herein in their entirety.
The readout system 410 is composed of several discrete components, preferably integrated on a printed circuit board. Readout system 410 may include any required input/output interfaces, an amplifier and digitizer circuits that may be required to operate the thin-film sensor array 204, including, but not limited to: resistive, capacitive, or impedance measurement circuits, voltage or current sources for electrical sensors, or laser diodes, spectrometers, optical filters, and power meters for optical systems. The readout system 410 also contains a microcontroller to process and store the data, as well as power control (voltage regulators or battery circuitry) and wired or wireless communication circuitry.
The user (surgeon) interface 412 provides feedback to the surgeon and displays the information acquired by the readout system 310 to the surgeon. The feedback and display may consist of audible cues and/or a visual display of metrics (e.g., wrapping factor or tip force), or a more complex visualization (e.g., a visualization of a 3D pose of the cochlear implant electrode array 202, or the strain or force distribution along the array). User interface 312 may consist of a device with a screen or speakers, or an augmented-reality display.
In various embodiments, readout system 410 may be configured to derive force and position information from data from the plurality of sensing elements 206.
In one embodiment, an analytical model may be used to derive force and position vectors based on the data from sensing elements 206.
In an alternate embodiment, a machine learning-based model 700 may be used to determine the position vector estimate 612 and the normal force vector estimate 614. Machine learning-based model 700 may use machine learning model 702 which is trained on ground truth position and force information derived from simulated cochlear implant surgeries or training sessions performed on cochlear models. Machine learning model 702 may be of any known architecture.
The position and normal force vectors produced by analytic model 600 or machine learning-based model 700 may be used to create an anatomically accurate pose estimation of electrode array 202 as it is inserted into the cochlea and this pose estimation can be visualized to the surgeon via user interface 412. In addition, alarms may be raised to the user when the normal force vector indicates that certain forces have exceeded predetermined thresholds or when the position vector indicates a position deviation of electrode array 202 and may also be delivered via user interface 412.
Readout system 310 may also include a surgical route planning component wherein electrode states are discretized by insertion depth. A state tree is shown in
For surgical planning, insertion of the electrode array 202 in a 3D model of a cochlea can be simulated using the Simulation Open Framework Architecture (SOFA). This simulation approach can be tailored to individual patients using 3D geometry of the scala tympani reconstructed from CT scans. In one embodiment, the insertion is discretized by insertion depth in intervals of 100 micrometers, although other discretization intervals may be used, which are chosen based on the available computation resources. At each step, a series of possible actions may be simulated to form a branching decision tree of surgical states. These decisions are simulated in discretized steps: insertion speed (i.e., [0.5, 1, 2 mm/s]), and insertion angle (i.e., Δθ=[−15, −10, −5, 0, 5, 10, 15 degrees]). Finer discretization intervals may be chosen if computational resources are available.
End states of the decision tree of surgical states (full insertion) are evaluated by placement metrics such as wrapping factor. Intermediate states are evaluated by the optimal placement that can be reached from each state. Intermediate state rankings are penalized by the peak insertion force at that state to avoid insertion trajectories that result in large transient forces which may result in surgical trauma.
In one embodiment, efficient surgical planning can be achieved via a search algorithm to identify a path to an optimal end state while avoiding intermediate states causing high insertion force. Given an arbitrary initial state (i.e., if surgical trajectory has deviated from the optimal path), a new search can be performed in the decision tree to identify the optimal recovery path. At each step of insertion, a minimally traumatic optimal placement path is defined in prescriptive steps of Δθ insertion angle change and insertion speed. In various alternative embodiments, the surgical route planning component may be implemented using a trained machine learning model where in the ground truth is indicated by surgical actions which resulted in positive clinical outcomes. The machine learning model may recommend an optimal surgical path and may also recommend remedial or next actions to be taken in the event that the position or force vectors 612, 614, or the higher-order features 804 previously discussed indicate pending trauma.
The invention is contemplated to include both the instrumented cochlear implant and the system for analyzing the data and providing intraoperative feedback to the surgeon, including recommending actions to raise the probability of a positive clinical outcome. As would be realized by one of skill in the art, many variations on the system and the device disclosed herein are possible and are contemplated to be within the scope of the invention, which is defined by the claims which follow.
This application claims the benefit of U.S. Provisional Patent Application No. 63/324,174, filed on Mar. 28, 2022, 63/324,839, filed Mar. 29, 2022 and 63/324,871, filed Mar. 29, 2022, the contents of which are incorporated herein in their entireties.
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/US23/16519 | 3/28/2023 | WO |
Number | Date | Country | |
---|---|---|---|
63324174 | Mar 2022 | US | |
63324839 | Mar 2022 | US | |
63324871 | Mar 2022 | US |