The present disclosure relates to the field of computer-assisted surgery, and more specifically, to anatomical feature tracking and positioning in computer-assisted surgery (CAS) systems.
Computer-assisted surgery (CAS) makes use of references fixed to the patient using pins inserted into the bones of the limbs or the pelvis. These pins, inserted into the bones before or during the surgery, are of different diametrical sizes and can cause pain after the surgery. They are an extra step to the surgery, exclusively because of the navigation system. Also, the insertions of the pins into the bone may cause weaknesses of the bone that can then more easily be fractured. Some cases, involving osteoporotic bones may also cause anchoring instability due to the lack of density of the bone. Infections may also occur as for any entry point at surgery.
Furthermore, the length of the pins is sometimes obtrusive to the surgeon who may cut them to a length better adapted to his movement during the surgery. The cut is also perceived as a nuisance; its end may be sharp and hazardous to the personnel working around the surgery table.
Consequently, wearable trackers have been developed, so as to minimize the invasive nature of computer-assisted tracking devices. However, the wearable trackers may occasionally lack precision. There thus remains room for improvement.
In accordance with a first aspect of the present disclosure, there is provided an ultrasound tracking system for tracking a position and orientation of an anatomical feature in computer-assisted surgery, the ultrasound tracking system comprising: an ultrasound imaging system having a phased-array ultrasound probe unit being adapted for emitting ultrasound signals successively towards different portions of said anatomical feature, measuring echo signals returning from said portions of said anatomical feature and generating respective imaged echo datasets; a coordinate tracking system tracking coordinates of said ultrasound phased array probe unit during said measuring, and generating corresponding coordinate datasets; and a controller being communicatively coupled to said ultrasound imaging system and said coordinate tracking system, said controller having a processor and a memory having stored thereon instructions that when executed by said processor perform the steps of: registering said imaged echo datasets in a common coordinate system based on said coordinate datasets; and tracking said position and orientation of said anatomical feature based on said registering.
In accordance with a second aspect of the present disclosure, there is provided a method for tracking a position and orientation of an anatomical feature in computer-assisted surgery, the method comprising: emitting phased-array ultrasound signals towards different portions of said anatomical feature, measuring echo signals returning from said portions of said anatomical feature and generating respective imaged echo datasets while tracking coordinates of said ultrasound imaging system, and generating corresponding coordinate datasets; and a controller performing the steps of: registering said imaged echo datasets in a common coordinate system based on said coordinate datasets; and tracking said position and orientation of said anatomical feature based on said registering.
In accordance with a third aspect of the present disclosure, there is provided a wearable element for use in computer-assisted surgery involving ultrasound tracking of an anatomical feature of a patient, the wearable element comprising: a garment to be worn by the patient; and an ultrasound imaging interface covering at least a portion of the garment, the ultrasound imaging interface being made of a solid acoustically transmissive material and having one or more surgery openings defined therein allowing access to the anatomical feature. In some embodiments, one or more ultrasound probe units can be embedded into the ultrasound imaging interface. In some embodiments, the garment can be provided in the form of a compression shirt, a compression sleeve and the like.
In accordance with a fourth aspect of the present disclosure, there is provided an ultrasound tracking device for use with a position sensing system to register position and orientation in computer-assisted surgery, the ultrasound tracking device comprising: a wearable holder adapted to be secured to an anatomic feature; at least two ultrasonic probe units supported by the wearable holder and adapted to emit signals to image part of the anatomic feature; at least one reference tracker supported by the wearable holder; and a mechanical member projecting from a remainder of the ultrasound tracking device and increasing an axial footprint of the ultrasound tracking device. In some embodiments, the at least two ultrasonic probe units are axially spaced-apart from one another along an anatomical axis of the anatomical feature.
In accordance with a fifth aspect of the present disclosure, there is provided a set of ultrasound tracking devices for use with a position sensing system to register position and orientation in computer-assisted surgery, each of the ultrasound tracking device comprising: a wearable holder adapted to be secured to an anatomic feature, and at least two ultrasonic probe units supported by the wearable holder and adapted to emit signals to image part of the anatomic feature; at least one reference tracker supported by one of the wearable holders; and a linkage between the set of ultrasound tracking devices, the linkage having a rotational joint and a sensor for determining an angular value variation in the rotational joint. In some embodiments, the ultrasound tracking devices are axially spaced-apart from one another along an anatomical axis of the anatomical feature.
In accordance with a sixth aspect of the present disclosure, there is provided an ultrasound tracking system for tracking a position of the ultrasound tracking device with respect to an extremity of an anatomical feature in computer-assisted surgery, the ultrasound tracking system comprising: at least an ultrasound probe unit fixedly mounted relative to the anatomical feature, the ultrasound probe unit being adapted for emitting an ultrasound signal within said anatomical feature, at least a portion of the ultrasound signal being guided away from the ultrasound probe unit and along an anatomical axis of the anatomical feature towards and the extremity thereof, the ultrasound probe unit detecting at least a reflected portion of the ultrasound signal being guided from the extremity of the anatomical feature and back towards the ultrasound probe unit; a controller being communicatively coupled to said ultrasound probe unit, said controller having a processor and a memory having stored thereon instructions that when executed by said processor perform the steps of: determining an axial position of the ultrasound probe unit relative to the extremity of the anatomical feature based on an ultrasound speed value indicative of a speed at which the portion of the ultrasound signal travels along the anatomical feature and on a time duration indicative of a time duration elapsed between the emitting and the detecting. In some embodiments, the ultrasound speed value is measured in situ based on measurements performed by at least two ultrasound probe units axially spaced-apart from one another along the anatomical axis.
In this specification, the term “reference marker” is intended to mean an active or passive marker, such as an emitter, a transmitter or a reflector.
Further features and advantages of the present disclosure will become apparent from the following detailed description, taken in combination with the appended drawings, in which:
Many further features and combinations thereof concerning the present improvements to those skilled in the art following a reading of the instant disclosure.
Referring to the drawings and more particularly to
As the CAS tracking system 1 operates, the controller 2 may receive, generate and transfer at least some of the datasets D associated to the objects O, which may be of various types of information including spatial (e.g., position, orientation), surfacic and volumetric. The controller 2 may also be used to derive information from such datasets D, which may include modifying and/or combining some such datasets D. The controller 2 may be capable of parsing or otherwise registering any such dataset D so as to interpret it in terms of an object-agnostic coordinate system, which may be referred to as a reference coordinate system R, a frame of reference, a referential system, etc. It should be noted that the controller 2 may relate some of the datasets D to one another. For example, a first dataset Da and a second dataset Db may respectively represent the position and orientation of a first object Oa and of a second object Ob according to corresponding first Ra and second Rb coordinate systems, or in a common global coordinate system R. The controller 2 may be configured to parse the second dataset Db so as to interpret it as if it were defined according to another coordinate system, for example the first coordinate system Ra or the reference coordinate system R. In some embodiments, the first dataset Da may be interpreted in a first coordinate system Ra and may be modified by registering it into another coordinate system such as the reference coordinate system R, for instance. In other words, the first and second datasets may be registered to one another in a common coordinate system, i.e., the reference coordinate system R.
The CAS tracking system 1 of
In terms of input, the CAS tracking system 1 may have access to some of datasets D in the form of digital models for the various objects O, such as anatomical models including, but not limited to, arm model(s), bone model(s), artery model(s), vein model(s), nerve model(s) and model(s) of any other anatomical feature(s) of interest. Such anatomical models may consist in datasets containing information relating to surfacic and/or volumetric characteristics of corresponding anatomical features, such as a bone. The anatomical models may be patient-specific, and may have been obtained pre-operatively or peri-operatively, using various imaging modalities. For example, the anatomical models may have been generated from magnetic resonance imagery, radiography in its various forms, ultrasound imaging, to name a few examples. As the case may be, a dataset Dc corresponding to a bone model may be defined according to a coordinate system Rc consistent with imaging conventions, e.g., with the X, Y and Z axes corresponding to the anterior, left lateral and cranial directions upon the patient lying supine. In some embodiments, the bone models may be obtained from a bone atlas or other suitable source based on factors such as gender, race, age, etc, for example as described in U.S. Pat. Nos. 8,884,618, and 10,130,478, the contents of both of which being incorporated herein by reference. In some such embodiments, the bone models may be digitally fitted to patient-specific data, such as, for instance, partial yet strategically selected data points, such as for joint surfaces of a bone, while a bone shaft may be taken from a bone atlas, for example. Such processing of the models may be carried out remotely or locally (e.g., via the controller 2). Likewise, storage of the bone models and/or models of any other anatomical features may be implemented remotely or locally (i.e., via a computer-readable memory).
Moreover, the CAS tracking system 1 has an ultrasound imaging system 6 configured to generate some of the datasets D. The ultrasound imaging system 6 includes at least one ultrasound probe unit 6a provided for producing signals indicative of characteristics pertaining to the objects O. The resulting signals may be communicated from an ultrasound imaging device 6b to the controller 2 to be processed into corresponding datasets D. Alternatively, the signals may be processed by device-specific processing units, such that the corresponding datasets D may be received by the controller 2 instead of the signals. The ultrasound imaging system 6 may be said to be modular as it can include a plurality of ultrasound probe unit(s) 6a and/or ultrasound imaging device(s) 6b.
Further, the CAS tracking system 1 may be configured such that the outputting of at least some of the navigation data from the controller 2 is timed with inputting of at least some of the datasets D into the controller 2. The CAS tracking system 1 may thus be said to provide the navigation data in real time or near real time.
In accordance with some embodiments, a coordinate tracking system 8 is provided as part of the CAS tracking system 1. The coordinate tracking system 8 may include one or more coordinate tracking devices 8a including, for instance, a camera that tracks marker reference(s) 8b. The coordinate tracking system 8 can use either active or passive spatial references as markers of position and/or orientation. For example, as is known in the art, the coordinate tracking system 8 may optically see and recognize retroreflective devices as reference markers, so as to track objects, for example tools and limbs, in six degrees of freedom (DOFs), namely in position and orientation along the X, Y and Z axes. Thus, the orientation and position of the limb in space can be determined using the information obtained from the spatial references, resulting in a corresponding dataset (for example, the dataset Db) that is defined according to a corresponding coordinate system (for example the coordinate system Rb), which may in some cases be inherent to a reference marker or to the ultrasound probe unit 6a used therewith. The coordinate tracking system 8 may also include a dedicated computing device used to condition, digitize and/or otherwise process the signal produced by the camera. The coordinate tracking device 8a may be a 3D depth camera as a possibility (e.g., a Kinect™), that may not require passive spatial references as markers of position and/or orientation. Other 3D cameras can be used in other embodiments. For instance, the coordinate tracking device 8a may include conventional two-dimensional camera(s) (operated in mono- or stereo-vision configuration) operated with a shape recognition module which identifies, locates and processes two-dimensional identifiers (e.g., QR codes) as imaged in the two-dimensional images generated by the two-dimensional camera(s). In these embodiments, the shape recognition module can evaluate the distortion of the two-dimensional identifiers in the two-dimensional images (e.g., a square identifier becoming trapezoidal when bent) to retrieve three-dimensional model(s) of the two-dimensional identifiers and/or of the underlying anatomical feature.
In some embodiments, the ultrasound imaging system 6 is used to produce a signal indicative of at least one spatial and/or dimensional characteristic relating to biological tissue. According to conventional ultrasound-based detection principles, which are typical to conventional ultrasound probe units, an ultrasound emitter may be used to cast a sound wave and, upon an object being located within range of the ultrasound emitter, an echo of the sound wave is cast back to be sensed by an ultrasound sensor. In some embodiments, the ultrasound emitter and the ultrasound sensor may be separate from one another. However, in some other embodiments, the ultrasound emitter and the ultrasound sensor may be combined to one another in an ultrasound transducer performing both the ultrasound emission and the sensing functions. The echo may materialize upon the sound wave travelling through a first medium, such as skin, reaching a second medium of greater density compared to that of the first medium, such as a bone. As the speeds at which the sound waves may travel through various media depend on the respective physical properties of such media, characteristics of the echo (e.g., time elapsed between emission of the sound wave and the sensing of the echo, intensity of the echo relative to that of the sound wave, etc.) may be used to derive certain characteristics of the media through which the echo has travelled. In some embodiments, the functions of both the ultrasound emitter and the ultrasound sensor are performed by one or more ultrasound transducer transducers. In some embodiments, the ultrasound transducer may have one or more piezoelectric crystals emitting ultrasound signals based on corresponding electrical signals, and/or generating electrical signals based on received ultrasound signals. Any suitable type of ultrasound transducers can be used including, but not limited to, piezoelectric polymer-based ultrasound transducers such as poly(vinylidene fluoride)-based ultrasound transducers, capacitive ultrasound transducers, microelectromechanical systems (MEMS) based ultrasound transducers and the like.
Per the present disclosure, namely, in the exemplary case of orthopedic surgery for instance, the ultrasound imaging system 6 may be configured to produce a signal indicative of a detailed spatial relationship between the ultrasound probe unit 6a and a limb (which may be one being tracked by the coordinate tracking system 8), and also between constituents of the limb such as soft tissue (e.g., skin, flesh, muscle, ligament) and bone. Resulting datasets may include measurements of a distance between contours associated to the limb, such as an epithelial contour associated to skin and a periosteal contour associated to the bone. The resulting datasets may also include measurements of thicknesses, surfaces, volumes, medium density and the like. Advantageously, updated signal production via the ultrasound imaging system 6 and ad hoc, quasi-real-time processing may produce datasets which take into account movement and/or deformation of one or more of the constituents of the limb. The ultrasound imaging system 6 may also include a dedicated computing device configured for conditioning and/or digitizing the signal.
In some implementations, the ultrasound imaging system 6 may be suitable for producing a signal indicative of surfacic, volumetric and even mechanical properties of the objects O to be tracked by the CAS tracking system 1. This may be achieved, for instance, by way of a multi-planar ultrasound system capable of operating simultaneously along multiple notional planes that are spaced and/or angled relative to one another, coupled to a suitably configured controller 2. Further, it is contemplated that other types of imaging systems, such as an optical coherence tomography (OCT) system, may be used in combination with the ultrasound imaging system 6. The type of additional imaging system may be selected, and combined with other type(s) as the case may be, to attain certain performance requirements in terms of effective range, effective depth, signal-to-noise ratio, signal acquisition frequency, contrast resolution and scale, spatial resolution, etc., among other possibilities. In some embodiments, partially exposed bone structures may be captured and/or referenced by the additional imaging system at any time before, during or after the surgery. Specifications of such imaging systems may thus be adapted, to some degree, based on requirements derived from typical characteristics of the objects O to be tracked.
As will be described in view of the above, a precise tracking of bone may be achieved using the CAS tracking system 1, regardless of certain materials, such as soft tissue, being overlaid thereon.
The CAS tracking system 1, and specifically the coordinate tracking system 8, may be well suited to track ultrasound tracking devices 10 shown in
According to an embodiment, an ultrasound tracking device 10 is of the type that attaches to a limb of a patient, and is used to track an axis of the limb. For this purpose, the ultrasound tracking device 10 has a wearable holder 12, ultrasound probe units 14, and may have another trackable reference 16.
The wearable holder 12 is of the type that is mounted about the outer-skin surface S (a.k.a., exposed skin, epidermis, external soft tissue, etc.) of an anatomical feature, such as but not limited to a thigh with femur F or a shank with tibia T of a patient. The wearable holder 12 and the CAS tracking system using it, as will be described herein and as an example the ultrasound imaging system 6 of
F and tibia T in
While the bone may be described herein as “underlying” the outer-skin surface S, it is to be understood that this does not exclude the possibility that certain portions of the bone may at least partially exposed during surgery (e.g. by incisions, etc.) nor does this require or imply that the entirety of the bone must necessarily be unexposed and subcutaneous at all times, for instance in the case of laparoscopic procedures.
The ultrasound tracking device 10 including the wearable holder 12 is configured to be grossly secured to the anatomical feature against which it is mounted in such a way that there is a tolerable movement between the holder 12 and the anatomical feature. Algorithms can detect and compensate for movement using ultrasound processing combined with the optical tracking system. The position and the orientation of the holder 12 may also be trackable through space by the CAS tracking system, whereby a tracking of the anatomical feature can be derived from a tracking of the ultrasound tracking device 10. The ultrasound tracking device 10 is therefore a non-invasive tool to be used to track the position and the orientation, and thus the movement, of the bone through space before, during or after the computer-assisted surgery, for instance relative to a global referential system.
The wearable holder 12 of the ultrasound tracking device 10 can take different forms to accomplish such functionality. In the depicted embodiment, the wearable holder 12 is in the form a belt, ring, vest or strap that is mounted to an anatomical feature (e.g., a leg and the underlying femur or tibia, etc.) of the patient to be in fixed relative relationship with the bone. In an alternate embodiment, the wearable holder 12 is in the form a tight-fitting sleeve that is mounted to an anatomical feature of the patient to be in fixed relative relationship with the bone. Similarly, the wearable holder 12 is mountable about other limbs, appendages, or other anatomical features of the patient having a bone to be tracked. The wearable holder 12 may essentially be a pressurized band around the limb to enhance contact. It is also considered to use a gel conforming pad to couple the holder 12 to the skin, as a possibility. Traditional coupling gel can also be used. In some embodiments, coupling gel of typical formulations as well as biocompatible gel (e.g., in vivo biocompatible or in vivo bioexcretable) can be used. The gel conforming pad may include acoustically transmissive material which can help the transmission of the ultrasound signals and returning echo signals thereacross. In another embodiment, the wearable holder 12 is in the form of a boot, a glove or a corset (in the thoraco-lumbar region). The wearable holder 12 may be annular and arranged to be at an axial location corresponding to a slice of the bone, to which the bone axis is normal, in a first scenario. However, it is also considered to have the holder 12 angled, in such a way that the bone axis is not normal to a plane passing though the holder 12. In doing so, the ultrasound tracking device 10 may produce a greater axial coverage of bone surface than for the first scenario. Other embodiments can also include independently placed sensors that are disposed in a non-planar but relevant scanning positions to obtain usable datasets.
Ultrasound probe units 14 are secured to the wearable holder 12. In an embodiment, the ultrasound probe units 14 include one or more transducers that emit an ultrasound wave and measure the time it takes for the wave to echo off of a hard surface (such as bone) and return to the face(s) of the transducer(s). In order to self-calibrate for the patient's individual speed of sound, some transducers are positioned very accurately relative to others and as one emits waves, others listen and can compute the speed of sound based on well-known relative geometric positioning. Using the known speed of the ultrasound wave travelling through a bodily media, the time measurement is translated into a distance measurement between the ultrasound probe 14 and the bone located below the outer-skin surface S. The transducers in the probe units 14 may be single-element or multi-element transducers, or a combination of both. For example, the probe units 14 may have multiple elements arranged in a phased array, i.e., phased-array ultrasound probe units 14, having the capacity of performing multi-element wave generation for sound wave direction control and signal reconstruction. In some embodiments, the phased-array ultrasound probe unit 14 has a single ultrasound transducer operating in a phased-array arrangement. When sensors are not rigidly linked to others, the relative position can be found with self-location algorithms. Therefore, the probe units 14 used in the manner shown in
A set of two or more ultrasound probe units 14 may be needed to determine the anatomical axis, as illustrated in
In this specific embodiment, the wearable holder 12 has an open cuff body 17 being wrappable about patient's thigh T for the ultrasound imaging of the femur F, for instance. The wearable holder 12 also has elongated member 19 extending axially from a circumference of the open cuff body 17, in a manner parallel to the anatomical axis A, for example. As shown, both the open cuff body 17 and the elongated member 19 have respective ultrasound probe units 14 that are embedded therein. The ultrasound probe units 14 are radially inwardly oriented. In this embodiment, the open cuff body 17 has two axially spaced-apart arrays of circumferentially distributed ultrasound probe units 14, hence providing axial coverage. However, in some other embodiments, only one or more than two of these arrays can be provided. Similarly, more than one array of axially distributed ultrasound probe units 14 can be embedded in the elongated member 19 in some other embodiments. As can be appreciated, each of the ultrasound probe units 14 can be used either as an ultrasound transmitter or as an ultrasound receiver, or both, depending on the embodiment. In some embodiments, the wearable holder 12 can have an adjustment mechanism 21 allowing to adjust, e.g., increase or decrease, the distance between the ultrasound probe units 14 of the elongated member 19 and the ultrasound probe units 14 of the open cuff body 17. The tuning mechanism 21 can be telescopic in some embodiments, and it can be encoded to monitor the distance between the open cuff body 17 and the elongated member 19.
As best shown in
In some embodiments, two axially spaced-apart ultrasound probe units 14 can be used to determine the ultrasound speed of either one of the guided ultrasound signal portions 23 and 25. In some embodiments, the ultrasound speed V can be determined based on the equation V=D1/T1, where D1 denotes an axial distance separating the two axially spaced-apart ultrasound probe units 14 and T1 denotes a first time duration elapsed between the detection of the guided ultrasound signal portion by a first one of the ultrasound probe units 14 and the detection of the guided ultrasound signal portion by the second one of the ultrasound probe units 14. In some other embodiments, the ultrasound speed V is not necessarily a measured ultrasound speed but rather a reference ultrasound speed retrieved from an accessible computer memory where it is stored. In either case, the ultrasound speed V of the guided ultrasound signal portion can be useful to determine the axial position of the ultrasound tracking device 10 with respect to the femur F. To do so, one may monitor a second time duration T2 elapsed between the generation of the ultrasound signal portion 23 within the patient's thigh T by a given ultrasound probe unit, and the detection of the reflected ultrasound signal portion 25 by the same ultrasound probe unit, or by another ultrasound probe unit sharing the same axial position along the femur F. By correlating the second time duration T2 to the ultrasound speed V discussed above, a propagation distance D2 can be determined. This propagation distance D2 can be indicative of the axial position of the ultrasound tracking device 10 along the patient's thigh T during the surgery. More specifically, the propagation distance D travelled by the ultrasound signal guided along the femur F may be given by D2=V×T2. In these embodiments, the axial position Pa of the ultrasound tracking device 10 with respect to the extremity E of the femur F would be half that propagation distance D2, i.e., Pa=D2/2. Accordingly, in these embodiments, the tracking of the ultrasound tracking device 10 using the trackable reference 19 may be omitted as the position of the ultrasound tracking device 10 can be otherwise tracked using ultrasound signals guided to and from the femur F, or any other bone under surgery. In some embodiments, additional computations and/or referencing may be required to distinguish which bone extremity reflects first. In these embodiments, determining which bone extremities reflects first or second can help to correctly position the bone or otherwise anatomical feature in the reference coordinate system.
Referring to
Referring to
Referring to
In the embodiment of
A reverse arrangement may be possible for
Referring to
Still further in the embodiment of
In the embodiment of
In the embodiment of
For the embodiments of
Referring to
In
In the embodiment of
Although the position and orientation of the ultrasound probe units 70 are fixed in the embodiment described with reference to
Referring now to
As best shown in
In the illustrated embodiment, the ultrasound imaging system 802 has at least two spaced-apart ultrasound probe units 820A and 820B each proximate a respective one of the portions 810a and 810b of the anatomical feature 810 of interest. The ultrasound probe units 820A and 820B are shown spaced from the limb, but can be in contact with the soft tissue, such as in the embodiments of
The imaged echo datasets D1 and D2 can be expressed in any suitable format. Examples of such imaged echo datasets D1 and D2 are plotted at
Referring back to
Examples of some coordinate tracking systems 804 are described below.
It is noted that, in view of the above, coordinates of the ultrasound imaging system 802 are tracked during its operation. Accordingly, the controller 806 is configured for registering the imaged echo datasets D to one another in a common coordinate system X, Y, Z based on the coordinate datasets generated by the coordinate tracking system 804. Such registered imaged echo datasets are schematically illustrated in
The controller 806 can be provided as a combination of hardware and software components. The hardware components can be implemented in the form of a computing device 900, an example of which is described with reference to
Referring to
The processor 902 can be, for example, a general-purpose microprocessor or microcontroller, a digital signal processing (DSP) processor, an integrated circuit, a field-programmable gate array (FPGA), a reconfigurable processor, a programmable read-only memory (PROM), a system on a chip, an embedded controller, or any combination thereof.
The memory 904 can include a suitable combination of any type of computer-readable memory that is located either internally or externally such as, for example, random-access memory (RAM), read-only memory (ROM), compact disc read-only memory (CDROM), electro-optical memory, magneto-optical memory, erasable programmable read-only memory (EPROM), and electrically-erasable programmable read-only memory (EEPROM), Ferroelectric RAM (FRAM) or the like.
Each I/O interface 906 enables the computing device 900 to interconnect with one or more input devices, such as the ultrasound imaging system or its ultrasound probe unit(s), the coordinate tracking system, keyboard(s), mouse(s), or with one or more output devices such as display screen(s), computer memory(ies), network(s) and like.
Each I/O interface 906 enables the controller 806 to communicate with other components, to exchange data with other components, to access and connect to network resources, to server applications, and perform other computing applications by connecting to a network (or multiple networks) capable of carrying data including the Internet, Ethernet, plain old telephone service (POTS) line, public switch telephone network (PSTN), integrated services digital network (ISDN), digital subscriber line (DSL), coaxial cable, fiber optics, satellite, mobile, wireless (e.g. Wi-Fi, WiMAX, Zigbee), SS7 signaling network, fixed line, local area network, wide area network, and others, including any combination of these.
It is intended that the instructions 908 may be executed to receive the imaged echo dataset(s), receive the coordinate dataset(s), to register the received imaged echo datasets in the common coordinate system X, Y, Z and to track the position and orientation of the anatomical feature. In some embodiments, a software application to execute the instructions 908 is stored on the memory 904 and accessible by the processor 902 of the computing device 900. In some embodiments, the software application can be locally or remotely stored.
The computing device 900 and the instructions to be executed described above are meant to be examples only. Other suitable embodiments of the controller 806 can also be provided, as it will be apparent to the skilled reader.
At step 1002, ultrasound signals are successively emitted towards different portions of the anatomical feature of interest. The ultrasound signals may be emitted by a single phased-array ultrasound probe unit in some embodiments, while they may be emitted by different phased-array ultrasound probe units in some other embodiments. The emission of the ultrasound signal may therefore be simultaneous or sequential depending on the embodiment.
At step 1004, echo signals returning from the portions of the anatomical feature are received and measured. The returning echo signals may be measured by a single ultrasound probe unit in some embodiments, while they may be measured by different ultrasound probe units in some other embodiments. Accordingly, the measurement of the returning echo signals may therefore be simultaneous or sequential depending on the embodiment.
At step 1006, imaged echo datasets associated to each measured echo signals are generated. In a variant, the imaged echo datasets are generated by one or more ultrasound imaging device(s) of the ultrasound imaging system which is communicatively coupled to the ultrasound probe unit(s). As such, the signal measured by the ultrasound probe unit(s) may be converted into the imaged echo datasets by the ultrasound imaging device(s) upon reception or subsequently, depending on the embodiment. The imaged echo datasets may be stored on a non-transitory computer-readable memory so as to be accessible by a processor of the controller at a subsequent step of the method 1000. Each ultrasound probe unit may have its corresponding ultrasound imaging device in some embodiments. In some other embodiments, a single ultrasound imaging device may be communicatively coupled to the ultrasound probe units. The imaged echo datasets may have corresponding probe unit stamps identifying their source ultrasound probe unit. The imaged echo datasets may have corresponding time stamps identifying at what moment in time the echo signals they represent have been measured.
As discussed above, the steps 1002, 1004 and 1006 may be performed simultaneously or sequentially, depending on whether an ultrasound imaging system having one or more ultrasound probe units or one or more ultrasound imaging devices is used.
At step 1008, optionally, coordinates of the ultrasound imaging system are tracked during at least the measurement step 1004. The coordinates of the ultrasound imaging system may be tracked only during any one of the steps 1002, 1004 and 1006 or during all of the steps 1002, 1004 and 1006 depending on the embodiment. In some embodiments, the coordinates of the ultrasound imaging system are tracked in a continuous manner during a computer-assisted surgery, concurrently with the steps 1002 and/or 1004. Steps 1002, 1004, 1006 may occur in real-time or quasi-real-time. However, in some other embodiments, it may be preferred to track the ultrasound imaging system only at specific moments in time. In the latter embodiments, the ultrasound imaging system and the coordinate tracking device may be synchronized to one another. As such, the coordinate tracking device may be triggered shortly before, during, and/or after the operation of the ultrasound imaging system.
In step 1010, coordinate datasets indicative of the tracked coordinates of the ultrasound imaging system are generated. The coordinate datasets may have corresponding probe unit stamps identifying which one of the ultrasound probe units is tracked. The coordinate datasets may have corresponding time stamps identifying at what moment in time the ultrasound probe units have been tracked. As such, it can be possible to find a correspondence between the imaged echo datasets and the coordinate datasets. Each coordinate dataset may include information relating to the geometric relationship between the portion of the ultrasound probe unit that is being tracked and the plane along which the ultrasound imaging is performed. For instance, in embodiments where a reference marker is mounted on a top of an ultrasound probe unit, the geometric relationship can be based on the length of the ultrasound probe unit, the length at which the reference marker extends from the top of the ultrasound probe unit, the shape of the ultrasound probe unit, and the like.
At step 1012, the imaged echo datasets are registered to one another in a common coordinate system based on the coordinate datasets. In some embodiments, the registering may involve the use of the probe unit stamps and/or the time stamps so as to ensure that the corresponding imaged echo datasets be registered in the common coordinate system using the corresponding coordinate dataset. If the anatomical features are fixed in the coordinate system, and the ultrasound units are also fixed, the steps 1008 and 1010 may be done punctually, at intervals, etc.
At step 1014, the position and orientation of the anatomical feature are tracked based on the registering step 1012. The tracking of the anatomical feature can include a step of calculating an axis of the tracked anatomical feature based on its position and orientation.
In some embodiments, the step 1012 of registering includes a step of generating an anatomical feature model representative of the anatomical feature based at least on the imaged echo datasets and corresponding coordinate datasets. The anatomical feature model can therefore be generated on the go in the coordinate system X, Y and Z. As such, the anatomical feature model can be incrementally improved as more imaged echo datasets and corresponding coordinate datasets are received over time.
In some embodiments, the step of generating the anatomical feature model can include a step of accessing a reference model base. For instance, a reference model base may be selected in a database comprising different reference model bases such as a tibia model base, a femur model base, a spine model base including, e.g., a sacral, lumbar, thoracic or cervical model base(s), a shoulder joint model base, a humerus model base, a scapula model base, a forearm model base, a pelvis model base, an elbow joint model base and the like. In embodiments where the femur is under surgery, for instance, a reference model base associated to the femur may be selected. As such, the femur reference model base may be positioned and orientated in the coordinate system based on the imaged echo datasets and corresponding coordinate datasets. In some other embodiments, the step of accessing a model base can include a step of fetching a patient-specific model which can be based on pre-operative or peri-operatively images of the anatomical feature of a given patient obtained using various imaging modalities. For example, the patient-specific anatomical models may have been generated from magnetic resonance imagery, or from radiography in its various forms, as possibilities. In these embodiments, the patient-specific model may be positioned and oriented in the coordinate system based on the imaged echo datasets. Generic models may be used as well, such as those from a bone atlas.
Referring now to
In some embodiments, the ultrasound probe unit 1120 may be moved in any given pattern to map desired portions of the anatomical feature 1110 which is a spine in this specific embodiment. In some embodiments, the pattern may be arbitrary as long as all the desired portions of the anatomical feature 1110 have satisfactorily been probed.
In this specific embodiment, the imaged echo datasets can be registered to one another in the common coordinate system so as to construct an anatomical feature model in a gradual manner as the arbitrary scan is being performed by an operator, for instance. In some other embodiments, a generic or patient-specific spine base model may be retrieved to build thereon to increase resolution.
As can be expected, the position and orientation of the surgical tool holder 1142 are also tracked in this embodiment. Accordingly, once the position and orientation of the anatomical feature 1110 are suitably tracked by the ultrasound tracking system, the surgical tool holder 1142 holding a surgical tool may be moved as desired in the common coordinate system to perform at least some surgical steps, in some embodiments.
An acoustically transmissive material such as ultrasound gel 1148 may be applied onto the outer-skin surface S of the anatomical feature 1110. In these embodiments, the transmission of the ultrasound signal from the ultrasound probe unit 1120 to the outer-skin surface S, or of the echo signal from the outer-skin surface S back towards the ultrasound probe unit 1120, may be enhanced.
In some embodiments, the ultrasound probe unit 1120 and the surgical tool holder 1142 can be mounted to a frame 1150 in accordance to a known geometric relationship, which can ease the registering of the imaged echo datasets and the coordinates datasets in the common coordinate system X, Y, Z. If the patient and the frame 1150 are fixed in the coordinate system, the ultrasound probe unit 1120 and the surgical tool holder 1142 may not need to be tracked optically. In some embodiments, intraoperative imaging may be eliminated from the procedure as the ultrasound tracking system 1100 may be used to compute (or recompute) an X-ray like image from the imaged data sets generated by the ultrasound probe unit 1120 in real time or quasi real time.
Referring now to
The mechanical coordinate tracking system 1204 has a frame 1250 to which the ultrasound probe unit 1220 is movably mounted via the carriage 1230 and which is fixedly mounted relative to the anatomical feature 1210. For instance, the frame 1250 may be directly fixed to the patient in some embodiments. The frame 1250 may be fixedly mounted to a table on which the patient lies in some other embodiments. The frame 1250 may be mounted to any other suitable structure as circumstances may dictate. The mechanical coordinate tracking system 1204 also has position sensors 1252 associated with the carriage 1230 and/or the frame 1250, to measure the movement of the ultrasound probe unit 1220 with respect to the frame 1250, especially the X-Y position. The position sensors 1252 may be range finders, encoders, linear actuators, etc, such that the X,Y coordinates of the ultrasound probe unit 1220 and the surgical tool and holder 1242 may be known relative to the frame 1250, and thus in the coordinate system. As observed, an inertial sensor unit 1252′ (e.g., iASSIST® sensor) may be present to determine the orientation of the surgical tool and holder 1242. In the illustrated embodiment, the surgical tool and holder 1242 is also mounted to the bracket 1225, so as to move concurrently with the ultrasound probe unit 1220, with the tool in the holder 1242 being in the field of imaging of the ultrasound probe unit 1220. For example, the surgical tool and holder 1242 may include a drill guide that can be tracked in position via the frame 1250 and position sensors 1252, and in orientation (phi, theta, ro) via the inertial sensor unit 1252′. Rotary encoders or like sensors can be an alternative to the inertial sensor unit 1252′. As observed, the position and orientation of the surgical tool and holder 1242 may be adjusted relative to the bracket 1225, by way of adjustable support 1254. For instance, a drill guide of the surgical tool and holder 1242 may be moved axially, and its orientation can be adjusted via dials 1254′, as an option. In some embodiments, the rotation about the Y-axis happens between the adjustable support 1254 by way of a rotating mechanism (e.g., a rack and pinion assembly) controlled by the pair of aligned dials 1254′. By doing so, the holder 1242 may be tilted as desired. The rotation about the x-axis can happen between the bracket 1225 and the adjustable support 1254 through the single knob 1254′.
In some embodiments, the ultrasound probe unit 1220 may be moved in any given pattern to map desired portions of the anatomical feature 1210. In some embodiments, the pattern may be a raster scan pattern 1252″ in which the ultrasound probe unit 1220 is moved along a first direction in the X-axis via movement of the carriage 1230 relative to rails of the frame 1250, then incrementally along the Y-axis via the bracket 1225 and its cylindrical joint, then along a second direction opposite the first direction in the X-axis, and so forth, until all the desired portions of the anatomical feature 1210 have been satisfactorily probed. As can be expected, the position sensors 1252, which may be of the encoder type, can measure the movement of the ultrasound probe unit 1220 with respect to the frame 1250. As such, the echo signal datasets generated by the ultrasound probe unit 1220 can be registered to one another in the common coordinate system X, Y, Z based on the coordinate datasets measured by the position sensors 1252. The orientation of the drill guide of the surgical tool and holder 1242, or of any other tool, can be tracked in orientation, such as by the inertial sensor unit 1252′.
In some embodiments, the acoustically transmissive material may not be provided in the form of gel. Indeed, the acoustically transmissive material can be provided in the form of one or more solid pieces of material. In some embodiments, the acoustically transmissive material may be provided in the form of a wearable element 1356, such as a vest or corset, to be worn by the patient during the computer-assisted surgery. As depicted in this embodiment, the wearable element 1356 has one or more surgery openings 1364 allowing access for the surgical tool and holder 1342 to interact with the anatomical feature 1310, or its surroundings, without risks of acoustically transmissive gel reaching the inner body during the computer-assisted surgery. An example of such a wearable element 1356 is shown in
In some embodiments, for instance with reference to
In some embodiments, wearable element(s) can be provided to help shoulder or hip surgeries as well. An example of such a wearable element 1456 used in shoulder surgeries is shown in
In this embodiment, trackable references 1416 fixed to a reference bed can help track the positioning of the patient relative to reference bed 1429. In some embodiments, trackable references 1416 are positioned on movable ultrasound probe units 1414 to track the positioning of the shoulder with respect to the reference bed 1429 as well. In some embodiments, the ultrasound probe units 1414 are collectively used to monitor a rotation of the humerus about its anatomical axis by monitoring another feature of the patient's arm such as a vein or an artery, using one or more other ultrasound tracking devices 1410, for instance. Locating the scapula to orient/link the glenoid in the surgical plane may also be envisaged in some embodiments. Although the reference bed 1429 is featured under the patient's belly, it can also be used with the patient lying on his back on the reference bed 1429 for some other types of surgeries.
In another aspect of the disclosure, there is described a wearable element for use in computer-assisted surgery involving ultrasound tracking of an anatomical feature of a patient, the wearable element comprising: a garment to be worn by the patient; and an ultrasound imaging interface covering at least a portion of the garment, the ultrasound imaging interface being made of a solid acoustically transmissive material and having one or more surgery openings defined therein allowing access to the anatomical feature.
In some embodiments, the garment is an upper-body garment such as a shirt, a vest, a corset, a sleeve, a belt and the like. As discussed above, ultrasound probe units may be embedded in the ultrasound imaging interface. In some embodiments, the garment can include adhesive pad(s) such as electrocauterisation grounding pads. In some embodiments, the ultrasound imaging interface covers at least a portion of a back portion of the upper-body garment. In some embodiments, the ultrasound imaging interface extends towards each lateral side of the upper-body garment and reaches at least a coronal plane of the upper-body garment. In some embodiments, the ultrasound imaging interface extends towards a lumbar portion of the upper-body garment and reaches at least a transverse plane of the upper-body garment. In some embodiments, the surgery opening(s) extend(s) along a spine orientation of the upper-body garment. In some embodiments, the ultrasound imaging interface has an ultrasound imaging strip following a spine orientation of the upper-body garment, and at least a surgery opening extends alongside and parallel to the ultrasound imaging strip. The ultrasound imaging strip allowing the imaging of the spine of the patient with an ultrasound probe unit being perpendicular to the ultrasound imaging strip, the surgery opening(s) extending alongside the ultrasound imaging strip allowing surgical tool(s) to reach the spine at an oblique angle through them. In some embodiments, the garment is made of a compression material tightly fitting the patient.
In another aspect of the disclosure, there is described an ultrasound tracking device for use with a position sensing system to register position and orientation in computer-assisted surgery, the ultrasound tracking device comprising: a wearable holder adapted to be secured to an anatomic feature; at least two ultrasonic probe units supported by the wearable holder and adapted to emit signals to image part of the anatomic feature; at least one reference tracker supported by the wearable holder; and a mechanical member projecting from a remainder of the ultrasound tracking device and increasing an axial footprint of the ultrasound tracking device.
In another aspect of the disclosure, there is described a set of ultrasound tracking devices for use with a position sensing system to register position and orientation in computer-assisted surgery, each of the ultrasound tracking device comprising: a wearable holder adapted to be secured to an anatomic feature, and at least two ultrasonic probe units supported by the wearable holder and adapted to emit signals to image part of the anatomic feature; at least one reference tracker supported by one of the wearable holders; and a linkage between the set of ultrasound tracking devices, the linkage having a rotational joint and a sensor for determining an angular value variation in the rotational joint.
In another aspect of the disclosure, there is described an ultrasound tracking system for tracking a position of the ultrasound tracking device with respect to an extremity of an anatomical feature in computer-assisted surgery, the ultrasound tracking system comprising: at least an ultrasound probe unit fixedly mounted relative to the anatomical feature, the ultrasound probe unit being adapted for emitting an ultrasound signal within said anatomical feature, at least a portion of the ultrasound signal being guided away from the ultrasound probe unit and along an anatomical axis of the anatomical feature towards and the extremity thereof, the ultrasound probe unit detecting at least a reflected portion of the ultrasound signal being guided from the extremity of the anatomical feature and back towards the ultrasound probe unit; a controller being communicatively coupled to said ultrasound probe unit, said controller having a processor and a memory having stored thereon instructions that when executed by said processor perform the steps of: determining an axial position of the ultrasound probe unit relative to the extremity of the anatomical feature based on an ultrasound speed value indicative of a speed at which the portion of the ultrasound signal travels along the anatomical feature and on a time duration indicative of a time duration elapsed between the emitting and the detecting. In some embodiments, the ultrasound speed value is measured in situ based on measurements performed by at least two ultrasound probe units axially spaced-apart from one another along the anatomical axis.
While illustrated in the block diagrams as groups of discrete components communicating with each other via distinct data signal connections, it will be understood by those skilled in the art that the preferred embodiments are provided by a combination of hardware and software components, with some components being implemented by a given function or operation of a hardware or software system, and many of the data paths illustrated being implemented by data communication within a computer application or operating system. The structure illustrated is thus provided for efficiency of teaching the present preferred embodiment. The embodiments of the invention described above are intended to be exemplary only. For instance, as knee or lumbar surgeries are described above, they are meant to be exemplary only. The methods and systems described herein can also be applicable in pelvis surgery, shoulder blade surgery, and any other bone or articulation surgery. Moreover, in some embodiments, the ultrasound methods and systems can be used to find tools, screws and other surgery equipment within the body of the patient during surgery. The scope of the invention is therefore intended to be limited solely by the scope of the appended claims.
Number | Date | Country | |
---|---|---|---|
62991707 | Mar 2020 | US |