The present invention relates to the field of medical imaging.
Medical images are used in state-of-the-art medical procedures to provide computer-aided guidance and increase accuracy and precision during an intervention. For example, for a brachial plexus block, the operator typically holds an ultrasound device in one hand and a syringe in the other, so that they can see the needle of the syringe in the ultrasound image. Current state-of-the-art procedures usually involve displaying the ultrasound image on a screen that is located next to the patient, which results in the operator effectively looking away from the position at which they are inserting the syringe. Therefore, this kind of operation is counter-intuitive and requires a lot of training to perform safely.
In recent years, optical head mounted displays (“OHMD”) have become available, which a bearer can wear on his head and which comprise a display that is arranged in the field of view of the bearer. Such an OHMD is adapted so that the bearer can see at least a part of his real environment as well as computer-generated images shown in the field of view of the bearer. This allows the bearer to visually experience an augmented reality, part of which is real and part of which is computer-generated. The computer-generated images can be displayed to simulate two- and three-dimensional objects that are shown as holograms overlaid onto the real world.
The purpose of the invention is to provide an improved method and system for utilizing medical imaging.
The present invention concerns creating an augmented reality (“AR”), i.e. a perception of a real-world environment combined with computer-generated information, by displaying medical images in the field of view of a user.
An AR system (“ARS”) is proposed that is adapted for displaying a medical image and comprises
wherein the ARS is adapted to display the transformed image on the OHMD in a position, orientation and scale that corresponds to the perspective of a position and orientation of the OHMD (“POOHMD”). The ARS can comprise hardware means and software means.
In other words, the ARS is adapted so that
This can allow for creating the AR impression that a part of reality at which the image was taken is overlaid with the transformed image.
The image is transformed so that the bearer of the OHMD, i.e. from the perspective of the POOHMD, perceives the transformed image as being in the position and orientation of the image (“POimage”). To do this, the image, which is initially created according to a coordinate system of the image (“CSimage”), is transformed to a coordinate system (“CSOHMD”) that is chosen to match the view of the bearer.
Overlaying parts of the reality with a medical image that is adjusted to the viewers perspective can support the work of an operator by allowing him to look in the direction of the area at which he performs his tasks, thereby allowing him to work in a more intuitive manner. Using the proposed ARS can thus allow for greatly reducing the time and costs for training of medical staff. In contrast to so-called virtual reality settings, where the real world is completely obscured to the bearer of a head mounted display device, the OHMD of the proposed ARS allows the bearer to still see the real world, which is preferred for medical procedures.
Currently used systems comprise a monitor and a supporting stand on which the monitor is placed. Therefore, currently used systems are often bulky and do not allow for a disinfection in a convenient and speedy manner. Thus, these systems are often not used in operating rooms, which are often tightly packed and require sterility. Since the proposed ARS can be realized in a significantly smaller and/or space efficient manner than the currently used systems, it can allow for using medical imaging in scenarios in which the currently used systems cannot conveniently be used, e.g. for cross-checking the state of an ongoing intervention in a tightly packed operating room. The possible decrease in size can furthermore allow the proposed ARS to be realized in an easily transportable manner, thereby facilitating its use in rural areas, mobile or field hospitals, and/or in clinics or hospitals lacking state-of-the-art infrastructure.
The proposed ARS and the proposed method can be used to support diagnostic and therapeutic operations. They can for example be used in the context of
For example, if during a resection an operator intends to remove parts of a tissue, structure, or organ of the patient's body that they can distinguish using a medical imaging method, they can use the proposed ARS during the intervention to determine which parts they shall remove and/or to determine which parts they have already removed.
Methods for taking the medical image by the MID can for example include
In some embodiments, the medical imaging method allows for taking two-dimensional (“2D”) and/or three-dimensional (“3D”) images. For example, X-ray technology typically allows for 2D-images and multiple such X-ray images allow for computing a 3D computer tomographic image (“CAT scan”). Another example of a 3D-imaging method is 3D-ultrasound imaging.
In some embodiments, the MID comprises an ultrasound probe. The ultrasound probe can e.g. be comprised in a handheld device.
In some embodiments, the MID comprises an X-ray emitter and an X-ray plate, which e.g. can be comprised in a C-arm.
For some medical imaging methods, the image does not have an inherent position, orientation and/or scale. Therefore, the position, orientation and/or scale of the image should be understood to be a choice, preferably a choice that approximates the intuitive understanding of the image, e.g. matching the anatomy of the respective body part. For example, an X-ray image is a two dimensional projection of a three dimensional segment. It can allow for allocating a scale (e.g. the size in which it was taken) and an orientation (e.g. orthogonal to the X-ray plate), but not for allocating an unambiguous position, as the X-ray image represents a superimposition of all the planes that the X-rays have passed through. Therefore, the ARS can be adapted to make a choice, e.g. displaying the X-ray image as being at the position of the X-ray plate or displaying the X-ray image as being somewhere between the X-ray emitter and the X-ray plate, e.g. so that the image is displayed as being halfway in the body part that has been scanned.
In some embodiments, the ARS comprises adjusting means that is adapted to instruct the ARS to adjust the position, orientation and/or scale in which the transformed image is displayed. The adjusting means can allow the bearer to instruct the ARS to not display the transformed image corresponding to the currently displayed perspective of a position and orientation of the image, but somewhere else and/or somehow else, e.g. scaled to a different size.
In some embodiments, the adjusting means are adapted so that the user can adjust the choice made by the ARS with respect to a position, orientation and/or scale of the image in cases where at least one thereof cannot be allocated in an unambiguous way. In a variant, the ARS is configured to be able to remember adjustments to the choices.
In some embodiments, the adjusting means is adapted to instruct the ARS to display
An adjusted view can support the operator during an intervention, e.g. when the actual size of the image generated by the MID is too small for the operator to see all the details necessary for the success of the medical intervention.
In some embodiments, the medical imaging method allows for taking still images (i.e. single images) and/or for taking a series of images. Preferably, the ARS is adapted to take a series of images quasi continuously, e.g. at a rate of 10 images or more per second, to create the illusion of a continuously changing image such like an image stream/video. In some embodiments, the ARS is adapted to take a series of 3D images, such as a 3D image stream/video, e.g. using 3D ultrasound. Still images can be preferred if taking an image is expensive or bears health risks, such as taking an X-ray.
In some embodiments, the ARS is adapted to display the transformed image in quasi-real-time, e.g. in less than 100 milliseconds, preferably in less than 60 or 40 milliseconds, after the image was taken and/or the POOHMD has changed. For example, where images are taken quasi continuously so as to form an image stream/video, the POOHMD-adjusted transformation of each individual image is displayed with a delay of less than 100 milliseconds after the respective image was taken. In another example, the image is a still image, e.g. an X-ray image, which is adjusted to a new POOHMD with a delay of less than 100 milliseconds. A delay of less than 100 milliseconds can create an illusion of real-time or live observation, which can allow for a more intuitive usage of the ARS.
In some embodiments, the ARS is configured so that the transformed image of an image taken at a distance of 2 m or less from the OHMD is displayed with a (perceived) precision of 5 mm or less, preferably 3 mm or less or 1 mm or less; in other words so that a pixel of the transformed image is not displayed as being further away from a position of the reality that is intended to represent than a further position of reality that in reality is less than 5 mm, preferably 3 mm or less or 1 mm or less, away from said position.
In some embodiments, the MID comprises, preferably is, a freely movable device, such as a handheld device, a semi-movable device, such as a C-arm, or a static device. For example, an ultrasound device can comprise a handheld device that comprises the ultrasound sensor so that the position and orientation of the image can easily be altered by moving the handheld device. In another example an X-ray device can comprise a C-arm, whose mobility is limited.
In some embodiments, the display of the OHMD comprises two portions, each of which is adapted to be arranged in front of one of the eyes of the bearer. The OHMD can be adapted for stereo displaying, i.e. creating the illusion of a 3D image to its bearer.
In some embodiments, the OHMD is adapted for a near focus of the bearer's view, e.g. such that it is suitable for medical procedures carried out at arm's length, preferably the OHMD is adapted for focusing on a distance of 20 cm to 100 cm.
Data measured by a TS can allow determining the position and orientation of an object, which in turn can allow for determining the transformation from a coordinate system of said object to a coordinate system of a different object or vice versa. Said object can e.g. be the MID or the OHMD.
By measuring the data of the TS over time, the ARS can track the position and orientation of an object and thus can allow for adjusting the transformation according to the current position and orientation of an object, preferably in quasi-real-time. The ARS, e.g. the TS, can be adapted to track the position and orientation of two or more objects, e.g. the MID and the OHMD.
The measuring means of the TS can e.g. comprise optical means (i.e. using electromagnetic radiation in the visible-light spectrum), infrared means (i.e. using electromagnetic radiation in the infrared spectrum), magnetic means (i.e. using magnetic fields), and/or image and shape recognition means (i.e. using image recognition and shape recognition). The TS can comprise an inertial measurement unit (“IMU”) that is adapted to measure data concerning a spatial acceleration and a rotation rate.
In some embodiments, the ARS comprises two or more TSs, which preferably are part of a redundant system. The redundant system can e.g. allow to increase the precision of measurements resp. calculations using the measurements (e.g. by taking averages and/or applying a Kalman filter), increase the reliability of the ARS (e.g. by performing independent measurements) and/or collect calibration data. Preferably, at least two of the two or more TSs comprise different sensor technology (e.g. a first TS using infrared recognition of a marker and a second TS using shape recognition) and/or different sensor configurations (e.g. a first TS being fixedly attached to the OHMD and a second TS being not fixedly attached to the OHMD). The ARS can be adapted to use a Kalman filter to calculate the current positions and orientations of at least some of the tracked objects by fusing data measured by two or more TSs. Preferably, the ARS can be adapted to use a Kalman filter to calculate resp. estimate the current positions and orientations of at least some of the tracked objects by taking into account earlier calculated positions and orientations of the tracked objects, e.g. in cases where one or more TSs of the two or more TSs fail to supply reliable data.
Preferably, a coordinate system of an object, e.g. the image, the MID, the TS or the OHMD, is a coordinate system that is associated with said object in the sense that the position, and preferably the orientation, of said object of fixed with respect to said coordinate system. Typically, there are multiple choices for such a coordinate system. Preferably, a coordinate system is chosen, wherein the origin, and possibly the direction of axes, are adjusted to the object in question, e.g. where the origin is placed at the location of a sensor of said object. Preferably, the coordinate system is a Cartesian coordinate system, which in three dimensions has three axes that are perpendicular to one another. Other possible choices include polar, cylindrical or spherical coordinate systems.
In some embodiments, the transformation comprises steps represented by matrices, e.g. expressing a translation, a scaling and/or a rotation. The matrices can e.g. be 4×4 matrices, preferably whereby a 3×3 block represents a scaling and a rotation and a 3×1 block (vector) represents a translation. The remaining entries can e.g. be chosen to guarantee that the matrix is invertible. The composition of steps of the transformation represented by matrices can be represented by the multiplication of said matrices. Using matrices and their multiplication can allow for a quick calculation of the transformed image, which in turn can allow for a quasi-real-time adjustment of the image displayed on the OHMD to the current POOHMD and, if a sequence of images is taken, to the current image.
The transformation can further comprise perspective transformations that allow for displaying an image on the OHMD in a perspective manner. The transformation can comprise stereo transformations that allow for stereo-displaying.
In some embodiments, the first coordinate system is the CSimage or a CSMID and the second coordinate system is a CDMID or the CSOHMD. For example, the processing unit can be configured to transform the image from the CSimage to a coordinate system of the MID that took the image (“CSMID”) and then transform the image from this CSMID to the CSOHMD.
In some embodiments, the POimage is fixed relative to the position and orientation of the MID (“POMID”) and a transformation from the CSimage to the CSMID can be a constant transformation. For example, the transformation of the CSimage to the CSMID is trivial if the image is directly taken in the CSMID (so that the CSimage is identical to the CSMID).
In some embodiments, the POimage is not fixed relative to POMID, i.e. the MID can, without itself being moved, take images at different position or orientation. For example, a sensor of the MID can be movable or the MID can comprise adequate software means. In this case it can be possible to determine the transformation of the CSimage to the CSMID using data of the MID (e.g. data of the control means of the MID) in combination with calibration data.
In some embodiments, the ARS, preferably the TS, is adapted to measure data concerning the POOHMD relative to the POimage resp. the POMID. The ARS can be adapted to track the POOHMD relative to the POimage resp. the POMID. Preferably, the ARS is adapted to transform the image from the CSimage to the CSOHMD using data concerning the POOHMD relative to the POimage resp. the POMID. In some cases, e.g. where a still image is displayed, the POimage is constant and the ARS can be adapted to measure the POOHMD relative to this constant position and orientation, i.e. relative to a world reference.
In a preferred embodiment, the ARS is adapted for displaying a medical image in quasi-real-time and comprises
wherein the ARS is adapted to display the transformed image in quasi-real-time on the OHMD in a position, orientation and scale that corresponds to the perspective of the POOHMD.
Optionally, it can be the case that
The respective perspective of a POOHMD to an individual image of the series of images is the perspective of the POOHMD (at least quasi) at the time the transformation of said image is displayed. In practise, there is a short delay, preferably less than 100 milliseconds. This can allow for quasi-real-time adjusting the series of images according to the current position and orientation of the OHMD, which preferably is at least quasi-continuously tracked using the first TS or a different TS. Preferably, the MID is at least quasi-continuously tracked using the first TS. The methods described herein for transforming individual images can of course be used iteratively for transforming a series of images.
In some embodiments, the ARS is adapted to
wherein t1, t2, t3 and t4 are quasi identical, e.g. within 100 milliseconds of each other. Preferably, the time t4 is less than 50 milliseconds after the time t3. The ARS can be adapted to iterate this process and to display a series of transformed images to the bearer of the OHMD, displaying each transformed image directly, e.g. within 100 milliseconds, after the image was taken. This allows for creating the feeling of a live imaging to the bearer.
In some embodiments, the processing unit is configured
wherein CSTS denotes a chosen coordinate system of the TS. The transformation from CSimage to CSMID can be a constant transformation if the position and orientation of the sensor of the MID is fixed relative to the POMID. The transformation from CSTS to CSOHMD can be a constant transformation if the POOHMD is fixed relative to the TS.
In some embodiments, the POOHMD is fixed relative to the TS. In other words, a position and orientation relative to which the TS is adapted to measure is fixed relative to the POOHMD. This means that a chosen CSTS can be transformed to the CSOHMD using a constant transformation.
For example, the TS can comprise a radiation (e.g. optical or infrared) emitter and a radiation receiver, whereof at least the receiver is fixedly attached to, preferably integrated into, the OHMD. Fixedly attached means that any change to the position or orientation of the one object will inevitably lead to the same change to the position (i.e. same translation) and orientation (i.e. same rotation) of the other object to which it is fixedly attached. However, it may of course still be the case that the two objects can again be separated, e.g. where they are fixedly attached by removable screws. A transformation of a first coordinate system of a first object that is fixedly attached to a second object to a second coordinate system of the second object can be a constant transformation. In the example at hand, the transformation from the CSTS to the CSOHMD can be a constant transformation and thus independent of the POOHMD.
In some embodiments, the POOHMD is not fixed relative to the TS. For example, the TS can be a static system, which e.g. is attached to a wall or mounted on a stand that is placed in a room. In this case, the calculation of the transformation from the CSimage or the CSMID to the CSOHMD can be performed using data concerning the POOHMD, wherein said data are preferably measured using the TS.
In some embodiments, the POOHMD is not fixed relative to the TS. Preferably, the TS is adapted to measure data concerning the POMID and the POOHMD. The processing unit can be configured to transform the image from a CSMID to the CSOHMD by using the measured data concerning the POMID and by using the measured data concerning the POOHMD, e.g. by
In some embodiments, the ARS comprises a first TS and a second TS, wherein each is adapted to measure data concerning a position and orientation of at least one object of the ARS, e.g. the MID and/or the OHMD.
In some embodiments, the TS comprises a first TS and a second TS and the ARS is adapted to calculate the position and orientation of objects with increased precision, e.g. by fusing the data measured by the various TSs. For example, the processing unit can be adapted to calculate the position and orientation of an object (e.g. the POMID or the P0OHMD) by taking weighted averages of the position and orientation of said object as calculated using data measured by the first TS and as calculated using data measured by the second TS and/or by using a Kalman filter to fuse the respective data.
In some embodiments, the ARS is adapted to collect and/or use calibration data. Calibration data can allow for correcting systematic deviations, which e.g. can be due to production tolerances. Preferably, calibration data are determined using data measured by a first TS and using data measured by a second TS.
In some embodiments, calibration data are used for determining transformations, e.g. the transformation from the CSimage or a CSMID to the CSOHMD can be determined using data concerning the POMID, and possibly data concerning POOHMD, as well as calibration data. For example, the transformation from CSimage to CSMID can be calculated using calibration data, e.g. wherein POimage is fixed relative to POMID and the transformation from CSimage to CSMID is constant. In cases the POimage is not fixed relative to the POMID, e.g. where the image sensor of the MID is movable relative to the rest of the MID, the transformation from CSimage to CSMID may not be constant but can e.g. be calculated using calibration data in connection with information concerning the movement of a motion unit of the sensor.
In some embodiments, the ARS, preferably the processing unit of the ARS, is configured to transform the image taken by the MID from the CSimage to the CSOHMD
Preferably, the calibration data are pre-determined and stored in the ARS.
In some embodiments, the calibration data are determined a using simultaneous measurement by the first TS and by second TS of the same environment. The calibration data can then e.g. be determined by solving equation systems concerning the respectively measured point cloud. In an example, the calibration data are determined using a simultaneous measurement concerning a position and orientation of the same object. Said same object could e.g. be a dummy object, which is specifically used for calibration purposes.
In some embodiments, the TS comprises a first TS and a second TS and the ARS is adapted to conduct plausibility checks, e.g. verifying data measured by the first TS using data measured by the second TS or vice versa. Preferably, the ARS can be configured to stop displaying the medical image and/or to issue a warning if the information of the data measured by the second TS significantly deviates from the information of the data measured by the first TS, e.g. in cases where data concerning the POMID as measured by the first TS are deemed inconsistent with data concerning the POMID as measured by the second TS.
In some embodiments, the TS comprises a first TS and a second TS and the ARS is adapted to measure data concerning a position and orientation of a first object using data measured using the first TS and to measure data concerning the a position and orientation of a second object using the second TS.
In some embodiments, the processing unit is configured to transform the image taken by the MID from the CSimage to the CSOHMD according to multiple ways, e.g. a first way and a second way. The processing unit can be adapted to always calculate the transformation in multiple ways or to only do so on specific occasions.
Two different ways of transforming an image differ in that the transformation of the first way and the transformation the second way differ in at least one aspect of how the transformation is conducted. For example, the two ways can differ in that they use
Two different ways of transforming can thus differ in their respective susceptibility to errors.
In an example, two ways of transforming use different data in that the first way uses data measured using a first TS and the second way uses data measured using a second TS.
In another example, two ways of transforming use different data in that the second way uses data and the first way uses a subset of that data. This can e.g. allow for the calculation via the first way to be fast; and the calculation of the second way to be slower but more reliable.
In yet another example, the two ways of transforming use different algorithms, wherein the calculation using the algorithm of the first way is fast; and the calculation using the algorithm of the second way is slower but more reliable.
In a further example, two ways of transforming use different technical principles based on which the measurement whose data is used for the transformation is performed in that the first way uses a first kind of measurement, e.g. using optical and/or infrared means, and the second way uses a second kind of measurement, e.g. using magnetic means.
In yet another example, two ways of transforming use different technical principles based on which the processing is performed in that the first way uses optical data for two dimensional image recognition, e.g. of a marker in form of an image pattern, and the second way uses—optionally the same—optical data for three dimensional shape recognition of an object. According to one example, a marker is attached to the MID and an optical TS is used to collect visual data; based on this visual data for a first way of transformation data concerning the POMID is calculated by recognizing the marker from the visual data; and for a second way of transformation data concerning the POMID is calculated by recognizing the shape of the MID itself from the visual data. In some embodiments, the ARS comprises a first TS and a second TS and is adapted to transform the image taken by the MID from the CSimage to the CSOHMD according to a first way and according to a second way,
In some embodiments, the TS comprises a first TS and a second TS and the ARS is adapted to calculate the transformed image with increased precision, e.g. by fusing the data measured by the various TSs. For example, the processing unit can be adapted to calculate the transformed image by taking weighted averages of the transformed image as calculated according to a first way using data measured by the first TS and of the transformed image as calculated according to a second way using data measured by the second TS.
In some embodiments, at least one TS is fixedly attached to the MID. Preferably a first TS is not fixedly attached to the MID and a second TS is fixedly attached to the MID.
In some embodiments, the ARS, e.g. the second TS, comprises an inertial measurement unit (“IMU”) that is adapted to measure data concerning a spatial acceleration and a rotation rate of an object, e.g. the MID or the OHMD. The IMU can be adapted to intrinsically measure data concerning a spatial acceleration and a rotation rate of an object, e.g. by being fixedly attached to said object, i.e. that any change of a position and orientation of said object will inevitably lead to the same change of a position (i.e. same translation) and orientation (i.e. same rotation) of the IMU.
In some embodiments, the ARS, e.g. the IMU, comprises an accelerometer that is adapted to measure data concerning a spatial acceleration of an object. The accelerometer can e.g. comprise piezo-electric, piezo-resistive and/or capacitive components. The accelerometer can e.g. comprise a pendulous integrating gyroscopic accelerometer.
In some embodiments, the ARS, e.g. the IMU, comprises a gyroscope that is adapted to measure data concerning a rotation rate of an object.
Using data concerning a spatial acceleration and rotation rate of an object, it is possible to determine a relative movement and rotation, i.e. a variation of the position and orientation of said object, and thus to estimate the position and orientation of said object at a time t, e.g. by using
Thus, data measured by the IMU can be used for estimating the position and orientation of an object of the ARS, e.g. the MID and/or the OHMD. Preferably, the processing unit is configured to use a Kalman filter for calculating said estimates. For example, the processing unit can be configured to verify and/or correct data measured by a TS (e.g. the first TS) using the data of the IMU (which e.g. can be comprised in the second TS).
In some embodiments, the processing unit is adapted to calculate the transformation using a position and orientation of an object, e.g. the MID and/or the OHMD, and is further adapted to calculate the position and orientation of said object using data measured by an IMU that is fixedly attached to said object.
In some embodiments, the second TS comprises an IMU that is fixedly attached to the MID, e.g. to a detector thereof, and that is adapted to measure data concerning a spatial acceleration and a rotation rate of the MID. Preferably, the processing unit is adapted to calculate the relative movement and rotation of the MID using data measured by the IMU. The processing unit can be configured
Similarly, the processing unit can be adapted to calculate the transformation using an estimated position and orientation of another object (e.g. of the OHMD) if an IMU is fixedly attached to said other object.
In some embodiments, the processing unit is configured to transform the image taken by the MID from the CS image to the CSOHMD
Preferably, the first way comprises using data measured by the first TS at a current time t and the second way comprises data measured by the first TS at an earlier time t0 (t>t0) and data measured by the second TS since the earlier time t0. In an example, the current time t is less than 5 seconds later than the earlier time t0.
In some embodiments, the TS comprises a first TS and a second TS and the ARS is adapted to calculate the transformation using the data measured by first TS in a first mode and using the data measured by the second TS in a second mode.
In some embodiments, the ARS comprises a first TS and a second TS, wherein the second TS preferably comprises an IMU. Preferably, the ARS is configured to normally calculate a transformation using data measured by the first TS (first mode) and, upon the occurrence of a triggering (e.g. if measurements by the TS are deemed unreliable), to calculate said transformation using data measured by the second TS (second mode). Preferably, the first TS and the second TS comprise different sensor technology and/or different sensor configurations, whereby the probability that both TSs are unreliable at the same time can be reduced.
In some embodiments, the ARS is configured
Preferably, the triggering event comprises that measurements on which the first way is based are deemed unreliable, e.g. with regard to accuracy or latency. The triggering event can e.g. occur if a plausibility test on measured data and/or on a calculated position and orientation has failed or where a calculation routine is unexpectedly terminated.
According to an example, the ARS is configured to normally operate in a first mode in which the transformation uses data of an image recognition of a marker, preferably of an image pattern, of the MID. However, if the marker cannot be detected sufficiently clearly, the ARS is configured to switch to a second mode in which the transformation uses data of a shape recognition of the MID itself. The two modes can use data measured using the same TS or using different TSs.
In some embodiments, the first TS is infrared-based, the second TS is based on the measurements of an IMU and the processing unit is configured to normally calculate the transformation of the CSimage to the CSOHMD using a calculation of the POMID using data measured by the infrared based TS (first mode). The processing unit is further configured to—in case the measurement of the infrared based TS is deemed unreliable, e.g. when the infrared radiation is obstructed and thus the POMID can no longer be calculated—calculate the transformation of the CSimage to the CSOHMD using an estimate of the POMID using data measured by the IMU-based TS (second mode), namely by estimating the current POMID using
In some embodiments, the ARS comprises monitoring means for determining if a measurement of the TS shall be deemed unreliable. The monitoring means can comprise hardware means and/or software means. Preferably, the monitoring means comprises a plausibility check, e.g. on the data measured by a, preferably at least the first, TS. The monitoring means can be adapted to indicate the occasion of a triggering event.
In some embodiments, the monitoring means comprises a second TS, preferably a second TS using a different sensor technology and/or a different sensor configuration. For example, the first TS can use a sensor system based on visible and/or infrared radiation, while the second TS can use a sensor system based on an IMU. In another example, the first TS is fixed relative to the OHMD and the second TS is not fixed relative to the OHMD (or vice versa).
In some embodiments, the ARS comprises a first TS whose measurements are used for a first mode, a second TS whose measurements are used for a second mode and a third TS whose measurement are used for a decision whether the first mode or the second mode shall be used. Preferably, at least the second TS comprises an IMU.
In some embodiments, the processing unit is configured to transform the image taken by the MID from a first coordinate system, e.g. the CSimage, to a second coordinate system, e.g. the CSOHMD in a first way and in a second way. Preferably, the ARS is configured to switch from a first mode in which the transformation is calculated in the first way to a second mode in which the transformation is calculated in the second way. In an example, the first mode is normally used and the second mode is used if the measurements on which the first way is based are deemed unreliable. Preferably, the second way comprises the usage of data measured by an IMU.
In some embodiments, the ARS is adapted to track and display multiple images. For example, where X-ray image were taken of different parts of a patient's body, the ARS can be adapted to keep track of the respective image and display the transformation of the suitable image or images depending on the current POOHMD. In order to keep track of the various images and possibly the various MIDs, the ARS can use a common reference system.
In some embodiments, the ARS comprises a position and orientation marker (“POM”). POMs are used to support the tracking of objects of the ARS, e.g. the MID or the OHMD, using the TSs. Preferably, at least one TS, e.g. at least the first TS, of the ARS is adapted to measure data concerning a position and orientation of the POM (“POPOM”).
In some embodiments, the POM is fixedly attached to an object of the ARS, e.g. the MID or the OHMD. Fixed attachment means that any change to the position and orientation of said object will inevitably lead to the same change to the position (i.e. same translation) and orientation (i.e. same rotation) of the POM. Thus, the position and the orientation of the object can be calculated using data concerning the POPOM, and possibly using calibration data, e.g. to eliminate production tolerances of the attachment of the POM to said object. Preferably, multiple POMs are fixedly attached to same object, e.g. to different sides of said object.
In some embodiments, the ARS comprises multiple POMs. At least one TS, e.g. the first TS, or the ARS can be adapted to measure data concerning a position and orientation of two or more POMs, e.g. a first POM fixedly attached of the MID and a second POM fixedly attached to the OHMD. In some examples, the first TS is adapted to measure data concerning a first POM and the second TS is adapted to measure data concerning a second POM.
In some embodiments, the POM is a magnetic marker, i.e. a marker that can be recognized, wherein the POPOM preferably can be measured, using magnetic fields. The magnetic marker can comprise a coil, preferably three orthogonal coils.
In some embodiments, the POM is a visual and/or infrared marker, i.e. a marker that can be recognized, wherein the POPOM preferably can be measured, using visible and/or infrared electromagnetic radiation. Preferably, such a POM can be adapted to reflect visible and/or infrared electromagnetic radiation.
In some embodiments, the visual and/or infrared POM comprises a vierbein, i.e. four spheres whose positions are fixed relative to one another. Preferably, the four spheres are reflective to visual and/or infrared electromagnetic radiation. Because the relative positioning of the four spheres is known, it is possible to determine the position and orientation of the vierbein based on data measured from almost any angle.
In some embodiments, the visual and/or infrared POM comprises an image pattern, preferably an image pattern comprising a plurality of vertices and edges such as e.g. a QR-code. The image pattern can e.g. be two dimensional or three dimensional. An image pattern can allow for determining a position and orientation of itself, which in turn can allow for determining a position and orientation of an object to which the image pattern is fixedly attached.
In some embodiments, two or more POMs, e.g. image patterns, are fixedly attached to the same object. Preferably, two or more POMs are attached to said object in such manner that they face in two or more different directions of the object. In an example, four or more POMs are attached to an object, preferably in cases where the TS comprises sensors facing to two or more directions. In some embodiments, six or more POMs are attached to an object, e.g. one or more to each side of a cuboid object. In an example, the use of multiple two-dimensional image patterns can allow for recognizing the position and orientation of a three-dimensional object.
In some embodiments, the ARS comprises a first TS, a second TS and a POM, wherein first TS as well as the second TS are adapted to measure data concerning the position and orientation of said POM. Thereby the POPOM can be measured using two different TS, e.g. for use in a redundant system and/or for calibration of the ARS.
In some embodiments, a POM is fixedly attached to the MID and the TS is adapted to measure the POPOM. Because of said fixation, tracking the POPOM can allow the processing unit to calculate, possibly as well using calibration data, the POMID. In other words, the TS can measure data concerning the POMID by measuring the position and orientation of a POM fixedly attached to the MID. Preferably, the processing unit is adapted to transform the image taken by the MID from CSimage to CSMED as well as to transform the so transformed image from CSMED to CSOHMD using the thereby calculated POMID.
In some embodiments, a first POM is fixedly attached to the MID, a second POM is fixedly attached to the OHMD and the TS is adapted to measure a position and orientation of the first POM (“PO1.POM”) and a position and orientation of the second POM (“PO2.POM”). The processing unit is adapted to calculate, possibly as well using calibration data, the POMID from the PO1.POM, and is further adapted to calculate, possibly as well using calibration data, the POOHMD from the PO2.POM. Preferably, the processing unit is adapted to transform the image taken by the MID from CSimage to CSMED using the calculated POMID and to transform the so transformed image from CSMED to CSOHMD using the calculated POMID and POOHMD.
Using POMs allows to easily modify existing MIDs so that they can be used in the ARS as proposed, namely by fixedly attaching one or more POMs to such a MID, especially since MIDs often are expensive and have a long lifespan.
In some embodiments, the TS is adapted to measure data concerning a position and orientation of object marker-less, i.e. without recognizing a specialized marker. For example, the TS can be configured for using shape recognition to recognize the position and orientation of an object, e.g. by using a scan and/or a CAD model of the real-world object as a virtual reference object. In an example, the MID is not rotationally symmetric and the TS is adapted to track the MID using shape recognition.
In some embodiments, the ARS comprises a camera, preferably a camera that is fixed relative to the OHMD. Such camera can allow the ARS to record the view of the bearer, e.g. for documentation. The ARS can be adapted to record the not-augmented view of the bearer and/or the augmented view of the bearer.
In some embodiments, the ARS is configured to save a still image of a displayed transformed image. Preferably, the ARS is configured to save a still image of the current view of the bearer, i.e. a still image of the real world (e.g. taken by a camera of the OHMD) onto which the currently displayed transformed image is overlaid. Such still images can e.g. be used for documentation. The ARS can comprise a saving means that is adapted to instruct the ARS to save a still image of a currently displayed transformed image.
In some embodiments, the ARS comprises an interruption means that is adapted to instruct the ARS to not display a transformed image on the OHMD. Said interruption means can e.g. be used when the bearer intends to see a non-augmented view.
In some embodiments, the ARS comprises a virtual button that is displayed on the OHMD and which e.g. can be overlaid onto the MID. For example, an intended action can be triggered, e.g. an image is saved, if the operator gazes at the virtual button for a prescribed amount of time. The ARS can be configured to display a circular progress indicator providing feedback regarding the imminence of the trigger to the operator during the gazing. Preferably, the ARS, e.g. the OHMD, comprises means for measuring data concerning the position of an eye, preferably of both eyes, of the operator, which can allow determining if the operator is gazing at a certain position, e.g. at a position where the virtual button is displayed. In another example, the ARS is configured to determine a gazing using the POOHMD. The ARS can be configured so that at least one function of the ARS can be triggered, preferably controlled, using the virtual button.
In some embodiments, the ARS comprises means for voice control, i.e. it is configured to receive and recognize voice commands. The ARS can be configured so that at least one function of the ARS can be triggered, preferably controlled, using voice commands.
In some embodiments, the ARS is configured so that at least one function of the ARS can be triggered, preferably controlled, using non-manual interaction, e.g. by using voice control and/or a virtual button. Controlling at least parts of the functions of the ARS using non-manual interactions can allow performing interventions using less personnel, thereby possibly saving costs and space in the operating room.
In some embodiments, the ARS, e.g. the OHMD, comprises a physical button that is configured trigger, preferably control, at least one function of the ARS. Preferably, the button is arranged at a location that is easily accessible to the bearer, e.g. on a side of the OHMD or a small hand-held device, e.g. a remote control.
In some embodiments, the ARS comprises a medical or surgical tool, such as a syringe, needle or a pointer, and the ARS is configured to display the surgical tool in a highlighted manner on the OHMD. Preferably, the ARS is configured to highlight the surgical tool in the displayed image.
In some embodiments, the medical and/or surgical tool comprises tool-recognition means that allows the ARS, preferably the TS and/or the processing unit (e.g. using the data comprised in the medical image), to recognize said tool, which e.g. can allow the ARS to highlight the said tool in the displayed image. In an example, the surgical tool comprises a material that is easily recognizable on the respective medical image, e.g. a metal. The tool-recognition means can comprise a POM.
Furthermore, methods that are represented by the embodiments of ARS, or parts thereof, disclosed herein are proposed.
Furthermore, a method for creating an augmented reality by displaying an image taken by the MID, preferably in quasi-real-time, on the OHMD is proposed, comprising the steps of:
The method can be performed in the order as written or in any other technically appropriate order. For example, the step of taking the image can be performed before, while, and/or after the step of measuring data concerning the at least one object is performed. The method can be iterated to—at least quasi continuously—adjust the displayed image to the current POOHMD, preferably wherein the step of taking an image is not iterated in case the image to be displayed is a still image.
The said at least one object can comprise the OHMD. By tracking the OHMD (i.e. measuring the POOHMD over time), the image (or images) can continuously be transformed to fit to the current perspective of the OHMD.
The said at least one object can comprise the MID. By tracking the MID (i.e. measuring the POMID over time), images continuously taken by the MID can be transformed and displayed.
According to some variants, the method allows for displaying a series of images taken by the MID and comprises
The respective CSOHMD resp. the respective perspective of a POOHMD to an individual image of the series of images is a CSOHMD resp. the perspective of the POOHMD (at least quasi) at the time the transformation of said image is displayed. In practise, there is a short delay, preferably less than 100 milliseconds. This can allow for quasi-real-time adjusting the series of images according to the movement of the OHMD. The data used for the transformation of an individual image concerns the position and orientation of the MID—at least quasi—at the time said individual image was taken. The methods described herein for transforming individual images can of course be used iteratively for transforming a series of images from resp. to the respective coordinate systems.
According to some variants,
wherein t1, t2, t3 and t4 are quasi identical, e.g. within 100 milliseconds of each other. Preferably, the time t4 is less than 50 milliseconds after the time t3. This process can be iterated, which allows displaying a series of transformed images to the bearer of the OHMD, displaying each transformed image in quasi-real-time, e.g. within 100 milliseconds, after the image was taken. This allows for creating the illusion of a live imaging to the bearer.
In some variants, the step of measuring data comprises measuring data concerning the POOHMD relative to the POMID. Preferably, the variant comprises calculating the POOHMD relative to the POMID, for example by calculating the POOHMD and the POMID relative to a chosen reference system, e.g. a world reference. The reference system can be chosen according to a POOHMD at a chosen time, e.g. at the time the OHMD is first started during a session, i.e. when the OHMD is initialized.
In some variants, the step of measuring data is performed by one or more TSs. At least one of the one or more TSs can be fixed relative to the OHMD.
In some variants, the method comprises measuring data concerning a spatial acceleration and a rotation rate of at least one object, preferably the OHMD and/or the MID. Preferably, the step of transforming the image taken by the MID from the CSimage to the CSOHMD is performed using an estimate of the position and orientation of the at least one object, whereby said estimate is calculated using:
The variation of the position and of the orientation of the at least one object over time can be calculated from its spatial acceleration and rotation rate. If a position and orientation of said at least one object at a first time and the variation of the position and of the orientation of said at least one object between the first time and a later second time is known, it is possible to calculate the position and orientation of said at least one object at the second time. Measurements of spatial acceleration and a rotation rate can be subject to significant tolerances, and thus the results of such method of calculation can be perceived as estimates. A Kalman filter that takes into account an earlier state of the system and the measured variation can be employed to increase the precision of such estimates.
Spatial acceleration and rotation rate can be measured using an IMU that is comprised in the object to be tracked. Such IMU is considered a highly reliable system in the sense that its measurements are only rarely interrupted, which encourages its use in a back-up system.
In some variants, the method comprises switching from
to
upon the occurrence of a triggering event. An example of such a triggering event can be that a measurement that is used in the first way is deemed unreliable, e.g. with respect to accuracy and/or latency. Choosing the second way of transforming the image can allow for continued usage of the method in case where the measurements on which the first way is based are corrupted. Preferably, the first way is a way of high precision, while the second way is a way of high reliability with respect to the availability of the measured data.
In some variants, the second way comprises estimating the position and orientation of the at least one object using
Preferably, the first way does not comprise an estimation of this kind.
Furthermore, embodiments of the ARS, or parts thereof, adapted to perform the methods disclosed herein are proposed.
Preferred embodiments of the invention are described in the following with reference to the drawings, which are for the purpose of illustrating the present preferred embodiments of the invention and not for the purpose of limiting the same. In the drawings,
In the following, the invention is primarily exemplified with a view towards ultrasound and X-ray imaging. However, of course the invention can be used with other medical imaging technologies.
To improve the applicability, safety and/or reliability of medical imaging especially during interventions, an augmented reality system (“ARS”) is proposed that comprises
The ARS is configured to display the transformed image in the field of view of the bearer of the OHMD 1, thereby creating an augmented reality (“AR”) image. The displayed image is a transformation of the original image such that when displayed to the bearer of the OHMD 1, the displayed image appears to be at the position and orientation of the image (“POimage”) as seen from the perspective of the position and orientation of the OHMD 1 (“POOHMD”). In other words, the ARS is adapted to display the transformed image on the OHMD 1 in a position, orientation and scale that corresponds to the perspective of the current POOHMD. Preferably, the ARS is adapted to display a sequence of transformed images in chronological order such that the displayed image adapts to the currently taken image and the current perspective of the POOHMD in quasi-real-time.
An example of the AR image as seen by the bearer of the OHMD is shown in
In the example of
The transformation comprises transforming the image from a first coordinate system to a second coordinate system using data measured by the TS 3, for example from a coordinate system of the image (“CSimage”) to the coordinate system of the OHMD 1 (“CSOHMD”). Preferably, the transformation comprises at least transforming the image from the CSimage to a coordinate system of the MID 2 (“CSMID”) and from the CSMID to the CSOHMD.
The MID 2 and the OHMD 1 are each fixedly attached to a position and orientation marker 5 (“POM”), which in this example are both realized as infrared-reflective vierbeins, i.e. a device comprising four spheres 55 at a predefined position from each other as can be seen in
In some embodiments, the ARS is adapted to perform the method for creating an augmented reality according to the methods shown in
Of course, step 101 and step 102 can be performed simultaneously or in any order. Preferably, the data measured in step 102 allow for a determination of the POimage (resp. the POMID) relative to the POOHMD. The measurements of step 102 can be made by one or multiple TSs 3, 3′. Step 103 can comprise using a Kalman filter. Preferably the steps 101 to 104 are performed in quasi-real-time, e.g. within 100 milliseconds.
The ARS can further be adapted to iterate this process. For example, if a still image shall be displayed, the steps 102-104 are iterated as shown in
In another example, if a sequence, preferably an image stream/video, of images shall be displayed, the steps 101-104 are iterated as shown in
As indicated by the coordinate systems in
This can allow transforming the image from the perspective in which the image was originally taken (i.e. the perspective according to the POimage) to the perspective of the OHMD 1 (i.e. the perspective according to the POOHMD). Preferably, each of the steps is expressed as a matrix, e.g. a 4×4 matrix, and the composition of the steps is expressed as a multiplication of the respective matrices.
In the example shown in
While
In the example of
An ARS comprising two or more TSs 3, 3′ can be adapted to calculate the position and orientation of one or more objects with increased precision. For example, the processor unit 5 can be adapted to calculate the POMID using data measured by the first TS 3 as well as data measured by the second TS 3′, thereby fusing the data measured by the two TSs 3, 3′. Preferably, the processor unit 5 can be adapted to calculate, e.g. as part of step 103, the POMID using a Kalman filter that fuses data measured by the first TS 3 and data measured by the second TS 3′.
An ARS comprising two or more TSs 3, 3′ can be adapted to calculate the transformed image with increased precision. For example, the processor unit 5 can be adapted to calculate the transformed image using data measured by the first TS 3 as well as to calculate the transformed image using data measured by the second TS 3′ (e.g. according to the method shown in
An ARS comprising two or more TSs 3, 3′ can be adapted to conduct plausibility checks. For example, the ARS can be adapted to transform the image using data measured by the first TS 3 and is further adapted to use, e.g. as part of step 102 or step 103, data measured by the second TS 3′ to check if the data measured by the first TS 3 are plausible. Of course, the role of the first TS 3 and the second TS 3′ in this context is interchangeable.
An ARS comprising two or more TSs 3, 3′ can be adapted to determine calibration data. For example, the ARS can be adapted to determine calibration data with respect to the exact attachment of POM 5 to the MID 2 or the OHMD 1 and determine the POMID resp. the POOHMD using data concerning the position and orientation of the respective POM 5 and the respective calibration data.
As indicated in
An arrangement for calibrating the POimage resp. the POMID relative to a POM 5 that is fixedly attached to the MID 2 is shown in
An ARS comprising two or more TSs 3, 3′ can be adapted to calculate the transformation of the image using the data measured by a first TS 3 in first mode and to calculate the transformation of the image using the data measured by a second TS 3′ in a second mode. For example, the ARS can be adapted to usually calculate the transformation of the image in a way using data measured by the first TS 3. However, the ARS can be adapted to instead calculate the transformation in a way using data measured by the second TS 3′. The ARS can be adapted to switch to this second mode during periods where the first TS 3 does not provide sufficiently reliable data, e.g. where the POMID or POOHMD cannot be determined using the data measured by the first TS 3. The ARS can e.g. be adapted to calculate the transformation using measurements of the first TS 3 and using measurements of the second TS 3′ in the first mode, e.g. for increasing the precision of the calculation of the transformation and/or of a position and orientation of an object, and to calculate the transformation either using measurements of the first TS 3 or using measurements of the second TS 3′ in the second mode if the measurements of the other is deemed unreliable.
The ARS can be adapted to perform the method as shown in
Again, step 101 and step 102 can of course be performed simultaneously or in any order. The method can further comprise iterating the process, thereby possibly restarting at step 101 to display a sequence of images, or at step 102 to display a still image. The triggering event can occur in case step 103 malfunctions, such as displayed in
For example, in
The data measured by the IMU 35 can allow for calculating the relative movement and rotation of the MID 2. In other words, data measured by the IMU 35 itself do not necessarily allow for the determination of the absolute POMID, but rather the relative movement of the MID 2 by a certain length along a certain direction, and the relative rotation of the MID 2 by a certain angle along a certain axis.
The ARS can be adapted to transform the image according to the method as shown in
Step 102′ and step 103′ can be comprised in or replace step 102, step 103 and step 203, respectively, in the methods of
The methods of
A TS 3 comprising an infrared emitter 31 and an infrared detector 32 is integrated in the shown OHMD 1. Preferably, the TS 3 comprises two or more infrared detectors 32, which can improve the area and/or the precision of the tracking. In addition to the TS 3, the shown OHMD 1 comprises a camera 18, which can be used as a second TS 3′ (e.g. as a monitoring means for monitoring the reliability of the data measured by the first TS 3) and/or to record the perspective of the bearer (e.g. for documentation).
Pupil sensors 14 allow tracking the position of the pupils of the eyes of the bearer, which can be used to adjust the displayed image to the bearer's physiognomy. This can e.g. entail the adjustment of the stereo-displaying in accordance with the bearer's interpupillary distance, and/or the scaling of the image in accordance with the distance of the bearer's pupils from the mirrors.
The pupil sensors 14 can also allow for implementing a gaze control, e.g. in the form of a gaze button. An example of a possible design of a gaze button 16 as displayed to the bearer of the OHMD 1 is shown in
Number | Date | Country | Kind |
---|---|---|---|
18200983.7 | Oct 2018 | EP | regional |
This application is the United States national phase of International Application No. PCT/EP2019/078119 filed Oct. 16, 2019, and claims priority to European Patent Application No. 18200983.7 filed Oct. 17, 2018, the disclosures of which are hereby incorporated by reference in their entirety.
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/EP2019/078119 | 10/16/2019 | WO | 00 |