This application claims priority under 35 U.S.C. §119 to Chinese Patent Application No. 2015107385601, filed on Nov. 03, 2015. The content of the priority application is hereby incorporated by reference in its entirety.
The present disclosure relates to rendering an expanded lumen image in the field of medical imaging technologies.
With the emergence of medical imaging devices, such as Computer Tomography (CT), Magnetic-Resonance Imaging (MRI), Positron Emission Tomography (PET) or the like, and the development of computer graphics, digital image processing technologies, visualization technologies and virtual reality technologies or the like, as a non-intrusive examination manner, virtual endoscopy is increasingly widely used. By means of medical imaging devices, such as CT, MRI, PET and so on, for obtaining a human body's two-dimensional slicing data, and by virtue of digital image processing technologies, scientific visualization technologies and virtual reality technologies or the like for processing the obtained data, the virtual endoscopy may generate three-dimensional images of a human body's lumens such as colons, blood vessels, tracheas and oesophaguses and simulate a traditional medical endoscope to perform an endoscopic examination on a patient.
NEUSOFT MEDICAL SYSTEMS CO., LTD. (NMS), founded in 1998 with its world headquarters in China, is a leading supplier of medical equipment, medical IT solutions, and healthcare services. NMS supplies medical equipment with a wide portfolio, including CT, Magnetic Resonance Imaging (MRI), digital X-ray machine, ultrasound, Positron Emission Tomography (PET), Linear Accelerator (LINAC), and biochemistry analyser. Currently, NMS' products are exported to over 60 countries and regions around the globe, serving more than 5,000 renowned customers. NMS's latest successful developments, such as 128 Multi-Slice CT Scanner System, Superconducting MRI, LINAC, and PET products, have led China to become a global high-end medical equipment producer. As an integrated supplier with extensive experience in large medical equipment, NMS has been committed to the study of avoiding secondary potential harm caused by excessive X-ray irradiation to the subject during the CT scanning process.
This present disclosure is directed to rendering an expanded lumen image. In general, one innovative aspect of the subject matter described in this specification can be embodied in methods that include the actions of obtaining a plurality of viewpoints by sampling a centre line of a lumen based on a three-dimensional lumen image; establishing a spherical projection plane for each of the viewpoints, wherein a point on the spherical projection plane corresponds to a point on an inner wall of the lumen; determining a relationship between a two-dimensional projection plane and the inner wall of the lumen according to a corresponding relationship between the spherical projection plane and the two-dimensional projection plane; and obtaining a two-dimensional expanded image of the three-dimensional lumen image by performing image rendering on the two-dimensional projection plane according to the determined relationship between the two-dimensional projection plane and the inner wall of the lumen.
Other embodiments of this aspect include corresponding computer systems, apparatus, and computer programs recorded on one or more computer storage devices, each configured to perform the actions of the methods. For a system of one or more computers to be configured to perform particular operations or actions means that the system has installed on it software, firmware, hardware, or a combination of them that in operation cause the system to perform the operations or actions. For one or more computer programs to be configured to perform particular operations or actions means that the one or more programs include instructions that, when executed by data processing apparatus, cause the apparatus to perform the operations or actions.
The foregoing and other embodiments can each optionally include one or more of the following features, alone or in combination. For instance, establishing a spherical projection plane for each of the viewpoints can include: determining a gazing direction and a local coordinate system of the viewpoint, wherein a Z-axis direction of the local coordinate system is the gazing direction of the viewpoint; and establishing the spherical projection plane by taking the viewpoint as a centre of sphere and R as a radius under the local coordinate system, wherein R is a preset radius value less than or equal to a radius of the inner wall of the lumen.
In some implementations, determining a gazing direction and a local coordinate system of a viewpoint comprises: determining a gazing direction of a current viewpoint according to positions of viewpoints adjacent to the current viewpoint on the centre line; and determining an X-axis direction, a Y-axis direction and a Z-axis direction of a local coordinate system of the current viewpoint according to a connecting line between a previous viewpoint before the current viewpoint and the current viewpoint and a local coordinate system of the previous viewpoint.
The point on the spherical projection plane corresponding to the point on the inner wall of the lumen can include: a first point on the spherical projection plane corresponding to a second point on the inner wall of the lumen directed by a three-dimensional vector of the first point, wherein the three-dimensional vector of the first point is a vector constituted by the first point and a centre of sphere of the spherical projection plane.
The corresponding relationship between the spherical projection plane and the two-dimensional projection plane can include: a corresponding relationship between a three-dimensional vector of the first point on the spherical projection plane and a two-dimensional coordinate of a third point on the two-dimensional projection plane. Determining a relationship between a two-dimensional projection plane and the inner wall of the lumen can include: dividing the spherical projection plane into a plurality of three-dimensional regions; determining an image rending region of the two-dimensional projection plane, the image rending region comprising a plurality of two-dimensional regions respectively corresponding to the plurality of three-dimensional regions of the spherical projection plane; and determining a three-dimensional vector on one of the three-dimensional regions corresponding to a point on one of the two-dimensional regions.
In some cases, the plurality of three-dimensional regions includes: a three-dimensional region R1, which is a spherical surface positioned between an X-axis positive direction and a Z-axis negative direction, a three-dimensional region R2, which is a spherical surface positioned in an X-axis negative direction, and a three-dimensional region R3, which is a spherical surface excluding the three-dimensional regions R1 and R2. And the image rendering region can include: a two-dimensional region R1′ corresponding to the three-dimensional region RI, which is a semicircle with radius equal to a scaled lumen radius r and positioned on a left side of the image rendering region, the scaled lumen radius r being obtained by multiplying a radius of the inner wall of the lumen with a preset scaling ratio, a two-dimensional region R2′ corresponding to the three-dimensional region R2, which is a rectangle with width equal to nr and height equal to 2 r and positioned in a middle part of the image rendering region, and a two-dimensional region R3′ corresponding to the three-dimensional region R3, which is a semicircle with radius equal to the scaled lumen radius r and positioned on a right side of the image rendering region.
Obtaining a two-dimensional expanded image of the three-dimensional lumen image can include: obtaining color information of the second point on the inner wall of the lumen directed by a three-dimensional vector corresponding to the third point on the two-dimensional projection plane; and obtaining a two-dimensional expanded image of the three-dimensional lumen image by performing image rendering on the two-dimensional projection plane based on the obtained color information.
The details of one or more embodiments of the subject matter described in the present disclosure are set forth in the accompanying drawings and description below. Other features, aspects, and advantages of the subject matter will become apparent from the description, the drawings, and the claims.
Features of the present disclosure are illustrated by way of example and not limited in the following figures, in which like numerals indicate like elements, in which:
With the development of medical imaging devices such as CT, MRI and PET and medical imageology, as a non-intrusive examination manner, virtual endoscopy is increasingly widely used. A virtual endoscopy may simulate an endoscope to display intracavitary morphology of cavity organs, such as colons, tracheas, blood vessels and oesophaguses, stereoscopically from a view angle within the cavities by utilizing a computer three-dimensional reconstruction function, and simulate an endoscopic examination process by using lumen navigation technologies or roaming technologies.
In the following, the present disclosure presents a method of rendering an expanded lumen image, which enables to intuitively and effectively display a lumen. It is to be noted that by using the method of rendering an expanded lumen image presented by the present disclosure, a lumen image in a three-dimensional medical image may be expanded to be a two-dimensional image for rendering. In practical applications, the three-dimensional medical image may include a three-dimensional CT image, a three-dimensional PET image, a three-dimensional MRI image and so on, which are not limited in examples of the present disclosure.
At block 101, a centre line of a lumen may be extracted from a three-dimensional lumen image, and viewpoints may be obtained by sampling the centre line.
In an example of the present disclosure, the centre line of a lumen may be manually extracted from a three-dimensional lumen image.
In another example of the present disclosure, the centre line of a lumen may be automatically extracted from a three-dimensional lumen image. Specifically, the centre line of a lumen may be extracted from a three-dimensional lumen image by using a distance transform iteration method or a skeletonization method. Of course, other methods in related technologies may also be used to extract the centre line of a lumen, which are not limited in the present disclosure.
After the centre line of the three-dimensional lumen image is obtained, viewpoints may be obtained by sampling the centre line. A sampling point can be a viewpoint.
In an example of the present disclosure, the centre line may be uniformly sampled. In practical application, the quantity of viewpoints may be adjusted by setting up a sampling interval. For example, a smaller sampling interval may be set up when more viewpoints are needed; whereas a larger sampling interval may be set up when fewer viewpoints are needed.
In another example of the present disclosure, the centre line may be non-uniformly sampled.
At block 102, a spherical projection plane may be established for each of the viewpoints within the lumen. A point on the spherical projection plane may correspond to a point on an inner wall of the lumen of the three-dimensional lumen image.
It is to be noted that a spherical projection plane of each of the viewpoints within the lumen may be positioned inside the lumen, as shown in
In an example of the present disclosure, as shown in
At block 102a, a gazing direction and a local coordinate system of a viewpoint may be determined. A Z-axis direction of the local coordinate system is the gazing direction of the viewpoint.
In this example, a gazing direction of a current viewpoint may be determined according to positions of viewpoints adjacent to the current viewpoint on the centre line, e.g., positions of viewpoints immediately before and after the current viewpoint on the centre line. Specifically, the gazing direction of the current viewpoint may be determined by performing a positional differential operation on positions of viewpoints adjacent to the current viewpoint on the centre line, e.g., positions of viewpoints immediately before and after the current viewpoint on the centre line.
In this example, when a local coordinate system of the current viewpoint is established, an X-axis direction, a Y-axis direction and a Z-axis direction of the local coordinate system of the current viewpoint may be determined according to a connecting line between a previous viewpoint, e.g., immediately, before the current viewpoint, and the current viewpoint and the local coordinate system of the previous viewpoint.
At block 102b, a spherical projection plane by taking the viewpoint as a centre of sphere and R as a radius may be established under the local coordinate system. R is a preset radius value less than or equal to a radius R0 of the inner wall of the lumen, that is, R<=R0.
It is to be noted that a three-dimensional lumen image generated according to virtual endoscopy, a human eye may make a roaming observation within the virtual internal cavity along a certain path to simulate a traditional endoscope examination process. In this example, the centre line of the lumen may be taken as a roaming path for the human eye to make a roaming observation, and a viewpoint on the centre line of the lumen may be taken as a roaming viewpoint of the human eye.
The spherical projection plane as shown in
It is to be noted that R in this example may be an empirical value for those skilled in the art, which of course may also be set up according to an actual situation, and is not limited in this example.
Referring back to
In an example of the present disclosure, a point on the spherical projection plane corresponding to a point on the inner wall of the lumen may include: a three-dimensional vector of a first point on the spherical projection plane corresponding to a second point on the inner wall of the lumen directed by the three-dimensional vector on the three-dimensional lumen image. Correspondingly, a corresponding relationship between the first point on the spherical projection plane and a third point on the two-dimensional projection plane may include: a corresponding relationship between the three-dimensional vector of the first point on the spherical projection plane and a two-dimensional coordinate of the third point on the two-dimensional projection plane. The three-dimensional vector of the first point on the spherical projection plane may be a vector constituted by the first point and the centre of sphere, as illustrated in
Specifically, in this example, a method of establishing the corresponding relationship between the three-dimensional vector of the first point on the spherical projection plane and the two-dimensional coordinate of the third point on the two-dimensional projection plane may include the following three steps.
At the first step, a plurality of three-dimensional regions, e.g., three three-dimensional regions, may be obtained by performing a regional division on the spherical projection plane. For example, a spherical surface positioned between X-axis positive direction and the Z-axis negative direction on the spherical projection plane may be taken as a three-dimensional region R1, a spherical surface positioned in the X axis negative direction may be taken as a three-dimensional region R2, and the residual spherical surface, i.e., a spherical surface positioned between X-axis positive direction and the Z-axis positive direction on the spherical projection plane, may be taken as a three-dimensional region R3. The three three-dimensional regions R1, R2 and R3 may be obtained by performing the foregoing regional division on the spherical projection plane as shown in
At the second step, an image rendering region of the two-dimensional projection plane may be determined, wherein the image rendering region may be composed of a plurality of two-dimensional regions, e.g., three two-dimensional regions R1′, R2′ and R3′, as shown in
It is to be noted that in this example, based on one viewpoint within the lumen, a section of expanded lumen image positioned around the viewpoint instead of the whole expanded lumen image may be rendered.
At the third step, a three-dimensional vector (LR3, MR3, NR3) of a point on the three-dimensional region R3 of the spherical projection plane corresponding to a point P (x, y) on the two-dimensional region R3′ of the two-dimensional projection plane may be determined. (xc, yc) is a coordinate of a centre of circle S2 of the two-dimensional region R3′, (X1, Y1, Z1), (X2, Y2, Z2) and (X3, Y3, Z3) respectively are unit vectors of the X axis, the Y axis and the Z axis, wherein LR3=XiR3* X1+YiR3* X2+ZiR3*X3, MR3=XiR3*Y1+YiR3*Y2+ZiR3*Y3, NR3=XiR3*Z1+YiR3*Z2+ZiR3*Z3, XiR3=iR3√{square root over (iR3*iR3+jR3*jR3+kR3*kR3)}, YiR3=jR3/√{square root over (iR3*iR3+jR3*jR3+kR3*kR3)}, ZiR3=kR3/√{square root over (iR3*iR3+jR3*jR3+kR3*kR3)}, iR3=(x−xc)*r*sin(√{square root over ((x−xc)2+(y−yc)2))}/r), jR3=(y−yc)*r*sin(√{square root over ((x−xc)2+(y−yc)2))}/r), kR3=r*cos(√{square root over ((x−xc)2+(y−yc)2))}/r),
(iR3, jR3, kR3) is a spherical point coordinate corresponding to the point P (x, y) on the two-dimensional region R3′.
A three-dimensional vector (LR1, MR1, NR1) of a point on the three-dimensional region R1 of the spherical projection plane corresponding to a point P (x, y) on the two-dimensional region R1′ of the two-dimensional projection plane may be determined. (xd, yd) is a coordinate of a centre of circle Si of the two-dimensional region R1′, wherein
LR1=XiR1* X1+YiR1* X2+ZiR1*X3, MR1=XiR1*Y1+YiR1*Y2+ZiR1*Y3, NR1=XiR1*Z1+YiR1*Z2+ZiR1*Z3, XiR1=iR1√{square root over (iR1*iR1+jR1*jR1+kR1*kR1)}, YiR1=jR1/√{square root over (iR1*iR1+jR1*jR1+kR1*kR1)}, ZiR1=kR1/√{square root over (iR1*iR1+jR1*jR1+kR1*kR1)}, iR1=(x−xd)*r*sin(√{square root over ((x−xd)2+(y−yd)2))}/r), jR1=(y−yd)*r*sin(√{square root over ((x−xd)2+(y−yd)2))}/r), kR1=r*cos(√{square root over ((x−xd)2+(y−yd)2))}/r),
(iR1, jR1, kR1) is a spherical point coordinate corresponding to the point P (x, y) on the two-dimensional region R1′.
A three-dimensional vector (LR2, MR2, NR2) of a point on the three-dimensional region R2 of the spherical projection plane corresponding to a point P (x, y) on the two-dimensional region R2′ of the two-dimensional projection plane may be determined, wherein
LR2=XiR2* X1+YiR2* X2+ZiR2*X3, MR2=XiR2*Y1+YiR2*Y2+ZiR2*Y3, NR2=XiR2*Z1+YiR2*Z2+ZiR2*Z3, XiR2=iR2√{square root over (iR2*iR2+jR2*jR2+kR2*kR2)}, YiR2=jR2/√{square root over (iR2*iR2+jR2*jR2+kR2*kR2)}, ZiR2=kR2/√{square root over (iR2*iR2+jR2*jR2+kR2*kR2)}, iR2=−cos(beta)*sin(alpha), jR2=−sin(beta), kR2=−cos(beta)*cos(alpha), alpha=(x−xd)*π/(xc−xd), beta=−(y−yd)*π/(2*r),
(iR2, jR2, kR2) is a spherical point coordinate corresponding to the point P (x, y) on the two-dimensional region R2′.
It is to be noted that, to implement a two-dimensional expanded image, a reasonable scaled lumen radius r and coordinates of points S1 and S2 may be first determined according to given width and height of the three-dimensional lumen image, e.g., according to a radius R0 of the inner wall of the lumen. The scaled lumen radius r can be determined by multiplying the radius R0 with a scaling ratio k (e.g., k<=1). Then, a radius R of the spherical projection plane is determined based on the scaled lumen radius r. The radius R may be greater than or identical to the scaled lumen radius r, i.e., R>=r. In some cases, the radius R is also determined based on the radius R0 of the inner wall of the lumen, e.g., R<=R0. In a particular example, r<=R<=R0. Subsequently, coordinates of points within the two-dimensional regions R1′, R2′ and R3′ of the two-dimensional projection plane may be determined based on coordinates of corresponding points within three-dimensional regions R1, R2 and R3 of the spherical projection plane. In some cases, points beyond the two-dimensional regions may be not calculated.
Referring back to
To render an expanded lumen image for one viewpoint, target color of each point within the image rendering region on the two-dimensional projection plane may be obtained according to the corresponding relationship among the viewpoint, a point on the two-dimensional projection plane and a point on the inner wall of the lumen of the three-dimensional lumen image as well as ray casting technologies. Target color may be color of each pixel point on the inner wall of the lumen directed by a direction of a three-dimensional projection vector corresponding to a point within the image rendering region. Afterward, a two-dimensional expanded image of the three-dimensional lumen image may be obtained finally by performing image rendering on the two-dimensional projection plane based on the obtained target color by using image rendering technologies such as three-dimensional volume reconstruction, maximum intensity projection or three-dimensional surface reconstruction. For example,
As can be seen from the foregoing example, the three-dimensional lumen image may be expanded to be a two-dimensional plane image for rendering. Rendering a two-dimensional expanded lumen image may include information of a virtual roaming position in 360 degrees, relatively intuitively display information on a lumen along two directions before and after the lumen at the virtual roaming position, and have the advantages of continuous image information and small deformation, thereby providing convenience for a doctor to make an observation and diagnosis.
Corresponding to the example of the foregoing method of rendering an expanded lumen image, the present disclosure further provides an example of a device for rendering an expanded lumen image.
Rendering an expanded lumen image provided by the present disclosure may be executed by a control device as shown in
In different examples, the storage medium 620 may be a Random Access Memory (RAM), a volatile memory, a nonvolatile memory, a flash memory, a memory drive (such as a hard disk drive), a solid state drive, any type of memory disks (such as an optical disk or a DVD and so on), or a similar storage medium, or a combination thereof
Further, the storage medium 620 may be used to store machine readable instructions corresponding to a control logic for rendering an expanded lumen image. Functionally divided, as shown in
a viewpoint sampling module 710 may be used to extract a centre line of a lumen from a three-dimensional lumen image, and sample the centre line to obtain viewpoints;
a spherical projection plane establishing module 720 may be used to establish a spherical projection plane for each of the viewpoints within the lumen, wherein a point on the spherical projection plane may correspond to a point on an inner wall of the lumen, and a radius of the spherical projection plane may be less than or equal to that of the inner wall of the lumen;
a corresponding relationship determining module 730 may be used to determine a corresponding relationship between a point on a two-dimensional projection plane and a point on the inner wall of the lumen according to a corresponding relationship between a point on the established spherical projection plane and a point on the two-dimensional projection plane; and an expanded image rendering module 740 may be used to perform image rendering on the two-dimensional projection plane according to the corresponding relationship between a point on the two-dimensional projection plane and a point on the inner wall of the lumen to obtain a two-dimensional expanded image of the three-dimensional lumen image.
Further, as shown in
a local coordinate system establishing sub-module 721 may be used to determine a gazing direction and a local coordinate system of a viewpoint, wherein a Z-axis direction of the local coordinate system is the gazing direction of the viewpoint; and
a spherical projection plane establishing sub-module 722 may be used to establish a spherical projection plane by taking the viewpoint as a centre of sphere and R as a radius under the local coordinate system, wherein R is a preset radius value less than or equal to a radius of the inner wall of the lumen.
Further, as shown in
a gazing direction determining module 7211 may be used to determine a gazing direction of a current viewpoint according to positions of viewpoints adjacent to the current viewpoint, e.g., immediately before and after the current viewpoint, on the centre line; and
a coordinate axis direction determining module 7212 may be used to determine an X-axis direction, a Y-axis direction and a Z-axis direction of a local coordinate system of the current viewpoint according to a connecting line between a previous viewpoint, e.g., immediately, before the current viewpoint, and the current viewpoint and a local coordinate system of the previous viewpoint.
In another example of the present disclosure, a corresponding relationship between a point on the spherical projection plane and a point on the two-dimensional projection plane may include: a corresponding relationship between a three-dimensional vector of the point on the spherical projection plane and a two-dimensional coordinate of the point on the two-dimensional projection plane, wherein the three-dimensional vector of the point on the spherical projection plane may be a vector constituted by the point on the spherical projection plane and the centre of sphere.
A point on the spherical projection plane corresponding to a point on the inner wall of the lumen may include: the three-dimensional vector of the point on the spherical projection plane corresponding to the point on the inner wall of the lumen directed by the direction of the vector.
In another example of the present disclosure, by reading machine readable instructions in the storage medium 620 corresponding to the control logic for rendering an expanded lumen image, the machine readable instructions further cause the processor 610 to:
take a spherical surface positioned between a positive direction of an X axis and a negative direction of a Z axis on the spherical projection plane as a three-dimensional region R1, a spherical surface in the negative direction of the X axis as a three-dimensional region R2, and the residual spherical surface as a three-dimensional region R3;
determine an image rendering region of a two-dimensional projection plane, which includes two-dimensional regions R1′, R2′ and R3′, wherein the two-dimensional regions R1′, R2′ and R3′ may respectively correspond to the three-dimensional regions R1, R2 and R3, the two-dimensional region R1′ and R3′ are semicircles whose radius is equal to a scaled lumen radius r, the two-dimensional region R2′ is a rectangle whose width is πr and height is 2 r and positioned in a middle part of the image rendering region, and the two-dimensional region R1′ and R3′ are respectively positioned on left and right sides of the two-dimensional regions R2′; the scaled lumen radius r may be obtained by multiplying the radius R0 of the inner wall of the lumen with a preset scaling ratio,
determine a three-dimensional vector (LR3, MR3, NR3) of a point on the three-dimensional region R3 corresponding to a point P (x, y) on the two-dimensional region R3′, wherein (xc, yc) is a coordinate of a centre of circle of the two-dimensional region R3′, (X1, Y1, Z1), (X2, Y2, Z2) and (X3, Y3, Z3) respectively are unit vectors of the X axis, the Y axis and the Z axis,
LR3=XiR3* X1+YiR3* X2+ZiR3*X3, MR3=XiR3*Y1+YiR3*Y2+ZiR3*Y3, NR3=XiR3*Z1+YiR3*Z2+ZiR3*Z3, XiR3=iR3√{square root over (iR3*iR3+jR3*jR3+kR3*kR3)}, YiR3=jR3/√{square root over (iR3*iR3+jR3*jR3+kR3*kR3)}, ZiR3=kR3/√{square root over (iR3*iR3+jR3*jR3+kR3*kR3)}, iR3=(x−xc)*r*sin(√{square root over ((x−xc)2+(y−yc)2))}/r), jR3=(y−yc)*r*sin(√{square root over ((x−xc)2+(y−yc)2))}/r), kR3=r*cos(√{square root over ((x−xc)2+(y−yc)2))}/r),
(iR3, jR3, kR3) is a spherical point coordinate corresponding to the point P (x, y) on the two-dimensional region R3′;
determine a three-dimensional vector (LR1, MR1, NR1) of a point on the three-dimensional region R1 corresponding to a point P (x, y) on the two-dimensional region R1′, wherein (xd, yd) is a coordinate of the centre of circle of the two-dimensional region R1′,
LR1=XiR1* X1+YiR1* X2+ZiR1*X3, MR1=XiR1*Y1+YiR1*Y2+ZiR1*Y3, NR1=XiR1*Z1+YiR1*Z2+ZiR1*Z3, XiR1=iR1√{square root over (iR1*iR1+jR1*jR1+kR1*kR1)}, YiR1=jR1/√{square root over (iR1*iR1+jR1*jR1+kR1*kR1)}, ZiR1=kR1/√{square root over (iR1*iR1+jR1*jR1+kR1*kR1)}, iR1=(x−xd)*r*sin(√{square root over ((x−xd)2+(y−yd)2))}/r), jR1=(y−yd)*r*sin(√{square root over ((x−xd)2+(y−yd)2))}/r), kR1=r*cos(√{square root over ((x−xd)2+(y−yd)2))}/r),
(iR1, jR1, kR1) is a spherical point coordinate corresponding to the point P (x, y) on the two-dimensional region R1;
determine a three-dimensional vector (LR2, MR2, NR2) of a point on the three-dimensional region R2 corresponding to a point P (x, y) on the two-dimensional region R2′, wherein
LR2=XiR2* X1+YiR2* X2+ZiR2*X3, MR2=XiR2*Y1+YiR2*Y2+ZiR2*Y3, NR2=XiR2*Z1+YiR2*Z2+ZiR2*Z3, XiR2=iR2√{square root over (iR2*iR2+jR2*jR2+kR2*kR2)}, YiR2=jR2/√{square root over (iR2*iR2+jR2*jR2+kR2*kR2)}, ZiR2=kR2/√{square root over (iR2*iR2+jR2*jR2+kR2*kR2)}, iR2=−cos(beta)*sin(alpha), jR2=−sin(beta), kR2=−cos(beta)*cos(alpha), alpha=(x−xd)*π/(xc−xd), beta=−(y−yd)*π/(2*r),
(iR2, jR2, kR2) is a spherical point coordinate corresponding to the point P (x, y) on the two-dimensional region R2′.
According to one example, the expanded image rendering module 740 may first obtain color information of a second point on the inner wall of the lumen directed by a three-dimensional vector corresponding to a third point on the two-dimensional projection plane according to the corresponding relationship between the two-dimensional projection plane and the inner wall of the lumen, and then obtain a two-dimensional expanded image of the three-dimensional lumen image by performing image rendering on the two-dimensional projection plane based on the obtained color information by using three-dimensional volume reconstruction, maximum intensity projection or three-dimensional surface reconstruction.
Taking software implementation as an example, the following will further describe how the device for rendering an expanded lumen image executes the control logic. In this example, the control logic in the present disclosure may be interpreted as computer executable instructions stored in the machine readable storage medium 620. When the processor 610 on the device for rendering an expanded lumen image of the present disclosure executes the control logic, by invoking the machine readable instructions stored in the machine readable storage medium 620, the machine readable instructions may further cause the processor 610 to:
obtain a plurality of viewpoints by sampling a centre line of a lumen based on a three-dimensional lumen image;
establish a spherical projection plane for each of the viewpoints, wherein a point on the spherical projection plane corresponds to a point on an inner wall of the lumen;
determine a corresponding relationship between a two-dimensional projection plane and the inner wall of the lumen according to a corresponding relationship between the spherical projection plane and the two-dimensional projection plane; and
obtain a two-dimensional expanded image of the three-dimensional lumen image by performing image rendering on the two-dimensional projection plane according to the corresponding relationship between the two-dimensional projection plane and the inner wall of the lumen.
According to one example, when establishing a spherical projection plane, the machine readable instructions may cause the processor 610 to:
determine a gazing direction and a local coordinate system of a viewpoint, wherein a Z-axis direction of the local coordinate system is the gazing direction of the viewpoint; and
establish a spherical projection plane by taking the viewpoint as a centre of sphere and R as a radius under the local coordinate system, wherein R is a preset radius value less than or equal to a radius r of the inner wall of the lumen.
Further, when determining a gazing direction and a local coordinate system of a viewpoint, the machine-executable instructions may further cause the processor 610 to:
determine a gazing direction of a current viewpoint according to positions of viewpoints immediately before and after the current viewpoint on the centre line; and
determine an X-axis direction, a Y-axis direction and a Z-axis direction of a local coordinate system of the current viewpoint according to a connecting line between a previous viewpoint immediately before the current viewpoint and the current viewpoint and a local coordinate system of the previous viewpoint.
According to one example, a point on the spherical projection plane corresponding to a point on the inner wall of the lumen may refer that a first point on the spherical projection plane may correspond to a second point on the inner wall of the lumen directed by a three-dimensional vector of the first point. The three-dimensional vector of the first point is a vector constituted by the first point and the centre of sphere.
Further, a corresponding relationship between the spherical projection plane and the two-dimensional projection plane may include: a corresponding relationship between a three-dimensional vector of a first point on the spherical projection plane and a two-dimensional coordinate of a third point on the two-dimensional projection plane.
In such a case, when determining a corresponding relationship between a two-dimensional projection plane and the inner wall of the lumen according to a corresponding relationship between the spherical projection plane and the two-dimensional projection plane, the machine readable instructions may further cause the processor 610 to:
divide the spherical projection plane into three three-dimensional regions, including:
a three-dimensional region R1, which is a spherical surface positioned between an X-axis positive direction and a Z-axis negative direction,
a three-dimensional region R2, which is a spherical surface positioned in an X-axis negative direction, and
a three-dimensional region R3, which is a spherical surface excluding the three-dimensional regions R1 and R2;
determine an image rendering region of the two-dimensional projection plane, wherein the image rendering region may include:
a two-dimensional region R1′ corresponding to the three-dimensional region R1, which is a semicircle with radius equal to a scaled lumen radius r and positioned on a left side of the image rendering region, the scaled lumen radius r may be obtained by multiplying a radius of the inner wall of the lumen with a preset scaling ratio,
a two-dimensional region R2′ corresponding to the three-dimensional region R2, which is a rectangle with width equal to nr and height equal to 2 r and positioned in a middle part of the image rendering region, and
a two-dimensional region R3′ corresponding to the three-dimensional region R3, which is a semicircle with radius equal to the scaled lumen radius r and positioned on a right side of the image rendering region; and
determine a three-dimensional vector on one of the three-dimensional regions R1, R2 and R3 corresponding to a point on each of the two-dimensional regions R1′, R2′ and R3′.
Further, when obtaining a two-dimensional expanded image of the three-dimensional lumen image by performing image rendering on the two-dimensional projection plane according to the corresponding relationship between the two-dimensional projection plane and the inner wall of the lumen, the machine readable instructions may further cause the processor 610 to:
obtain color information of a second point on the inner wall of the lumen directed by a three-dimensional vector corresponding to a third point on the two-dimensional projection plane; and
obtain a two-dimensional expanded image of the three-dimensional lumen image by performing image rendering on the two-dimensional projection plane based on the obtained color information.
For simplicity and illustrative purposes, the present disclosure is described by referring mainly to an example thereof. In the above description, numerous specific details are set forth in order to provide a thorough understanding of the present disclosure. It will be readily apparent however, that the present disclosure may be practiced without limitation to these specific details. In other instances, some methods and structures may have not been described in detail so as not to unnecessarily obscure the present disclosure. As used herein, the terms “a” and “an” are intended to denote at least one of a particular element, the term “includes” means includes but not limited to, the term “including” means including but not limited to, and the term “based on” means based at least in part on.
Specific implementations of functions and roles of modules in the above device may be seen in detail in implementations of corresponding blocks in above methods, which are not unnecessarily elaborated herein.
The above are only preferred examples of the present disclosure is not intended to limit the disclosure within the spirit and principles of the present disclosure, any changes made, equivalent replacement, or improvement in the protection of the present disclosure should contain within the range.
The methods, processes and units described herein may be implemented by hardware (including hardware logic circuitry), software or firmware or a combination thereof. The term ‘processor’ is to be interpreted broadly to include a processing unit, ASIC, logic unit, or programmable gate array etc. The processes, methods and functional units may all be performed by the one or more processors; reference in this disclosure or the claims to a ‘processor’ should thus be interpreted to mean ‘one or more processors’.
Further, the processes, methods and functional units described in this disclosure may be implemented in the form of a computer software product. The computer software product is stored in a storage medium and comprises a plurality of instructions for making a processor to implement the methods recited in the examples of the present disclosure.
The figures are only illustrations of an example, wherein the units or procedure shown in the figures are not necessarily essential for implementing the present disclosure. Those skilled in the art will understand that the units in the device in the example can be arranged in the device in the examples as described, or can be alternatively located in one or more devices different from that in the examples. The units in the examples described can be combined into one module or further divided into a plurality of sub-units.
Although the flowcharts described show a specific order of execution, the order of execution may differ from that which is depicted. For example, the order of execution of two or more blocks may be changed relative to the order shown. Also, two or more blocks shown in succession may be executed concurrently or with partial concurrence. All such variations are within the scope of the present disclosure.
Throughout the present disclosure, the word “comprise”, or variations such as “comprises” or “comprising”, will be understood to imply the inclusion of a stated element, integer or step, or group of elements, integers or steps, but not the exclusion of any other element, integer or step, or group of elements, integers or steps.
It will be appreciated by persons skilled in the art that numerous variations and/or modifications may be made to the above-described embodiments, without departing from the broad general scope of the present disclosure. The present embodiments are, therefore, to be considered in all respects as illustrative and not restrictive. Accordingly, other embodiments are within the scope of the following claims.
Number | Date | Country | Kind |
---|---|---|---|
2015107385601 | Nov 2015 | CN | national |