Embodiments of the invention relate generally to a three-dimensional prostate pathological image generating system and a method thereof and, more specifically, to a method and system capable of generating a three-dimensional prostate pathology image by mapping a diagnosis result through a biopsy to a three-dimensional prostate image.
A transrectal ultrasound guided prostate biopsy is most commonly used as a biopsy for diagnosis of prostate cancer.
The transrectal ultrasound guided prostate biopsy is a method of collecting specimens by injecting an ultrasound probe into the rectum to see the shape of the prostate and inserting a specimen collecting needle specimens in the area where the prognosis of cancer is predicted.
However, the transrectal ultrasound guided prostate biopsy has a disadvantage in that it is not possible to accurately specify which location in the prostate the collected specimen was at, so that the vicinity of the collected specimen site must be resected relatively extensively.
In addition, recently, when performing a biopsy of the prostate, attempts have been made to increase the accuracy of the biopsy by using multiparametric magnetic response imaging (mpMRI). However, even in this case, only the approximate site of the cancer may be identified, and it is difficult to specify which site the collected specimen was and map it on a three-dimensional image.
Meanwhile, an examination method for relatively accurately determining the location of the collected specimen has emerged.
A transperineal template prostate mapping biopsy is an examination method in which the location of the needle collecting the specimen may be specified using a tool called a template.
Such transperineal template prostate mapping biopsy method may relatively accurately specify the location where a needle is injected through an instrument called a template, but it is only limited to specifying a location where the needle is injected.
Therefore, there is an urgent need for a method and system that may help in treatment by visualizing the location of the needle on the three-dimensional image of the prostate, the location of the collected specimen on the prostate and the location of prostate cancer expression as a result of examination of the specimen on a three-dimensional prostate image.
The above information disclosed in this Background section is only for understanding of the background of the inventive concepts, and, therefore, it may contain information that does not constitute prior art.
A method of generating a three-dimensional prostate pathological image and a system therefor according to exemplary embodiments may be helpful in treatment by relatively accurately visualizing the location of a specimen collected through biopsy on a three-dimensional image of the prostate and the location of prostate cancer expression as a result of examination of the specimen.
Additional features of the inventive concepts will be set forth in the description which follows, and in part will be apparent from the description, or may be learned by practice of the inventive concepts.
A method of generating a three-dimensional prostate pathological image according to an embodiment includes specifying, by a three-dimensional prostate pathological image generating system, a three-dimensional prostate image, obtaining, by the three-dimensional prostate pathological image generating system, a digital diagnosis result for each of at least one specimen corresponding to predetermined template coordinates obtained through transperineal template prostate biopsy (TTPB) through a diagnosis system, and displaying, by the three-dimensional prostate pathological image generating system, an onset site of prostate cancer present in the at least one specimen on the three-dimensional prostate image based on template coordinates for each specimen and the digital diagnosis result.
The displaying, by the three-dimensional prostate pathological image generating system, of the onset site of prostate cancer present in the at least one specimen on the three-dimensional prostate image based on the template coordinates for each specimen and the digital diagnosis result may include, specifying, by the three-dimensional prostate pathological image generating system, a specimen location for each specimen on the three-dimensional prostate image based on the template coordinates for each specimen, and specifying, for each specimen, a relative onset location within the specimen of the prostate cancer onset site present in the specimen based on the digital diagnosis result.
The method of generating the three-dimensional prostate pathological image may further include displaying the specimen location for each specimen on the three-dimensional prostate image.
The method of generating the three-dimensional prostate pathological image may further include specifying an estimated onset site based on a plurality of onset sites respectively included in a plurality of different specimens, and displaying the estimated onset site on the three-dimensional prostate image.
The specifying of the estimated onset site based on the plurality of onset sites respectively included in the plurality of different specimens may include, when it is determined that a first individual onset site and a second individual onset site corresponding to a first specimen and a second specimen, respectively, among the plurality of different specimens satisfy a predetermined combining criterion, specifying a tissue between the first individual onset site and the second individual onset site as the estimated onset site.
The combining criterion may be a case where it is determined that the first individual onset site and the second individual onset site corresponding to the first specimen and the second specimen, respectively, satisfy an adjacency criterion according to a predetermined criterion or the first individual onset site and the second individual onset site satisfy a predetermined continuity criterion.
The specifying, for each specimen, of the relative onset location within the specimen of the prostate cancer onset site present in the specimen based on the digital diagnosis result may include, specifying the relative onset location within the specimen based on a specimen pathological image of the specimen displayed in the digital diagnosis result and a location of the prostate cancer onset site within the specimen displayed on the specimen pathological image.
The specifying, by the three-dimensional prostate pathological image generating system, of the specimen location for each specimen on the three-dimensional prostate image based on the template coordinates for each specimen may include, based on a location of a needle on an ultrasound image obtained during the TTPB, specifying a location of the needle on the three-dimensional prostate image, and specifying the specimen location for each specimen based on the location of the needle on the three-dimensional prostate image and a location of a predetermined specimen collecting part on the needle.
The method may be implemented by a computer program stored in a non-transitory computer readable recording medium.
A system for generating a three-dimensional prostate pathological image according to another embodiment includes a processor, and a memory in which a program executed by the processor is recorded, wherein the processor is configured to drive the program to specify a three-dimensional prostate image, obtain a digital diagnosis result for each of at least one specimen corresponding to predetermined template coordinates obtained through transperineal template prostate biopsy (TTPB) through a diagnosis system, and display an onset site of prostate cancer present in the at least one specimen on the three-dimensional prostate image based on template coordinates for each specimen and the digital diagnosis result.
The processor may be configured to drive the program to specify a specimen location for each specimen on the three-dimensional prostate image based on the template coordinates for each specimen, and specify a relative onset location within the specimen of the prostate cancer onset site present in the specimen for each specimen based on the digital diagnosis result.
The processor may be configured to drive the program to specify an estimated onset site based on a plurality of onset sites respectively included in a plurality of different specimens, and display the estimated onset site on the three-dimensional prostate image.
The processor may be configured to drive the program to, when it is determined that a first individual onset site and a second individual onset site corresponding to a first specimen and a second specimen, respectively, among the plurality of different specimens satisfy a predetermined combining criterion, specify a tissue between the first individual onset site and the second individual onset site as the estimated onset site.
The combining criterion may be a case where it is determined that the first individual onset site and the second individual onset site corresponding to the first specimen and the second specimen, respectively, satisfy an adjacency criterion according to a predetermined criterion or the first individual onset site and the second individual onset site satisfy a predetermined continuity criterion.
The processor may be configured to drive the program to specify the relative onset location within the specimen based on a specimen pathological image of the specimen displayed in the digital diagnosis result and a location of the prostate cancer onset site within the specimen displayed on the specimen pathological image.
The processor may be configured to drive the program to, based on a location of a needle on an ultrasound image obtained during the TTPB, specify a location of the needle on the three-dimensional prostate image, and specify the specimen location for each specimen based on the location of the needle on the three-dimensional prostate image and a location of a predetermined specimen collecting part on the needle.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory and are intended to provide further explanation of the invention as claimed.
The accompanying drawings, which are included to provide a further understanding of the invention and are incorporated in and constitute a part of this specification, illustrate exemplary embodiments of the invention, and together with the description serve to explain the inventive concepts.
In the following description, for the purposes of explanation, numerous specific details are set forth in order to provide a thorough understanding of various exemplary embodiments or implementations of the invention. As used herein “embodiments” and “implementations” are interchangeable words that are non-limiting examples of devices or methods employing one or more of the inventive concepts disclosed herein. It is apparent, however, that various exemplary embodiments may be practiced without these specific details or with one or more equivalent arrangements. In other instances, well-known structures and devices are shown in block diagram form in order to avoid unnecessarily obscuring various exemplary embodiments. Further, various exemplary embodiments may be different, but do not have to be exclusive. For example, specific shapes, configurations, and characteristics of an exemplary embodiment may be used or implemented in another exemplary embodiment without departing from the inventive concepts.
Unless otherwise specified, the illustrated exemplary embodiments are to be understood as providing exemplary features of varying detail of some ways in which the inventive concepts may be implemented in practice. Therefore, unless otherwise specified, the features, components, modules, layers, films, panels, regions, and/or aspects, etc. (hereinafter individually or collectively referred to as “elements”), of the various embodiments may be otherwise combined, separated, interchanged, and/or rearranged without departing from the inventive concepts.
The use of cross-hatching and/or shading in the accompanying drawings is generally provided to clarify boundaries between adjacent elements. As such, neither the presence nor the absence of cross-hatching or shading conveys or indicates any preference or requirement for particular materials, material properties, dimensions, proportions, commonalities between illustrated elements, and/or any other characteristic, attribute, property, etc., of the elements, unless specified. Further, in the accompanying drawings, the size and relative sizes of elements may be exaggerated for clarity and/or descriptive purposes. When an exemplary embodiment may be implemented differently, a specific process order may be performed differently from the described order. For example, two consecutively described processes may be performed substantially at the same time or performed in an order opposite to the described order. Also, like reference numerals denote like elements.
When an element, such as a layer, is referred to as being “on,” “connected to,” or “coupled to” another element or layer, it may be directly on, connected to, or coupled to the other element or layer or intervening elements or layers may be present. When, however, an element or layer is referred to as being “directly on,” “directly connected to,” or “directly coupled to” another element or layer, there are no intervening elements or layers present. To this end, the term “connected” may refer to physical, electrical, and/or fluid connection, with or without intervening elements. Further, the D1-axis, the D2-axis, and the D3-axis are not limited to three axes of a rectangular coordinate system, such as the x, y, and z—axes, and may be interpreted in a broader sense. For example, the D1-axis, the D2-axis, and the D3-axis may be perpendicular to one another, or may represent different directions that are not perpendicular to one another. For the purposes of this disclosure, “at least one of X, Y, and Z” and “at least one selected from the group consisting of X, Y, and Z” may be construed as X only, Y only, Z only, or any combination of two or more of X, Y, and Z, such as, for instance, XYZ, XYY, YZ, and ZZ. As used herein, the term “and/or” includes any and all combinations of one or more of the associated listed items.
Although the terms “first,” “second,” etc. may be used herein to describe various types of elements, these elements should not be limited by these terms. These terms are used to distinguish one element from another element. Thus, a first element discussed below could be termed a second element without departing from the teachings of the disclosure.
Spatially relative terms, such as “beneath,” “below,” “under,” “lower,” “above,” “upper,” “over,” “higher,” “side” (e.g., as in “sidewall”), and the like, may be used herein for descriptive purposes, and, thereby, to describe one elements relationship to another element(s) as illustrated in the drawings. Spatially relative terms are intended to encompass different orientations of an apparatus in use, operation, and/or manufacture in addition to the orientation depicted in the drawings. For example, if the apparatus in the drawings is turned over, elements described as “below” or “beneath” other elements or features would then be oriented “above” the other elements or features. Thus, the exemplary term “below” can encompass both an orientation of above and below. Furthermore, the apparatus may be otherwise oriented (e.g., rotated 90 degrees or at other orientations), and, as such, the spatially relative descriptors used herein interpreted accordingly.
The terminology used herein is for the purpose of describing particular embodiments and is not intended to be limiting. As used herein, the singular forms, “a,” “an,” and “the” are intended to include the plural forms as well, unless the context clearly indicates otherwise. Moreover, the terms “comprises,” “comprising,” “includes,” and/or “including,” when used in this specification, specify the presence of stated features, integers, steps, operations, elements, components, and/or groups thereof, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof. It is also noted that, as used herein, the terms “substantially,” “about,” and other similar terms, are used as terms of approximation and not as terms of degree, and, as such, are utilized to account for inherent deviations in measured, calculated, and/or provided values that would be recognized by one of ordinary skill in the art.
As customary in the field, some exemplary embodiments are described and illustrated in the accompanying drawings in terms of functional blocks, units, and/or modules. Those skilled in the art will appreciate that these blocks, units, and/or modules are physically implemented by electronic (or optical) circuits, such as logic circuits, discrete components, microprocessors, hard-wired circuits, memory elements, wiring connections, and the like, which may be formed using semiconductor-based fabrication techniques or other manufacturing technologies. In the case of the blocks, units, and/or modules being implemented by microprocessors or other similar hardware, they may be programmed and controlled using software (e.g., microcode) to perform various functions discussed herein and may optionally be driven by firmware and/or software. It is also contemplated that each block, unit, and/or module may be implemented by dedicated hardware, or as a combination of dedicated hardware to perform some functions and a processor (e.g., one or more programmed microprocessors and associated circuitry) to perform other functions. Also, each block, unit, and/or module of some exemplary embodiments may be physically separated into two or more interacting and discrete blocks, units, and/or modules without departing from the scope of the inventive concepts. Further, the blocks, units, and/or modules of some exemplary embodiments may be physically combined into more complex blocks, units, and/or modules without departing from the scope of the inventive concepts.
Unless otherwise defined, all terms (including technical and scientific terms) used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this disclosure is a part. Terms, such as those defined in commonly used dictionaries, should be interpreted as having a meaning that is consistent with their meaning in the context of the relevant art and should not be interpreted in an idealized or overly formal sense, unless expressly so defined herein.
In addition,
Then, while the injection location of the needle is fixed through the template, the needle is injected while checking one section of the prostate through an ultrasound probe injected through the rectum, the needle is injected into a site suspected of having prostate cancer, and a specimen of a predetermined size is collected.
In addition, referring to
As shown in
As used herein, the identifiers of each injection port are defined as template coordinates, and the left, right, top, and bottom locations in the front view of the prostate where the needle is injected, i.e., the locations on the x and y planes, may be specified by these template coordinates.
The number of injection ports formed on the template 10 and at what intervals may vary according to exemplary embodiments.
The examiner may select a template coordinate suitable for a part where it is determined that a specimen needs to be collected through an ultrasound image obtained through an ultrasound probe 20, and inject the specimen collecting needle 20 into the selected template coordinate.
Through this, there is an effect of relatively accurately specifying the location of the needle 20 from which the specimen was collected, compared to the conventional transrectal prostate biopsy method.
However, in such transperineal template prostate biopsy method, although the x, y coordinates are specified in the front view (e.g., the left diagram in
Therefore, according to inventive concepts of the present disclosure, while examining through the above-mentioned transperineal template prostate biopsy, if cancer is expressed in the specimen collected and a specific site in the specimen, it is possible to provide a method and system capable of visually and easily confirming examination results by relatively accurately mapping the expression location of the cancer on a three-dimensional prostate image and also usefully utilized for treatment.
In a method of generating a three-dimensional prostate pathological image according to inventive concepts of the present disclosure, when one or more specimens were collected from the prostate via transperineal template prostate biopsy to perform examination, in addition to simply confirming whether or not cancer is expressed in the specimen, it is possible to specify a site in the specimen in which cancer is expressed and display it on a three-dimensional prostate image.
In particular, when a specimen is collected over a certain length in the longitudinal direction of the needle and cancer is expressed only in a certain site of the specimen, only the certain site in the specimen is displayed on the three-dimensional prostate image, thereby enabling relatively accurate visual confirmation and treatment of examination results, and may be effective in preventing treatment for excessive sites.
Moreover, according to inventive concepts of the present disclosure, it is possible to provide an effect of easily confirming the examined site on the prostate image by displaying the location of the collected specimen as well as displaying the cancer expression site on the three-dimensional prostate image.
In addition, since it is possible to relatively accurately determine the location of cancer expression even within a specimen, in case of collecting a plurality specimens, based on the location of cancer expression in each of the plurality of specimens, it is possible to provide an effect that may reasonably predict and visualize a location where cancer is highly likely to be expressed even though the specimen has not actually been collected.
A three-dimensional prostate pathological image generating system according to embodiments will be described with reference to
Referring to
The three-dimensional prostate pathological image generating system (hereinafter referred to as ‘generating system’, 100) may implement the inventive concepts of the present disclosure while communicating with a predetermined diagnosis system 200.
In addition, the generating system 100 may receive the examination result of the specimen from the diagnosis system 200, and based on the examination result, may map and display the location of the specimen and the expression location of cancer on the three-dimensional prostate image as described above.
The diagnosis system 200 may be a system that receives a pathological image (e.g., a tissue image) of a specimen and determines in which site prostate cancer is expressed.
The diagnosis system 200 may be a system that determines the expression part of cancer on the entire digital image of the specimen, i.e., on a slide image or a patch image obtained by dividing the slide image into a plurality of patches through deep learning-based image analysis to display the location of the expression site on the image or output the location of the expression site of the cancer.
An example of such a diagnosis system 200 has been disclosed in Korean patent applications (Application No. 10-2016-0168176, System and Method for Medical Diagnosis Using Neural Network), (Application No. 10-2018-0064332, System and Method for Two Phase Diagnosis Using Neural Network) filed by the applicant, and in addition, various diagnosis methods are known that receive a digital image of a tissue and specify a cancer onset site through a deep learning-based diagnosis model.
Although detailed descriptions of the diagnosis system 200 is omitted to clarify the gist of the present disclosure, an average expert in the art of the present disclosure will be able to easily infer that various methods of receiving a digital image of a specimen to be examined and specifying the onset site of prostate cancer on the digital image may be implemented with the diagnosis system 200.
The diagnosis system 200 may receive an input of the injection port of the specimen collected by the examiner, i.e., the coordinates of the template, from the examiner, and the diagnosis system 200 may transfer template coordinates and digital diagnosis results of a specimen corresponding to the template coordinates to the generating system 100.
In addition, the diagnosis system 200 may be provided with a tool for generating a digital image of the specimen by photographing or scanning the collected specimen when the specimen is collected, or may receive a digital image through the tool.
The digital diagnosis result of the specimen may be a pathological image itself in which an onset site of prostate cancer is marked on a digital image corresponding to the specimen. In this case, the generating system 100 may analyze the pathological image to specify the location (e.g., coordinate values, etc.) of the cancer onset site. Alternatively, according to an embodiment, the diagnosis system 200 may specify at least one coordinate value that may indicate the prostate cancer expression site on the digital image of the specimen and transmit information to the generating system 100. These coordinate values may be used as relative locations indicating the onset site of prostate cancer in the corresponding specimen.
According to an embodiment, information (e.g., Gleason score) representing the progress level of prostate cancer by site of expression may be transmitted from the diagnosis system 200 to the generating system 100, and the generating system 100 may visualize the corresponding site differentially according to the progress level of prostate cancer as well as the location of the cancer expression site on the three-dimensional prostate image.
Then, the generating system 100 may receive template coordinates for each specimen and a digital diagnosis result corresponding to the specimen, and may specify a location of the corresponding specimen on the three-dimensional prostate image based on the received template coordinates for each specimen.
As for the location of the specimen on the three-dimensional prostate image, the locations of x and y on the front view of the template 10 may be specified based on the template coordinates for each specimen. In addition, the location in the depth direction may be estimated based on the length of the needle and the location of the collecting part formed on the needle to collect the specimen.
Depending on embodiments, the ultrasound image in a state in which the needle is injected may be analyzed in order to calculate a more accurate depth direction. For example, the generating system 100 may additionally receive an ultrasound image in a state in which the needle is injected, and may specify the location of the outer edge of the prostate and the end of the needle in the ultrasound image through image analysis.
Then, by converting the location of the needle analyzed in the ultrasound image into a location on the three-dimensional prostate image, the location of the injected needle in the depth direction may be determined relatively accurately, and when the location of the needle is specified, the location of the specimen collected on the three-dimensional prostate image may be finally specified based on the location of the specimen collecting part formed at a predetermined location in the needle.
In addition, the generating system 100 may confirm information about the relative location of the prostate cancer onset site and/or the progress level of cancer based on the digital diagnosis result of the diagnosis system 200.
Then, based on the location of the specified specimen and the relative onset location of the cancer onset site (onset site) within the specimen, the cancer onset site in the corresponding specimen may finally be specified and displayed on the three-dimensional prostate image.
In addition, by performing this process on all specimens collected, the diagnosis results diagnosed through transperineal template prostate biopsy may be visualized on the three-dimensional prostate image.
In addition, according to inventive concepts of the present disclosure, when a plurality of specimens are collected and each specimen has an onset site of cancer, based on the location of the specimen and the location of the onset site of cancer for each specimen, it is possible to estimate whether or not cancer has developed in the prostate site that has not actually been collected. In other words, an estimated onset site may be calculated and provided.
As shown in
The average expert in the technical field of the present disclosure may easily infer that the processor 110 may perform data processing necessary to implement inventive concepts of the present disclosure by driving the above program.
The memory 120 may be a device in which a program for implementing inventive concepts of the present disclosure are stored/installed. According to an embodiment, the memory 120 may be divided into a plurality of different physical devices, and a part of the memory 120 may exist inside the processor 110 according to an embodiment. The memory 120 may be implemented as a hard disk, a GPU, a solid state disk (SSD), an optical disk, a random access memory (RAM), and/or other various types of storage media, depending on an embodiment, and may be detachably implemented in the memory 120 if necessary.
The generating system 100 may be implemented as an independent physical device for implementing a method of generating a three-dimensional prostate pathological image according to inventive concepts of the present disclosure while communicating with the u) diagnosis system 200, but is not limited thereto. In some embodiments, the generating system 100 may be implemented with any data processing device (e.g., computer, mobile terminal, etc.) capable of processing data to execute the program.
According to an embodiment, the generating system 100 may be mounted on the diagnosis system 200 and be physically implemented integrally with the diagnosis system 200.
In addition, the average expert in the technical field of the present disclosure may easily infer that the generating system 100 may be provided with the processor 110, the memory 120, and various peripheral devices (e.g., input/output devices, display devices, audio devices, etc., 140, 141) provided in the diagnosis control system 100, and a communication interface (e.g., communication bus, 130, etc.) to connect these devices.
Meanwhile, the generating system 100 according to embodiments may be implemented by organically combining the program (or software) stored in the memory 120 and the processor 110, and the average expert in the technical field of the present disclosure may easily infer that the function and/or operation performed by the generating system 100 in the present specification hereinafter may be executed by the processor 110.
The three-dimensional prostate image may be received from the outside by the generating system 100 or may be generated by the generating system 100 itself.
The technical idea for generating three-dimensional prostate images is widely known. For example, a three-dimensional image of the subject's prostate may be generated through techniques, such as CT 3D and/or multiparametric magnetic resonance image (mpMRI).
The generating system 100 may display a schematic, i.e., an estimated, cancer expression site on the generated three-dimensional image using information acquired through a predetermined diagnosis method (e.g., ultrasound scan, mpMRI, etc.).
The examiner may perform the transperineal template prostate biopsy as described above at the expected site of expression.
A specimen collected through the transperineal template prostate biopsy may be diagnosed through the diagnosis system 200.
The diagnosis system 200 may determine the onset site of prostate cancer in the specimen for each specimen and transmit the digital diagnosis result to the generating system 100. In this case, template coordinates of each specimen may be input to the diagnosis system 200.
The generating system 100 may specify and visualize the location of the specimen and/or the onset site of cancer on the three-dimensional prostate image based on the received diagnosis result for each specimen.
Then, the specimen collecting part formed at a predetermined location (e.g., 21 to 22) of the needle 20 may collect a specimen by a certain length in the longitudinal direction of the needle.
The collected specimen may be converted into a digital pathological image by the diagnosis system 200 or through a separate external device.
The diagnosis system 200, receiving the converted digital pathological image and template coordinates, may generate a digital diagnosis result.
An example of the generated digital diagnosis result may be as shown in
At this time, the areas visualized in yellow and orange may be visualized in different colors to distinguish different progress levels of prostate cancer.
The digital diagnosis result may be the image itself in which the onset site of prostate cancer is visualized in the digital pathological image of the specimen, or may be a coordinate value for each pixel corresponding to the onset site of prostate cancer on the digital pathological image and a progress level (e.g., Gleason score) corresponding to each coordinate value.
In any case, the digital diagnosis result may include information capable of specifying which location on the digital pathological image of the specimen is the onset site of prostate cancer.
Then, as shown in
The relative onset location in the specimen may be a start location (Ct) and an end location (Cb) that may specify the location of onset of prostate cancer in the specimen as shown in
In any case, when the location of the specimen is specified on the three-dimensional prostate image, any information that may specify the location of the onset site in the specimen on the three-dimensional prostate image may be the relative onset location.
In addition, as described above, the location of the specimen may be specified according to the template coordinates, the location of the needle 20 in the depth direction, and the location of the specimen collecting part in the needle.
In addition, while the location of the needle 20 in the depth direction may be calculated as a rough estimate based on how much the needle has entered with respect to the template 10, as described above, the location of the needle 20 may be determined by the generating system 100 through an ultrasound image obtained when specimen collection is performed.
For example, as described above, the location of the needle 20 in the depth direction on the three-dimensional image may be determined in more detail by analyzing the distance difference between the outer edge of the prostate and the tip of the needle 20 in the ultrasound image.
Then, as shown in
The generating system 100 may simply visualize only the onset site of prostate cancer on the three-dimensional prostate image, but since it may be important information for diagnosis and treatment, the location of the specimen 30 collected may be displayed on the three-dimensional prostate image.
In addition, while
Referring to
In addition, the generating system 100 may specify the onset site of prostate cancer for each specimen and visualize it.
According to inventive concepts of the present disclosure, the generating system 100 may stop at simply visualizing the onset sites 41, 42, 43, and 44 for each of the plurality of specimens on the three-dimensional prostate image.
However, inventive concepts are not limited thereto, based on the plurality of onset sites 41, 42, 43, and 44, estimated onset sites (e.g., 50, 51, and 52) may be specified, and these estimated onset sites (e.g., 50, 51, and 52) may be further visualized.
This is because prostate cancer may occur even in sites that are not collected through transperineal template prostate biopsy and when there are onset sites 41, 42, 43, and 44 that satisfy predetermined reference conditions, there is a high possibility that prostate cancer has occurred at a predetermined site, i.e., an estimated onset site, that has not been collected, and it may be very useful to visualize even these estimated onset sites.
In other words, the generating system 100 may visualize the estimated onset site separately from the onset site by actual diagnosis, or display and visualize combined onset sites including the estimated onset sites 50, 51, and 52 and the actual onset sites 41, 42, 43, and 44 on the three-dimensional prostate image.
The estimated onset site may be a site that satisfies a predetermined reference condition, i.e., a combining criterion.
For example, when it is determined that the first individual onset site (e.g., 41) and the second individual onset site (e.g., 42) corresponding to the first specimen and the second specimen, respectively, among the plurality of different specimens satisfy the predetermined combining criterion, the generating system 100 may specify the tissue (e.g., 50) between the first individual onset site (e.g., 41) and the second individual onset site (e.g., 42) as the estimated onset site.
Alternatively, when it is determined that the first individual onset site (e.g., 42) and the second individual onset site (e.g., 43) satisfy the predetermined combining criterion, the tissue (e.g., 51) between the first individual onset site (e.g., 42) and the second individual onset site (e.g., 43) may be specified as the estimated onset site.
The combining criterion may include a condition that the template coordinates corresponding to each of the first specimen and the second specimen satisfy an adjacency criterion determined by a predetermined criterion.
In other words, when the locations of individual onset sites are adjacent to each other at a certain level or more, there is a high probability of occurrence in tissues between individual onset sites, so such condition may be included in the above combining criterion.
Whether the locations of individual onset sites are adjacent may be simply determined based on the template coordinates of the specimens corresponding to the individual onset sites, or may be determined based on the distance between actual individual onset sites in three dimensions.
The distance between individual onset sites in three dimensions may be the distance between center points of the individual onset sites, or may be the distance between the nearest outlines after specifying the outline of each of the individual onset sites. The distance between individual onset sites may be defined in various ways, and the degree of adjacency to satisfy the combining criterion may be determined in advance or determined through repeated experiments.
In addition, since it may not be considered that the tissue between the individual onset sites is diseased simply because they are adjacent to each other, whether or not a predetermined continuity criterion between the individual onset sites is satisfied may be further included in the combining criterion.
The continuity criterion may be, for example, a case in which the progression of the disease is the same or with a difference within a predetermined range, such as that the Gleason scores of two individual onset sites satisfying the adjacency criterion are the same or only have a difference of about 1. Alternatively, it may be a case where individual onset sites satisfy a predetermined pattern shape. This pattern shape is learned from a number of medical data to build a pattern for the shapes of the onset site in advance, and it may be determined that the continuity criterion is satisfied when at least one of such pattern shapes may include two individual onset sites.
The continuity criterion may be defined in a variety of ways.
In general, when there are two separate onset sites that satisfy the adjacency criterion, resection or treatment is also performed in the area between them, so the continuity criterion may not need to be defined too strictly.
In addition, the combining criterion may be defined to be satisfied even if only one of the adjacency criterion or the continuity criterion is satisfied, or the combining criterion may be defined to be satisfied only when both of the criterion are satisfied.
After all, according to inventive concepts of the present disclosure, it is not limited to the onset site confirmed through actual diagnosis being visualized on a three-dimensional prostate image, but it may be very useful for diagnosis and treatment by estimating and visualizing even sites with a high possibility of onset based on the confirmed onset site.
While
In addition, for convenience of explanation, in the present specification, it has been described that visualization is performed based on whether or not prostate cancer has occurred, but as described above, the onset site may be classified and visualized according to the progress level (e.g., Gleason score) of prostate cancer.
The method of generating a three-dimensional prostate pathological image according to an embodiment of the present disclosure may be implemented as computer readable code on a non-transitory computer readable recording medium. A non-transitory computer-readable recording medium includes all types of recording devices in which data that may be read by a computer system is stored. Examples of computer-readable recording media include ROM, RAM, CD-ROM, magnetic tape, hard disk, floppy disk, and optical data storage devices. In addition, the computer-readable recording medium is distributed in computer systems connected through a network, so that computer-readable codes may be stored and executed in a distributed manner. In addition, functional programs, codes, and code segments for implementing the present disclosure may be easily inferred by programmers in the art to which the present disclosure belongs.
According to embodiments, inventive concepts may be used for a method of generating a three-dimensional prostate pathological image and a system thereof. In particular, by relatively accurately visualizing the location of a specimen collected through a biopsy and the location of prostate cancer expression as a result of examination of the specimen on a three-dimensional image of the prostate, inventive concepts according to the present disclosure may provide an effect of usefully using for diagnosis, treatment, and prognosis.
Although certain exemplary embodiments and implementations have been described herein, other embodiments and modifications will be apparent from this description. Accordingly, the inventive concepts are not limited to such embodiments, but rather to the broader scope of the appended claims and various obvious modifications and equivalent arrangements as would be apparent to a person of ordinary skill in the art.
Number | Date | Country | Kind |
---|---|---|---|
10-2021-0017521 | Feb 2021 | KR | national |
This application is a National Stage Entry of International Application No. PCT/KR2022/001884, filed on Feb. 8, 2022, and claims priority from and the benefit of Korean Patent Application No. 10-2021-0017521, filed on Feb. 8, 2021, each of which is incorporated by reference for all purposes as if fully set forth herein.
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/KR2022/001884 | 2/8/2022 | WO |