This invention relates to pathology systems and methods that employ automated processes, and more particularly to tissue slide analysis.
Measurement of tissue size, determination of grossing approach, and tissue inking are all currently performed manually by highly reimbursed specialized histotechnicians, physician assistants, or pathology residents. Analysis of tissue quality is performed by the pathologists but is not reflected in the pathology report and because of its subjective nature results in significant variation.
Following the resection of solid tumors, the tissue is removed and (a) measured, (b) processed, (c) grossed, (d) inked, (e) sectioned, (f) stained, (g) slides read by a pathologist, (h) pathology report is created. By way of background, a typical pathology report 100 is shown in
It is generally desirable to use machine learning and artificial intelligence to automate the routine but very necessary parts of the surgical pathology report while significantly improving efficiency of the surgical pathology lab thereby reducing turnaround time and decreasing cost while increasing quality.
Recording tissue size is a critical piece of information in the pathology report. Tissue size dictates how the piece of tissue is grossed in terms of the number of tissue blocks and cassettes that are needed to adequately assess the tissue. The number of cassettes and subsequent slides per cassette influence the amount of total tissue that is analyzed by the pathologist and the amount of time that is required to produce and read the slides. This is a balance. Currently the amount of tissue analyzed (compared to the total volume of tissue removed) is not well reflected in the pathology report. If insufficient tissue is analyzed serious adverse consequences can occur, for instance, a false negative margin resulting in tumor recurrence in the future. Therefore maximizing the amount of tissue that can be placed into a cassette/onto a slide in an efficient manner decreases the number of required cassettes while increasing the proportion of tissue analyzed. This results in an overall increase in efficiency, and a more regimented, reproducible process. Note that in the report 100 of
Tissue ink is used for at least purposes, namely (a) to orient the tissue, and (b) to define whether all of the tissue is contained within the tissue section, i.e. there is not any tissue missing and ink spans the complete border of the tissue. Orientation of the tissue is required when evaluating specimen margins either in real-time (from the operating room (OR), or in Mohs Surgery), or post-operatively when tissue is being analyzed in the event that there is tumor at the margin which would require additional treatment. See the diagram 200 of an exemplary procedure in
The ability to make an accurate histologic diagnosis or fully identify tissue margins for the presence or absence of a tumor relies on the high quality tissue sections (a very commonly overlooked factor, even for experienced pathologists). Holes or defects (e.g. hole/nick 210 in
This invention overcomes disadvantages of the prior art by providing a system and method that effectively automates the early/preliminary and vital part of the overall pathology procedure that is typically repetitive and labor intensive and fundamentally determines which tissue is actually provided to diagnostic slides used by downstream pathologists and other practitioners in treatment of a condition, such as but not limited to skin cancer. These tools, including ink recognition, can provide additional information to the pathologist when analyzing digital pathology images. Significantly, this system and method aids in supporting a vital practice guideline, which is to avoid a diagnosis without sufficient information. This system organizes automated input and output data as above to generate a pathology report that can be efficiently and clearly interpreted.
In an illustrative embodiment, a system and method for automation of a pathology process is provided, which includes a processor having trained artificial intelligence (AI) modules operating in association therewith, adapted to receive image data from camera images of whole tissue acquired by an camera assembly and whole slide images (WSIs) of inked and segmented tissue samples. A mask produces image results for tissue with holes and free of holes, and a filter provides filtered image results to the AI modules, which thereby detects tumors and macroarchitecture features. A quality assessment process produces quality score outputs for tumors and macroarchitecture features. A report generator then provides reports with one or more parameters to a user via an interface. More particularly, the report generator automatically creates a pathology report consisting of a written description and pictorial diagram relative the images of the tissue. Illustratively, a tissue determination process can determine a size and a description of tissue automatically based upon characteristics in the images acquired by the camera assembly. Additionally, a tissue grossing and inking process can automatically generate a grossing and inking scheme based upon a user input of desired percentage of tissue margins analyzed and the tissue size. The AI modules can include a tumor inflammation convolutional neural network (CNN), a macroarchitecture hole CNN, a tumor detection graphical neural network (GNN) and a macroarchitecture detection GNN. Additionally, the filters can include at least one of a Sobel filter and a gradient-based filter. An ink detection and orientation process can operate on the filtered images of the tissue samples and delivers results thereof to the report generator. The report generator can receive results from a quality assessment process acting on results from the a tumor detection GNN and a macroarchitecture detection GNN. Illustratively, the macroarchitecture can be defined by at least one of holes, fat, edges, dermis and epidermis information. A visual generation and 3D stitching process can provide results to the report generator, and can receive information from a cell nuclei detection process. Outputs of the aforementioned CNN/GNNs are thereby used to automatically generate a pathology report.
The invention description below refers to the accompanying drawings, of which:
With further reference to the generalized computing environment described in
With reference to the flow diagram 400 of
The system and method includes a process that determines the ink color scheme for a piece of tissue 510 that has been surgically removed. The system and method thereby defines the grossing and inking scheme (procedure 200) based on the size, shape of the tissue, and % of margin desired to be analyzed. The tissue is then inked 512 accordingly (See the process 500 of
Slide quality is assessed by the system and method by identifying tissue artifacts, tears and holes which may cause the surgical pathologist to unknowingly miss diagnostically important regions of tissue, weighted by both their prognostic significance and contextualized by information presented in adjacent sections (3D). First, tissue is detected using a combination of advanced filtration techniques/morphometric operations, which also serves to identify and flag potential holes and tears in the tissue. Since these areas of low intensity imaging can be confused with regions of fat, wispy dermis, edema, cystic cavities or lumina, the computing procedure separately applies a set of deep learning algorithms to distinguish these holes from other possible conflating regions by informing prediction from contextual cues in the surrounding tissue macroarchitecture.
With reference to the color imagery diagram 700 of
With predictions produced by the system and method, persistence homology and Mapper subroutines from topological data analysis are used to both (a) process the shape of the tissue for overall metrics of missing tissue and (b) develop/generate a score for each section for how much the tissue artifacts intermingle into prognostically important regions of tissue (See box 730 in
The system and method can employ Topological Data Analysis (TDA) so as to reveal key relationships in the data (in this case, morphological and spatial information encoded in each of the WSI subimages) by collapsing irrelevant structures and predicting how tissue is expected to be distributed versus what is observed. The Mapper process/algorithm, a smart TDA clustering technique, decomposes the WSI into overlapping Regions of Interest (ROI) that are representative of different tissue subcompartments, and forms weighted connections between ROIs to portray shared information content and important functional relationships. The system and method applies these processes/algorithms to calculate the intermingling of the hole with the surrounding tissue architecture for where artifacts are located in the slide with respect to important architectural components (importance assignment). The system and method further uses area calculations for quantitating total amount of holes (amount of bad quality), and compares TDA measurements across sections for assess the likelihood of a tumor being where the hole is in the current section (importance assignment; e.g. if hole in dermis of present section but adjacent section contains tumor in dermis). The quality of a particular tissue may be assessed given the following mathematical relationship:
Where the importance of each region and the weight given to adjacent sections at a particular distance is determined via expert knowledge. Note these approaches can be applied to any tissue type assuming CNN/GNN trained on expert truth and/or on specialized staining patterns.
Reference is made to
It should be clear that the above-described system and method provides an effective and useful tool for automating the overall pathology process for a range of WSI samples. It provides the practitioner with objectively scored results that assist in making further determinations relative to diagnosis of conditions, such as cancer. The techniques herein can be refined continuously by further training of applicable AI algorithms and can be applied to an increasing range of medical conditions and parts of the body. Results can be delivered in a manner that provides revenue to the system operators, and can be delivered to any region or device via existing communications networks, including highly remote areas that may lack sophisticated equipment or facilities.
The foregoing has been a detailed description of illustrative embodiments of the invention. Various modifications and additions can be made without departing from the spirit and scope of this invention. Features of each of the various embodiments described above may be combined with features of other described embodiments as appropriate in order to provide a multiplicity of feature combinations in associated new embodiments. Furthermore, while the foregoing describes a number of separate embodiments of the apparatus and method of the present invention, what has been described herein is merely illustrative of the application of the principles of the present invention. For example, the use of various colors and color-codings in images is exemplary of a wide range of possible indicia for distinguishing between characteristics in a display or physical structure. Grayscale shading and/or use of non-visible wavelengths, or other characteristics, can be used to distinguish such items, in a manner clear to those of skill. Also, as used herein, various directional and orientational terms (and grammatical variations thereof) such as “vertical”, “horizontal”, “up”, “down”, “bottom”, “top”, “side”, “front”, “rear”, “left”, “right”, “forward”, “rearward”, and the like, are used only as relative conventions and not as absolute orientations with respect to a fixed coordinate system, such as the acting direction of gravity. Additionally, where the term “substantially” or “approximately” is employed with respect to a given measurement, value or characteristic, it refers to a quantity that is within a normal operating range to achieve desired results, but that includes some variability due to inherent inaccuracy and error within the allowed tolerances (e.g. 1-2%) of the system. Note also, as used herein the terms “process” and/or “processor” should be taken broadly to include a variety of electronic hardware and/or software based functions and components. Moreover, a depicted process or processor can be combined with other processes and/or processors or divided into various sub-processes or processors. Such sub-processes and/or sub-processors can be variously combined according to embodiments herein. Accordingly, this description is meant to be taken only by way of example, and not to otherwise limit the scope of this invention.
This application claims the benefit of co-pending U.S. Provisional Application Ser. No. 63/176,333, entitled SYSTEM AND METHOD FOR AUTOMATION OF SURGICAL PATHOLOGY PROCESSES USING ARTIFICIAL INTELLIGENCE, filed Apr. 18, 2021, the teachings of which are expressly incorporated herein by reference.
Number | Name | Date | Kind |
---|---|---|---|
3975762 | van den Bosch | Aug 1976 | A |
5075214 | Connor | Dec 1991 | A |
5218529 | Meyer | Jun 1993 | A |
5544650 | Boon | Aug 1996 | A |
5784162 | Cabib | Jul 1998 | A |
5976885 | Cohenford | Nov 1999 | A |
5991028 | Cabib | Nov 1999 | A |
6146897 | Cohenford | Nov 2000 | A |
6463438 | Veltri | Oct 2002 | B1 |
6690817 | Cabib | Feb 2004 | B1 |
7693334 | Ogura | Apr 2010 | B2 |
8351676 | Dai | Jan 2013 | B2 |
9518982 | Sood | Dec 2016 | B2 |
9786050 | Bhargava | Oct 2017 | B2 |
9983195 | King | May 2018 | B2 |
10013760 | Bhargava | Jul 2018 | B2 |
10013781 | Gammage | Jul 2018 | B1 |
10521901 | Ikemoto | Dec 2019 | B2 |
10935773 | Johnson | Mar 2021 | B2 |
11504103 | Johnson | Nov 2022 | B2 |
11508045 | Amthor | Nov 2022 | B1 |
11526984 | Barnes | Dec 2022 | B2 |
11614610 | Johnson | Mar 2023 | B2 |
11620751 | Sarkar | Apr 2023 | B2 |
11621058 | Gurcan | Apr 2023 | B2 |
11631171 | Leng | Apr 2023 | B2 |
11633146 | Leng | Apr 2023 | B2 |
11672425 | Pyun | Jun 2023 | B2 |
11675178 | Wirch | Jun 2023 | B2 |
11681418 | Wirch | Jun 2023 | B2 |
11684264 | Bryant-Greenwood | Jun 2023 | B2 |
11751903 | Knowlton | Sep 2023 | B2 |
11751904 | Knowlton | Sep 2023 | B2 |
11759231 | Knowlton | Sep 2023 | B2 |
11776124 | Behrooz | Oct 2023 | B1 |
11776681 | Godrich | Oct 2023 | B2 |
11783603 | Stumpe | Oct 2023 | B2 |
20010018659 | Koritzinsky | Aug 2001 | A1 |
20030026762 | Malmros | Feb 2003 | A1 |
20060210153 | Sara | Sep 2006 | A1 |
20070127022 | Cohen | Jun 2007 | A1 |
20070135999 | Kolatt | Jun 2007 | A1 |
20080015448 | Keely | Jan 2008 | A1 |
20080166035 | Qian | Jul 2008 | A1 |
20080273199 | Maier | Nov 2008 | A1 |
20080319324 | Maier | Dec 2008 | A1 |
20090002702 | Maier | Jan 2009 | A1 |
20090024375 | Kremer | Jan 2009 | A1 |
20090319291 | Noordvyk | Dec 2009 | A1 |
20110080581 | Bhargava | Apr 2011 | A1 |
20110182490 | Hoyt | Jul 2011 | A1 |
20110286654 | Krishnan | Nov 2011 | A1 |
20120034647 | Herzog | Feb 2012 | A1 |
20120052063 | Bhargava | Mar 2012 | A1 |
20120092663 | Kull | Apr 2012 | A1 |
20120143082 | Notingher | Jun 2012 | A1 |
20120200694 | Garsha | Aug 2012 | A1 |
20120212733 | Kodali | Aug 2012 | A1 |
20120226644 | Jin | Sep 2012 | A1 |
20120290607 | Bhargava | Nov 2012 | A1 |
20130022250 | Nygaard | Jan 2013 | A1 |
20140235487 | McDevitt | Aug 2014 | A1 |
20140270457 | Bhargava | Sep 2014 | A1 |
20140336261 | Chin | Nov 2014 | A1 |
20150268226 | Bhargava | Sep 2015 | A1 |
20150374306 | Gelbman | Dec 2015 | A1 |
20160042511 | Chukka | Feb 2016 | A1 |
20160272934 | Chander | Sep 2016 | A1 |
20160335478 | Bredno | Nov 2016 | A1 |
20170160171 | Tsujikawa | Jun 2017 | A1 |
20170169567 | Chefd'hotel | Jun 2017 | A1 |
20170322124 | Barnes | Nov 2017 | A1 |
20170358082 | Bhargava | Dec 2017 | A1 |
20170372471 | Eurèn | Dec 2017 | A1 |
20180232883 | Sethi | Aug 2018 | A1 |
20190188446 | Wu | Jun 2019 | A1 |
20200302603 | Barnes | Sep 2020 | A1 |
20200372235 | Peng | Nov 2020 | A1 |
20200372635 | Veidman | Nov 2020 | A1 |
20200394825 | Stumpe | Dec 2020 | A1 |
20210103797 | Jang | Apr 2021 | A1 |
20210295507 | Nie | Sep 2021 | A1 |
20220146418 | Bauer | May 2022 | A1 |
20230249175 | Linnes | Aug 2023 | A1 |
20230279512 | Masters | Sep 2023 | A1 |
20230394716 | de Haan | Dec 2023 | A1 |
Number | Date | Country |
---|---|---|
2020180755 | Sep 2020 | WO |
Entry |
---|
Akbar, 2019, Scientific Reports, pp. 1-9. |
Arunachalam, 2019, PLOS ONE, pp. 1-19. |
Corvo, 2017 IEEE Workshop on Visual Analytics in Healthcare, pp. 77-83. |
McCann, IEEE Signal Processing Magazine, 2015, pp. 78-87. |
Rivenson, 2020, BMEF, pp. 1-11. |
Taieb, 2019, ArXiv, pp. 1-58. |
Grala, 2009, pp. 587-592. |
U.S. Appl. No. 16/679,133, entitled System and Method for Analyzing Cytological Tissue Preparations, Louis J. Vaickus, filed Nov. 8, 2018. |
Number | Date | Country | |
---|---|---|---|
20220375604 A1 | Nov 2022 | US |
Number | Date | Country | |
---|---|---|---|
63176333 | Apr 2021 | US |