Aspects of this disclosure are generally related to human-machine interfaces, and more particularly to cursors.
The typical arrow-shaped cursor presented by a computer operating system is zero-dimensional. A zero-dimensional cursor designates the location of a single point in a space such as a two-dimensional window presented on a monitor. Mouse buttons can be used in combination with movement of the cursor to select objects in the two-dimensional space, but at any given instant of time a zero-dimensional cursor position designates only a single point in space.
The current standard for diagnostic radiologists reviewing computed tomography (CT) or magnetic resonance imaging (MRI) studies is a slice-by-slice method. A conventional keyboard, monitor, and mouse with a zero-dimensional cursor are used for manipulating the images. The use of mouse buttons and cursor movement for manipulating the images can become burdensome. For example, many images are included in radiology studies that are performed for the follow up of cancer to determine the response to treatment. The ability to recognize and analyze differences between images can be important. As an example, the recent Investigation of Serial Studies to Predict Your Therapeutic Response with Imaging and Molecular Analysis (I-SPY) trial tracked the changes in the tumor over multiple magnetic resonance imaging (MM) scans during the administration of neoadjuvant chemotherapy (NACT). It has been noted that the phenotypic appearance (e.g., shape, margins) of a tumor correlated with the pathologic response to NACT. A more efficient and accurate interface for manipulating and presenting medical images would therefore have utility.
Known techniques for 3D viewing of medical images are described in U.S. Pat. No. 9,349,183, Method and Apparatus for Three Dimensional Viewing of Images, issued to Douglas, U.S. Pat. No. 8,384,771, Method and Apparatus for Three Dimensional Viewing of Images, issued to Douglas, Douglas, D. B., Petricoin, E. F., Liotta L., Wilson, E. D3D augmented reality imaging system: proof of concept in mammography. Med Devices (Auckl), 2016; 9:277-83, Douglas, D. B., Boone, J. M., Petricoin, E., Liotta, L., Wilson, E. Augmented Reality Imaging System: 3D Viewing of a Breast Cancer. J Nat Sci. 2016; 2(9), and Douglas, D. B., Wilke, C. A., Gibson, J. D., Boone, J. M., Wintermark, M. Augmented Reality: Advances in Diagnostic Imaging. Multimodal Technologies and Interaction, 2017; 1(4):29. In D3D imaging, the radiologist wears an augmented reality (AR), mixed reality (MR) or virtual reality (VR) headset and uses a joystick or gaming controller. Advantages include improved depth perception and human machine interface. Still, there are several challenges faced with this approach. First, an area of interest (e.g. tumor) may be in close proximity to structures that are similar in composition/density. Isolating the area of interest for better examination may be difficult. Second, many soft tissues in the body are mobile and deformable, so it can be difficult to achieve the best orientation to properly compare the tumor at multiple time points. Efficiently aligning the orientation to do so may be difficult. Third, certain portions of a tumor can respond to treatment and decrease in size while other portions of a tumor demonstrate increases in size. The pattern of tumor shrinkage has important prognostic implications. Furthermore, composition and complex morphologic features including spiculations (spikes extending from the surface), irregular margins and enhancement also have important implications. Consequently, there is a need for a system that facilitates recognition of the subtle, yet important changes in size, shape and margins. Fourth, a patient with metastatic cancer has several areas of interest in different areas of the body. It is difficult and time consuming to find each of the areas of interest at every time point to determine interval change. Consequently, there is a need for a system that enables the observer to do this efficiently.
All examples, aspects and features mentioned in this document can be combined in any technically possible way.
In accordance with an aspect of the invention a method comprises: generating a three-dimensional cursor that has a non-zero volume; responsive to a first input, moving the three-dimensional cursor within a three-dimensional image; responsive to a second input, selecting a volume of the three-dimensional image designated by the three-dimensional cursor; and responsive to a third input, presenting a modified version of the selected volume of the three-dimensional image. In some implementations presenting the modified version of the selected volume of the three-dimensional image comprises removing an un-selected volume of the three-dimensional image from view. In some implementations presenting the modified version of the selected volume of the three-dimensional image comprises changing transparency of presented tissues within the selected volume. In some implementations presenting the modified version of the selected volume of the three-dimensional image comprises filtering a selected tissue to remove the selected tissue from view. In some implementations presenting the three-dimensional cursor with measurement markings on at least one edge, surface or side. In some implementations presenting the modified version of the selected volume of the three-dimensional image comprises presenting inputted location indicators. In some implementations presenting the modified version of the selected volume of the three-dimensional image comprises presenting inputted annotations. Some implementations comprise changing a size dimension of the three-dimensional cursor responsive to a fourth input. Some implementations comprise changing a geometric shape of the three-dimensional cursor responsive to a fifth input. Some implementations comprise automatically generating a statistical representation of the selected volume of the three-dimensional image. In some implementations presenting the modified version of the selected volume of the three-dimensional image comprises presenting at least one tissue type with false color. In some implementations presenting the modified version of the selected volume of the three-dimensional image comprises presenting volumetric changes over time with false color. Some implementations comprise presenting multiple computed tomography images associated with the selected volume using reference lines. Some implementations comprise presenting multiple axial computed tomography images associated with the selected volume using reference lines. Some implementations comprise presenting a maximum intensity projection (MIP) image of a positron emission tomography (PET) scan with the three-dimensional cursor overlaid thereon to indicate orientation and location of the selected volume. Some implementations comprise presenting a radiology report enhanced with information obtained using the three-dimensional cursor. Some implementations comprise automatically calculating and presenting a quantitative analysis and a qualitative analysis associated with multiple time points. Some implementations comprise presenting the modified version of the selected volume of the three-dimensional image comprises presenting inputted registration markers. Some implementations comprise automatically calculating volumetric change based on the registration markers. Some implementations comprise automatically re-orienting the selected volume of the three-dimensional image based on the registration markers. Some implementations comprise using multiple volumes selected with the three-dimensional cursor to designate a pre-operative planning pathway for guiding surgical intervention. Some implementations comprise presenting the selected volume with an augmented reality, virtual reality or mixed reality headset.
In accordance with an aspect of the invention an apparatus comprises: a computing device; and a human-machine interface comprising a three-dimensional cursor that has a non-zero volume; the human-machine interface moving the three-dimensional cursor within a three-dimensional image responsive to a first input; the human-machine interface selecting a volume of the three-dimensional image designated by the three-dimensional cursor responsive to a second input; and the human-machine interface presenting a modified version of the selected volume of the three-dimensional image responsive to a third input. In some implementations, the human-machine interface removes an un-selected volume of the three-dimensional image from view. In some implementations, the human-machine interface changes transparency of presented tissues within the selected volume. In some implementations, the human-machine interface filters a selected tissue to remove the selected tissue from view. In some implementations, the human-machine interface presents the three-dimensional cursor with measurement markings on at least one edge, surface or side. In some implementations, the human-machine interface receives and implements inputted location indicators. In some implementations, the human-machine interface receives and implements inputted annotations. In some implementations, the human-machine interface changes a size dimension of the three-dimensional cursor responsive to a fourth input. In some implementations, the human-machine interface changes a geometric shape of the three-dimensional cursor responsive to a fifth input. In some implementations, the human-machine interface automatically generates and presents a statistical representation of the selected volume of the three-dimensional image. In some implementations, the human-machine interface presents at least one tissue type with false color. In some implementations, the human-machine interface presents volumetric changes over time with false color. In some implementations, the human-machine interface presents multiple computed tomography images associated with the selected volume using reference lines. In some implementations, the human-machine interface presents multiple axial computed tomography images associated with the selected volume using reference lines. In some implementations, the human-machine interface presents a maximum intensity projection (MIP) image of a positron emission tomography (PET) scan with the three-dimensional cursor overlaid thereon to indicate orientation and location of the selected volume. In some implementations, the human-machine interface presents a radiology report enhanced with information obtained using the three-dimensional cursor. In some implementations, the human-machine interface automatically calculates and presents a quantitative analysis and a qualitative analysis associated with multiple time points. In some implementations, the human-machine interface presents inputted registration markers. In some implementations, the human-machine interface automatically calculates volumetric change after appropriate registration using the registration markers. In some implementations, the human-machine interface automatically re-orients the selected volume of the three-dimensional image based on the registration markers. In some implementations, the human-machine interface presents multiple volumes selected with the three-dimensional cursor to designate a pre-operative planning pathway for guiding surgical intervention. In some implementations, the human-machine interface presents the selected volume with an augmented reality, virtual reality or mixed reality headset.
unselected portions have been removed from view.
Some aspects, features and implementations described herein may include machines such as computers, electronic components, radiological components, optical components, and processes such as computer-implemented steps. It will be apparent to those of ordinary skill in the art that the computer-implemented steps may be stored as computer-executable instructions on a non-transitory computer-readable medium. Furthermore, it will be understood by those of ordinary skill in the art that the computer-executable instructions may be executed on a variety of tangible processor devices. For ease of exposition, not every step, device or component that may be part of a computer or data storage system is described herein. Those of ordinary skill in the art will recognize such steps, devices and components in view of the teachings of the present disclosure and the knowledge generally available to those of ordinary skill in the art. The corresponding machines and processes are therefore enabled and within the scope of the disclosure.
Transparency modification and tissue filtering facilitate presentation of certain tissue types of concern, both within the cursor and outside of the cursor. Currently, the medical professional must see through any tissue within the cursor but external to the tissue type of concern from the viewing point of the medical professional, thus degrading the visibility of the tissue of concern. The illustrated improvements enable the medical professional to change the transparency of any tissue within the cursor-defined volume but external to the tissue type of concern. Alternatively, tissue types not of concern are subtracted from the volume contained within the interactive 3D cursor, leaving only the tissue of concern in the presented image. Multiple interactive 3D cursors in combination can be used to obtain varying patterns of tissue subtraction. This helps to overcome the limitations of degraded visibility due to tissue within the cursor but external to the tissue type of concern from the viewing point of the medical professional.
The dimensional measurement markings can help serve as a reference for radiologist's activities to include visual assessment, orientation, comparisons with prior scans or measurements. Advantages may include mitigating the current lack of metrics are available to the medical professional to understand the size of the cursor and/or of the tissue elements contained within the cursor. This implementation places measurement metrics on each edge and side of the cursor to help enable the medical professional to rapidly understand the size of the subtended volume within the cursor. In the case where the cursor encapsulates a volume of concern such as a tumor, the three-dimensional size could be recorded in the medical professional report. This can help the visual assessment of each portion of the tumor to aid in the assessment of small changes in size of findings including lobulations of a mass's margin and spiculations.
Referring to
Referring to
The ability to change the size, shape, and individual dimensions of the 3D cursor enables the cursor to be customized based on the particular volume of interest to the medical professional. A fixed-shape, fixed-size cursor might be too large or too small, e.g. so as to include a significant amount of tissue not of interest. For example, in examining the lungs, placement of a cube-shaped cursor could cause ribs to be included in the image. Changing the shape of the 3D cursor would help to overcome this limitation. Customization could be accomplished by wide variety of techniques, possibly including but not limited to selecting an edge, side or vertex of the original 3D cursor with a second type of cursor 412, and then “clicking and dragging” the selected edge, side, or vertex in the desired direction to expand or reduce the volume of the original 3D cursor. The interface may also enable selection and change between multiple 3D geometric shapes, e.g. changing from cuboid to spherical. Scrolling on the conventional slices while simultaneously drawing shapes can also be performed to generate the prescribed 3D cursor volume. The interactive 3D cursor thus provides an efficient interface for tissue subtraction to provide enhanced visualization of the tumor.
There are multiple potential advantages of the interactive 3D cursor. For example, there is reduction in time spent for classification of multiple lesions. The radiologist doesn't have to sort through many prior imaging studies to find the lesion and the interactive 3D cursor will save time. There is reduction in error when tracking multiple lesions, i.e. reducing the likelihood of mistakes when identifying different specific lesions that are nearby one another when comparing multiple scans. One possibility is to analyze the images obtained using the 3D cursor and using multiple uniquely tagged (e.g. numbered) cursors for any suspicious regions. The medical profession could then switch to slices for confirmation.
Several features, aspects, embodiments and implementations have been described. Nevertheless, it will be understood that a wide variety of modifications and combinations may be made without departing from the scope of the inventive concepts described herein. Accordingly, those modifications and combinations are within the scope of the following claims.
This application is a Continuation-in-Part of U.S. patent application Ser. No. 14/877,442, filed Oct. 7, 2015, now U.S. Pat. No. 9,980,691, which is a Continuation-in-Part of U.S. patent application Ser. No. 12/176,569, filed Jul. 21, 2008, now U.S. Pat. No. 9,349,183, which is a Continuation-in-Part of U.S. patent application Ser. No. 11/941,578, filed Nov. 16, 2007, now U.S. Pat. No. 8,384,771, which claims the benefit of and priority under 35 U.S.C. § 119(e) to U.S. Patent Application No. 60/877,931, filed Dec. 28, 2006, each of which are incorporated herein by reference in their entirety.
| Number | Name | Date | Kind |
|---|---|---|---|
| 4472737 | Iwasaki | Sep 1984 | A |
| 4808979 | DeHoff et al. | Feb 1989 | A |
| 4952024 | Gale | Apr 1990 | A |
| 4987527 | Hamada et al. | Jan 1991 | A |
| 5049987 | Hoppenstein | Sep 1991 | A |
| 5200819 | Nudelman et al. | Apr 1993 | A |
| 5233458 | Moffitt et al. | Aug 1993 | A |
| 5278884 | Eberhard et al. | Jan 1994 | A |
| 5293529 | Yoshimura et al. | Mar 1994 | A |
| 5371778 | Yanof | Dec 1994 | A |
| 5402191 | Dean et al. | May 1995 | A |
| 5510832 | Garcia | Apr 1996 | A |
| 5621867 | Murata et al. | Apr 1997 | A |
| 5627582 | Muramoto et al. | May 1997 | A |
| 5659625 | Marquardt | Aug 1997 | A |
| 5682172 | Travers et al. | Oct 1997 | A |
| 5682437 | Okino et al. | Oct 1997 | A |
| 5696521 | Robinson et al. | Dec 1997 | A |
| 5708359 | Gregory et al. | Jan 1998 | A |
| 5841830 | Barni et al. | Nov 1998 | A |
| 5850352 | Moezzi et al. | Dec 1998 | A |
| 5867588 | Marquardt | Feb 1999 | A |
| 6002518 | Faris | Dec 1999 | A |
| 6034716 | Whiting et al. | Mar 2000 | A |
| 6052100 | Soltan et al. | Apr 2000 | A |
| 6057827 | Matthews | May 2000 | A |
| 6066095 | Morsy et al. | May 2000 | A |
| 6084937 | Tam et al. | Jul 2000 | A |
| 6100862 | Sullivan | Aug 2000 | A |
| 6108005 | Starks et al. | Aug 2000 | A |
| 6115449 | Jang et al. | Sep 2000 | A |
| 6124977 | Takahashi | Sep 2000 | A |
| 6130930 | Tam | Oct 2000 | A |
| 6225979 | Taima et al. | May 2001 | B1 |
| 6272366 | Vining | Aug 2001 | B1 |
| 6275561 | Danielsson | Aug 2001 | B1 |
| 6276799 | Van Saarloos et al. | Aug 2001 | B1 |
| 6342378 | Zhang et al. | Jan 2002 | B1 |
| 6342878 | Chevassus et al. | Jan 2002 | B1 |
| 6442417 | Shahidi et al. | Aug 2002 | B1 |
| 6449309 | Tabata | Sep 2002 | B1 |
| 6466185 | Sullivan et al. | Oct 2002 | B2 |
| 6476607 | Dannels et al. | Nov 2002 | B1 |
| 6487432 | Slack | Nov 2002 | B2 |
| 6490335 | Wang et al. | Dec 2002 | B1 |
| 6532008 | Guralnick | Mar 2003 | B1 |
| 6545650 | Yamada et al. | Apr 2003 | B1 |
| 6549803 | Raghavan et al. | Apr 2003 | B1 |
| 6580448 | Stuffier | Jun 2003 | B1 |
| 6676259 | Trilllo | Jan 2004 | B1 |
| 6692441 | Poland et al. | Feb 2004 | B1 |
| 6711231 | Knoplioch et al. | Mar 2004 | B2 |
| 6734847 | Baldeweg et al. | May 2004 | B1 |
| 6792071 | Dewaele | Sep 2004 | B2 |
| 6862364 | Berestov | Mar 2005 | B1 |
| 7020236 | Shechter | Mar 2006 | B2 |
| 7058156 | Bruder et al. | Jun 2006 | B2 |
| RE39342 | Starks et al. | Oct 2006 | E |
| 7193626 | Otani et al. | Mar 2007 | B2 |
| 7298372 | Pfister et al. | Nov 2007 | B2 |
| 7324085 | Balakrishnan et al. | Jan 2008 | B2 |
| 7524053 | Lipton | Apr 2009 | B2 |
| 7604597 | Murashita et al. | Oct 2009 | B2 |
| 7643025 | Lange | Jan 2010 | B2 |
| 7647593 | Matsumoto | Jan 2010 | B2 |
| 7654826 | Faulkner et al. | Feb 2010 | B2 |
| 7773074 | Arenson et al. | Aug 2010 | B2 |
| 7786990 | Wegenkittl et al. | Aug 2010 | B2 |
| 7796790 | McNutt et al. | Sep 2010 | B2 |
| 7808449 | Neidrich et al. | Oct 2010 | B2 |
| 7822265 | Berretty | Oct 2010 | B2 |
| 7840047 | Böing et al. | Nov 2010 | B2 |
| 8078000 | Böhm et al. | Dec 2011 | B2 |
| 8175683 | Roose | May 2012 | B2 |
| 8228327 | Hendrickson et al. | Jul 2012 | B2 |
| 8233103 | MacNaughton et al. | Jul 2012 | B2 |
| 8363096 | Aguirre | Jan 2013 | B1 |
| 8384771 | Douglas | Feb 2013 | B1 |
| 8398541 | DiMaio et al. | Mar 2013 | B2 |
| 8542326 | MacNaughton et al. | Sep 2013 | B2 |
| 8567954 | Koehler et al. | Oct 2013 | B2 |
| D692941 | Klinar et al. | Nov 2013 | S |
| 8866883 | Rohaly et al. | Oct 2014 | B2 |
| 9077982 | Rha et al. | Jul 2015 | B2 |
| 9094676 | Schutten et al. | Jul 2015 | B1 |
| 9338445 | Atkins et al. | May 2016 | B2 |
| 9349183 | Douglas et al. | May 2016 | B1 |
| 9473766 | Douglas et al. | Oct 2016 | B2 |
| 9980691 | Douglas et al. | May 2018 | B2 |
| 20020068863 | Slack | Jun 2002 | A1 |
| 20020101658 | Hoppenstein | Aug 2002 | A1 |
| 20020112237 | Kelts | Aug 2002 | A1 |
| 20030026474 | Yano | Feb 2003 | A1 |
| 20030107644 | Choi | Jun 2003 | A1 |
| 20030194119 | Manjeshwar et al. | Oct 2003 | A1 |
| 20030204364 | Goodwin et al. | Oct 2003 | A1 |
| 20030218720 | Morita et al. | Nov 2003 | A1 |
| 20040054248 | Kimchy et al. | Mar 2004 | A1 |
| 20040070584 | Pyo et al. | Apr 2004 | A1 |
| 20040204644 | Tsougarakis et al. | Oct 2004 | A1 |
| 20040223636 | Edic et al. | Nov 2004 | A1 |
| 20040246269 | Serra et al. | Dec 2004 | A1 |
| 20040254454 | Kockro | Dec 2004 | A1 |
| 20050017938 | O'Donnell et al. | Jan 2005 | A1 |
| 20050030621 | Takahashi et al. | Feb 2005 | A1 |
| 20050062684 | Geng | Mar 2005 | A1 |
| 20050065423 | Owen | Mar 2005 | A1 |
| 20050065424 | Shah et al. | Mar 2005 | A1 |
| 20050096530 | Daw et al. | May 2005 | A1 |
| 20050110791 | Krishnamoorthy et al. | May 2005 | A1 |
| 20050148848 | Guang et al. | Jul 2005 | A1 |
| 20050151152 | Miller et al. | Jul 2005 | A1 |
| 20050152591 | Kiraly et al. | Jul 2005 | A1 |
| 20050244050 | Nomura et al. | Nov 2005 | A1 |
| 20050278408 | Matsumoto | Dec 2005 | A1 |
| 20060013472 | Kagitani | Jan 2006 | A1 |
| 20060026533 | Napoli et al. | Feb 2006 | A1 |
| 20060033992 | Solomon | Feb 2006 | A1 |
| 20060056680 | Stutsman et al. | Mar 2006 | A1 |
| 20060058605 | Deischinger et al. | Mar 2006 | A1 |
| 20060077204 | Pfister et al. | Apr 2006 | A1 |
| 20060079755 | Stazzone et al. | Apr 2006 | A1 |
| 20060120583 | Dewaele | Jun 2006 | A1 |
| 20060171028 | Oikawa et al. | Aug 2006 | A1 |
| 20060177133 | Kee | Aug 2006 | A1 |
| 20060210111 | Cleveland et al. | Sep 2006 | A1 |
| 20060210147 | Sakaguchi | Sep 2006 | A1 |
| 20060227103 | Koo et al. | Oct 2006 | A1 |
| 20060238441 | Benjamin et al. | Oct 2006 | A1 |
| 20060239523 | Stewart et al. | Oct 2006 | A1 |
| 20060268104 | Cowan et al. | Nov 2006 | A1 |
| 20070035830 | Matveev et al. | Feb 2007 | A1 |
| 20070058249 | Hirose et al. | Mar 2007 | A1 |
| 20070085902 | Walker et al. | Apr 2007 | A1 |
| 20070103459 | Stoval, III et al. | May 2007 | A1 |
| 20070115204 | Budz et al. | May 2007 | A1 |
| 20070116357 | Dewaele | May 2007 | A1 |
| 20070118408 | Mahesh et al. | May 2007 | A1 |
| 20070146325 | Poston et al. | Jun 2007 | A1 |
| 20070147671 | Di Vincenzo et al. | Jun 2007 | A1 |
| 20070165927 | Muradyan et al. | Jul 2007 | A1 |
| 20070167801 | Webler et al. | Jul 2007 | A1 |
| 20070188520 | Finley et al. | Aug 2007 | A1 |
| 20070279435 | Ng et al. | Dec 2007 | A1 |
| 20080025584 | Kunz | Jan 2008 | A1 |
| 20080033240 | Hoffman et al. | Feb 2008 | A1 |
| 20080037843 | Fu et al. | Feb 2008 | A1 |
| 20080055305 | Blank et al. | Mar 2008 | A1 |
| 20080062173 | Tashiro | Mar 2008 | A1 |
| 20080100612 | Dastmalchi et al. | May 2008 | A1 |
| 20080117233 | Mather et al. | May 2008 | A1 |
| 20080267527 | Berretty | Oct 2008 | A1 |
| 20080281182 | Rabben et al. | Nov 2008 | A1 |
| 20090016491 | Li | Jan 2009 | A1 |
| 20090034684 | Bernard et al. | Feb 2009 | A1 |
| 20090051685 | Takagi et al. | Feb 2009 | A1 |
| 20090080765 | Bernard et al. | Mar 2009 | A1 |
| 20090219283 | Hendrickson et al. | Sep 2009 | A1 |
| 20090231697 | Marcus et al. | Sep 2009 | A1 |
| 20090232275 | Spartiotis et al. | Sep 2009 | A1 |
| 20090304232 | Tsukizawa | Dec 2009 | A1 |
| 20090324052 | Nowinski | Dec 2009 | A1 |
| 20100194861 | Hoppenstein | Aug 2010 | A1 |
| 20100201785 | Lantin | Aug 2010 | A1 |
| 20100246911 | Rabben et al. | Sep 2010 | A1 |
| 20110026808 | Kim et al. | Feb 2011 | A1 |
| 20110107270 | Wang et al. | May 2011 | A1 |
| 20110109620 | Hong et al. | May 2011 | A1 |
| 20110194728 | Kutcka et al. | Aug 2011 | A1 |
| 20110273543 | Ushio et al. | Nov 2011 | A1 |
| 20120008734 | Thomson et al. | Jan 2012 | A1 |
| 20120008735 | Maurer et al. | Jan 2012 | A1 |
| 20120013711 | Tamir et al. | Jan 2012 | A1 |
| 20120019636 | Gefen et al. | Jan 2012 | A1 |
| 20120038631 | Mayhew et al. | Feb 2012 | A1 |
| 20120056998 | Kang et al. | Mar 2012 | A1 |
| 20120113235 | Shintani | May 2012 | A1 |
| 20120120207 | Shimazaki et al. | May 2012 | A1 |
| 20120127284 | Bar-Zeev et al. | May 2012 | A1 |
| 20120162219 | Kobayashi et al. | Jun 2012 | A1 |
| 20120190439 | Nourbakhsh | Jul 2012 | A1 |
| 20120190967 | Nahm | Jul 2012 | A1 |
| 20120206665 | Sakai et al. | Aug 2012 | A1 |
| 20120215218 | Lipani | Aug 2012 | A1 |
| 20120224755 | Wu | Sep 2012 | A1 |
| 20120229595 | Miller | Sep 2012 | A1 |
| 20120242569 | Hamagishi | Sep 2012 | A1 |
| 20120269424 | Ebata et al. | Oct 2012 | A1 |
| 20120287361 | Sugihara | Nov 2012 | A1 |
| 20130003020 | Koehler et al. | Jan 2013 | A1 |
| 20130057830 | Tsai et al. | Mar 2013 | A1 |
| 20130141552 | Kwon | Jun 2013 | A1 |
| 20130182085 | Ziarati | Jul 2013 | A1 |
| 20130242063 | Matsumoto | Sep 2013 | A1 |
| 20130245375 | DiMaio | Sep 2013 | A1 |
| 20130278727 | Tamir et al. | Oct 2013 | A1 |
| 20140065663 | Vasquez et al. | Mar 2014 | A1 |
| 20140176685 | Oikawa et al. | Jun 2014 | A1 |
| 20140253698 | Evans et al. | Sep 2014 | A1 |
| 20140347726 | Yang et al. | Nov 2014 | A1 |
| 20160026266 | Douglas | Jan 2016 | A1 |
| 20160302895 | Rohaly et al. | Oct 2016 | A1 |
| Number | Date | Country |
|---|---|---|
| 1885233 | Dec 2006 | CN |
| 0571827 | Dec 1993 | EP |
| 1791087 | May 2007 | EP |
| 1843296 | Oct 2007 | EP |
| H11-232010 | Aug 1999 | JP |
| 2002-330958 | Nov 2002 | JP |
| 2008-220406 | Sep 2008 | JP |
| 2009-000167 | Jan 2009 | JP |
| 2009-018048 | Jan 2009 | JP |
| WO 9923586 | May 1999 | WO |
| WO 03100542 | Dec 2003 | WO |
| WO 2007063442 | Jun 2007 | WO |
| Entry |
|---|
| U.S. Appl. No 60/842,377, filed Sep. 6, 2006, Nowinski. |
| U.S. Appl. No. 60/854,872, filed Oct. 27, 2006, Dastmalchi et al. |
| Bakalash, Reuven et al. “Medicube: A 3D Medical Imaging Architecture” Computer &Graphics vol. 13, No. 2, pp. 151-157; 1989. |
| Douglas, David B. et al. “Augmented Reality Imaging System: 3D Viewing of a Breast Cancer” J Nat Sci, 2016;2(9). |
| Douglas, David B, et al. “Augmented Reality: Advances in Diagnostic Imaging: Multimodal Technologies and Interaction” 2017;1(4):29. |
| Douglas, David B. et al. “D3D Augmented Reality Imaging System: Proof of Concept in Mammography” Med Devices (Auckl), 2016; 9:277-83. |
| Engel, K., et al. “Combining Local and Remote Visualization Techniques for Interactive Volume Rendering in Medical Applications” Proceedings Visualization 2000. VIS 2000 (Cat. No. 00CH37145), Salt Lake City, UT, USA, 2000, pp. 449-452. |
| Erickson, Bradley J. “A Desktop Computer-Based Workstation for Display and Analysis of 3-and 4-Dimensional Biomedical Images” Computer Methods and Programs in Biomedicine, 30; pp. 97-110; 1989. |
| Goodsitt, Mitchel M. et al “Stereomammography: Evaluation of Depth Perception using a Virtual 3D Cursor” Med. Phys. 27 (6), Jun. 2000. |
| Haker, Steven et al. “Nondistorting Flattening Maps and the 3-D Visualization of Colon CT Images” IEEE Transactions of Medical Imaging; vol. 19, No. 7; Jul. 2000; 665-670. |
| Hinckley, Ken “Haptic Issues for Virtual Manipulation” A Dissertation Presented to the Faculty of the School of Engineering and Applied Science at the University of Virginia; Dec. 1996. |
| Hinckley, Ken, et al. “New Applications for the Touchscreen in 2D and 3D Medical Imaging Workstations” Proc. SPIE Medical Imaging '95; Image Display, SPIE vol. 2431, pp. 110-118. |
| Hui, Y.W. et al “3D Cursors for vol. Rendering Applications” EEE Conference on Nuclear Science Symposium and Medical Imaging, Orlando, FL, USA, 1992, pp. 1243-1245 vol. 2. |
| Hong, Lichan et al. “Reconstruction and Visualization of 3D Models of Colonic Surface” IEEE Transactions on Nuclear Science, vol. 44, No. 3, Jun. 1997. |
| IBM “Fast Inspection of Contents of a Volume of 3D Data” IBM Technical Disclosure Bulletin; Feb. 1, 1994; vol. 37, Issue 2A. |
| Interrante, Victoria et al. “Strategies for Effectively Visualizing 3D Flow with Volume LIC” IEEE Visualization Conference; 1997; pp. 1-4. |
| Kapur, Ajay et al. “Combination of Digital Mammography with Semi-Automated 3D Breast Ultrasound” NIH Public Access; Author Manuscript; Technol Cancer Res Treat, 3(4); 325-334; Aug. 2004. |
| Kaufman, A., et a. “Real-Time Volume Rendering” International Journal of Imaging Systems and Technology, special issue on 3D Imaging; 2000. |
| Klein, GJ et al, “A 3D Navigational Environment for Specifying Positron Emission Tomography Volumes-of-Interest” 1995 IEEE Nuclear Science Symposium and Medical Imaging Conference Record, San Francisco, CA, USA, 1995, pp. 1452-1455 vol. 3. |
| Kok, Arlan J.F. et al. “A Multimodal Virtual Reality Interface for 3D Interaction with VTK” Knowledge and Information Systems; 2007. |
| Kreeger, Kevin et al. “Interactive Volume Segmentation with the PAVLOV Architecture” Proceedings 1999 IEEE Parallel Visualization and Graphics Symposium (Cat. No. 99EX381), San Francisco, CA, USA, 1999, pp. 61-119. |
| Li, Yanhong et al. “Tinkerbell—A Tool for Interactive Segmentation of 3D Data” Journal of Structural Biology 120, 266-275; 1997. |
| Löbbert, Sebastian et al. “Visualisation of Two-Dimensional Volumes” 2004. |
| Loh, Yong Chong et al. “Surgical Planning System with Real-Time Volume Rendering” Proceedings International Workshop on Medical Imaging and Augmented Reality, Shatin, Hong Kong, China, 2001, pp. 259-261. |
| Martin, RW et al. “Stereographic Viewing of 3D Ultrasound Images: A Novelty or a Tool?” 1995 IEEE Ultrasonics Symposium; IEEE Press 1431-1434. |
| Peterson, Christine M. et al. “Volvulus of the Gastrointestinal Tract: Appearances at Multi-Modality Imaging” Radiographics; vol. 29, No. 5; Sep.-Oct. 2009; pp. 1281-1293. |
| Piekarski, Wayne “Interactive 3D Modelling in Outdoor Augmented Reality Worlds” Wearable Computer Lab, School of Computer and Information Science; The University of South Australia; Feb. 2004. |
| Robb, R.A., et al. “A Workstation for Interactive Display and Quantitative Analysis of 3-D and 4-D Biomedical Images” Biodynamics Research Unit, IEEE, 1986. |
| Robb, R.A. et al. “Interactive Display and Analysis of 3-D Medical Images” IEEE Transactions on Medical Imaging, vol. 8, No. 3, Sep. 1989. |
| Skoglund, T. et al. “3D Reconstruction of Biological Objects from Sequential Image Planes—Applied on Cerebral Cortex from CAT” Computerized Medical Imaging and Graphics; vol. 17, No. 3, pp. 165-174; 1993. |
| Steinicke, Frank et al. “Towards Applicable 3D User Interfaces for Everyday Working Environments” Conference Paper; Sep. 2007. |
| Subramanian, Sriram “Tangible Interfaces for Volume Navigation” CIP-Data Library Technische University Eindhoven; 2004. |
| Vanacken, Lode et al. “Exploring the Effects of Environment Density and Target Visibility on Object Selection in 3D Virtual Environments” IEEE Symposium on 3D User Interfaces 2007, Mar. 10-11. |
| Ware, Colin et al. “Selection Using a One-Eyed Cursor in a Fish Tank VR Environment” Faculty of Computer Science, University of New Brunswick; Apr. 20, 2000. |
| Wither, Jason et al. “Pictorial Depth Cues for Outdoor Augmented Reality” Ninth IEEE International Symposium on Wearable Computers (ISWC'05), Osaka, 2005, pp. 92-99. |
| Wong, Terence Z. et al. “Stereoscopically Guided Characterization of Three-Dimensional Dynamic MR Images of the Breast” Radiology, 1996; 198:288-291. |
| Yushkevich, Paul a et al. “User-Guided 3D Active Contour Segmentation of Anatomical Structures: Significantly Improved Efficiency and Reliability” Neurolmage 31; 1116-1128; 2006. |
| Zhai, Shumin et al. “The Partial Occlusion Effect: Utilizing Semi-Transparency in 3D Human Computer Interaction” ACM Transactions on Computer-Human Interaction, 3(3), 254-284; 1996. |
| Office Action for U.S. Appl. No. 11/941,578, dated Sep. 29, 2011. |
| Office Action for U.S. Appl. No. 11/941,578, dated Feb. 22, 2012. |
| Notice of Allowance for U.S. Appl. No. 11/941,578, dated Dec. 21, 2012. |
| Office Action for U.S. Appl. No. 12/176,569, dated Apr. 4, 2012. |
| Office Action for U.S. Appl. No. 12/176,569, dated Oct. 26, 2012. |
| Office Action for U.S. Appl. No. 12/176,569, dated Jul. 15, 2014. |
| Office Action for U.S. Appl. No. 12/176,569, dated Feb. 5, 2015. |
| Notice of Allowance for U.S. Appl. No. 12/176,569, dated May 29, 2015. |
| Office Action for U.S. Appl. No. 14/313,398 dated Sep. 25, 2015. |
| Office Action for U.S. Appl. No. 14/313,398 dated May 12, 2016. |
| Notice of Allowance for U.S. Appl. No. 14/313,398 dated Jul. 15, 2016. |
| Office Action for U.S. Appl. No. 14/877,442 dated Jul. 14, 2017. |
| Office Action for U.S. Appl. No. 14/877,442 dated Dec. 5, 2017. |
| Notice of Allowance for U.S. Appl. No. 14/877,442 dated Apr. 5. 2018. |
| Number | Date | Country | |
|---|---|---|---|
| 20190227641 A1 | Jul 2019 | US | |
| 20200264709 A9 | Aug 2020 | US |
| Number | Date | Country | |
|---|---|---|---|
| 60877931 | Dec 2006 | US |
| Number | Date | Country | |
|---|---|---|---|
| Parent | 14877442 | Oct 2015 | US |
| Child | 15878463 | US | |
| Parent | 12176569 | Jul 2008 | US |
| Child | 14877442 | US | |
| Parent | 11941578 | Nov 2007 | US |
| Child | 12176569 | US |