Method and apparatus for performing stereoscopic zooming on a head display unit

Information

  • Patent Grant
  • 11228753
  • Patent Number
    11,228,753
  • Date Filed
    Sunday, September 26, 2021
    2 years ago
  • Date Issued
    Tuesday, January 18, 2022
    2 years ago
Abstract
Pointers are added to a 3D volumetric dataset to help the user visualize the direction of blood flow. A 3D volume containing at least one blood vessel is created. Next, the direction of the blood flow is determined. Next, at least pointer is placed into the 3D volume in an aligned fashion with the direction of blood flow such that the 3D volume is modified. Next, the modified 3D volume is displayed on a head display unit, such as an augmented reality or virtual reality display. Next, at least one pointer is advanced to a new position for additional modification of the 3D imaging volume.
Description
TECHNICAL FIELD

Aspects of this disclosure are generally related to radiological imaging, and more particularly to blood vessel appearance using extended reality headsets.


BACKGROUND

One of the challenges that physicians face when viewing a volume with an augmented reality, virtual reality or mixed reality headset is visualization of blood flow.


SUMMARY

All examples, aspects and features mentioned in this document can be combined in any technically possible way.


In accordance with some implementations a method of denoting blood flow within a 3D volume on a head display unit (HDU), comprises: generating a 3D volumetric dataset containing at least one blood vessel; generating at least one pointer; determining the direction of blood flow; modifying the 3D volumetric dataset by placing the at least one pointer in proximity to the at least one blood vessel in a direction aligned with a direction of blood flow; displaying, in said HDU, a left eye image based on said modified 3D volumetric dataset and a right eye image based on said modified 3D volumetric dataset, wherein said left eye image and said right eye image are alternate three-dimensional images; and displaying, in said HDU, the at least one pointer advancing in the direction of blood flow. In some implementations placing the at least one pointer in proximity to the at least one blood vessel comprises placing a 2D arrow. In some implementations placing the at least one pointer in proximity to the at least one blood vessel comprises placing a 3D arrow. Some implementations comprise displaying, in said HDU, the pointer with changing color. Some implementations comprise displaying, in said HDU, the pointer advancing in the direction of blood flow faster in arteries than veins.


BRIEF DESCIPTION OF FIGURES


FIG. 1 illustrates the method for using pointers to denote blood flow direction within a 3D volumetric dataset and viewing with a head display unit.



FIG. 2 illustrates advancing pointers to depict the direction of the blood flow.



FIG. 3 illustrates placement of a 2D pointer into the 3D volume.



FIG. 4 illustrates placement of a 3D pointer into the 3D volume.



FIG. 5 illustrates placement of a 3D pointer into the 3D volume wherein the appearance of the 3D pointer can be modified.



FIG. 6 illustrates variable pointer rates of movement.







DETAILED DESCIPTION OF FIGURES

Some aspects, features and implementations described herein may include machines such as computers, electronic components, radiological components, optical components, and processes such as computer-implemented steps. It will be apparent to those of ordinary skill in the art that the computer-implemented steps may be stored as computer-executable instructions on a non-transitory computer-readable medium. Furthermore, it will be understood by those of ordinary skill in the art that the computer-executable instructions may be executed on a variety of tangible processor devices. For ease of exposition, not every step, device or component that may be part of a computer or data storage system is described herein. Those of ordinary skill in the art will recognize such steps, devices and components in view of the teachings of the present disclosure and the knowledge generally available to those of ordinary skill in the art. The corresponding machines and processes are therefore enabled and within the scope of the disclosure.



FIG. 1 illustrates an implementation of a method for using pointers to denote blood flow direction within a 3D volumetric dataset and viewing with a head display unit. In the first step 100, a 3D volumetric dataset containing at least one blood vessel is generated. In the second step 102, at least one pointer is generated. In the third step 104, the direction of blood flow is determined. In the fourth step 106, at least one pointer in proximity to a blood vessel in a direction aligned with the direction of blood flow is placed such that the 3D volumetric dataset is modified. In the fifth step 108, an image for said left eye based on said modified 3D imaging volume, said view point for said left eye and said volume of interest is displayed, in the left eye display of the said HDU. In the sixth step 110, an image for said right eye based on said modified 3D imaging volume, said view point for said right eye, and said volume of interest and wherein said image for said left eye and said image for said right eye produce an alternate three-dimensional image to said user. In the seventh step 112, at least point pointer is advanced in the direction of blood flow such that the imaging volume is modified. Some portions of this process can be repeated such that multiple modified 3D imaging volumes are created and displayed on the HDU. This would serve to provide the visualization of moving arrows and help the imager better understand blood flow.



FIG. 2 illustrates advancing pointers (or arrows) to depict the direction of the blood flow. In the human body, it is common for blood in most arteries to be directed away from the heart and for blood in most veins to be directed towards the heart. However, in some situations in the body (e.g., subclavian steal with retrograde flow in the vertebral artery), this rule does not apply. It can be difficult for even an experienced imager to readily determine which structures are arteries and which structures are veins. Additionally, even if an imager is able to identify a structure as an artery, it can be difficult to determine its orientation without carefully tracing it back to its origin. Through advances in computer processing, these vessels and the direction of blood flow therein can be determined. An effective visual representation method is required. In this method, advancing pointers along an artery can be performed to indicate the direction of blood flow. Similarly, advancing pointers can be performed in a vein. The color of the pointers can be changed to designate to the user whether it is an artery or vein. Further, the rate of advance of the pointers can also be varied, such as to match the natural blood flow rate for a realistic understanding of the hemodynamics of the patient. The pointers could be located in close proximity to (or within the blood vessels, such as within the center of the blood vessel). As a blood vessel curves through the 3D volume space, the path of the pointers would also curve to match that of the normal blood flow. In FIG. 2A, the pointers 204 are shown within the blood vessel lumen 202 in an initial position with respect to the blood vessel wall 200 and position of the remainder of structures within the imaging volume, which are not shown. This would represent the appearance of the imaging volume at an initial time point. In FIG. 2B, the pointers 204 are shown within the blood vessel lumen 202 in an second, slightly advanced position with respect to the blood vessel wall 200 and position of the remainder of structures within the imaging volume, which are not shown. This would represent the appearance of the imaging volume at a subsequent time point. In FIG. 2C, the pointers 204 are shown within the blood vessel lumen 202 in an third, even further advanced position with respect to the blood vessel wall 200 and position of the remainder of structures within the imaging volume, which are not shown. This would represent the appearance of the imaging volume at an additional subsequent time point. The volume that would displayed to the user on an extended reality (i.e., augmented reality, mixed reality or virtual reality headset) would therefore be dynamic and change over time. Even if the user were looking at a particular structure without moving his or her head, some items within the 3D volume would appear to be moving.



FIG. 3 illustrates placement of a 2D pointer into the 3D volume. In FIG. 3A, a 2D pointer 300 is placed into the blood vessel 200 within the 3D imaging volume. Note that this image illustrates a side view wherein the user's left and right eye view points and left and right eye viewing angles show the side of the 2D pointer 300 and the side of the blood vessel 200 within the 3D volume. In FIG. 3B, the 2D pointer 300 is placed into the blood vessel 200 within the 3D volume. Note that this image illustrates a top down view wherein the user's left and right eye view points and left and right eye viewing angles show the 2D pointer 300 and the top of the blood vessel 200 within the 3D volume. Note that since the 2D pointer is a planar slice, it nearly disappears when viewing from a near top position. A true top position with a planar 2D slice would completely disappear unless the 2D arrow was reoriented. Non-planar slices could also be used, which would be seen from any viewing angle and could be beneficial for viewing direction of blood on a curved vessel.



FIG. 4 illustrates placement of a 3D pointer into the 3D volume. In FIG. 4A, a 3D pointer 400 is placed into the blood vessel 200 within the 3D imaging volume. Note that this image illustrates a side view wherein the user's left and right eye view points and left and right eye viewing angles show the side of the 3D pointer 400 and the side of the blood vessel 200 within the 3D volume. In FIG. 4B, the 3D pointer 400 is placed into the blood vessel 200 within the 3D volume. Note that this image illustrates a top down view wherein the user's left and right eye view points and left and right eye viewing angles show the 3D pointer 400 and the top of the blood vessel 200 within the 3D volume. Note that since the pointer is 3D, it is clearly visualized when viewing from a near top position. Such a pointer could be constructed by arranging a series of 2D non-planar slices to form a cone abutting a cylinder (also made of combination of planar and non-planar slices) yielding a 3D pointer 400. By inserting this into the 3D volume, the volume would be modified.



FIG. 5 illustrates placement of a 3D pointer into the 3D volume wherein the appearance of the 3D pointer can be modified. In FIG. 5A, a 3D pointer 500 is placed into the blood vessel 200 within the 3D volume. Note that the appearance of the 3D pointer 500 is black. In FIG. 5B, the 3D pointer 502 is placed into the blood vessel 200 within the 3D volume. Note that the appearance of the 3D pointer 502 is gray. In FIG. 5C, a 3D pointer 504 is placed into the blood vessel 200 within the 3D volume. Note that the appearance of the 3D pointer 504 is red. Note that the appearance of the pointer can vary. It can be 2D or 3D. It can be a wide range of colors. It can be a wide range of shapes. It can have a wide range of textures.



FIG. 6 illustrates variable pointer rates of movement. In FIG. 6A, the black 3D pointer 500 is located within the proximal portion of an artery 600 at time point=x. In FIG. 6B, the black 3D pointer 500 has moved and is located distally towards the end of the artery 600 at time point=x+n. In FIG. 6C, the pointer 500 is located within the distal portion of a vein 602 at time point=x. In FIG. 6D, the pointer 500 is located with the mid portion of the vein 602 at time point=x+n. Note that the 3D pointer 500 is moving faster in the artery 600 as compared to the vein 602.


Several features, aspects, embodiments, and implementations have been described. Nevertheless, it will be understood that a wide variety of modifications and combinations may be made without departing from the scope of the inventive concepts described herein. Accordingly, those modifications and combinations are within the scope of the following claims.

Claims
  • 1. A method to display three-dimensional images comprising: receiving a volume of interest from a volumetric dataset;receiving an initial viewing angle of said volume of interest;receiving a first viewpoint for a left eye;receiving a second viewpoint for a right eye, wherein said first viewpoint and said second viewpoint are different viewpoints;displaying, in a head display unit (HDU), a first image for said left eye based on said initial viewing angle, said first viewpoint for said left eye and said volume of interest;displaying, in said HDU, a second image for said right eye based on said initial viewing angle, said second viewpoint for said right eye, and said volume of interest, and wherein said first image for said left eye and said second image for said right eye display a three-dimensional image in said HDU;receiving a third viewpoint for said left eye, wherein a distance from said third viewpoint to said volume of interest is smaller than a distance from said first viewpoint to said volume of interest;receiving a fourth viewpoint for said right eye, wherein a distance from said fourth viewpoint to said volume of interest is smaller than a distance from said second viewpoint to said volume of interest;wherein said third viewpoint and said fourth viewpoint are different viewpoints;displaying, in said HDU, a third zoomed in image for said left eye based on said initial viewing angle, said third viewpoint for said left eye, and said volume of interest; anddisplaying, in said HDU, a fourth zoomed in image for said right eye based on said initial viewing angle, said fourth viewpoint for said right eye, and said volume of interest, and wherein said third image for said left eye and said fourth image for said right eye display an alternate three-dimensional image in said HDU.
  • 2. The method of claim 1, further comprising wherein said third zoomed in image and said fourth zoomed in image are filtered.
  • 3. The method of claim 1, further comprising wherein said third zoomed in image and said fourth zoomed in image are colored.
  • 4. The method of claim 1, further comprising wherein a first convergence point is used for said first image and said second image,wherein a second convergence point is used for said zoomed in third image and said zoomed in fourth image, andwherein said first convergence point and said second convergence point are different.
  • 5. The method of claim 1, further comprising wherein said first viewpoint for said left eye, said second viewpoint for said right eye, said third viewpoint for said left eye, and said fourth viewpoint for said right eye are stored.
  • 6. The method of claim 1, further comprising wherein said volumetric dataset is generated by an imaging device.
  • 7. The method of claim 1, further comprising receiving a subsequent viewing angle of said volume of interest.
  • 8. A non-transitory computer readable information storage medium having one or more processors executing computer readable instructions stored thereon for generating three-dimensional images, the instructions comprising: using a volumetric dataset;using a volume of interest from said volumetric dataset;using an initial viewing angle of said volume of interest;using a first zoomed state, wherein said first zoomed state comprises: using a first viewpoint for a left eye;using a second viewpoint for a right eye wherein said first viewpoint and said second viewpoint are different viewpoints;generating a first image for said left eye for display in a head display unit (HDU) based on said initial viewing angle, said first viewpoint for said left eye and said volume of interest;generating a second image for said right eye for display in said HDU based on said initial viewing angle, said second viewpoint for said right eye, and said volume of interest, and wherein said first image for said left eye and said second image for said right eye display a three-dimensional image in said HDU;using a second zoomed state, wherein said second zoomed state comprises: using a third viewpoint for a left eye wherein said third viewpoint is closer to said volume of interest than said first viewpoint;using a fourth viewpoint for a right eye wherein said fourth viewpoint is closer to said volume of interest than said second viewpoint; wherein said third viewpoint and said fourth viewpoint are different viewpoints;generating a third zoomed in image for said left eye for display in said HDU, based on said initial viewing angle, said third viewpoint for said left eye, and said volume of interest; andgenerating a fourth zoomed in image for said right eye for display in said HDU based on said initial viewing angle, said fourth viewpoint for said right eye, and said volume of interest, and wherein said third zoomed in image for said left eye and said fourth zoomed in image for said right eye comprise a zoomed in three-dimensional image in said HDU.
  • 9. The medium of claim 8, further comprising wherein said third zoomed in image and said fourth zoomed in image are filtered.
  • 10. The medium of claim 8, further comprising wherein said third zoomed in image and said fourth zoomed in image are colored.
  • 11. The medium of claim 8, further comprising wherein a convergence point is used for said first image and said second image, andwherein said convergence point is used for said third zoomed in image and said fourth zoomed in image.
  • 12. The medium of claim 8, further comprising wherein said first viewpoint for said left eye, said second viewpoint for said right eye, said third viewpoint for said left eye, and said fourth viewpoint for said right eye are stored.
  • 13. The medium of claim 8, further comprising wherein said volumetric dataset is generated by an imaging device.
  • 14. The medium of claim 8, further comprising receiving a subsequent viewing angle of said volume of interest.
  • 15. A system comprising: a memory;a processor;a communications interface;an interconnection coupling the memory, the processor and the communications interface; andwherein the memory is encoded with an application for displaying three-dimensional images in a head display unit, that when performed on the processor, provides a process for processing information, the process causing the system to perform the operations of:using a volumetric dataset;using a volume of interest from said volumetric dataset;using an initial viewing angle of said volume of interest;using a first viewpoint for a left eye;using a second viewpoint for a right eye, wherein said first viewpoint and said second viewpoint are different viewpoints;displaying, in a head display unit (HDU), a first image for said left eye based on said initial viewing angle, said first viewpoint for said left eye and said volume of interest;displaying, in said HDU, a second image for said right eye based on said initial viewing angle, said second viewpoint for said right eye, and said volume of interest, and wherein said first image for said left eye and said second image for said right eye display a three-dimensional image in said HDU;moving said first viewpoint for said left eye closer to said volume of interest;moving said second viewpoint for said right eye closer to said volume of interest;displaying, in said HDU, a third image for said left eye based on said initial viewing angle, said moved first viewpoint for said left eye, and said volume of interest; anddisplaying, in said HDU, a fourth image for said right eye based on said initial viewing angle, said moved second viewpoint for said right eye, and said volume of interest, and wherein said third image for said left eye and said fourth image for said right eye display an alternate three-dimensional image in said HDU.
  • 16. The system of claim 15, further comprising wherein said third image and said fourth image are filtered.
  • 17. The system of claim 15, further comprising wherein said third image and said fourth image are colored.
  • 18. The system of claim 15, further comprising wherein a first convergence point is used for said first image and said second image,wherein a second convergence point is used for said third image and said fourth image, andwherein said first convergence point and said second convergence point are different.
  • 19. The system of claim 15, further comprising wherein said first viewpoint for said left eye, said second viewpoint for said right eye, said moved first viewpoint for said left eye, and said moved second viewpoint for said right eye are stored.
  • 20. The system of claim 15, further comprising wherein said volumetric dataset is generated by an imaging device.
  • 21. The system of claim 15, further comprising receiving a subsequent viewing angle of said volume of interest.
  • 22. A method comprising: configuring a head display unit (HDU) to display stereoscopic images of a volume of interest wherein: at a first time point, said HDU displays a first left eye image on a left eye display and a first right eye image on a right eye display; wherein said left eye display of said HDU is configured to be positioned over a left eye of a user,wherein said first left eye image is generated based on a first left eye viewpoint, a viewing angle, and said volume of interest,wherein said first right eye display of said HDU is configured to be positioned over a right eye of said user,wherein said first right eye image is generated based on a first right eye viewpoint, said viewing angle, and said volume of interest,wherein said first right eye viewpoint is different from said first left eye viewpoint; andat a subsequent time point, said HDU displays a second left eye image on said left eye display and a second right eye image on said right eye display; wherein said second left eye image is generated based on a second left eye viewpoint, said viewing angle, and said volume of interest,wherein a distance from said second left eye viewpoint to said volume of interest is different than a distance from said first left eye viewpoint to said volume of interest,wherein said second right eye image is generated based on a second right eye viewpoint, said viewing angle, and said volume of interest,wherein a distance from said second right eye viewpoint to said volume of interest is different than a distance from said first right eye viewpoint to said volume of interest, andwherein said second right eye viewpoint is different from said second left eye viewpoint.
  • 23. The method of claim 22, further comprising wherein said second left eye image and said second right eye image are filtered.
  • 24. The method of claim 22, further comprising wherein said second left eye image and said second right eye image are colored.
  • 25. The method of claim 22, further comprising wherein a first convergence point is used for said first left eye image and said first right eye image,wherein a second convergence point is used for said second left eye image and said second right eye image, andwherein said first convergence point and said second convergence point are different.
  • 26. The method of claim 22, further comprising wherein said first left eye viewpoint, said first right eye viewpoint, said second left eye viewpoint, and said second right eye viewpoint are stored.
  • 27. The method of claim 22, further comprising wherein said volumetric dataset is generated by an imaging device.
  • 28. The method of claim 22, further comprising receiving a subsequent viewing angle of said volume of interest.
CROSS REFERENCE TO RELATED APPLICATIONS

This application is a Continuation of U.S. patent application Ser. No. 16/506,073, filed Jul. 9, 2019, which is a Continuation of U.S. patent application Ser. No. 15/878,463, filed Jan. 24, 2018, now U.S. Pat. No. 10,795,457, which is a Continuation-in-Part of U.S. patent application Ser. No. 14/877,442, filed Oct. 7, 2015, now U.S. Pat. No. 9,980,691, which is a Continuation-in-Part of U.S. patent application Ser. No. 12/176,569 , filed Jul. 21, 2008, now U.S. Pat. No. 9,349,183, which is a Continuation-in-Part of U.S. patent application Ser. No. 11/941,578, filed Nov. 16, 2007, now U.S. Pat. No. 8,384,771, which claims the benefit of and priority under 35 U.S.C. § 119(e) to U.S. patent application Ser. No. 60/877,931, filed Dec. 28, 2006, each of which are incorporated herein by reference in their entirety.

US Referenced Citations (386)
Number Name Date Kind
4472737 Iwasaki Sep 1984 A
4808979 DeHoff et al. Feb 1989 A
4870600 Hiraoka Sep 1989 A
4871233 Sheiman Oct 1989 A
4896210 Brokenshire et al. Jun 1990 A
4952024 Gale Aug 1990 A
4987527 Hamada et al. Jan 1991 A
5049987 Hoppenstein Sep 1991 A
5113285 Franklin et al. May 1992 A
5162897 Jitsukata et al. Nov 1992 A
5200819 Nudelman et al. Apr 1993 A
5233458 Moffitt et al. Aug 1993 A
5278884 Eberhard et al. Jan 1994 A
5293529 Yoshimura et al. Mar 1994 A
5371778 Yanof et al. Dec 1994 A
5402191 Dean et al. Mar 1995 A
5488952 Schoolman Feb 1996 A
5493595 Schoolman Feb 1996 A
5510832 Garcia Apr 1996 A
5524187 Feiner et al. Jun 1996 A
5535747 Katakura Jul 1996 A
5541641 Shimada Jul 1996 A
5564810 Larson Oct 1996 A
5566280 Fukui et al. Oct 1996 A
5621867 Murata et al. Apr 1997 A
5627582 Muramoto et al. May 1997 A
5644324 Maguire, Jr. Jul 1997 A
5659625 Marquardt Aug 1997 A
5682172 Travers et al. Oct 1997 A
5682437 Okino et al. Oct 1997 A
5696521 Robinson et al. Dec 1997 A
5708359 Gregory et al. Jan 1998 A
5714997 Anderson Feb 1998 A
5734416 Ito et al. Mar 1998 A
5745163 Nakamura et al. Apr 1998 A
5822117 Kleinberger et al. Oct 1998 A
5841830 Barni et al. Nov 1998 A
5850352 Moezzi et al. Dec 1998 A
5852646 Klotz et al. Dec 1998 A
5867588 Marquardt Feb 1999 A
5880883 Sudo Mar 1999 A
5978143 Spruck Nov 1999 A
5986662 Argiro et al. Nov 1999 A
5993004 Moseley et al. Nov 1999 A
5999165 Matsumoto Dec 1999 A
6002518 Faris Dec 1999 A
6034716 Whiting et al. Mar 2000 A
6052100 Soltan et al. Apr 2000 A
6057827 Matthews May 2000 A
6066095 Morsy et al. May 2000 A
6084937 Tam et al. Jul 2000 A
6100862 Sullivan Aug 2000 A
6108005 Starks Aug 2000 A
6115449 Jang et al. Sep 2000 A
6124977 Takahashi Sep 2000 A
6130930 Tam Oct 2000 A
6191808 Katayama et al. Feb 2001 B1
6201566 Harada et al. Mar 2001 B1
6211884 Knittel et al. Apr 2001 B1
6211927 Yamazaki et al. Apr 2001 B1
6220709 Heger Apr 2001 B1
6225979 Taima et al. May 2001 B1
6252707 Kleinberger et al. Jun 2001 B1
6272366 Vining Aug 2001 B1
6275561 Danielsson Aug 2001 B1
6276799 Van Saarloos et al. Aug 2001 B1
6297799 Knittel et al. Oct 2001 B1
6342378 Zhang et al. Jan 2002 B1
6342878 Chevassus et al. Jan 2002 B1
6346940 Fukunaga Feb 2002 B1
6377230 Yamazaki et al. Apr 2002 B1
6407737 Zhao et al. Jun 2002 B1
6429861 Hossack et al. Aug 2002 B1
6429884 Budz et al. Aug 2002 B1
6442417 Shahidi et al. Aug 2002 B1
6449005 Faris Sep 2002 B1
6449090 Omar et al. Sep 2002 B1
6449309 Tabata Sep 2002 B1
6466185 Sullivan et al. Oct 2002 B2
6476607 Dannels et al. Nov 2002 B1
6487432 Slack Nov 2002 B2
6490335 Wang et al. Dec 2002 B1
6507359 Muramoto et al. Jan 2003 B1
6532008 Guralnick Mar 2003 B1
6545650 Yamada et al. Apr 2003 B1
6549803 Raghavan et al. Apr 2003 B1
6570629 Hirakata et al. May 2003 B1
6580448 Stuttler Jun 2003 B1
6606091 Liang et al. Aug 2003 B2
6608628 Ross et al. Aug 2003 B1
6676259 Trifilo Jan 2004 B1
6692441 Poland et al. Feb 2004 B1
6711231 Knoplioch et al. Mar 2004 B2
6734847 Baldeweg et al. May 2004 B1
6762794 Ogino Jul 2004 B1
6792071 Dewaele Sep 2004 B2
6798412 Cowperthwaite Sep 2004 B2
6847336 Lemelson Jan 2005 B1
6862364 Berestov Mar 2005 B1
6885886 Bauch et al. Apr 2005 B2
6947039 Gerritsen et al. Sep 2005 B2
7002619 Dean et al. Feb 2006 B1
7020236 Shechter Mar 2006 B2
7058156 Bruder et al. Jun 2006 B2
7113186 Kim et al. Sep 2006 B2
RE39342 Starks et al. Oct 2006 E
7127091 Op De Beek et al. Oct 2006 B2
7187420 Yamazaki et al. Mar 2007 B2
7190825 Yoon et al. Mar 2007 B2
7193626 Otani et al. Mar 2007 B2
7193773 Haisch et al. Mar 2007 B2
7242402 Betting et al. Jul 2007 B1
7298372 Pfister et al. Nov 2007 B2
7301510 Hewitt et al. Nov 2007 B2
7321682 Tooyama et al. Jan 2008 B2
7324085 Balakrishnan et al. Jan 2008 B2
7466336 Regan et al. Dec 2008 B2
7479933 Weissman Jan 2009 B2
7524053 Lipton Apr 2009 B2
7604597 Murashita et al. Oct 2009 B2
7605776 Satoh et al. Oct 2009 B2
7643025 Lange Jan 2010 B2
7647593 Matsumoto Jan 2010 B2
7654826 Faulkner et al. Feb 2010 B2
7715608 Vaz et al. May 2010 B2
7773074 Arenson et al. Aug 2010 B2
7786990 Wegenkittl et al. Aug 2010 B2
7796790 McNutt et al. Sep 2010 B2
7808449 Neidrich et al. Oct 2010 B2
7822265 Berretty Oct 2010 B2
7832869 Maximus et al. Nov 2010 B2
7840047 Böing et al. Nov 2010 B2
7907167 Vesely et al. Mar 2011 B2
7957061 Connor Jun 2011 B1
8049773 Ishikawa et al. Nov 2011 B2
8078000 Böhm et al. Dec 2011 B2
8159526 Sato et al. Apr 2012 B2
8160341 Peng et al. Apr 2012 B2
8165365 Bernard et al. Apr 2012 B2
8175683 Roose May 2012 B2
8199168 Virtue Jun 2012 B2
8228327 Hendrickson et al. Jul 2012 B2
8233103 MacNaughton et al. Jul 2012 B2
8248458 Schowengerdt et al. Aug 2012 B2
8289380 Kim et al. Oct 2012 B2
8363096 Aguirre Jan 2013 B1
8384771 Douglas Feb 2013 B1
8398541 DiMaio et al. Mar 2013 B2
8480234 Richards Jul 2013 B2
8508583 Goto Aug 2013 B2
8520024 Guthrie et al. Aug 2013 B2
8542326 MacNaughton et al. Sep 2013 B2
8547422 Surman Oct 2013 B2
8565505 Bergmans et al. Oct 2013 B2
8567954 Koehler et al. Oct 2013 B2
D692941 Klinar et al. Nov 2013 S
8712137 Wollenweber Apr 2014 B2
8745536 Davidson Jun 2014 B1
8750450 Ulrici et al. Jun 2014 B2
8803946 Tomita Aug 2014 B2
8866883 Rohaly et al. Oct 2014 B2
8885027 Yamaguchi et al. Nov 2014 B2
8955978 Yanai Feb 2015 B2
8964008 Bathiche Feb 2015 B2
8998417 Yanai Apr 2015 B2
9036882 Masumoto et al. May 2015 B2
9077982 Rha et al. Jul 2015 B2
9083963 Karins-Naske et al. Jul 2015 B2
9094676 Schutten et al. Jul 2015 B1
9116666 Salter et al. Aug 2015 B2
9131913 Sehnert et al. Sep 2015 B2
9142059 Mallet et al. Sep 2015 B1
9338445 Atkins May 2016 B2
9349183 Douglas May 2016 B1
9473766 Douglas et al. Oct 2016 B2
9677741 Hsu et al. Jun 2017 B2
9691175 Rane Jun 2017 B2
9736463 Gharib et al. Aug 2017 B2
9769442 Shirai et al. Sep 2017 B2
9980691 Douglas et al. May 2018 B2
9986176 Moghadam May 2018 B2
10019812 Bendall Jul 2018 B2
10042511 Roe et al. Aug 2018 B2
10088686 Robbins et al. Oct 2018 B2
10136124 MacKenzie et al. Nov 2018 B2
10297089 Buelow et al. May 2019 B2
10373309 Thiele et al. Aug 2019 B2
10417808 Noshi et al. Sep 2019 B2
10492749 Boone et al. Dec 2019 B2
10545251 Gesbert et al. Jan 2020 B2
10795457 Douglas et al. Oct 2020 B2
20010045979 Matsumoto et al. Nov 2001 A1
20020068863 Slack Jun 2002 A1
20020101658 Hoppenstein Aug 2002 A1
20020105602 Pan Aug 2002 A1
20020112237 Kelts Aug 2002 A1
20020113868 Park Aug 2002 A1
20020183607 Bauch et al. Dec 2002 A1
20030020809 Gibbon et al. Jan 2003 A1
20030026474 Yano Feb 2003 A1
20030107644 Choi Jun 2003 A1
20030194119 Manjeshwar et al. Oct 2003 A1
20030204364 Goodwin et al. Oct 2003 A1
20030218720 Morita et al. Nov 2003 A1
20040054248 Kimchy et al. Mar 2004 A1
20040070584 Pyo et al. Apr 2004 A1
20040082846 Johnson et al. Apr 2004 A1
20040096799 Hughes et al. May 2004 A1
20040174605 Olsson Sep 2004 A1
20040204644 Tsougarakis et al. Oct 2004 A1
20040208358 Tooyama et al. Oct 2004 A1
20040223636 Edic et al. Nov 2004 A1
20040238732 State et al. Dec 2004 A1
20040246269 Serra et al. Dec 2004 A1
20040254454 Kockro Dec 2004 A1
20050017938 O'Donnell et al. Jan 2005 A1
20050030621 Takahashi et al. Feb 2005 A1
20050055118 Nikolskiy et al. Mar 2005 A1
20050062684 Geng Mar 2005 A1
20050065423 Owen Mar 2005 A1
20050065424 Shah et al. Mar 2005 A1
20050096530 Daw et al. May 2005 A1
20050110791 Krishnamoorthy et al. May 2005 A1
20050148848 Guang et al. Jul 2005 A1
20050151152 Miller et al. Jul 2005 A1
20050151730 Lobregt Jul 2005 A1
20050152591 Kiraly et al. Jul 2005 A1
20050168461 Acosta Aug 2005 A1
20050208449 Abolfathi et al. Sep 2005 A1
20050244050 Nomura et al. Nov 2005 A1
20050278408 Matsumoto Dec 2005 A1
20050283063 Besson et al. Dec 2005 A1
20050285844 Morita et al. Dec 2005 A1
20060013472 Kagitani Jan 2006 A1
20060026533 Napoli et al. Feb 2006 A1
20060033992 Solomon Feb 2006 A1
20060056680 Stutsman et al. Mar 2006 A1
20060056726 Fujiwara et al. Mar 2006 A1
20060058605 Deischinger et al. Mar 2006 A1
20060077204 Pfister et al. Apr 2006 A1
20060079755 Stazzone et al. Apr 2006 A1
20060109753 Fergason May 2006 A1
20060120583 Dewaele Jun 2006 A1
20060171028 Oikawa et al. Aug 2006 A1
20060173338 Ma et al. Aug 2006 A1
20060177133 Kee Aug 2006 A1
20060181482 Iaquinto Aug 2006 A1
20060210111 Cleveland et al. Sep 2006 A1
20060210147 Sakaguchi Sep 2006 A1
20060227103 Koo et al. Oct 2006 A1
20060232665 Schowengerdt et al. Oct 2006 A1
20060238441 Benjamin et al. Oct 2006 A1
20060239523 Stewart et al. Oct 2006 A1
20060241458 Hayashi Oct 2006 A1
20060268104 Cowan et al. Nov 2006 A1
20060279569 Acosta et al. Dec 2006 A1
20060286501 Chishti et al. Dec 2006 A1
20070021738 Hasser et al. Jan 2007 A1
20070035830 Matveev et al. Feb 2007 A1
20070040854 Lievin et al. Feb 2007 A1
20070053562 Reinhardt et al. Mar 2007 A1
20070058249 Hirose et al. Mar 2007 A1
20070085902 Walker et al. Apr 2007 A1
20070103459 Stoval, III et al. May 2007 A1
20070115204 Budz et al. May 2007 A1
20070116357 Dewaele May 2007 A1
20070118408 Mahesh et al. May 2007 A1
20070146325 Poston et al. Jun 2007 A1
20070147671 Di Vincenzo et al. Jun 2007 A1
20070165927 Muradyan et al. Jul 2007 A1
20070167801 Webler et al. Jul 2007 A1
20070188520 Finley et al. Aug 2007 A1
20070206155 Lipton Sep 2007 A1
20070237369 Brunner et al. Oct 2007 A1
20070263915 Mashiach Nov 2007 A1
20070274585 Zhang et al. Nov 2007 A1
20070279435 Ng et al. Dec 2007 A1
20070279436 Ng et al. Dec 2007 A1
20070285774 Merrirt et al. Dec 2007 A1
20080025584 Kunz et al. Jan 2008 A1
20080033240 Hoffman et al. Feb 2008 A1
20080037843 Fu et al. Feb 2008 A1
20080044069 DuGal Feb 2008 A1
20080055305 Blank et al. Mar 2008 A1
20080055310 Mitchell et al. Mar 2008 A1
20080062173 Tashiro Mar 2008 A1
20080088621 Grimaud et al. Apr 2008 A1
20080094398 Ng et al. Apr 2008 A1
20080100612 Dastmalchi et al. May 2008 A1
20080117233 Mather et al. May 2008 A1
20080154952 Waldinger et al. Jun 2008 A1
20080267499 Deischinger et al. Oct 2008 A1
20080267527 Berretty Oct 2008 A1
20080281182 Rabben et al. Nov 2008 A1
20080291268 Beretty Nov 2008 A1
20080297434 Abileah Dec 2008 A1
20090016491 Li Jan 2009 A1
20090034684 Bernard et al. Feb 2009 A1
20090040227 Vrba Feb 2009 A1
20090051685 Takagi et al. Feb 2009 A1
20090080765 Bernard et al. Mar 2009 A1
20090119609 Matsumoto May 2009 A1
20090147073 Getty Jun 2009 A1
20090217209 Chen et al. Aug 2009 A1
20090219283 Hendrickson et al. Sep 2009 A1
20090219383 Passmore Sep 2009 A1
20090231697 Marcus et al. Sep 2009 A1
20090232275 Spartiotis et al. Sep 2009 A1
20090237492 Kikinis et al. Sep 2009 A1
20090244267 Yuan et al. Oct 2009 A1
20090278917 Dobbins et al. Nov 2009 A1
20090282429 Olsson et al. Nov 2009 A1
20090304232 Tsukizawa Dec 2009 A1
20090324052 Nowinski Dec 2009 A1
20100045783 State et al. Feb 2010 A1
20100081912 McKenna Apr 2010 A1
20100085423 Lange Aug 2010 A1
20100194861 Hoppenstein Aug 2010 A1
20100201785 Lantin Aug 2010 A1
20100231705 Yahav et al. Sep 2010 A1
20100246911 Rabben et al. Sep 2010 A1
20110026808 Kim et al. Feb 2011 A1
20110043644 Munger et al. Feb 2011 A1
20110063576 Redmann et al. Mar 2011 A1
20110107270 Wang et al. May 2011 A1
20110109620 Hong et al. May 2011 A1
20110141246 Schwartz et al. Jun 2011 A1
20110194728 Kutcka et al. Aug 2011 A1
20110196237 Pelissier Aug 2011 A1
20110228051 Dedoglu et al. Sep 2011 A1
20110254845 Oikawa et al. Oct 2011 A1
20110273543 Ushio et al. Nov 2011 A1
20110279450 Seong et al. Nov 2011 A1
20120008734 Thomson et al. Jan 2012 A1
20120008735 Maurer et al. Jan 2012 A1
20120013711 Tamir et al. Jan 2012 A1
20120019636 Gefen et al. Jan 2012 A1
20120038631 Mayhew et al. Feb 2012 A1
20120056998 Kang et al. Mar 2012 A1
20120071755 Zheng et al. Mar 2012 A1
20120075293 Kuwabara et al. Mar 2012 A1
20120113235 Shintani May 2012 A1
20120120202 Yoon et al. May 2012 A1
20120120207 Shimazaki et al. May 2012 A1
20120127284 Bar-Zeev et al. May 2012 A1
20120162219 Kobayashi et al. Jun 2012 A1
20120190439 Nourbakhsh Jul 2012 A1
20120190967 Nahm Jul 2012 A1
20120206665 Sakai et al. Aug 2012 A1
20120209106 Liang et al. Aug 2012 A1
20120215218 Lipani Aug 2012 A1
20120224755 Wu Sep 2012 A1
20120229595 Miller Sep 2012 A1
20120242569 Hamagishi Sep 2012 A1
20120269424 Ebata et al. Oct 2012 A1
20120287361 Sugihara Nov 2012 A1
20120306849 Steen Dec 2012 A1
20130002646 Lin et al. Jan 2013 A1
20130003020 Koehler et al. Jan 2013 A1
20130057830 Tsai et al. Mar 2013 A1
20130070984 Shirasaka et al. Mar 2013 A1
20130141552 Kwon Jun 2013 A1
20130176566 Mitchell et al. Jul 2013 A1
20130182085 Ziarati Jul 2013 A1
20130242063 Matsumoto Sep 2013 A1
20130245375 DiMaio et al. Sep 2013 A1
20130251242 Suzuki et al. Sep 2013 A1
20130278727 Tamir et al. Oct 2013 A1
20130335417 McQueston et al. Dec 2013 A1
20140051988 Lautenschlager Feb 2014 A1
20140063376 Tsang et al. Mar 2014 A1
20140065663 Vasquez et al. Mar 2014 A1
20140176685 Oikawa et al. Jun 2014 A1
20140210965 Goodman et al. Jul 2014 A1
20140253698 Evans et al. Sep 2014 A1
20140253699 Schafer et al. Sep 2014 A1
20140307067 Douglas Oct 2014 A1
20140340400 Takeguchi et al. Nov 2014 A1
20140347726 Yang et al. Nov 2014 A1
20150077713 Drumm Mar 2015 A1
20150379351 Dibenedetto Dec 2015 A1
20160038248 Bharadwaj et al. Feb 2016 A1
20160287201 Bergtholdt et al. Oct 2016 A1
20160302895 Rohaly et al. Oct 2016 A1
20180116728 Lang May 2018 A1
20180168740 Ryan Jun 2018 A1
Foreign Referenced Citations (65)
Number Date Country
1885233 Dec 2006 CN
102968791 Mar 2013 CN
19534750 Mar 1997 DE
102011080588 Feb 2013 DE
0571827 Dec 1993 EP
0592652 Sep 1997 EP
0918242 May 1999 EP
1056049 Nov 2000 EP
1056049 Nov 2000 EP
0970589 Aug 2004 EP
1683485 Jul 2006 EP
1791087 May 2007 EP
1843296 Oct 2007 EP
2838598 Oct 2004 FR
H 09-205660 Aug 1997 JP
H 11-232010 Aug 1999 JP
2000-333950 Dec 2000 JP
2001-504603 Apr 2001 JP
2002-330958 Nov 2002 JP
2005-130309 May 2005 JP
2005-521960 Jul 2005 JP
2006-113088 Apr 2006 JP
3816599 Jun 2006 JP
2008-220406 Sep 2008 JP
2009-000167 Jan 2009 JP
2009-018048 Jan 2009 JP
2009-022476 Feb 2009 JP
2009-515404 Apr 2009 JP
4319165 Jun 2009 JP
4519898 May 2010 JP
2012-105796 Jun 2012 JP
2012-142846 Jul 2012 JP
2013-538360 Oct 2013 JP
2014-222459 Nov 2014 JP
2015-036084 Feb 2015 JP
10-2004-0076846 Sep 2004 KR
10-2006-0085596 Jul 2006 KR
10-0659327 Dec 2006 KR
10-2007-0082138 Aug 2007 KR
10-2011-0125416 Nov 2011 KR
10-1083808 Nov 2011 KR
10-2012-0051065 May 2012 KR
10-1162053 Jul 2012 KR
10-2014-0048994 Apr 2014 KR
WO 9500872 Jan 1995 WO
WO 9700482 Jan 1997 WO
WO 9746029 Dec 1997 WO
WO 9923586 May 1999 WO
WO 01005161 Jan 2001 WO
WO 03010977 Feb 2003 WO
WO 03083781 Oct 2003 WO
WO 03100542 Dec 2003 WO
WO 2005062629 Jul 2005 WO
WO 2006038744 Apr 2006 WO
WO 2007052216 May 2007 WO
WO 2007059477 May 2007 WO
WO 2007063442 Jun 2007 WO
WO 2009076303 Jun 2009 WO
WO 2011031315 Mar 2011 WO
WO 2011160200 Dec 2011 WO
WO 2012030091 Mar 2012 WO
WO 2012101395 Aug 2012 WO
WO 2012144453 Oct 2012 WO
WO 2013011035 Jan 2013 WO
WO 2015069049 May 2015 WO
Non-Patent Literature Citations (157)
Entry
Inter Partes Review—Microsoft Corporation v. D3D Technologies, Inc; United States Patent and Trademark Office—Before the Patent Trial and Appeal Board, Case No. IPR2021-00647.
Inter Partes Review—Microsoft Corporation v. D3D Technologies, Inc; United States Patent and Trademark Office—Before the Patent Trial and Appeal Board, Case No. IPR2021-00648.
U.S. Appl. No. 60/735,458, filed Nov. 11, 2005, Murphy et al.
U.S. Appl. No. 60/764,508, filed Feb. 2, 2006, Murphy et al.
Petition for Inter Partes Review of U.S. Pat. No. 8,384,771, including Exhibits 1001-1012 and 1020-1022; Case No. IPR2021 -00647, filed Mar. 23, 2021 (808 pages).
Documents filed with Microsoft Corporation v. D3D Technologies, Inc.; United States Patent and Trademark Office—Before the Patent Trial and Appeal Board, Case No. IPR2021-00647; Filed on Mar. 26, 2021 (5 pages).
Petition for Inter Partes Review of U.S. Pat. No. 9,349,183, including Exhibits 1001-1007,1009; 1010; 1013; 1014, and 1020-1022; Case No. IPR2021-00648, filed Mar. 23, 2021 (1,020 pages).
Documents filed with Microsoft Corporation v. D3D Technologies, Inc.; United States Patent and Trademark Office—Before the Patent Trial and Appeal Board, Case No. IPR2021-00648; Filed on Mar. 26, 2021 (5 pages).
Notice of Allowance for U.S. Appl. No. 17/021,548 dated Jan. 13, 2021.
Notice of Allowance for U.S. Appl. No. 17/095,411 dated Feb. 2, 2021.
Documents filed with U.S. District Court Proceedings for D3D Technologies, Inc. v. Microsoft Corporation; U.S. District Court, Middle District of Florida Orlando Division; Civil Action No. 6:2Q-cv-01699-GAP-DCI; Includes publicly available documents filed from Nov. 9, 2020-Jan. 4, 2021; Docket Nos. 22-41; (1,536 pages).
Documents filed with U.S. District Court Proceedings for D3D Technologies, Inc. v. Microsoft Corporation; U.S. District Court, Middle District of Florida Orlando Division; Civil Action No. 6:20-cv-01699-GAP-DCI; Includes publicly available documents filed from Jan. 6, 2021-Feb. 3, 2021; Docket Nos. 42-46; (96 pages).
U.S. Appl. No. 60/673,257, filed Apr. 20, 2005, Bar-Zohar et al.
U.S. Appl. No. 60/835,852, filed Aug. 4, 2006, Anderson et al.
Azuma, Ronald T. “A Survey of Augmented Reality” in Presence: Teleoperators and Virtual Environments 6, 4 (Aug. 1997) pp. 355-385.
By the Editors of Electronic Gaming Monthly “1993 Video Game Preview Guide” 1993.
Cakmakci, Ozan et al. “Head-Worn Displays: A Review” Journal of Display Technology, vol. 2, No. 3, Sep. 2006.
Calhoun, Paul S. et al. “Three-Dimensional Volume Rendering of Spiral CT Data: Theory and Method” Radio Graphics; vol. 19, No. 3; May-Jun. 1999.
CBR Staff Writer “Sense8 Launches World Up, Virtual Reality Tool” CBR; https://www.cbronline.com; Sep. 8, 1995.
Cochrane, Nathan “VFX-1 Virtual Reality Helmet by Forte” Game Bytes Magazine; 1994.
D'Orazio, Dante et al. “Valve's VR Headset is Called the Vive and it's Made by HTC” The Verge: https://www.theverge/com/2015/3/1/8127445/htc-vive-valve-vr-headset; Mar. 1, 2015.
Digest of Papers “First International Symposium on Wearable Computers” IEEE Computer Society Technical Committee on Fault Tolerant Computing; Cambridge, MA; Oct. 13-14, 1997 (5 pages).
Digest of Papers “Second International Symposium on Wearable Computers” IEEE Computer Society Technical Committee on Fault Tolerant Computing; Pittsburgh, PA; Oct. 19-20, 1998 (6 pages).
Doneus, Michael et al. “Anaglyph Images—Still A Good Way to Look at 3D-Objects?” Oct. 1999.
Edirisinghe, E.A et al. “Stereo Imaging, An Emerging Technology” Jan. 2000.
Fisher, Scott S. “Portfolio of Work: Environmental Media Project” Graduate School of Media and Governance, Keio University, Tokyo, Japan 1999-Current.
Fisher, Scott S. “Portfolio of Work: Menagerie” Telepresence Research, Inc. San Francisco, CA 1993.
Fisher, Scott S. “Portfolio of Work: NASA VIEWlab” NASA Ames Research Center, Mountain View CA 1985-90.
Fisher, Scott S. “Portfolio of Work: Stereoscopic Workstation” Architecture Machine Group, MIT, Cambridge, MA 1981.
Fisher, Scott S. “Portfolio of Work: Telepresence Mobile Robot” Telepresence Research, Inc., San Francisco, CA 1991.
Fisher, Scott S. “Portfolio of Work: Viewpoint Dependent Imaging” Architecture Machine Group, MIT, Cambridge, MA 1981.
Fisher, Scott S. Portfolio of Work: Virtual Brewery Adventure: Telepresence Research, Inc., San Francisco, CA 1994.
Fisher, Scott S. “Portfolio of Work: Virtual Explorer” University of California, San Diego, CA 1998.
Fisher, Scott S. et al. “Virtual Interface Environment Workstations” Proceedings of the Human Factors Society—32nd Annual Meeting—1988.
Fisher, Scott S. “Portfolio of Work: VRML Projects” Telepresence Research, Inc., San Francisco, CA 1996.
Fuhrmann, A.L. et al. “Distributed Software-Based Volume Visualization in a Virtual Environment” The Eurographics Association and Blackwell Publishing; vol. 0, No. 0, pp. 1-11; 1981.
Galton, N. “Fast Inspection of Contents of a Volume of 3D Data” IBM Technical Disclosure Bulletin; ip.com; Feb. 1, 1994 (3 pages).
He, Changming “Volume Visualization in Projection-Based Virtual Environments: Interaction and Exploration Tools Design and Evaluation” Griffith University; 2011.
Heuser, John E. “Membrane Traffic in Anaglyph Stereo” Munksgaard International Publishers; Traffic 2000, vol. 1, 35-37.
IEEE 1998 Virtual Reality Annual International Symposium IEEE Computer Society; Atlanta, GA; Mar. 14-18, 1998 (8 pages).
Interrante, Victoria et al. “Strategies for Effectively Visualizing 3D Flow with Volume LIC” IEEE Visualization Conference; 1997; pp. 1-5.
Kaluszka, Aaron “3DS North American Price, Date, Colors Set” NintendoWorld Report: Jan. 19, 2011.
Kancherla, Anantha R. et al. “A Novel Virtual Reality Tool For Teaching Dynamic 3D Anatomy” Conference Paper; Jan. 1995.
Kato, Hirokazu et al. “Marker Tracking and HMD Calibration for a Video-Based Augmented Reality Conferencing System” IWAR '99: Proceedings of the 2nd IEEE and ACM International Workshop on Augmented Reality; Oct. 1999.
Kniss, Joe et al., “Interactive Texture-Based Volume Rendering for Large Data Sets” IEEE Computer Graphics and Applications; Jul./Aug. 2001.
Krapichler, Christian et al. “VR Interaction Techniques for Medical Imaging Applications” Computer Methods and Programs in Biomedicine 56; pp. 65-74; 1998.
Kratz, Andrea et al. “GPU-Based High-Quality Volume Rendering for Virtual Environments” Oct. 2006.
Kratz, Andrea “Integration of Hardware Volume Renderer into a Virtual Reality Application” Universitat Koblenz Landau; Oct. 2005.
Kress, Bernard et al. “Speckle Reduction Technique for Laser Based Automotive Head Up Display (HUD) Projectors” Proceedings vol. 8026, Photonic Applications for Aerospace, Transportation, and Harsh Environment II; 80260P (2011) https://doi.org/10.1117/12.886536; May 26, 2011.
Lima, Luis Alberto et al. “Virtual Seismic interpretation” IEEE XI SIBGRAPI Proceedings, Oct. 1998.
Lorensen, William E. et al. “Marching Cubes: A High Resolution 3D Surface Construction Algorithm” SIGGRAPH '87: Proceedings of the 14th annual conference on Computer graphics and interactive techniques Aug. 1987.
Marescaux, Jacques et al. “Augmented-Reality-Assisted Laparoscopic Adrenalectomy” Journal of American Medical Association; vol. 292, No. 18; Nov. 10, 2004.
McAllister, David F. “Display Technology: Stereo & 3D Display Technologies” Mar. 2003.
McKenna, Michael et al. “Three Dimensional Visual Display Systems for Virtual Environments” The Massachusetts Institute of Technology; Presence, vol. 1, No. 4, Fall 1992.
Mellott “Cybermaxx Virtual Reality Helmet” Mellott's VR; https://www.mellottsvrpage.com/index/php/cybermaxx-virtual-reality-helmet/; Jan. 26, 2021.
Moeller, D.P.F. “Mathematical and Computational Modeling and Simulation: Fundamentals and Case Studies” Springer-Verlag Berlin Heidelber; 2004.
NASA “The Virtual Interface Environment Workstation (VIEW)” Partnership with VPL Research, Inc.; https://www.nasa.gov/ames/spinoff/new_continent_of_ideas/; 1990.
Osorio, Angel et al. “A New PC Based on Software to Take and Validate Clinical Decisions for Colorectal Cancer using Metric 3D images Segmentations” https://dx.doi.org/10.1594/ecr2010/C-1071; 10.1594/ecr2010/C-1071; 2010.
PlayStation “Announcing the Price and Release Date for PlayStation VR” Available at https://www.youtube.com/watch?v=wZ57CI3Nq6o; Mar. 15, 2016.
Popescu, Voicu et al. “Three-Dimensional Display Rendering Acceleration Using Occlusion Camera Reference Images” Journal of Display Technology, vol. 2, No. 3, Sep. 2006.
Radeva, Nadezhda et al. “Generalized Temporal Focus+Context Framework for Improved Medical Data Exploration” Society for imaging informatics in Medicine; Jan. 8, 2014.
Rosenberg, Adam “Hands-On with Oculus Rift, John Carmack's Virtual Reality Goggles” G4 Media, LLC; Jun. 14, 2012.
Schmalstieg, Dieter et al. “The Studierstube Augmented Reality Project” https://arbook.icg.tugraz.at/schmalstieg/Schmalstieg_045.pdf; 2005.
ScienceDaily “FDA Approves New Robotic Surgery Device” ScienceDaily; Food and Drug Administration; Jul. 17, 2000.
Soler, L., et al. “Virtual Reality and Augmented Reality in Digestive Surgery” Proceedings of the Third IEEE and ACM International Symposium on Mixed and Augmented Reality; 2004.
Soler, Luc et al. “Virtual Reality, Augmented Reality, and Robotics Applied to Digestive Operative Procedures: From in Vivo Animal Preclinical Studies to Clinical use” Proceedings of SPIE; 2006.
SONY “Sony Global—Product & Technology Milestones—Projector” https://www.sony.net/SonyInfo/CorporateInfo/History/sonyhistory-n.html; printed Feb. 23, 2021.
SONY “Projector Head Mounted Display” Sony Global—Product & Technology Milestones—Projector Head Mounted Display; https://www.sony.net/SonyInfo/CorporateInfo/History/sonyhistory-n.html; Jan. 26, 2021.
Storey, Neil et al. “Interactive Stereoscopic Computer Graphic Display Systems” Proc. Ineract '84; pp. 163-168; Sep. 4-7, 1984.
Sutherland, Ivan E. “A Head-Mounted Three Dimensional Display” Fall Join Computer Conference, 1968.
The Computer Chronicles “Virtual Reality” available at https://www.youtube.com/watch?v=wfHMSqQKg6s; 1992.
Tresens, Marc Antonijuan et al. “Hybrid-Reality: A Collaborative Environment for Biomedial Data Exploration Exploiting 2-D and 3-D Correspondence” Studies in Health Technology and Informatics; Feb. 2004.
Ultrasound Visualization Research “UNC Ultrasound/Medical Augmented Reality Research: Augmented Reality Technology” https://www.cs.unc.edu/Research/US/; Jun. 15, 2000.
Vidal, F. P. et al., “Principles and Applications of Medical Virtual Environments” Eurographics 2004.
V-Rtifacts “Retrospective Photo Review of Forte VFX1 Virtual Reality System” https://wwvrtifacts.com/retrospective-photo-review-of-forte-vlx1-virtual-reality-system/; Jan. 26, 2021.
V-Rtifacts “Teardown—Virtual Research V6: Head Mounted Displays, How-To; Teardowns; Tutorials, Stereoscopic 3D, VR Companies” https://vrtifacts.com/teardown-virtual-research-v6/; printed Jan. 26, 2021.
WIKIPEDIA “MechWarrior 2: 31st Century Combat” https://en.wikipedia.org/wiki/MechWarrior_2:_31st_Century_Combat; Jan. 26, 2021.
WIKIPEDIA “Virtual Boy” https://en.wikipedia.org/wiki/Virtual_Boy; Jan. 11, 2021.
Wikipedia “VPL Research” https://en.wikipedia.org/wiki/VPL_Research; Jan. 22, 2021.
Notice of Allowance for U.S. Appl. No. 17/122,549 dated Mar. 3, 2021.
Defendant Microsoft Corporation's Preliminary Noninfringement Contentions for D3D Technologies, Inc. v. Microsoft Corporation; U.S. District Court, Middle District of Florida Orlando Division; Civil Action No. 6:20-cv-01699-GAP-DCI; Filed Feb. 4, 2021 (1,114 Pages).
U.S. Appl. No. 11/941,578, filed Nov. 16, 2017 U.S. Pat. No. 8,384,771.
U.S. Appl. No. 12/176,569, filed Jul. 21, 2008 U.S. Pat. No. 9,349,183.
U.S. Appl. No. 14/313,398, filed Jun. 24, 2014 U.S. Pat. No. 9,473,766.
U.S. Appl. No. 14/877,442, filed Oct. 7, 2015 U.S. Pat. No. 9,980,691.
U.S. Appl. No. 15/878,463, filed Jan. 24, 2018 U.S. Pat. No. 10,795,457.
D3D Technologies, Inc. v. Microsoft Corporation; U.S. District Court, Middle District of Florida Orlando Division; Civil Action No. 6:20-cv-01699-GAP-DCI.
U.S. Appl. No. 17/021,548, filed Sep. 15, 2020, Douglas et al.
U.S. Appl. No. 17/095,411, filed Nov. 11, 2020, Douglas et al.
U.S. Appl. No. 17/122,549, filed Dec. 15, 2020, Douglas et al.
U.S. Appl. No. 60/842,377, filed Sep. 6, 2006, Nowinski.
U.S. Appl. No. 60/854,872, filed Oct. 27, 2006, Dastmalchi et al.
Bakalash, Reuven et al. “Medicube: A 3D Medical Imaging Architecture” Computer &Graphics vol. 13, No. 2, pp. 151-157; 1989.
Douglas, David B. et al. “Augmented Reality Imaging System: 3D Viewing of a Breast Cancer” J Nat Sci, 2016;2(9).
Douglas, David B. et al. “Augmented Reality: Advances in Diagnostic Imaging: Multimodal Technologies and Interaction” 2017;1(4):29.
Douglas, David B. et al. “D3D Augmented Reality Imaging System: Proof of Concept in Mammography” Med Devices (Auckl), 2016; 9:277-83.
Engel, K., et al. “Combining Local and Remote Visualization Techniques for Interactive Volume Rendering in Medical Applications” Proceedings Visualization 2000. VIS 2000 (Cat. No. 00CH37145), Salt Lake City, UT, USA, 2000, pp. 449-452.
Erickson, Bradley J. “A Desktop Computer-Based Workstation for Display and Analysis of 3-and 4-Dimensional Biomedical Images” Computer Methods and Programs in Biomedicine, 30; pp. 97-110; 1989.
Goodsitt, Mitchel M. et al. “Stereomammography: Evaluation of Depth Perception using a Virtual 3D Cursor” Med. Phys. 27 (6), Jun. 2000.
Haker, Steven et al. “Nondistorting Flattening Maps and the 3-D Visualization of Colon CT Images” IEEE Transactions of Medical Imaging; vol. 19, No. 7; Jul. 2000; 665-670.
Hinckley, Ken “Haptic Issues forVirtual Manipulation” A Dissertation Presented to the Faculty of the School of Engineering and Applied Science at the University of Virginia; Dec. 1996.
Hinckley, Ken, et al. “New Applications for the Touchscreen in 2D and 3D Medical Imaging Workstations” Proc. SPIE Medical Imaging '95: Image Display, SPIE vol. 2431, pp. 110-118.
Hui, Y.W. et al. “3D Cursors for Volume Rendering Applications” EEE Conference on Nuclear Science Symposium and Medical Imaging, Orlando, FL, USA, 1992, pp. 1243-1245 vol. 2.
Hong, Lichan et al. “Reconstruction and Visualization of 3D Models of Colonic Surface” IEEE Transactions on Nuclear Science, vol. 44, No. 3, Jun. 1997.
IBM “Fast Inspection of Contents of a Volume of 3D Data” IBM Technical Disclosure Bulletin; Feb. 1, 1994; vol. 37, Issue 2A.
Interrante, Victoria et al. “Strategies for Effectively Visualizing 3D Flow with Volume LIC” IEEE Visualization Conference; 1997; pp. 1-4.
Kapur, Ajay et al. “Combination of Digital Mammography with Semi-Automated 3D Breast Ultrasound” NIH Public Access; Author Manuscript; Technol Cancer Res Treat, 3(4); 325-334; Aug. 2004.
Kaufman, A., et a. “Real-Time Volume Rendering” International Journal of Imaging Systems and Technology, special issue on 3D Imaging; 2000.
Klein, GJ et al. “A 3D Navigational Environment for Specifying Positron Emission Tomography Volumes-of-Interest” 1995 IEEE Nuclear Science Symposium and Medical Imaging Conference Record, San Francisco, CA, USA, 1995, pp. 1452-1455 vol. 3.
Kok, Arjan J.F. et al. “A Multimodal Virtual Reality Interface for 3D Interaction with VTK” Knowledge and Information Systems; 2007.
Kreeger, Kevin et al. “Interactive Volume Segmentation with the PAVLOV Architecture” Proceedings 1999 IEEE Parallel Visualization and Graphics Symposium (Cat. No. 99EX381), San Francisco, CA, USA, 1999, pp. 61-119.
Li, Yanhong et al. “Tinkerbell—A Tool for Interactive Segmentation of 3D Data” Journal of Structural Biology 120, 266-275; 1997.
Löbbert, Sebastian et al. “Visualisation of Two-Dimensional Volumes” 2004.
Loh, Yong Chong et al. “Surgical Planning System with Real-Time Volume Rendering” Proceedings International Workshop on Medical Imaging and Augmented Reality, Shatin, Hong Kong, China, 2001, pp. 259-261.
Martin, RW et al. “Stereographic Viewing of 3D Ultrasound Images: A Novelty or a Tool?” 1995 IEEE Ultrasonics Symposium; IEEE Press 1431-1434.
Peterson, Christine M. et al. “Volvulus of the Gastrointestinal Tract: Appearances at MultiModality Imaging” Radiographics; vol. 29, No. 5; Sep.-Oct. 2009; pp. 1281-1293.
Piekarski, Wayne “Interactive 3D Modelling in Outdoor Augmented Reality Worlds” Wearable Computer Lab, School of Computer and information Science; The University of South Australia; Feb. 2004.
Robb, R.A., et al. “A Workstation for Interactive Display and Quantitative Analysis of 3-D and 4-D Biomedical Images” Biodynamics Research Unit, IEEE, 1986.
Robb, R.A et al. “Interactive Display and Analysis of 3-D Medical Images” IEEE Transactions on Medical Imaging, vol. 8, No. 3, Sep. 1989.
Skoglund, T. et al. “3D Reconstruction of Biological Objects from Sequential Image Planes—Applied on Cerebral Cortex from CAT” Computerized Medical Imaging and Graphics; vol. 17, No. 3, pp. 165-174; 1993.
Steinicke, Frank et al. “Towards Applicable 3D User Interfaces for Everyday Working Environments” Conference Paper; Sep. 2007.
Subramanian, Sriram “Tangible Interfaces for Volume Navigation” CIP-Data Library Technische University Eindhoven; 2004.
Vanacken, Lode et al. “Exploring the Effects of Environment Density and Target Visibility on Object Selection in 3D Virtual Environments” IEEE Symposium on 3D User Interfaces, Mar. 10-11, 2007.
Ware, Colin et al., “Selection Using a One-Eyed Cursor in a Fish Tank VR Environment” Faculty of Computer Science, University of new Brunswick; Apr. 20, 2000.
Wither, Jason et al. “Pictorial Depth Cues for Outdoor Augmented Reality” Ninth IEEE International Symposium on Wearable Computers (ISWC'05), Osaka, 2005, pp. 92-99.
Wong, Terence Z. et al. “Stereoscopically Guided Characterization of Three-Dimensional Dynamic MR Images of the Breast” Radiology, 1996; 198:288-291.
Yushkevich, Paul A. et al. “User-Guided 3D Active Contour Segmentation of Anatomical Structures: Significantly Improved Efficiency and Reliability” NeuroImage 31; 1116-1128; 2006.
Zhai, Shumin et al. “The Partial Occlusion Effect: Utilizing Semi-Transparency in 3D Human Computer Interaction” ACM Transactions on Computer-Human Interaction, 3(3), 254-284; 1996.
Office Action for U.S. Appl. No. 11/941,578, dated Sep. 29, 2011.
Office Action for U.S. Appl. No. 11/941,578, dated Feb. 22, 2012.
Notice of Allowance for U.S. Appl. No. 11/941,578, dated Dec. 21, 2012.
Office Action for U.S. Appl. No. 12/176,569, dated Apr. 4, 2012.
Office Action for U.S. Appl. No. 12/176,569, dated Oct. 26, 2012.
Office Action for U.S. Appl. No. 12/176,569, dated Jul. 15, 2014.
Office Action for U.S. Appl. No. 12/176,569, dated Feb. 5, 2015.
Notice of Allowance for U.S. Appl. No. 12/176,569, dated May 29, 2015.
Office Action for U.S. Appl. No. 14/313,398 dated Sep. 25, 2015.
Office Action for U.S. Appl. No. 14/313,398 dated May 12, 2016.
Notice of Allowance for U.S. Appl. No. 14/313,398 dated Jul. 15, 2016.
Office Action for U.S. Appl. No. 14/877,442 dated Jul. 14, 2017.
Office Action for U.S. Appl. No. 14/877,442 dated Dec. 5, 2017.
Notice of Allowance for U.S. Appl. No. 14/877,442 dated Apr. 5, 2018.
Office Action for U.S. Appl. No. 15/878,463 dated Jun. 13, 2019.
Office Action for U.S. Appl. No. 15/878,463 dated Sep. 24, 2019.
Office Action for U.S. Appl. No. 15/878,463 dated Feb. 24, 2020.
Notice of Allowance for U.S. Appl. No. 15/878,463 dated Aug. 10, 2020.
Documents filed with U.S. District Court Proceedings for D3D Technologies, Inc. v. Microsoft Corporation; U.S. District Court, Middle District of Florida Orlando Division; Civil Action No. 6:20-cv-01699-GAP-DCI; Includes publicly available documents filed from Sep. 16, 2020-Oct. 6, 2020; Docket Nos. 1-21; (991 pages).
Moreira, Dilvan A et al. “3D Markup of Radiological Images in ePAD, a Web-Based Image Annotation Tool” 2015 IEEE 28th International Symposium on Computer-Based Medical Systems; 2015.
Documents filed with U.S. District Court. Proceedings for D3D Technologies, Inc. v. Microsoft Corporation; U.S. District Court, Middle District of Florida Orlando Division; Civil Action No. 6:20-cv-01699-GAP-DCI; Includes publicly available documents filed on Apr. 19, 2021; Docket No. 70; (76 pages).
Defendant Microsoft Corporation's Supplemental Invalidity Contentions for D3D Technologies. Inc. v. Microsoft Corporation; U.S. District Court, Middle District of Florida Orlando Division; Civil Action No. 6:20-cv-01699-GAP-DCI; Filed Apr. 19, 2021 (143 Pages).
Documents filed with Microsoft Corporation v. D3D Technologies, Inc.; United States Patent and Trademark Office—Before the Patent Trial and Appeal Board, Case No. IPR2021-00647; Filed on Apr. 15, 2021 (8 pages).
Documents filed with Microsoft Corporation v. D3D Technologies, Inc.; United States Patent and Trademark Office—Before the Patent Trial and Appeal Board, Case No. IPR2021-00648; Filed on Apr. 15, 2021 (8 pages).
Documents filed with Microsoft Corporation v. D3D Technologies, Inc.; United States Patent and Trademark Office—Before the Patent Trial and Appeal Board, Case No. IPR2021-00703; Filed on Apr. 15, 2021 (14 pages).
Inter Partes Review—Microsoft Corporation v. D3D Technologies, Inc.; United States Patent and Trademark Office—Before the Patent Trial and Appeal Board, Case No. IPR2021-00703.
Foley et al. “The Systems Programming Series: Computer Graphics: Principles and Practice Second Edition” Addison-Wesley Publishing Company; 1990.
Documents filed with U.S. District Court Proceedings for D3D Technologies, Inc. v. Microsoft Corporation; U.S. District Court, Middle District of Florida Orlando Division; Civil Action No. 6:20-cv-01699-GAP-DCI; Includes publicly available documents filed from Feb. 4, 2021-Apr. 6, 2021; Docket Nos. 47-69; (1,242 pages).
Petition for Inter Partes Review of U.S. Pat. No. 9,473,766, including Exhibits 1001-1024; Case No. IPR2021-00703, filed Apr. 7, 2021 (1,441 pages).
Provisional Applications (1)
Number Date Country
60877931 Dec 2006 US
Continuation in Parts (5)
Number Date Country
Parent 16506073 Jul 2019 US
Child 17410403 US
Parent 15878463 Jan 2018 US
Child 16506073 US
Parent 14877442 Oct 2015 US
Child 15878463 US
Parent 12176569 Jul 2008 US
Child 14877442 US
Parent 11941578 Nov 2007 US
Child 12176569 US