Claims
- 1. A method for local 3-dimensional (3D) reconstruction from 2-dimensional (2D) ultrasound images, comprising:
deriving a 2D image of an object; defining a target region within said 2D image; deriving further 2D images having respective poses and including said target region; and reconstructing a 3D image representation for said target region from said 2D images and said respective poses.
- 2. A method for local 3D reconstruction as recited in claim 1, wherein said step of defining a target region comprises a step of searching said image along its centerline for identifying a potential target region.
- 3. A method for local 3D reconstruction as recited in claim 2, wherein said step of searching said image comprises a step of utilizing a search algorithm for searching said image along its centerline for identifying a potential target region.
- 4. A method for local 3D reconstruction as recited in claim 3, wherein said step of utilizing a search algorithm comprises a step of de-noising said image around its centerline for identifying a potential target region.
- 5. A method for local 3D reconstruction as recited in claim 4, wherein said step of de-noising comprises a step of median filtering for identifying a potential target region.
- 6. A method for local 3D reconstruction as recited in claim 1, wherein said step of de-noising comprises a step of median filtering for identifying a potential target region.
- 7. A method for local 3D reconstruction as recited in claim 4, wherein said step of searching said image comprises a step of utilizing a Hough transform for verifying a potential target region.
- 8. A method for local 3-dimensional (3D) reconstruction from 2-dimensional (2D) ultrasound images, comprising:
deriving a 2D image of an object; defining a target region within said 2D image, defining a volume scan period; during said volume scan period, deriving further 2D images of said target region and storing respective pose information for said further 2D images; and reconstructing a 3D image representation for said target region by utilizing said 2D images and said respective pose information.
- 9. A method for local 3D reconstruction as recited in claim 8, wherein said step of defining a target region comprises semi-automatic steps.
- 10. A method for local 3-dimensional (3D) reconstruction from 2-dimensional (2D) ultrasound images, comprising:
deriving a 2D image of an object; defining a target region within said 2D image, said target region being less than the whole of said 2D image; defining the start and end of a volume scan, deriving further 2D images of said target region during a period between said start and end of said volume scan, said further 2D images having respective poses; storing pose information for said respective poses; defining the end of said volume scan; and reconstructing a 3D image representation for said target region by utilizing said said 2D images and said respective pose information.
- 11. A method for local 3-dimensional (3D) reconstruction from 2-dimensional (2D) ultrasound Doppler images, comprising:
deriving a 2D Doppler image of an object; detecting flow regions exhibiting predetermined flow characteristics; defining a target region within said 2D image in correspondence with said flow regions; defining a volume scan period; during said volume scan period, deriving further 2D images of said target region and storing respective pose information for said further 2D images; and reconstructing a 3D image representation for said target region by utilizing said 2D images and said respective pose information.
- 12. A method for local 3-dimensional (3D) reconstruction from 2-dimensional (2D) ultrasound Doppler images, comprising:
deriving a 2D Doppler image of an object; automatically detecting flow regions exhibiting predetermined flow characteristics; automatically defining a target region within said 2D image in correspondence with said flow regions; defining a volume scan period; during said volume scan period, deriving further 2D images of said target region and storing respective pose information for said further 2D images; and reconstructing a 3D image representation for said target region by utilizing said 2D images and said respective pose information.
- 13. A method for local 3D reconstruction as recited in claim 12, wherein said step of automatically defining a target region comprises a step of automatically searching said image along its centerline for identifying a potential target region.
- 14. A method for local 3D reconstruction as recited in claim 13, wherein said step of automatically searching said image comprises a step of utilizing a search algorithm for searching said image along its centerline for identifying a potential target region.
- 15. A method for local 3D reconstruction as recited in claim 14, wherein said step of utilizing a search algorithm comprises a step of de-noising said image around its centerline for identifying a potential target region.
- 16. A method for local 3D reconstruction as recited in claim 15, wherein said step of de-noising comprises a step of median filtering for identifying a potential target region.
- 17. A method for local 3D reconstruction as recited in claim 15, wherein said step of de-noising comprises a step of median filtering for identifying a potential target region.
- 18. A method for local 3D reconstruction as recited in claim 13, wherein said step of searching said image comprises a step of utilizing a Hough transform for verifying a potential target region.
- 19. Apparatus for local 3-dimensional (3D) reconstruction from 2-dimensional (2D) ultrasound images, comprising:
means for deriving a 2D image of an object; means for determining and storing respective pose information for a 2D image derived by said means for deriving a 2D image; means for defining the start and end of a volume scan, said means for defining being coupled to said means for deriving a 2D image; means for defining a target region within said 2D image, said target region being less than the whole of said 2D image, wherein said means for deriving a 2D image is coupled to said means for defining a target region, and derives further 2D images of said target region between said start and said end of said volume scan, said 2D images having respective poses; and means for reconstructing a 3D image representation for said target region by utilizing said said 2D images and said respective pose information.
- 20. Apparatus for local 3-dimensional (3D) reconstruction as recited in claim 1, wherein said means for defining a target region comprises a pointer.
- 21. Apparatus for local 3-dimensional (3D) reconstruction as recited in claim 1, wherein said means for defining a target region comprises an image line.
- 22. Apparatus for local 3-dimensional (3D) reconstruction from 2-dimensional (2D) ultrasound images, comprising:
means for deriving a 2D image of an object; means for defining a target region within said 2D image; means for defining a volume scan period; means for storing respective pose information for further 2D images of said target region derived during said volume scan period by said means for deriving a 2D image; and means for reconstructing a 3D image representation for said target region by utilizing said 2D images and said respective pose information.
- 23. Apparatus for local 3-dimensional (3D) reconstruction as recited in claim 22, wherein said means for defining a target region comprises processor means for searching said image along its centerline for identifying a potential target region.
- 24. Apparatus for local 3-dimensional (3D) reconstruction as recited in claim 23, wherein processor means for searching utilizes a search algorithm for searching said image along its centerline for identifying a potential target region.
- 25. Apparatus for local 3-dimensional (3D) reconstruction as recited in claim 23, wherein processor means for searching utilizes a search algorithm for de-noising said image around its centerline for identifying a potential target region.
- 26. Apparatus for local 3-dimensional (3D) reconstruction as recited in claim 23, wherein processor means for searching utilizes a search algorithm for de-noising said image around its centerline, by using a median filter, for identifying a potential target region.
- 27. Apparatus for local 3-dimensional (3D) reconstruction as recited in claim 23, wherein processor means for searching utilizes a Hough transform for verifying a potential target region.
Parent Case Info
[0001] Reference is hereby made to the following U.S. Provisional patent applications whereof the benefit is hereby claimed and whereof the disclosures are hereby incorporated by reference:
[0002] U.S. Provisional patent application No. 60/312,872, entitled MARKING 3D LOCATIONS FROM ULTRASOUND IMAGES and filed Aug. 16, 2001 in the names of Frank Sauer, Ali Khamene, Benedicte Bascle;
[0003] U.S. Provisional patent application No. 60/312,876, entitled LOCAL 3D RECONSTRUCTION FROM ULTRASOUND IMAGES and filed Aug. 16, 2001 in the names of Frank Sauer, Ali Khamene, Benedicte Bascle;
[0004] U.S. Provisional patent application No. 60/312,871, entitled SPATIOTEMPORAL FREEZING OF ULTRASOUND IMAGES IN AUGMENTED REALITY VISUALIZATION and filed Aug. 16, 2001 in the names of Frank Sauer, Ali Khamene, Benedicte Bascle;
[0005] U.S. Provisional patent application No. 60/312,875, entitled USER INTERFACE FOR AUGMENTED AND VIRTUAL REALITY SYSTEMS and filed Aug. 16, 2001 in the names of Frank Sauer, Lars Schimmang, Ali Khamene; and
[0006] U.S. Provisional patent application No. 60/312,873, entitled VIDEO-ASSISTANCE FOR ULTRASOUND GUIDED NEEDLE BIOPSY and filed Aug. 16, 2001 in the names of Frank Sauer and Ali Khamene.
[0007] Reference is hereby made to the following copending U.S. patent applications being filed on even date herewith.
[0008] U.S. patent application, entitled MARKING 3D LOCATIONS FROM ULTRASOUND IMAGES filed in the names of Frank Sauer, Ali Khamene, Benedicte Bascle;
[0009] U.S. patent application entitled SPATIOTEMPORAL FREEZING OF ULTRASOUND IMAGES IN AUGMENTED REALITY VISUALIZATION and filed in the names of Frank Sauer, Ali Khamene, Benedicte Bascle;
[0010] U.S. patent application entitled USER INTERFACE FOR AUGMENTED AND VIRTUAL REALITY SYSTEMS and filed in the names of Frank Sauer, Lars Schimmang, Ali Khamene; and
[0011] U.S. patent application entitled VIDEO-ASSISTANCE FOR ULTRASOUND GUIDED NEEDLE BIOPSY and filed in the names of Frank Sauer and Ali Khamene.
Provisional Applications (5)
|
Number |
Date |
Country |
|
60312872 |
Aug 2001 |
US |
|
60312876 |
Aug 2001 |
US |
|
60312871 |
Aug 2001 |
US |
|
60312875 |
Aug 2001 |
US |
|
60312873 |
Aug 2001 |
US |