Claims
- 1. A three-dimensional digital scanner comprising:
a multiple view detector responsive to a broad spectrum of light, said detector being operative to develop a plurality of images of a three dimensional object to be scanned, said plurality of images being taken from a plurality of relative positions and orientations with respect to said three dimensional object, said plurality of images depicting a plurality of surface portions of said three dimensional object to be scanned; a digital processor including a computational unit said digital processor being coupled to said detector; a calibration ring comprising calibration objects integrated with an object support for inclusion in said plurality of images; said digital processor being operative to use imagery of said calibration objects to determine camera geometry relative to a reference frame; said object support being an apparatus designed to provide rotational view images of said object placed on said object support and said rotational view images being defined as images taken by relative rotation between the object and the detector; at least one source of white light for illuminating said three dimensional object for purposes of color image acquisition; at least one source of active light being projected onto the object; said digital processor being operative to use active range finding techniques to derive 3D spatial data describing the surface of said three dimensional object; said digital processor being operative to analyze said calibration objects to perform a coordinate transformation between reference coordinates and image space coordinates said coordinate transformation being used to correlate color image data, viewing angle data and 3D spatial data; such that a digital representation of said object that includes both shape and surface coloration may be developed from said data.
- 2. The apparatus of claim 1 said digital processor being operative to partition said 3D spatial data into a plurality of 3D surface elements.
- 3. The apparatus of claim 1 further comprising at least one frontalness setting for selecting data for an archive of viewing angles and corresponding color image data to be processed for color mapping.
- 4. The apparatus of claim 1 wherever said digital processor is operative to perform a visibility computation to create an archive of viewing angle data and nonoccluded color image data and 3D spatial data corresponding to said 3D surface element.
- 5. The apparatus of claim 2 said digital processor being operative to identify individual 3D surface elements.
- 6. The apparatus of claim 5 said digital processor being operative to develop a viewing history that stores viewing angles, 3D spatial data and multiple color images of a surface patch corresponding to an identified 3D surface element.
- 7. The apparatus of claim 6 further comprising at least one frontalness setting for selecting data for an archive of viewing angles and corresponding color image data to be processed for color mapping.
- 8. The apparatus of claim 6 wherein said digital processor is operative to perform a visibility computation to create an archive of viewing angle data and nonoccluded color image data and 3D spatial data corresponding to said 3D surface element.
- 9. A three-dimensional digital scanner comprising:
a multiple view detector responsive to a broad spectrum of light, said detector being operative to develop a plurality of images of a three dimensional object to be scanned, said plurality of images being taken from a plurality of relative positions and orientations with respect to said object, said plurality of images depicting a plurality of surface portions of said object to be scanned; a digital processor including a computational unit said digital processor being coupled to said image detector; said digital processor being operative to measure and record for each of said plurality of images, position information including the relative angular position of the detector with respect to the object; said digital processor being operative to use said position information to develop with said computational unit 3D coordinate positions and related image information of said plurality of surface portions of said object; a calibration ring comprising calibration objects integrated with an object support for inclusion in said plurality of images; said digital processor being operative to use imagery of said calibration object to determine camera geometry relative to a reference frame; said object support being an apparatus designed to provide rotational view images of said object placed on the object support, and said rotational view images being defined as images taken by relative rotation between the object and the detector; said digital processor being operative to use active range finding techniques to derive surface geometry data describing the surface of said object said digital processor being operative to partition said surface geometry data into a plurality of 3D surface elements; said digital processor being operative to identify individual 3D surface elements; said digital processor being operative to develop a viewing history that stores viewing angles and 3D spatial data related to multiple views of a surface patch corresponding to an identified 3D surface element; said digital process being operative to have at least one frontalness setting for creating for selecting data for an archive of viewing angles and corresponding color image data from said viewing history to be processed for color mapping purposes related to a view dependent representation of the surface image corresponding to said 3D surface element; said digital processor being operative to perform a visibility computation to create an archive of visible cameras related to the acquired surface imagery corresponding to said 3D surface elements said archive of visible cameras including reference to 3D spatial data describing said 3D surface elements; said digital processor being operative to cross correlate said archive of visible cameras and said archive of viewing angles and corresponding image data to produce data which is incorporated into a functional expression such that a view dependent representation of said object that includes both shape and surface coloration may be developed.
- 10. A method for scanning a three dimensional object using a multiple view detector responsive to a broad spectrum of light and a digital processor having a computation unit said digital processor being coupled to said multiple view detector comprising;
developing a plurality of images of a three dimensional object to be scanned said plurality of images being taken from a plurality of relative angles with respect to said object, said plurality of images depicting a plurality of surface portions of said object to be scanned; providing a calibration ring comprised of patterned 3D calibration objects to be included in said plurality of images; providing incident light to illuminate said calibration ring and said object during the taking of said images; determining from said images of said calibration ring the viewing geometry and lighting geometry relative to a 3D surface element corresponding to a surface patch of a scanned object said lighting geometry being determined by said incident lighting casting a shadow of said 3D calibration objects. the pattern of each 3D calibration object providing feature or points at precisely known locations with respect to the 3D surface geometry of said calibration objects and including identifiable features to enhance the identification of said feature points over a range of viewing angles; deriving data by analyzing the images of said calibration objects and performing with said data a transformation between reference coordinate and image space coordinate; using said transformation, registering view dependent imagery of said surface patch with said 3D surface element; using said calibration ring to develop a viewing history where said viewing history includes the viewing geometry, angle of incident light corresponding and said view dependent imagery relative to said 3D surface element.
- 11. A method for scanning a three dimensional object using a multiple view detector responsive to a broad spectrum of light and a digital processor having a computation unit said digital processor being coupled to said multiple view detector comprising;
developing a plurality of images of a three dimensional object to be scanned said plurality of images being taken from a plurality of relative angles with respect to said object, said plurality of images depicting a plurality of surface portions of said object to be scanned; providing a calibration ring comprised of patterned 3D calibration objects to be included in said plurality of images; the pattern of said calibration and/or of an adjacent object including identifiable features that will enhance the identification said feature points over a range of viewing angles;
using a patterned 3D calibration object to determine the viewing geometry and angle of incident light relative to a 3D surface element corresponding to a surface patch of a scanned object; using said 3D calibration object to determine lighting geometry by casting a shadow of said incident light; providing from the pattern of said patterned 3D calibration object, features and points at precisely known locations with respect to 3D surface geometry of said calibration object; using data derived by analyzing said calibration objects, performing a transformation between reference coordinates and image space coordinates; said transformation being used to register view dependent imagery of said surface patch with said 3D surface element; using said calibration ring to develop a viewing history where said history includes the viewing geometry, angle of incident light corresponding and said view dependent imagery relative to said 3D surface element.
- 12. A method for scanning a three dimensional object using a multiple view detector responsive to a broad spectrum of light and a digital processor having a computation unit said digital processor being coupled to said multiple view detector comprising by use of said detector and said digital processor:
developing a plurality of images of a three dimensional object to be scanned said plurality of images being taken from a plurality of relative angles with respect to said object, said plurality of images depicting a plurality of surface portions of said object to be scanned; providing a calibration ring comprised of patterned 3D calibration objects to be included in said plurality of images; said digital processor being operative to use imagery of said calibration objects to determine camera geometry data including viewing direction of said detector relative to a reference frame; acquiring said object imagery under a plurality of lighting scenarios; said lighting scenarios including at least white light used for acquiring color image data and active projection lighting used to provide imagery to be analyzed by an active range finding procedures; using said digital processor to develop 3D data and related color image data of said plurality of surface portions of said object; using active range finding techniques to derive 3D spatial data located in reference coordinate space; processing said 3D spatial data into a 3D surface representation comprised of identifiable 3D surface elements of said surface portions; performing a coordinate transformation to transform said 3D spatial data and corresponding identifiable 3D surface elements from reference coordinate space to image coordinate space; performing a visibility computation that compares image space depth coordinates between transformed 3D surface elements, from a given viewing direction of said detector, to determine whether a 3D surface element is non-occluded with respect to all other 3D surface element from said viewing direction; using the color image data corresponding to said non-occluded 3D color surface element to develop a 3D shape and color representation of said object.
- 13. A method for scanning a three dimensional object using a multiple view detector responsive to a broad spectrum of light and a digital processor having a computation unit said digital processor being coupled to said multiple view detector comprising by use of said detector and said digital processor;
developing a plurality of relative images of a three dimensional object to be scanned said plurality of images being taken from a plurality of positions and orientations including viewing angles relative to said object, said plurality of images depicting a plurality of surface portions of said object to be scanned; providing a calibration ring comprised of calibration objects to be included in said plurality of images; and wherein the geometry of the calibration features and geometric location of calibration features with respect to said geometry of the calibration objects is known or is determined; said digital processor being operative to use imagery of said calibration objects to determine camera geometry data including angles of said detector relative to a reference frame; acquiring said plurality of images of said object under a plurality of lighting scenarios said lighting scenarios including at least white light used for acquiring color of said images and active projection lighting used to provide imagery to be analyzed by an active range finding procedure; said images being located in image coordinate space; determining camera geometry data corresponding to different said positions and orientations of said detector to capture the color image data; recording said color image data with corresponding camera geometry data to a point on a colormap image; whereby a three dimensional representation of said object to be scanned can be developed by said digital processor.
- 14. A method for scanning a three dimensional object using a multiple view detector responsive to a broad spectrum of light and a digital processor having a computation unit said digital processor being coupled to said multiple view detector comprising by use of said detector and said digital processor;
providing a calibration ring integrated with an object support for inclusion in said plurality of images said calibration ring comprising calibration objects said calibration objects being of known geometry and having calibration features of known geometry and the geometric location of the calibration features with respect to the geometry of the calibration objects being known and the geometric configuration of the calibration objects relative to each other being known; using of non-calibration imagery to enhance the identification of said calibration features; said digital processor being operative to use imagery of said calibration objects to determine detector location and orientation relative to a reference; acquiring said plurality of images of said object under a plurality of lighting scenarios including using white light used for acquiring color image data and using active projection lighting to provide imagery to be analyzed by an active range finding procedure to determine the surface geometry of object portions; said object support being of an apparatus designed to provide rotational view images of said object placed on said object support, and said rotational view images being defined as images taken by relative rotation between the object and the detector; developing a plurality of images of a three dimensional object to be scanned said plurality of images being taken from a plurality of relative angles with respect to said object, said plurality of images depicting a plurality of surface portions of said object to be scanned; whereby a three dimensional image of said object to be scanned can be developed by said digital processor that includes both shape and surface image.
- 15. A method for scanning a three dimensional object using a multiple view detector responsive to a broad spectrum of light and a digital processor having a computation unit said digital processor being coupled to said multiple view detector comprising;
providing a calibration ring comprising calibration objects integrated with an object support for inclusion in said plurality of images; said digital processor being operative to use imagery of said calibration objects to determine detector location and orientation relative to a reference frame; acquiring said plurality of images said object under a plurality of lighting scenarios including using white light used for acquiring color image data and using active projection lighting to provide imagery to be analyzed by an active range finding procedure to determine the surface geometry of said object portions; said object support being of an apparatus designed to provide rotational view images of said object placed on said object support, and said rotational view images being defined as images taken by relative rotation between the object and the detector; developing a plurality of images of a three dimensional object to be scanned said plurality of images being taken from a plurality of relative angles with respect to said object, said plurality of images depicting a plurality of surface portions of said object to be scanned; whereby a three dimensional image of said object to be scanned can be developed by said digital processor that includes both shape and surface image.
- 16. A method for scanning a three dimensional object using a multiple view detector responsive to a broad spectrum of light and a digital processor having a computation unit said digital processor being coupled to said multiple view detector comprising;
providing a calibration ring comprising 3D calibration objects integrated with an object support for inclusion in said plurality of images; said digital processor being operative to use imagery of said 3D calibration objects to determine detector location and orientation relative to a reference frame; determining the angle of incident lighting by analyzing shadows caste by 3D calibration objects relative to a reference frame;
acquiring said plurality of images said object under a plurality of lighting scenarios including using white light used for acquiring color image data and using active projection lighting to provide imagery to be analyzed by an active range finding procedure to determine the surface geometry of said object portions; said object support being of an apparatus designed to provide rotational view images of said object placed on said object support, and said rotational view images being defined as images taken by relative rotation between the object and the detector; developing a plurality of images of a three dimensional object to be scanned said plurality of images being taken from a plurality of relative angles with respect to said object, said plurality of images depicting a plurality of surface portions of said object to be scanned; whereby a three dimensional image of said object to be scanned can be developed by said digital processor that includes both shape and surface image.
- 17. A method for scanning a three dimensional object using a multiple view detector responsive to a broad spectrum of light and a digital processor having a computation unit said digital processor being coupled to said multiple view detector comprising;
developing a plurality of images of a three dimensional object to be scanned said plurality of images being taken from a plurality of relative angles with respect to said object, said plurality of images depicting a plurality of surface portions of said object to be scanned; said digital processor being operative to perform a coordinate transformation to transform 3D spatial data and corresponding identifiable 3D surface elements from reference coordinate space to image coordinate space. performing a visibility computation that compares image space depth coordinates between transformed 3D surface elements from a given viewing angle of said detector to determine whether a 3D surface element is non-occluded with respect to all other 3D surface elements from said viewing angle; using color image data corresponding to said non-occluded 3D surface elements to develop a 3D shape and color representation of said object; recording pairings of said object's 3D surface elements with a viewing angle at which it is visible; whereby a three dimensional image of said object to be scanned can be developed by said digital processor that includes both shape and surface image.
- 18. A method for scanning a three dimensional object using a multiple view detector responsive to a broad spectrum of light and a digital processor having a computation unit said digital processor being coupled to said multiple view detector comprising;
developing a plurality of images of a three dimensional object to be scanned said plurality of images being taken from a plurality of relative angles with respect to said object, said plurality of images depicting a plurality of surface portions of said object to be scanned; said digital processor being operative to perform a coordinate transformation to transform 3D spatial data and corresponding identifiable 3D surface elements from reference coordinate space to image coordinate space; performing a visibility computation that compares image space depth coordinates between transformed 3D surface elements, from a given viewing angle of said detector, to determine whether a 3D surface element is non-occluded with respect to all other 3D surface elements from said viewing angle; whereby a three dimensional image of said object to be scanned can be developed by said digital processor that includes both shape and surface image.
- 19. A method for scanning a three dimensional object using a multiple view detector responsive to a broad spectrum of light and a digital processor having a computation unit said digital processor being coupled to said multiple view detector comprising;
developing a plurality of images of a three dimensional object to be scanned said plurality of images being taken from a plurality of relative angles with respect to said object, said plurality of images depicting a plurality of surface portions of said object to be scanned; said digital processor being operative to perform a coordinate transformation to transform 3D spatial data and corresponding identifiable 3D surface elements from reference coordinate space to image coordinate space; performing a visibility computation that compares image space depth coordinates between transformed 3D surface elements, from a given viewing angle of said detector, to determine whether a 3D surface element is non-occluded with respect to all other 3D surface elements from said viewing angle; recording pairings of said object's non-occluded 3d surface elements with an image viewing angles at which they are visible; using said visibility computation to develop a representation of the surface image said representation being view dependent; whereby a three dimensional image of said object to be scanned can be developed by said digital processor that includes both shape and surface image.
- 20. A method for scanning a three dimensional object using a multiple view detector responsive to a broad spectrum of light and a digital processor having a computation unit said digital processor being coupled to said multiple view detector comprising;
developing a plurality of images of a three dimensional object to be scanned said plurality of images being taken from a plurality of relative angles with respect to said object, said plurality of images depicting a plurality of surface portions of said object to be scanned; said digital processor being operative to perform a coordinate transformation to transform 3D spatial data and corresponding identifiable 3D surface elements from reference coordinate space to image coordinate space; performing a visibility computation that compares image space depth coordinates between transformed 3D surface elements, from a given viewing angle of said detector, to determine whether a 3D surface element is non-occluded with respect to all other 3D surface elements from said viewing angle; recording pairings of said object's non-occluded 3d surface elements with an image viewing angeles at which they are visible; using said visibility computation to develop a representation of the surface image said representation being view dependent; whereby a three dimensional image of said object to be scanned can be developed by said digital processor that includes both shape and surface image.
- 21. A method for scanning a three dimensional object using a multiple view detector responsive to a broad spectrum of light and a digital processor having a computation unit said digital processor being coupled to said multiple view detector comprising;
computing 3-D coordinate positions and related image information of said plurality of surface portions of said object from said plurality of images such that a three dimensional image of said object to be scanned can be developed that includes both shape and surface image information; said digital processor being operative to perform a coordinate transformation to transform said 3D spatial data and corresponding identifiable 3D surface elements from a reference coordinate space to an image coordinate space; performing a visibility computation that compares image space depth coordinates between transformed 3D surface elements, from a given viewing angle of said detector, to determine whether a 3D surface element is non-occluded with respect to all other 3D surface elements from said viewing angle generating a viewing history to contain data related to non-occluded 3D surface elements and corresponding surface patches; cross-correlating viewing history data for processing purposes; dividing said object into numerous surface elements each with an individual viewing history to provide a structure by which said cross-correlating will produce independent measurements of coloration and geometry for each 3D surface element on a global basis; developing a plurality of images of a three dimensional object to be scanned said plurality of images being taken from a plurality of relative angles with respect to said object, said plurality of images depicting a plurality of surface portions of said object to be scanned; whereby a three dimensional image of said object to be scanned can be developed by said digital processor that includes both shape and surface image.
- 22. The method of claim 17 further comprising recording pairings of said object's non-occluded 3d surface elements with an image viewing angles they are visible.
- 23. The method of claim 17 further comprising storing said pairings as a list,
- 24. The method of claim 17 further comprising storing said pairings as an array.
- 25. The method of claim 17 further comprising storing said pairing sas a matrix.
- 26. The method of claim 17 further comprising using said visibility computation to reconstruct the surface coloration of said object.
- 27. The method of claim 17 further comprising using said visibility computation to reconstruct at least one surface property of said object.
- 28. The method of claim 17 further comprising selecting visible 3D surface elements using angle of the surface normal to the viewing angle in choosing the visibility pairs likely to have the best-quality image data for a particular 3D surface element.
- 29. The method of claim 17 further comprising creating a view dependent representational format.
- 29. The method of claim 17 further comprising analyzing immediate neighborhood of a Z-buffer 3D surface element to decide whether a point is close to being obscured, and so its visible color would be contaminated by the surface coloration of another surface element.
- 30. The method of claim 21 includes using multiple view dependent measurements of color images of each surface patch related to a corresponding 3D surface element to develop a view dependent representation of said color images corresponding to said surface patch and said 3D surface element.
- 31. The method of claim 21 includes using multiple view dependent measurements of surface properties for each surface patch related to a corresponding 3d surface element to develop a view dependent representation of said surface properties corresponding to said patch said surface patch and said 3d surface element.
- 32. The method of claim 21 includes determining lighting geometry relative to a 3D surface element corresponding to a surface patch of a scanned object.
- 33. The method of claim 21 includes using a coordinate transformation to register view dependent imagery of said surface patch with said 3D surface element.
- 34. The method of claim 21 including camera geometry data in said viewing history.
- 35. The method of claim 21 using a table to store the viewing history data.
- 36. The method of claim 21 cross-correlating said data for processing purposes.
- 37. The method of claim 21 integrating color and reflectance data to derive a representation of surface image of a surface patch corresponding to a 3d surface element.
- 38. The method of claim 21 using said viewing history to store lighting geometry data.
- 39. The method of claim 21 using said viewing history to store data describing the angle of incident lighting.
- 40. The method of claim 21 using said viewing history to store data describing viewing geometry.
- 41. The method of claim 21 using said viewing history to store data describing surface reflectance estimates.
- 42. The method of claim 21 using said viewing history to store data describing lighting intensity.
- 43. The method of claim 21 using said viewing history to store data describing the brightness of ingoing light.
- 44. The method of claim 21 using said viewing history to store data describing the brightness of outgoing light.
- 45. The method of claim 21 using said viewing history to store data describing change in hue.
- 46. The method of claim 21 using said viewing history to store data describing change in chrominance.
- 47. The method of claim 21 where said viewing history stores change in saturation.
- 48. The method of claim 21 further comprising using said viewing history to store data describing change in intensity.
- 49. The method of claim 21 further comprising using said viewing history to store data describing change in brightness.
- 50. The method of claim 21 further comprising using said viewing history to store data describing the change in hue relative to lighting and viewing geometries.
- 51. The method of claim 21 further comprising using said viewing history to store data describing the change in chrominance relative to lighting and viewing geometries.
- 52. The method of claim 21 further comprising using said viewing history to store data describing the change in saturation relative to lighting and viewing geometries.
- 53. The method of claim 21 further comprising using said viewing history to store data describing the change in intensity relative to lighting and viewing geometries.
- 54. The method of claim 21 further comprising using said viewing history to store data describing the change in brightness relative to lighting and viewing geometries.
- 55. The method of claim 21 further comprising using said viewing history to store data describing the change in hue relative to lighting geometry.
- 56. The method of claim 21 further comprising using said viewing history to store data describing the change in chrominance relative to lighting geometry.
- 57. The method of claim 21 further comprising using said viewing history to store data describing the change in saturation relative to lighting geometry.
- 58. The method of claim 21 further comprising using said viewing history to store data describing the change in intensity relative to lighting geometry.
- 59. The method of claim 21 further comprising using said viewing history to store data describing the change in brightens.
- 60. The method of claim further comprising using said viewing history to store data describing the change in hue relative to viewing geometry.
- 61. The method of claim further comprising using said viewing history to store data describing the change in chrominance relative to viewing geometry.
- 62. The method of claim 21 further comprising using said viewing history to store data describing the change in saturation relative to viewing geometry.
- 63. The method of claim 21 further comprising using said viewing history to store data describing the change in intensity relative to viewing geometry.
- 64. The method of claim 21 further comprising using said viewing history to store data describing the change in brightness relative to viewing geometry.
- 65. The method of claim 21 further comprising correlating the viewing geometry with the lighting geometry to produce a surface reflectance estimate.
- 66. The method of claim 21 further comprising calculating the surface reflectance estimate using a known value representing the lighting intensity.
- 67. The method of claim 21 further comprising acquiring multiple views of a surface patch corresponding to a 3D surface element under multiple lighting conditions.
- 68. The method of claim 21 further comprising correlating the angle of incident lighting with the surface normal corresponding to a said 3D surface element to produce the lighting geometry.
- 69. The method of claim 21 further comprising correlating the camera geometry with the surface normal corresponding to a 3D surface element to produce the viewing geometry.
- 70. The method of claim 21 further comprising estimating the surface normal of a surface patch corresponding to a 3D surface element.
- 71. The method of claim 21 further comprising producing a view dependent three-dimensional representation of said object that includes both shape and surface coloration information.
- 72. The method of claim 21 further comprising producing a view dependent three-dimensional representation of said object that includes both shape and surface coloration and surface property information.
- 73. The method of claim 21 producing a lighting dependent three-dimensional representation of said object that includes both shape and surface coloration information.
- 74. The method of claim 21 further comprising producing a lighting dependent three dimensional representation of said object that includes both shape and surface coloration and surface property information.
- 75. The method of claim 21 further comprising producing a lighting dependent and viewing dependent three-dimensional representation of said object that includes both shape and surface coloration information.
- 76. The method of claim 21 further comprising producing lighting dependent and viewing dependent three dimensional representation of said object that includes both shape and surface coloration and surface property information.
- 77. The method of claim 21 further comprising producing a mathematical functional expression representing a view dependent representation of said object.
- 78. The method of claim 21 further comprising producing a mathematical functional expression representing a lighting dependent representation of said object.
- 79. The method of claim 21 further comprising producing a mathematical functional expression representing lighting dependent and view dependent representation of said object.
- 80. The method of claim 21 further comprising producing a frequency based mathematical functional expression representing said object.
- 81. The method of claim 21 further comprising producing a frequency based mathematical functional expression representing said view dependent representation of said object.
- 82. The method of claim 21 further comprising producing a frequency based mathematical functional expression representing said lighting dependent representation of said object.
- 83. The method of claim 21 further comprising producing a frequency based mathematical functional expression representing said view dependent and lighting dependent representation of said object.
REFERENCE TO RELATED APPLICATIONS
[0001] This application is a continuation-in-part of pending U.S. application Ser. No. 09/945,133 filed on Aug. 31, 2001 which is a continuation of U.S. application Ser. No. 09/236,727 filed on Jan. 25, 2999 now U.S. Pat. No. 6,288,385 issued on Sep. 11, 2001 which is a continuation of U.S. application Ser. No. 08/738,437 filed on Oct. 25, 1996, now U.S. Pat. No. 5,864,640 issued on Jan. 26, 1999 all of which are incorporated herein by reference.
[0002] Appendix A entitled SIGGRAPH 2001 Course: Acquisition and Visualization of Surface Light Fields is attached hereto and made a part hereof by reference.
Continuations (2)
|
Number |
Date |
Country |
Parent |
09236727 |
Jan 1999 |
US |
Child |
09945133 |
Aug 2001 |
US |
Parent |
08738437 |
Oct 1996 |
US |
Child |
09236727 |
Jan 1999 |
US |
Continuation in Parts (1)
|
Number |
Date |
Country |
Parent |
09945133 |
Aug 2001 |
US |
Child |
10216088 |
Aug 2002 |
US |