The present invention relates generally to the capturing of three dimensional data of the shape of an object, and more particularly involves the use of cameras (or imagers) and light projectors to capture images of an object and storing that data for processing into a three-dimensional model of the shape of the object. In a preferred embodiment the invention is useful in fitting amputees with properly fitted prosthetic devices.
It is known to sense the shape of body parts or limbs three-dimensionally by means of active laser scanners. See, for example, PCT Publication WO92/08175. It is also known to use photogrammetric systems employing passive stereo cameras and thin, tight fitting envelopes of material carrying a high contrast pattern worn over the body part being sensed. See, for example, U.S. Pat. No. 5,911,126.
The present invention employs light projectors preferably mounted on or in a structure which is preferably in the shape of a ring, and cameras preferably mounted on or in the ring. The structure is placed over the object to be sensed and light planes are projected onto the object. The cameras and projectors are preferably electronically controlled to operate in a predetermined sequence to capture images of the light planes projected onto the object.
The structure is preferably connected to a computer and display. A processor analyzes the data captured to create a three-dimensional model of the object. To create the 3D model the invention preferably uses data taken from the two-dimensional edges of the light planes on the object from the cameras positioned at various angles to the object and converts the data to a 3D model.
In one preferred embodiment the 3D data model may be used in the manufacture of properly fitted prosthetic devices for an amputee. A prosthetist may operate the mobile apparatus of the present invention wherever the amputee may happen to be located. Data collected from the use of the apparatus of the present invention may be stored in a computer used by the prosthetist. Later the prosthetist may download the data stored on an amputee to be used in the manufacture of a properly fitted prosthetic device for the amputee.
Properly fitting an amputee with a prosthetic device is important to the comfort and health of the amputee as well as to the usefulness of the prosthetic device. An improperly fitted prosthetic device can cause discomfort to the amputee, and can cause sores or blisters to develop on or near the amputee's stump. By mapping the various contours of an amputee's stump in a three-dimensional data model that can later be downloaded, a manufacturer of prosthetic devices can produce a better fitting prosthetic device for the amputee that recognizes and accommodates substantially the same contours of the amputee's stump.
The present invention may be used directly on an amputee's stump or in association with a covering or liner worn over the stump. There are very few limits on the use of the present invention. It is really only limited in its use by the size of the structure and the object being sensed. The object needs to be able to fit in the structure about which the cameras and projectors are arranged.
With reference to the figures, there is shown in
In one embodiment of the present invention the object 16 is a stump of an amputee. A prosthetist may operate the ring structure to gather data to form a 3D model of the amputee's stump. The data collected may be used in the manufacture of a prosthetic device for the amputee. The stump may be covered in advance of the image gathering by a gel liner common to the industry of prosthetics, or by other covering means.
The present invention is preferably comprised of cameras, projectors, and a ring structure. There are preferably four cameras built into the structure. On a target object placed at the center of the ring, each camera is preferably able to ‘see’ approximately 10″ up, 10″ down, and 7″ from side to side. The cameras should be focused for the proper exposure and synchronized for operation. There are preferably four projectors built into the structure. Each projector is preferably able to project 21 planes of light in towards the center of the ring. The planes of light are preferably evenly spaced and preferably fan out in an evenly spaced manner. For a model approximately 4″ in diameter, the projected planes of light will preferably illuminate a 20″ range (10″ up and 10″ down) of the model. Due to the high amount of current necessary to flash a projector, a large capacitor is preferably mounted on the projector card and fully charged before initiating a flash. The inside (working) surface of the ring is preferably approximately 25″ in diameter. The ring preferably houses, as noted above, four cameras and four projectors evenly spaced in an alternating sequence around the ring. Each camera and projector is preferably connected, in a daisy-chain arrangement, to a USB/processor board 24 inside the structure, which ultimately controls the cameras and projectors based upon commands sent over a USB channel from the user's PC.
The projector and camera arrangement, and usage are preferably as follows: The four projectors may be arranged in global orientation and designated North, South, East, and West. The four cameras may be arranged between the projectors and designated Northeast, Southeast, Northwest, and Southwest. While holding the structure normally, preferably using handgrips on the structure, the South projector is close to the operator's stomach, the East projector is close to the operator's right hand, the West is close to the operator's left hand, and the North is on the opposite side of the ring. The Northeast camera is between the North and East projectors, and so on.
Image capture is accomplished using the cameras. The cameras are preferably capable of taking a single full-frame (640×480) monochrome capture and sending it to the operator's PC. A capture can be issued for an individual camera or for all cameras simultaneously. The image data may be encrypted on the camera card itself and decoded on the user's PC.
For 3D capture and projector calibration purposes, the cameras are preferably capable of taking a ‘triple capture’—three consecutive frames of video coordinated with the projectors in the following manner: the North and South projectors will be on during the first frame, no projectors will be on during the second frame, and the East and West projectors will be on during the third frame. Due to potential memory limitations on the camera cards, the cameras preferably only store every third row of pixels for each frame. As with the full-frame capture, a triple-capture may be issued for an individual camera or for all cameras simultaneously and the data may be encrypted for protection during data transfer.
Once a triple capture is acquired, the software may process it in many ways.
Post-Processed—The ‘East Delta’ frame or
Post-Processed—The ‘North Delta’ frame or
Post-Processed—The ‘Both’ frame or
Post Processed—The ‘Intersection’ frame or
Lens calibration preferably occurs once—during manufacturing—and it stores the characteristics of the individual lens into the EEPROM memory onboard the camera. A combination of the lens and field calibration allows the program to pair a camera pixel with an accurate three dimensional ray exiting the camera lens. Field calibration may be performed by the user at any time. It corrects for any shift or rotation of the camera card, or a slight deformation of the ring. This calibration uses a picture of the blue lights on the far side of the ring, as they are at a known physical position in space. A combination of the lens and field calibration allows the program to pair a camera pixel with an accurate three dimensional ray exiting the camera lens. Projector calibration may be performed by the user at any time. It determines the location of the planes of light being projected into the ring. This calibration correlates two delta frames from different cameras to determine the position of a calibration target held in front of the projector, normalizes the planes, and stores the result onboard the projector card. The projector calibration allows the program to know the precise mathematical formula of each three dimensional plane of light projected onto a model.
From a software perspective, capturing a 3D shape using the present invention preferably begins by connecting to the structure, loading the lens, field, and projector calibrations, setting up the cameras, and taking a triple capture on all four cameras simultaneously. At that point, having 12 pictures of the model (ambient/north-lit/east-lit from each of four cameras), and many other possibilities for post-processed images.
Once the captures are taken, there are six steps to create a model in the preferred embodiment:
The description herein of the preferred embodiment of the invention is for exemplary purposes and is not intended in any way to limit the scope of the allowed claims. The allowed claims are to be given their broadest interpretation under the law.
This application claims the benefit of U.S. Provisional Application No. 60/611,364, filed on Sep. 18, 2004 which is expressly incorporated herein.
Number | Name | Date | Kind |
---|---|---|---|
3985444 | Takashima et al. | Oct 1976 | A |
4653104 | Tamura | Mar 1987 | A |
4745290 | Frankel et al. | May 1988 | A |
4773029 | Claesson et al. | Sep 1988 | A |
4819660 | Smith | Apr 1989 | A |
4821200 | Oberg | Apr 1989 | A |
4895434 | Stern et al. | Jan 1990 | A |
4929843 | Chmielewski et al. | May 1990 | A |
4969106 | Vogel et al. | Nov 1990 | A |
4982438 | Usami et al. | Jan 1991 | A |
5040005 | Davidson et al. | Aug 1991 | A |
5127420 | Horvath | Jul 1992 | A |
5307151 | Hof et al. | Apr 1994 | A |
5339154 | Gassler et al. | Aug 1994 | A |
5360446 | Kennedy | Nov 1994 | A |
5432703 | Clynch et al. | Jul 1995 | A |
5448472 | Mushabac | Sep 1995 | A |
5477459 | Clegg et al. | Dec 1995 | A |
5528517 | Løken | Jun 1996 | A |
5539649 | Walsh et al. | Jul 1996 | A |
5741215 | D'Urso | Apr 1998 | A |
5742068 | Dybdahl et al. | Apr 1998 | A |
5753931 | Borchers et al. | May 1998 | A |
RE35816 | Schultz | Jun 1998 | E |
5781652 | Pratt | Jul 1998 | A |
5886775 | Houser et al. | Mar 1999 | A |
5911126 | Massen | Jun 1999 | A |
5917640 | Staver | Jun 1999 | A |
5969823 | Wurz et al. | Oct 1999 | A |
6028672 | Geng | Feb 2000 | A |
6075605 | Futamura et al. | Jun 2000 | A |
6081739 | Lemchen | Jun 2000 | A |
6144386 | Pratt | Nov 2000 | A |
6177999 | Wurz et al. | Jan 2001 | B1 |
6236743 | Pratt | May 2001 | B1 |
6256099 | Kaufman et al. | Jul 2001 | B1 |
6287119 | van Nifterick et al. | Sep 2001 | B1 |
6326994 | Yoshimatsu | Dec 2001 | B1 |
6369899 | Hamada | Apr 2002 | B1 |
6383148 | Pusch et al. | May 2002 | B1 |
6421629 | Ishiyama | Jul 2002 | B1 |
6424422 | Kamon et al. | Jul 2002 | B1 |
6480678 | Matsushima | Nov 2002 | B1 |
6490541 | Ariga et al. | Dec 2002 | B1 |
6493095 | Song et al. | Dec 2002 | B1 |
6512844 | Bouguet et al. | Jan 2003 | B2 |
6542249 | Kofman et al. | Apr 2003 | B1 |
6542250 | Michaelis et al. | Apr 2003 | B1 |
6549289 | Ellis | Apr 2003 | B1 |
6564086 | Marchitto et al. | May 2003 | B2 |
6590573 | Geshwind | Jul 2003 | B1 |
6674893 | Abe et al. | Jan 2004 | B1 |
6829377 | Milioto | Dec 2004 | B2 |
7006952 | Matsumoto et al. | Feb 2006 | B1 |
20020176608 | Rose | Nov 2002 | A1 |
20030122954 | Kassatly | Jul 2003 | A1 |
20030137510 | Massen | Jul 2003 | A1 |
20030142863 | Massen | Jul 2003 | A1 |
20040032595 | Massen | Feb 2004 | A1 |
20040032649 | Kondo et al. | Feb 2004 | A1 |
20070276224 | Lang et al. | Nov 2007 | A1 |
Number | Date | Country |
---|---|---|
363 580 | Aug 1981 | AT |
2 250 679 | Apr 1973 | DE |
4232606 | Mar 1994 | DE |
44 17 872 | Nov 1995 | DE |
595 04 229 D | Dec 1998 | DE |
2 257 250 | Jan 1993 | GB |
WO8304114 | Nov 1983 | WO |
WO9010194 | Sep 1990 | WO |
WO9208175 | May 1992 | WO |
WO9531934 | Nov 1995 | WO |
Number | Date | Country | |
---|---|---|---|
20060062449 A1 | Mar 2006 | US |
Number | Date | Country | |
---|---|---|---|
60611364 | Sep 2004 | US |