Method for calibrating a vehicular camera system

Abstract
A method of calibrating a vehicular multi-camera system includes equipping a vehicle with a plurality of cameras wherein each camera of the plurality of cameras captures image data, equipping the vehicle with an image processor, inputting image data from each of the plurality of cameras to the image processor, the image processor processing input image data in order to calibrate the vehicular multi-camera system, and wherein calibration of the vehicular multi-camera system is achieved independently of a model of the real world.
Description
FIELD OF THE INVENTION

The present invention relates to a method for automatically calibrating a virtual camera and to a virtual camera apparatus which is set up for carrying out the method according to the invention. In particular, the present invention relates to a virtual camera for producing a view of the surroundings of a motor vehicle from a bird's eye perspective on a motor vehicle.


BACKGROUND OF THE INVENTION

What is known as a virtual camera refers to sets comprising a real recording camera and a habitually electronic image data processing device, which produce an output signal with a coded image or a coded image sequence, wherein the perspective of the coded images does not match the perspective of the recording camera. On account of the loss of information when a real three-dimensional object is mapped into a two-dimensional image data model by a real recording camera, the virtual camera is able to reproduce non-moving objects correctly particularly when they are approximately flat.


Virtual cameras have been proposed as driver assistance devices in motor vehicles. These are particularly what are known as top view systems or omnidirectional cameras. These typically comprise a plurality of real recording cameras which are arranged in or on a vehicle and which are used to produce a chronological sequence of image data records. The image data in the image data records are subjected to different transformations in a typically electronic image data processing device and are mixed to form a chronological sequence of overall image data. This makes it possible to obtain, by way of example, a view of the surroundings of the vehicle from a perspective above the vehicle roof. This chronological sequence of overall image data can be continuously displayed to the driver of the motor vehicle on a display apparatus in order to simplify shunt or parking manoeuvres.


It is evident that in a virtual camera with a plurality of real recording cameras the quality of the overall image delivered is distinctly dependent on the exact knowledge of the positions and directions of the real recording cameras. The more accurately that these data are known, the easier that it is possible to determine the transformation that determines the best possible image quality in the area of adjacent recording areas. Against this background, there have been a series of proposals involving either the automatic determination of the positions and directions of the recording cameras or the correction of errors in these variables in relation to initially stored values.


An omnidirectional camera with automatic calibration is revealed by the officially published document DE 10 2007 043 905 A1. Said document proposes identifying an object element in the image data for the purpose of the calibration. The recognition of mapped areas of the outer vehicle skin is recognized as particularly advantageous. The proposed approach is intended to allow compensation for, inter alia, situational changes in the real recording cameras on the vehicle on account of vibrations, ageing and thermal or mechanical stresses.


However, the known method is based on the processing of information from a model of the real world and particularly on the processing of information regarding the shape of the outer vehicle skin.


SUMMARY OF THE INVENTION

Against this background, the present invention is based on the object of providing a method for the automatic calibrating of a virtual camera which manages independently of a model of the real world.


This object is achieved by the present invention by means of a method having the features specified in claim 1.


Advantageous refinements and developments of the method according to the invention are specified in the subclaims.





BRIEF DESCRIPTION OF THE DRAWINGS

A preferred manner of carrying out the method according to the invention and a virtual camera apparatus which is set up to do so are described below, reference being made to the appended drawings, in which:



FIG. 1 shows a schematic illustration of a typical arrangement of real recording cameras on a motor vehicle and of the situation of the virtual perspective; and



FIG. 2 shows a schematic illustration of the functional units and data streams in a preferred apparatus for carrying out a method according to the invention; and



FIG. 3 shows a schematic synopsis of an algorithm for implementing a method according to the invention.





DETAILED DESCRIPTION OF THE INVENTION

As FIG. 1 shows, an expedient apparatus for carrying out a preferred method according to the invention for a driver assistance system on a motor vehicle first of all comprises a plurality of real recording camera devices 1, 2, 3. A fourth recording camera device 4 is located in the area which cannot be seen on the opposite side of the vehicle from the recording camera device 2. In one expedient implementation, the real recording camera devices 1, 2, 3, 4 are in the form of a CCD or CMOS array with a wide-angle or fisheye lens in a manner which is known per se and are arranged at different points on the motor vehicle 4 behind the glazing and in suitable pockets on the exterior mirrors. In the fitting positions which are possible in practice, the real perspectives P1 . . . 4 of the real recording camera devices 1, 2, 3, 4 are naturally offset and/or tilted with respect to the virtual perspective Pv, that is to say the viewing point of the virtual viewer. The greater that the difference between the real perspective P1 . . . 4 of a real recording camera device 1, 2, 3, 4 and the virtual perspective Pv turns out to be, the worse the quality of the image data with the transformed perspective under realistic conditions. This will be discussed later. Conversely, some fields of vision would not be visible to a viewer with a perspective corresponding to the virtual perspective Pv, because they are situated behind portions of the outer vehicle skin. In this respect, the selection of the camera positions is a trade-off between these advantages and disadvantages.


As further components, the apparatus described which is shown in FIG. 2 comprises an image data processing device 5 and a display device 6. In this case, the display device 6 is typically arranged in the area of the instrument panel or the central console. In a typical implementation, the image data processing device 5 is a digital signal processor or a powerful microprocessor for general applications with adequate equipment in terms of main memory and non-volatile program memory.


In line with the schematic illustration of a preferred algorithmic implementation of the method according to the invention which is shown in FIG. 3, the real recording camera devices 1, 2, 3, 4 respectively deliver a chronological sequence of NĂ—4 raw subimage data items DR1 . . . N, 1 . . . 4 to the image data processing device 5 in the operating period under consideration. Said image data processing device combines the data streams to form a chronological sequence of subimage data records DR1 . . . N. Each subimage data record DRi in the sequence contains the subimage data DRi, 1 . . . 4 from the real recording camera devices 1, 2, 3, 4. In the image data processing device 5, a respective transformation T1 . . . 4 is applied to the subimage data DRi, 1 . . . 4 which a subimage data record DRi contains when it is present. These transformations T1 . . . 4 are respectively determined such that the relevant image 7, 8, 9, 10 of a predetermined planar mapping area Aj of the real recording camera device j with the perspective Pj is transformed into the image from the virtual camera with the perspective Pv. For the case of an ideal recording camera device, this transformation would be linear and could be compiled from a planar perspective extension of the image by means of rotation and displacement. In comparison with the aforementioned ideal recording camera device, however, the real recording camera devices 1, 2, 3, 4 deliver maps with nonlinear distortions. The primary cause of these nonlinear distortions is inadequacies in the real mapping lenses. This relates particularly distinctly to the proposed lenses with a strong wide-angle or fisheye characteristic. From this point of view, positions for the real recording camera devices 1, 2, 3, 4 which are further away from the road are naturally preferable. However, the nonlinear distortions can usually be attributed to a satisfactory measure for observation of the environment using what are known as inverse nonlinear transformations, which are known per se. Consequently, a person skilled in the art will, according to the respective situation, select a chain comprising a nonlinear transformation for the purpose of equalization and an ideal perspective extension for the described transformation T1 . . . 4. The application of the transformations T1 . . . 4 to the subimage data DRi, 1 . . . 4 results in transformed subimage data DTi, 1 . . . 4. In the present case, it is assumed that the transformations T1 . . . 4 transform the subimage data DTi, 1 . . . 4 directly into the coordinate system of the overall image 11. Accordingly, the overall image data DGi can be produced by simply combining the data from the transformed subimage data DTi, 1 . . . 4.


Since, in the ideal case described, the positions and recording directions of the recording camera devices 1, 2, 3, 4 did not change, the information about the orientation and situation of the subimages in the overall image would need to be set only once for the relevant vehicle geometry as a parameter for the transformations T1 . . . 4 prior to startup. In the course of such initial calibration, the parameters can be determined either by calculating or calibrating the apparatus on a test bench. Even with optimum initial calibration, ambiguities can arise in the area of the overlaps 12 in the subimages 7, 8, 9, 10 on account of the described nonlinear distortions in the real recording camera devices 1, 2, 3, 4. Through suitable selection of the transformations and/or expedient stipulation of the subimage boundaries, however, it is possible to attain entirely satisfactory results in practice. In this regard, an appropriately adjusted image data processing device 5 produces as an output signal a sequence of overall image data DG1 . . . N which are displayed in their chronological order on a display device 6 and give the driver an impression of the immediate surroundings of the motor vehicle.


Effects of ageing, overloads, accidents and the like may result in the position and/or orientation of the recording camera devices 1, 2, 3, 4 being changed. If such a change is followed by the subimages continuing to be assembled to form an overall image in the originally stipulated manner, the result is poorer quality for the overall image. To counteract this drawback, the image data processing device 5 recurrently performs calibration with the aim of optimization using a prescribed quality criterion Q for the overall image. In this case, the quality criterion Q is a scalar value which is dependent on the data from the subimages and on the stored information relating to the positions and directions of the real recording camera device 1, 2, 3, 4. Expediently, the quality criterion Q is stipulated such that it reflects the quality of the overall image, as subjectively perceived by an average viewer. In one advantageous refinement, the quality criterion Q will also relate to the correlations of the subimages 7, 8, 9, 10 in the areas of overlap 12. The quality criterion Q is optimized for a firmly prescribed subimage data record DRi by varying the parameters of the transformations. The parameters varied to the optimum replace the values originally stored in the apparatus for the purpose of further operation of the apparatus.


This calibration is recurrently performed over time whenever a selection criterion C flags a subimage data record DRk for this purpose. In the present case, the selection criterion C is defined such that the flagged subimage data record DRk means that the calibration provides the best possible result. Intuitively, a good result will be assumed if the application of the quality criterion Q to the overall images 11 which follow the calibration provides the best possible result overall. Since the quality criterion Q in the present case relates only to an individual image data record, the quality of a sequence naturally requires appropriate definition. To this end, it is possible to use the generally known statistical functions, such as the mean value. Expediently, the selection criterion C processes the subimage data DRk, 1 . . . 4 anyway in order to assess the suitability of the flagged subimage data record DRk for the calibration. This complies with the insight that not all subimage data records DRi are equally good for the calibration in practice. By way of example, calibration must obviously remain undone in the event of a lack of contrast, under exposure, defocusing or motion blurring. Equally disadvantageous are image data records with periodic structures, which can be recognized by means of frequency analysis of the subimage data DRk, 1 . . . 4, for example. In addition, the subimage data DRk, 1 . . . 4 can be examined to determine whether they contain maps of three-dimensional objects above the road level. Typical objects of this kind are high kerbstones, crash barriers, marker posts, for example. A subimage data record DRi with maps of such objects in the area of the overlaps 12 should not be used for the calibration.


In addition, the selection criterion C flags a subimage data record DRk for calibration only if there was a particular state of the motor vehicle at the time at which said subimage data record was recorded, and the driving situation was within prescribed limits at this moment. To this end, the image data processing device 5 also derives, collects and assesses vehicle state variables and driving state variables from detection devices on the motor vehicle. In this context, preferred vehicle state variables are the operating period and mileage of the vehicle, the number of vehicle starts and the operating period and also the mileage since the vehicle was last started. By including these variables in the selection criterion C, it is particularly possible to take account of thermal changes, mechanical settling or ageing effects and wear. Preferred driving state variables selected are the speed of travel, the acceleration, the steering angle, the angle of inclination and the loading of the vehicle. If available, it is also possible to include data about the tyre pressure and the setting of the suspension. When these data are included, it is possible to take account of the dynamic differences in the vehicle situation in relation to the road surface when deciding about calibration.


Further preferred variables for inclusion in the selection criterion C could be the GPS position of the vehicle, the exterior light conditions and signals from proximity sensors for the near field of the vehicle. By including such variables, it is possible to base the decision about calibration on considerations concerning whether and to what extent the current vehicle surroundings favour or impede calibration.


On the basis of a subimage data record DRk flagged by the selection criterion C, the calibration can preferably be performed by calculating a correlation between the subimage data DTi, 1 . . . 4 in a manner which is known per se. In this case, the areas of overlap 12 are identified and the situation and orientation of the image sections coded in the subimage data DTi, 1 . . . 4 relative to one another are determined. The quality criterion Q is optimized for the flagged subimage data record DRk by varying the parameters of the transformations T1 . . . 4.


In one refinement of the method described above, it is also possible to include the history of the calibrations performed in the past in the selection criterion. For example, this history could be used to determine the time for the next calibration. It is also possible to determine the parameters of the transformations not exclusively on the basis of the result of the last calibration, but rather to perform historical averaging. Yet another option is to anticipate adaptation of the parameters without calibration by extrapolation on the basis of the already collected historical data from past calibrations.

Claims
  • 1. A method of calibrating a vehicular multi-camera system, said method comprising: equipping a vehicle with a plurality of cameras wherein each camera of said plurality of cameras captures image data;wherein said plurality of cameras comprises a first camera mounted at a forward portion of the vehicle and having a field of view that includes a ground surface forward of the vehicle, a second camera mounted at a rearward portion of the vehicle and having a field of view that includes a ground surface rearward of the vehicle, a third camera mounted at a sideward portion of the vehicle and having a field of view that includes a ground surface sideward of the vehicle at one side of the vehicle, and a fourth camera mounted at another sideward portion of the vehicle and having a field of view that includes a ground surface sideward of the vehicle at the other side of the vehicle;equipping said vehicle with an image processor;inputting image data from each of said plurality of cameras to said image processor;said image processor processing input image data in order to calibrate said vehicular multi-camera system;wherein calibration of said vehicular multi-camera system is achieved independently of a model of the real world;producing a chronological sequence of image data by recurrently applying a selection criterion to select from a sequence of subimage data records recorded by said first, second, third and fourth cameras;determining parameters of transformations of said subimage data records via an optimization method that uses a quality criterion for an overall image; andassembling a sequence of transformed subimage data records to form a sequence of overall image data.
  • 2. The method of claim 1, wherein equipping a vehicle with a plurality of cameras comprises equipping a vehicle with a plurality of CMOS cameras.
  • 3. The method of claim 1, wherein calibration of said vehicular multi-camera system is achieved independently of the shape of the outer vehicle skin.
  • 4. The method of claim 1, comprising an initial calibration before producing a chronological sequence of image data.
  • 5. The method of claim 1, wherein said quality criterion comprises a scalar value.
  • 6. The method of claim 5, wherein said scalar value is dependent on said subimage data.
  • 7. The method of claim 6, wherein said scalar value is dependent on stored information relating to at least one of (i) a position of at least one of said first, second, third and fourth cameras and (ii) a direction of said field of view of at least one of said first, second, third and fourth cameras.
  • 8. The method of claim 6, wherein said scalar value is dependent on stored information relating to (i) a position of at least one of said first, second, third and fourth cameras and (ii) a direction of said field of view of at least one of said first, second, third and fourth cameras.
  • 9. The method of claim 6, wherein said scalar value is dependent on stored information relating to at least one of (i) respective positions of said first, second, third and fourth cameras and (ii) respective directions of said fields of view of said first, second, third and fourth cameras.
  • 10. The method of claim 1, wherein said quality criterion reflects a quality of said overall image data.
  • 11. The method of claim 1, wherein said calibration is recurrently performed over time in response to said selection criterion selecting a particular subimage data record.
  • 12. The method of claim 11, wherein said selection criterion selects a particular subimage data record when a particular state of the vehicle at the time at which said particular subimage data record was recorded is within a prescribed limit.
  • 13. A method of calibrating a vehicular multi-camera system, said method comprising: equipping a vehicle with a plurality of CMOS cameras wherein each camera of said plurality of CMOS cameras captures image data;wherein said plurality of CMOS cameras comprises a first camera mounted at a forward portion of the vehicle and having a field of view that includes a ground surface forward of the vehicle, a second camera mounted at a rearward portion of the vehicle and having a field of view that includes a ground surface rearward of the vehicle, a third camera mounted at a sideward portion of the vehicle and having a field of view that includes a ground surface sideward of the vehicle at one side of the vehicle, and a fourth camera mounted at another sideward portion of the vehicle and having a field of view that includes a ground surface sideward of the vehicle at the other side of the vehicle;equipping said vehicle with an image processor;inputting image data from each of said plurality of cameras to said image processor;said image processor processing input image data in order to calibrate said vehicular multi-camera system;wherein calibration of said vehicular multi-camera system is achieved independently of a model of the real world, and wherein calibration of said vehicular multi-camera system is achieved independently of the shape of the outer vehicle skin;producing a chronological sequence of image data by recurrently applying a selection criterion to select from a sequence of subimage data records recorded by said first, second, third and fourth cameras;determining parameters of transformations of said subimage data records via an optimization method that uses a quality criterion for an overall image; andassembling a sequence of transformed subimage data records to form a sequence of overall image data.
  • 14. The method of claim 13, comprising an initial calibration before producing a chronological sequence of image data.
  • 15. The method of claim 13, wherein said quality criterion comprises a scalar value, and wherein said scalar value is dependent on stored information relating to at least one of (i) a position of at least one of said first, second, third and fourth cameras and (ii) a direction of said field of view of at least one of said first, second, third and fourth cameras.
  • 16. The method of claim 13, wherein said calibration is recurrently performed over time in response to said selection criterion selecting a particular subimage data record, and wherein said selection criterion selects a particular subimage data record when a particular state of the vehicle at the time at which said particular subimage data record was recorded is within a prescribed limit.
  • 17. A method of calibrating a vehicular multi-camera system, said method comprising: equipping a vehicle with a plurality of CMOS cameras wherein each camera of said plurality of CMOS cameras captures image data;wherein said plurality of CMOS cameras comprises a first camera mounted at a forward portion of the vehicle and having a field of view that includes a ground surface forward of the vehicle, a second camera mounted at a rearward portion of the vehicle and having a field of view that includes a ground surface rearward of the vehicle, a third camera mounted at a sideward portion of the vehicle and having a field of view that includes a ground surface sideward of the vehicle at one side of the vehicle, and a fourth camera mounted at another sideward portion of the vehicle and having a field of view that includes a ground surface sideward of the vehicle at the other side of the vehicle;equipping said vehicle with an image processor;inputting image data from each of said plurality of cameras to said image processor;said image processor processing input image data in order to calibrate said vehicular multi-camera system;wherein calibration of said vehicular multi-camera system is achieved independently of a model of the real world;producing a chronological sequence of image data by recurrently applying a selection criterion to select from a sequence of subimage data records recorded by said first, second, third and fourth cameras;determining parameters of transformations of said subimage data records via an optimization method that uses a quality criterion for an overall image;assembling a sequence of transformed subimage data records to form a sequence of overall image data; andinitially calibrating said vehicular multi-camera system before producing a chronological sequence of image data.
  • 18. The method of claim 17, comprising an initial calibration before producing a chronological sequence of image data.
  • 19. The method of claim 17, wherein said quality criterion comprises a scalar value, and wherein said scalar value is dependent on stored information relating to at least one of (i) a position of at least one of said first, second, third and fourth cameras and (ii) a direction of said field of view of at least one of said first, second, third and fourth cameras.
  • 20. The method of claim 17, wherein said calibration is recurrently performed over time in response to said selection criterion selecting a particular subimage data record, and wherein said selection criterion selects a particular subimage data record when a particular state of the vehicle at the time at which said particular subimage data record was recorded is within a prescribed limit.
Priority Claims (1)
Number Date Country Kind
10 2008 053 047 Oct 2008 DE national
Parent Case Info

This application is a continuation of U.S. patent application Ser. No. 12/604,432, filed Oct. 23, 2009, now U.S. Pat. No. 8,169,480, which claims the benefits of German Application No. 102008053047.6, filed Oct. 24, 2008.

US Referenced Citations (193)
Number Name Date Kind
3882268 Ogawa et al. May 1975 A
4258979 Mahin Mar 1981 A
4600913 Caine Jul 1986 A
4847772 Michalopoulos et al. Jul 1989 A
4907870 Brucker Mar 1990 A
4931937 Kakinami et al. Jun 1990 A
4942533 Kakinami et al. Jul 1990 A
4970653 Kenue Nov 1990 A
4971430 Lynas Nov 1990 A
5070454 Griffith Dec 1991 A
5097362 Lynas Mar 1992 A
5128874 Bhanu et al. Jul 1992 A
5177685 Davis et al. Jan 1993 A
5189561 Hong Feb 1993 A
5294991 Oshima et al. Mar 1994 A
5304980 Maekawa Apr 1994 A
5333111 Chaiken et al. Jun 1994 A
5355118 Fukuhara Oct 1994 A
5365603 Karmann Nov 1994 A
5369590 Karasudani Nov 1994 A
5424952 Asayama Jun 1995 A
5426294 Kobayashi et al. Jun 1995 A
5448484 Bullock et al. Sep 1995 A
5487116 Nakano et al. Jan 1996 A
5500766 Stonecypher Mar 1996 A
5521633 Nakajima et al. May 1996 A
5521843 Hashima et al. May 1996 A
5523811 Wada et al. Jun 1996 A
5530771 Meakawa Jun 1996 A
5537003 Bechtel et al. Jul 1996 A
5541590 Nishio Jul 1996 A
5550677 Schofield et al. Aug 1996 A
5555555 Sato et al. Sep 1996 A
5568190 Noguchi et al. Oct 1996 A
5581464 Woll et al. Dec 1996 A
5596365 Erickson et al. Jan 1997 A
5617085 Tsutsumi et al. Apr 1997 A
5627586 Yamasaki May 1997 A
5638116 Shimoura et al. Jun 1997 A
5642093 Kinoshita et al. Jun 1997 A
5642299 Hardin et al. Jun 1997 A
5668663 Varaprasad et al. Sep 1997 A
5670935 Schofield et al. Sep 1997 A
5675489 Pomerleau Oct 1997 A
5715093 Schierbeek et al. Feb 1998 A
5724187 Varaprasad et al. Mar 1998 A
5745310 Mathieu Apr 1998 A
5760962 Schofield et al. Jun 1998 A
5786772 Schofield et al. Jul 1998 A
5790403 Nakayama Aug 1998 A
5796094 Schofield et al. Aug 1998 A
5798575 O'Farrell et al. Aug 1998 A
5837994 Stam et al. Nov 1998 A
5845000 Breed et al. Dec 1998 A
5850254 Takano et al. Dec 1998 A
5877897 Schofield et al. Mar 1999 A
5884212 Lion Mar 1999 A
5890083 Franke et al. Mar 1999 A
5892855 Kakinami et al. Apr 1999 A
5929784 Kawaziri et al. Jul 1999 A
5929786 Schofield et al. Jul 1999 A
5949331 Schofield et al. Sep 1999 A
6005492 Tamura et al. Dec 1999 A
6009377 Hiwatashi Dec 1999 A
6044321 Nakamura et al. Mar 2000 A
6049619 Anandan et al. Apr 2000 A
6097023 Schofield et al. Aug 2000 A
6104552 Thau et al. Aug 2000 A
6163083 Kramer et al. Dec 2000 A
6169940 Jitsukata et al. Jan 2001 B1
6173222 Seo et al. Jan 2001 B1
6201236 Juds Mar 2001 B1
6201642 Bos Mar 2001 B1
6218960 Ishikawa et al. Apr 2001 B1
6222447 Schofield et al. Apr 2001 B1
6226389 Lemelson et al. May 2001 B1
6226592 Luckscheiter et al. May 2001 B1
6243003 DeLine et al. Jun 2001 B1
6246961 Sasaki et al. Jun 2001 B1
6249214 Kashiwazaki Jun 2001 B1
6250148 Lynam Jun 2001 B1
6269308 Kodaka et al. Jul 2001 B1
6278377 DeLine et al. Aug 2001 B1
6282483 Yano et al. Aug 2001 B1
6285393 Shimoura et al. Sep 2001 B1
6292111 Ishikawa et al. Sep 2001 B1
6292752 Franke et al. Sep 2001 B1
6313454 Bos et al. Nov 2001 B1
6317057 Lee Nov 2001 B1
6320176 Schofield et al. Nov 2001 B1
6330511 Ogura et al. Dec 2001 B2
6341523 Lynam Jan 2002 B2
6353392 Schofield et al. Mar 2002 B1
6396397 Bos et al. May 2002 B1
6411204 Bloomfield et al. Jun 2002 B1
6420975 DeLine et al. Jul 2002 B1
6433676 DeLine et al. Aug 2002 B2
6483429 Yasui et al. Nov 2002 B1
6485155 Duroux et al. Nov 2002 B1
6498620 Schofield et al. Dec 2002 B2
6580996 Freidrich Jun 2003 B1
6590719 Bos Jul 2003 B2
6594583 Ogura et al. Jul 2003 B2
6636258 Strumolo Oct 2003 B2
6671607 Ishizu et al. Dec 2003 B2
6690268 Schofield et al. Feb 2004 B2
6691008 Kondo et al. Feb 2004 B2
6708100 Russell et al. Mar 2004 B2
6717610 Bos et al. Apr 2004 B1
6748312 Russell et al. Jun 2004 B2
6757109 Bos Jun 2004 B2
6760471 Raymond Jul 2004 B1
6819231 Berberich et al. Nov 2004 B2
6823241 Shirato et al. Nov 2004 B2
6824281 Schofield et al. Nov 2004 B2
6882287 Schofield Apr 2005 B2
6912001 Okamoto et al. Jun 2005 B2
6915228 Uffenkamp et al. Jul 2005 B2
6917378 Weis et al. Jul 2005 B2
6928180 Stam et al. Aug 2005 B2
6941216 Isogai et al. Sep 2005 B2
6946978 Schofield Sep 2005 B2
6968266 Ahmed-Zaid et al. Nov 2005 B2
6989736 Berberich et al. Jan 2006 B2
7005974 McMahon et al. Feb 2006 B2
7038577 Pawlicki et al. May 2006 B2
7049945 Breed et al. May 2006 B2
7058207 Iida et al. Jun 2006 B2
7071964 Glatt Jul 2006 B1
7088262 Schindler et al. Aug 2006 B2
7139412 Kato et al. Nov 2006 B2
7145519 Takahashi et al. Dec 2006 B2
7151844 Stevenson et al. Dec 2006 B2
7161616 Okamoto et al. Jan 2007 B1
7188963 Schofield et al. Mar 2007 B2
7230640 Regensburger et al. Jun 2007 B2
7248283 Takagi et al. Jul 2007 B2
7280124 Laufer et al. Oct 2007 B2
7295229 Kumata et al. Nov 2007 B2
7295682 Otsuka et al. Nov 2007 B2
7301488 Leung et al. Nov 2007 B2
7317813 Yanagawa et al. Jan 2008 B2
7369940 Frank et al. May 2008 B2
7370983 DeWind et al. May 2008 B2
7388475 Litkouhi Jun 2008 B2
7391014 Saccagno Jun 2008 B2
7420592 Freeman Sep 2008 B2
7463138 Pawlicki et al. Dec 2008 B2
7463281 Luskin et al. Dec 2008 B2
7468745 Xin et al. Dec 2008 B2
7519459 Ito et al. Apr 2009 B2
7526103 Schofield et al. Apr 2009 B2
7565006 Stam et al. Jul 2009 B2
7592928 Chinomi et al. Sep 2009 B2
7663476 Watanabe et al. Feb 2010 B2
7680570 Mori Mar 2010 B2
7697029 Ozaki Apr 2010 B2
7720580 Higgins-Luthman May 2010 B2
7742070 Glatt Jun 2010 B2
7764808 Zhu et al. Jul 2010 B2
7768545 Glatt Aug 2010 B2
7782374 Suzuki et al. Aug 2010 B2
7864981 Leleve et al. Jan 2011 B2
7877175 Higgins-Luthman Jan 2011 B2
7914187 Higgins-Luthman et al. Mar 2011 B2
7925229 Eisenstadt et al. Apr 2011 B2
7949486 Denny et al. May 2011 B2
7991522 Higgins-Luthman Aug 2011 B2
8017898 Lu et al. Sep 2011 B2
8027029 Lu et al. Sep 2011 B2
8116929 Higgins-Luthman Feb 2012 B2
8150210 Chen et al. Apr 2012 B2
8239086 Higgins-Luthman Aug 2012 B2
20020003571 Schofield et al. Jan 2002 A1
20020124260 Rivera Sep 2002 A1
20020159270 Lynam et al. Oct 2002 A1
20020188392 Breed et al. Dec 2002 A1
20030025597 Schofield Feb 2003 A1
20030052773 Sjonell Mar 2003 A1
20030156015 Winner et al. Aug 2003 A1
20030169522 Schofield et al. Sep 2003 A1
20030236622 Schofield Dec 2003 A1
20040149504 Swoboda et al. Aug 2004 A1
20050232469 Schofield et al. Oct 2005 A1
20060050018 Hutzel et al. Mar 2006 A1
20060125919 Camilleri et al. Jun 2006 A1
20060164230 DeWind et al. Jul 2006 A1
20060171704 Bingle et al. Aug 2006 A1
20080143835 Abe et al. Jun 2008 A1
20080144924 Hoffmann Jun 2008 A1
20090102292 Cook et al. Apr 2009 A1
20100002071 Ahiska Jan 2010 A1
20100013930 Matsuo et al. Jan 2010 A1
Foreign Referenced Citations (16)
Number Date Country
19962997 Jun 2001 DE
102006044615 Mar 2008 DE
0132151 Jan 1985 EP
0354261 Feb 1990 EP
0591743 Apr 1994 EP
0755524 Oct 1995 EP
1710749 Oct 2006 EP
01-281600 Nov 1989 JP
03-097080 Apr 1991 JP
03-203000 Sep 1991 JP
05-062099 Mar 1993 JP
2003-087781 Mar 2003 JP
2003-329439 Nov 2003 JP
2005-039599 Feb 2005 JP
WO 2007049266 May 2007 WO
WO 2010038224 Apr 2010 WO
Related Publications (1)
Number Date Country
20120224064 A1 Sep 2012 US
Continuations (1)
Number Date Country
Parent 12604432 Oct 2009 US
Child 13460896 US