Image acquisition system

Information

  • Patent Grant
  • 8090246
  • Patent Number
    8,090,246
  • Date Filed
    Friday, August 8, 2008
    16 years ago
  • Date Issued
    Tuesday, January 3, 2012
    13 years ago
Abstract
A system having a sensor and variable focus lens for iris image standoff acquisition. The sensor may capture a sequence of images at a high frame rate of a person for getting an eye or an iris in a window within the images. Even if the eye moves around in the image, the window may stay on the eye. During this capture, the focus of the lens may be changed, with a best focus situated somewhere in between the end focus positions of the lens. The sensor may be an infrared (IR) sensor and an IR illuminator or flash may provide light for the capture of images. An intensity variance indicator may be incorporated to select an in-focus image of the sequence. Processing of the images may be subsequent to the capture of images, thus not hindering the frame rate of the system.
Description
BACKGROUND

The invention pertains to biometrics and particularly to acquisition of biometric images.


SUMMARY

The invention is an image standoff acquisition system for capturing images of an eye or eyes of a non-cooperating subject. The invention may overcome a need for exact focusing by capturing a rapid sequence of frames while sweeping through the focus range of optics of an acquisition system or camera. The focus range may be effected with moving the lens, the subject, the camera, the image sensor in the camera, or a combination of two or more items. After the sequence is captured, then a frame of the sequence having an iris most in focus may be selected.





BRIEF DESCRIPTION OF THE DRAWING


FIG. 1 is a block diagram of an image acquisition system;



FIG. 2 is a more detailed diagram of the image acquisition system;



FIG. 3 is a diagram of an example image on a camera sensor;



FIG. 4 is a diagram of a window on an eye of an image of a subject;



FIG. 5 is a diagram showing focus distances between an acquisition camera and a subject;



FIG. 5
a is a graph showing the nominal focus distance versus time;



FIG. 6 is a graph showing focusing distance versus time, and trigger signal voltage for image acquisition;



FIG. 7 is a diagram illustrating lateral and radial movement of a subject relative to an acquisition camera;



FIG. 8 is a graph of capture/shutter trigger signal versus time for exposures and frames of images;



FIG. 9 is a graph of intensity variance versus focus distance for captured images;



FIG. 10 is a diagram simulating some of the captured images of an eye of a subject at various focus distances shown in the graph of FIG. 9; and



FIG. 11 is a diagram of an iris image cropped from a focused image in FIG. 10.





DESCRIPTION

Iris patterns may contain many very small details that need to be correctly recorded for analysis and identification. Capturing those details may require an optical system with a very large spatial frequency bandwidth. Such system may have very sensitive focusing in that its depth of field can be as low as a fraction of a millimeter. Given the depth profile of a human face, a high quality system cannot necessarily be focused on it in its entirety. The system may need to find within a frame the eye to be imaged and then focus on it. The focusing should be very precise. It is this requirement that makes many current systems so user unfriendly, because in order to lower their engineering complexity and thus cost, the systems shift the burden onto the subject in hope that the subject's cooperation will eventually result in a well focused iris image. To obtain a well focused iris image, the subject may be commanded to move back and forth, left to right, and so on, until the subject eventually positions its eye into the system's sweet spot. Experience shows that it may require much patience and willingness to cooperate by the subject. Alternatively, handheld devices like those used in the military need to be moved by the user to get the eye into a crosshair and achieve focus. If the user is not well trained or works under stress, capturing a good image may again become a time consuming challenge. Since the amount of iris details needed to be captured may depend on an intended security level, in that the higher the security level, the more precise solution is required to capture adequate images.


In an iris image acquisition system, the optics alone is not necessarily the costliest part. Cost may be primarily and progressively driven by the degree of subject's lack of cooperation and the user's lack of skill which the system can tolerate and still work reliably. The present system may address these issues of cooperation and skill in a way that requires neither optical autofocusing nor precise range finding.


Until recently, cameras offered not really very large image sizes and had low frame rates. The present system may build upon recent advances such as large image sizes and high frame rates. Other technologies may be incorporated.


The system may have a camera that takes a fast sequence of frames while the optical focus lens position is incremented or varied so that each frame is taken with a somewhat different focus adjustment. The focus lens position may execute a full sweep of its adjustability range, very much like if one turns the focus ring on a classical camera lens objective from end to end, i.e., from a focus at infinity to a focus at the nearest working distance of the objective, while shooting pictures in a rapid succession during the turn of the focus ring.


There may be the stop-and-go approach, when in each iteration, the system first resets the lens' focus and then takes a shot. There may be the continuous approach, when for instance, four or so shots are taken while the focus lens is moving, without stopping during the image acquisition. For this “continuous focus lens sweep” to work well, the image exposure time (TE) should be shorter than the time (TF) it takes the lens to get out of its depth of field.


Once the sequence has been captured, each frame may eventually be checked to note if the subject of the camera is at least approximately in focus by measuring its contrast or, in a more detailed way, spatial frequency content in a number of small patches selected in the frame post-processing. Patch selection may follow a predefined pattern or it may be random. A first check of the frames may result in discarding most of the images in the sequence for being very much out of focus. The selection may be done after the sequence has been taken but it could be done in real time though the latter processing or autofocusing could be more costly.


A significant aspect is that the frames which survive a first pass will be analyzed further to locate one or two eyes of a person in them, and then the spatial frequency content will be measured over the eyes only. Eye finding may be generated by an algorithm. The eye in the image or images may be localized to and followed by a window. Such window may be described herein relative to FIGS. 3, 4 and 10. The frames having the highest spatial frequency content over the eyes in them may then be the best eye and/or iris images in the sequence, and be either in or nearly in focus.


The rate at which the frames are to be taken, may define a minimum degree of cooperation required from the subject. If the subject is not moving, holding its head more or less steady, and is reasonably close to the camera, then frame rates on the order of hundreds of frames per second may suffice, without a need for very high illumination iradiance levels which may be harmful to the subject's eyes. Near IR flash may be used which may be increased with intensity at farther distances or shorter exposure times.


The frame sequence may be processed either offline or in real time, depending on the investment one is willing to make in the necessary computational hardware for the present system. Another approach may include culling out the unlikely candidate frames in real time, storing only those with promise, and analyzing them off-line after the entire focus sweep sequence is completed. Off-line processing of the frames may be done within the system in a second or so.


The system may have preset focusing prior to image capturing, whether it be either manual focusing done through manipulating the mutual position between the subject and the device, or focusing based on a distance or focus mechanism and then setting the focus back 100 mm, for instance, or so behind the subject or a focus sweet spot for image acquisition, such as an iris being coincident with the camera optics object plane.


The system may have a focus lens suitably instrumented with a motor drive. During frame capture, the focus lens position may sweep an entire focus range in either continuous motion or discrete steps. In the latter case, frame grabbing may be synchronized so that a frame is taken when the lens stops at each consecutive step of a series of steps. A camera may have a fixed focus and the camera moves instead or the subject is asked to step forward or backward, or the sensor in the camera may be moved.


Measuring a quality of focus may rely on measuring image intensity variance over a patch, or rely on approaches based on spectral analysis. The measuring may be done off-line. For example, the variance value changing from one image to the next image may indicate whether the images examined are moving toward or away from the focus. The system is not limited to a particular approach of focus quality measurements or certain algorithms for face and eye finding.



FIG. 1 is a diagram of, for example, an iris image standoff acquisition system 10. The system may include an optics assembly 11 having an adjustable focus for capturing eye 28 details of a subject 20. Alternatively, an adjustable focus may be achieved by moving the camera sensor to and from the subject, moving the camera to and from the subject, or having the subject move to and from the camera. The assembly 11 may be optically coupled to a camera 16. There may be an illuminator 48 in case of a need for added light or a flash for obtaining a fast image take of the subject. The focus of assembly 11 may be adjusted with a signal from a computer/processor 30 (referred to herein as a computer). Shutter control, and imagery capture and receipt may be managed by the computer 30. Computer 30 may be a PC or other equivalent device. It may contain software appropriate for effecting standoff image acquisition of an iris. Computer 30 may be connected to a database 32, a network, enforcement agency data center, an immigration processing center, and/or so on, the latter items of which are shown as “other” 38 in FIG. 1.



FIG. 2 is a diagram of an illustrative example of the iris image standoff acquisition system 10. Focus of the subject may be achieved in several ways. One way is to have the subject 20 move across the focusing distance 34 of the camera 16 as indicated by motion arrow 73. Another way is to move the camera 16 across the focusing distance 34 as indicated by motion arrow 71. Another way is to move the sensor 27 across the focusing distance 34 as indicated by motion arrow 72. Still another way is to move the lens 12 across the focusing distance 34 as indicated by motion arrow 13. In each of the indicated ways of moving across the focusing distance 34, the other ways are held still in that just one item is moved. Moving across the focusing distance 34 indicates that the movement begins at one side of the distance 34 and ends up on the other side of the focusing distance 34. During movement across the focusing distance for each of these ways, a sequence of images may be captured at a fast rate. Each of the movements may be “continuous ” or “stop and go”.


For illustrative purposes, the way of moving the lens 12 across the focusing distance 34 may be example for description herein. The optics assembly 11 of FIG. 2 may have a lens 12 which may be moved back and forth in direction 13 with a housing 14 for holding the lens which may be used for focusing an image of subject 20 or a portion of it on a sensor array 27. The lens 12 or sensor array 27 may be moved with a motor, a drive or mover mechanism 18 which may be connected to housing 14. The lens housing 14 may be moved in a continuous mode resulting in a “continuous focus sweep” approach, or in a step mode resulting in a “stop-and-go” approach having focusing increments.


In the “stop-and-go” approach, at each stop, the lens focus maybe set and then a picture is taken and detected at sensor array 27 in camera 16. The lens 12 may be set again for another focus at the next step and then a picture is taken and so on. The optics assembly 11 may contain one or more lenses.


In the continuous approach, a sequence of images or pictures may be taken while the lens housing 14 is moving. The focusing of lens 12 does not necessarily stop during image acquisition or picture taking. Relative to the “continuous focus lens sweep” approach, image or picture exposure time should be shorter than the time it takes the lens to get out of its depth of field. Light 23 from subject 20 may be conveyed through lens 12 of lens housing 14, and into camera 16 onto array 27.


A mechanical or electronic shutter may be controlled with a signal from a shutter control 25 of a subsystem 26 which may be a part of the computer 30. The shutter may be electronically controlled, or may in effect be a picking off or an electronically receiving an image from sensor array 27 for a specified duration as desired. The exposure time of the image sensor for an image may be less than 100 milliseconds. In some instances, it may be less than 10 milliseconds or even less than 2 milliseconds, depending on the design of the system. The shutter may be in effect an illuminator 48 or other non-mechanical type of device. Alternatively, the shutter could be a mechanical mechanism.


There might be no explicit shutter as such in system 10. The image capture or acquisition may occur during a time of a flash or a constant supplemental illumination, such as LED sources, to assist in image exposure or capture. During no-flash time, an image sensor may be set, configured or designed not to detect any light. There may be a threshold which a light intensity, whether IR or visible, has to reach before the camera sensor 27 will sense and capture an image projected to it. The duration of the illumination or flash, particularly relative to sensor 27, may be equivalent to the speed of a shutter opening. For instance, system 10 may have an IR illuminator or flash 48 which may provide a basis for a short duration exposure of an image on the sensor 27. The IR illuminator 48 may be of a wavelength which is not readily visible to but may have some effect on the subject 20 such as a person. There may also be a visible light illuminator in place of having an intensity which may be inconspicuous to the person targeted by system 10.


IR flash or illuminator 48 may be electronically controlled by a shutter signal from shutter control 25. Alternatively, sensor 27 may capture or acquire an image by being electronically controlled in terms of the amount of time the sensor is allowed to be sensitive to light from the subject. Such sensor control may emulate a shutter effect. Camera 16 may monitor subject 20 and its eye or eyes 28 for purposes of aiming, focusing and capturing an image of the subject. The focusing change of lens 12 may be provided by a sweep signal from a module 33 to the drive or mover mechanism 18. An input to module 33 may be a “preset done” signal from a preset module 36 which occurs when a preset signal from a module 37 indicating an object plane or focusing distance (dF) 34 for an initial time (t1) of a start of a sequence of images to be captured of subject 20. The preset signal (dF(t1)) of module 37 may be based on an estimate of the distance 34 between the subject 20 and camera 16. A point of distance 34 measurement from camera 16 may be lens 12 or some other item of the camera 16 arrangement. A module 39 may provide a signal of the distance 34 estimate {tilde over (d)}s to preset module 37. The signal from module 39 may also go to a decision item represented by a symbol 36 which asks the question whether the preset has been done. This signal indicates that the preset has been done and a “yes” signal may go to the sweep module 33 and the shutter control module 25 for a go-ahead of the sweep and shutter control to begin. If the signal from module 39 has not been received by preset module 37 and the decision item at symbol 36, then a “no” signal may go to a “wait” module 47 which means that modules 25 and 33 should wait until such signal has been sent before starting.



FIG. 3 shows an image 29 of subject 20 on sensor 27. Image 29 may be forwarded on to computer 30. Sensor 27 may be, for example, a 2560 by 1920 pixel array providing about 4.9 mega-pixels of imagery of the subject. A window or portion 31 of a target or an area of interest may be extracted from image 29. It may be a VGA format with a pixel size image of 640 by 480 pixels, or be some other size such as 480 by 290 pixels as used in the present example. An eye 28 of the subject or person may be the target to be extracted with window 31. Eye finding software may be used here in conjunction with the window. The window covers a portion of the camera's field of view which should include the target, e.g., eye or iris, in full but need not be much larger than required for better overall accurate focusing on the target (i.e., eye 28). Having the larger image 29 of subject 20, the pointing or aiming of the camera 16 is not necessarily so critical. Further, there may a significant aiming or pointing error tolerance of system 10. A diagram of window 31 is shown in FIG. 4. About 150 pixels may be allowed for a diameter of an iris 46 of eye 28. The iris may be somewhat centered in window 31 as shown by the pixel dimensions. However, it need not necessarily be centered. Also, window 31 may have other pixel dimensions. The dimensions may be adjustable.



FIG. 5 is a diagram of an estimate {tilde over (d)}s 49 of actual distance 34 between subject 20 and camera 16, e.g., lens 16. The estimate 49 of the distance, ds 34, may be more or less than actual distance 34. For an illustrative example, the estimate 49 may be less than distance 34. When an estimate is received by the preset module 37 and forwarded on to the sweep module 33, the focusing distance of lens 12 for camera 16 may be set at a distance behind the estimated distance 49 for the initial start of focusing and taking images of the subject 20. The initial focusing distance may be about 100 mm (or other distance) behind the estimated distance 49, and be designated as dF(t1) in FIG. 5. The distance at the other end of the focusing range may be about 100 mm or so ahead of the estimated distance 49, and designated as dF(tN), where N may indicate the number of frame periods, or frame shots taken tF milliseconds apart, or it may be the last unit of N units of time for the focus sweep, whether continuous or discrete, of the subject by lens 12 of optics assembly 11. Starting at time t1, the frames are taken periodically at times t1, t2, t3, . . . . through tN. Somewhere in the course of the focus sweep between t1 and tN, say at tL, the focusing distance dF(tL) equals the actual distance 34. FIG. 5a shows a graph of the nominal focus distance “dF(t)” with the plus/minus delta distance of focus “+/−½ΔDOF” versus time “t”. In the Figure, tL is intentionally chosen so that dF(tL) happens to fall within the depth of the field +/−ΔDOF of both the (n-1)th and nth frames. In more typical cases, however, the iris will be found in focus in just one frame.



FIG. 6 illustrates graphically focusing distance dF between the camera 16 or its lens 12 and subject 20, particularly an eye or eyes 28, and more particularly an iris or irises 46 of the eyes, over a number of time increments t, as shown by line 51. In the graphs of FIG. 6, there are about ten instances 52 of time through the range of focusing at which an image of subject 20 may be captured. There could be more or less than ten instances. At each instance of time 52, a trigger signal voltage 53 may go from shutter control module 25 to camera 16 for the shutter or other mechanism to initiate a capture or acquisition of an image of subject 20. At one of these instances of time 52, an image of an iris 46 of at least one eye 28 of subject 20 may be captured on array 27 at an in-focus distance relative to the subject 20 and camera 16.


Movement of subject 20, as illustrated in FIG. 7, is not an intended aspect of the present system 10. Even though subject 20 may have some inadvertent movement, it typically is not critical for obtaining a good focus of the subject. Lateral movement 54 of subject 20 does not necessarily affect the focus distance of the subject from the camera 16, unless it is particularly large. This movement may be arrested by choosing short exposure time. The present system may address radial or forward/backward movement 55 through appropriately choosing the frame rate. As long as individual frames overlap, the present system is immune to radial movement. However, since subject 20 may be directed to take a certain position as a steady subject, the radial movement 55 would not necessarily be a factor relative to attaining an in-focus image, particularly since the exposure time of camera 16 would be about one to two milliseconds. If there is to be a concern, the exposure time TE may be shortened to avoid effects of movement. The lateral movement 54 or radial movement 55 would not necessarily exceed the depth of field of the optics assembly 11 for a given focus.



FIG. 8 is a diagram showing a relationship of exposure time TE 56 and a frame period TF 57. A shutter, exposure, capture or trigger signal may be indicated by a pulse 58 where there is an exposure time 56 of the subject 20 on image array 27 of camera 16. Time t1 may begin when the exposure time TE starts at the rising edge or beginning of pulse 58. The exposure time TE ends at the falling edge or end of the pulse 58. The next exposure time TE may begin at the end of t1 and the start of pulse 58 at t2. The same may occur for each time t3, t4, t5, and so on, until pulse 58 at tN, where N is the total number of pulses.


The depth of field of the optics 11 may, for one example, be about 10 mm. That the subject 20 moves forward, for instance, or that a focus that moves forward at a velocity VF, may be a factor to consider. Δd may be regarded as a depth of field. The formula TF≦Δd/VF should apply. If TF is much shorter than Δd/VF, then there may be a waste of resources. If TF is longer than Δd/VF, then the system may be unworkable because of gaps in focus coverage. Thus, in the present illustrative example, for still subjects, TF may be a period of up to 10 ms where the velocity VF approaches 1 meter/sec. The exposure time may be relatively much shorter such as about one to two milliseconds. Focusing distance would sweep 0.2 m during which the actual subject distance ds is within the depth of field of at least one frame. A sequence of images may be taken and processing relating to them may generally be done later or could be done in real-time. The processing may take only several seconds; however, this time is large relative to 10 milliseconds multiplied by the number of exposures. For 20 images, the time would be 200 milliseconds for the total image acquisition. A goal is to have at least one frame in the depth of field of the subject. This approach may permit a high frame rate (e.g., 100 frames per second) of image acquisition.


The above numbers may assume that the relative velocity |vf−vs|=1 m/s. If this is the case, the system may be at its maximum speed and the 20 images will provide no difference overlap. For smaller relative velocities, there may be an overlap. The smaller the relative velocity, the larger may be the overlap.



FIG. 9 is a graph of intensity variance as a function of distance difference. There may be an actual in-focus point at distance 34 and an estimate in-focus point at distance 49. Distance 62 may represent an increment of the difference depth of focus at distance 34 which, as an illustrative example, may be about 25 mm. Other depths of focus 63, at and relative to distance 49, may be chosen to be about 25 millimeters and the timing of image capture can be such that the images may have about a 5 millimeter or so depth of focus overlap with adjacent depths of focus 63. That means a depth of focus 63 may extend about 12.5 mm on each side of its center. The depths of focus 63 may overlap other depths of focus 63 relative to 20 mm increments of distance difference throughout about a 200 mm portion shown in graph 35. The focus change may sweep forward (left to right) or backwards (right to left) on the graph. The example noted herein may be a sweep forward version. Thus, the focusing distance adjustment beginning at t1 may be designated as dF(t1)={tilde over (d)}s−100 mm at line 59 and ending at tN may be designated as dF(tN)={tilde over (d)}s+100 mm at line 67, as noted relative to FIG. 5. TF may equal 20 ms. So, one may have an image acquired a −100 mm, −80 mm, −60 mm, −40 mm, −20 mm, 0 mm, +20 mm, +40 mm, +60 mm, +80 mm, +100 mm, 11 images in total, at lines 59, 60, 41, 42, 43, 49, 44, 45, 65, 66 and 67, respectively, and so on, relative to the estimated focus distance {tilde over (d)}s 49. This setup may be performed under an assumption of a focus velocity being about one meter per second. As long as the focus distance estimate is within 100 mm of the depth of field 62 of the actual focus distance, then there should be at least one image taken within the depth of field of focus of the subject. Also, as long as TF is less than or equal to the depth of focus Δd divided by VF, then an image of the subject within the depth of field 62 of focus should be acquirable. For example, it may be that the estimated distance of focus is at line 49, which can turn out to be, for instance, about 15 mm closer to lens 12 of the camera than the actual distance 34 of focus from subject 20, which may be from contrast analysis processing of the images for selecting the image which is in the best focus. The distances as represented in FIG. 9 are for illustrative purposes and not necessarily drawn to scale.


The curves of graph 35 are not necessarily smooth, due to sensor noise and window location uncertainty image discretization, as indicated by an example magnification 68. The lens 12 focus may be set, for instance, at infinity to start and to its closest focus to end, and then be changed through its focus range as a sequence of images of the subject 20 is captured during the change of focus. For illustrative purposes, five images at focus distances 41, 42, 43, 49 and 44 of the right eye may be captured. Incidentally, just three images might be sufficient. Images at the distances 41, 42, 43, 49 and 44 as designated by the lines may be regarded as images 41, 42, 43, 49 and 44, respectively. These images may be a cropped image 31 of image 29 from FIG. 3 taken at various focus distances. Images 41, 42, 49 and 44, as shown in FIG. 10, may be regarded as out of focus as indicated by blurriness, poor clarity, low contrast, a lack of information discernable in iris 46, and so forth in view of image 43, to an observer. However, mechanisms are available to indicate quality of focus. From left to right of the images, as the focus changes as the focus distance is changed, iris 46 details or information appears to be most unambiguous and discernable, and thus best focused in image 43. However, the best focus of the image 43 may be machine evaluated within system 10 via an aspect of computer 30 in view of the intensity variance/contrast and to be detected before a human eye sees it.


Specifically, graph 35 shows the intensity variance increasing as the focus is improving for the sequence of lines 41 through 43. At lines 49 and 44 the focus appears to degrade as the intensity variance decreases. It may be noted that the best focus may be at line 43 which appears within the depth of field 62 at the peak of the intensity variance of graph 35, which coincides with the capturing of image 43 within the depth of field 62.



FIG. 11 is a diagram of an iris 46 image 69 cropped from the focused captured image 43 in FIG. 10. This iris image 69 may be available for analysis, storage, matching, and so forth.


In the present specification, some of the matter may be of a hypothetical or prophetic nature although stated in another manner or tense.


Although the invention has been described with respect to at least one illustrative example, many variations and modifications will become apparent to those skilled in the art upon reading the present specification. It is therefore the intention that the appended claims be interpreted as broadly as possible in view of the prior art to include all such variations and modifications.

Claims
  • 1. A standoff image acquisition system comprising: an image sensor;an optics assembly coupled to the image sensor; anda processor coupled to the image sensor and the optics assembly;wherein:the optics assembly is configured to convey images of a subject suitable for biometric imaging to the image sensor with a varying focus distance while the standoff image acquisition system is disposed at a standoff distance from the subject; andthe processor is programmed to control the image sensor to acquire a sequence of images from the optics assembly of the subject and control the optics assembly to vary the focus distance monotonically during acquisition of the sequence of images.
  • 2. The system of claim 1, wherein at least one image of the sequence of images acquired comprises a window within the at least one images enclosing a biometric target on the subject, the window enclosing an area of the image substantially less than the total area of the image.
  • 3. The system of claim 1, wherein the focus distance of the optics assembly is varied in a continuous manner during the conveying of images.
  • 4. The system of claim 1, the focus distance of the optics assembly is varied in a discrete manner during the conveying of images.
  • 5. The system of claim 1, wherein: the processor is programmed to determine a best focused image based on spatial frequency content relative to that of other images in the sequence of images; andthe sequence of images is acquired at a frame rate independent of the rate at which the processor processes the images for determining a best focused image in the sequence of images.
  • 6. The system of claim 1, wherein: a frame period between exposures for images is less than or equal to a depth of field of focus divided by focus velocity; andthe focus velocity is a change in focus distance relative to the optics assembly per unit time.
  • 7. The system of claim 1, wherein: the image sensor is for capturing infrared images; andan exposure time of the image sensor for an image is equivalent to a duration of illumination of an infrared illuminator.
  • 8. The system of claim 1, further comprising: a shutter; andwherein:the shutter of the image sensor is electronic; andan exposure time of the image sensor for an image is less than 100 milliseconds.
  • 9. The system of claim 1, wherein: an estimate of an in-focus distance of the subject is provided to the optics assembly; andthe focus of the optics assembly is set to vary between a first distance behind and a second distance in front of the estimate of the in-focus distance.
  • 10. A method for iris image acquisition comprising: providing a sensor for capturing images;conveying a plurality of images with a lens onto the sensor;varying a focus with the lens of at least two of the plurality of images;selecting an image from at least two of the plurality of images having a sufficient focus on a subject; andproviding a window within at least one of the plurality of images to capture an eye target on the subject, wherein the window encloses an area of the at least one of the plurality of images substantially less than the total area of the image.
  • 11. The method of claim 10, wherein: a rate of processing images is independent of a rate of capturing;processing images comprises selecting an image out of at least two images of the plurality of images, as having a good focus.
  • 12. The method of claim 10, wherein capturing images comprises providing an illuminator flash for freezing motion of a subject of the images.
  • 13. The method of claim 11, wherein: an item of the images is at least a portion of a person; andthe sufficient focus is of the iris target within the window in the at least one of the plurality of images.
  • 14. An iris image acquisition system comprising: a camera configured to acquire a plurality of images; anda variable focus mechanism coupled to the camera, the variable focus mechanism configured to vary, during the acquisition of the plurality of images, a focus distance of the images;wherein:TF≦ΔD/VF;ΔD is a depth of field of focus;VF is a velocity at which the focus distance changes during the acquisition of the plurality of images; andTF is a frame time between acquisitions of images of the plurality of images.
  • 15. The system of claim 14, wherein: an estimate of a correct focus distance is determined;the focus mechanism is set at the estimated focus distance offset by a first distance;during acquisition of the plurality of images, the focus mechanism is varied from the estimated focus distance plus the first distance to the estimated focus distance minus a second distance; andthe first and second distances are either plus or minus distances.
  • 16. The system of claim 15, wherein during or after the acquisition of the plurality of images, an image having a sufficient focus is selected from the images.
  • 17. The system of claim 16, wherein an image having the sufficient focus is selected according to an image intensity variance of the image.
  • 18. The system of claim 14, wherein: the images comprise an area of interest on a subject; anda field of view of the camera is sufficiently larger than that of the area of interest to minimize pointing of the camera to capture an image that includes the area of interest; andwherein the area of interest is delineated by a window in the image.
  • 19. The system of claim 18, wherein the area of interest comprises an eye and/or an iris.
US Referenced Citations (397)
Number Name Date Kind
4641349 Flom et al. Feb 1987 A
4836670 Hutchinson Jun 1989 A
5231674 Cleveland et al. Jul 1993 A
5291560 Daugman Mar 1994 A
5293427 Ueno et al. Mar 1994 A
5359382 Uenaka Oct 1994 A
5404013 Tajima Apr 1995 A
5551027 Choy et al. Aug 1996 A
5572596 Wildes et al. Nov 1996 A
5608472 Szirth et al. Mar 1997 A
5664239 Nakata Sep 1997 A
5687031 Ishihara Nov 1997 A
5717512 Chmielewski, Jr. et al. Feb 1998 A
5751836 Wildes et al. May 1998 A
5859686 Aboutalib et al. Jan 1999 A
5860032 Iwane Jan 1999 A
5896174 Nakata Apr 1999 A
5901238 Matsuhita May 1999 A
5909269 Isogai et al. Jun 1999 A
5953440 Zhang et al. Sep 1999 A
5956122 Doster Sep 1999 A
5978494 Zhang Nov 1999 A
6005704 Chmielewski, Jr. et al. Dec 1999 A
6007202 Apple et al. Dec 1999 A
6012376 Hanke et al. Jan 2000 A
6021210 Camus et al. Feb 2000 A
6028949 McKendall Feb 2000 A
6055322 Salganicoff et al. Apr 2000 A
6064752 Rozmus et al. May 2000 A
6069967 Rozmus et al. May 2000 A
6081607 Mori et al. Jun 2000 A
6088470 Camus et al. Jul 2000 A
6091899 Konishi et al. Jul 2000 A
6101477 Hohle et al. Aug 2000 A
6104431 Inoue et al. Aug 2000 A
6108636 Yap et al. Aug 2000 A
6119096 Mann et al. Sep 2000 A
6120461 Smyth Sep 2000 A
6134339 Luo Oct 2000 A
6144754 Okano et al. Nov 2000 A
6246751 Bergl et al. Jun 2001 B1
6247813 Kim et al. Jun 2001 B1
6252977 Salganicoff et al. Jun 2001 B1
6259478 Hori Jul 2001 B1
6282475 Washington Aug 2001 B1
6285505 Melville et al. Sep 2001 B1
6285780 Yamakita et al. Sep 2001 B1
6289113 McHugh et al. Sep 2001 B1
6299306 Braithwaite et al. Oct 2001 B1
6308015 Matsumoto Oct 2001 B1
6309069 Seal et al. Oct 2001 B1
6320610 Van Sant et al. Nov 2001 B1
6320612 Young Nov 2001 B1
6320973 Suzaki et al. Nov 2001 B2
6323761 Son Nov 2001 B1
6325765 Hay et al. Dec 2001 B1
6330674 Angelo et al. Dec 2001 B1
6332193 Glass et al. Dec 2001 B1
6344683 Kim Feb 2002 B1
6370260 Pavlidis et al. Apr 2002 B1
6377699 Musgrave et al. Apr 2002 B1
6393136 Amir et al. May 2002 B1
6400835 Lemelson et al. Jun 2002 B1
6424727 Musgrave et al. Jul 2002 B1
6424845 Emmoft et al. Jul 2002 B1
6433818 Steinberg et al. Aug 2002 B1
6438752 McClard Aug 2002 B1
6441482 Foster Aug 2002 B1
6446045 Stone et al. Sep 2002 B1
6483930 Musgrave et al. Nov 2002 B1
6484936 Nicoll et al. Nov 2002 B1
6490443 Freeny, Jr. Dec 2002 B1
6493363 Shuman et al. Dec 2002 B1
6493669 Curry et al. Dec 2002 B1
6494363 Roger et al. Dec 2002 B1
6503163 Van Sant et al. Jan 2003 B1
6505193 Musgrave et al. Jan 2003 B1
6506078 Mori et al. Jan 2003 B1
6508397 Do Jan 2003 B1
6516078 Yang et al. Feb 2003 B1
6516087 Camus Feb 2003 B1
6516416 Gregg et al. Feb 2003 B2
6522772 Morrison et al. Feb 2003 B1
6523165 Liu et al. Feb 2003 B2
6526160 Ito Feb 2003 B1
6532298 Cambier et al. Mar 2003 B1
6540392 Braithwaite Apr 2003 B1
6542624 Oda Apr 2003 B1
6546121 Oda Apr 2003 B1
6553494 Glass Apr 2003 B1
6580356 Alt et al. Jun 2003 B1
6591001 Oda et al. Jul 2003 B1
6591064 Higashiyama et al. Jul 2003 B2
6594377 Kim et al. Jul 2003 B1
6594399 Camus et al. Jul 2003 B1
6598971 Cleveland Jul 2003 B2
6600878 Pregara Jul 2003 B2
6614919 Suzaki et al. Sep 2003 B1
6652099 Chae et al. Nov 2003 B2
6674367 Sweatte Jan 2004 B2
6690997 Rivalto Feb 2004 B2
6708176 Strunk et al. Mar 2004 B2
6711562 Ross et al. Mar 2004 B1
6714665 Hanna et al. Mar 2004 B1
6718049 Pavlidis et al. Apr 2004 B2
6718665 Hess et al. Apr 2004 B2
6732278 Baird, III et al. May 2004 B2
6734783 Anbai May 2004 B1
6745520 Puskaric et al. Jun 2004 B2
6750435 Ford Jun 2004 B2
6751733 Nakamura et al. Jun 2004 B1
6753919 Daugman Jun 2004 B1
6754640 Bozeman Jun 2004 B2
6760467 Min et al. Jul 2004 B1
6765470 Shinzaki Jul 2004 B2
6766041 Golden et al. Jul 2004 B2
6775774 Harper Aug 2004 B1
6785406 Kamada Aug 2004 B1
6793134 Clark Sep 2004 B2
6819219 Bolle et al. Nov 2004 B1
6829370 Pavlidis et al. Dec 2004 B1
6832044 Doi et al. Dec 2004 B2
6836554 Bolle et al. Dec 2004 B1
6837436 Swartz et al. Jan 2005 B2
6845479 Illman Jan 2005 B2
6853444 Haddad Feb 2005 B2
6867683 Calvesio et al. Mar 2005 B2
6873960 Wood et al. Mar 2005 B1
6896187 Stockhammer May 2005 B2
6905411 Nguyen et al. Jun 2005 B2
6920237 Chen et al. Jul 2005 B2
6930707 Bates et al. Aug 2005 B2
6934849 Kramer et al. Aug 2005 B2
6950139 Fujinawa Sep 2005 B2
6954738 Wang et al. Oct 2005 B2
6957341 Rice et al. Oct 2005 B2
6972797 Izumi Dec 2005 B2
6992562 Fuks et al. Jan 2006 B2
7030351 Wasserman et al. Apr 2006 B2
7053948 Konishi May 2006 B2
7071971 Elberbaum Jul 2006 B2
7084904 Liu et al. Aug 2006 B2
7136581 Fujii Nov 2006 B2
7183895 Bazakos et al. Feb 2007 B2
7184577 Chen et al. Feb 2007 B2
7197173 Jones et al. Mar 2007 B2
7204425 Mosher, Jr. et al. Apr 2007 B2
7277561 Shin Oct 2007 B2
7277891 Howard et al. Oct 2007 B2
7298873 Miller, Jr. et al. Nov 2007 B2
7315233 Yuhara Jan 2008 B2
7362210 Bazakos et al. Apr 2008 B2
7362370 Sakamoto et al. Apr 2008 B2
7362884 Willis et al. Apr 2008 B2
7365771 Kahn et al. Apr 2008 B2
7406184 Wolff et al. Jul 2008 B2
7414648 Imada Aug 2008 B2
7417682 Kuwakino et al. Aug 2008 B2
7418115 Northcott et al. Aug 2008 B2
7421097 Hamza et al. Sep 2008 B2
7443441 Hiraoka Oct 2008 B2
7460693 Loy et al. Dec 2008 B2
7471451 Dent et al. Dec 2008 B2
7486806 Azuma et al. Feb 2009 B2
7518651 Butterworth Apr 2009 B2
7537568 Moehring May 2009 B2
7538326 Johnson et al. May 2009 B2
7542945 Thompson et al. Jun 2009 B2
7580620 Raskar et al. Aug 2009 B2
7593550 Hamza Sep 2009 B2
7639846 Yoda Dec 2009 B2
7722461 Gatto et al. May 2010 B2
7751598 Matey et al. Jul 2010 B2
7756301 Hamza Jul 2010 B2
7756407 Raskar Jul 2010 B2
7761453 Hamza Jul 2010 B2
7777802 Shinohara et al. Aug 2010 B2
7804982 Howard et al. Sep 2010 B2
20010026632 Tamai Oct 2001 A1
20010027116 Baird Oct 2001 A1
20010047479 Bromba et al. Nov 2001 A1
20010051924 Uberti Dec 2001 A1
20010054154 Tam Dec 2001 A1
20020010857 Karthik Jan 2002 A1
20020033896 Hatano Mar 2002 A1
20020039433 Shin Apr 2002 A1
20020040434 Elliston et al. Apr 2002 A1
20020062280 Zachariassen et al. May 2002 A1
20020077841 Thompson Jun 2002 A1
20020089157 Breed et al. Jul 2002 A1
20020106113 Park Aug 2002 A1
20020112177 Voltmer et al. Aug 2002 A1
20020114495 Chen et al. Aug 2002 A1
20020130961 Lee et al. Sep 2002 A1
20020131622 Lee et al. Sep 2002 A1
20020139842 Swaine Oct 2002 A1
20020140715 Smet Oct 2002 A1
20020142844 Kerr Oct 2002 A1
20020144128 Rahman et al. Oct 2002 A1
20020150281 Cho Oct 2002 A1
20020154794 Cho Oct 2002 A1
20020158750 Almalik Oct 2002 A1
20020164054 McCartney et al. Nov 2002 A1
20020175182 Matthews Nov 2002 A1
20020186131 Fettis Dec 2002 A1
20020191075 Doi et al. Dec 2002 A1
20020191076 Wada et al. Dec 2002 A1
20020194128 Maritzen et al. Dec 2002 A1
20020194131 Dick Dec 2002 A1
20020198731 Barnes et al. Dec 2002 A1
20030002714 Wakiyama Jan 2003 A1
20030012413 Kusakari et al. Jan 2003 A1
20030014372 Wheeler et al. Jan 2003 A1
20030020828 Ooi et al. Jan 2003 A1
20030038173 Blackson et al. Feb 2003 A1
20030046228 Berney Mar 2003 A1
20030053663 Chen et al. Mar 2003 A1
20030055689 Block et al. Mar 2003 A1
20030055787 Fujii Mar 2003 A1
20030058492 Wakiyama Mar 2003 A1
20030061172 Robinson Mar 2003 A1
20030061233 Manasse et al. Mar 2003 A1
20030065626 Allen Apr 2003 A1
20030071743 Seah et al. Apr 2003 A1
20030072475 Tamori Apr 2003 A1
20030073499 Reece Apr 2003 A1
20030074317 Hofi Apr 2003 A1
20030074326 Byers Apr 2003 A1
20030076161 Tisse Apr 2003 A1
20030076300 Lauper et al. Apr 2003 A1
20030076984 Tisse et al. Apr 2003 A1
20030080194 O'Hara et al. May 2003 A1
20030091215 Lauper et al. May 2003 A1
20030092489 Veradej May 2003 A1
20030095689 Volkommer et al. May 2003 A1
20030098776 Friedli May 2003 A1
20030099379 Monk et al. May 2003 A1
20030099381 Ohba May 2003 A1
20030103652 Lee et al. Jun 2003 A1
20030107097 McArthur et al. Jun 2003 A1
20030107645 Yoon Jun 2003 A1
20030108224 Ike Jun 2003 A1
20030108225 Li Jun 2003 A1
20030115148 Takhar Jun 2003 A1
20030115459 Monk Jun 2003 A1
20030116630 Carey et al. Jun 2003 A1
20030118212 Min et al. Jun 2003 A1
20030118217 Kondo et al. Jun 2003 A1
20030123711 Kim et al. Jul 2003 A1
20030125054 Garcia Jul 2003 A1
20030125057 Pesola Jul 2003 A1
20030126560 Kurapati et al. Jul 2003 A1
20030131245 Linderman Jul 2003 A1
20030131265 Bhakta Jul 2003 A1
20030133597 Moore et al. Jul 2003 A1
20030140235 Immega et al. Jul 2003 A1
20030140928 Bui et al. Jul 2003 A1
20030141411 Pandya et al. Jul 2003 A1
20030149881 Patel et al. Aug 2003 A1
20030152251 Ike Aug 2003 A1
20030152252 Kondo et al. Aug 2003 A1
20030156741 Lee et al. Aug 2003 A1
20030158762 Wu Aug 2003 A1
20030158821 Maia Aug 2003 A1
20030159051 Hollnagel Aug 2003 A1
20030163739 Armington et al. Aug 2003 A1
20030169334 Braithwaite et al. Sep 2003 A1
20030169901 Pavlidis et al. Sep 2003 A1
20030169907 Edwards et al. Sep 2003 A1
20030173408 Mosher, Jr. et al. Sep 2003 A1
20030174049 Beigel et al. Sep 2003 A1
20030177051 Driscoll et al. Sep 2003 A1
20030182151 Taslitz Sep 2003 A1
20030182182 Kocher Sep 2003 A1
20030189480 Hamid Oct 2003 A1
20030189481 Hamid Oct 2003 A1
20030191949 Odagawa Oct 2003 A1
20030194112 Lee Oct 2003 A1
20030195935 Leeper Oct 2003 A1
20030198368 Kee Oct 2003 A1
20030200180 Phelan, III et al. Oct 2003 A1
20030210139 Brooks et al. Nov 2003 A1
20030210802 Schuessier Nov 2003 A1
20030218719 Abourizk et al. Nov 2003 A1
20030225711 Paping Dec 2003 A1
20030228898 Rowe Dec 2003 A1
20030233556 Angelo et al. Dec 2003 A1
20030235326 Morikawa et al. Dec 2003 A1
20030235411 Morikawa et al. Dec 2003 A1
20030236120 Reece et al. Dec 2003 A1
20040001614 Russon et al. Jan 2004 A1
20040002894 Kocher Jan 2004 A1
20040005078 Tillotson Jan 2004 A1
20040006553 de Vries et al. Jan 2004 A1
20040010462 Moon et al. Jan 2004 A1
20040012760 Mihashi et al. Jan 2004 A1
20040019570 Bolle et al. Jan 2004 A1
20040023664 Mirouze et al. Feb 2004 A1
20040023709 Beaulieu et al. Feb 2004 A1
20040025030 Corbett-Clark et al. Feb 2004 A1
20040025031 Ooi et al. Feb 2004 A1
20040025053 Hayward Feb 2004 A1
20040029564 Hodge Feb 2004 A1
20040030930 Nomura Feb 2004 A1
20040035123 Kim et al. Feb 2004 A1
20040037450 Bradski Feb 2004 A1
20040039914 Barr et al. Feb 2004 A1
20040042641 Jakubowski Mar 2004 A1
20040044627 Russell et al. Mar 2004 A1
20040046640 Jourdain et al. Mar 2004 A1
20040049687 Orsini et al. Mar 2004 A1
20040050924 Mletzko et al. Mar 2004 A1
20040050930 Rowe Mar 2004 A1
20040052405 Walfridsson Mar 2004 A1
20040052418 DeLean Mar 2004 A1
20040059590 Mercredi et al. Mar 2004 A1
20040059953 Purnell Mar 2004 A1
20040104266 Bolle et al. Jun 2004 A1
20040117636 Cheng Jun 2004 A1
20040133804 Smith et al. Jul 2004 A1
20040146187 Jeng Jul 2004 A1
20040148526 Sands et al. Jul 2004 A1
20040160518 Park Aug 2004 A1
20040162870 Matsuzaki et al. Aug 2004 A1
20040162984 Freeman et al. Aug 2004 A1
20040169817 Grotehusmann et al. Sep 2004 A1
20040172541 Ando et al. Sep 2004 A1
20040174070 Voda et al. Sep 2004 A1
20040190759 Caldwell Sep 2004 A1
20040193893 Braithwaite et al. Sep 2004 A1
20040204711 Jackson Oct 2004 A1
20040219902 Lee et al. Nov 2004 A1
20040233038 Beenau et al. Nov 2004 A1
20040252866 Tisse et al. Dec 2004 A1
20040255168 Murashita et al. Dec 2004 A1
20050008200 Azuma et al. Jan 2005 A1
20050008201 Lee et al. Jan 2005 A1
20050012817 Hampapur et al. Jan 2005 A1
20050029353 Isemura et al. Feb 2005 A1
20050052566 Kato Mar 2005 A1
20050055582 Bazakos et al. Mar 2005 A1
20050063567 Saitoh et al. Mar 2005 A1
20050084137 Kim et al. Apr 2005 A1
20050084179 Hanna et al. Apr 2005 A1
20050099288 Spitz et al. May 2005 A1
20050102502 Sagen May 2005 A1
20050110610 Bazakos et al. May 2005 A1
20050125258 Yellin et al. Jun 2005 A1
20050127161 Smith et al. Jun 2005 A1
20050129286 Hekimian Jun 2005 A1
20050134796 Zelvin et al. Jun 2005 A1
20050138385 Friedli et al. Jun 2005 A1
20050138387 Lam et al. Jun 2005 A1
20050146640 Shibata Jul 2005 A1
20050151620 Neumann Jul 2005 A1
20050152583 Kondo et al. Jul 2005 A1
20050193212 Yuhara Sep 2005 A1
20050199708 Friedman Sep 2005 A1
20050206501 Farhat Sep 2005 A1
20050206502 Bernitz Sep 2005 A1
20050207614 Schonberg et al. Sep 2005 A1
20050210267 Sugano et al. Sep 2005 A1
20050210270 Rohatgi et al. Sep 2005 A1
20050210271 Chou et al. Sep 2005 A1
20050238214 Matsuda et al. Oct 2005 A1
20050240778 Saito Oct 2005 A1
20050248725 Ikoma et al. Nov 2005 A1
20050249385 Kondo et al. Nov 2005 A1
20050255840 Markham Nov 2005 A1
20060093190 Cheng et al. May 2006 A1
20060147094 Yoo Jul 2006 A1
20060165266 Hamza Jul 2006 A1
20060274919 LoIacono et al. Dec 2006 A1
20070036397 Hamza Feb 2007 A1
20070140531 Hamza Jun 2007 A1
20070160266 Jones et al. Jul 2007 A1
20070189582 Hamza et al. Aug 2007 A1
20070206840 Jacobson Sep 2007 A1
20070211924 Hamza Sep 2007 A1
20070274570 Hamza Nov 2007 A1
20070274571 Hamza Nov 2007 A1
20070286590 Terashima Dec 2007 A1
20080005578 Shafir Jan 2008 A1
20080075334 Determan et al. Mar 2008 A1
20080075441 Jelinek et al. Mar 2008 A1
20080075445 Whillock et al. Mar 2008 A1
20080104415 Palti-Wasserman et al. May 2008 A1
20080148030 Goffin Jun 2008 A1
20080211347 Wright et al. Sep 2008 A1
20080252412 Larsson et al. Oct 2008 A1
20090046899 Northcott et al. Feb 2009 A1
20090092283 Whillock et al. Apr 2009 A1
20090316993 Brasnett et al. Dec 2009 A1
20100034529 Jelinek Feb 2010 A1
20100142765 Hamza Jun 2010 A1
20100182440 McCloskey Jul 2010 A1
20100239119 Bazakos et al. Sep 2010 A1
Foreign Referenced Citations (188)
Number Date Country
0484076 May 1992 EP
0593386 Apr 1994 EP
0878780 Nov 1998 EP
0899680 Mar 1999 EP
0910986 Apr 1999 EP
0962894 Dec 1999 EP
1018297 Jul 2000 EP
1024463 Aug 2000 EP
1028398 Aug 2000 EP
1041506 Oct 2000 EP
1041523 Oct 2000 EP
1126403 Aug 2001 EP
1139270 Oct 2001 EP
1237117 Sep 2002 EP
1477925 Nov 2004 EP
1635307 Mar 2006 EP
2369205 May 2002 GB
2371396 Jul 2002 GB
2375913 Nov 2002 GB
2402840 Dec 2004 GB
2411980 Sep 2005 GB
9161135 Jun 1997 JP
9198545 Jul 1997 JP
9201348 Aug 1997 JP
9147233 Sep 1997 JP
9234264 Sep 1997 JP
9305765 Nov 1997 JP
9319927 Dec 1997 JP
10021392 Jan 1998 JP
10040386 Feb 1998 JP
10049728 Feb 1998 JP
10137219 May 1998 JP
10137221 May 1998 JP
10137222 May 1998 JP
10137223 May 1998 JP
10248827 Sep 1998 JP
10269183 Oct 1998 JP
11047117 Feb 1999 JP
11089820 Apr 1999 JP
11200684 Jul 1999 JP
11203478 Jul 1999 JP
11213047 Aug 1999 JP
11339037 Dec 1999 JP
2000005149 Jan 2000 JP
2000005150 Jan 2000 JP
2000011163 Jan 2000 JP
2000023946 Jan 2000 JP
2000083930 Mar 2000 JP
2000102510 Apr 2000 JP
2000102524 Apr 2000 JP
2000105830 Apr 2000 JP
2000107156 Apr 2000 JP
2000139878 May 2000 JP
2000155863 Jun 2000 JP
2000182050 Jun 2000 JP
2000185031 Jul 2000 JP
2000194972 Jul 2000 JP
2000237167 Sep 2000 JP
2000242788 Sep 2000 JP
2000259817 Sep 2000 JP
2000356059 Dec 2000 JP
2000357232 Dec 2000 JP
2001005948 Jan 2001 JP
2001067399 Mar 2001 JP
2001101429 Apr 2001 JP
2001167275 Jun 2001 JP
2001222661 Aug 2001 JP
2001292981 Oct 2001 JP
2001297177 Oct 2001 JP
2001358987 Dec 2001 JP
2002119477 Apr 2002 JP
2002133415 May 2002 JP
2002153444 May 2002 JP
2002153445 May 2002 JP
2002260071 Sep 2002 JP
2002271689 Sep 2002 JP
2002286650 Oct 2002 JP
2002312772 Oct 2002 JP
2002329204 Nov 2002 JP
2003006628 Jan 2003 JP
2003036434 Feb 2003 JP
2003108720 Apr 2003 JP
2003108983 Apr 2003 JP
2003132355 May 2003 JP
2003150942 May 2003 JP
2003153880 May 2003 JP
2003242125 Aug 2003 JP
2003271565 Sep 2003 JP
2003271940 Sep 2003 JP
2003308522 Oct 2003 JP
2003308523 Oct 2003 JP
2003317102 Nov 2003 JP
2003331265 Nov 2003 JP
2004005167 Jan 2004 JP
2004021406 Jan 2004 JP
2004030334 Jan 2004 JP
2004038305 Feb 2004 JP
2004094575 Mar 2004 JP
2004152046 May 2004 JP
2004163356 Jun 2004 JP
2004164483 Jun 2004 JP
2004171350 Jun 2004 JP
2004171602 Jun 2004 JP
2004206444 Jul 2004 JP
2004220376 Aug 2004 JP
2004261515 Sep 2004 JP
2004280221 Oct 2004 JP
2004280547 Oct 2004 JP
2004287621 Oct 2004 JP
2004315127 Nov 2004 JP
2004318248 Nov 2004 JP
2005004524 Jan 2005 JP
2005011207 Jan 2005 JP
2005025577 Jan 2005 JP
2005038257 Feb 2005 JP
2005062990 Mar 2005 JP
2005115961 Apr 2005 JP
2005148883 Jun 2005 JP
2005242677 Sep 2005 JP
WO 9717674 May 1997 WO
WO 9721188 Jun 1997 WO
WO 9802083 Jan 1998 WO
WO 9808439 Mar 1998 WO
WO 9932317 Jul 1999 WO
WO 9952422 Oct 1999 WO
WO 9965175 Dec 1999 WO
WO 0028484 May 2000 WO
WO 0029986 May 2000 WO
WO 0031677 Jun 2000 WO
WO 0036605 Jun 2000 WO
WO 0062239 Oct 2000 WO
WO 0101329 Jan 2001 WO
WO 0103100 Jan 2001 WO
WO 0128476 Apr 2001 WO
WO 0135348 May 2001 WO
WO 0135349 May 2001 WO
WO 0140982 Jun 2001 WO
WO 0163994 Aug 2001 WO
WO 0169490 Sep 2001 WO
WO 0186599 Nov 2001 WO
WO 0201451 Jan 2002 WO
WO 0219030 Mar 2002 WO
WO 0235452 May 2002 WO
WO 0235480 May 2002 WO
WO 02091735 Nov 2002 WO
WO 02095657 Nov 2002 WO
WO 03002387 Jan 2003 WO
WO 03003910 Jan 2003 WO
WO 03054777 Jul 2003 WO
WO 03077077 Sep 2003 WO
WO 2004029863 Apr 2004 WO
WO 2004042646 May 2004 WO
WO 2004055737 Jul 2004 WO
WO 2004089214 Oct 2004 WO
WO 2004097743 Nov 2004 WO
WO 2005008567 Jan 2005 WO
WO 2005013181 Feb 2005 WO
WO 2005024698 Mar 2005 WO
WO 2005024708 Mar 2005 WO
WO 2005024709 Mar 2005 WO
WO 2005029388 Mar 2005 WO
WO 2005062235 Jul 2005 WO
WO 2005069252 Jul 2005 WO
2005093510 Oct 2005 WO
WO 2005093681 Oct 2005 WO
WO 2005096962 Oct 2005 WO
WO 2005098531 Oct 2005 WO
WO 2005104704 Nov 2005 WO
WO 2005109344 Nov 2005 WO
WO 2006012645 Feb 2006 WO
WO 2006023046 Mar 2006 WO
WO 2006051462 May 2006 WO
WO 2006063076 Jun 2006 WO
WO 2006081209 Aug 2006 WO
WO 2006081505 Aug 2006 WO
WO 2007101269 Sep 2007 WO
WO 2007101275 Sep 2007 WO
WO 2007101276 Sep 2007 WO
WO 2007103698 Sep 2007 WO
WO 2007103701 Sep 2007 WO
WO 2007103833 Sep 2007 WO
WO 2007103834 Sep 2007 WO
WO 2008016724 Feb 2008 WO
WO 2008019168 Feb 2008 WO
WO 2008019169 Feb 2008 WO
WO 2008021584 Feb 2008 WO
2008031089 Mar 2008 WO
2008040026 Apr 2008 WO
Related Publications (1)
Number Date Country
20100033677 A1 Feb 2010 US