The present invention pertains to capturing images and particularly capturing images of subjects. More particularly, the invention pertains to focusing for such images.
The invention is a predictive autofocusing system for still or moving subjects using image data.
An iris recognition system may work with iris images acquired in the near infrared (NIR) spectrum. NIR illumination may be provided by a special flash. An operational scenario of the system appears to raise a question, namely, how to focus an iris camera. First, given its very small depth of field, the iris camera should be focused on a particular eye being photographed. Second, the focusing should be done prior to the flash discharge, using only ambient light. Third, determining the correct focus and adjusting the lens to achieve the focus may take a certain amount of time. If the subject is moving, the time needed to do the focusing should be properly accounted for relative to the subject speed, if the system is to produce well focused iris images. An autofocusing system should predict where the subject's eye is going to be in the near future and calculate the nearest time when its lens' focus can “catch up” with the eye given the system's computational and physical limitations.
The autofocusing approach of the present system may operate on optical principles, i.e., such approach does not necessarily explicitly measure the distance to the subject using a ranging device like lidar. The approach may be based on trial-and-error techniques when the system takes a few test images using different focus settings and uses the information gleaned from them to determine both the correct focus lens position and when to fire the iris camera shot.
Predictive autofocusing may involve several phases. In phase 1, the system may take a sequence of test images and use the sequence to estimate the subject's position and/or speed relative to the camera. In phase 2, the position and/or speed may be used to solve a dynamic pursuer-evader problem, whose solution is the location and/or time of their earliest encounter. Here, the pursuer may be the camera focus, which is “chasing” the “evading” subject. As with any evader-pursuer problem, the solution should be computed ahead of real time to allow the pursuer enough time to actually reach the pre-calculated location of the encounter. In phase 3, the pursuer may set out to move into the pre-calculated location of the encounter and fire the shot when it gets there.
In order to focus on the eye, the predictive autofocusing algorithm may rely on an eye finding algorithm to locate the eye in the image. The eye finding and optical autofocusing algorithms may require test images taken using only ambient light. Moreover, the algorithms may work with frame rates higher than the rate a near infrared, large image size iris camera can support. To overcome certain constraints, the present system may be implemented using a custom-built single-lens-splitter (SLS) camera that uses a beam splitter to separate the NIR and visible light bouncing off the subject and direct it into two separate cameras. The SLS camera is described herein. Other kinds of cameras may instead be used.
In contrast to the autofocusing approaches used in digital photographic cameras, the present system may use genuine image data from a small window around the eye taken in the test images of a sequence, use the image sequence to estimate the position and/or speed of the subject, and feed the data and estimate into a pursuer-evader problem solver to accurately predict the encounter rather then just using simple feedback reacting to a few fast point sensors to catch up with the moving subject. While such simple approaches may work when taking conventional pictures, their performance appears to fall short for iris imaging due to a very short depth of field of the iris camera optics and a need for precise focusing on a small, well defined area of an image.
A combined face and iris recognition system (CFAIRS) may work with high resolution iris images acquired by its iris camera in the near infrared (NIR) spectrum. Other kinds of recognition system may be used. The illumination may be provided by a special NIR flash, whose duration, on the order of one millisecond, may be short enough to freeze the subject's motion during image exposure.
The operational scenario should have a way to focus the iris camera. First, the focusing should be done prior to the flash discharge, using only the ambient light. Second, given the extremely small depth of field of the iris camera optics, the iris camera should be focused not on a vaguely defined “scene”, but on the particular eye being photographed.
Of the several phases, in phase 1, the system may acquire data from images to determine the subject's position and/or speed. Speed may include radial speed, which is measured along the optical axis of the camera. However, there may be situations where the subject motion has a high lateral speed (i.e., in a plane perpendicular to the optical axis) as well, in which case the system should determine the complete speed vector.
In phase 2, the position and/or speed may be used to solve a dynamic pursuer-evader problem, a solution of which is the location and/or time of their encounter. Here, the focus lens in the camera objective, whose position determines the focus distance, may be “chasing” the “evading” subject. When they meet, the system needs to recognize this encounter and make the camera fire its shot of the eye or iris of the subject. One may note that as with any evader-pursuer issue, the solution needs to be computed ahead of real time to allow time for the pursuer, whose velocity is always limited, to perhaps actually reach the pre-calculated location of the encounter so as to be prepared for an iris image capture. How long this prediction needs to be, depends on the relative pursuer-evader speed.
In phase 3, the pursuer may set out to move into the pre-calculated location of the encounter. If the pursuer has the ability to update its estimates of the subject's position and/or velocity during the pursuit, the pursuer may counter the subject's “evasive” maneuvers using feedback and improve its odds of obtaining a well focused image of the subject, particularly the subject's iris. The feedback may consist of just a periodic re-run of phases 1 and 2.
The optical autofocusing approaches for moving subjects may differ from other approaches just in their implementation of phase 1. A disadvantage of other approaches may be their inability to precisely locate the focus target. As a tradeoff for precision, a disadvantage of optical approaches may be the relatively long time before they determine the subject's position and/or velocity, which is largely determined by the time needed to collect the test images. The time requirement appears to be of particular concern when the subjects are moving. To speed up the collection, one needs to use a camera with a fast readout. However, fast and large image size sensors (i.e., ten to sixteen Mpixels or so) of the kind needed for iris capture may not be currently available at a reasonable price. To manage the size constraint, the single lens splitter (SLS) camera that uses a beam splitter may support a division-of-labor approach to the issue. The single lens splitter camera is merely an example camera. Two separate cameras may instead be used for visible and infrared image capture, respectively. Or one camera may be used for both visible and infrared image capture.
The SLS camera may have two cameras that share a common high power zoom lens followed by a splitter that separates the incoming light into its NIR and visible components. The NIR component may go straight into the high resolution, but slow, iris camera. The visible component may be deflected and it may enter a fast, low resolution focus camera, whose purpose is to provide data to enable proper and fast autofocusing.
Another limitation may stem from a need to repeatedly change focus in the course of autofocusing. A general and conceptually simplest approach may be the stop-and-go autofocusing in which the system moves the focus lens, waits until it stops, then starts the image exposure and waits until the exposure is over before it begins to move the lens again to acquire the next image. This is an approach which may involve moving mechanical parts with inertia. Thus, getting the parts moving and stopping them may be slowed down by dynamic transients. A continuous focus sweep autofocusing may improve the speed and reduce mechanical stresses on the optics by not requiring the focus lens to stop and start during each test image acquisition cycle.
The SLS camera's apparent complexity or other camera's properties might not be a consequence of using the particular autofocusing approaches presented herein. An issue may be that, given the sensor technology limitations, generally an iris camera cannot focus by itself. Regardless of what autofocusing approach one uses, the solution may require an auxiliary, be it another camera, lidar or some other device, whose presence could complicate the design, and whose role is to find the focus target in the scene, i.e., the particular eye that the iris camera will photograph, and autofocus on it using an approach of choice. Thus, a target that moves radially (and possibly also laterally) might be handled only by predictive autofocusing approaches. One reason for predictive autofocusing is that due to the limited focus lens velocity the lens focus (dF(t)), the lens focus can be changed only so fast.
Camera 22 and CFAIRS system 22 may be terms which may be used at times interchangeably in the present description. At time t0, system 22 may lock onto the subject 21 and initiate an iris image capture approach. The system's ranging subsystem may obtain a reading on the subject's approximate distance, which can be used to preset the focus lens at a location, position or distance 12 dF where the subject 21 would be in focus if the subject were standing that far or at that distance from the camera. Because the subject's actual distance 11 is dS, the focus lens setting may appear to be off by (dS—dF) meters. One may assume that the initial difference 12 dF is virtually identical with an end point of the range 23 and is beyond the subject (i.e., dS<dF, farther away from camera 22 than the subject 21 is), and that at time t1 the focus lens will have already arrived at this point.
A focus camera of camera 22 may start the autofocusing cycle at time t1 by taking its first image. The exposure time 25 may be for the first image taking be designated a TE. The focus during that time TE may be designated as a focus period 26, whether the focus distance 12 is changing or not, and be a dashed line portion of focus distance 12 graph. As soon as the exposure ends at t1+TE, the system may once again start moving the focus lens and thus changing the focus distance 12. After it stops at time t2, the focus camera may take a second test image during TE 25 and during the focus holding distance 26, and so on. According to
The number of images taken, N, may depend on the subject speed, which is not known at this point in time. The faster the subject 21 moves, then the more test images the focus camera needs to take since it takes the focus lens longer to catch up at time 16 tF with the moving subject 21. In order to determine the image sequence length dynamically, the system 10 would need to process the test images in real time. This means that the time the system has at most is TL seconds from when the current image exposure terminates until the next image data becomes available to locate the eye 27, to extract data from a window 29 surrounding the eye, to calculate the image intensity variance over the window data and to decide whether to terminate the test sequence acquisition. Upon the capturing of a test image, virtually immediately in real time, the intensity variance over the target window 29 may be calculated before capturing the next image. Completing the test image sequence at time tP
The system may next enter phase 2, when it uses the data to find the time 16 tF of the system's passing through the correct focus point 15, to compute the estimate of the subject's radial speed, νS, (i.e., down the iris camera's optical axis) to compute the estimate of the subject's location, dS(tF), at which the focus occurred, to compute the prediction of the time tP
When done computing the encounter specifics, which happens at time tP
The next step may be to extend the concept to moving subjects. However, one may skip this extension and move on to the continuous focus sweep autofocusing since the basic ideas appear the same. A forward sweep approach may be considered.
Dynamic transients associated with the repeated moving and stopping of the focus lens tend to slow down the image acquisition process, mechanically stress the lens drive and increase power consumption. A better solution may be not to stop the lens movement but to expose pictures while the lens is moving. An added benefit of this approach is that the exposure and move intervals overlap so that the lens is already closer to its new position 23 when the exposure ends and thus the lens gets there sooner.
In
As the Figures show, the lens focus may be in error by being incorrectly set either before or beyond the subject 21. The focus or focusing error e(t) at time t may be introduced as the difference between the subject distance dS(t) and the focus distance dF(t),
e(t)=dS(t)−dF(t). (1)
The focusing error at the start of the autofocusing cycle in the forward sweep design,
e(t1)=dS(t1)−dF(t1)<0, (2)
may be negative, but change its sign later at time tF. Assuming that both the subject and lens focus are moving at constant velocities νS and νF, respectively, they may advance in time Δt to new positions,
dS(t+Δt)=dS (t)+νSΔt, and dF(t+Δt)=dF(t)+νFΔt, (3)
thus changing the focusing error to a new value,
e(t+Δt)=e(t)+(νS−νF)Δt. (4)
A significant requirement for the continuous focus sweep to work is that the focus lens may move only so fast in that the focusing error change during the exposure is smaller than the depth of field of the camera objective. If the focus camera exposure lasts TE seconds, then the following inequality must hold.
|νS−νF|TE≦depth of field for all n=1, 2, . . . (5)
for the continuous forward focus sweep to work. Also, if the lens focus point is to ever get ahead of the subject, the velocities need to satisfy the inequality,
νF<νS≦0 (6)
It may happen that the subject 21 is moving so fast that the focus lens drive lacks the power to get the lens focus position 23 ahead of the subject and the inequality (6) is not met. In reality, the autofocus may likely “time out” sooner, even though getting lens focus ahead of the subject is still theoretically possible, since getting ahead would likely take too much time to achieve.
The roles of the lens of camera 22 and the subject 21 may be swapped.
The backward sweep approach may start from an initial position where the focus lens distance 12 is preset so as to be before the subject 21 (i.e., closer to camera 22 than subject 21),
e(t1)=dS(t1)−dF(t1)>0. (7)
The focus velocity may now need to head away from the camera,
νF≧0, (8)
for the approach to work. While possible, the backward sweep approach may be slower than the forward sweep approach. A reason may be that even if the image sequence is shorter, when the time tP
The timings in the forward and backward focus sweep approaches are shown in
Focus quality may be measured by the image intensity variance computed over a window (patch) 29 of an area centered on eye 27 (
Plot 31 shown in
Once the test images are collected at tP
tn=t1+(n−1)TL for n=1, 2, . . . N. (9)
In system 10, the sampling period TL may be fixed.
Once subject 21 is allowed to move, in the illustrations it may be noted that the relationship between the focusing error and time depends on the combined velocity (νS−νF) as the equation (4) states. The error measured at the sampling instants may be
e(tn)=e(t1)+(νS−νF) (n−1) TL for n=1, 2, . . . , (10)
with e(t1)<0 being the forward sweep design assumption (2). The inequality (6) may be rewritten as
0<νS−νF≦νF, (11)
from which the largest focus error increments, −νFTL, may occur when the subject 21 is standing, i.e., the subject's velocity νS=0. Or in other words, the faster that the subject 21 is moving, the smaller the increments, which may be manifested on the time axis 34 by shortening its scale as if the samples were denser in time. A flat top 38 like top 32 of
e(tn)=e(t1) for n=1, 2, . . . (12)
Increasing the subject's velocity even further may produce a growing error e(tn)>e(t1). This phenomenon may correspond to the case shown in
Using the focus error scale 33 as the independent variable may make the time axis scale 34 vary as appearing in
Second, the faster the subject 21 moves, the smaller are the variance increments per sample. If the variance data is noisy, the diminishing increments mean a lower signal-to-noise ratio (
may have to be estimated to determine the subject velocity,
the number of images needed to maintain the same level of accuracy may go up with the growing subject (21) speed, because while the noise remains the same, the underlying focusing error increments become smaller. The level of noise present in the images may thus indirectly determine the maximum speed the system 10 can reliably handle.
The subject's speed may be determined from the samples (i.e., test images) obtained before the system 10 passes through the first focus, i.e., for tn≦tF. The number of these images, Nbefore, should be such as to allow a reliable estimation of the slope. The number of samples, Nafter, that need to be collected past the focus point should be such as to allow the algorithm safely decide that, first, the passing has indeed happened and, second, estimate or reconstruct the slope to the right of it well enough to determine the time tF when it took place. Nafter is generally smaller than Nbefore.
Making the prediction may be done. An optical approach to autofocusing may determine the subject 21 distance from the relationship relating the distance at which a lens is focused, dF, to the values of the lens' zoom, sZ, and focus, sF, and servo set points.
dF=f(sZ, sF) (14)
For a given lens and its instrumentation, this focus calibration function may be fixed. The focus calibration function may be determined once the system is built and stored as a regression function of the calibration data. When using the regression function, the first item is to ensure that the lens is properly focused on the target whose distance is being estimated. This may explain why there is an interest in determining virtually exactly the time tF when the lens focus happens to be aligned with the subject's eye. Knowing this time allows a recovery of the zoom, sZ(tF), and focus, sF(tF), drive positions at that instant and, consequently, also the subject's distance,
dS(tF)=dF(tF)=f(sZ(tF), sF(tF)), (15)
which may be used as the initial conditions in the equations for computing an encounter as noted herein. The encounter may be a future situation when the lens focus and subject 21 are aligned again, that is,
dS(tP
where tP
To obtain the prediction, the equations of motions (3) with the terminal condition (16) should be solved. Since the focus lens velocity has discontinuities, the entire time interval from tF to tP
Phase 3 may be skipped altogether, if executing it would not offer any significant time improvement over just waiting for the subject 21 to move into the encounter distance. If this is the case, then tP
The following solution may be generic, with all three phases present as shown in the
dS(tF)=dF(tF), dS(tN)=dS(tF)+νS(tN−tF) and dF(tN)=dF(tF)+νF(tN−tF) (17)
Phase 2 equations may be
dS(tP
Phase 3 equations may be
dS(tp
Their solution may be the predicted encounter time,
(tP
As could be expected, the encounter time may be a function of time increments and thus independent of the absolute value of the times involved. Thus, one may be free to choose the instant from which one starts measuring time.
The formula (20) may be valid as long as tP
(tP
on the time tP
(tP
Equation (22) may also explain why the autofocusing system cycle generally needs to have phase 3. If autofocusing system only passively waited until the subject 21 moved into the right position, then for νS→0, the time of encounter may be tP
Since the encounter instant computation cannot be initiated until the test image sequence is completed, a lower bound on tP
The bounds may confirm what has been already established herein, namely, that the speed of the lens focus distance change should not be smaller than the subject speed for the continuous focus sweep autofocusing to work. Additionally, the bounds may also define what the “real time processing” means in the continuous focus sweep autofocusing context. The time available for computing the encounter instant, TC=tP
νF/νS→1 (24)
and shrinks to zero, that is, tP
(νf/νS)min=1+(TC min/(tN−tF)) (25)
Since νF is fixed by the optics design, the inequality (25) may limit the maximum subject speed that the autofocusing can handle.
νSmax=(1/(1+(TC min/(tN−tF))))νF. (26)
The difference between equations (22) and (20),
((νF/νS)(tN−tF))−(νF/(νS+νF))(tN−tF)−(νF/(νS+νF))(tP
may show that if one wants to maximize the speedup through the forthcoming focus lens motion, then one should strive to make the computation time TC as short as possible.
As a timing device, one may use either the computer clock or design the code so that the computer's components execute in known times or use timing derived from the stepping of the focus lens drive.
The solution (20) may become known to the system 10 sometimes during phase 2. Before accepting it, the system should check if it is not too close to the current time (of which the system is aware) to be realizable, given the system components' timing constraints. If the time is far out, it may likely make sense to actually execute phase 3. If it is close, however, it may make sense to drop phase 3 and recompute the predicted encounter time without it.
Once the autofocusing algorithm decides on tP
dS(tP
In the
The algorithm may start its clock at an arbitrarily chosen time, which may be marked as to, because its choice has no effect on the end result. A more important finding is, however, that the encounter time computation appears to make no direct use of anything that happened before the time, tF, of the first passing through the focus. True, the time as well as the subject 21 speed at that time can be established only in retrospect, no sooner than at tN, and for that purpose, the whole sequence of N test images had to be taken, most of them before the passing through the focus. In this respect, the diagrams sketched in
First, one may note that phases 2 and 3 may exist in any predictive autofocusing concept. The formulae used in phase 2 to compute the encounter specifics may be slightly different from those derived herein, but this should be an inconsequential difference. Regardless of the approach used, at the time tP
It may be the case that a lidar can provide the position and speed measurements of the subject in a shorter time. It might seem, then, that as far as the agility is concerned, the optical autofocusing appears much slower compared to the approaches based on the ranging. An actual benefit of ranging approaches, however, may not be as great as a first look may suggest. A reason is that in actual operation, much of the test image sequence may be taken during transitioning the SLS camera 22 from one subject 21 to the next, an operation that is generally there despite of how the camera is going to be focused.
An example single lens splitter camera 22 may provide high quality iris images that can be used for identification and/or tracking of subjects or individuals 21. A camera system may include a focus camera and an iris camera. The latter may be referred to as sub-cameras. The focus camera may be sensitive to ambient light or some spectrum thereof, while the iris camera may be sensitive to infrared or other spectrum of light. The focus camera and the iris camera may share an optical path that includes one or more lens that capture light, as well as a beam splitter or other optical element that directs light of some wavelengths to the focus camera and allows other wavelengths to reach the iris camera.
Focus camera 52 may be sensitive to ambient light or some spectrum thereof. Focus camera 52 may be any suitable camera that has a sufficiently high frame rate, allows region of interest selection and offers sensitivity to perform an auto-focusing function, such as, for example a PixeLink™ PL-A741 camera. Having a relatively high frame rate may mean that focus camera 52 may have a relatively lower resolution, but this is not always the case. In some cases, focus camera 52 may have a frame rate of at least about 100 frames per second, or a frame every ten milliseconds.
It is contemplated that iris camera 54 may be any suitable camera that is capable of acquiring an iris image in a desired light spectrum and with a desired quality, such as, for example, a REDLAKE™ ES11000 or a ES16000 digital camera. The light spectra used may include, but are not limited to, visible and infrared wavelengths. The desired image quality may depend on an intended security application. For example, higher security level applications typically require higher image quality. The image quality is typically dependent on the entire optical path including both the camera and its optics. For some applications, the minimum iris image quality for various security levels is defined in ANSI standard INCITS M1/03-0590.
Camera system 22 may include a lens 58. While a single lens 58 is illustrated, it will be recognized that in some applications, depending for example on a distance between camera system 22 and a possible subject 21, or perhaps depending at least in part on the particular optics, two or more lenses 58 may be deployed, as desired. Lens or lenses 58 may be configured to provide any desired degree of magnification.
A beam splitter 62 or other optical element may be deployed downstream of lens 58. Beam splitter 62 may be configured to permit some wavelengths of light to pass straight through while other wavelengths of light are deflected at an angle as shown. In some instances, beam splitter 62 may be configured to permit infrared light such as near infrared light (about 700 to about 900 nanometers) to pass through beam splitter 62 towards iris camera 54 while deflecting visible light (about 400 to about 700 nanometers) or some spectrum thereof towards focus camera 52.
As a result, focus camera 52 and iris camera 54 may see the same image, albeit in different wavelengths, and may be considered as sharing an optical path, i.e., through lens 58. Focus camera 52 may be considered as having an optical axis 64 while iris camera 54 may be considered as having an optical axis 66. In some cases, optical axis 64 is perpendicular or at least substantially perpendicular to optical axis 66, but this is not required. Rather, this may be a feature of the optical properties of beam splitter 62. In some instances, a zoom lens 58 may be considered as being disposed along optical axis 66. In some cases, beam splitter 62 may be disposed at or near an intersection of optical axis 64 and optical axis 66, but this is not necessarily required.
Focus camera 52 may be used to move or focus a lens that is part of lens 58. Since focus camera 52 and iris camera 54 see the same image, by virtue of their common optical path, it should be recognized that focusing lens 58 via focus camera 52 may provide an initial focusing for iris camera 54, under ambient lighting conditions. In some instances, focus camera 52 may move the focus lens within lens 58 using one or more servo motors under the control of any suitable auto-focusing algorithm. In some cases, a controller (not shown in
Because light of differing wavelengths are refracted differently as they pass through particular materials (glass lenses and the like, for example), focusing lens 58 via one wavelength of light may not provide a precise focus for iris camera 54 at another wavelength of light. In some cases, it may be useful to calculate or otherwise determine a correction factor that may be used to correct the focus of lens 58 after lens 58 has been auto-focused using the focus camera 52, but before the iris camera 54 captures an image. Information regarding such correction may be found in, for example, U.S. patent application Ser. No. 11/681,251, filed Mar. 2, 2007. U.S. patent application Ser. No. 11/681,251, filed Mar. 2, 2007, is hereby incorporated by reference.
Once camera system 22 is pointed at a face, the focus camera 52 (or a separate controller or the like) is tasked with finding a focus target within an image seen or sensed by focus camera 52. In some cases, the focus target may be a predefined point on the focus target, such as a predefined specific point on a face such as an eye pupil or the nose bridge. Once the focus target is located at functionality 68 and focus camera 52 is precisely autofocused on it via functionality 70, it may be necessary to provide a focus correction pertaining to the difference in focal length between the ambient light or some spectrum thereof used to auto-focus the lens, and the wavelength or wavelengths to be captured by the iris camera 54, as indicated at item 70. If or when the subject moves, such as by walking, bending, turning its head, and the like, focus camera 52 may be tasked to focus lens 58 in an ongoing process. Once focus has been achieved, camera system 22 may provide an in-focus flag 72 to initiate iris camera shutter control 74, and in some cases, a flash controller. Iris image data 55 may be provided from camera 54.
In some situations, camera system 22 may be deployed in a position that permits detection and identification of people who are standing or walking in a particular location such as a hallway, airport concourse, and the like.
The present illustration makes several assumptions. For example, a steering angle of plus or minus 22.5 degrees (or a total path width of about 45 degrees) may be assumed. It may also be assumed, for purposes of this illustration, that the individual is unaware of being identified and thus is being uncooperative. As a result, the individual happens to walk in a manner that increases the relative angle between the camera and the individual. The person may be detected at a distance of about 2 to 5 meters in this example.
It may be recognized that digital tilt and pan permit a camera to remain pointed at a face without requiring mechanical re-positioning as long as a desired portion of the image, such as a face or a portion of a face, remain within the viewable image. Because focus camera 52 and iris camera 54 have about the same field of view, they may have about the same digital tilt and pan. A focus target algorithm may find the focus target (such as an eye pupil or nose bridge) within the focus camera image and then precisely focus on it.
At block 98, a focus target may be found within the focus camera image. Image data from a small area surrounding the focus target can be extracted from the focus camera image at block 100, and the extracted data may be used to precisely auto focus the focus camera 52. Control may pass to block 102, where the focus setting is corrected, if necessary, for any differences between the light spectrum used for focusing and the light spectrum used for image acquisition by iris camera 54. Control may pass to block 104, where an iris image is captured using, for example, infrared light sometimes aided by a flash discharge.
In the present specification, some of the matter may be of a hypothetical or prophetic nature although stated in another manner or tense.
Although the invention has been described with respect to at least one illustrative example, many variations and modifications will become apparent to those skilled in the art upon reading the present specification. It is therefore the intention that the appended claims be interpreted as broadly as possible in view of the prior art to include all such variations and modifications.
The U.S. Government may have rights in the present invention.
Number | Name | Date | Kind |
---|---|---|---|
4641349 | Flom et al. | Feb 1987 | A |
5218394 | Ueda et al. | Jun 1993 | A |
5291560 | Daugman | Mar 1994 | A |
5293427 | Ueno et al. | Mar 1994 | A |
5359382 | Uenaka | Oct 1994 | A |
5404013 | Tajima | Apr 1995 | A |
5572596 | Wildes et al. | Nov 1996 | A |
5608472 | Szirth et al. | Mar 1997 | A |
5664239 | Nakata | Sep 1997 | A |
5687031 | Ishihara | Nov 1997 | A |
5717512 | Chmielewski, Jr. et al. | Feb 1998 | A |
5751836 | Wildes et al. | May 1998 | A |
5859686 | Aboutalib et al. | Jan 1999 | A |
5860032 | Iwane | Jan 1999 | A |
5896174 | Nakata | Apr 1999 | A |
5901238 | Matsuhita | May 1999 | A |
5953440 | Zhang et al. | Sep 1999 | A |
5956122 | Doster | Sep 1999 | A |
5978494 | Zhang | Nov 1999 | A |
6005704 | Chmielewski, Jr. et al. | Dec 1999 | A |
6007202 | Apple et al. | Dec 1999 | A |
6012376 | Hanke et al. | Jan 2000 | A |
6021210 | Camus et al. | Feb 2000 | A |
6028949 | McKendall | Feb 2000 | A |
6055322 | Salganicoff et al. | Apr 2000 | A |
6064752 | Rozmus et al. | May 2000 | A |
6069967 | Rozmus et al. | May 2000 | A |
6081607 | Mori et al. | Jun 2000 | A |
6088470 | Camus et al. | Jul 2000 | A |
6091899 | Konishi et al. | Jul 2000 | A |
6101477 | Hohle et al. | Aug 2000 | A |
6104431 | Inoue et al. | Aug 2000 | A |
6108636 | Yap et al. | Aug 2000 | A |
6119096 | Mann et al. | Sep 2000 | A |
6120461 | Smyth | Sep 2000 | A |
6144754 | Okano et al. | Nov 2000 | A |
6246751 | Bergl et al. | Jun 2001 | B1 |
6247813 | Kim et al. | Jun 2001 | B1 |
6252977 | Salganicoff et al. | Jun 2001 | B1 |
6259478 | Hori | Jul 2001 | B1 |
6282475 | Washington | Aug 2001 | B1 |
6285505 | Melville et al. | Sep 2001 | B1 |
6285780 | Yamakita et al. | Sep 2001 | B1 |
6289113 | McHugh et al. | Sep 2001 | B1 |
6299306 | Braithwaite et al. | Oct 2001 | B1 |
6308015 | Matsumoto | Oct 2001 | B1 |
6309069 | Seal et al. | Oct 2001 | B1 |
6320610 | Van Sant et al. | Nov 2001 | B1 |
6320612 | Young | Nov 2001 | B1 |
6320973 | Suzaki et al. | Nov 2001 | B2 |
6323761 | Son | Nov 2001 | B1 |
6325765 | Hay et al. | Dec 2001 | B1 |
6330674 | Angelo et al. | Dec 2001 | B1 |
6332193 | Glass et al. | Dec 2001 | B1 |
6344683 | Kim | Feb 2002 | B1 |
6370260 | Pavlidis et al. | Apr 2002 | B1 |
6377699 | Musgrave et al. | Apr 2002 | B1 |
6400835 | Lemelson et al. | Jun 2002 | B1 |
6424727 | Musgrave et al. | Jul 2002 | B1 |
6424845 | Emmoft et al. | Jul 2002 | B1 |
6433818 | Steinberg et al. | Aug 2002 | B1 |
6438752 | McClard | Aug 2002 | B1 |
6441482 | Foster | Aug 2002 | B1 |
6446045 | Stone et al. | Sep 2002 | B1 |
6483930 | Musgrave et al. | Nov 2002 | B1 |
6484936 | Nicoll et al. | Nov 2002 | B1 |
6490443 | Freeny, Jr. | Dec 2002 | B1 |
6493669 | Curry et al. | Dec 2002 | B1 |
6494363 | Roger et al. | Dec 2002 | B1 |
6503163 | Van Sant et al. | Jan 2003 | B1 |
6505193 | Musgrave et al. | Jan 2003 | B1 |
6506078 | Mori et al. | Jan 2003 | B1 |
6516078 | Yang et al. | Feb 2003 | B1 |
6516087 | Camus | Feb 2003 | B1 |
6516416 | Gregg et al. | Feb 2003 | B2 |
6522772 | Morrison et al. | Feb 2003 | B1 |
6523165 | Liu et al. | Feb 2003 | B2 |
6526160 | Ito | Feb 2003 | B1 |
6532298 | Cambier et al. | Mar 2003 | B1 |
6540392 | Braithwaite | Apr 2003 | B1 |
6542624 | Oda | Apr 2003 | B1 |
6546121 | Oda | Apr 2003 | B1 |
6553494 | Glass | Apr 2003 | B1 |
6580356 | Alt et al. | Jun 2003 | B1 |
6591001 | Oda et al. | Jul 2003 | B1 |
6591064 | Higashiyama et al. | Jul 2003 | B2 |
6594377 | Kim et al. | Jul 2003 | B1 |
6594399 | Camus et al. | Jul 2003 | B1 |
6598971 | Cleveland | Jul 2003 | B2 |
6600878 | Pregara | Jul 2003 | B2 |
6614919 | Suzaki et al. | Sep 2003 | B1 |
6652099 | Chae et al. | Nov 2003 | B2 |
6674367 | Sweatte | Jan 2004 | B2 |
6690997 | Rivalto | Feb 2004 | B2 |
6708176 | Strunk et al. | Mar 2004 | B2 |
6714665 | Hanna et al. | Mar 2004 | B1 |
6718049 | Pavlidis et al. | Apr 2004 | B2 |
6732278 | Baird, III et al. | May 2004 | B2 |
6734783 | Anbai | May 2004 | B1 |
6745520 | Puskaric et al. | Jun 2004 | B2 |
6750435 | Ford | Jun 2004 | B2 |
6751733 | Nakamura et al. | Jun 2004 | B1 |
6753919 | Daugman | Jun 2004 | B1 |
6754640 | Bozeman | Jun 2004 | B2 |
6760467 | Min et al. | Jul 2004 | B1 |
6765470 | Shinzaki | Jul 2004 | B2 |
6766041 | Golden et al. | Jul 2004 | B2 |
6775774 | Harper | Aug 2004 | B1 |
6785406 | Kamada | Aug 2004 | B1 |
6793134 | Clark | Sep 2004 | B2 |
6819219 | Bolle et al. | Nov 2004 | B1 |
6829370 | Pavlidis et al. | Dec 2004 | B1 |
6832044 | Doi et al. | Dec 2004 | B2 |
6836554 | Bolle et al. | Dec 2004 | B1 |
6837436 | Swartz et al. | Jan 2005 | B2 |
6845479 | Park | Jan 2005 | B2 |
6853444 | Haddad | Feb 2005 | B2 |
6867683 | Calvesio et al. | Mar 2005 | B2 |
6873960 | Wood et al. | Mar 2005 | B1 |
6896187 | Stockhammer | May 2005 | B2 |
6905411 | Nguyen et al. | Jun 2005 | B2 |
6920237 | Chen et al. | Jul 2005 | B2 |
6930707 | Bates et al. | Aug 2005 | B2 |
6934849 | Kramer et al. | Aug 2005 | B2 |
6954738 | Wang et al. | Oct 2005 | B2 |
6957341 | Rice et al. | Oct 2005 | B2 |
6972797 | Izumi | Dec 2005 | B2 |
6992562 | Fuks et al. | Jan 2006 | B2 |
6992717 | Hatano | Jan 2006 | B2 |
7030351 | Wasserman et al. | Apr 2006 | B2 |
7053948 | Konishi | May 2006 | B2 |
7084904 | Liu et al. | Aug 2006 | B2 |
7183895 | Bazakos et al. | Feb 2007 | B2 |
7184577 | Chen et al. | Feb 2007 | B2 |
7204425 | Mosher, Jr. et al. | Apr 2007 | B2 |
7277561 | Shin | Oct 2007 | B2 |
7362210 | Bazakos et al. | Apr 2008 | B2 |
7418115 | Northcott et al. | Aug 2008 | B2 |
7421097 | Hamza et al. | Sep 2008 | B2 |
7460693 | Loy et al. | Dec 2008 | B2 |
7471451 | Dent et al. | Dec 2008 | B2 |
7486806 | Azuma et al. | Feb 2009 | B2 |
7537568 | Moehring | May 2009 | B2 |
7538326 | Johnson et al. | May 2009 | B2 |
7722461 | Gatto et al. | May 2010 | B2 |
20010026632 | Tamai | Oct 2001 | A1 |
20010027116 | Baird | Oct 2001 | A1 |
20010047479 | Bromba et al. | Nov 2001 | A1 |
20010051924 | Uberti | Dec 2001 | A1 |
20010054154 | Tam | Dec 2001 | A1 |
20020010857 | Karthik | Jan 2002 | A1 |
20020033896 | Hatano | Mar 2002 | A1 |
20020039433 | Shin | Apr 2002 | A1 |
20020040434 | Elliston et al. | Apr 2002 | A1 |
20020062280 | Zachariassen et al. | May 2002 | A1 |
20020077841 | Thompson | Jun 2002 | A1 |
20020089157 | Breed et al. | Jul 2002 | A1 |
20020106113 | Park | Aug 2002 | A1 |
20020112177 | Voltmer et al. | Aug 2002 | A1 |
20020114495 | Chen et al. | Aug 2002 | A1 |
20020130961 | Lee et al. | Sep 2002 | A1 |
20020131622 | Lee et al. | Sep 2002 | A1 |
20020139842 | Swaine | Oct 2002 | A1 |
20020140715 | Smet | Oct 2002 | A1 |
20020142844 | Kerr | Oct 2002 | A1 |
20020144128 | Rahman et al. | Oct 2002 | A1 |
20020150281 | Cho | Oct 2002 | A1 |
20020154794 | Cho | Oct 2002 | A1 |
20020158750 | Almalik | Oct 2002 | A1 |
20020164054 | McCartney et al. | Nov 2002 | A1 |
20020175182 | Matthews | Nov 2002 | A1 |
20020186131 | Fettis | Dec 2002 | A1 |
20020191075 | Doi et al. | Dec 2002 | A1 |
20020191076 | Wada et al. | Dec 2002 | A1 |
20020194128 | Maritzen et al. | Dec 2002 | A1 |
20020194131 | Dick | Dec 2002 | A1 |
20020198731 | Barnes et al. | Dec 2002 | A1 |
20030002714 | Wakiyama | Jan 2003 | A1 |
20030012413 | Kusakari et al. | Jan 2003 | A1 |
20030014372 | Wheeler et al. | Jan 2003 | A1 |
20030020828 | Ooi et al. | Jan 2003 | A1 |
20030038173 | Blackson et al. | Feb 2003 | A1 |
20030046228 | Berney | Mar 2003 | A1 |
20030053663 | Chen et al. | Mar 2003 | A1 |
20030055689 | Block et al. | Mar 2003 | A1 |
20030055787 | Fujii | Mar 2003 | A1 |
20030058492 | Wakiyama | Mar 2003 | A1 |
20030061172 | Robinson | Mar 2003 | A1 |
20030061233 | Manasse et al. | Mar 2003 | A1 |
20030065626 | Allen | Apr 2003 | A1 |
20030071743 | Seah et al. | Apr 2003 | A1 |
20030072475 | Tamori | Apr 2003 | A1 |
20030073499 | Reece | Apr 2003 | A1 |
20030074317 | Hofi | Apr 2003 | A1 |
20030074326 | Byers | Apr 2003 | A1 |
20030076161 | Tisse | Apr 2003 | A1 |
20030076300 | Lauper et al. | Apr 2003 | A1 |
20030076984 | Tisse et al. | Apr 2003 | A1 |
20030080194 | O'Hara et al. | May 2003 | A1 |
20030091215 | Lauper et al. | May 2003 | A1 |
20030092489 | Veradej | May 2003 | A1 |
20030095689 | Volkommer et al. | May 2003 | A1 |
20030098776 | Friedli | May 2003 | A1 |
20030099379 | Monk et al. | May 2003 | A1 |
20030099381 | Ohba | May 2003 | A1 |
20030103652 | Lee et al. | Jun 2003 | A1 |
20030107097 | McArthur et al. | Jun 2003 | A1 |
20030107645 | Yoon | Jun 2003 | A1 |
20030108224 | Ike | Jun 2003 | A1 |
20030108225 | Li | Jun 2003 | A1 |
20030115148 | Takhar | Jun 2003 | A1 |
20030115459 | Monk | Jun 2003 | A1 |
20030116630 | Carey et al. | Jun 2003 | A1 |
20030118212 | Min et al. | Jun 2003 | A1 |
20030118217 | Kondo et al. | Jun 2003 | A1 |
20030123711 | Kim et al. | Jul 2003 | A1 |
20030125054 | Garcia | Jul 2003 | A1 |
20030125057 | Pesola | Jul 2003 | A1 |
20030126560 | Kurapati et al. | Jul 2003 | A1 |
20030131245 | Linderman | Jul 2003 | A1 |
20030131265 | Bhakta | Jul 2003 | A1 |
20030133597 | Moore et al. | Jul 2003 | A1 |
20030140235 | Immega et al. | Jul 2003 | A1 |
20030140928 | Bui et al. | Jul 2003 | A1 |
20030141411 | Pandya et al. | Jul 2003 | A1 |
20030149881 | Patel et al. | Aug 2003 | A1 |
20030152251 | Ike | Aug 2003 | A1 |
20030152252 | Kondo et al. | Aug 2003 | A1 |
20030156741 | Lee et al. | Aug 2003 | A1 |
20030158762 | Wu | Aug 2003 | A1 |
20030158821 | Maia | Aug 2003 | A1 |
20030159051 | Hollnagel | Aug 2003 | A1 |
20030163739 | Armington et al. | Aug 2003 | A1 |
20030169334 | Braithwaite et al. | Sep 2003 | A1 |
20030169901 | Pavlidis et al. | Sep 2003 | A1 |
20030169907 | Edwards et al. | Sep 2003 | A1 |
20030173408 | Mosher, Jr. et al. | Sep 2003 | A1 |
20030174049 | Beigel et al. | Sep 2003 | A1 |
20030177051 | Driscoll et al. | Sep 2003 | A1 |
20030182151 | Taslitz | Sep 2003 | A1 |
20030182182 | Kocher | Sep 2003 | A1 |
20030189480 | Hamid | Oct 2003 | A1 |
20030189481 | Hamid | Oct 2003 | A1 |
20030191949 | Odagawa | Oct 2003 | A1 |
20030194112 | Lee | Oct 2003 | A1 |
20030195935 | Leeper | Oct 2003 | A1 |
20030198368 | Kee | Oct 2003 | A1 |
20030200180 | Phelan, III et al. | Oct 2003 | A1 |
20030210139 | Brooks et al. | Nov 2003 | A1 |
20030210802 | Schuessier | Nov 2003 | A1 |
20030218719 | Abourizk et al. | Nov 2003 | A1 |
20030225711 | Paping | Dec 2003 | A1 |
20030228898 | Rowe | Dec 2003 | A1 |
20030233556 | Angelo et al. | Dec 2003 | A1 |
20030235326 | Morikawa et al. | Dec 2003 | A1 |
20030235411 | Morikawa et al. | Dec 2003 | A1 |
20030236120 | Reece et al. | Dec 2003 | A1 |
20040001614 | Russon et al. | Jan 2004 | A1 |
20040002894 | Kocher | Jan 2004 | A1 |
20040005078 | Tillotson | Jan 2004 | A1 |
20040006553 | de Vries et al. | Jan 2004 | A1 |
20040010462 | Moon et al. | Jan 2004 | A1 |
20040012760 | Mihashi et al. | Jan 2004 | A1 |
20040019570 | Bolle et al. | Jan 2004 | A1 |
20040023664 | Mirouze et al. | Feb 2004 | A1 |
20040023709 | Beaulieu et al. | Feb 2004 | A1 |
20040025030 | Corbett-Clark et al. | Feb 2004 | A1 |
20040025031 | Ooi et al. | Feb 2004 | A1 |
20040025053 | Hayward | Feb 2004 | A1 |
20040029564 | Hodge | Feb 2004 | A1 |
20040030930 | Nomura | Feb 2004 | A1 |
20040035123 | Kim et al. | Feb 2004 | A1 |
20040037450 | Bradski | Feb 2004 | A1 |
20040039914 | Barr et al. | Feb 2004 | A1 |
20040042641 | Jakubowski | Mar 2004 | A1 |
20040044627 | Russell et al. | Mar 2004 | A1 |
20040046640 | Jourdain et al. | Mar 2004 | A1 |
20040049687 | Orsini et al. | Mar 2004 | A1 |
20040050924 | Mletzko et al. | Mar 2004 | A1 |
20040050930 | Rowe | Mar 2004 | A1 |
20040052405 | Walfridsson | Mar 2004 | A1 |
20040052418 | DeLean | Mar 2004 | A1 |
20040059590 | Mercredi et al. | Mar 2004 | A1 |
20040059953 | Purnell | Mar 2004 | A1 |
20040104266 | Bolle et al. | Jun 2004 | A1 |
20040117636 | Cheng | Jun 2004 | A1 |
20040133804 | Smith et al. | Jul 2004 | A1 |
20040146187 | Jeng | Jul 2004 | A1 |
20040148526 | Sands et al. | Jul 2004 | A1 |
20040160518 | Park | Aug 2004 | A1 |
20040162870 | Matsuzaki et al. | Aug 2004 | A1 |
20040162984 | Freeman et al. | Aug 2004 | A1 |
20040169817 | Grotehusmann et al. | Sep 2004 | A1 |
20040172541 | Ando et al. | Sep 2004 | A1 |
20040174070 | Voda et al. | Sep 2004 | A1 |
20040190759 | Caldwell | Sep 2004 | A1 |
20040193893 | Braithwaite et al. | Sep 2004 | A1 |
20040219902 | Lee et al. | Nov 2004 | A1 |
20040233038 | Beenau et al. | Nov 2004 | A1 |
20040240711 | Hamza et al. | Dec 2004 | A1 |
20040252866 | Tisse et al. | Dec 2004 | A1 |
20040255168 | Murashita et al. | Dec 2004 | A1 |
20050008200 | Azuma et al. | Jan 2005 | A1 |
20050008201 | Lee et al. | Jan 2005 | A1 |
20050029353 | Isemura et al. | Feb 2005 | A1 |
20050052566 | Kato | Mar 2005 | A1 |
20050055582 | Bazakos et al. | Mar 2005 | A1 |
20050063567 | Saitoh et al. | Mar 2005 | A1 |
20050084137 | Kim et al. | Apr 2005 | A1 |
20050084179 | Hanna et al. | Apr 2005 | A1 |
20050099288 | Spitz et al. | May 2005 | A1 |
20050102502 | Sagen | May 2005 | A1 |
20050110610 | Bazakos et al. | May 2005 | A1 |
20050125258 | Yellin et al. | Jun 2005 | A1 |
20050127161 | Smith et al. | Jun 2005 | A1 |
20050129286 | Hekimian | Jun 2005 | A1 |
20050134796 | Zelvin et al. | Jun 2005 | A1 |
20050138385 | Friedli et al. | Jun 2005 | A1 |
20050138387 | Lam et al. | Jun 2005 | A1 |
20050146640 | Shibata | Jul 2005 | A1 |
20050151620 | Neumann | Jul 2005 | A1 |
20050152583 | Kondo et al. | Jul 2005 | A1 |
20050193212 | Yuhara | Sep 2005 | A1 |
20050199708 | Friedman | Sep 2005 | A1 |
20050206501 | Farhat | Sep 2005 | A1 |
20050206502 | Bernitz | Sep 2005 | A1 |
20050207614 | Schonberg et al. | Sep 2005 | A1 |
20050210267 | Sugano et al. | Sep 2005 | A1 |
20050210270 | Rohatgi et al. | Sep 2005 | A1 |
20050210271 | Chou et al. | Sep 2005 | A1 |
20050238214 | Matsuda et al. | Oct 2005 | A1 |
20050240778 | Saito | Oct 2005 | A1 |
20050248725 | Ikoma et al. | Nov 2005 | A1 |
20050249385 | Kondo et al. | Nov 2005 | A1 |
20050255840 | Markham | Nov 2005 | A1 |
20060165266 | Hamza | Jul 2006 | A1 |
20060274919 | LoIacono et al. | Dec 2006 | A1 |
20070036397 | Hamza | Feb 2007 | A1 |
20070140531 | Hamza | Jun 2007 | A1 |
20070189582 | Hamza et al. | Aug 2007 | A1 |
20070211924 | Hamza | Sep 2007 | A1 |
20080075334 | Determan et al. | Mar 2008 | A1 |
20080075445 | Whillock et al. | Mar 2008 | A1 |
20080104415 | Palti-Wasserman et al. | May 2008 | A1 |
20080148030 | Goffin | Jun 2008 | A1 |
20090046899 | Northcott et al. | Feb 2009 | A1 |
Number | Date | Country |
---|---|---|
0878780 | Nov 1998 | EP |
0910986 | Apr 1999 | EP |
0962894 | Dec 1999 | EP |
1018297 | Jul 2000 | EP |
1024463 | Aug 2000 | EP |
1028398 | Aug 2000 | EP |
1041506 | Oct 2000 | EP |
1041523 | Oct 2000 | EP |
1126403 | Aug 2001 | EP |
1477925 | Nov 2004 | EP |
2369205 | May 2002 | GB |
2371396 | Jul 2002 | GB |
2375913 | Nov 2002 | GB |
2402840 | Dec 2004 | GB |
2411980 | Sep 2005 | GB |
9161135 | Jun 1997 | JP |
9198545 | Jul 1997 | JP |
9201348 | Aug 1997 | JP |
9147233 | Sep 1997 | JP |
9234264 | Sep 1997 | JP |
9305765 | Nov 1997 | JP |
9319927 | Dec 1997 | JP |
10021392 | Jan 1998 | JP |
10040386 | Feb 1998 | JP |
10049728 | Feb 1998 | JP |
10137219 | May 1998 | JP |
10137221 | May 1998 | JP |
10137222 | May 1998 | JP |
10137223 | May 1998 | JP |
10248827 | Sep 1998 | JP |
10269183 | Oct 1998 | JP |
11047117 | Feb 1999 | JP |
11089820 | Apr 1999 | JP |
11200684 | Jul 1999 | JP |
11203478 | Jul 1999 | JP |
11213047 | Aug 1999 | JP |
11339037 | Dec 1999 | JP |
2000005149 | Jan 2000 | JP |
2000005150 | Jan 2000 | JP |
2000011163 | Jan 2000 | JP |
2000023946 | Jan 2000 | JP |
2000083930 | Mar 2000 | JP |
2000102510 | Apr 2000 | JP |
2000102524 | Apr 2000 | JP |
2000105830 | Apr 2000 | JP |
2000107156 | Apr 2000 | JP |
2000139878 | May 2000 | JP |
2000155863 | Jun 2000 | JP |
2000182050 | Jun 2000 | JP |
2000185031 | Jul 2000 | JP |
2000194972 | Jul 2000 | JP |
2000237167 | Sep 2000 | JP |
2000242788 | Sep 2000 | JP |
2000259817 | Sep 2000 | JP |
2000356059 | Dec 2000 | JP |
2000357232 | Dec 2000 | JP |
2001005948 | Jan 2001 | JP |
2001067399 | Mar 2001 | JP |
2001101429 | Apr 2001 | JP |
2001167275 | Jun 2001 | JP |
2001222661 | Aug 2001 | JP |
2001292981 | Oct 2001 | JP |
2001297177 | Oct 2001 | JP |
2001358987 | Dec 2001 | JP |
2002119477 | Apr 2002 | JP |
2002133415 | May 2002 | JP |
2002153444 | May 2002 | JP |
2002153445 | May 2002 | JP |
2002260071 | Sep 2002 | JP |
2002271689 | Sep 2002 | JP |
2002286650 | Oct 2002 | JP |
2002312772 | Oct 2002 | JP |
2002329204 | Nov 2002 | JP |
2003006628 | Jan 2003 | JP |
2003036434 | Feb 2003 | JP |
2003108720 | Apr 2003 | JP |
2003108983 | Apr 2003 | JP |
2003132355 | May 2003 | JP |
2003150942 | May 2003 | JP |
2003153880 | May 2003 | JP |
2003242125 | Aug 2003 | JP |
2003271565 | Sep 2003 | JP |
2003271940 | Sep 2003 | JP |
2003308522 | Oct 2003 | JP |
2003308523 | Oct 2003 | JP |
2003317102 | Nov 2003 | JP |
2003331265 | Nov 2003 | JP |
2004005167 | Jan 2004 | JP |
2004021406 | Jan 2004 | JP |
2004030334 | Jan 2004 | JP |
2004038305 | Feb 2004 | JP |
2004094575 | Mar 2004 | JP |
2004152046 | May 2004 | JP |
2004163356 | Jun 2004 | JP |
2004164483 | Jun 2004 | JP |
2004171350 | Jun 2004 | JP |
2004171602 | Jun 2004 | JP |
2004206444 | Jul 2004 | JP |
2004220376 | Aug 2004 | JP |
2004261515 | Sep 2004 | JP |
2004280221 | Oct 2004 | JP |
2004280547 | Oct 2004 | JP |
2004287621 | Oct 2004 | JP |
2004315127 | Nov 2004 | JP |
2004318248 | Nov 2004 | JP |
2005004524 | Jan 2005 | JP |
2005011207 | Jan 2005 | JP |
2005025577 | Jan 2005 | JP |
2005038257 | Feb 2005 | JP |
2005062990 | Mar 2005 | JP |
2005115961 | Apr 2005 | JP |
2005148883 | Jun 2005 | JP |
2005242677 | Sep 2005 | JP |
WO 9717674 | May 1997 | WO |
WO 9802083 | Jan 1998 | WO |
WO 9808439 | Mar 1998 | WO |
WO 9932317 | Jul 1999 | WO |
WO 9952422 | Oct 1999 | WO |
WO 9965175 | Dec 1999 | WO |
WO 0028484 | May 2000 | WO |
WO 0029986 | May 2000 | WO |
WO 0031677 | Jun 2000 | WO |
WO 0036605 | Jun 2000 | WO |
WO 0101329 | Jan 2001 | WO |
WO 0103100 | Jan 2001 | WO |
WO 0128476 | Apr 2001 | WO |
WO 0135348 | May 2001 | WO |
WO 0135349 | May 2001 | WO |
WO 0140982 | Jun 2001 | WO |
WO 0163994 | Aug 2001 | WO |
WO 0169490 | Sep 2001 | WO |
WO 0186599 | Nov 2001 | WO |
WO 0201451 | Jan 2002 | WO |
WO 0219030 | Mar 2002 | WO |
WO 0235452 | May 2002 | WO |
WO 0235480 | May 2002 | WO |
WO 02091735 | Nov 2002 | WO |
WO 02095657 | Nov 2002 | WO |
WO 03002387 | Jan 2003 | WO |
WO 03054777 | Jul 2003 | WO |
WO 03077077 | Sep 2003 | WO |
WO 2004029863 | Apr 2004 | WO |
WO 2004042646 | May 2004 | WO |
WO 2004055737 | Jul 2004 | WO |
WO 2004089214 | Oct 2004 | WO |
WO 2004097743 | Nov 2004 | WO |
WO 2005008567 | Jan 2005 | WO |
WO 2005013181 | Feb 2005 | WO |
WO 2005024698 | Mar 2005 | WO |
WO 2005024708 | Mar 2005 | WO |
WO 2005024709 | Mar 2005 | WO |
WO 2005029388 | Mar 2005 | WO |
WO 2005062235 | Jul 2005 | WO |
WO 2005069252 | Jul 2005 | WO |
WO 2005093681 | Oct 2005 | WO |
WO 2005096962 | Oct 2005 | WO |
WO 2005098531 | Oct 2005 | WO |
WO 2005104704 | Nov 2005 | WO |
WO 2005109344 | Nov 2005 | WO |
WO 2006023046 | Mar 2006 | WO |
WO 2006063076 | Jun 2006 | WO |
Number | Date | Country | |
---|---|---|---|
20100034529 A1 | Feb 2010 | US |