The present invention relates generally to security systems and, more particularly, to methods and systems for screening receptacles including, for example, luggage, mail parcels or cargo containers, to identify certain objects located therein, where such methods and systems implement image distortion correction functionality.
Security in airports, train stations, ports, office buildings and other public or private venues is becoming increasingly important particularly in light of recent violent events.
Typically, security screening systems make use of devices generating penetrating radiation, such as x-ray devices, to scan individual pieces of luggage to generate an image conveying the contents of the luggage. The image is displayed on a screen and is examined by a human operator whose task it is to detect and possibly identify, on the basis of the image, potentially threatening objects located in the luggage. In certain cases, some form of object recognition technology may be used to assist the human operator.
A deficiency with current systems is that they are mostly reliant on the human operator to detect and identify potentially threatening objects. However, the performance of the human operator greatly varies according to such factors as poor training and fatigue. As such, the detection and identification of threatening objects is highly susceptible to human error. Furthermore, it will be appreciated that failure to identify a threatening object, such as a weapon for example, may have serious consequences, such as property damage, injuries and fatalities.
Another deficiency with current systems is that the labour costs associated with such systems are significant since human operators must view the images.
Consequently, there is a need in the industry for providing a method and system for use in screening luggage items, cargo containers, mail parcels or persons to identify certain objects that alleviate at least in part the deficiencies of the prior art.
In accordance with a first broad aspect, the present application seeks to provide an apparatus suitable for screening a receptacle. The apparatus comprises an input for receiving an image signal associated with the receptacle, the image signal conveying an input image related to contents of the receptacle, the image signal having been produced by a device that is characterized by introducing distortion into the input image. The apparatus also comprises a processing unit in communication with the input, and operative for: applying a distortion correction process to the image signal to remove at least part of the distortion from the input image, thereby to generate a corrected image signal conveying at least one corrected image related to the contents of the receptacle; processing the corrected image signal in combination with a plurality of data elements associated with a plurality of target objects in an attempt to detect a presence of at least one of said target objects in the receptacle; and generating a detection signal in response to detection of the presence of at least one of said target objects in the receptacle. The apparatus further comprises an output for releasing the detection signal.
In accordance with a second broad aspect, the present application seeks to provide an apparatus for detecting the presence of one or more prohibited objects in a receptacle. The apparatus comprises an input for receiving an input image conveying graphic information regarding contents of a receptacle, the image having been produced by a device that introduces distortion into the input image; a distortion correction functional unit operable for processing the input image to remove at least part of the distortion from the input image in order to derive at least one corrected image; an optical correlator operable for processing the at least one corrected image in an attempt to detect whether at least one of said one or more prohibited objects is depicted in at least one of the at least one corrected image; and an output for releasing a signal in response to detecting that at least one of said one or more prohibited objects is depicted in at least one of the at least one corrected image.
In accordance with a third broad aspect, the present application seeks to provide a method for screening a receptacle, which comprises receiving an image signal associated with the receptacle, the image signal conveying an input image related to contents of the receptacle, the image signal having been produced by a device that is characterized by introducing distortion into the input image; applying a distortion correction process to the image signal to remove at least part of the distortion from the input image, thereby to generate a corrected image signal conveying at least one corrected image related to the contents of the receptacle; processing the corrected image signal in combination with a plurality of data elements associated with a plurality of target objects in an attempt to detect a presence of at least one of said target objects in the receptacle; generating a detection signal in response to detection of the presence of at least one of said target objects in the receptacle; and releasing the detection signal.
In accordance with a fourth broad aspect, the present application seeks to provide a computer-readable medium comprising computer-readable program code which, when interpreted by a computing apparatus, causes the computing apparatus to execute a method of screening a receptacle. The computer-readable program code comprises first computer-readable program code for causing the computing apparatus to be attentive to receipt of an image signal associated with the receptacle, the image signal conveying an input image related to contents of the receptacle, the image signal having been produced by a device that is characterized by introducing distortion into the input image; second computer-readable program code for causing the computing apparatus to apply a distortion correction process to the image signal to remove at least part of the distortion from the input image, thereby to generate a corrected image signal conveying at least one corrected image related to the contents of the receptacle; third computer-readable program code for causing the computing apparatus to process the corrected image signal in combination with a plurality of data elements associated with a plurality of target objects in an attempt to detect a presence of at least one of said target objects in the receptacle; fourth computer-readable program code for causing the computing apparatus to generate a detection signal in response to detection of the presence of at least one of said target objects in the receptacle; and fifth computer-readable program code for causing the computing apparatus to release the detection signal.
In accordance with a fifth broad aspect, the present application seeks to provide an apparatus for screening a receptacle. The apparatus comprises means for receiving an image signal associated with the receptacle, the image signal conveying an input image related to contents of the receptacle, the image signal having been produced by a device that is characterized by introducing distortion into the input image; means for applying a distortion correction process to the image signal to remove at least part of the distortion from the input image, thereby to generate a corrected image signal conveying at least one corrected image related to the contents of the receptacle; means for processing the corrected image signal in combination with a plurality of data elements associated with a plurality of target objects in an attempt to detect a presence of at least one of said target objects in the receptacle; means for generating a detection signal in response to detection of the presence of at least one of said target objects in the receptacle; and means for releasing the detection signal.
In accordance with a sixth broad aspect, the present application seeks to provide a system for screening a receptacle. The system comprises an image generation device operable to generate an image signal associated with the receptacle, the image signal conveying an input image related to contents of the receptacle, the input image containing distortion introduced by said image generation device. The system also comprises an apparatus in communication with said image generation device, and operable for: applying a distortion correction process to the image signal to remove at least part of the distortion from the input image, thereby to generate a corrected image signal conveying at least one corrected image related to the contents of the receptacle; processing the corrected image signal in combination with a plurality of data elements associated with a plurality of target objects in an attempt to detect a presence of at least one of said target objects in the receptacle; and generating a detection signal in response to detection of the presence of at least one of said target objects in the receptacle. The system further comprises an output module for conveying information derived at least in part on the basis of said detection signal to a user of the system.
For the purpose of this specification, the expression “receptacle” is used to broadly describe an entity adapted for receiving objects therein such as, for example, a luggage item, a cargo container or a mail parcel.
For the purpose of this specification, the expression “luggage item” is used to broadly describe luggage, suitcases, handbags, backpacks, briefcases, boxes, parcels or any other similar type of item suitable for containing objects therein.
For the purpose of this specification, the expression “cargo container” is used to broadly describe an enclosure for storing cargo such as would be used, for example, in a ship, train, truck or an other suitable type of cargo container.
Other aspects and features of the present invention will become apparent to those ordinarily skilled in the art upon review of the following description of specific embodiments of the invention in conjunction with the accompanying Figures.
A detailed description of embodiments of the present invention is provided herein below, by way of example only, with reference to the accompanying drawings, in which:
a and 4b depict specific examples of visual outputs conveying the presence of at least one target object in the receptacle in accordance with specific examples of implementation of the present invention;
In the drawings, the embodiments of the invention are illustrated by way of examples. It is to be expressly understood that the description and drawings are only for the purpose of illustration and are an aid for understanding. They are not intended to be a definition of the limits of the invention.
Shown in
It should be noted that the image generation device 102 may introduce distortion into the input image 800. More specifically, different objects appearing in the input image 800 may be distorted to different degrees, depending on the position of the object in question within the input image 800 and depending on the height of the object in question within the receptacle 104 (which sets the distance between the object in question and the image generation device 102).
The apparatus 106 receives the image signal 150 and processes the image signal 150 in combination with a plurality of data elements associated with a plurality of target objects in an attempt to detect a presence of one or more target objects in the receptacle 104. In a specific non-limiting implementation, the data elements associated with the plurality of target objects are stored in a database 110.
In response to detection of the presence of one or more target objects in the receptacle 104, the apparatus 106 generates a detection signal 160 which conveys the presence of one or more target objects in the receptacle 104. Examples of the manner in which the detection signal 160 can be generated are described later on in the specification. The output module 108 conveys information derived at least in part on the basis of the detection signal 160 to a user of the system.
Advantageously, the system 100 provides assistance to the human security personnel using the system 100 in detecting certain target objects and decreases the susceptibility of the screening process to human error.
In a specific example of implementation, the image generation device 102 uses penetrating radiation or emitted radiation to generate the image signal 150. Specific examples of such devices include, without being limited to, x-ray, gamma ray, computed tomography (CT scans), thermal imaging and millimeter wave devices. Such devices are known in the art and as such will not be described further here. In a non-limiting example of implementation, the image generation device 102 is a conventional x-ray machine and the input image 800 related to the contents of the receptacle 104 is an x-ray image of the receptacle 104 generated by the x-ray machine.
The input image 800 related to the contents of the receptacle 104, which is conveyed by the image signal 150, may be a two-dimensional (2-D) image or a three-dimensional (3-D) image, and may be in any suitable format such as, without limitation, VGA, SVGA, XGA, JPEG, GIF, TIFF and bitmap amongst others. Preferably, the input image 800 related to the contents of the receptacle 104 is in a format that can be displayed on a display screen.
In some embodiments (e.g., where the receptacle 104 is large, as is the case with a cargo container), the image generation device 102 may be configured to scan the receptacle 104 along various axes to generate an image signal conveying multiple input images related to the contents of the receptacle 104. Scanning methods for large objects are known in the art and as such will not be described further here. Each of the multiple images is then processed in accordance with the method described herein below to detect the presence of one or more target objects in the receptacle 104.
In a specific example of implementation, the database 110 includes a plurality of entries associated with respective target objects that the system 100 is designed to detect. A non-limiting example of a target object includes a weapon. The entry in the database 110 that is associated with a particular target object includes a data element associated with the particular target object.
In a first non-limiting example of implementation, the data element associated with the particular target object comprises one or more images of the particular target object. The format of the one or more images of the particular target object will depend upon the image processing algorithm implemented by the apparatus 106, to be described later. Where plural images of the particular target object are provided, these images may depict the particular target object in various orientations.
In a second non-limiting example of implementation, the data element associated with the particular target object comprises the Fourier transform of one or more images of the particular target object.
Optionally, for the entry associated with a particular target object, characteristics of the particular target object are provided. Such characteristics may include, without being limited to, the name of the particular target object, its associated threat level, the recommended handling procedure when the particular target object is detected and any other suitable information. Optionally still, the entry associated with a particular target object is also associated with a respective target object identifier data element.
Although the database 110 has been shown in
In a specific example of implementation, the output module 108 conveys to a user of the system information derived at least in part on the basis of the detection signal 160. A specific example of implementation of the output module 108 is shown in
The output controller unit 200 receives from the apparatus 106 (shown in
In a first specific example of implementation, the output controller unit 200 is adapted to cause a display unit to convey information related to the one or more detected target objects. In a non-limiting example of implementation, the output controller unit 200 generates image data conveying the location of the one or more detected target objects within the receptacle 104. Optionally, the output controller unit 200 also extracts characteristics of the one or more detected target objects from the database 110 on the basis of the target object identifier data element and generates image data conveying the characteristics of the one or more detected target objects. In yet another non-limiting example of implementation, the output controller unit 200 generates image data conveying the location of the one or more detected target objects within the receptacle 104 in combination with the input image generated by the image generation device 102 (shown in
In a second specific example of implementation, the output controller unit 200 is adapted to cause an audio unit to convey information related to the one or more detected target objects. In a specific non-limiting example of implementation, the output controller unit 200 generates audio data conveying the presence of the one or more detected target objects, the location of the one or more detected target objects within the receptacle 104 and the characteristics of the one or more detected target objects.
The output controller unit 200 then releases a signal for causing the output device 202 to convey the desired information to a user of the system.
The output device 202 may be any device suitable for conveying information to a user of the system 100 regarding the presence of one or more target objects in the receptacle 104. The information may be conveyed in visual format, audio format or as a combination of visual and audio formats.
In a first specific example of implementation, the output device 202 includes a display screen adapted for displaying in visual format information related to the presence of the one or more detected target objects. In a second specific example of implementation, the output device 202 includes a printer adapted for displaying in printed format information related to the presence of the one or more detected target objects.
In a third specific example of implementation, the output device 202 includes an audio output unit adapted for releasing an audio signal conveying information related to the presence of one or more target objects in the receptacle 104.
In a fourth specific example of implementation, the output device 202 includes a set of visual elements, such as lights or other suitable visual elements, adapted for conveying in visual format information related to the presence of one or more target objects in the receptacle 104.
The person skilled in the art will readily appreciate, in light of the present specification, that other suitable types of output devices may be used without detracting from the spirit of the invention.
The apparatus 106 will now be described in greater detail with reference to
The first input 310 is for receiving an image signal 150 associated with the receptacle 104 from the image generation device 102 (shown in
Generally speaking, the processing unit of the apparatus 106 receives the image signal 150 associated with the receptacle 104 from the first input 310 and processes the image signal 150 in combination with the data elements associated with target objects (received from the database 110 at the second input 314) in an attempt to detect the presence of one or more target objects in the receptacle 104. In response to detection of one or more target objects (hereinafter referred to as “detected target objects”) in the receptacle 104, the processing unit of the apparatus 106 generates and releases at the output 312 the detection signal 160 which conveys the presence of the one or more detected target objects in the receptacle 104.
The various functional elements of the processing unit of the apparatus 106 implement a process, which is depicted in
Step 500
At step 500, the pre-processing module 300 receives the image signal 150 associated with the receptacle 104 via the first input 310. It is recalled that the image signal 150 conveys an input image 800 related to the contents of the receptacle 104.
Step 501A
At step 501A, the pre-processing module 300 processes the image signal 150 in order to enhance the input image 800 related to the contents of the receptacle 104, remove extraneous information therefrom and remove noise artefacts, thereby to help obtain more accurate comparison results later on. The complexity of the requisite level of pre-processing and the related trade-offs between speed and accuracy depend on the application. Examples of pre-processing may include, without being limited to, brightness and contrast manipulation, histogram modification, noise removal and filtering amongst others. As part of step 501A, the pre-processing module 300 releases a modified image signal 170 for processing by the distortion correction module 350 at step 501B. The modified image signal 170 conveys a pre-processed version of the input image 800 related to the contents of the receptacle 104.
Step 501B
One recalls at this point that the image generation device 102 may have introduced distortion into the input image 800 related to the contents of the receptacle 104. At step 501B, the distortion correction module 350 processes the modified image signal 170 in order to remove distortion from the pre-processed version of the input image 800. The complexity of the requisite amount of distortion correction and the related trade-offs between speed and accuracy depend on the application. As part of step 501B, the distortion correction module 350 releases a corrected image signal 180 for processing by the image comparison module 302 at step 502. The corrected image signal 180 conveys at least one corrected image related to the contents of the receptacle 104.
With reference now to
Assuming that the receptacle 104 were flat (in the Z-direction), one could model the distortion introduced by the image generation device 102 as a spatial transformation T on a “true” image to arrive at the input image 800. Thus, T would represent a spatial transformation that models the distortion affecting a target object having a given shape and location in the “true” image, resulting in that object's “distorted” shape and location in the input image 800. Thus, to obtain the object's “true” shape and location, it is reasonable to want to make the distortion correction process resemble the inverse of T as closely as possible, so as to facilitate accurate identification of a target object in the input image 800. However, not only is T generally unknown in advance, but moreover it will actually be different for objects appearing at different heights within the receptacle 104.
More specifically, different objects appearing in the input image 800 may be distorted to different degrees, depending on the position of those objects within the input image 800 and depending on the height of those objects within the receptacle 104 (i.e., the distance between the object in question and the image generation device 102). Stated differently, assume that a particular lo target object 890 is located at a given height H890 within the receptacle 104. An image taken of the particular target object 890 will manifest itself as a corresponding image element 800I in the input image 800, containing a distorted version of the particular target object 890. To account for the distortion of the shape and location of the image element 800I within the input image 800, one can still use the spatial transformation approach mentioned above, but this approach needs take into consideration the height H890 at which the particular target object 890 appears within the receptacle 104. Thus, one can denote the spatial transformation for a given candidate height H by TH, which therefore models the distortion affects the “true” images of target objects when such target objects are located at the candidate height H within the receptacle 104.
Now, although TH is not known, it may be inferred, from which its inverse can be obtained. The inferred version of TH is denoted TH* and is hereinafter referred to as an “inferred spatial transformation” for a given candidate height H. Basically, TH* can be defined as a data structure that represents an estimate of TH. Although the number of possible heights that a target object may occupy is a continuous variable, it may be possible to granularize this number to a limited set of “candidate heights” (e.g., such as 5-10) without introducing a significant detection error. Of course, the number of candidate heights in a given embodiment may be as low as one, while the upper bound on the number of candidate heights is not particularly limited.
The data structure that represents the inferred spatial transformation TH* for a given candidate height H may be characterized by a set of parameters derived from the coordinates of a set of “control points” in both the input image 800 and an “original” image for that candidate height. An “original” image for a given candidate height would contain non-distorted images of objects only if those images appeared within the receptacle 104 at the given candidate height. Of course, while the original image for a given candidate height is unknown, it may be possible to identify picture elements in the input image portion that are known to have originated from specific picture elements in the (unknown) original image. Thus, a “control point” corresponds to a picture element that occurs at a known location in the original image for a given candidate height H, and whose “distorted” position can be located in the input image 800.
In one specific non-limiting embodiment, to obtain control points specific to a given image generation device 102, and with reference to
To obtain the inferred spatial transformation TH* for a given candidate height H, one may utilize a “transformation model”. The transformation model that is used may fall into one or more of the following non-limiting categories, depending on the type of distortion that is sought to be corrected:
The use of the function cp2tform in the Image Processing Toolbox of Matlab® (available from Mathworks Inc.) is particularly suitable for the computation of inferred spatial transformations such as TH* based on coordinates for a set of control points. Other techniques will now be apparent to persons skilled in the art to which the present invention pertains.
The above process can be repeated several times, for different candidate heights, thus obtaining TH* for various candidate heights. It is noted that the derivation of TH* for various candidate heights can be performed off-line, i.e., before scanning of the receptacle 104. In fact, the derivation of TH* is independent of the contents of the receptacle 104.
Returning now to
It is noted that inverting TH* for the various candidate heights yields a corresponding number of corrected images 800C. Those skilled in the art will appreciate that each of the corrected images 800C will contain areas of reduced distortion where those areas contained objects located at the candidate height for which the particular corrected image 800C was generated.
It will be appreciated that TH*−1 is not always computable in closed form based on the corresponding TH*. Nevertheless, the corrected image 800C for the given candidate height can be obtained from the input image 800 using interpolation methods, based on the corresponding TH*. Examples of suitable interpolation methods that may be used include bicubic, bilinear and nearest-neighbor, to name a few.
The use of the function imtransform in the Image Processing Toolbox of Matlab® (available from Mathworks Inc.) is particularly suitable for the computation of an output image (such as one of the corrected images 800C) based on an input image (such as the input image 800) and an inferred spatial transformation such as TH*. Other techniques will now be apparent to persons skilled in the art to which the present invention pertains.
It is noted that certain portions of the corrected image 800C for a given candidate height might not exhibit less distortion than in the input image 800, for the simple reason that the objects contained in those portions appeared at a different height within the receptacle 104 when they were being scanned. Nevertheless, if a certain target object was in the receptacle 104, then it is likely that at least one portion of the corrected image 800C for at least one candidate height will show a reduction in distortion with respect to representation of the certain target object in the input image 800, thus facilitating comparison with data elements in the database 110 as described later on.
Naturally, the precise numerical values in the transformations used in the selected distortion correction technique may vary from one image generation device 102 to the next, as different image generation devices introduce different amounts of distortion of different types, which appear in different regions of the input image 800.
Of course, those skilled in the art will appreciate that similar reasoning and calculations apply when taking into account the effect of the pre-processing module 300, the only difference being that one would be dealing with observations made in the pre-processed version of the input image 800 rather than in the input image 800 itself.
It will also be appreciated that the functionality of the pre-processing module 300 and the distortion correction module 350 can be performed in reverse order. In other embodiments, all or part of the functionality of the pre-processing module 300 and/or the distortion correction module 350 may be external to the apparatus 106, e.g., such functionality may be integrated with the image generation device 102 or performed by external components. It will also be appreciated that the pre-processing module 300 and/or the distortion correction module 350 (and hence steps 501A and/or 501B) may be omitted in certain embodiments of the present invention without detracting from the spirit of the invention.
Step 502
At step 502, the image comparison module 302 verifies whether there remain any unprocessed data elements in the database 110. In the affirmative, the image comparison module 302 proceeds to step 503 where the next one of the data elements is accessed and the image comparison module 302 then proceeds to step 504. If at step 502 all data elements in the database 110 have been processed, the image comparison module 302 proceeds to step 508 and the process is completed.
Step 504
Assuming for the moment that the data elements in the database 110 represent images of target objects, the data element accessed at step 503 conveys a particular image of a particular target object. Thus, at step 504, the image comparison module 302 effects a comparison between at least one corrected image related to the contents of the receptacle 104 (which are conveyed in the corrected image signal 180) and the particular image of the particular target object to determine whether a match exists. It is noted that more than one corrected image may be provided, namely when more than one candidate height is accounted for. The comparison may be effected using any image processing algorithm suitable for comparing two images. Examples of algorithms that can be used to perform image processing and comparison include without being limited to:
The above algorithms are well known in the field of image processing and as such will not be described further here.
In a specific example of implementation, the image comparison module 302 includes an edge detector to perform part of the comparison at step 504.
In another specific example of implementation, the comparison performed at step 504 includes effecting a “correlation operation” between the at least one corrected image related to the contents of the receptacle 104 (which are conveyed in the corrected image signal 180) and the particular image of the particular target object. Again, it is recalled that when multiple candidate heights are accounted for, then multiple corrected images may need to be processed, either serially, in parallel or a combination thereof.
In a specific non-limiting embodiment, the correlation operation involves computing the Fourier transform of the at least one corrected image related to the contents of the receptacle 104 (which are conveyed in the corrected image signal 180), computing the Fourier transform complex conjugate of the particular image of the particular target object, multiplying the two Fourier transforms together and then taking the Fourier transform (or inverse Fourier transform) of the product. Simply put, the result of the correlation operation provides a measure of the degree of similarity between the two images.
In a specific example of implementation, the correlation operation is performed by an optical correlator. A specific example of implementation of an optical correlator suitable for use in comparing two images will be described later on in the specification. In an alternative example of implementation, the correlation operation is performed by a digital correlator.
The image comparison module 302 then proceeds to step 506.
Step 506
The result of the comparison effected at step 504 is processed to determine whether a match exists between (I) at least one of the at least one corrected image 800C related to the contents of the receptacle 104 and (II) the particular image of the particular target object. In the absence of a match, the image comparison module 302 returns to step 502. However, in response to detection of a match, it is concluded that the particular target object has been detected in the receptacle and the image comparison module 302 triggers the detection signal generation module 306 to execute step 510. Then, the image comparison module 302 returns to step 502 to continue processing with respect to the next data element in the database 100.
Step 510
At step 510, the detection signal generation module 306 generates the aforesaid detection signal 160 conveying the presence of the particular target object in the receptacle 104. The detection signal 160 is released via the output 312. The detection signal 160 may simply convey the fact that the particular target object has been detected as present in the receptacle 104, without necessarily specifying the identity of the particular target object. Alternatively, the detection signal 160 may convey the actual identity of the particular target object. As previously indicated, the detection signal 160 may include information related to the positioning of the particular target object within the receptacle 104 and optionally a target object identifier data element associated with the particular target object.
It should be noted that generation of the detection signal 160 may also be deferred until multiple or even all of the data elements in the database 110 have been processed. Accordingly, the detection signal may convey the detection of multiple particular target objects in the receptacle 104 and/or their respective identities.
It will be appreciated that the correlation operation may be implemented using a digital correlator. The correlation operation is computationally intensive and, in certain implementations requiring real-time performance, the use of a digital correlator may not provide suitable performance. Under such conditions, an optical correlator may be preferred.
Advantageously, an optical correlator performs the correlation operation physically through light-based computation, rather than by using software running on a silicon-based computer, which allows computations to be performed at a higher speed than is possible with a software implementation and thus provides for improved real-time performance. Specific examples of implementation of the optical correlator include a joint transform correlator (JTC) and a focal plane correlator (FPC). Two specific non-limiting embodiments of a suitable optical correlator are shown in
In a first embodiment, now described with reference to
In a second embodiment, now described with reference to
In this second embodiment, the data element accessed at step 503 conveys a particular template (or filter) 804′ for a particular image 804. Thus, in a modified version of step 504, and with continued reference to
The content and format of the database 110 may be further varied from one implementation to the next, and the skilled person in the art will readily appreciate in light of the present description that other approaches to generating templates (or filters) may be used without detracting from the spirit of the invention.
Many methods for generating filters are known in the art and a few such methods will be described later on in the specification. The reader is invited to refer to the following document for additional information regarding phase only filters (POF): Phase-Only Matched Filtering, Joseph L. Homer and Peter D. Gianino, Appl. Opt. Vol. 23 no. 6, 15 Mar. 1994, pp. 812-816. The contents of this document are incorporated herein by reference.
In a non-limiting example of implementation, the generation of the template (or filter) is performed in a few steps. First, the background is removed from the particular image of the particular target object. In other words, the particular image is extracted from the background and the background is replaced by a black background. The resulting image is then processed through a Fourier transform function. The result of this transform is a complex image. A phase only filter (POF) for example will only contain the complex conjugate of the phase information (between zero and 2 pi) which is mapped to a 0 to 255 range values. These 256 values correspond in fact to the 256 levels of gray of an image.
As a variant, in order to reduce the amount of data needed to represent the whole range of 3D orientations that a single target object can take, a MACE (Minimum Average Correlation Energy) filter is used to generate a template (or filter) for a given target object. Typically, the MACE filter combines several different 2D projections of a given object and encodes them in a single MACE filter instead of having one 2D projection per filter. One of the benefits of using MACE filters is that the resulting database 110 would take less space since it would include fewer items. Also, since the number of correlation operations needed to identify a single target object would be reduced, the total processing time to determine whether a given object is present would also be reduced. The reader is invited to refer to the following document for additional information regarding MACE filters: Mahalanobis, A., B. V. K. Vijaya Kumar, and D. Casasent (1987); Minimum Average Correlation Energy Filters, Appl. Opt. 26 no. 17, 3633-3640. The contents of this document are incorporated herein by reference.
Another way of reducing the processing time of the correlation operation is to take advantage of the linear properties of the Fourier transform. By dividing the particular image into several sub-images, a composite image can be formed, herein referred to as a mosaic. When a mosaic is displayed at the input of the correlator, the correlation is computed simultaneously on all the sub-images without incurring any substantial time penalty. A mosaic may contain several different target objects or several different orientations of the same target object or a combination of both.
The inner workings of the aforementioned non-limiting example optical correlator are illustrated in
The light beam modulated by the first image on the first LCD screen 904 is then propagated through a second set of lenses 906, referred to as a Fourier lens since it performs the equivalent of the Fourier transform mathematical operation. The inherent properties of light are used to physically perform the appropriate calculations. Specifically, the propagation of light is a function which corresponds to the kernel of the Fourier transform operation, thus the propagation of light along the axis of a Fourier lens represents a sufficiently strong approximation of this natural phenomenon to assert that the light beam undergoes a Fourier transform. Otherwise stated, a lens has the inherent property of performing a Fourier transform on images observed at its front focal plane, provided that this image is displayed at its back focal plane. The Fourier transform, which can normally be rather computation-intensive when calculated by a digital computer, is performed in the optical correlator simply by the propagation of the light. The mathematics behind this optical realization is equivalent to the exact Fourier transform function and can be modeled with standard fast Fourier algorithms. For more information regarding Fourier transforms, the reader is invited to consider B. V. K. Vijaya Kumar, Marios Savvides, Krithika Venkataramani, and Chunyan Xie , “Spatial frequency domain image processing for biometric recognition”, Biometrics ICIP Conference 2002 or alternatively J. W. Goodman, Introduction to Fourier Optics, 2nd Edition, McGraw-Hill, 1996. The contents of these documents are incorporated herein by reference.
After going through the Fourier lens 906, the signal is projected on a second LCD screen 908 on which is displayed the target template (or filter) 804′ (i.e., Fourier transform) for the particular image 804. When the Fourier transform of the image associated with the receptacle 104 goes through the second LCD screen 908 on which the target template (or filter) 804′ is displayed, the light beam crosses a second Fourier lens 910 which, again, optically computes the equivalent of a Fourier transform multiplication. This operation corresponds to a correlation in the spatial domain. The template (or filter) 804′ displayed on the second LCD screen 908 in fact induces a phase variation on the incoming light beam. Each pixel can potentially induce a phase change whose magnitude is equivalent to its gray level. As such the Fourier transform displayed on the first LCD screen 904 is multiplied with the Fourier transform of the template (or filter) 804′ for the particular image 804, which is equivalent to performing a correlation.
The second Fourier lens 910 finally concentrates the light beam on a small area camera or camera 912 where the result of the correlation is measured, so to speak. The camera 912 in fact measures energy peaks on the correlation plane. The position of a correlation peak corresponds in fact to the location of the target object center in the input image 800.
Referring back to
Fourier Transform and Spatial Frequencies
The Fourier transform as applied to images will now be described in general terms. The Fourier transform is a mathematical tool used to convert the information present within an object's image into its frequency representation. In short, an image can be seen as a superposition of various spatial frequencies and the Fourier transform is a mathematical operation used to compute the intensity of each of these frequencies within the image. The spatial frequencies represent the rate of variation of image intensity in space. Consequently, a smooth or uniform pattern mainly contains low frequencies. Sharply contoured patterns, by contrast, exhibit a higher frequency content.
The Fourier transform of an image f(x,y) is given by:
F(u,v)=∫∫f(x,y)e−j2π(ux+vy)dxdy (1)
where u, v are the coordinates in the frequency domain. Thus, the Fourier transform is a global operator: changing a single frequency of the Fourier transform affects the whole object in the spatial domain.
A correlation operation can be mathematically described by:
where ε and ξ represent the pixel coordinates in the correlation plane, C(ε,ξ) stands for the correlation, x and y identify the pixel coordinates of the input image, f(x, y) is the original input image and h*(ε,ξ) is the complex conjugate of the correlation filter.
In the frequency domain the same expression takes a slightly different form:
C(ε,ξ)=ℑ−1(F(u,v)H*(u,v)) (3)
where ℑ is the Fourier transform operator, u and v are the pixel coordinates in the Fourier plane, F(u,v) is the Fourier transform complex conjugate of the image acquired with the camera f(x,y) and H*(u,v) is the Fourier transform of the template (or filter). Thus, the correlation between an input image and a template (or filter) is equivalent, in mathematical terms, to the multiplication of their respective Fourier transforms, provided that the complex conjugate of the template (or filter) is used. Consequently, the correlation can be defined in the spatial domain as the search for a given pattern (template/filter), or in the frequency domain, as filtering operation with a specially designed matched filter.
Advantageously, the use of optics for computing a correlation operation allows the computation to be performed in a shorter time than by using a digital implementation of the correlation. It turns out that an optical lens properly positioned (i.e. input and output images are located on the lens's focal planes) automatically computes the Fourier transform of the input image. In order to speed up the computation of the correlation, the Fourier transform of a particular image can be computed beforehand and submitted to the correlator as a template (or filter). This type of filter is called a matched filter.
Generation of Templates (or Filters)
Matched filters, as their name implies, are specifically adapted to respond to one image in particular: they are optimized to respond to an object with respect to its energy content. Generally, the contour of an object corresponds to its high frequency content. This can be easily understood as the contour represent areas where the intensity varies rapidly (hence a high frequency).
In order to emphasize the contour of an object, the matched filter can be divided by its module (the image is normalized), over the whole Fourier transform image. The resulting filter is called a Phase-Only Filter (POF) and is defined by:
The reader is invited to refer to the following document for additional information regarding phase only filters (POF): “Phase-Only Matched Filtering”, Joseph L. Homer and Peter D. Gianino, Appl. Opt. Vol. 23 no. 6, 15 Mar. 1994, pp. 812-816. The contents of this document are incorporated herein by reference.
Because these filters are defined in the frequency domain, normalizing over the whole spectrum of frequencies implies that each of the frequency components is considered with the same weight. In the spatial domain (e.g. usual real-world domain), this means that the emphasis is given to the contours (or edges) of the object. As such, the POF filter provides a higher degree of discrimination, sharper correlation peaks and higher energy efficiency.
The discrimination provided by the POF filter, however, has some disadvantages. It turns out that, although the optical correlator is somewhat insensitive to the size of the objects to be recognized, the images are expected to be properly sized, otherwise the features might not be registered properly. To understand this requirement, imagine a filter defined out of a given instance of a ‘2’. If that filter is applied to a second instance of a ‘2’ whose contour is slightly different, the correlation peak will be significantly reduced as a result of the great sensitivity of the filter to the original shape. A different type of filter, termed a composite filter, was introduced to overcome these limitations. The reader is invited to refer to the following document for additional information regarding this different type of composite filter: H. J. Caufield and W. T. Maloney, Improved Discrimination in Optical Character Recognition, Appl. Opt., 8, 2354, 1969. The contents of this document are incorporated herein by reference.
In accordance with specific implementations, filters can be designed by:
The latter procedure forms the basis for the generation of composite filters. Thus composite filters are composed of the response of individual POF filters to the same symbol. Mathematically, this can be expressed by:
hcomp(x,y)=αaha(x,y)+αbhb(x,y)+K+αxhx(x,y) (5)
A filter generated in this fashion is likely to be more robust to minor signature variations as the irrelevant high frequency features will be averaged out. In short, the net effect is an equalization of the response of the filter to the different instances of a given symbol.
Composite filters can also be used to reduce the response of the filter to the other classes of symbols. In equation (5) above, if the coefficient b, for example, is set to a negative value, then the filter response to a symbol of class b will be significantly reduced. In other words, the correlation peak will be high if ha(x,y) is at the input image, and low if hb(x,y) is present at the input. A typical implementation of composite filters is described in: Optical Character Recognition (OCR) in Uncontrolled Environments Using Optical Correlators, Andre Morin, Alain Bergeron, Donald Prevost and Ernst A. Radloff, Proc. SPIE Int. Soc. Opt. Eng. 3715, 346 (1999). The contents of this document are incorporated herein by reference.
Those skilled in the art will appreciate that the concepts described above can also be readily applied to the screening of people. For example, in an alternative embodiment, a system for screening people is provided. The system includes components similar to those described in connection with the system depicted in
Those skilled in the art will appreciate that certain portions of the apparatus 106 can be implemented on a general purpose digital computer 1300, of the type depicted in
Alternatively, the above-described apparatus 106 can be implemented on a dedicated hardware platform where electrical/optical components implement the functional blocks described in the specification and depicted in the drawings. Specific implementations may be realized using ICs, ASICs, DSPs, FPGAs, an optical correlator, digital correlator or other suitable hardware platform.
In a specific example of implementation, the optical correlator suitable for use in the system described herein includes a video input and a digital input. The video input is suitable for receiving a signal derived from an image generation device and the digital input is suitable for receiving a signal derived from images in a database. In a specific implementation, the video input is suitable for receiving a signal in an NTSC compatible format and the digital input suitable for receiving a signal in a VGA compatible format. It will be appreciated that the digital input suitable for receiving a signal in a VGA compatible format may be replaced by any other suitable digital input interface adapted for receiving signals of lower or higher resolution that the VGA compatible format signal. Similarly, it will also be appreciated that the video input suitable for receiving a signal in an NTSC compatible format may be replaced by any other suitable analog or digital video signal interface suitable for receiving signals in suitable formats such as, but not limited to, PAL and SECAM. In a non-limiting implementation, the optical correlator is adapted to process an image received at the video input having an area of 640×480 pixels. However, it will be readily apparent that, by providing suitable interfaces, larger or smaller images can be handled since the optical correlator's processing capability is independent of the size of the image, as opposed to digital systems that require more processing time and power as images get larger. In yet another alternative implementation, the video input is replaced by a second digital input adapted for receiving an image signal in any suitable digital image format. In such an implementation, the image generation device 102 (shown in
In a variant, a single optical correlator can be shared by multiple image generation devices. In such a variant, conventional parallel processing techniques can be used for sharing a common hardware resource.
Although the present invention has been described in considerable detail with reference to certain preferred embodiments thereof, variations and refinements are possible without departing from the spirit of the invention. Therefore, the scope of the invention should be limited only by the appended claims and their equivalents.
This application is a continuation-in-part claiming the benefit under 35 USC §120 of PCT international patent application serial number PCT/CA2005/000716, filed on May 11, 2005 and designating the United States, the contents of which are incorporated herein by reference. This application is also a continuation-in-part claiming the benefit under 35 USC §120 of U.S. patent application Ser. No. 11/268,749 entitled “METHOD AND SYSTEM FOR SCREENING CARGO CONTAINERS”, filed on Nov. 8, 2005 now U.S. Pat. No. 7,734,102 by Eric Bergeron et al., the contents of which are incorporated herein by reference.
Number | Name | Date | Kind |
---|---|---|---|
2398196 | Sicular | Apr 1946 | A |
4338626 | Lemelson | Jul 1982 | A |
4379348 | Haas et al. | Apr 1983 | A |
4383327 | Kruger | May 1983 | A |
4418575 | Hundt et al. | Dec 1983 | A |
4470303 | O'Donnell | Sep 1984 | A |
4480899 | Sprague | Nov 1984 | A |
4482958 | Nakayama et al. | Nov 1984 | A |
4509075 | Simms et al. | Apr 1985 | A |
4573198 | Anderson | Feb 1986 | A |
4612666 | King | Sep 1986 | A |
4637056 | Sherman et al. | Jan 1987 | A |
4651297 | Schlunt | Mar 1987 | A |
4653109 | Lemelson et al. | Mar 1987 | A |
4722096 | Dietrich et al. | Jan 1988 | A |
4724543 | Klevecz et al. | Feb 1988 | A |
4725733 | Horman et al. | Feb 1988 | A |
4736399 | Okazaki | Apr 1988 | A |
4736401 | Donges et al. | Apr 1988 | A |
4737650 | West | Apr 1988 | A |
4756015 | Doenges et al. | Jul 1988 | A |
4759047 | Donges et al. | Jul 1988 | A |
4775895 | Traupe et al. | Oct 1988 | A |
4783794 | Dietrich | Nov 1988 | A |
4788704 | Donges et al. | Nov 1988 | A |
4795253 | Sandridge et al. | Jan 1989 | A |
4819188 | Matsubara et al. | Apr 1989 | A |
4832447 | Javidi | May 1989 | A |
4837733 | Shiraishi et al. | Jun 1989 | A |
4838644 | Ochoa et al. | Jun 1989 | A |
4841554 | Doenges et al. | Jun 1989 | A |
4849912 | Leberl et al. | Jul 1989 | A |
4862358 | Kimura et al. | Aug 1989 | A |
4869574 | Hartman | Sep 1989 | A |
4870670 | Geus | Sep 1989 | A |
4884289 | Glockmann et al. | Nov 1989 | A |
4887899 | Hung | Dec 1989 | A |
4916722 | Ema | Apr 1990 | A |
4955060 | Katsuki et al. | Sep 1990 | A |
5003616 | Orita et al. | Mar 1991 | A |
5018178 | Katsumata | May 1991 | A |
5020111 | Weber | May 1991 | A |
5022062 | Annis | Jun 1991 | A |
5034812 | Rawlings | Jul 1991 | A |
5041993 | Rawlings | Aug 1991 | A |
5056130 | Engel | Oct 1991 | A |
5060249 | Eisen et al. | Oct 1991 | A |
5063602 | Peppers et al. | Nov 1991 | A |
5065418 | Bermbach et al. | Nov 1991 | A |
5073782 | Huguenin et al. | Dec 1991 | A |
5079698 | Grenier et al. | Jan 1992 | A |
5091924 | Bermbach et al. | Feb 1992 | A |
5107351 | Leib et al. | Apr 1992 | A |
5109276 | Nudelman et al. | Apr 1992 | A |
5132811 | Iwaki et al. | Jul 1992 | A |
5132842 | Yeh | Jul 1992 | A |
5132998 | Tsutsui et al. | Jul 1992 | A |
5138167 | Barnes | Aug 1992 | A |
5150229 | Takesue et al. | Sep 1992 | A |
5179581 | Annis | Jan 1993 | A |
5181234 | Smith | Jan 1993 | A |
5198669 | Namiki et al. | Mar 1993 | A |
5216541 | Takesue et al. | Jun 1993 | A |
5239595 | Takemura et al. | Aug 1993 | A |
5257085 | Ulich et al. | Oct 1993 | A |
5257322 | Matsuoka et al. | Oct 1993 | A |
5268967 | Jang et al. | Dec 1993 | A |
5283641 | Lemelson | Feb 1994 | A |
5297222 | Mori et al. | Mar 1994 | A |
5309244 | Katagiri et al. | May 1994 | A |
5309523 | Iwaki et al. | May 1994 | A |
5311359 | Lucas et al. | May 1994 | A |
5319547 | Krug et al. | Jun 1994 | A |
5323472 | Falk | Jun 1994 | A |
5327286 | Sampsell et al. | Jul 1994 | A |
5345081 | Rogers | Sep 1994 | A |
5345173 | Bito et al. | Sep 1994 | A |
5365560 | Tam | Nov 1994 | A |
5365564 | Yashida et al. | Nov 1994 | A |
5367552 | Peschmann | Nov 1994 | A |
5371542 | Pauli et al. | Dec 1994 | A |
5375156 | Kuo-Petravic et al. | Dec 1994 | A |
5376796 | Chan et al. | Dec 1994 | A |
5379334 | Zimmer et al. | Jan 1995 | A |
5379336 | Kramer et al. | Jan 1995 | A |
5418380 | Simon et al. | May 1995 | A |
5420788 | Vissers | May 1995 | A |
5425113 | Ito | Jun 1995 | A |
5428657 | Papanicolopoulos et al. | Jun 1995 | A |
5430787 | Norton | Jul 1995 | A |
5481584 | Tang et al. | Jan 1996 | A |
5481622 | Gerhardt et al. | Jan 1996 | A |
5483569 | Annis | Jan 1996 | A |
5485312 | Horner et al. | Jan 1996 | A |
5490218 | Krug et al. | Feb 1996 | A |
5493444 | Khoury et al. | Feb 1996 | A |
5506880 | Scardino et al. | Apr 1996 | A |
5519225 | Mohr et al. | May 1996 | A |
5524133 | Neale et al. | Jun 1996 | A |
5528702 | Mitsuoka et al. | Jun 1996 | A |
5528703 | Lee | Jun 1996 | A |
5546189 | Svetkoff et al. | Aug 1996 | A |
5568256 | Korner et al. | Oct 1996 | A |
5580471 | Fukumoto et al. | Dec 1996 | A |
5595767 | Cinquin et al. | Jan 1997 | A |
5600303 | Husseiny et al. | Feb 1997 | A |
5600485 | Iwaki et al. | Feb 1997 | A |
5600700 | Krug et al. | Feb 1997 | A |
5604634 | Khoury et al. | Feb 1997 | A |
5619596 | Iwaki et al. | Apr 1997 | A |
5625192 | Oda et al. | Apr 1997 | A |
5625717 | Hashimoto et al. | Apr 1997 | A |
5638420 | Armistead | Jun 1997 | A |
5642393 | Krug et al. | Jun 1997 | A |
5642394 | Rothschild | Jun 1997 | A |
5647018 | Benjamin | Jul 1997 | A |
5664574 | Chance | Sep 1997 | A |
5668846 | Fox et al. | Sep 1997 | A |
5680525 | Sakai et al. | Oct 1997 | A |
5684565 | Oshida et al. | Nov 1997 | A |
5692028 | Geus et al. | Nov 1997 | A |
5692029 | Husseiny et al. | Nov 1997 | A |
5692446 | Becker et al. | Dec 1997 | A |
5699400 | Lee et al. | Dec 1997 | A |
5703921 | Fujita et al. | Dec 1997 | A |
5706816 | Mochizuki et al. | Jan 1998 | A |
5726449 | Yoshiike et al. | Mar 1998 | A |
5739539 | Wang et al. | Apr 1998 | A |
5745542 | Gordon et al. | Apr 1998 | A |
5748305 | Shimono et al. | May 1998 | A |
5748697 | Tam | May 1998 | A |
5754621 | Suzuki et al. | May 1998 | A |
5756875 | Parker et al. | May 1998 | A |
5757981 | Kawakubo | May 1998 | A |
5761334 | Nakajima et al. | Jun 1998 | A |
5764683 | Swift et al. | Jun 1998 | A |
5764719 | Noettling | Jun 1998 | A |
5768334 | Maitrejean et al. | Jun 1998 | A |
5777742 | Marron | Jul 1998 | A |
5778046 | Molloi et al. | Jul 1998 | A |
5779641 | Hatfield et al. | Jul 1998 | A |
5784429 | Arai | Jul 1998 | A |
5786597 | Lingren et al. | Jul 1998 | A |
5787145 | Geus | Jul 1998 | A |
5794788 | Massen | Aug 1998 | A |
5796802 | Gordon | Aug 1998 | A |
5796868 | Dutta-Choudhury | Aug 1998 | A |
5799100 | Clarke et al. | Aug 1998 | A |
5800355 | Hasegawa | Sep 1998 | A |
5802133 | Kawai et al. | Sep 1998 | A |
5809171 | Neff et al. | Sep 1998 | A |
5815198 | Vachtsevanos et al. | Sep 1998 | A |
5815264 | Reed et al. | Sep 1998 | A |
5828722 | Ploetz et al. | Oct 1998 | A |
5828774 | Wang | Oct 1998 | A |
5834153 | Hasegawa et al. | Nov 1998 | A |
5838758 | Krug et al. | Nov 1998 | A |
5838759 | Armistead | Nov 1998 | A |
5841828 | Gordon et al. | Nov 1998 | A |
5841907 | Javidi et al. | Nov 1998 | A |
5850465 | Shimura et al. | Dec 1998 | A |
5862198 | Samarasekera et al. | Jan 1999 | A |
5862258 | Taylor | Jan 1999 | A |
5864598 | Hsieh et al. | Jan 1999 | A |
5866907 | Drukier et al. | Feb 1999 | A |
5877849 | Ramer et al. | Mar 1999 | A |
5881123 | Tam | Mar 1999 | A |
5893095 | Jain et al. | Apr 1999 | A |
5894345 | Takamoto et al. | Apr 1999 | A |
5901196 | Sauer et al. | May 1999 | A |
5901198 | Crawford et al. | May 1999 | A |
5903623 | Swift et al. | May 1999 | A |
5909285 | Beaty et al. | Jun 1999 | A |
5909477 | Crawford et al. | Jun 1999 | A |
5910765 | Slemon et al. | Jun 1999 | A |
5910973 | Grodzins | Jun 1999 | A |
5911139 | Jain et al. | Jun 1999 | A |
5917190 | Yodh et al. | Jun 1999 | A |
5926568 | Chaney et al. | Jul 1999 | A |
5940468 | Huang et al. | Aug 1999 | A |
5943388 | Tumer | Aug 1999 | A |
5951474 | Matsunaga et al. | Sep 1999 | A |
5953452 | Boone et al. | Sep 1999 | A |
5960104 | Conners et al. | Sep 1999 | A |
5974111 | Krug et al. | Oct 1999 | A |
5978440 | Kang et al. | Nov 1999 | A |
5981949 | Leahy et al. | Nov 1999 | A |
5987095 | Chapman et al. | Nov 1999 | A |
6005916 | Johnson et al. | Dec 1999 | A |
6008496 | Winefordner et al. | Dec 1999 | A |
6009142 | Sauer et al. | Dec 1999 | A |
6011620 | Sites et al. | Jan 2000 | A |
6018561 | Tam | Jan 2000 | A |
6018562 | Willson | Jan 2000 | A |
6031890 | Bermbach et al. | Feb 2000 | A |
6035014 | Hiraoglu et al. | Mar 2000 | A |
6043870 | Chen | Mar 2000 | A |
6049381 | Reintjes et al. | Apr 2000 | A |
6057761 | Yukl | May 2000 | A |
6057909 | Yahav et al. | May 2000 | A |
6058159 | Conway et al. | May 2000 | A |
6060677 | Ulrichsen et al. | May 2000 | A |
6070583 | Perelman et al. | Jun 2000 | A |
6075591 | Vokhmin | Jun 2000 | A |
6075880 | Kollhof et al. | Jun 2000 | A |
6078638 | Sauer et al. | Jun 2000 | A |
6080994 | Carrott et al. | Jun 2000 | A |
6081580 | Grodzins et al. | Jun 2000 | A |
6084939 | Tamura | Jul 2000 | A |
6088423 | Krug et al. | Jul 2000 | A |
6094472 | Smith | Jul 2000 | A |
6097427 | Dey et al. | Aug 2000 | A |
6097483 | Komatsu | Aug 2000 | A |
6149300 | Greenway et al. | Nov 2000 | A |
6153873 | Wolf | Nov 2000 | A |
6155179 | Aust et al. | Dec 2000 | A |
6157730 | Roever et al. | Dec 2000 | A |
6163403 | Carrott et al. | Dec 2000 | A |
6175417 | Do et al. | Jan 2001 | B1 |
6175613 | Boutenko et al. | Jan 2001 | B1 |
6185272 | Hiraoglu et al. | Feb 2001 | B1 |
6188747 | Geus et al. | Feb 2001 | B1 |
6195413 | Geus et al. | Feb 2001 | B1 |
6195444 | Simanovsky et al. | Feb 2001 | B1 |
6198795 | Naumann et al. | Mar 2001 | B1 |
6205195 | Lanza | Mar 2001 | B1 |
6205243 | Migdal et al. | Mar 2001 | B1 |
6218943 | Ellenbogen | Apr 2001 | B1 |
6222902 | Lin et al. | Apr 2001 | B1 |
6229872 | Amos | May 2001 | B1 |
6233303 | Tam | May 2001 | B1 |
6236704 | Navab et al. | May 2001 | B1 |
6236708 | Lin et al. | May 2001 | B1 |
6249341 | Basiji et al. | Jun 2001 | B1 |
6252929 | Swift et al. | Jun 2001 | B1 |
6256370 | Yavuz | Jul 2001 | B1 |
6256404 | Gordon et al. | Jul 2001 | B1 |
6263044 | Joosten | Jul 2001 | B1 |
6263231 | Reitter | Jul 2001 | B1 |
6272204 | Amtower et al. | Aug 2001 | B1 |
6272233 | Takeo | Aug 2001 | B1 |
6278760 | Ogawa et al. | Aug 2001 | B1 |
6288974 | Nelson | Sep 2001 | B1 |
6289235 | Webber et al. | Sep 2001 | B1 |
6292260 | Lin et al. | Sep 2001 | B1 |
6292530 | Yavus et al. | Sep 2001 | B1 |
6292533 | Swift et al. | Sep 2001 | B1 |
6324245 | Tam | Nov 2001 | B1 |
6335742 | Takemoto | Jan 2002 | B1 |
6353673 | Shnitser et al. | Mar 2002 | B1 |
6366638 | Hsieh et al. | Apr 2002 | B1 |
6370222 | Cornick | Apr 2002 | B1 |
6373916 | Inoue et al. | Apr 2002 | B1 |
6373970 | Dong et al. | Apr 2002 | B1 |
6373979 | Wang | Apr 2002 | B1 |
6381297 | Hsieh | Apr 2002 | B1 |
6388788 | Harris et al. | May 2002 | B1 |
6403960 | Wellnitz et al. | Jun 2002 | B1 |
6404841 | Pforr et al. | Jun 2002 | B1 |
6408042 | Hsieh | Jun 2002 | B1 |
6415012 | Taguchi et al. | Jul 2002 | B1 |
6418184 | Wang et al. | Jul 2002 | B1 |
6418189 | Schafer | Jul 2002 | B1 |
6424692 | Suzuki | Jul 2002 | B1 |
6442288 | Haerer et al. | Aug 2002 | B1 |
6445765 | Frank et al. | Sep 2002 | B1 |
6448545 | Chen | Sep 2002 | B1 |
6453003 | Springer et al. | Sep 2002 | B1 |
6459755 | Li | Oct 2002 | B1 |
6463181 | Duarte | Oct 2002 | B2 |
6473489 | Bani-Hashemi et al. | Oct 2002 | B2 |
6477221 | Ning | Nov 2002 | B1 |
6480285 | Hill | Nov 2002 | B1 |
6480564 | Kim et al. | Nov 2002 | B1 |
6483894 | Hartick et al. | Nov 2002 | B2 |
6487307 | Hennessey et al. | Nov 2002 | B1 |
6502984 | Ogura et al. | Jan 2003 | B2 |
6507025 | Verbinski et al. | Jan 2003 | B1 |
6507278 | Brunetti et al. | Jan 2003 | B1 |
6525331 | Ngoi et al. | Feb 2003 | B1 |
6526120 | Gray et al. | Feb 2003 | B1 |
6532276 | Hartick et al. | Mar 2003 | B1 |
6542574 | Grodzins | Apr 2003 | B2 |
6542578 | Ries et al. | Apr 2003 | B2 |
6542579 | Takasawa | Apr 2003 | B1 |
6542580 | Carver et al. | Apr 2003 | B1 |
6542628 | Muller et al. | Apr 2003 | B1 |
6549683 | Bergeron et al. | Apr 2003 | B1 |
6552809 | Bergeron et al. | Apr 2003 | B1 |
6559769 | Anthony et al. | May 2003 | B2 |
6570177 | Struckhoff et al. | May 2003 | B1 |
6570708 | Bergeron et al. | May 2003 | B1 |
6570951 | Hsieh | May 2003 | B1 |
6570956 | Rhee et al. | May 2003 | B1 |
6574296 | Stierstorfer | Jun 2003 | B2 |
6574297 | Tam | Jun 2003 | B2 |
6580777 | Ueki et al. | Jun 2003 | B1 |
6580778 | Meder | Jun 2003 | B2 |
6583895 | Kuwahara et al. | Jun 2003 | B1 |
6584170 | Aust et al. | Jun 2003 | B2 |
6586193 | Yguerabide et al. | Jul 2003 | B2 |
6587575 | Windham et al. | Jul 2003 | B1 |
6587595 | Henkel et al. | Jul 2003 | B1 |
6597760 | Beneke et al. | Jul 2003 | B2 |
6603536 | Hasson et al. | Aug 2003 | B1 |
6608921 | Inoue et al. | Aug 2003 | B1 |
6611575 | Alyassin et al. | Aug 2003 | B1 |
6618466 | Ning | Sep 2003 | B1 |
6621887 | Albagli et al. | Sep 2003 | B2 |
6621888 | Grodzins et al. | Sep 2003 | B2 |
6621925 | Ohmori et al. | Sep 2003 | B1 |
6628982 | Thomas et al. | Sep 2003 | B1 |
6628983 | Gagnon | Sep 2003 | B1 |
6654443 | Hoffman | Nov 2003 | B1 |
6663280 | Doenges | Dec 2003 | B2 |
6665373 | Kotowski et al. | Dec 2003 | B1 |
6707879 | McClelland et al. | Mar 2004 | B2 |
6714623 | Sako et al. | Mar 2004 | B2 |
6721387 | Naidu et al. | Apr 2004 | B1 |
6721391 | McClelland et al. | Apr 2004 | B2 |
6724922 | Vilsmeier | Apr 2004 | B1 |
6731819 | Fukushima et al. | May 2004 | B1 |
6735274 | Zahavi et al. | May 2004 | B1 |
6735279 | Jacobs et al. | May 2004 | B1 |
6738450 | Barford | May 2004 | B1 |
6744909 | Kostrzewski et al. | Jun 2004 | B1 |
6746864 | McNeil et al. | Jun 2004 | B1 |
6751349 | Matama | Jun 2004 | B2 |
6754374 | Miller et al. | Jun 2004 | B1 |
6763148 | Sternberg et al. | Jul 2004 | B1 |
6785410 | Vining et al. | Aug 2004 | B2 |
H2110 | Newman | Oct 2004 | H |
6801647 | Arakawa | Oct 2004 | B1 |
6803997 | Stanek | Oct 2004 | B2 |
6804412 | Wilkinson | Oct 2004 | B1 |
6813395 | Kinjo | Nov 2004 | B1 |
6825854 | Beneke et al. | Nov 2004 | B1 |
6837422 | Meder | Jan 2005 | B1 |
6839403 | Kotowski et al. | Jan 2005 | B1 |
6839406 | Ries et al. | Jan 2005 | B2 |
6843599 | Le et al. | Jan 2005 | B2 |
6856272 | Levitan et al. | Feb 2005 | B2 |
6865287 | Beneke | Mar 2005 | B1 |
6865509 | Hsiung et al. | Mar 2005 | B1 |
6868138 | Clinthorne et al. | Mar 2005 | B2 |
6873261 | Anthony et al. | Mar 2005 | B2 |
6876322 | Keller | Apr 2005 | B2 |
6895072 | Schrock et al. | May 2005 | B2 |
6895338 | Hsiung et al. | May 2005 | B2 |
6899540 | Neiderman et al. | May 2005 | B1 |
6918541 | Knowles et al. | Jul 2005 | B2 |
6928141 | Carver et al. | Aug 2005 | B2 |
6936828 | Saccomanno | Aug 2005 | B2 |
6938488 | Diaz et al. | Sep 2005 | B2 |
6940943 | Claus et al. | Sep 2005 | B2 |
6950492 | Besson | Sep 2005 | B2 |
6952163 | Huey et al. | Oct 2005 | B2 |
6970531 | Eberhard et al. | Nov 2005 | B2 |
6980681 | Hsieh | Dec 2005 | B1 |
6982643 | Garfinkle | Jan 2006 | B2 |
6990171 | Toth et al. | Jan 2006 | B2 |
7000827 | Meder | Feb 2006 | B2 |
7012256 | Roos et al. | Mar 2006 | B1 |
7020241 | Beneke et al. | Mar 2006 | B2 |
7043474 | Mojsilovic et al. | May 2006 | B2 |
7045787 | Verbinski et al. | May 2006 | B1 |
7046761 | Ellenbogen et al. | May 2006 | B2 |
7050616 | Hsieh et al. | May 2006 | B2 |
7062074 | Beneke | Jun 2006 | B1 |
7065175 | Green | Jun 2006 | B2 |
7068751 | Toth et al. | Jun 2006 | B2 |
7092485 | Kravis | Aug 2006 | B2 |
7098461 | Endo | Aug 2006 | B2 |
7099004 | Masten | Aug 2006 | B2 |
7099432 | Ichihara et al. | Aug 2006 | B2 |
7100165 | Eldridge et al. | Aug 2006 | B2 |
7103137 | Seppi et al. | Sep 2006 | B2 |
7105828 | Unger et al. | Sep 2006 | B2 |
7116749 | Besson | Oct 2006 | B2 |
7130456 | Hillmann | Oct 2006 | B2 |
7136716 | Hsiung et al. | Nov 2006 | B2 |
7139406 | McClelland et al. | Nov 2006 | B2 |
7142633 | Eberhard et al. | Nov 2006 | B2 |
7154650 | Lettington | Dec 2006 | B2 |
7164750 | Nabors et al. | Jan 2007 | B2 |
7183906 | Zanovitch et al. | Feb 2007 | B2 |
7193515 | Roberts et al. | Mar 2007 | B1 |
7212113 | Zanovitch | May 2007 | B2 |
7212661 | Samara et al. | May 2007 | B2 |
7233682 | Levine | Jun 2007 | B2 |
7244941 | Roos et al. | Jul 2007 | B2 |
7253766 | Foote et al. | Aug 2007 | B2 |
7257189 | Modica et al. | Aug 2007 | B2 |
20010016030 | Nicolas et al. | Aug 2001 | A1 |
20010021013 | Hecht et al. | Sep 2001 | A1 |
20010021244 | Suzuki et al. | Sep 2001 | A1 |
20010028696 | Yamada et al. | Oct 2001 | A1 |
20010033636 | Hartick et al. | Oct 2001 | A1 |
20010038681 | Stanton et al. | Nov 2001 | A1 |
20010038705 | Rubbert et al. | Nov 2001 | A1 |
20010038707 | Ohara | Nov 2001 | A1 |
20010048734 | Uppaluri et al. | Dec 2001 | A1 |
20010053197 | Murayama et al. | Dec 2001 | A1 |
20020001366 | Tamura et al. | Jan 2002 | A1 |
20020015475 | Matsumoto et al. | Feb 2002 | A1 |
20020016546 | Cerofolini | Feb 2002 | A1 |
20020017620 | Oomori et al. | Feb 2002 | A1 |
20020018199 | Blumenfeld et al. | Feb 2002 | A1 |
20020024016 | Endo | Feb 2002 | A1 |
20020027970 | Chapman et al. | Mar 2002 | A1 |
20020028994 | Kamiyama | Mar 2002 | A1 |
20020031246 | Kawano | Mar 2002 | A1 |
20020037068 | Oikawa | Mar 2002 | A1 |
20020044691 | Matsugu | Apr 2002 | A1 |
20020054694 | Vachtsevanos et al. | May 2002 | A1 |
20020067259 | Fufidio et al. | Jun 2002 | A1 |
20020067793 | Stierstorfer | Jun 2002 | A1 |
20020085046 | Furuta et al. | Jul 2002 | A1 |
20020088952 | Rao et al. | Jul 2002 | A1 |
20020094062 | Dolazza et al. | Jul 2002 | A1 |
20020094119 | Sahadevan | Jul 2002 | A1 |
20020098518 | Levinson | Jul 2002 | A1 |
20020106052 | Menhardt | Aug 2002 | A1 |
20020122528 | Besson | Sep 2002 | A1 |
20020124664 | Call et al. | Sep 2002 | A1 |
20020126800 | Matsumoto et al. | Sep 2002 | A1 |
20020127586 | Mortensen | Sep 2002 | A1 |
20020141625 | Nelson | Oct 2002 | A1 |
20020150200 | Zonneveld | Oct 2002 | A1 |
20020161534 | Adler et al. | Oct 2002 | A1 |
20020168083 | Garms et al. | Nov 2002 | A1 |
20020168657 | Chen et al. | Nov 2002 | A1 |
20020172324 | Ellengogen | Nov 2002 | A1 |
20020172409 | Saito et al. | Nov 2002 | A1 |
20020175921 | Xu et al. | Nov 2002 | A1 |
20020176534 | Meder | Nov 2002 | A1 |
20020186862 | McClelland et al. | Dec 2002 | A1 |
20020188197 | Bishop et al. | Dec 2002 | A1 |
20020191209 | Yasumaru | Dec 2002 | A1 |
20030012420 | Verwoerd et al. | Jan 2003 | A1 |
20030023592 | Modica et al. | Jan 2003 | A1 |
20030024315 | Merkel et al. | Feb 2003 | A1 |
20030031289 | Hsieh | Feb 2003 | A1 |
20030031291 | Yamamoto et al. | Feb 2003 | A1 |
20030036006 | Feke et al. | Feb 2003 | A1 |
20030038945 | Mahner | Feb 2003 | A1 |
20030072414 | Sakaida | Apr 2003 | A1 |
20030072418 | Albagli et al. | Apr 2003 | A1 |
20030072484 | Kokko et al. | Apr 2003 | A1 |
20030076924 | Mario et al. | Apr 2003 | A1 |
20030081720 | Swift et al. | May 2003 | A1 |
20030081859 | Kasutani | May 2003 | A1 |
20030082516 | Straus | May 2003 | A1 |
20030085348 | Megerle | May 2003 | A1 |
20030085353 | Almogy et al. | May 2003 | A1 |
20030091145 | Mohr et al. | May 2003 | A1 |
20030095633 | Van Woezik | May 2003 | A1 |
20030095692 | Mundy et al. | May 2003 | A1 |
20030128812 | Appleby et al. | Jul 2003 | A1 |
20030138147 | Ongkojoyo | Jul 2003 | A1 |
20030148393 | Woodbury et al. | Aug 2003 | A1 |
20030149346 | Arnone et al. | Aug 2003 | A1 |
20030165213 | Maglich | Sep 2003 | A1 |
20030179853 | Amemiya et al. | Sep 2003 | A1 |
20030194121 | Eberhard et al. | Oct 2003 | A1 |
20030205676 | Nelson et al. | Nov 2003 | A1 |
20030206649 | Moshe | Nov 2003 | A1 |
20030210139 | Brooks et al. | Nov 2003 | A1 |
20030215051 | Suzuki | Nov 2003 | A1 |
20030215143 | Zakrzewski et al. | Nov 2003 | A1 |
20030231788 | Yukhin et al. | Dec 2003 | A1 |
20030231791 | Torre-Bueno et al. | Dec 2003 | A1 |
20040012853 | Garcia et al. | Jan 2004 | A1 |
20040013239 | Gregerson et al. | Jan 2004 | A1 |
20040016271 | Shah et al. | Jan 2004 | A1 |
20040017882 | Misawa et al. | Jan 2004 | A1 |
20040017883 | Takagi et al. | Jan 2004 | A1 |
20040017888 | Seppi et al. | Jan 2004 | A1 |
20040017935 | Avinash et al. | Jan 2004 | A1 |
20040022425 | Avinash et al. | Feb 2004 | A1 |
20040027127 | Mills | Feb 2004 | A1 |
20040037462 | Lewis et al. | Feb 2004 | A1 |
20040041082 | Harmon | Mar 2004 | A1 |
20040051030 | Olszak et al. | Mar 2004 | A1 |
20040062342 | Cahill | Apr 2004 | A1 |
20040062349 | Schuster | Apr 2004 | A1 |
20040062351 | Yoshioka | Apr 2004 | A1 |
20040066882 | Eberhard et al. | Apr 2004 | A1 |
20040066884 | Hermann Claus et al. | Apr 2004 | A1 |
20040066890 | Dalmijn et al. | Apr 2004 | A1 |
20040075058 | Blevis et al. | Apr 2004 | A1 |
20040080315 | Beevor et al. | Apr 2004 | A1 |
20040082846 | Johnson et al. | Apr 2004 | A1 |
20040083958 | Saidman et al. | May 2004 | A1 |
20040086075 | Hein et al. | May 2004 | A1 |
20040086160 | Zimmermann | May 2004 | A1 |
20040087844 | Yen | May 2004 | A1 |
20040101097 | Wakayama et al. | May 2004 | A1 |
20040102700 | Asafusa | May 2004 | A1 |
20040109231 | Haisch et al. | Jun 2004 | A1 |
20040120009 | White et al. | Jun 2004 | A1 |
20040120857 | Smith et al. | Jun 2004 | A1 |
20040134986 | Studer et al. | Jul 2004 | A1 |
20040141056 | Izumi et al. | Jul 2004 | A1 |
20040142386 | Rigler et al. | Jul 2004 | A1 |
20040160599 | Hamamatsu et al. | Aug 2004 | A1 |
20040161073 | Nokita | Aug 2004 | A1 |
20040175041 | Miller | Sep 2004 | A1 |
20040176677 | Hwu et al. | Sep 2004 | A1 |
20040212492 | Boesch et al. | Oct 2004 | A1 |
20040213377 | Endo | Oct 2004 | A1 |
20040213600 | Watanabe et al. | Oct 2004 | A1 |
20040218729 | Xue et al. | Nov 2004 | A1 |
20040225222 | Zeng et al. | Nov 2004 | A1 |
20040236520 | Williams et al. | Nov 2004 | A1 |
20040240612 | Suzuki | Dec 2004 | A1 |
20040247071 | Dafni | Dec 2004 | A1 |
20040247171 | Hashimoto et al. | Dec 2004 | A1 |
20040252024 | Huey et al. | Dec 2004 | A1 |
20040252870 | Reeves et al. | Dec 2004 | A1 |
20040253660 | Gibbs et al. | Dec 2004 | A1 |
20040258198 | Carver et al. | Dec 2004 | A1 |
20040258202 | Wernick et al. | Dec 2004 | A1 |
20040263379 | Keller | Dec 2004 | A1 |
20040264624 | Tanaka et al. | Dec 2004 | A1 |
20040264648 | Claus et al. | Dec 2004 | A1 |
20040265175 | Witty et al. | Dec 2004 | A1 |
20050008119 | McClelland et al. | Jan 2005 | A1 |
20050008203 | Dixon | Jan 2005 | A1 |
20050017181 | Kearfott et al. | Jan 2005 | A1 |
20050018812 | Wolfs | Jan 2005 | A1 |
20050025280 | Schulte | Feb 2005 | A1 |
20050025350 | Engelbart et al. | Feb 2005 | A1 |
20050025377 | Avinash et al. | Feb 2005 | A1 |
20050031069 | Kaucic et al. | Feb 2005 | A1 |
20050053307 | Nose et al. | Mar 2005 | A1 |
20050057354 | Jenkins et al. | Mar 2005 | A1 |
20050058242 | Peschmann | Mar 2005 | A1 |
20050058350 | Dugan et al. | Mar 2005 | A1 |
20050061955 | Endo | Mar 2005 | A1 |
20050069085 | Lewis | Mar 2005 | A1 |
20050074088 | Ichihara et al. | Apr 2005 | A1 |
20050085721 | Fauver et al. | Apr 2005 | A1 |
20050094856 | Warren | May 2005 | A1 |
20050098728 | Alfano et al. | May 2005 | A1 |
20050105680 | Nabors et al. | May 2005 | A1 |
20050110672 | Cardiasmenos et al. | May 2005 | A1 |
20050111618 | Sommer, Jr. et al. | May 2005 | A1 |
20050113961 | Sabol et al. | May 2005 | A1 |
20050117693 | Miyano | Jun 2005 | A1 |
20050117700 | Peschmann | Jun 2005 | A1 |
20050123093 | Lawaczeck et al. | Jun 2005 | A1 |
20050123174 | Gorsky et al. | Jun 2005 | A1 |
20050128069 | Skatter | Jun 2005 | A1 |
20050133708 | Eberhard et al. | Jun 2005 | A1 |
20050147199 | Dunham et al. | Jul 2005 | A1 |
20050153356 | Okawa et al. | Jul 2005 | A1 |
20050163354 | Ziegler | Jul 2005 | A1 |
20050173284 | Ambrefe, Jr. | Aug 2005 | A1 |
20050189412 | Hudnut et al. | Sep 2005 | A1 |
20050190882 | McGuire | Sep 2005 | A1 |
20050206514 | Zanovitch et al. | Sep 2005 | A1 |
20050207655 | Chopra et al. | Sep 2005 | A1 |
20050212913 | Richter | Sep 2005 | A1 |
20050219523 | Onuma et al. | Oct 2005 | A1 |
20050220264 | Homegger | Oct 2005 | A1 |
20050226375 | Eberhard et al. | Oct 2005 | A1 |
20050231421 | Fleisher et al. | Oct 2005 | A1 |
20050240858 | Croft et al. | Oct 2005 | A1 |
20050248450 | Zanovitch | Nov 2005 | A1 |
20050249416 | Leue et al. | Nov 2005 | A1 |
20050251397 | Zanovitch et al. | Nov 2005 | A1 |
20050251398 | Zanovitch et al. | Nov 2005 | A1 |
20050259868 | Sones | Nov 2005 | A1 |
20050265517 | Gary | Dec 2005 | A1 |
20050271184 | Ovadia | Dec 2005 | A1 |
20050275831 | Silver | Dec 2005 | A1 |
20050276376 | Eilbert | Dec 2005 | A1 |
20050276443 | Slamani et al. | Dec 2005 | A1 |
20050279936 | Litman et al. | Dec 2005 | A1 |
20050283079 | Steen et al. | Dec 2005 | A1 |
20060000911 | Stekel | Jan 2006 | A1 |
20060002504 | De Man et al. | Jan 2006 | A1 |
20060008054 | Ohara | Jan 2006 | A1 |
20060009269 | Hoskinson et al. | Jan 2006 | A1 |
20060013455 | Watson et al. | Jan 2006 | A1 |
20060013464 | Ramsay et al. | Jan 2006 | A1 |
20060017605 | Lovberg et al. | Jan 2006 | A1 |
20060018434 | Jacobs et al. | Jan 2006 | A1 |
20060018517 | Chen et al. | Jan 2006 | A1 |
20060019409 | Nelson et al. | Jan 2006 | A1 |
20060034503 | Shimayama | Feb 2006 | A1 |
20060036167 | Shina | Feb 2006 | A1 |
20060045235 | Bruder et al. | Mar 2006 | A1 |
20060045323 | Ateya | Mar 2006 | A1 |
20060064246 | Medberry et al. | Mar 2006 | A1 |
20060065844 | Zelakiewicz et al. | Mar 2006 | A1 |
20060072702 | Chapman | Apr 2006 | A1 |
20060083418 | Watson et al. | Apr 2006 | A1 |
20060084872 | Ichikawa et al. | Apr 2006 | A1 |
20060086794 | Knowles et al. | Apr 2006 | A1 |
20060093088 | Sowerby et al. | May 2006 | A1 |
20060098773 | Peschmann | May 2006 | A1 |
20060098866 | Whitson et al. | May 2006 | A1 |
20060115109 | Whitson et al. | Jun 2006 | A1 |
20060116566 | Bruijns | Jun 2006 | A1 |
20060119837 | Raguin et al. | Jun 2006 | A1 |
20060133650 | Xie et al. | Jun 2006 | A1 |
20060133659 | Hammond | Jun 2006 | A1 |
20060142662 | Van Beek | Jun 2006 | A1 |
20060142984 | Weese et al. | Jun 2006 | A1 |
20060173268 | Mullick et al. | Aug 2006 | A1 |
20060176062 | Yang et al. | Aug 2006 | A1 |
20060203960 | Schlomka et al. | Sep 2006 | A1 |
20060204080 | Sones et al. | Sep 2006 | A1 |
20060215811 | Modica et al. | Sep 2006 | A1 |
20060255929 | Zanovitch et al. | Nov 2006 | A1 |
20060257005 | Bergeron et al. | Nov 2006 | A1 |
20060262902 | Wattenburg | Nov 2006 | A1 |
20060269135 | Ramsay et al. | Nov 2006 | A1 |
20060273257 | Roos et al. | Dec 2006 | A1 |
20060274916 | Chan et al. | Dec 2006 | A1 |
20060282886 | Gaug | Dec 2006 | A1 |
20070003122 | Sirohey et al. | Jan 2007 | A1 |
20070058037 | Bergeron et al. | Mar 2007 | A1 |
20070147585 | Eilbert et al. | Jun 2007 | A1 |
20070168467 | Hu et al. | Jul 2007 | A1 |
20070195994 | McClelland et al. | Aug 2007 | A1 |
20070200566 | Clark et al. | Aug 2007 | A1 |
20070206719 | Suryanarayanan et al. | Sep 2007 | A1 |
20070210921 | Volpi et al. | Sep 2007 | A1 |
20070269005 | Chalmers et al. | Nov 2007 | A1 |
20080236275 | Breed et al. | Oct 2008 | A1 |
20080260097 | Anwar et al. | Oct 2008 | A1 |
Number | Date | Country |
---|---|---|
2307439 | May 2000 | CA |
2319958 | Sep 2000 | CA |
2574402 | Jan 2006 | CA |
2651131 | Nov 2007 | CA |
0 577 380 | Jan 1994 | EP |
W00127601 | Apr 2001 | WO |
WO 02082290 | Oct 2002 | WO |
WO 03069498 | Aug 2003 | WO |
WO 03107113 | Dec 2003 | WO |
WO 2005086616 | Sep 2005 | WO |
PCTCA2005000716 | Feb 2006 | WO |
PCTCA2005001930 | Apr 2006 | WO |
PCTCA2005001930 | Apr 2006 | WO |
PCTCA2006000655 | Aug 2006 | WO |
PCTCA2006000751 | Aug 2006 | WO |
WO 2006119603 | Nov 2006 | WO |
PCTCA2007000779 | Aug 2007 | WO |
PCTCA2007000840 | Aug 2007 | WO |
PCTCA2005000716 | Nov 2007 | WO |
PCTCA2005001930 | Nov 2007 | WO |
PCTCA2006000655 | Nov 2007 | WO |
PCTCA2006000751 | Nov 2007 | WO |
PCTCA2007001297 | Nov 2007 | WO |
PCTCA2007001298 | Nov 2007 | WO |
PCTCA2007001658 | Jan 2008 | WO |
PCTCA2007001749 | Jan 2008 | WO |
Number | Date | Country | |
---|---|---|---|
20070041612 A1 | Feb 2007 | US |
Number | Date | Country | |
---|---|---|---|
Parent | PCT/CA2005/000716 | May 2005 | US |
Child | 11431627 | US | |
Parent | 11268749 | Nov 2005 | US |
Child | PCT/CA2005/000716 | US |