This application is related to U.S. patent application Ser. No. 11/407,217 filed Apr. 20, 2006 by Eric Bergeron et al. and presently pending which was a continuation-in-part application of international PCT patent application serial number PCT/CA2005/000716 filed May 11, 2005 by Eric Bergeron et al. designating the United States.
The contents of the above referenced applications are incorporated herein by reference.
The present invention relates generally to security systems and, more particularly, to a user interface providing image enhancement capabilities for use in screening luggage, mail parcels or cargo containers to identify certain objects located therein or for screening persons to identify certain objects located thereon and to a method and system for implementing such a user interface.
Security in airports, train stations, ports, mail sorting facilities, office buildings and other public or private venues is becoming increasingly important in particular in light of recent violent events.
Typically, security-screening systems at airports make use of devices generating penetrating radiation, such as x-ray devices, to scan individual pieces of luggage to generate an image conveying the contents of the luggage. The image is displayed on a screen and is examined by a human operator whose task it is to identify, on the basis of the image, potentially threatening objects located in the luggage.
A deficiency with current systems is that they are entirely reliant on the human operator to identify potentially threatening objects. However, the performance of the human operator greatly varies according to such factors as poor training and fatigue. As such, the process of detection and identification of threatening objects is highly susceptible to human error.
Another deficiency with current systems is that the labour costs associated with such systems are significant since human operators must view the images.
Yet another deficiency is that the images displayed on the x-ray machines provide little, if any, guidance as to what is being observed. It will be appreciated that failure to identify a threatening object, such as a weapon for example, may have serious consequences, such as property damage, injuries and human deaths.
Consequently, there is a need in the industry for providing a device for facilitating visual identification of a prohibited object in an image during security screening that alleviates at least in part the deficiencies of the prior art.
In accordance with a broad aspect, the invention provides a method for facilitating visual identification of a prohibited object in an image during security screening. The method comprises receiving data conveying an image of the contents of a receptacle derived from an apparatus that scans the receptacle with penetrating radiation. The method also comprises receiving information from an automated threat detection processor indicating an area of interest in the image potentially containing a prohibited object. The method also comprises processing the image to generate an enhanced image in which portions outside the area of interest are visually de-emphasized. The method also comprises displaying the enhanced image on a display device.
Advantageously, by de-emphasizing portions of the image outside the area of interest, visual information located in portions of the image outside the area of interest is filtered out. As a result, the enhanced image displayed to the user conveys the area of interest in a visually contrasting manner relative to portions of the image outside the area of interest. As such, during security screening, the attention of screening operators is led to an area of the enhanced image identified as an area of interest and that likely represents a higher risk of hiding a potential threat there by facilitating visual identification of a prohibited object in an image.
In accordance with a specific implementation, the method comprises receiving information from an automated threat detection processor indicating a plurality of areas of interest in the image potentially containing respective prohibited objects. The method also comprises processing the image to generate the enhanced image in which portions outside the areas of interest are visually de-emphasized.
In accordance with a specific example of implementation, portions of the enhanced image inside the area of interest are visually emphasized.
Advantageously, generating an enhanced image by concurrently de-emphasizing portions of the image outside the area of interest and emphasizing portions of the image inside the area of interest, provides an improved visual cue for directing the visual attention of a screener to an area of the image most likely to contain a prohibited object.
In accordance with a specific example of implementation, the method comprises providing a user control allowing a user to select either one of the image conveyed by the data received and the enhanced image to be displayed on the display device.
In accordance with a specific example of implementation, the method comprises providing a user control allowing a user to select a level of enhancement from a set of possible levels of enhancement. In a first example, the method comprises processing the image to generate the enhanced image such that portions outside the area of interest in the enhanced image are visually de-emphasized at least in part based on the selected level of enhancement. In a second example, the method comprises processing the image to generate the enhanced image such that portions inside the area of interest in the enhanced image are visually emphasized at least in part based on the selected level of enhancement. Optionally, the method comprises providing a first user control and a second user control for allowing a user to select a first level of enhancement and a second level of enhancement from a set of possible levels of enhancement. The method comprises processing the image to generate the enhanced image such that portions inside the area of interest in the enhanced image are visually emphasized at least in part based on the selected second level of enhancement and portions outside the area of interest are visually de-emphasized at least in part based on the selected first level of enhancement.
In accordance with a specific example of implementation, the method comprises providing a user control allowing a user to select a level of enlargement from a set of possible levels of enlargement. The method also comprises processing the image to generate the enhanced image such that portions inside the area of interest are enlarged at least in part based on the selected level of enlargement such that features of the portion of the enhanced image inside the area of interest appear on a larger scale than features in portions of the enhanced image outside the area of interest.
In accordance with another broad aspect, the invention provides and apparatus suitable for implementing a user interface for facilitating visual identification of a prohibited object in an image during security screening in accordance with the above described method.
In accordance with another broad aspect, the invention provides a computer readable storage medium including a program element suitable for execution by a CPU for implementing a graphical user interface module for facilitating visual identification of a prohibited object in the image during security screening in accordance with the above described method.
In accordance with yet another broad aspect, the invention provides a system for detecting the presence of one or more prohibited objects in a receptacle. The system includes an image generation apparatus, an automated threat detection processor, a display module and an apparatus for implementing a user interface module. The image generation apparatus is suitable for scanning a receptacle with penetrating radiation to generate data conveying an image of contents of the receptacle. The automated threat detection processor is in communication with the image generation apparatus and is adapted for processing the image to identify an area of interest in the image potentially containing a prohibited object. The apparatus is in communication with the image generation apparatus, the automated threat detection processor and the display module and implements a user interface module for facilitating visual identification of a prohibited object in an image during security screening. The apparatus comprises a first input for receiving the data conveying the image of the contents of the receptacle and a second input for receiving information indicating the area of interest in the image. The apparatus also comprises a processing unit in communication with the first input and the second input. The processing unit is operative for implementing the user interface module. The user interface module is adapted for processing the image to generate an enhanced image in which portions outside the area of interest are visually de-emphasized and for displaying the enhanced image on the display module.
In accordance with yet another broad aspect, the invention provides a client-server system for implementing a graphical user interface module for facilitating visual identification of a prohibited object in an image during security screening. The client-server system comprising a client system and a server system operative to exchange messages over a data network. The server system stores a program element for execution by a CPU. The program element comprises a first program element component executed on the server system for receiving data conveying an image of the contents of a receptacle derived from an apparatus that scans the receptacle with penetrating radiation and for receiving information from an automated threat detection processor indicating an area of interest in the image potentially containing a prohibited object. The program element also comprises a second program element component executed on the server system for processing the image to generate an enhanced image in which portions outside the area of interest are visually de-emphasized. The program element comprises a third program element component executed on the server system for sending messages to the client system for causing the client system to display the enhanced image on a display device.
In accordance with another broad aspect, the invention provides an apparatus for implementing a user interface module for facilitating visual identification of a prohibited object in an image during security screening. The apparatus comprises means for receiving data conveying an image of the contents of a receptacle derived from an apparatus that scans the receptacle with penetrating radiation. The apparatus also comprises means for receiving information from an automated threat detection processor indicating an area of interest in the image potentially containing a prohibited object. The apparatus also comprises means for processing the image to generate an enhanced image in which portions outside the area of interest are visually de-emphasized. The apparatus also comprises means for releasing a display signal, the display signal being suitable for causing a display device to display the enhanced image.
In accordance with another broad aspect, the invention provides a method for facilitating visual identification of a prohibited object in an image during security screening. The method comprises receiving data conveying an image of the contents of a receptacle derived from an apparatus that scans the receptacle with penetrating radiation. The method also comprises receiving information from an automated threat detection processor indicating an area of interest in the image potentially containing a prohibited object. The method also comprises processing the image to generate an enhanced image in which features appearing inside the area of interest are visually emphasized. The method also comprises displaying the enhanced image on a display device.
In accordance with a specific example of implementation, the method comprises providing a user control allowing a user to select either one of the received image and the enhanced image for display on the display device.
In accordance with another broad aspect, the invention provides and apparatus suitable for implementing a user interface for facilitating visual identification of a prohibited object in an image during security screening in accordance with the above described method.
In accordance with another broad aspect, the invention provides a computer readable storage medium including a program element suitable for execution by a CPU for implementing a graphical user interface module for facilitating visual identification of a prohibited object in the image during security screening in accordance with the above described method.
In accordance with yet another broad aspect, the invention provides a system for detecting the presence of one or more prohibited objects in a receptacle. The system includes an image generation apparatus, an automated threat detection processor, a display module and an apparatus for implementing a user interface module. The image generation apparatus is suitable for scanning a receptacle with penetrating radiation to generate data conveying an image of contents of the receptacle. The automated threat detection processor is in communication with the image generation apparatus and is adapted for processing the image to identify an area of interest in the image potentially containing a prohibited object. The apparatus is in communication with the image generation apparatus, the automated threat detection processor and the display module and implements a user interface module for facilitating visual identification of a prohibited object in an image during security screening. The apparatus comprises a first input for receiving the data conveying the image of the contents of the receptacle and a second input for receiving information indicating the area of interest in the image. The apparatus also comprises a processing unit in communication with the first input and the second input. The processing unit is operative for implementing the user interface module. The user interface module is adapted for processing the image to generate an enhanced image in which features appearing inside the area of interest are visually emphasized and for displaying the enhanced image on the display module.
In accordance with yet another broad aspect, the invention provides a client-server system for implementing a graphical user interface module for facilitating visual identification of a prohibited object in an image during security screening. The client-server system comprising a client system and a server system operative to exchange messages over a data network. The server system stores a program element for execution by a CPU. The program element comprises a first program element component executed on the server system for receiving data conveying an image of the contents of a receptacle derived from an apparatus that scans the receptacle with penetrating radiation and information from an automated threat detection processor indicating an area of interest in the image potentially containing a prohibited object. The program element also comprises a second program element component executed on the server system for processing the image to generate an enhanced image in which features appearing inside the area of interest are visually emphasized. The program element also comprises a third program element component executed on the server system for sending messages to the client system for causing the client system to display the enhanced image on a display device.
In accordance with another broad aspect, the invention provides an apparatus for implementing a user interface module for facilitating visual identification of a prohibited object in an image during security screening. The apparatus comprises means for receiving data conveying an image of the contents of a receptacle derived from an apparatus that scans the receptacle with penetrating radiation. The apparatus also comprises means for receiving information from an automated threat detection processor indicating an area of interest in the image potentially containing a prohibited object. The apparatus also comprises means for processing the image to generate an enhanced image in which features appearing inside the area of interest are visually emphasized. The apparatus also comprises means for displaying the enhanced image on a display module.
In accordance with another broad aspect, the invention provides a method for facilitating visual identification of prohibited objects in images associated with previously screened receptacles. The method comprises providing a plurality of records associated to respective previously screened receptacles. Each record includes an image of the contents of a receptacle derived from an apparatus that scans the receptacle with penetrating radiation and information derived from an automated threat detection processor and indicating an area of interest in the image potentially containing a prohibited object. The method also comprises displaying on a displaying device a first viewing space conveying a set of thumbnail images, each thumbnail image in the set of thumbnail images being derived from a record in the plurality of records. The method also comprises enabling a user to select at least one thumbnail image in the set of thumbnail images. The method is also comprises displaying on the display device a second viewing space conveying an enhanced image derived from a certain record in the plurality of records corresponding to the selected at least one thumbnail image.
In accordance with a specific example of implementation, the enhanced image is an enhanced previous image. The method further comprises receiving data conveying a current image of the contents of a currently screened receptacle derived from an apparatus that scans the currently screened receptacle with penetrating radiation. The method also includes receiving information from an automated threat detection processor indicating an area of interest in the current image potentially containing a prohibited object. The method also includes processing the current image to generate an enhanced current image in which portions outside the area of interest are visually de-emphasized. The method also includes enabling the user to select between the enhanced previous image and the enhanced current image. The method also includes displaying the selected on of the enhanced current image and the enhanced previous image on a display module.
In accordance with another broad aspect, the invention provides a computer readable storage medium including a program element suitable for execution by a CPU for implementing a graphical user interface module for facilitating visual identification of a prohibited object in the image during security screening in accordance with the above described method.
In accordance with another broad aspect, the invention provides an apparatus suitable for implementing a user interface for facilitating visual identification of a prohibited object in an image during security screening. The apparatus comprises a memory and a processing unit. The memory is suitable for storing a plurality of records associated to respective previously screened receptacles, each record including: an image of the contents of a receptacle derived from an apparatus that scans the receptacle with penetrating radiation; and information derived from an automated threat detection processor and indicating an area of interest in the image potentially containing a prohibited object. The processing unit is in communication with the memory unit and implements the user interface module. The user interface module is adapted for displaying on a displaying device a first viewing space conveying a set of thumbnail images, each thumbnail image in the set of thumbnail images being derived from a record in the plurality of records. The user interface module is also adapted for enabling a user to select at least one thumbnail image in the set of thumbnail images. The user interface module is also adapted for displaying on the display device a second viewing space conveying an enhanced image derived from a certain record in the plurality of records corresponding to the selected at least one thumbnail image.
In accordance with another broad aspect, the invention provides a method for displaying information associated to a receptacle for use in detecting the presence of a threat in the receptacle during security screening. The method comprises receiving data conveying an image of the contents of a receptacle derived from an apparatus that scans the receptacle with penetrating radiation. The method also comprises receiving data conveying a level confidence that a threat has been detected in the receptacle. The method also comprises deriving a perceived level of threat associated with the receptacle at least in part based on the level confidence. The method also comprises displaying a screening image derived at least in part based on the data conveying the image of the contents of a receptacle. The method also comprises displaying concurrently with the screening image a threat probability scale, the threat probability scale conveying the perceived level of threat associated with the receptacle in graphical format.
In accordance with another broad aspect, the invention provides and apparatus suitable for implementing a user interface for displaying information associated to a receptacle for use in detecting the presence of a threat in the receptacle during security screening in accordance with the above described method.
In accordance with another broad aspect, the invention provides a computer readable storage medium including a program element suitable for execution by a CPU for implementing a graphical user interface module for displaying information associated to a receptacle for use in detecting the presence of a threat in the receptacle during security screening in accordance with the above described method.
Other aspects and features of the present invention will become apparent to those ordinarily skilled in the art upon review of the following description of specific embodiments of the invention in conjunction with the accompanying Figures.
A detailed description of the embodiments of the present invention is provided herein below, by way of example only, with reference to the accompanying drawings, in which:
a and 13b depict a first example of an original image conveying contents of a receptacle and a corresponding enhanced image in accordance with a specific example of implementation of the present invention;
c and 13d depict a second example of an original image conveying contents of a receptacle and a corresponding enhanced image in accordance with a specific example of implementation of the present invention.
e, 13f and 13g depict a third example of an original image conveying contents of a receptacle and two (2) corresponding enhanced images in accordance with a specific example of implementation of the present invention.
In the drawings, the embodiments of the invention are illustrated by way of examples. It is to be expressly understood that the description and drawings are only for the purpose of illustration and are an aid for understanding. They are not intended to be a definition of the limits of the invention.
Shown in
As depicted, the system 100 includes an image generation apparatus 102, an automated threat detection processor 106 in communication with the image generation apparatus 102 and an output module 108.
The image generation apparatus 102 is adapted for scanning a receptacle 104 to generate data conveying an image of contents of the receptacle 104. The automated threat detection processor 106 receives the data conveying the image of contents of the receptacle 104 and processes that image to identify an area of interest in the image potentially containing a prohibited object. Optionally, as shown in the embodiment depicted in
Advantageously, the system 100 provides assistance to the human security personnel using the system by facilitating visual identification of a prohibited object in an image during security screening. More specifically, displaying the enhanced image allows focussing an operator's attention to areas most likely to contain a prohibited object thereby improving the security personnel's efficiency and productivity.
Image Generation Apparatus 102
In a specific example of implementation, the image generation apparatus 102 uses penetrating radiation or emitted radiation to generate data conveying an image of the contents of the receptacle 104. Specific examples of such devices include, without being limited to, x-ray, gamma ray, computed tomography (CT scans), thermal imaging, TeraHertz and millimeter wave devices. Such devices are known in the art and as such will not be described further here. In a non-limiting example of implementation, the image generation apparatus 102 is a conventional x-ray machine suitable for generating data conveying an x-ray image of the receptacle 104. The x-ray image conveys, amongst others, material density information in relation to objects within the receptacle.
The data generated by the image generation apparatus 102 and conveying an image of the contents of the receptacle 104 may convey as a two-dimensional (2-D) image or a three-dimensional (3-D) image and may be in any suitable format. Possible formats include, without being limited to, JPEG, GIF, TIFF and bitmap amongst others.
Database of Prohibited Objects 110
In a specific example of implementation, the database of prohibited objects 110 includes a plurality of entries associated to respective prohibited objects that the system 100 is designed to detect.
In a non-limiting implementation, for each entry associated to a prohibited object at least one image (hereinafter referred to as a “target image”) is provided in the database of prohibited objects 110. The format of the target images will depend upon the image processing algorithm implemented by the automated threat detection processor 106. More specifically, the format of the target images is such that a comparison operation can be performed by the automated threat detection processor 106 between the target images and data conveying an image of contents of the receptacle 104.
Optionally, for each entry associated to a prohibited object, a set of images is provided in the database of prohibited objects 110. For example, images depicting the prohibited object in various orientations may be provided.
Optionally still, for each entry associated to a target object, characteristics of the prohibited object are provided. Such characteristics may include, without being limited to, the name of the prohibited object, its associated threat level, the recommended handling procedure when such a prohibited object is detected and any other suitable information. In a specific implementation, the threat level information associated to the target object convey the relative threat level of a prohibited object compared to other prohibited objects in the database of prohibited objects 110. For example, a gun would be given a relatively high threat level while a metallic nail file would be given a relatively low level threat level and a pocket knife would be given a threat level between that of the nail file and the gun. Optionally still, each entry in the database of prohibited objects 110 is also associated to a respective prohibited object identifier data element.
In the case of luggage screening (in an airport facility for example) the prohibited object typically constitutes a potential threat to the safety of the passenger or aircraft.
In the case of mail parcel screening, the prohibited object is typically an object that is normally not permitted to be sent through the mail, such as guns (in Canada) for example, due to registration requirements/permits and so on.
In a non-limiting example of implementation, the database of prohibited objects 110 includes one or more entries associated to objects which are not prohibited but which may represent potential threats. For example, the presence of a metal plate or a metal canister in a piece of luggage going through luggage security screening is not prohibited in itself. However such objects may conceal one or more dangerous objects. As such, it is desirable to be able to detect the presence of such objects in receptacle such as to bring them to the attention of the security screeners.
The specific design and content of the database of prohibited objects 110 may vary from one implementation to the next without detracting from the spirit of the invention. The design of the database is not critical to the present invention and as such will not be described further here.
Although the database of prohibited objects 110 has been shown in
Output Module 108
In a specific example of implementation, the output module 108 displays to a user of the system 100 a user interface conveying an enhanced image of contents of the receptacle 104 for facilitating visual identification of a prohibited object.
A specific example of implementation of the output module 108 is shown in
The output device 202 may be any device suitable for conveying an enhanced image of contents of the receptacle to a user of the system 100. In a specific example of implementation, the output device 202 is in communication with the apparatus implementing a user interface module 200 and includes a display unit adapted for displaying in visual format information related to the presence of a prohibited object in the receptacle 104. The display unit may be part of a computing station or may be integrated into a hand-held portable device for example. In another specific example of implementation, the output device 202 includes a printer adapted for displaying in printed format information related to the presence of a prohibited object in the receptacle 104. The person skilled in the art will readily appreciate, in light of the present specification, that other suitable types of output devices may be used here without detracting from the spirit of the invention.
The apparatus implementing a user interface module 200 receives data conveying an image of the contents of a receptacle derived from the image generation apparatus 102.
The apparatus 200 also receives information from the automated threat detection processor 106 indicating an area of interest in the image potentially containing a prohibited object. Optionally, the information received from the automated threat detection processor 106 also conveys a level of confidence associated to the area of interest that the area of interest contains a prohibited object. Optionally still, the information received from the automated threat detection processor 106 also conveys an identification of the prohibited object potentially detected in the image.
In a specific example of implementation, the information received from the automated threat detection processor 106 conveying the area of interest includes location information conveying a location in the image of the contents of a receptacle derived from the image generation apparatus 102.
In a first non-limiting example of implementation, the location information is an (X,Y) pixel location conveying the center of an area in the image. The area of interest is established based on the center location (X,Y) provided by the automated threat detection processor 106 in combination with a shape for the area. The shape of the area may be pre-determined in which case it may be of any suitable geometric shape and will have any suitable size. Alternatively, the shape and/or size of the area of interest may be determined by the user on the basis of a user configuration command.
In a second non-limiting example of implementation, the shape and/or size of the area of interest is determined on the basis of information provided by the automated threat detection processor 106. For example, the information may convey a plurality of (X,Y) pixel locations defining an area in the image of the contents of a receptacle. In such a case, the information received will convey both the shape of the area of interest in the image and the position of the area of interest in that image.
In yet another non-limiting example of implementation, the automated threat detection processor 106 may provide an indication of a type of prohibited object potentially identified in the receptacle being screened in addition to a location of that potentially prohibited object in the image. Based on this potentially identified prohibited object, an area of interest having a shape and size conditioned on the basis of the potentially identified prohibited object may be determined.
A functional block diagram of apparatus 200 is depicted in
The apparatus 200 implements a user interface module for facilitating visual identification of a prohibited object in an image during security screening. A specific example of a method implemented by the apparatus 200 will now be described with reference to
As depicted in
The first input 304 is for receiving data conveying an image of the contents of a receptacle derived from an apparatus that scans the receptacle with penetrating radiation. In a specific implementation, the image signal is derived from a signal generated by the image generation apparatus 102 (shown in
The second input 306 is for receiving information from an automated threat detection processor indicating an area of interest in the image potentially containing a prohibited object. In a specific implementation, the information is provided by the automated threat detection processor 106. The type of information received at the second input 306 depends on the specific implementation of the automated threat detection processor 106 and may vary from one implementation to the next without detracting from the spirit of the invention. Examples of the type of information that may be received include information on the position of the prohibited object detected within the image, information about the level of confidence of the detection and data allowing identifying the prohibited object detected.
The user input 308, which is an optional feature, is for receiving signals from a user input device, the signals conveying commands for controlling the type information displayed by the user interface module or for annotating the information displayed. Any suitable user input device for providing user commands may be used such as, for example, a mouse, keyboard, pointing device, speech recognition unit or touch sensitive screen.
The processing unit 300 is in communication with the first input 304, the second input 306 and the user input 308 and implements a user interface module for facilitating visual identification of a prohibited object in an image of contents of a receptacle.
More specifically, the processing unit 300 is adapted for processing the image received at the first input 304 to generate an enhanced image based at least in part on the information received at the second input 306 and optionally on commands received at the user input 308.
In a specific non-limiting example of implementation, the processing unit 300 is adapted for generating an image mask on the basis of the information received indicating an area of interest in the image. The image mask includes a first enhancement area corresponding to the area of interest and a second enhancement area corresponding to portions of the image outside the area of interest. The image mask allows applying a different type of image enhancement processing to portions of the image corresponding to the first enhancement area and the second enhancement area to generate an enhanced image.
a to 13g depict various illustrative examples of images and corresponding enhanced images generated by the processing unit 300 in accordance with specific examples of implementation of the invention.
a depicts a first exemplary image 400 conveying contents of a receptacle that was generated by an x-ray machine. The processing unit 300 processes the first exemplary image 400 to derive information conveying an area of interest, denoted as area of interest 402 in the figure.
c depicts a second exemplary image 410 conveying contents of another receptacle that was generated by an x-ray machine. The processing unit 300 processes the second exemplary image 410 to derive information conveying a plurality of areas of interest, denoted as areas of interest 462a 462b and 462c in the figure.
e depicts a third example of an illustrative image 1300 conveying contents of a receptacle. The processing unit 300 processes the image 1300 to derive information conveying an area of interest, denoted as area of interest 1302 in the figure.
De-emphasizing Portions Outside the Area of Interest
In a first example of implementation, the processing unit 300 processes the image received at input 304 to generate an enhanced image wherein portions outside the area of interest, conveyed by the information received at second input 306, are visually de-emphasized. Any suitable image manipulation technique for de-emphasizing the visual appearance of portions of the image outside the area of interest may be used by the processing unit 300. Such image manipulation techniques are well-known in the art and as such will not be described in detail here.
In a specific example, the processing unit 300 processes the image to attenuate portions of the image outside the area of interest. In a non-limiting example, the processing unit 300 processes the image to reduce contrasts between feature information appearing in portions of the image outside the area of interest and background information appearing in portions of the image outside the area of interest. Alternatively, the processing unit 300 processes the image to remove features from portions of the image outside the area of interest. In yet another alternative embodiment, the processing unit 300 processes the image to remove all features appearing in portions of the image outside the area of interest such that only features in the areas of interest remain in the enhanced image.
In another example, the processing unit 300 processes the image to overlay or replace portions of the image outside the area of interest with a pre-determined visual pattern. The pre-determined visual pattern may be a suitable textured pattern of may be a uniform pattern. The uniform pattern may be a uniform color or other uniform pattern.
In yet another example, where the image includes color information, the processing unit 300 processes the image to modify color information associated to features of the image appearing outside the area of interest. In a non-limiting example of implementation, portions of the image outside the area of interest are converted into grayscale or other monochromatic color palette.
In yet another example of implementation, the processing unit 300 processes the image to reduce the resolution associated to portions of the image outside the area of interest. This type of image manipulation results in portions of the enhanced image outside the area of interest appearing blurred compare to portions of the image inside the area of interest.
In yet another example of implementation, the processing unit 300 processes the image to shrink portions of the image outside the area of interest such that at least some features of the enhanced image located inside the area of interest appear on a larger scale than features in portions of the enhanced image located outside the area of interest.
It will be appreciated that the above-described techniques for de-emphasizing the visual appearance of portions of the image outside the area of interest may be used individually or in combination with one another. It will also be appreciated that the above described exemplary techniques for de-emphasizing the visual appearance of portions of the image outside the area of interest are not meant as an exhaustive list of such techniques and that other suitable techniques may be used without detracting from the spirit of the invention.
Emphasizing Features Appearing Inside the Area of Interest
In a second example of implementation, the processing unit 300 processes the image received at input 304 to generate an enhanced image wherein features appearing inside the area of interest, conveyed by the information received at step 402 are visually emphasized. Any image manipulation suitable technique for emphasizing the visual appearance of features of the image inside the area of interest may be used. Such image manipulation techniques are well-known in the art and as such will not be described in detail here.
In a specific example, the processing unit 300 processes the image to increase contrasts between feature information appearing in portions of the image inside the area of interest and background information appearing in portions of the image inside the area of interest. For example, contour lines defining objects inside the area of interest are made to appear darker and/or thicker compared to the background. In a non-limiting example, contrast-stretching tools with settings highlighting the metallic content of portions of the image inside the area of interest are used to enhance the appearance of such features.
In another specific example, the processing unit 300 processes the image to overlay portions of the image inside the area of interest with a pre-determined visual pattern. The pre-determined visual pattern may be a suitable textured pattern of may be a uniform pattern. The uniform pattern may be a uniform color or other uniform pattern. In a non-limiting example, portions of the image inside the area of interest are highlighted by overlaying the area of interest with a brightly colored pattern. Preferably the visual pattern has transparent properties in that a user can see features of the image in portions of the image inside the area through the visual pattern once the pattern is overlaid in the image.
In another non-limiting example, the processing unit 300 processes the image to modify color information associated to features of the image appearing inside the area of interest. For example, colors for features of the image appearing inside the area of interest may be made to appear brighter or may be replaced by other more visually contrasting colors. In particular, color associated to metallic objects in an x-ray image may be made to appear more prominently by either replacing it with a different color or changing an intensity of the color. For example, the processing unit 300 may transform features appearing in blue inside the area of interest such that these same features appear in red in the enhanced image.
In another non-limiting example, processing unit 300 processes the image to enlarge a portion of the image inside the area of interest such that at least some features of the enhanced image located inside the area of interest appear on a larger scale than features in portions of the enhanced image located outside the area of interest.
It will be appreciated that the above described techniques for emphasizing the visual appearance of portions of the image inside the area of interest may be used individually or in combination with one another or with other suitable techniques without detracting from the spirit of the invention. For example, processing the image may include modifying color information associated to features of the image appearing inside the area of interest and enlarging a portion of the image inside the area of interest. It will also be appreciated that the above described exemplary techniques for emphasizing portions of the image inside the area of interest are not meant as an exhaustive list of such techniques and that other suitable techniques may be used without detracting from the spirit of the invention.
Concurrently De-emphasizing Portions Outside the Area of Interest Emphasizing Features Inside the Area of Interest
In addition, it will be appreciated that embodiments of the invention may also concurrently de-emphasize portions of the image outside the area of interest and emphasize features of the image inside the area of interest without detracting from the spirit of the invention.
Portions Surrounding the Area of Interest
Optionally, the processing unit 300 processes the image received at input 304 to modify portions of areas surrounding the area of interest to generate the enhanced image. In a specific example, the processing unit 300 modifies portions of areas surrounding the area of interest includes applying a blurring function to the edges surrounding the area of interest. In a specific example of implementation, the edges of the area of interest are blurred. Advantageously, blurring the edges of the area of interest accentuates the contrast between the area of interest and the portions of the image outside the area of interest.
Multiple Areas of Interest
It will be appreciated that, although the above described examples describe situations in which a single area of interest is conveyed by the information received from the automated threat detection processor 106, implementations of the invention adapted from processing information indicating a plurality of areas of interest in the image are within the scope of the invention. As such, the processing unit 300 is adapted for receiving at input 306 information from an automated threat detection processor, such as automated threat detection processor 106, indicating a plurality of areas of interest in the image potentially containing respective prohibited objects. The processing unit 300 then processes the image received at input 304 to generate the enhanced image. The processing of the image is performed using the same principles as those described above with reference to information conveying a single area of interest. The person skilled in the art will readily appreciate, in light of the present description, the manner in which the processing unit 300 may be adapted for processing information conveying a plurality of areas of interest without required further guidance.
Returning to
Graphical User Interface Module Example
With reference to
As depicted, the user interface module provides a viewing window 500 including a viewing space 570 for displaying an enhanced image 502 wherein areas of interest 504a and 504b are displayed to the user in a visually contrasting manner relative to portions of the image outside the areas of interest 506. In this fashion, an operator's attention can be focused on the areas interest 504a and 504b of the image which are the areas most likely to contain prohibited objects.
In the example depicted, portions of the image outside the areas of interest 504a and 504b have been de-emphasized. Amongst possible other processing, portions of the image outside the areas of interest 504a and 504b, generally designated with reference numeral 506, have been attenuated by reducing contrasts between the features and the background. These portions appear paler relative to the areas of interest 504a and 504b. In the example depicted, features depicted in the areas of interest 504a and 504b have also been emphasized by using contrast-stretching tools to increase the level of contrast between the features depicted in the areas of interest 504a and 504b and the background. Finally, as depicted in the figure, the edges 508a and 508b surrounding the area of interest 504a and 504b have been blurred to accentuates the contrast between the areas of interest 504a and 504b and the portions of the image outside the areas of interest 504a and 504b. The location of the areas of interest 504a and 504b was derived on the basis of the information received at input 306 (shown in
Optionally, the user interface module also provides a set of controls 510512514516550518 and 520 for allowing a user to providing commands for modifying features of the graphical user interface module to change the appearance of the enhanced image 502 displayed in the viewing window 500.
In a specific implementation, the controls in the set of controls 510512514516550518 allow the user to change the appearance of the enhanced image displayed in the viewing window 500 by using an input device in communication with the apparatus 200 (shown in
It will be apparent that certain controls in the set of controls 510512514516550518 may be omitted from certain implementations and that additional controls may be included in alternative implementations of a user interface without detracting from the spirit of the invention.
In the specific example of implementation depicted, functionality is provided to the user for allowing the latter to select for display in viewing window 500 the “original” image received at input 304 of apparatus 200 (shown in
In the specific example of implementation depicted, functionality is also provided to the user for allowing the latter to select a level of enlargement from a set of possible levels of enlargement to be applied to the image in order to derive the enhanced image for display in the viewing window 500. The functionality allows the user to independently control the scale of features appearing in areas of interest 504a and 504b relative to the scale of features in portions of the image outside the areas of interest 504a and 504b. In a specific example, such functionality may be enabled by displaying a control on the user interface allowing a user to effect the selection of the level of enlargement. In
In another specific example of implementation, not depicted in the figure, functionality is also provided to the user for allowing the latter to select a zoom level to be applied to derive the enhanced image 502 for display in the viewing window 500. The functionality allows the user to change the zoom amount to be applied to the image depicted in the viewing space of viewing window 500. This zoom level functionality differs from the level of enlargement functionality described above, and enabled by buttons 512 and 514, in that the zoom level functionality affects the entire image with a selected zoom level. In other words, modifying the zoom level does not affect the relative scale between the areas of interest and portions of the image outside the area of interest remains. In a specific example, such functionality may be enabled by displaying a control on the user interface allowing a user to effect the selection of the zoom level. Any suitable type of control for allowing a user to select a zoom level may be envisaged in specific implementations of the user interface module.
In the specific example of implementation depicted, functionality is also provided to the user for allowing the latter to select a level of enhancement from a set of possible levels of enhancement. The functionality allows the user to independently control the type of enhancement to be applied to the original image to generate the enhanced image for display in the viewing window 500. In a specific example of implementation, the set of possible levels of enhancement includes at least two levels of enhancement. In a non-limiting example, one of the levels of enhancement is a “NIL” level wherein the areas of interest are not emphasized and the portions of the images outside the areas of interest are not de-emphasized. In other examples of implementation, the set of possible levels of enlargement includes two or more distinct levels of enhancement other that the “NIL” level. In a specific example of implementation, each level of enhancement in the set of levels of enhancement is adapted for causing an enhanced image to be derived wherein:
For example, the different levels of enhancement may cause the processing unit 300 to apply different types of image processing functions or different degrees of image processing such as to modify the appearance of the enhanced image depicted in the viewing window 500. Advantageously, this allows the user to adapt the appearance of the enhanced image 502 based on either user preferences or in order to view an image in a different manner to facilitate visual identification of a prohibited object. In a specific example, the above-described functionality may be enabled by providing a control on the user interface allowing a user to effect the selection of the level of enhancement. In
In a specific example of implementation, not shown in the figures, functionality is also provided to the user for allowing the latter to independently control the amount of enhancement to be applied to the area(s) of interest of the images and the amount of enhancement to be applied to portions of the image outside of the area(s) of interest. In a specific example, the above-described functionality may be enabled by providing on a user interface a first user control for enabling the user to select a first selected level of enhancement, and a second user control is provided for allowing a user to select a second level of enhancement. The processing unit generates the enhanced image such that:
Optionally still, the user interface module is adapted for displaying a control 518 for allowing a user to modify other configuration elements of the user interface. In accordance with a non-liming specific implementation, actuating control 518 causes the user interface module to displays a control window 600 of the type depicted in
The person skilled in the art in light of the present description will readily appreciate that other options may be provided to the user and that certain options described above may be omitted from certain implementations without detracting from the spirit of the invention. As a variant, certain options may be selectively provided to certain users or, alternatively, may require a password to be modified. For example, the setting threshold sensitivity/confidence level 608 may only be made available to user having certain privileges (examples screening supervisors or security directors). As such, the user interface module may include some type of user identification functionality, such as a login process, to identify the user of the screening system. Alternatively, the user interface module, upon selection by the user of the setting threshold sensitivity/confidence level 608 option, may prompt the user to enter a password for allowing the user to modify the detection sensitivity level of the screening system.
Optionally still, the user interface module is adapted for displaying a control 520 for allowing a user to login/log-out of the system in order to provide user identification functionality. Manners in which user identification functionality can be provided are well-known in the art and are not critical to the present invention and as such will not be described further here.
Optionally still, not shown in the figures, the user interface module is adapted to allow the user to add complementary information to the information being displayed on the user interface. In a specific example of implementation, the user is enabled to insert markings in the form of text and/or visual indicators in the image displayed on the user interface. The marked-up image may then be transmitted to a third party location, such as a checking station, so that the checking station is alerted to verify the marked portion of the receptacle to locate a prohibited object. In such an implementation, the user input 308 (depicted in
Previously Screened Receptacles
In accordance with a specific example of implementation, apparatus 200 shown in
The generation of a record may be effected for all receptacles being screened or for selected receptacles only. In practical implementations of the inventions, in particular in areas where traffic levels are high and a large number of receptacles are screened, it may be preferred to selectively store the images of the receptacles rather than storing images for all the receptacles. The selection of which images to store may be effected by the user of the user interface by providing a suitable control on the user interface for receiving user command to that effect. Alternatively, the selection of which images may be effected in the basis of information received from the automated threat detection processor 106. For example, a record may be generated for a given object when a prohibited object was potentially detected in the receptacle as could be conveyed by a signal received from the automated threat detection processor 106.
A process for facilitating visual identification of prohibited objects in images associated with previously screened receptacles is depicted in
As shown, at step 700, a plurality of records associated to previously screened receptacles are provided. In a non-limiting example of implementation, apparatus 200 enables step 700 by providing the memory 350 for storing a plurality of records associated to respective previously screened receptacles. As described above, each record includes an image of the contents of a receptacle derived from an apparatus that scans the receptacle with penetrating radiation and information derived from an automated threat detection processor and indicating an area of interest in the image potentially containing a prohibited object.
At step 702, a set of thumbnail images derived from the plurality of records is displayed in viewing space 572. In a specific example of implementation, processing unit 300 is adapted for displaying a set of thumbnail images 522 in viewing space 572, each thumbnail image 526a 526b 526c in the set of thumbnail images 522 being derived from a record in the plurality of records stored in memory unit 350 (shown in
At step 704 a user in enabled to select at least one thumbnail image from the set of thumbnail images 522. The selection may be effected on the basis of the images themselves or by allowing the user to specify either a time or time period associated to the records. In the specific example depicted, the user can select thumbnail image from the set of thumbnail images 522 using a user-input device to actuate the desired thumbnail image. Any suitable user input device for providing user commands may be used such as, for example, a mouse, keyboard, pointing device, speech recognition unit or touch sensitive screen.
At step 706, an enhanced image derived from a record corresponding to the selected thumbnail image is displayed in a second viewing space 570. More specifically, in response to a selection of a thumbnail image from the set of thumbnail images, an enhanced image derived from the certain record corresponding to the selected thumbnail image is displayed in viewing space 570. When multiple thumbnail images are selected, the corresponding enhanced images may be displayed concurrently with another or may be displayed separately in viewing space 570.
The enhanced imaged derived from the certain record corresponding to the selected thumbnail image may be derived in a similar manner as that described previously in the present specification. For example, for a given record in the database of records including a certain image and information conveying a certain area of interest in the image, portions of the certain image outside the certain area of interest may be visually de-emphasized to generate the enhanced image. In a second example of implementation, features appearing inside the certain area of interest are visually emphasized to generate the enhanced image. In yet another example, the portions of the image outside the certain area of interest are visually de-emphasized and features appearing inside the certain area of interest are visually emphasized to generate the enhanced image. The manner in which the portions of the certain image outside the certain area of interest may be visually de-emphasized and features appearing inside the certain area of interest may visually emphasized have been previously described in the present applicable and as such will not be described further here.
In the specific example of implementation depicted, functionality is also provided to the user for allowing the latter to scroll through a plurality of thumbnail images so the different sets of the thumbnail images may be displayed in viewing space 572. In a specific example, such functionality may be enabled by displaying a control on the user interface allowing a user to scroll through plurality of thumbnail images. In
Optionally, each thumbnail image in the set of thumbnail images conveys information derived from an associated time stamp data element. In the example depicted in
Optionally, the user interface module implemented by apparatus 200 (shown in
Optionally, as depicted in
Automated Threat Detection Processor 106
The automated threat detection processor 106 shown in
In a specific example of implementation if the invention, either one or both of the region of the interest locator module 804 and the image comparison module 802 may generate information conveying an area of interest. In a non-limiting example of implementation, the area of interest locator module 804 is adapted for generating information conveying one or more regions of interest based on characteristics inherent to the image conveying contents of a receptacle. In a non-limiting example of implementation where the image is an x-ray image, the characteristics inherent to the image include, without being limited to, density information conveyed by an x-ray type image example. Conversely, in this non-limiting example of implementation, the image comparison module 802 is adapted for generating information conveying one or more regions of interest based on a comparison between the image conveying contents of a receptacle and images in a database of target objects. It will readily appreciated that specific examples of implementation of the may omit either one of the image comparison module 802 and the area of interest locator module 804 for implementing the functionality for generating information conveying an area of interest the without detraction from the spirit of the invention.
The first input 810 is for receiving data conveying an image of the contents of a receptacle from the image generation apparatus 102 (shown in
The second input 814 is for receiving target images from the database of prohibited objects 110. It will be appreciated that in embodiments where the database of prohibited objects 110 is part of automated threat detection processor 106, the second input 314 may be omitted.
The output 312 is for releasing information for transmittal to output module 108 indicating an area of interest in the image potentially containing a prohibited object. Optionally, the information released also conveys a level of confidence that the area of interest contains a prohibited object as well as the identity of the prohibited object potentially detected.
The processing unit of the automated threat detection processor 106 receives the data conveying an image of the contents of the receptacle 104 from the first input 810 and processes that image to derive an area of interest in the image and, optionally, to identify a prohibited object in the receptacle 104. The processing unit of the automated threat detection processor 106 generates and releases at output 812 information conveying an area of interest in the image an optionally information conveying the identity of a detected prohibited object.
The process implemented by the various functional elements of the processing unit of the automated threat detection processor 106 is depicted in
The complexity of the requisite level of pre-processing and the related trade-offs between speed and accuracy depend on the application. Examples of pre-processing may include, without being limited to, brightness and contrast manipulation, histogram modification, noise removal and filtering amongst others. It will be appreciated that all or part of the functionality of the pre-processing module 800 may actually be external to the automated threat detection processor 106, e.g., it may be integrated as part of the image generation apparatus 102 or as an external component. It will also be appreciated that the pre-processing module 800 (and hence step 901) may be omitted in certain embodiments of the present invention without detracting from the spirit of the invention. As part of step 901, the pre-processing module 800 releases data conveying a modified image of the contents of the receptacle 104 for processing by the image comparison module 802 and by the area of interest locator module 804.
At step 950, the area of interest locator module 804 processes the data conveying the modified image received from the pre-processing module 800 (or the data conveying an image of the contents of the receptacle received via the first input 810) to generate information conveying an area of interest in the image. The area of interest in the image is an area that potentially contains a prohibited object. Any suitable method to determine an area of the image of (or modified image of) contents of a receptacle that potentially contains a prohibited object may be used. In a specific example, the area of interest locator module 804 is adapted for generating information conveying area of interest based on characteristics inherent to the input image. In a first specific example of implementation, the image is an x-ray image conveying information related to the material density associated to contents of the receptacle. The area of interest locator module 804 is adapted to process the image and identify areas including a certain concentration of elements characterized by a certain material density, say for example metallic-type elements, and label these areas as areas of interest. Characteristics such as the size of the area exhibited the certain density may also be taken into account to identify an area of interest. It will be apparent to the person skilled in the art that other suitable methods for identifying regions of interest in an image may be used. Many such methods are known in the art and as such will not be described further here.
At step 902, the image comparison module 802 verifies whether there remain any unprocessed target images in the database of prohibited objects 110. In the affirmative, the image comparison module 802 proceeds to step 903 where the next target image is accessed and the image comparison module 802 then proceeds to step 904. If at step 902 all target images in the database of prohibited objects 110 have been processed, the image comparison module 802 proceeds to step 908 and the process is completed.
At step 904, the image comparison module 802 compares the image (or modified image) of the contents of the receptacle 104 against the target image accessed at step 903 to determine whether a match exists. The comparison may be effected using any image processing algorithm suitable for comparing two images. Optionally, the comparison may make use of the area of interest information generated by the area of interest locator module to limit the comparison operation to the area of interest. Examples of algorithms that can be used to perform image processing and comparison include without being limited to:
A—Image Enhancement
B—Image Segmentation
C—General detection
D—Edge Detection
E—Morphological Image Processing
F—Frequency Analysis
G—Shape Analysis and Representations
H—Feature Representation and Classification
The above algorithms are well known in the field of image processing and as such will not be described further here.
In a specific example of implementation, the image comparison module 802 includes an edge detector to perform part of the comparison at step 904. In another specific example of implementation, the comparison performed at step 904 includes effecting a correlation operation between the image (or modified image) of contents of the receptacle and the target images in the database 110. In a specific example of implementation, the correlation operation is performed by an optical correlator. In an alternative example of implementation, the correlation operation is performed by a digital correlator. In yet another implementation, a combination of methods is used to effect the comparison of step 904. The results of the comparisons are then combined to obtain a joint comparison result.
The image comparison module 802 then proceeds to step 906 where the result of the comparison effected at step 904 is processed to determine whether a match exists between the image (or modified image) of the contents of receptacle 104 and the target image. In a specific example of implementation, the comparison at step 904 generates a score conveying a likelihood that there is a match between a portion of the image (or modified image) of the contents of receptacle 104 and the target image. A match is detected of the score obtained by the comparison at step 904 is above a certain threshold score. This score can also be considered as the confidence level associated to detection of a match. In the absence of a match, the image comparison module 802 returns to step 902. In response to detection of a match, the image comparison module 802 triggers the output signal generator module 806 to execute step 910. Then, the image comparison module 802 returns to step 902 to continue processing with respect to the next target image.
At step 910, the output signal generator module 806 generates information conveying the presence of a prohibited object in the receptacle 104, and the information is released at output 812. The information conveys positioning information associated to the prohibited object within the image received at input 810. The positioning information conveys an area of interest in the image where the prohibited object is potentially located which was derived either from the area of interest locator module 804 or the image comparison module 802. The information may be conveyed in any suitable format. In a non-limiting example, the information may convey a plurality of (X,Y) pixel locations defining an area in the image of the contents of a receptacle. In another non-limiting example of implementation, the information conveys an (X,Y) pixel location conveying the center of an area in the image. Optionally, the information released also conveys a level of confidence that the area of interest contains a prohibited object. Optionally still an identification of the prohibited object potentially detected in the image is also provided.
Although the above-described screening system was described in connection with screening of receptacles, the concepts described above can also be applied to the screening of people.
For example, in an alternative embodiment, a system for screening people is provided. The system includes components similar to those described in connection with the system depicted in
Optionally, in the case of a system for screening people, database of prohibited objects 110 may further include entries associated to non-prohibited objects and/or objects that do not represent a potential threat. Such entries may be used to detect objects commonly carried by people such as cell-phones, watches and rings, for example, which are not prohibited and not threatening. Advantageously, by identifying such objects unnecessary manual verifications can be avoided.
Specific Physical Implementation
Certain portions of the apparatus 200 for implementing a user interface (shown in
Similarly, certain portions of the automated threat detection processor 106 can also be implemented on a general purpose digital computer having a similar structure as that described in connection with
Alternatively, the above-described automated threat detection processor 106 can be implemented on a dedicated hardware platform where electrical/optical components implement the functional blocks described in the specification and depicted in the drawings. Specific implementations may be realized using ICs, ASICs, DSPs, FPGA, an optical correlator, digital correlator or other suitable hardware platform.
Other alternative implementations of the automated threat detection processor 106 and the apparatus 200 for implementing a user interface can be implemented as a combination of dedicated hardware and software such as apparatus 1000 of the type depicted in
It will be appreciated that the screening system 100 (depicted in
The server system 1110 includes a program element 1116 for execution by a CPU. Program element 1116 includes functionality to implement the methods described above, including a method for displaying information associated to a receptacle and for facilitating visual identification of a prohibited object in an image during security screening, and includes the necessary networking functionality to allow the server system 1110 to communicate with the client systems 1102, 1104, 1106 and 1108 over network 1112. In a specific implementation, the client systems 1102, 1104, 1106 and 1108 include display units responsive to signals received from the server system 1110 for displaying a user interface module implementation by the server system 1110.
Although the present invention has been described in considerable detail with reference to certain preferred embodiments thereof, variations and refinements are possible without departing from the spirit of the invention. Therefore, the scope of the invention should be limited only by the appended claims and their equivalents.
Number | Name | Date | Kind |
---|---|---|---|
4338626 | Lemelson | Jul 1982 | A |
4379348 | Haas et al. | Apr 1983 | A |
4383327 | Kruger | May 1983 | A |
4470303 | O'Donnell | Sep 1984 | A |
4480899 | Sprague | Nov 1984 | A |
4481575 | Bazlen et al. | Nov 1984 | A |
4482958 | Nakayama et al. | Nov 1984 | A |
4509075 | Simms et al. | Apr 1985 | A |
4573198 | Anderson | Feb 1986 | A |
4612666 | King | Sep 1986 | A |
4637056 | Sherman et al. | Jan 1987 | A |
4651957 | Minnich, II | Mar 1987 | A |
4653109 | Lemelson et al. | Mar 1987 | A |
4722096 | Dietrich et al. | Jan 1988 | A |
4724543 | Klevecz et al. | Feb 1988 | A |
4725733 | Horman et al. | Feb 1988 | A |
4736399 | Okazaki | Apr 1988 | A |
4736401 | Donges et al. | Apr 1988 | A |
4737650 | West | Apr 1988 | A |
4756015 | Doenges et al. | Jul 1988 | A |
4759047 | Donges et al. | Jul 1988 | A |
4775895 | Traupe et al. | Oct 1988 | A |
4783794 | Dietrich | Nov 1988 | A |
4788704 | Donges et al. | Nov 1988 | A |
4795253 | Sandridge et al. | Jan 1989 | A |
4819188 | Matsubara et al. | Apr 1989 | A |
4832447 | Javidi | May 1989 | A |
4837733 | Shiraishi et al. | Jun 1989 | A |
4838644 | Ochoa et al. | Jun 1989 | A |
4841554 | Doenges et al. | Jun 1989 | A |
4849912 | Leberl et al. | Jul 1989 | A |
4862358 | Kimura et al. | Aug 1989 | A |
4869574 | Hartman | Sep 1989 | A |
4870670 | Geus | Sep 1989 | A |
4884289 | Glockmann et al. | Nov 1989 | A |
4887899 | Hung | Dec 1989 | A |
4916722 | Ema | Apr 1990 | A |
4955060 | Katsuki et al. | Sep 1990 | A |
5003616 | Orita et al. | Mar 1991 | A |
5018178 | Katsumata | May 1991 | A |
5020111 | Weber | May 1991 | A |
5022062 | Annis | Jun 1991 | A |
5034812 | Rawlings | Jul 1991 | A |
5041993 | Rawlings | Aug 1991 | A |
5056130 | Engel | Oct 1991 | A |
5060249 | Eisen et al. | Oct 1991 | A |
5063602 | Peppers et al. | Nov 1991 | A |
5065418 | Bermbach et al. | Nov 1991 | A |
5073782 | Huguenin et al. | Dec 1991 | A |
5079698 | Grenier et al. | Jan 1992 | A |
5091924 | Bermbach et al. | Feb 1992 | A |
5107351 | Leib et al. | Apr 1992 | A |
5109276 | Nudelman et al. | Apr 1992 | A |
5132811 | Iwaki et al. | Jul 1992 | A |
5132842 | Yeh | Jul 1992 | A |
5132998 | Tsutsui et al. | Jul 1992 | A |
5138167 | Barnes | Aug 1992 | A |
5150229 | Takesue et al. | Sep 1992 | A |
5179581 | Annis | Jan 1993 | A |
5181234 | Smith | Jan 1993 | A |
5198669 | Namiki et al. | Mar 1993 | A |
5216541 | Takesue et al. | Jun 1993 | A |
5239595 | Takemura et al. | Aug 1993 | A |
5243693 | Maron | Sep 1993 | A |
5257085 | Ulich et al. | Oct 1993 | A |
5257322 | Matsuoka et al. | Oct 1993 | A |
5268967 | Jang et al. | Dec 1993 | A |
5283641 | Lemelson | Feb 1994 | A |
5297222 | Mori et al. | Mar 1994 | A |
5309244 | Katagiri et al. | May 1994 | A |
5309523 | Iwaki et al. | May 1994 | A |
5311359 | Lucas et al. | May 1994 | A |
5319547 | Krug et al. | Jun 1994 | A |
5323472 | Falk | Jun 1994 | A |
5327286 | Sampsell et al. | Jul 1994 | A |
5345081 | Rogers | Sep 1994 | A |
5345173 | Bito et al. | Sep 1994 | A |
5365560 | Tam | Nov 1994 | A |
5365564 | Yashida et al. | Nov 1994 | A |
5367552 | Peschmann | Nov 1994 | A |
5371542 | Pauli et al. | Dec 1994 | A |
5375156 | Kuo-Petravic et al. | Dec 1994 | A |
5376796 | Chan et al. | Dec 1994 | A |
5379334 | Zimmer et al. | Jan 1995 | A |
5379336 | Kramer et al. | Jan 1995 | A |
5418380 | Simon et al. | May 1995 | A |
5420788 | Vissers | May 1995 | A |
5425113 | Ito | Jun 1995 | A |
5428657 | Papanicolopoulos et al. | Jun 1995 | A |
5430787 | Norton | Jul 1995 | A |
5481584 | Tang et al. | Jan 1996 | A |
5481622 | Gerhardt et al. | Jan 1996 | A |
5483569 | Annis | Jan 1996 | A |
5485312 | Horner et al. | Jan 1996 | A |
5490218 | Krug et al. | Feb 1996 | A |
5493444 | Khoury et al. | Feb 1996 | A |
5506880 | Scardino et al. | Apr 1996 | A |
5519225 | Mohr et al. | May 1996 | A |
5524133 | Neale et al. | Jun 1996 | A |
5528702 | Mitsuoka et al. | Jun 1996 | A |
5528703 | Lee | Jun 1996 | A |
5546189 | Svetkoff et al. | Aug 1996 | A |
5568256 | Korner et al. | Oct 1996 | A |
5580471 | Fukumoto et al. | Dec 1996 | A |
5595767 | Cinquin et al. | Jan 1997 | A |
5600303 | Husseiny et al. | Feb 1997 | A |
5600485 | Iwaki et al. | Feb 1997 | A |
5600700 | Krug et al. | Feb 1997 | A |
5604634 | Khoury et al. | Feb 1997 | A |
5619596 | Iwaki et al. | Apr 1997 | A |
5625192 | Oda et al. | Apr 1997 | A |
5625717 | Hashimoto et al. | Apr 1997 | A |
5638420 | Armistead | Jun 1997 | A |
5642393 | Krug et al. | Jun 1997 | A |
5642394 | Rothschild | Jun 1997 | A |
5647018 | Benjamin | Jul 1997 | A |
5664574 | Chance | Sep 1997 | A |
5668846 | Fox et al. | Sep 1997 | A |
5680525 | Sakai et al. | Oct 1997 | A |
5684565 | Oshida et al. | Nov 1997 | A |
5692028 | Geus et al. | Nov 1997 | A |
5692029 | Husseiny et al. | Nov 1997 | A |
5692446 | Becker et al. | Dec 1997 | A |
5699400 | Lee et al. | Dec 1997 | A |
5703921 | Fujita et al. | Dec 1997 | A |
5706816 | Mochizuki et al. | Jan 1998 | A |
5726449 | Yoshiike et al. | Mar 1998 | A |
5739539 | Wang et al. | Apr 1998 | A |
5745542 | Gordon et al. | Apr 1998 | A |
5748305 | Shimono et al. | May 1998 | A |
5748697 | Tam | May 1998 | A |
5754621 | Suzuki et al. | May 1998 | A |
5756875 | Parker et al. | May 1998 | A |
5757981 | Kawakubo | May 1998 | A |
5761334 | Nakajima et al. | Jun 1998 | A |
5764683 | Swift et al. | Jun 1998 | A |
5764719 | Noettling | Jun 1998 | A |
5768334 | Maitrejean et al. | Jun 1998 | A |
5777742 | Marron | Jul 1998 | A |
5778046 | Molloi et al. | Jul 1998 | A |
5779641 | Hatfield et al. | Jul 1998 | A |
5784429 | Arai | Jul 1998 | A |
5786597 | Lingren et al. | Jul 1998 | A |
5787145 | Geus | Jul 1998 | A |
5794788 | Massen | Aug 1998 | A |
5796802 | Gordon | Aug 1998 | A |
5796868 | Dutta-Choudhury | Aug 1998 | A |
5799100 | Clarke et al. | Aug 1998 | A |
5800355 | Hasegawa | Sep 1998 | A |
5802133 | Kawai et al. | Sep 1998 | A |
5809171 | Neff et al. | Sep 1998 | A |
5815198 | Vachtsevanos et al. | Sep 1998 | A |
5815264 | Reed et al. | Sep 1998 | A |
5828722 | Ploetz et al. | Oct 1998 | A |
5828774 | Wang | Oct 1998 | A |
5834153 | Hasegawa et al. | Nov 1998 | A |
5838758 | Krug et al. | Nov 1998 | A |
5838759 | Armistead | Nov 1998 | A |
5841828 | Gordon et al. | Nov 1998 | A |
5841907 | Javidi et al. | Nov 1998 | A |
5850465 | Shimura et al. | Dec 1998 | A |
5862198 | Samarasekera et al. | Jan 1999 | A |
5862258 | Taylor | Jan 1999 | A |
5864598 | Hsieh et al. | Jan 1999 | A |
5866907 | Drukier et al. | Feb 1999 | A |
5877849 | Ramer et al. | Mar 1999 | A |
5881123 | Tam | Mar 1999 | A |
5893095 | Jain et al. | Apr 1999 | A |
5894345 | Takamoto et al. | Apr 1999 | A |
5901196 | Sauer et al. | May 1999 | A |
5901198 | Crawford et al. | May 1999 | A |
5903623 | Swift et al. | May 1999 | A |
5909285 | Beaty et al. | Jun 1999 | A |
5909477 | Crawford et al. | Jun 1999 | A |
5910765 | Slemon et al. | Jun 1999 | A |
5910973 | Grodzins | Jun 1999 | A |
5911139 | Jain et al. | Jun 1999 | A |
5917190 | Yodh et al. | Jun 1999 | A |
5926568 | Chaney et al. | Jul 1999 | A |
5940468 | Huang et al. | Aug 1999 | A |
5943388 | Tumer | Aug 1999 | A |
5951474 | Matsunaga et al. | Sep 1999 | A |
5953452 | Boone et al. | Sep 1999 | A |
5960104 | Conners et al. | Sep 1999 | A |
5974111 | Krug et al. | Oct 1999 | A |
5978440 | Kang et al. | Nov 1999 | A |
5981949 | Leahy et al. | Nov 1999 | A |
5987095 | Chapman et al. | Nov 1999 | A |
6005916 | Johnson et al. | Dec 1999 | A |
6008496 | Winefordner et al. | Dec 1999 | A |
6009142 | Sauer et al. | Dec 1999 | A |
6011620 | Sites et al. | Jan 2000 | A |
6018561 | Tam | Jan 2000 | A |
6018562 | Willson | Jan 2000 | A |
6031890 | Bermbach et al. | Feb 2000 | A |
6035014 | Hiraoglu et al. | Mar 2000 | A |
6043870 | Chen | Mar 2000 | A |
6049381 | Reintjes et al. | Apr 2000 | A |
6057761 | Yukl | May 2000 | A |
6057909 | Yahav et al. | May 2000 | A |
6058159 | Conway et al. | May 2000 | A |
6060677 | Ulrichsen et al. | May 2000 | A |
6070583 | Perelman et al. | Jun 2000 | A |
6075591 | Vokhmin | Jun 2000 | A |
6075880 | Kollhof et al. | Jun 2000 | A |
6078638 | Sauer et al. | Jun 2000 | A |
6080994 | Carrott et al. | Jun 2000 | A |
6081580 | Grodzins et al. | Jun 2000 | A |
6084939 | Tamura | Jul 2000 | A |
6088423 | Krug et al. | Jul 2000 | A |
6094472 | Smith | Jul 2000 | A |
6097427 | Dey et al. | Aug 2000 | A |
6097483 | Komatsu | Aug 2000 | A |
6149300 | Greenway et al. | Nov 2000 | A |
6153873 | Wolf | Nov 2000 | A |
6155179 | Aust et al. | Dec 2000 | A |
6157730 | Roever et al. | Dec 2000 | A |
6163403 | Carrott et al. | Dec 2000 | A |
6175417 | Do et al. | Jan 2001 | B1 |
6175613 | Boutenko et al. | Jan 2001 | B1 |
6185272 | Hiraoglu et al. | Feb 2001 | B1 |
6188747 | Geus et al. | Feb 2001 | B1 |
6195413 | Geus et al. | Feb 2001 | B1 |
6195444 | Simanovsky et al. | Feb 2001 | B1 |
6198795 | Naumann et al. | Mar 2001 | B1 |
6205195 | Lanza | Mar 2001 | B1 |
6205243 | Migdal et al. | Mar 2001 | B1 |
6218943 | Ellenbogen | Apr 2001 | B1 |
6222902 | Lin et al. | Apr 2001 | B1 |
6229872 | Amos | May 2001 | B1 |
6233303 | Tam | May 2001 | B1 |
6236704 | Navab et al. | May 2001 | B1 |
6236708 | Lin et al. | May 2001 | B1 |
6249341 | Basiji et al. | Jun 2001 | B1 |
6252929 | Swift et al. | Jun 2001 | B1 |
6256370 | Yavuz | Jul 2001 | B1 |
6256404 | Gordon et al. | Jul 2001 | B1 |
6263044 | Joosten | Jul 2001 | B1 |
6263231 | Reitter | Jul 2001 | B1 |
6272204 | Amtower et al. | Aug 2001 | B1 |
6272233 | Takeo | Aug 2001 | B1 |
6278760 | Ogawa et al. | Aug 2001 | B1 |
6288974 | Nelson | Sep 2001 | B1 |
6289235 | Webber et al. | Sep 2001 | B1 |
6292260 | Lin et al. | Sep 2001 | B1 |
6292530 | Yavus et al. | Sep 2001 | B1 |
6292533 | Swift et al. | Sep 2001 | B1 |
6324245 | Tam | Nov 2001 | B1 |
6335742 | Takemoto | Jan 2002 | B1 |
6353673 | Shnitser et al. | Mar 2002 | B1 |
6366638 | Hsieh et al. | Apr 2002 | B1 |
6370222 | Cornick, Jr. | Apr 2002 | B1 |
6373916 | Inoue et al. | Apr 2002 | B1 |
6373970 | Dong et al. | Apr 2002 | B1 |
6373979 | Wang | Apr 2002 | B1 |
6381297 | Hsieh | Apr 2002 | B1 |
6388788 | Harris et al. | May 2002 | B1 |
6403960 | Wellnitz et al. | Jun 2002 | B1 |
6404841 | Pforr et al. | Jun 2002 | B1 |
6408042 | Hsieh | Jun 2002 | B1 |
6415012 | Taguchi et al. | Jul 2002 | B1 |
6418184 | Wang et al. | Jul 2002 | B1 |
6418189 | Schafer | Jul 2002 | B1 |
6424692 | Suzuki | Jul 2002 | B1 |
6442288 | Haerer et al. | Aug 2002 | B1 |
6445765 | Frank et al. | Sep 2002 | B1 |
6448545 | Chen | Sep 2002 | B1 |
6453003 | Springer et al. | Sep 2002 | B1 |
6459755 | Li | Oct 2002 | B1 |
6463181 | Duarte | Oct 2002 | B2 |
6473489 | Bani-Hashemi et al. | Oct 2002 | B2 |
6477221 | Ning | Nov 2002 | B1 |
6480285 | Hill | Nov 2002 | B1 |
6480564 | Kim et al. | Nov 2002 | B1 |
6483894 | Hartick et al. | Nov 2002 | B2 |
6487307 | Hennessey et al. | Nov 2002 | B1 |
6502984 | Ogura et al. | Jan 2003 | B2 |
6507025 | Verbinski et al. | Jan 2003 | B1 |
6507278 | Brunetti et al. | Jan 2003 | B1 |
6525331 | Ngoi et al. | Feb 2003 | B1 |
6526120 | Gray et al. | Feb 2003 | B1 |
6532276 | Hartick et al. | Mar 2003 | B1 |
6542574 | Grodzins | Apr 2003 | B2 |
6542578 | Ries et al. | Apr 2003 | B2 |
6542579 | Takasawa | Apr 2003 | B1 |
6542580 | Carver et al. | Apr 2003 | B1 |
6542628 | Muller et al. | Apr 2003 | B1 |
6549683 | Bergeron et al. | Apr 2003 | B1 |
6552809 | Bergeron et al. | Apr 2003 | B1 |
6559769 | Anthony et al. | May 2003 | B2 |
6570177 | Struckhoff et al. | May 2003 | B1 |
6570708 | Bergeron et al. | May 2003 | B1 |
6570951 | Hsieh | May 2003 | B1 |
6570956 | Rhee et al. | May 2003 | B1 |
6574296 | Stierstorfer | Jun 2003 | B2 |
6574297 | Tam | Jun 2003 | B2 |
6580777 | Ueki et al. | Jun 2003 | B1 |
6580778 | Meder | Jun 2003 | B2 |
6583895 | Kuwahara et al. | Jun 2003 | B1 |
6584170 | Aust et al. | Jun 2003 | B2 |
6586193 | Yguerabide et al. | Jul 2003 | B2 |
6587575 | Windham et al. | Jul 2003 | B1 |
6587595 | Henkel et al. | Jul 2003 | B1 |
6597760 | Beneke et al. | Jul 2003 | B2 |
6603536 | Hasson et al. | Aug 2003 | B1 |
6608921 | Inoue et al. | Aug 2003 | B1 |
6611575 | Alyassin et al. | Aug 2003 | B1 |
6618466 | Ning | Sep 2003 | B1 |
6621887 | Albagli et al. | Sep 2003 | B2 |
6621888 | Grodzins et al. | Sep 2003 | B2 |
6621925 | Ohmori et al. | Sep 2003 | B1 |
6628982 | Thomas et al. | Sep 2003 | B1 |
6628983 | Gagnon | Sep 2003 | B1 |
6654443 | Hoffman | Nov 2003 | B1 |
6661867 | Mario et al. | Dec 2003 | B2 |
6663280 | Doenges | Dec 2003 | B2 |
6665373 | Kotowski et al. | Dec 2003 | B1 |
6707879 | McClelland et al. | Mar 2004 | B2 |
6714623 | Sako et al. | Mar 2004 | B2 |
6721387 | Naidu et al. | Apr 2004 | B1 |
6721391 | McClelland et al. | Apr 2004 | B2 |
6724922 | Vilsmeier | Apr 2004 | B1 |
6731819 | Fukushima et al. | May 2004 | B1 |
6735274 | Zahavi et al. | May 2004 | B1 |
6735279 | Jacobs et al. | May 2004 | B1 |
6738450 | Barford | May 2004 | B1 |
6744909 | Kostrzewski et al. | Jun 2004 | B1 |
6746864 | McNeil et al. | Jun 2004 | B1 |
6751349 | Matama | Jun 2004 | B2 |
6754374 | Miller et al. | Jun 2004 | B1 |
6763148 | Sternberg et al. | Jul 2004 | B1 |
6785410 | Vining et al. | Aug 2004 | B2 |
H2110 | Newman | Oct 2004 | H |
6801647 | Arakawa | Oct 2004 | B1 |
6803997 | Stanek | Oct 2004 | B2 |
6804412 | Wilkinson | Oct 2004 | B1 |
6813395 | Kinjo | Nov 2004 | B1 |
6825854 | Beneke et al. | Nov 2004 | B1 |
6837422 | Meder | Jan 2005 | B1 |
6839403 | Kotowski et al. | Jan 2005 | B1 |
6839406 | Ries et al. | Jan 2005 | B2 |
6843599 | Le et al. | Jan 2005 | B2 |
6856272 | Levitan et al. | Feb 2005 | B2 |
6865287 | Beneke | Mar 2005 | B1 |
6865509 | Hsiung et al. | Mar 2005 | B1 |
6868138 | Clinthorne et al. | Mar 2005 | B2 |
6873261 | Anthony et al. | Mar 2005 | B2 |
6876322 | Keller | Apr 2005 | B2 |
6895072 | Schrock et al. | May 2005 | B2 |
6895338 | Hsiung et al. | May 2005 | B2 |
6899540 | Neiderman et al. | May 2005 | B1 |
6918541 | Knowles et al. | Jul 2005 | B2 |
6928141 | Carver et al. | Aug 2005 | B2 |
6936828 | Saccomanno | Aug 2005 | B2 |
6938488 | Diaz et al. | Sep 2005 | B2 |
6940943 | Claus et al. | Sep 2005 | B2 |
6950492 | Besson | Sep 2005 | B2 |
6952163 | Huey et al. | Oct 2005 | B2 |
6970531 | Eberhard et al. | Nov 2005 | B2 |
6980681 | Hsieh | Dec 2005 | B1 |
6982643 | Garfinkle | Jan 2006 | B2 |
6990171 | Toth et al. | Jan 2006 | B2 |
7000827 | Meder | Feb 2006 | B2 |
7012256 | Roos et al. | Mar 2006 | B1 |
7020241 | Beneke et al. | Mar 2006 | B2 |
7043474 | Mojsilovic et al. | May 2006 | B2 |
7045787 | Verbinski et al. | May 2006 | B1 |
7046761 | Ellenbogen et al. | May 2006 | B2 |
7050616 | Hsieh et al. | May 2006 | B2 |
7062074 | Beneke | Jun 2006 | B1 |
7065175 | Green | Jun 2006 | B2 |
7068751 | Toth et al. | Jun 2006 | B2 |
7092485 | Kravis | Aug 2006 | B2 |
7098461 | Endo | Aug 2006 | B2 |
7099004 | Masten | Aug 2006 | B2 |
7099432 | Ichihara et al. | Aug 2006 | B2 |
7100165 | Eldridge et al. | Aug 2006 | B2 |
7103137 | Seppi et al. | Sep 2006 | B2 |
7105828 | Unger et al. | Sep 2006 | B2 |
7116749 | Besson | Oct 2006 | B2 |
7130456 | Hillmann | Oct 2006 | B2 |
7136716 | Hsiung et al. | Nov 2006 | B2 |
7139406 | McClelland et al. | Nov 2006 | B2 |
7142633 | Eberhard et al. | Nov 2006 | B2 |
7154650 | Lettington | Dec 2006 | B2 |
7164750 | Nabors et al. | Jan 2007 | B2 |
7183906 | Zanovitch et al. | Feb 2007 | B2 |
7193515 | Roberts et al. | Mar 2007 | B1 |
7212113 | Zanovitch | May 2007 | B2 |
7212661 | Samara et al. | May 2007 | B2 |
7233682 | Levine | Jun 2007 | B2 |
7244941 | Roos et al. | Jul 2007 | B2 |
7253766 | Foote et al. | Aug 2007 | B2 |
7257189 | Modica et al. | Aug 2007 | B2 |
7260173 | Wakayama et al. | Aug 2007 | B2 |
7529341 | Schlomka et al. | May 2009 | B2 |
7720194 | Connelly et al. | May 2010 | B2 |
7734066 | DeLia et al. | Jun 2010 | B2 |
7769132 | Hurd et al. | Aug 2010 | B1 |
7792248 | Strecker et al. | Sep 2010 | B2 |
7882141 | Ono et al. | Feb 2011 | B2 |
7899232 | Gudmundson et al. | Mar 2011 | B2 |
20010016030 | Nicolas et al. | Aug 2001 | A1 |
20010021013 | Hecht et al. | Sep 2001 | A1 |
20010021244 | Suzuki et al. | Sep 2001 | A1 |
20010028696 | Yamada et al. | Oct 2001 | A1 |
20010033636 | Hartick et al. | Oct 2001 | A1 |
20010038681 | Stanton et al. | Nov 2001 | A1 |
20010038705 | Rubbert et al. | Nov 2001 | A1 |
20010038707 | Ohara | Nov 2001 | A1 |
20010048734 | Uppaluri et al. | Dec 2001 | A1 |
20010053197 | Murayama et al. | Dec 2001 | A1 |
20020001366 | Tamura et al. | Jan 2002 | A1 |
20020015475 | Matsumoto et al. | Feb 2002 | A1 |
20020016546 | Cerofolini | Feb 2002 | A1 |
20020017620 | Oomori et al. | Feb 2002 | A1 |
20020018199 | Blumenfeld et al. | Feb 2002 | A1 |
20020024016 | Endo | Feb 2002 | A1 |
20020027970 | Chapman et al. | Mar 2002 | A1 |
20020028994 | Kamiyama | Mar 2002 | A1 |
20020031246 | Kawano | Mar 2002 | A1 |
20020037068 | Oikawa | Mar 2002 | A1 |
20020044691 | Matsugu | Apr 2002 | A1 |
20020054694 | Vachtsevanos et al. | May 2002 | A1 |
20020067259 | Fufidio et al. | Jun 2002 | A1 |
20020067793 | Stierstorfer | Jun 2002 | A1 |
20020085046 | Furuta et al. | Jul 2002 | A1 |
20020088952 | Rao et al. | Jul 2002 | A1 |
20020094062 | Dolazza et al. | Jul 2002 | A1 |
20020094119 | Sahadevan | Jul 2002 | A1 |
20020098518 | Levinson | Jul 2002 | A1 |
20020106052 | Menhardt | Aug 2002 | A1 |
20020114530 | Duarte | Aug 2002 | A1 |
20020122528 | Besson | Sep 2002 | A1 |
20020124664 | Call et al. | Sep 2002 | A1 |
20020126800 | Matsumoto et al. | Sep 2002 | A1 |
20020127586 | Mortensen | Sep 2002 | A1 |
20020141625 | Nelson | Oct 2002 | A1 |
20020150200 | Zonneveld | Oct 2002 | A1 |
20020161534 | Adler et al. | Oct 2002 | A1 |
20020168083 | Garms et al. | Nov 2002 | A1 |
20020168657 | Chen et al. | Nov 2002 | A1 |
20020172324 | Ellengogen | Nov 2002 | A1 |
20020172409 | Saito et al. | Nov 2002 | A1 |
20020175921 | Xu et al. | Nov 2002 | A1 |
20020176534 | Meder | Nov 2002 | A1 |
20020186862 | McClelland et al. | Dec 2002 | A1 |
20020188197 | Bishop et al. | Dec 2002 | A1 |
20020191209 | Yasumaru | Dec 2002 | A1 |
20030012420 | Verwoerd et al. | Jan 2003 | A1 |
20030023592 | Modica et al. | Jan 2003 | A1 |
20030024315 | Merkel et al. | Feb 2003 | A1 |
20030031289 | Hsieh | Feb 2003 | A1 |
20030031291 | Yamamoto et al. | Feb 2003 | A1 |
20030036006 | Feke et al. | Feb 2003 | A1 |
20030038945 | Mahner | Feb 2003 | A1 |
20030072414 | Sakaida | Apr 2003 | A1 |
20030072418 | Albagli et al. | Apr 2003 | A1 |
20030072484 | Kokko et al. | Apr 2003 | A1 |
20030076924 | Mario et al. | Apr 2003 | A1 |
20030081720 | Swift et al. | May 2003 | A1 |
20030081859 | Kasutani | May 2003 | A1 |
20030082516 | Straus | May 2003 | A1 |
20030085348 | Megerle | May 2003 | A1 |
20030085353 | Almogy et al. | May 2003 | A1 |
20030091145 | Mohr et al. | May 2003 | A1 |
20030095633 | Van Woezik | May 2003 | A1 |
20030095692 | Mundy et al. | May 2003 | A1 |
20030128812 | Appleby et al. | Jul 2003 | A1 |
20030138147 | Ongkojoyo | Jul 2003 | A1 |
20030148393 | Woodbury et al. | Aug 2003 | A1 |
20030149346 | Arnone et al. | Aug 2003 | A1 |
20030165213 | Maglich | Sep 2003 | A1 |
20030179853 | Amemiya et al. | Sep 2003 | A1 |
20030194121 | Eberhard et al. | Oct 2003 | A1 |
20030205676 | Nelson et al. | Nov 2003 | A1 |
20030206649 | Moshe | Nov 2003 | A1 |
20030210139 | Brooks et al. | Nov 2003 | A1 |
20030215051 | Suzuki | Nov 2003 | A1 |
20030215143 | Zakrzewski et al. | Nov 2003 | A1 |
20030231788 | Yukhin et al. | Dec 2003 | A1 |
20030231791 | Torre-Bueno et al. | Dec 2003 | A1 |
20040012853 | Garcia et al. | Jan 2004 | A1 |
20040013239 | Gregerson et al. | Jan 2004 | A1 |
20040016271 | Shah et al. | Jan 2004 | A1 |
20040017882 | Misawa et al. | Jan 2004 | A1 |
20040017883 | Takagi et al. | Jan 2004 | A1 |
20040017888 | Seppi et al. | Jan 2004 | A1 |
20040017935 | Avinash et al. | Jan 2004 | A1 |
20040022425 | Avinash et al. | Feb 2004 | A1 |
20040027127 | Mills | Feb 2004 | A1 |
20040037462 | Lewis et al. | Feb 2004 | A1 |
20040041082 | Harmon | Mar 2004 | A1 |
20040051030 | Olszak et al. | Mar 2004 | A1 |
20040062342 | Cahill | Apr 2004 | A1 |
20040062349 | Schuster | Apr 2004 | A1 |
20040062351 | Yoshioka | Apr 2004 | A1 |
20040066882 | Eberhard et al. | Apr 2004 | A1 |
20040066884 | Hermann Claus et al. | Apr 2004 | A1 |
20040066890 | Dalmijn et al. | Apr 2004 | A1 |
20040075058 | Blevis et al. | Apr 2004 | A1 |
20040080315 | Beevor et al. | Apr 2004 | A1 |
20040082846 | Johnson et al. | Apr 2004 | A1 |
20040083958 | Saidman et al. | May 2004 | A1 |
20040086075 | Hein et al. | May 2004 | A1 |
20040086160 | Zimmermann | May 2004 | A1 |
20040087844 | Yen | May 2004 | A1 |
20040101097 | Wakayama et al. | May 2004 | A1 |
20040102700 | Asafusa | May 2004 | A1 |
20040109231 | Haisch et al. | Jun 2004 | A1 |
20040120009 | White et al. | Jun 2004 | A1 |
20040120857 | Smith et al. | Jun 2004 | A1 |
20040134986 | Studer et al. | Jul 2004 | A1 |
20040141056 | Izumi et al. | Jul 2004 | A1 |
20040142386 | Rigler et al. | Jul 2004 | A1 |
20040160599 | Hamamatsu et al. | Aug 2004 | A1 |
20040161073 | Nokita | Aug 2004 | A1 |
20040175041 | Miller | Sep 2004 | A1 |
20040176677 | Hwu et al. | Sep 2004 | A1 |
20040212492 | Boesch et al. | Oct 2004 | A1 |
20040213377 | Endo | Oct 2004 | A1 |
20040213600 | Watanabe et al. | Oct 2004 | A1 |
20040218729 | Xue et al. | Nov 2004 | A1 |
20040225222 | Zeng et al. | Nov 2004 | A1 |
20040236520 | Williams et al. | Nov 2004 | A1 |
20040240612 | Suzuki | Dec 2004 | A1 |
20040247071 | Dafni | Dec 2004 | A1 |
20040247171 | Hashimoto et al. | Dec 2004 | A1 |
20040252024 | Huey et al. | Dec 2004 | A1 |
20040252870 | Reeves et al. | Dec 2004 | A1 |
20040253660 | Gibbs et al. | Dec 2004 | A1 |
20040258198 | Carver et al. | Dec 2004 | A1 |
20040258202 | Wernick et al. | Dec 2004 | A1 |
20040263379 | Keller | Dec 2004 | A1 |
20040264624 | Tanaka et al. | Dec 2004 | A1 |
20040264648 | Claus et al. | Dec 2004 | A1 |
20040265175 | Witty et al. | Dec 2004 | A1 |
20050008119 | McClelland et al. | Jan 2005 | A1 |
20050008203 | Dixon | Jan 2005 | A1 |
20050017181 | Kearfott et al. | Jan 2005 | A1 |
20050018812 | Wolfs | Jan 2005 | A1 |
20050024199 | Huey et al. | Feb 2005 | A1 |
20050025280 | Schulte | Feb 2005 | A1 |
20050025350 | Engelbart et al. | Feb 2005 | A1 |
20050025377 | Avinash et al. | Feb 2005 | A1 |
20050031069 | Kaucic et al. | Feb 2005 | A1 |
20050053307 | Nose et al. | Mar 2005 | A1 |
20050057354 | Jenkins et al. | Mar 2005 | A1 |
20050058242 | Peschmann | Mar 2005 | A1 |
20050058350 | Dugan et al. | Mar 2005 | A1 |
20050061955 | Endo | Mar 2005 | A1 |
20050069085 | Lewis | Mar 2005 | A1 |
20050074088 | Ichihara et al. | Apr 2005 | A1 |
20050085721 | Fauver et al. | Apr 2005 | A1 |
20050094856 | Warren | May 2005 | A1 |
20050098728 | Alfano et al. | May 2005 | A1 |
20050105680 | Nabors et al. | May 2005 | A1 |
20050110672 | Cardiasmenos et al. | May 2005 | A1 |
20050111618 | Sommer, Jr. et al. | May 2005 | A1 |
20050113961 | Sabol et al. | May 2005 | A1 |
20050117693 | Miyano | Jun 2005 | A1 |
20050117700 | Peschmann | Jun 2005 | A1 |
20050123093 | Lawaczeck et al. | Jun 2005 | A1 |
20050123174 | Gorsky et al. | Jun 2005 | A1 |
20050128069 | Skatter | Jun 2005 | A1 |
20050133708 | Eberhard et al. | Jun 2005 | A1 |
20050147199 | Dunham et al. | Jul 2005 | A1 |
20050153356 | Okawa et al. | Jul 2005 | A1 |
20050163354 | Ziegler | Jul 2005 | A1 |
20050173284 | Ambrefe, Jr. | Aug 2005 | A1 |
20050189412 | Hudnut et al. | Sep 2005 | A1 |
20050190882 | McGuire | Sep 2005 | A1 |
20050206514 | Zanovitch et al. | Sep 2005 | A1 |
20050207655 | Chopra et al. | Sep 2005 | A1 |
20050212913 | Richter | Sep 2005 | A1 |
20050219523 | Onuma et al. | Oct 2005 | A1 |
20050220264 | Homegger | Oct 2005 | A1 |
20050226375 | Eberhard et al. | Oct 2005 | A1 |
20050231421 | Fleisher et al. | Oct 2005 | A1 |
20050238232 | Ying et al. | Oct 2005 | A1 |
20050240858 | Croft et al. | Oct 2005 | A1 |
20050248450 | Zanovitch | Nov 2005 | A1 |
20050249416 | Leue et al. | Nov 2005 | A1 |
20050251397 | Zanovitch et al. | Nov 2005 | A1 |
20050251398 | Zanovitch et al. | Nov 2005 | A1 |
20050259868 | Sones | Nov 2005 | A1 |
20050265517 | Gary | Dec 2005 | A1 |
20050271184 | Ovadia | Dec 2005 | A1 |
20050275831 | Silver | Dec 2005 | A1 |
20050276376 | Eilbert | Dec 2005 | A1 |
20050276443 | Slamani et al. | Dec 2005 | A1 |
20050279936 | Litman et al. | Dec 2005 | A1 |
20050283079 | Steen et al. | Dec 2005 | A1 |
20060000911 | Stekel | Jan 2006 | A1 |
20060002504 | De Man et al. | Jan 2006 | A1 |
20060008054 | Ohara | Jan 2006 | A1 |
20060009269 | Hoskinson et al. | Jan 2006 | A1 |
20060013455 | Watson et al. | Jan 2006 | A1 |
20060013464 | Ramsay et al. | Jan 2006 | A1 |
20060017605 | Lovberg et al. | Jan 2006 | A1 |
20060018434 | Jacobs et al. | Jan 2006 | A1 |
20060018517 | Chen et al. | Jan 2006 | A1 |
20060019409 | Nelson et al. | Jan 2006 | A1 |
20060034503 | Shimayama | Feb 2006 | A1 |
20060036167 | Shina | Feb 2006 | A1 |
20060045235 | Bruder et al. | Mar 2006 | A1 |
20060045323 | Ateya | Mar 2006 | A1 |
20060056584 | Allman et al. | Mar 2006 | A1 |
20060064246 | Medberry et al. | Mar 2006 | A1 |
20060065844 | Zelakiewicz et al. | Mar 2006 | A1 |
20060072702 | Chapman | Apr 2006 | A1 |
20060083417 | Dehmeshki | Apr 2006 | A1 |
20060083418 | Watson et al. | Apr 2006 | A1 |
20060084872 | Ichikawa et al. | Apr 2006 | A1 |
20060086794 | Knowles et al. | Apr 2006 | A1 |
20060093088 | Sowerby et al. | May 2006 | A1 |
20060098773 | Peschmann | May 2006 | A1 |
20060098866 | Whitson et al. | May 2006 | A1 |
20060115109 | Whitson et al. | Jun 2006 | A1 |
20060116566 | Bruijns | Jun 2006 | A1 |
20060119837 | Raguin et al. | Jun 2006 | A1 |
20060133650 | Xie et al. | Jun 2006 | A1 |
20060133659 | Hammond | Jun 2006 | A1 |
20060142662 | Van Beek | Jun 2006 | A1 |
20060142984 | Weese et al. | Jun 2006 | A1 |
20060173268 | Mullick et al. | Aug 2006 | A1 |
20060176062 | Yang et al. | Aug 2006 | A1 |
20060203960 | Schlomka et al. | Sep 2006 | A1 |
20060204080 | Sones et al. | Sep 2006 | A1 |
20060215811 | Modica et al. | Sep 2006 | A1 |
20060228040 | Simon et al. | Oct 2006 | A1 |
20060255929 | Zanovitch et al. | Nov 2006 | A1 |
20060257005 | Bergeron et al. | Nov 2006 | A1 |
20060262902 | Wattenburg | Nov 2006 | A1 |
20060269135 | Ramsay et al. | Nov 2006 | A1 |
20060273257 | Roos et al. | Dec 2006 | A1 |
20060274916 | Chan et al. | Dec 2006 | A1 |
20060282886 | Gaug | Dec 2006 | A1 |
20070003122 | Sirohey et al. | Jan 2007 | A1 |
20070041612 | Perron et al. | Feb 2007 | A1 |
20070041613 | Perron et al. | Feb 2007 | A1 |
20070058037 | Bergeron et al. | Mar 2007 | A1 |
20070147585 | Eilbert et al. | Jun 2007 | A1 |
20070168467 | Hu et al. | Jul 2007 | A1 |
20070195994 | McClelland et al. | Aug 2007 | A1 |
20070196803 | Goodrich | Aug 2007 | A1 |
20070200566 | Clark et al. | Aug 2007 | A1 |
20070206719 | Suryanarayanan et al. | Sep 2007 | A1 |
20070210921 | Volpi et al. | Sep 2007 | A1 |
20070269005 | Chalmers et al. | Nov 2007 | A1 |
20080152082 | Bouchard et al. | Jun 2008 | A1 |
20080170660 | Gudmundson et al. | Jul 2008 | A1 |
20080236275 | Breed et al. | Oct 2008 | A1 |
20080260097 | Anwar et al. | Oct 2008 | A1 |
20090175411 | Gudmundson et al. | Jul 2009 | A1 |
20120300902 | Modica et al. | Nov 2012 | A1 |
Number | Date | Country |
---|---|---|
2307439 | May 2000 | CA |
2319958 | Sep 2000 | CA |
2574402 | Jan 2006 | CA |
2651131 | Nov 2010 | CA |
0 577 380 | Jan 1994 | EP |
9718462 | May 1997 | WO |
0127601 | Apr 2001 | WO |
WO 02082290 | Oct 2002 | WO |
WO 03069498 | Aug 2003 | WO |
WO 03107113 | Dec 2003 | WO |
WO 2005086616 | Sep 2005 | WO |
2006119609 | Nov 2006 | WO |
WO 2006119603 | Nov 2006 | WO |
WO2008034232 | Mar 2008 | WO |
WO2008040119 | Apr 2008 | WO |
PCTCA2008000275 | Oct 2009 | WO |
Entry |
---|
PCT/CA2005/000716 (IPRP), Nov. 13, 2007, OptoSecurity Inc. |
PCT/CA2005/001930 (IPRP), Nov. 13, 2007, OptoSecurity Inc. |
PCT/CA2006/000655 (IPRP), Nov. 13, 2007, OptoSecurity Inc. |
PCT/CA2006/000751 (IPRP), Nov. 13, 2007, OptoSecurity Inc. |
PCT/CA2007/001297 (ISR), Nov. 14, 2007, OptoSecurity Inc. et al. |
PCT/CA2007/001298 (ISR), Nov. 14, 2007, OptoSecurity Inc. et al. |
PCT/CA2007/001658 (ISR), Jan. 10, 2008, OptoSecurity Inc. et al. |
PCT/CA2007/001749 (ISR), Jan. 14, 2008, OptoSecurity Inc. et al. |
PCT/CA2005/000716 (ISR), Feb. 3, 2006. OptoSecurity Inc. |
PCT/CA/2005/001930 (ISR), Apr. 7, 2006, OptoSecurity Inc. |
PCT/CA/2006/000655 (ISR), Aug. 3, 2006, OptoSecurity Inc. |
PCT/CA/2006/000751 (ISR), Aug. 28, 2006, OptoSecurity Inc. |
H.J. Caufield and W.T. Maloney, Improved discrimination in optical character recognition, 1969, Appl. Opt., 8, p. 2354. |
Mahalanobis, A. et al., Minimum average correlation energy filters, Sep. 1, 1987, Appl. Opt. 26 No. 17, pp. 3633-3640. |
Joseph L. Horner et al., Phase-only matched filtering, Mar. 15, 1994, Appl. Opt. vol. 23 No. 6, pp. 812-816. |
Benjamin R., Object-based 3D X-ray imaging for second-line security screening, 1995, Conf. Publ. No. 408, Londonn, Uk: IEE, pp. 310-313, Abstract. |
Andre Morin et al., Optical character recognition (OCR) in uncontrolled environnments using optical correlators, 1999, SPIE Int., pp. 346-356. |
PinPoint TM Threat Identification Software dated Jul. 25, 2005 of URL: http://www.guardiantechintl.com/security.php.?npage+pinpoint, 4 pages. |
Gregor Mcdonald, Fast Pattern Recognition, QinetiQ Ltd 2003. |
Secure Flight passenger screening program, Oct. 28, 2005, http://www.globalsecurity.org/securiy/systems/passenger—screen.htm, 6 pages. |
Security technology overview: Advanced vehicle verification & threat identification, www.optosecurity.com and www.extremeCCTV.com, 1 page. |
B.V.K. Vijaya Kumar et al., Spatial Frequency Domain Image Processing for Biometric Recognition, 2002, pp. 1-53-1-56, vol. 1, IEEE ICIP, Pittsburgh, PA, USA. |
eXaminer 3DX, Explosives Detection System, L3 Communications, Security & Detection Systems, Nov. 8-9, 2005. |
Airport Magazine, Solutions, Products, Services, vol. 7, Mar. 2006, selected pages. |
PCT/CA2007/000779 (ISR), Aug. 17, 2007, OptoSecurity Inc. |
PCT/CA2007/000840 (ISR), Aug. 22, 2007, OptoSecurity Inc. |
Extended European Search Report issued by the European Patent Office on Oct. 19, 2011 in connection with European Patent Application Serial No. 07719704.4, 9 pages. |
Final Office Action issued by United States Patent and Trademark Office on Nov. 10, 2011 in connection with U.S. Appl. No. 11/407,217, 14 pages. |
Extended European Search Report issued by the European Patent Office on Nov. 22, 2011 in connection with European Patent Application Serial No. 07719764.8, 8 pages. |
Lisa Gottesfeld Brown, “A Survey of Image Registration Techniques”, ACM Computing Surveys, ACM, New York, NY, US, vol. 24, No. 4, Dec. 1, 1992, XP002942558, ISSN: 0360-0300, DOI: 10.1145/146370.146374 * sect.2.2 *, pp. 325-376. |
Hartley et al., “Multiple View Geometry in Computer Vision Contents”, Multiple View Geometry in Computer Vision, Cambridge: Cambridge University Press, GB, Jan. 1, 2004, XP002244390, ISBN: 978-0-521-54051-3 * the whole document *, pp. V-X. |
Jutamulia S. Yu, “Optical Pattern Recognition”, 2000, Cambridge University Press, p. v-p. xii. |
A. Schwaninger et al., “Evaluation of CBT for increasing threat detection performance in X-ray screening”, The Internet Society: Advances in learning, Commerce and Security, Jan. 1, 2004, XP55009216, Retrived from the Internet: URL:http://www.casra.ch/publications/doc/SchwaningerHofer2004.pdf [retrieved on Oct. 11, 2011] sect.2.1, 2.2, pp. 147-156. |
Y.S. Yong et al., “Threat image projection for stereoscopic X-ray imaging”, Jan. 1, 2001, Visualization, Imaging, and Image Processing Proceedings of the IASTED International Conference, Sep. 3-5, 2001 Marsella, Spain, Anaheim, Calif.[U.A.] IASTED/ACTA Press, pp. 301-306, P008144120, ISBN: 978-0-88986-309-5* p. 1 *. |
Mark Mitckes, “Threat Image Projection—An Overview”, Oct. 2003, XP002661292, Retrieved from the Internet URL:web.eecs.utk.edu/{lme/t1p.pdf * sect.l, 2.1, 3,figures 1-9 *, pp. 1-23. |
Adrian Schwaninger, “Increasing Efficiency in Airport Security Screening”, AVSEC World 2004, Nov. 3, 2004, XP55009005, Retrieved from the Internet: URL:http://www.casra.ch/publications/doc/ Schwaninger2004a.pdf [retrieved on Oct. 10, 2007] * sect 2.1figures 1,2,3,4,7 *, pp. 1-14. |
F. Hofer et al. “Using threat image projection data for assessing individual screener performance”, WIT Transactions on the Built Environment, Jan. 1, 2005, XP55009395, Safety and Security Engineering, Retrieved from the Internet: URL:http://www.casra.ch/publications/doc/HoferSchwaninger2005.pdf [retrieved on Oct. 12, 2011] * sect.2.1.1 *, pp. 417-426. |
Office Action mailed on Feb. 13, 2009 in connection with U.S. Appl. No. 11/268,749, 7 pages. |
Office Action mailed on May 28, 2009 in connection with U.S. Appl. No. 11/268,749, 18 pages. |
Office Action mailed on Jul. 2, 2009 in connection with U.S. Appl. No. 11/431,627, 34 pages. |
Office Action mailed on Dec. 2, 2008 in connection with Canadian Patent Application 2,546,296, 2 pages. |
Office Action mailed on Jun. 29, 2009 in connection with Canadian Patent Application 2,651,131, 3 pages. |
Office Action mailed on Apr. 16, 2010 in connection with U.S. Appl. No. 11/431,627, 46 pages. |
PCT/CA2008/000275 (ISR), Oct. 2009, OptoSecurity Inc. |
Notice of Allowance issued by the United States Patent and Trademark Office on Mar. 24, 2011 in connection with U.S. Appl. No. 11/431,627, 13 pages. |
Extended European Search Report issued by the European Patent Office on Mar. 24, 2011 in connection with European Patent Application Serial No. 08 714 598.3, 7 pages. |
Extended European Search Report issued by the European Patent Office on Apr. 1, 2011 in connection with European Patent Application Serial No. 06 741 416.9, 7 pages. |
Non-Final Office Action issued by the United States Patent and Trademark Office on Apr. 22, 2011 in connection with U.S. Appl. No. 11/407,217, 20 pages. |
Non-Final Office Action issued by the United States Patent and Trademark Office on Aug. 15, 2011 in connection with U.S. Appl. No. 11/920,042, 9 pages. |
Restriction Requirement Office Action issued by the United States Patent and Trademark Office on Jun. 11, 2010 in connection with U.S. Appl. No. 11/407,217, 6 pages. |
Restriction Requirement Office Action issued by the United States Patent and Trademark Office on Oct. 7, 2010 in connection with U.S. Appl. No. 11/431,719, 5 pages. |
Notice of Allowance issued by the United States Patent and Trademark Office on Oct. 26, 2010 in connection with U.S. Appl. No. 11/747,639, 8 pages. |
Examiner's Report (Office Action) issued by the Canadian Intellectual Property Office on Jul. 5, 2012 in connection with Canadian Patent Application Serial No. 2,608,119, 3 pages. |
Examiner's Report (Office Action) issued by the European Patent Office on Aug. 10, 2012 in connection with European Patent Application No. 07 719 704.4, 5 pages. |
Examiner's Report (Office Action) issued by the European Patent Office on Aug. 10, 2012 in connection with European Patent Application Serial No. 07 719 764.8, 6 pages. |
Number | Date | Country | |
---|---|---|---|
20080240578 A1 | Oct 2008 | US |