Probabilistic image analysis

Information

  • Patent Grant
  • 12181422
  • Patent Number
    12,181,422
  • Date Filed
    Tuesday, September 15, 2020
    4 years ago
  • Date Issued
    Tuesday, December 31, 2024
    3 days ago
  • Inventors
  • Original Assignees
    • Rapiscan Holdings, Inc. (Hawthorne, CA, US)
  • Examiners
    • Kiknadze; Irakli
    Agents
    • Novel IP
Abstract
A method for detecting at least one object of interest in at least one raw data x-ray image includes the steps of emitting an incident x-ray radiation beam through a scanning volume having an object therein, detecting x-ray signals transmitted through at least one of the scanning volume and the object, deriving the at least one raw data x-ray image from the detected x-ray signals, inputting the raw data x-ray image, expressed according to an attenuation scale, into a neural network, for each pixel in the raw data x-ray image, outputting from the neural network a probability value assigned to that pixel, and, classifying each pixel in the raw data x-ray image into a first classification if the probability value associated with the pixel exceeds a predetermined threshold probability value and in a second classification if the probability value associated with the pixel is below the predetermined threshold probability value.
Description
CROSS-REFERENCE

The present application is a 371 national stage application of PCT Application Number PCT/CA2020/051239, titled “Probabilistic Image Analysis” and filed on Sep. 15, 2020.


TECHNICAL FIELD

The present disclosure generally relates to a system for detection of objects or materials. More particularly, the present disclosure relates to a system for detection of objects of interest using a probabilistic analysis technique.


BACKGROUND

Conventional X-ray detection usually relies on transmission signal levels or attenuation, or on the conversion of a detected x-ray transmission signals into information representing the effective atomic number, mass attenuation, density or other property or characteristic of the material being scanned provided, for example, by way of trace material detection. These values are then analyzed to detect the presence of certain materials which may be prohibited, such as drugs, or materials which may potentially be dangerous, such as explosive materials or the metal from weapons. However, the shape and the visual details of the prohibited or dangerous objects, which contain relevant information as to what the object might be, are not utilized in such an analysis.


When a trained operator looks at the image produced by an X-ray scanning machine or data provided by a trace detection device, the operator is the one to perform the analysis to assess the presence of objects or materials of interest, such as potential threats, based on their combined shape and/or composition as interpreted on visual review. Manual reviews of this type are time-consuming and are subject to human error. Accordingly, they are subject to a higher rate of false positive readings or false negative readings. Moreover, manual review does not produce data or information which can be used automatically to improve other review processes or to influence the behavior of other components operably connected to the X-ray scanning device or trace material detection device.


It is therefore desired to have a system which automatically recognizes objects or materials of interest in an inspected object, preferably in real-time or near real-time which produces useful information to be applied in future processes.


Machine learning has been applied in many ways for recognition of objects in images. Applications of machine learning have been contemplated for use in interpreting images produced by x-ray scans. As an improvement to the machine learning field, the machine learning sub-class known as “deep learning” aims to simulate human interpretation of image data. Deep learning is often characterized by the use of an algorithm or series of algorithms known as “artificial neural networks”, or simply “neural networks”.


In prior applications of machine learning to x-ray image analysis, observations have been represented in a variety of ways, such as a vector of each pixel intensity value, or more abstractly represented as a series of edges or regions of a particular shape, and the like. One advantage of deep learning applications to image analysis is that the neural networks may be trained in an unsupervised or semi-supervised manner to learn features and hierarchical feature extraction using efficient algorithms instead of manual acquisition of features. To simplify image analysis, a process known as “image segmentation” is used to split the input image information into segments that that represent objects or parts of objects. This allows for analysis of the images in larger components.


Some conventional applications of neural networks to analyze x-ray scan images includes identifying regions of a digital x-ray scan image which has been normalized and processed. A neural network may be used to identify one or more regions of the image that is likely to contain an object of interest. To do so, pixels may be analyzed in groups, possibly sequential groups, to identify one or more features indicative of an object of interest. Features may include, for example, edges, areas of a particular shape, concavities, convexities or any other aspect. The features identified in the pixel groups or “regions” of the image may then be input into a classification network to classify the object of interest according to one or more known objects. The classification network typically outputs one or more probabilities or “scores” that the object represented in the image belongs to a particular type or “class” of object.


Segmentation of an x-ray scan image by way of the features, such as those of shape, identified in the image is known as “instance segmentation”. Instance segmentation approaches to object classification include pre-classification steps associated with feature detection because such methods are used for “recognition” of objects. Therefore, they are typically more computationally intensive and can be slower to output a classification. In applications of x-ray scanning for security purposes, it is not necessarily required to “recognize” an object of interest, but rather only to “detect” the presence of an object of interest, such as detection of an object that could be classified as “a potential threat” or “not a potential threat.”


By foregoing the computationally intensive and time-consuming steps associated with object recognition, the process of detecting the presence of a potential threat may be accelerated.


SUMMARY

The present disclosure generally relates to a system for detection of objects or materials. More particularly, the present disclosure relates to a system for detection of objects of interest using a probabilistic analysis technique.


The present disclosure is in the context of probabilistic analysis of raw or unprocessed data in the form of x-ray scan images as produced by transmission x-ray scanning devices for inspection. The present disclosure would also apply to other forms data which may be extracted from an inspected object. Such other forms of data may include images provided by dual energy channels x-ray scans, multi-channel x-ray scans, trace material detection, millimeter wave scans, spectral analysis, x-ray diffraction information, x-ray backscatter images and any other means of inspection for extracting data suitable for analyzing the properties of an object subject to inspection. It should be further understood that the extracted data use for analysis may be unprocessed or processed data.


In one aspect, there is provided a method for detecting at least one object of interest in at least one raw data x-ray image. The method includes the steps of emitting an incident x-ray radiation beam through a scanning volume having an object therein; detecting x-ray signals transmitted through at least one of the scanning volume and the object; deriving the at least one raw data x-ray image from the detected x-ray signals; inputting the raw data x-ray image, expressed according to an attenuation scale, into a neural network; for each pixel in the raw data x-ray image, outputting from the neural network a probability value assigned to that pixel; and classifying each pixel in the raw data x-ray image into a first classification if the probability value associated with the pixel exceeds a predetermined threshold probability value and in a second classification if the probability value associated with the pixel is below the predetermined threshold probability value. The neural network may be a convolutional neural network. Further, the convolutional neural network may be a FC-Densenet.


After the deriving step, the step of inputting the raw data x-ray image expressed according to an attenuation scale may further comprise the steps of determining a transmittance value for each pixel in the raw data x-ray image; and determining an attenuation value from each transmittance value. The outputting step may output a probability map for each pixel in the raw data x-ray image.


The classifying step may use semantic segmentation. The first classification may indicate that the pixel is likely associated with a potential threat and the second classification may indicate that the pixel is not likely to be associated with a potential threat.


The method may further provide a colour-mapped image based on the probability map showing pixels classified in the first classification in a first colour scheme and pixels classified in the second classification in a second colour scheme. The first colour scheme and the second colour scheme may be at least one of flashing, shifting hue and shifting luma.


In another aspect, there is provided a system for detecting at least one object of interest in at least one raw data x-ray image. The method may include an x-ray emitter for emitting an incident x-ray radiation beam through a scanning volume having an object therein; at least one detector for detecting x-ray signals transmitted through at least one of the scanning volume and the object; at least one processor for deriving at least one raw data x-ray image from the detected x-ray signal; at least one processor configured to: input the raw data x-ray image, expressed according to an attenuation scale, into a neural network; output from the neural network a probability value assigned to each pixel in the raw data x-ray image; and, classify each pixel in the raw data x-ray image into a first classification if the probability value associated with the pixel exceeds a predetermined threshold probability value and in a second classification if the probability value associated with the pixel is below the predetermined threshold probability value. The neural network may be configured to classify each pixel in the raw data x-ray image by way of semantic segmentation. The neural network may be a convolutional neural network. The convolutional neural network may be a FC-Densenet.


The at least one processor may be further configured to determine a transmittance value for each pixel in the raw data x-ray image; and determine an attenuation value from each transmittance value.


The at least one processor may be configured to output a probability map for each pixel in the raw data x-ray image. The at least one processor may further be configured to provide a colour-mapped image showing pixels in the first classification in a first colour scheme and pixels in the second classification in a second colour scheme.


In another aspect, there is provided a method for determining a presence of an object of interest. The method may include the steps of: deriving a raw data image representative of at least a portion of an object; inputting the raw data image, expressed according to an attenuation scale, into a neural network; for each pixel in the raw data image, outputting from the neural network a probability value assigned to that pixel; and, classifying each pixel in the raw data image into a first classification if the probability value associated with the pixel exceeds a predetermined threshold probability value and in a second classification if the probability value associated with the pixel is below the predetermined threshold probability value.





BRIEF DESCRIPTION OF THE DRAWINGS

Exemplary non-limiting embodiments are described with reference to the accompanying drawings in which:



FIG. 1 is an illustration of an exemplary x-ray scanning device which may be used in accordance with the invention;



FIG. 2 is a diagram representation of a system which may be used in one aspect of the invention;



FIG. 3 is a flow chart diagram of the operational process according to one aspect of the invention;



FIG. 4 is a diagram representation of an artificial neural network as may be used in accordance with the invention; and,



FIG. 5 is a diagram representation of the training process for the artificial neural network according to one aspect of the invention.





DETAILED DESCRIPTION

The present disclosure generally relates to a system for detection of objects or materials. More particularly, the present disclosure relates to a system for detection of objects or materials of interest using a probabilistic analysis technique.


According to the aspect shown in FIG. 1, there is provided an exemplary x-ray scanning device 100. The x-ray scanning device 100 includes a housing 102 having openings 104 at either end thereof. The openings 104 provide access to a scanning chamber 106 passing through the housing 102. The system 100 may further include a displacement assembly 108, such as a conveyor, which extends through the scanning chamber 106 and which may be used to displace at least one object of interest to be scanned using the x-ray scanning device 100. The x-ray scanning device 100 further includes a source assembly 110. The source assembly 110 includes a source (not shown) for emitting electromagnetic radiation such as x-rays, a source assembly housing 112 at least partially enclosing the source, a pedestal 114 to which the source assembly housing 112 is mounted and a collimator 116 mounted to the source assembly housing 112 for directing x-rays emitted from the source. Collimator 116 may for example be a fan-shaped collimator for directing the x-rays in a fan-shaped beam. However, collimator 116 may be of any suitable shape and not only fan-shaped.


The x-ray scanning device 100 may further include a group of detectors including at least one detector 120 and preferably a plurality of detectors 120 each mounted to the bracket 122. In one aspect, the bracket is an L-shaped bracket which is positioned within the scanning chamber 106 such that the plurality of detectors 120 are mounted at least partially about the scanning chamber 106. In the aspect shown in FIG. 1 there is shown mounted within the scanning chamber a single bracket 122. In other aspects, the scanning chamber may include more than one bracket positioned within the scanning chamber and that the brackets do not have to have same orientation or angular position. It should be further understood that the bracket 122 does not have to be L-shaped. Rather, the bracket 122 may be linear or arc shaped or any other suitable shape.


In some embodiments, each detector 120 includes a detector card having a center point and edges. The center point corresponds to the geometric center of the detector cards. The edges of each detector card define the boundaries of the detector 120.


As shown in FIG. 2, each detector 120 may comprise a first scintillator 202, a filter 204 and a second scintillator 206. All of these may be sandwiched together as shown in FIG. 2 or may be otherwise suitably arranged. In a scanning operation, broad-spectrum x-rays are emitted by the source and are directed by the collimator 116 toward the plurality of detectors 120 within the scanning chamber 106. In the case of each detector 120, a plurality of the emitted x-rays encounters the first scintillator 202 which may be configured to detect the lower portion of the emitted x-ray signal spectrum. Residual low energy x-ray signals may then be stopped by the filter 204 and remaining x-ray signals from the emitted x-rays reach the second scintillator 206 which may be configured to detect a higher portion of the x-ray signal spectrum.


With further reference to FIG. 2, in one aspect, each of the scintillators 202, 206 converts the detected x-ray energy to light. Each of these scintillators 202, 206 is coupled with a photodiode 208 which captures the light from the respective scintillator 202, 206 and generates a corresponding analog electric signal, such as a photo current signal. The electric signal is further digitized by a converter 210. The digitized signal value is associated with a pixel of an image for providing a visual representation of a portion of an object within the scanning volume being scanned. The detectors thus measure to what degree the x-ray signal has attenuated due to passing through a defined inspection volume.


In the conversion of the light into an electric signal by the photodiodes 208, some uncertainties may be introduced in that a given light source may result in different electric signals since every detector card reacts slightly differently to the presence or absence of the electromagnetic radiation of an x-ray. In order to correct these variations and for the final image to appear more homogenously, each pixel of the image may be normalized by correcting an offset and gain in the light conversion. Such a normalization procedure may be executed for example using a normalization module 212 as shown in FIG. 2 in order to compensate for slight variations in offset and gain for each detector, as well as for estimating the expected uncertainties in the low-energy and high-energy signals and/or attenuation for each detector.


Detectors 120 and the x-ray scanning device 100 may be linked to one or more local central processing units (CPU) 200 or other local processing device coupled with the x-ray scanning device 100 via a suitable communication means such as input port 203. Thereby, x-ray signals detected by the detectors 120 may be analyzed locally using, for example, analysis module 214a. The information output from the analysis module 214a may be output locally. Such output may include output of an image to a display 228 for review by security personnel or to a suitable data storage volume, database or preferably data management system 226. Alternatively, the CPU may be configured to provide the x-ray scanning data to a remote location or cloud system for remote analysis 214b, via a suitable communication means, such as a network connection, for processing and may be further configured to receive from the remote location 214b the processed information sent back to the x-ray scanning device or a computer or monitor operably coupled therewith.


The detected x-ray energy signals resulting from the steps described above, once digitized, provide one or more data sets which can be displayed in graphical form and can be recognized by a human technician as indicating the presence of particular structures representing a specific class of objects or materials in the object. However, in an automatic method, the data must be evaluated by one or more computer processors processing the data.



FIG. 3 is a flowchart summarizing the operational process 300 of one aspect of the invention. In a first step 302, at least one x-ray image, composed of unprocessed or raw data, is produced. The at least raw data one x-ray image may include, for example, a set of dual-energy x-ray images. In one aspect, such raw data images may be retrieved from a suitable data storage medium, such as an archive or library of dual-energy x-ray images. In another aspect, the images may be produced de novo by performing a dual-energy x-ray scanning operation on an object using an x-ray scanning machine that produces raw data dual-energy x-ray scan images, for example, in the manner described above with reference to FIG. 1 and FIG. 2.


In a preferred aspect, the raw data image inputs for the neural network are expressed according to an attenuation scale. At step 304, the transmittance value of each pixel in the raw data x-ray images is determined. In one aspect, the transmittance value of a pixel may be determined from the corresponding raw detector pixel signal. Attenuation values are determined using the transmittance values of each pixel, as at step 306. The determination of attenuation values for each pixel from the corresponding transmittance values may be accomplished by any suitable means, but preferably by applying a logarithmic transformation and affine transformation to the subject transmittance value. Once the attenuation values for each pixel are determined, then the raw data x-ray images may be input into a neural network according to an attenuation scale for probabilistic analysis, as at step 308.


Probabilistic analysis is performed on the raw data image on a pixel-by-pixel basis to associate each pixel with a class label. As an example, such class labels may include “threat” or “not a threat” or the like. This analysis is performed using a neural network which, in one preferred aspect, is a convolutional neural network (CNN). Pixels which are adjacent or connected and which receive the same classification from the neural network form an object. The raw data for input is subject to no processing or very limited processing to normalize images from different scanners. The raw data images are not false colour images, as in other systems. Preferably, the raw dual-energy image is input to the CNN in patches. Patch overlap is ideally above the size of the largest expected potential threat object to be detected. For example, CD-ROMs have a large footprint, but poor attenuation. Accordingly, patches which are too small may result in false negatives.


In this preferred aspect, the purpose is to distinguish between detected objects which may pose a threat and should be investigated further, and those which are unlikely to pose a threat and do not necessarily require further investigation. This distinction may be made based on a threshold probability value, which may be predetermined. At step 308 a probability value is assigned to each pixel on the basis of the probabilistic analysis of the raw data x-ray image, expressed according to an attenuation scale, input into the neural network. The assignment of probability values to each pixel may be provided in a probability map. At step 310, pixels are classified according to the probability assigned by the neural network and the threshold probability value. Pixels having a probability value which exceeds the threshold probability value may be classified in a first classification. Pixels having a probability value which is below the threshold probability value may be classified in a second classification. As an example, the first classification may indicate that the pixel is likely associated with a potential threat and the second classification may indicate that the pixel is not likely to be associated with a potential threat. The threshold can be automatically determined by the network after the training process. Alternatively, the threshold can be predetermined or assigned by the operator in advance of the real-time object scanning.


As shown at step 312, the output may include a colour-mapped image wherein pixels representing potential threat objects are in one colour, such as red, and pixels representing non-threat objects are in another colour, such as blue. Since pixels making up the same object are grouped and classed together by the CNN, the objects in the output image may be easily distinguishable to an operator by the difference in colour. Preferably, the colour scheme may include flashing violet hues and shifted luma. Regions of interest may be further identified or made apparent by fitting rectangles on sets of connected pixels in the image.


The colour mapped image may be transmitted over a wide area network (WAN) in real-time or non-real-time. The image may be compressed using lossy compression, with lossiness knob set to adjust the detection performance impact against the detection latency (and scanning throughput). The scanning device runs as an HTTP service which can be hosted on-premises or in the cloud. To improve detector performance, the system may include an online feedback loop which allows for operators to flag false positives or false negatives which may then be used as inputs to the neural network to improve performance.


The input of the neural network is preferably a raw data dual-energy image with values expressed in an attenuation scale. For example, on such an attenuation scale, 0 could represent no attenuation and 1 could represent maximum attenuation (ie. Epsilon transmittance). The attenuation value of a pixel is more linear than its corresponding transmittance value because the signal at least generally follows the Beer-Lambert law. This makes the convolutional neural network (CNN) less sensitive to the “attenuation context” in which an object of a certain “relative attenuation” is present. Further, operation of the neural network using attenuation as input is more efficient since, in that case, the neural network does not have to be trained using, or “learn”, a significant non-linearity.


Potential threat objects may include, for example, a potentially dangerous object, such as a weapon, drugs, contraband or potentially toxic or explosive materials or devices. If the presence of a potential threat object is probable at step 314, then an alert condition may be raised, as at step 316, to notify one or more operators that subsequent action is required. If the presence of a potential threat object is improbable at step 314, then no alert condition is raised, as at step 318.


In a preferred aspect, the analysis or classification of the raw dual-energy x-ray image data is performed automatically and preferably in real-time or near real-time using the probabilistic image analysis technique described herein in which a plurality of input data points, obtained from the raw dual-energy x-ray scan image data, contributes to the determination of the presence of a potential threat object. Although probabilistic classification techniques can include explicit, identifiable rules created by a programmer, a classification procedure that incorporates the results of training is preferred. For example, a classification algorithm can be used to process a training set consisting of patterns for structures of known classification. The results of this processing are used to adjust the algorithm, so that the classification accuracy improves as the algorithm learns by processing the training sets.


Trainable classifiers, such as the neural networks described herein within the context of the present invention, classify each pixel of the image into one of a plurality of classes. Artificial neural networks are used to perform pattern recognition and data classification tasks. Neural networks are fine grain parallel processing architectures composed of non-linear processing units, known as neurons or nodes. The neural network passes a signal by links from input nodes to output nodes. In some cases, such as with a feed-forward neural network, passing of the signal is in one direction only. A CNN includes at least one convolutional layer wherein the outputs of two or more other layers may be convolved and output as input to the next layer. In most implementations, the nodes are organized into multiple layers: the input layer, output layer, and several intermediate or “hidden layers” in between. Each hidden layer successively applies a filter or performs an operation on the input data.


In order to perform semantic segmentation, the algorithm must determine the classification of each of the pixels and determine which pixels correspond to the same object. As is described in more detail hereinbelow, neural networks suitable for semantic segmentations, such as CNNs, and more specifically FC-Densenet, can be trained by inputting new raw dual-energy x-ray scan images of known objects or images retrieved from a library or archive of images saved in a data management system or on a data storage medium. The training images may be pre-labeled manually or automatically prior to inputting to the network so that the neural network can make an appropriate association between the output with the input.


For illustration of the general architecture of a basic neural network, there is provided in FIG. 4 a schematic representation of an artificial neural network 400 network consisting of an input layer 402 of neurons or nodes 404, at least one hidden layer 406, and an output layer 408. The neuron layers are linked via a set of synaptic interconnections 410. Each neuron 404 in the input layer 402 is typically connected to each neuron 404 in the hidden layer 406, and each neuron 404 in the hidden layer 406 is typically connected to each neuron 404 in the output layer 408, via a synaptic connection 410. Connections 410 between nodes may be physical, electronic hardware connections, or they may be embodied in software, as may be the neurons 404 themselves, which software operates on computers.


The neurons or nodes in a neural network typically accept several inputs as a weighted sum (a vector dot product). This sum is then tested against an activation function, which is typically a threshold, and then is processed through an output function. In artificial neural networks, the activation function may also be referred to as a “transfer function”. The activation function of a node defines the output of that node given an input or a set of inputs. The inputs for the nodes comprising the input layer come from external sources, such as input data. The inputs for the nodes comprising the intermediate or hidden layers are the outputs from the nodes of the input layer, for the first hidden layer, or from preceding hidden layers in the neural network. The inputs for the nodes comprising the output layer are the outputs from the last hidden layer in the neural network. The output function could be a non-linear function such as a hard-limiter, a sigmoid function, a convolution, a sine-function or any other suitable function known to a person of ordinary skill in the art.


The activation function threshold determines how high the input to that node must be in order to generate a positive output of that node. For example, a node may be considered to be turned “ON” whenever its value is above a predetermined value such as, for instance, 0.8 and turned “OFF” with a value of less than another value such as 0.25. The node may have an undefined “maybe” state between those values. Between two layers, multiple node connection patterns are possible. In a fully interconnected network, every node in one layer is connected to every node in the next layer. “Pooling” is another arrangement wherein multiple nodes in one layer may connect to a single node in the next layer. This allows for a reduction in the number of neurons in a subsequent layer. Other arrangements are possible.


The connectivity pattern between any two layers defines which node receives the output value of one or more previous nodes as their input. Each connection between nodes is assigned a weight that represents its relative importance. The relative importance is determined by training the neural network, which is discussed hereinafter. A propagation function computes the input to a neuron from the outputs of its predecessor nodes and the strength of their connections. The connection between two nodes is thus realized in mathematical terms by multiplying the output of the one or more lower level nodes by the strength of that connection (weight). At each instant of propagation, the values for the inputs define an activity state. The initial activity state is defined upon presentation of the inputs to the network.


The output response of any hidden layer node and any output layer node is a function of the network input to that node defined by the difference of the threshold of that node and the input to it. The value of the input into each hidden or output layer node is weighted with the weight stored for the connection strengths between each of the input and hidden layer nodes, and the hidden and output layer nodes, respectively. Summing over all connections into a particular node and subtracting this sum from the threshold value may be performed according to sigmoid-type functions, sine-type functions, or any other suitable function known in the art that may be used to obtain the desired type of response function for the output of a node. The weights are chosen to minimize the error between the produced result and the correct result. A learning rule defines how to choose the weight values and adjust them with subsequent instances of training. Several commonly used learning rules are back-propagation, competitive learning, adaptive resonance, reinforcement learning, supervised learning, unsupervised learning and self-organization, though other learning rules may be relied upon within the context of the present invention.


In a preferred aspect, the artificial neural network uses back-propagation learning. The back-propagation learning algorithm is derived from the chain rule for partial derivatives and provides a gradient descent learning method in the space of weights. Back-propagation learning is a supervised learning method. The purpose for back-propagation learning is to find a function that best maps a set of inputs to their correct output. Accordingly, back-propagation learning involves a set of pairs of input and output vectors. The artificial neural network uses an input vector to generate its own, or actual, output vector. The actual output vector is compared with a desired output, or target, vector. The target vector may be defined in the course of training but correlates with the input vector. During the back-propagation training process, the connection weights are adjusted iteratively to best map the target vector and the actual output vector. The conventional delta rule may be used for this calculation where the weight for a particular synapse or connection between nodes is adjusted proportionally to the product of an error signal, delta, available to the node receiving input via the connection and the output of the node sending a signal via the connection. If a node is an output node, the error signal is proportional to the difference between the actual and target value of the node. If it is a hidden layer, it is determined recursively in terms of the error signals of the nodes to which it directly connects and the weights of those connections.


Thus, the training of a neural network is the process of setting the connection weights so that the network produces a desired output in response to any input that is normal for the situation. Supervised training refers to training which requires a training set, i.e. a set of input-target output patterns. The back-propagation algorithm is an efficient technique to train some types of neural network. It operates to send an error back through the neural network during the training process, thereby adjusting all the node connection weights in correspondence with their contribution to the error. The weights of the network therefore gradually drift to a set of values which better maps the input vector with the correct or target output vector. The initial weights may be chosen randomly, within reasonable limits, and adjustments are left to the training process.


The artificial neural network 400 of FIG. 4 is preferably trained on a suitably large set of dual-energy x-ray scan images composed of raw data or values calculated from raw data, such as transmittance or attenuation. The set of images includes images of objects of different shapes and composed of different materials scanned at various angles and orientations. The set of images will include images of objects which may or may not be potentially harmful. Such a set of images, for example, may be generated by new x-ray scans of objects or may be retrieved from a library or archive of images saved in a data management system or on a data storage medium. The images in the training data set must be labeled or tagged to identify the contents of the image including the names and positions of objects or materials. This labeled raw scan data is used as an input-set to be used for training the neural network. In this context, the labeled raw scan data becomes “training data”. The training data is input to the neural network to generate an output 408, in accordance with the error back-propagation learning method described above. Thus, the input data 412 to be used to train the neural network 400 preferably includes dual-energy x-ray images composed of raw signals, expressed according to an attenuation scale


The purpose of training the neural network is to have a processing means capable of recognizing a signature representing an object or material of interest, particularly if the material or object is potentially harmful. This signature is defined as an array of numbers corresponding, on a one-to-one basis, to the discretized values of a physical quantity, such as the energy of X-rays, and could include unrelated, but relevant, other values, such as transmission detector array data, position and volume of the scanned object in the x-ray scanning machine, and other environmental factors. The array may consist of any amount of data points.


The training process is repeated using labeled scan data of a sufficiently large number of raw data dual energy images containing objects and materials of interest in a variety of permutations and combinations to model real-world scenarios. Since the training data is obtained by scanning objects having known configuration and including known materials, each output data during the training process may be further labeled or tagged to identify whether the respective training data represents a defined or known object or material of interest. This output data of the training step may be further stored in a suitable library, data management system or database such a file server on a digital computer system along with the tagged identification information. Furthermore, the library, data management system or database of training data may be enhanced to incorporate and reflect all previously known objects or materials of interest, including threat materials or objects or potentially harmful materials or objects, and their corresponding raw dual-energy x-ray scan data.



FIG. 5 is a flow diagram of the back-propagation training process 500 for an artificial neural network, in accordance with one aspect of the invention. One of ordinary skill in the art would appreciate that the processing is conducted using one or more computers having a plurality of processors and system architecture for executing the machine learning analytical processes described herein, embodied in at least one software program, a plurality of storage devices or a data management system for storing the requisite data, library information, and other information necessary to conduct these analyses, and at least one output device, such as one or more other computing devices, servers or data management systems, networks, “cloud” systems, monitors or other computing devices and peripherals. It should also be understood that the software including the neural network may be housed on a computer system or data management system at a remote location from the x-ray scanning device. X-ray imaging data produced by the scanning device may be sent via a suitable network connection to the remote computer system or data management system for processing. The output of the neural network may then be sent back to the location of the x-ray scanning device for review by an operator.


At the beginning of the training process 502, the synaptic weights and thresholds of the neural network are initialized 504 with, for example, random or arbitrary numbers that are within reason to a person skilled in the art. After initialization 504, the input layer of the neural network is introduced 506 to a first set of training data and the neural network is run to receive 508 an actual output. The neural network makes use of the randomly assigned weights and thresholds to generate at least one output based on a suitable resolving function, as described above. The outputs may, for example, be in the form of differentiable signals such as numerals between, 0 and 1, in the form of positive or negative states implied by an output numeral of greater than or less than 0 respectively, or any other suitable indication as evident to a person of ordinary skill in the art. One form the outputs may take in accordance with the present invention includes one or more values between 0 and 1 indicating a probability as to the presence of an object or material of interest in an image, such as an object which constitutes a potential threat. As previously mentioned, the output may include a colour-mapped image to be shown to an operator wherein potential threat objects and non-threat objects are shown in different colours.


The first set of training data is introduced into the system and, based on the random weights and thresholds, produces an actual output, such as, for example, a numeral greater than 0. If the training data represents an object or material of interest, this output indication is set as a benchmark to identify an object or material of interest while, for example, a numeral less than 0 may be set to identify an object or material that is not of interest. Once a suitable benchmark is set, the training process is repeated with the next set of training data and corresponding actual outputs are received. At step 510, the actual output is compared with the desired or target output, defined by an operator with knowledge as to whether input data is or is not representative of an object or material of interest, for the corresponding set of training data that was input to the neural network in step 506. If the actual output is commensurate with the desired or target output or, if the difference between the target and actual output falls below a predefined acceptable level, a check 512 is made to determine whether the neural network has been trained on the entire set of training data. If not, then the next set of training data is introduced to the neural network at step 506 and the foregoing steps 502 to 510 are repeated. The training process 500 continues until the neural network has been trained on the entire set of training data.


If the comparison 510 suggests that the actual output is not in agreement with the desired or targeted output, the ensuing additional steps are performed. At step 514, the difference between the actual output and the target output is used to generate an error pattern in accordance with a suitable back-propagation rule such as the ‘delta rule’ or any other error estimation rule known to a person of ordinary skill in the art. The error pattern is used to adjust, at step 516, the synaptic weights of the output layer such that the error pattern would be reduced at the next instance the training process 500 is performed, if the same set of training data were presented as the input data. Then, at step 518, the synaptic weights of the hidden layers, preceding the output layer, are adjusted by comparing the hidden layer node actual outputs with the results of nodes in the output layer to form an error pattern for the hidden layer.


The error can thus be propagated as far back over as many hidden layers as are present in the artificial neural network. Finally, the weights for the input layer are similarly adjusted at step 520, and the next set of training data is introduced to the neural network to iterate through the learning cycle again. The neural network is therefore trained by presenting each set of training data in turn at the inputs and propagating forwards and backwards, followed by the next input data, and repeating this cycle a sufficient number of times such that the neural network iteratively adjusts the weights of the synaptic connections between layers to establish a set of weights and thresholds which may be relied upon to produce a pattern of actual output that is in agreement with the target output for the presented input data. Once the desired set of weights and thresholds is established, preferably when all training data has been input to the neural network, then the learning process may be terminated, as shown at step 524. The learned information of a neural network is contained in the values of the set of weights and thresholds.


Once the neural network has been trained using the training data, then recognition and classification of pixels representing objects in an image may be performed using live input data. Live input data may be provided from stored or archived scan images or may be provided by performing new scans using an x-ray scanning device such as the one described above with reference to FIG. 1 and FIG. 2. Depending on the input data, filters may be applied, or specific operations performed, in order to achieve the best performance from the neural network for the live input data. The output of the neural network may be used to modify the display provided by an operator in a manner which would draw the attention of the operator to a specific object or material automatically and in real-time or in near-real-time. The operator may then subsequently raise an alert condition if the object or material of interest identified by the neural network constitutes a potentially harmful object or material. In another aspect, the alert condition may be automatically initiated based on the output of the neural network.


It should be further understood that the training operation can be performed on one machine and the results can be replicated in additional machines. For example, training of a neural network results in a set of weight values defining the association between nodes of the neural network. This set can be recorded and incorporated in other, similar neural networks.


The present disclosure is in the context of probabilistic analysis of raw or unprocessed data in the form of x-ray scan images as produced by transmission x-ray scanning devices, preferably using dual-energy channels for inspection. The present disclosure would also apply to other forms data which may be extracted from an inspected object. Such other forms of data may include images provided by multi-channel x-ray scans, trace material detection, millimeter wave scans, spectral analysis, x-ray diffraction information, x-ray backscatter images and any other means of inspection for extracting data suitable for analyzing the physical or chemical properties of an object or volume subject to inspection. It should be further understood that the extracted data use for analysis may be unprocessed or processed data.


While the invention has been described in terms of specific embodiments, it is apparent that other forms could be adopted by one skilled in the art. For example, the methods described herein could be performed in a manner which differs from the embodiments described herein. The steps of each method could be performed using similar steps or steps producing the same result, but which are not necessarily equivalent to the steps described herein. Some steps may also be performed in different order to obtain the same result. Similarly, the apparatuses and systems described herein could differ in appearance and construction from the embodiments described herein, the functions of each component of the apparatus could be performed by components of different construction but capable of a similar though not necessarily equivalent function, and appropriate materials could be substituted for those noted. Accordingly, it should be understood that the invention is not limited to the specific embodiments described herein. It should also be understood that the phraseology and terminology employed above are for the purpose of disclosing the illustrated embodiments, and do not necessarily serve as limitations to the scope of the invention.

Claims
  • 1. A method for detecting at least one object of interest in at least one raw data x-ray image, the method comprising the steps of: emitting an incident x-ray radiation beam through a scanning volume having an object therein;detecting x-ray signals transmitted through at least one of the scanning volume and the object;deriving the at least one raw data x-ray image from the detected x-ray signals;inputting the raw data x-ray image, expressed according to an attenuation scale, into a neural network;for each pixel in the raw data x-ray image, outputting from the neural network a probability value assigned to that pixel, wherein the probability value is indicative of whether the pixel is likely associated, or not likely associated, with a potential threat; and,classifying each pixel in the raw data x-ray image into at least one of a first classification or second classification based on whether the probability value associated with the pixel exceeds, or does not exceed, a predetermined threshold probability value.
  • 2. The method of claim 1, wherein the step of inputting the raw data x-ray image expressed according to an attenuation scale further comprises the steps of: determining a transmittance value for each pixel in the raw data x-ray image; and,determining an attenuation value from each transmittance value.
  • 3. The method of claim 1, wherein the outputting step outputs a probability map for each pixel in the raw data x-ray image.
  • 4. The method of claim 3, wherein the method further comprises: providing a colour-mapped image based on the probability map showing pixels classified in the first classification in a first colour scheme and pixels classified in the second classification in a second colour scheme.
  • 5. The method of claim 4, wherein the first colour scheme and the second colour scheme at least one of flashes, shifts hue and shifts luma.
  • 6. The method of claim 1, wherein the classifying step is by way of semantic segmentation.
  • 7. The method of claim 1, wherein the neural network is a convolutional neural network.
  • 8. The method of claim 7, wherein the convolutional neural network is a FC-Densenet.
  • 9. The method of claim 1, wherein the at least one raw data x-ray image includes a set of raw data dual-energy x-ray images.
  • 10. A system for detecting at least one object of interest in at least one raw data x-ray image, comprising: an x-ray emitter for emitting an incident x-ray radiation beam through a scanning volume having an object therein;at least one detector for detecting x-ray signals transmitted through at least one of the scanning volume and the object;at least one processor for deriving at least one raw data x-ray image from the detected x-ray signal;at least one processor configured to: input the raw data x-ray image, expressed according to an attenuation scale, into a neural network;output from the neural network a probability value assigned to each pixel in the raw data x-ray image, wherein the probability value is indicative of whether the pixel is likely associated, or not likely associated, with a potential threat; and,classify each pixel in the raw data x-ray image into at least one of a first classification or second classification based on whether the probability value associated with the pixel exceeds, or does not exceed, a predetermined threshold probability value.
  • 11. The system of claim 10, wherein to express the raw data x-ray image according to an attenuation scale, the at least one processor is further configured to: determine a transmittance value for each pixel in the raw data x-ray image; and,determine an attenuation value from each transmittance value.
  • 12. The system of claim 10, wherein to output the probability value assigned to each pixel in the raw data x-ray image, the at least one processor is further configured to output a probability map for each pixel in the raw data x-ray image.
  • 13. The system of claim 12 wherein the at least one processor is further configured to provide a colour-mapped image showing pixels in the first classification in a first colour scheme and pixels in the second classification in a second colour scheme.
  • 14. The system of claim 13, wherein the first colour scheme and the second colour scheme at least one of flashes, shifts hue and shifts luma.
  • 15. The system of claim 10, wherein the neural network is configured to classify each pixel in the raw data x-ray image by way of semantic segmentation.
  • 16. The system of claim 10, wherein the neural network is a convolutional neural network.
  • 17. The system of claim 16, wherein the convolutional neural network is a FC-Densenet.
PCT Information
Filing Document Filing Date Country Kind
PCT/CA2020/051239 9/15/2020 WO
Publishing Document Publishing Date Country Kind
WO2021/051191 3/25/2021 WO A
US Referenced Citations (638)
Number Name Date Kind
2831123 Daly Apr 1958 A
3239706 Farrell Mar 1966 A
3766387 Heffan Oct 1973 A
3768645 Conway Oct 1973 A
3784837 Holmstrom Jan 1974 A
4020346 Dennis Apr 1977 A
4047035 Dennhoven Sep 1977 A
4057725 Wagner Nov 1977 A
4105922 Lambert Aug 1978 A
4139771 Dennhoven Feb 1979 A
4210811 Dennhoven Jul 1980 A
4216499 Dennhoven Aug 1980 A
4228353 Johnson Oct 1980 A
4259721 Kuznia Mar 1981 A
4266425 Allport May 1981 A
4274005 Yamamura Jun 1981 A
4352021 Boyd Sep 1982 A
4366382 Kotowski Dec 1982 A
4430568 Yoshida Feb 1984 A
4566113 Doenges Jan 1986 A
4593355 Chase Jun 1986 A
4599740 Cable Jul 1986 A
4618978 Cosman Oct 1986 A
4641330 Herwig Feb 1987 A
4675890 Plessis Jun 1987 A
4736401 Donges Apr 1988 A
4754469 Harding Jun 1988 A
4788704 Donges Nov 1988 A
4789930 Sones Dec 1988 A
4825454 Annis Apr 1989 A
4868856 Frith Sep 1989 A
4872188 Lauro Oct 1989 A
4884289 Glockmann Nov 1989 A
4887604 Shefer Dec 1989 A
4956856 Harding Sep 1990 A
4979202 Siczek Dec 1990 A
4987584 Doenges Jan 1991 A
4991189 Boomgaarden Feb 1991 A
5007072 Jenkins Apr 1991 A
5008911 Harding Apr 1991 A
5022062 Annis Jun 1991 A
5033106 Kita Jul 1991 A
5044002 Stein Aug 1991 A
5065418 Bermbach Nov 1991 A
5091924 Bermbach Feb 1992 A
5098640 Gozani Mar 1992 A
5164590 Coles Nov 1992 A
5179581 Annis Jan 1993 A
5181234 Smith Jan 1993 A
5182764 Peschmann Jan 1993 A
5224144 Annis Jun 1993 A
5237598 Albert Aug 1993 A
5247561 Kotowski Sep 1993 A
5253283 Annis Oct 1993 A
5263075 McGann Nov 1993 A
5265144 Harding Nov 1993 A
5272627 Maschhoff Dec 1993 A
5313511 Annis May 1994 A
5319547 Krug Jun 1994 A
5367552 Peschmann Nov 1994 A
5379334 Zimmer Jan 1995 A
5410156 Miller Apr 1995 A
5420905 Bertozzi May 1995 A
5467377 Dawson Nov 1995 A
5490196 Rudich Feb 1996 A
5490218 Krug Feb 1996 A
5493596 Annis Feb 1996 A
5524133 Neale Jun 1996 A
5532492 He Jul 1996 A
5541856 Hammermeister Jul 1996 A
5557108 Tumer Sep 1996 A
5600303 Husseiny Feb 1997 A
5600700 Krug Feb 1997 A
5606167 Miller Feb 1997 A
5633907 Gravelle May 1997 A
5638420 Armistead Jun 1997 A
5642393 Krug Jun 1997 A
5642394 Rothschild Jun 1997 A
5661774 Gordon Aug 1997 A
5666393 Annis Sep 1997 A
5687210 Maitrejean Nov 1997 A
5692028 Geus Nov 1997 A
5712926 Eberhard Jan 1998 A
5745543 De Bokx Apr 1998 A
5751837 Watanabe May 1998 A
5764683 Swift Jun 1998 A
5768334 Maitrejean Jun 1998 A
5787145 Geus Jul 1998 A
5796802 Gordon Aug 1998 A
5805660 Perion Sep 1998 A
5818897 Gordon Oct 1998 A
5838758 Krug Nov 1998 A
5838759 Armistead Nov 1998 A
5859891 Hibbard Jan 1999 A
5872829 Wischmann Feb 1999 A
5881122 Crawford Mar 1999 A
5887047 Bailey Mar 1999 A
5901198 Crawford May 1999 A
5903623 Swift May 1999 A
5905806 Eberhard May 1999 A
5909477 Crawford Jun 1999 A
5910973 Grodzins Jun 1999 A
5930326 Rothschild Jul 1999 A
5940468 Huang Aug 1999 A
5974111 Krug Oct 1999 A
5982843 Bailey Nov 1999 A
6005912 Ocleppo Dec 1999 A
6018562 Willson Jan 2000 A
6021174 Campbell Feb 2000 A
6026143 Simanovsky Feb 2000 A
6026171 Hiraoglu Feb 2000 A
6031890 Bermbach Feb 2000 A
6035014 Hiraoglu Mar 2000 A
6037597 Karavolos Mar 2000 A
6054712 Komardin Apr 2000 A
6058158 Eiler May 2000 A
6058159 Conway May 2000 A
6067344 Grodzins May 2000 A
6067366 Simanovsky May 2000 A
6075871 Simanovsky Jun 2000 A
6076400 Bechwati Jun 2000 A
6078642 Simanovsky Jun 2000 A
6081580 Grodzins Jun 2000 A
6088423 Krug Jul 2000 A
6091795 Schafer Jul 2000 A
6094472 Smith Jul 2000 A
6108396 Bechwati Aug 2000 A
6111974 Hiraoglu Aug 2000 A
6118850 Mayo Sep 2000 A
6118852 Rogers Sep 2000 A
6122343 Pidcock Sep 2000 A
6128365 Bechwati Oct 2000 A
6151381 Grodzins Nov 2000 A
6163591 Benjamin Dec 2000 A
6181765 Sribar Jan 2001 B1
6183139 Solomon Feb 2001 B1
6185272 Hiraoglu Feb 2001 B1
6188745 Gordon Feb 2001 B1
6188747 Geus Feb 2001 B1
6192101 Grodzins Feb 2001 B1
6192104 Adams Feb 2001 B1
6195413 Geus Feb 2001 B1
6195444 Simanovsky Feb 2001 B1
6198795 Naumann Mar 2001 B1
6216540 Nelson Apr 2001 B1
6218943 Ellenbogexn Apr 2001 B1
6236709 Perry May 2001 B1
6249567 Rothschild Jun 2001 B1
6252929 Swift Jun 2001 B1
6256369 Lai Jul 2001 B1
6256404 Gordon Jul 2001 B1
6269142 Smith Jul 2001 B1
6272230 Hiraoglu Aug 2001 B1
6278115 Annis Aug 2001 B1
6282260 Grodzins Aug 2001 B1
6292533 Swift Sep 2001 B1
6301326 Bjorkholm Oct 2001 B2
6304629 Conway Oct 2001 B1
6317509 Simanovsky Nov 2001 B1
6320933 Grodzins Nov 2001 B1
6324249 Fazzio Nov 2001 B1
6345113 Crawford Feb 2002 B1
6356620 Rothschild Mar 2002 B1
6379043 Zylka Apr 2002 B1
6418189 Schafer Jul 2002 B1
6421420 Grodzins Jul 2002 B1
6424695 Grodzins Jul 2002 B1
6429578 Danielsson Aug 2002 B1
6430255 Fenkart Aug 2002 B2
6434219 Rothschild Aug 2002 B1
6435715 Betz Aug 2002 B1
6442233 Grodzins Aug 2002 B1
6445765 Frank Sep 2002 B1
6453003 Springer Sep 2002 B1
6453007 Adams Sep 2002 B2
6456684 Mun Sep 2002 B1
6459755 Li Oct 2002 B1
6459761 Grodzins Oct 2002 B1
6459764 Chalmers Oct 2002 B1
6473487 Le Oct 2002 B1
RE37899 Grodzins Nov 2002 E
6483894 Hartick Nov 2002 B2
6490477 Zylka Dec 2002 B1
6507025 Verbinski Jan 2003 B1
6532276 Hartick Mar 2003 B1
6542574 Grodzins Apr 2003 B2
6542578 Ries Apr 2003 B2
6542580 Carver Apr 2003 B1
6546072 Chalmers Apr 2003 B1
6552346 Verbinski Apr 2003 B2
6556653 Hussein Apr 2003 B2
6563903 Kang May 2003 B2
6563906 Hussein May 2003 B2
6580778 Meder Jun 2003 B2
6584170 Aust Jun 2003 B2
6590956 Fenkart Jul 2003 B2
6597760 Beneke Jul 2003 B2
6606516 Levine Aug 2003 B2
6618466 Ning Sep 2003 B1
6621888 Grodzins Sep 2003 B2
6628745 Annis Sep 2003 B1
6636581 Sorenson Oct 2003 B2
6647091 Fenkart Nov 2003 B2
6647094 Harding Nov 2003 B2
6647095 Hsieh Nov 2003 B2
6653588 Gillard-Hickman Nov 2003 B1
6658087 Chalmers Dec 2003 B2
6663280 Doenges Dec 2003 B2
6665373 Kotowski Dec 2003 B1
6665433 Roder Dec 2003 B2
6687333 Carroll Feb 2004 B2
6690766 Kresse Feb 2004 B2
6707879 McClelland Mar 2004 B2
6715533 Kresse Apr 2004 B2
6721387 Naidu Apr 2004 B1
6735271 Rand May 2004 B1
6737652 Lanza May 2004 B2
6748043 Dobbs Jun 2004 B1
6754298 Fessler Jun 2004 B2
6770884 Bryman Aug 2004 B2
6775348 Hoffman Aug 2004 B2
6785357 Bernardi Aug 2004 B2
6788761 Bijjani Sep 2004 B2
6798863 Sato Sep 2004 B2
6812426 Kotowski Nov 2004 B1
6813374 Karimi Nov 2004 B1
6816571 Bijjani Nov 2004 B2
6827265 Knowles Dec 2004 B2
6830185 Tsikos Dec 2004 B2
6837422 Meder Jan 2005 B1
6837432 Tsikos Jan 2005 B2
6839403 Kotowski Jan 2005 B1
6856667 Ellenbogen Feb 2005 B2
6859514 Hoffman Feb 2005 B2
6895072 Schrock May 2005 B2
6901135 Fox May 2005 B2
6906329 Bryman Jun 2005 B2
6907101 Hoffman Jun 2005 B2
6922455 Jurczyk Jul 2005 B2
6922460 Skatter Jul 2005 B2
6922461 Kang Jul 2005 B2
6928137 Bruder et al. Aug 2005 B2
6928141 Carver Aug 2005 B2
6933504 Hoffman Aug 2005 B2
6934354 Hoffman Aug 2005 B2
6940071 Ramsden Sep 2005 B2
6944264 Bijjani Sep 2005 B2
6947517 Hoffman Sep 2005 B2
6950492 Besson Sep 2005 B2
6950493 Besson Sep 2005 B2
6952163 Huey Oct 2005 B2
6953935 Hoffman Oct 2005 B1
6957913 Renkart Oct 2005 B2
6962289 Vatan Nov 2005 B2
6968030 Hoffman Nov 2005 B2
6968034 Ellenbogen Nov 2005 B2
6971577 Tsikos Dec 2005 B2
6973158 Besson Dec 2005 B2
6975698 Katcha Dec 2005 B2
6978936 Tsikos Dec 2005 B2
6980627 Qiu Dec 2005 B2
6990171 Toth Jan 2006 B2
6990172 Toth Jan 2006 B2
6991371 Georgeson Jan 2006 B2
6993111 Annis Jan 2006 B1
6996209 Marek Feb 2006 B2
7010083 Hoffman Mar 2006 B2
7016459 Ellenbogen Mar 2006 B2
7020241 Beneke Mar 2006 B2
7020242 Ellenbogen Mar 2006 B2
7023956 Heaton Apr 2006 B2
7023957 Bijjani Apr 2006 B2
7027553 Dunham Apr 2006 B2
7027554 Gaultier Apr 2006 B2
7031430 Kaucic, Jr. Apr 2006 B2
7031434 Saunders Apr 2006 B1
7034313 Hoffman Apr 2006 B2
7039154 Ellenbogen May 2006 B1
7045787 Verbinski May 2006 B1
7046756 Hoffman May 2006 B2
7046761 Ellenbogen May 2006 B2
7050536 Fenkart May 2006 B1
7054408 Jiang May 2006 B2
7062009 Karimi Jun 2006 B2
7062011 Tybinkowski Jun 2006 B1
7062074 Beneke Jun 2006 B1
7064334 Hoffman Jun 2006 B2
7065175 Green Jun 2006 B2
7065179 Block Jun 2006 B2
7068750 Toth Jun 2006 B2
7068751 Toth Jun 2006 B2
7072434 Tybinkowski Jul 2006 B1
7076029 Toth Jul 2006 B2
7078699 Seppi Jul 2006 B2
7081628 Granfors Jul 2006 B2
7084404 Hoffman Aug 2006 B2
7087902 Wang Aug 2006 B2
7088799 Hoffman Aug 2006 B2
7090133 Zhu Aug 2006 B2
7092481 Hoffman Aug 2006 B2
7092485 Kravis Aug 2006 B2
7103137 Seppi Sep 2006 B2
7106830 Rosner Sep 2006 B2
7110488 Katcha Sep 2006 B2
7112797 Hoge Sep 2006 B2
7116749 Besson Oct 2006 B2
7116751 Ellenbogen Oct 2006 B2
7119553 Yang Oct 2006 B2
7123681 Ellenbogen Oct 2006 B2
7127027 Hoffman Oct 2006 B2
7130374 Jacobs Oct 2006 B1
7133491 Bernardi Nov 2006 B2
7136450 Ying Nov 2006 B2
7136451 Naidu Nov 2006 B2
7139367 Le Nov 2006 B1
7139406 McClelland Nov 2006 B2
7149278 Arenson Dec 2006 B2
7149339 Veneruso Dec 2006 B2
7155812 Peterson Jan 2007 B1
7158611 Heismann Jan 2007 B2
7162005 Bjorkholm Jan 2007 B2
7164747 Ellenbogen Jan 2007 B2
7164750 Nabors Jan 2007 B2
7166458 Ballerstadt Jan 2007 B2
7167539 Hoffman Jan 2007 B1
7173998 Hoffman Feb 2007 B2
7177387 Yasunaga Feb 2007 B2
7177391 Chapin Feb 2007 B2
7190757 Ying Mar 2007 B2
7197113 Katcha Mar 2007 B1
7197172 Naidu Mar 2007 B1
7207713 Lowman Apr 2007 B2
7215731 Basu May 2007 B1
7215738 Muenchau May 2007 B2
7218704 Adams May 2007 B1
7221732 Annis May 2007 B1
7224763 Naidu May 2007 B2
7224765 Ellenbogen May 2007 B2
7224766 Jiang May 2007 B2
7224769 Turner May 2007 B2
7233640 Ikhlef Jun 2007 B2
7236564 Hopkins Jun 2007 B2
7238945 Hoffman Jul 2007 B2
7247856 Hoge Jul 2007 B2
7251310 Smith Jul 2007 B2
7260170 Arenson Aug 2007 B2
7260171 Arenson Aug 2007 B1
7260172 Arenson Aug 2007 B2
7260173 Wakayama Aug 2007 B2
7260174 Hoffman Aug 2007 B2
7260182 Toth Aug 2007 B2
7263160 Schlomka Aug 2007 B2
7266180 Saunders Sep 2007 B1
7272429 Walker Sep 2007 B2
7274767 Clayton Sep 2007 B2
7277577 Ying Oct 2007 B2
7279120 Cheng Oct 2007 B2
7280631 De Man Oct 2007 B2
7282727 Retsky Oct 2007 B2
7283604 De Man Oct 2007 B2
7283609 Possin Oct 2007 B2
7295019 Yang Nov 2007 B2
7298812 Tkaczyk Nov 2007 B2
7302083 Larson Nov 2007 B2
7308073 Tkaczyk Dec 2007 B2
7308074 Jiang Dec 2007 B2
7308077 Bijjani Dec 2007 B2
7317195 Eikman Jan 2008 B2
7317390 Huey Jan 2008 B2
7319737 Singh Jan 2008 B2
7322745 Agrawal Jan 2008 B2
7324625 Eilbert Jan 2008 B2
7327853 Ying Feb 2008 B2
7330527 Hoffman Feb 2008 B2
7330535 Arenson Feb 2008 B2
7333589 Ellenbogen Feb 2008 B2
7335887 Verbinski Feb 2008 B1
7336769 Arenson Feb 2008 B2
7339159 Juh Mar 2008 B2
7366282 Peschmann Apr 2008 B2
7369643 Kotowski May 2008 B2
7384194 Gatten Jun 2008 B2
7417440 Peschmann Aug 2008 B2
7453987 Richardson Nov 2008 B1
7486772 Lu Feb 2009 B2
7505562 Dinca Mar 2009 B2
7551718 Rothschild Jun 2009 B2
7555099 Rothschild Jun 2009 B2
7579845 Peschmann Aug 2009 B2
7606348 Foland Oct 2009 B2
7609807 Leue Oct 2009 B2
7634051 Robinson Dec 2009 B2
7636418 Anwar Dec 2009 B2
7656995 Robinson Feb 2010 B2
7668289 Proksa Feb 2010 B2
7672427 Chen Mar 2010 B2
7693261 Robinson Apr 2010 B2
7706507 Williamson Apr 2010 B2
7734066 DeLia Jun 2010 B2
7796733 Hughes Sep 2010 B2
7831012 Foland Nov 2010 B2
7856081 Peschmann Dec 2010 B2
7864920 Rothschild Jan 2011 B2
7873201 Eilbert Jan 2011 B2
7876879 Morton Jan 2011 B2
7924979 Rothschild Apr 2011 B2
7945017 Chen May 2011 B2
7965816 Kravis Jun 2011 B2
7995707 Rothschild Aug 2011 B2
8009799 Doyle Aug 2011 B2
8009800 Doyle Aug 2011 B2
8014493 Roux Sep 2011 B2
8031903 Paresi Oct 2011 B2
8098794 Fernandez Jan 2012 B1
8116428 Gudmundson Feb 2012 B2
8135110 Morton Mar 2012 B2
8135112 Hughes Mar 2012 B2
8138770 Peschmann Mar 2012 B2
D658294 Awad Apr 2012 S
8189889 Pearlstein May 2012 B2
8204173 Betcke Jun 2012 B2
8223919 Morton Jul 2012 B2
8233588 Gibson Jul 2012 B2
8284896 Singh Oct 2012 B2
8311309 Siedenburg Nov 2012 B2
8320523 Zhang Nov 2012 B2
8401270 Eilbert Mar 2013 B2
8428217 Peschmann Apr 2013 B2
8442186 Rothschild May 2013 B2
8478016 Robinson Jul 2013 B2
8503606 Rothschild Aug 2013 B2
8515010 Hurd Aug 2013 B1
8537968 Radley Sep 2013 B2
8559592 Betcke Oct 2013 B2
8633823 Armistead, Jr. Jan 2014 B2
8674706 Peschmann Mar 2014 B2
8724774 Langeveld May 2014 B2
8750454 Gozani Jun 2014 B2
8774357 Morton Jul 2014 B2
8774362 Hughes Jul 2014 B2
8781066 Gudmundson Jul 2014 B2
8804899 Morton Aug 2014 B2
8831331 Gudmundson Sep 2014 B2
8842808 Rothschild Sep 2014 B2
8861684 Al-Kofahi Oct 2014 B2
8867816 Bouchard Oct 2014 B2
8879791 Drouin Nov 2014 B2
8885794 Morton Nov 2014 B2
8903046 Morton Dec 2014 B2
8958526 Morton Feb 2015 B2
9042511 Peschmann May 2015 B2
9099279 Rommel Aug 2015 B2
9111331 Parikh Aug 2015 B2
9113839 Morton Aug 2015 B2
9170212 Bouchard Oct 2015 B2
9183647 Morton Nov 2015 B2
9189846 Wismuller Nov 2015 B2
9194975 Drouin Nov 2015 B2
9196082 Pearlstein Nov 2015 B2
9268058 Peschmann Feb 2016 B2
9311277 Rinkel Apr 2016 B2
9404875 Langeveld Aug 2016 B2
9417060 Schubert Aug 2016 B1
9466456 Rommel Oct 2016 B2
9535019 Rothschild Jan 2017 B1
9632205 Morton Apr 2017 B2
9681851 Rohler Jun 2017 B2
9733385 Franco Aug 2017 B2
9746431 Grader Aug 2017 B2
9747705 Morton Aug 2017 B2
9772426 Armistead, Jr. Sep 2017 B2
9823383 Hanley Nov 2017 B2
9880314 Pfander Jan 2018 B2
9989508 Awad Jun 2018 B2
9996890 Cinnamon Jun 2018 B1
10089956 Awad et al. Oct 2018 B2
10168445 Morton Jan 2019 B2
10180483 Holdsworth Jan 2019 B2
10210631 Cinnamon Feb 2019 B1
10254436 Awad Apr 2019 B2
10295483 Morton May 2019 B2
10302807 Yu May 2019 B2
10366293 Faviero Jul 2019 B1
10386532 Morton Aug 2019 B2
10408967 Morton Sep 2019 B2
10452959 Gautam Oct 2019 B1
10453223 Cinnamon Oct 2019 B2
10504261 Cinnamon Dec 2019 B2
10510319 Awad Dec 2019 B2
10555716 Rohler Feb 2020 B2
10557911 Holdsworth Feb 2020 B2
10572963 Cinnamon Feb 2020 B1
10598812 Franco Mar 2020 B2
10650783 Awad May 2020 B2
10706335 Gautam Jul 2020 B2
10768338 Yu Sep 2020 B2
10782440 Hanley Sep 2020 B2
10795047 St-Aubin Oct 2020 B2
10795048 St-Aubin Oct 2020 B2
10795049 St-Aubin Oct 2020 B2
10809414 St-Aubin Oct 2020 B2
10901113 Morton Jan 2021 B2
10901114 St-Aubin Jan 2021 B2
11010605 Nord May 2021 B2
11073486 Siegrist Jul 2021 B2
11116471 Rohler Sep 2021 B2
11263499 Gautam Mar 2022 B2
11275194 Morton Mar 2022 B2
11276213 Cinnamon Mar 2022 B2
11287391 Yu Mar 2022 B2
11307325 Morton Apr 2022 B2
11423592 Cinnamon Aug 2022 B2
11478214 Siewerdsen Oct 2022 B2
11561320 Morton Jan 2023 B2
11790575 Cinnamon Oct 2023 B2
11822041 Morton Nov 2023 B2
20010014137 Bjorkholm Aug 2001 A1
20010022346 Katagami Sep 2001 A1
20020031202 Callerame Mar 2002 A1
20020094064 Zhou Jul 2002 A1
20020176531 McClelland Nov 2002 A1
20030031352 Nelson Feb 2003 A1
20030085348 Megerle May 2003 A1
20040091078 Ambrefe May 2004 A1
20040120454 Ellenbogen Jun 2004 A1
20040141584 Bernardi Jul 2004 A1
20040179643 Gregerson Sep 2004 A1
20040213378 Zhou Oct 2004 A1
20040252807 Skatter Dec 2004 A1
20040258305 Burnham Dec 2004 A1
20050008126 Juh Jan 2005 A1
20050025280 Schulte Feb 2005 A1
20050031075 Hopkins Feb 2005 A1
20050053189 Gohno Mar 2005 A1
20050058242 Peschmann Mar 2005 A1
20050105682 Heumann May 2005 A1
20050111610 De Man May 2005 A1
20050117700 Peschmann Jun 2005 A1
20050157925 Lorenz Jul 2005 A1
20050180542 Leue Aug 2005 A1
20050281390 Johnson Dec 2005 A1
20060018428 Li Jan 2006 A1
20060098866 Whitson May 2006 A1
20060113163 Hu Jun 2006 A1
20060273259 Li Dec 2006 A1
20070003003 Seppi Jan 2007 A1
20070003009 Gray Jan 2007 A1
20070096030 Li May 2007 A1
20070110215 Hu May 2007 A1
20070116177 Chen May 2007 A1
20070132580 Ambrefe, Jr. Jun 2007 A1
20070133740 Kang Jun 2007 A1
20070133742 Gatten Jun 2007 A1
20070172129 Tortora Jul 2007 A1
20070183568 Kang Aug 2007 A1
20070235652 Smith Oct 2007 A1
20070280502 Paresi Dec 2007 A1
20080025470 Streyl Jan 2008 A1
20080063140 Awad Mar 2008 A1
20080232668 Kitamura Sep 2008 A1
20090010386 Peschmann Jan 2009 A1
20090060135 Morton Mar 2009 A1
20090196396 Doyle Aug 2009 A1
20090285353 Ellenbogen Nov 2009 A1
20100002834 Gudmundson Jan 2010 A1
20100027741 Doyle Feb 2010 A1
20100086185 Weiss Apr 2010 A1
20100098218 Vermilyea Apr 2010 A1
20100207741 Gudmundson Aug 2010 A1
20100208972 Bouchard Aug 2010 A1
20100223016 Gibson Sep 2010 A1
20100277312 Edic Nov 2010 A1
20100295689 Armistead, Jr. Nov 2010 A1
20100302034 Clements Dec 2010 A1
20110007870 Roux Jan 2011 A1
20110019797 Morton Jan 2011 A1
20110033118 Yildiz Feb 2011 A1
20110172972 Gudmundson Jul 2011 A1
20110228896 Peschmann Sep 2011 A1
20110235777 Gozani Sep 2011 A1
20120069964 Scholling Mar 2012 A1
20120093367 Gudmundson Apr 2012 A1
20120140879 Gudmundson Jun 2012 A1
20120230463 Morton Sep 2012 A1
20120275646 Drouin Nov 2012 A1
20130034268 Perron Feb 2013 A1
20130085788 Rowlan Apr 2013 A1
20130114788 Crass May 2013 A1
20130163811 Oelke Jun 2013 A1
20130251098 Morton Sep 2013 A1
20130292574 Levene Nov 2013 A1
20130294574 Peschmann Nov 2013 A1
20130301794 Grader Nov 2013 A1
20130336447 Morton Dec 2013 A1
20140072108 Rohler Mar 2014 A1
20140185923 Chen Jul 2014 A1
20140205059 Sharpless Jul 2014 A1
20140211917 Chen Jul 2014 A1
20140211980 Bouchard Jul 2014 A1
20140222385 Muenster Aug 2014 A1
20140241495 Gudmundson Aug 2014 A1
20140249536 Jajeh Sep 2014 A1
20150021342 Crass Jan 2015 A1
20150186732 Perron Jul 2015 A1
20150268016 Eshetu Sep 2015 A1
20150282781 Rohler Oct 2015 A1
20150355117 Morton Dec 2015 A1
20160025888 Peschmann Jan 2016 A1
20160252647 Awad Sep 2016 A1
20160260412 Awad Sep 2016 A1
20170103513 Heilmann Apr 2017 A1
20170184737 Dujmic Jun 2017 A1
20170184756 Miao Jun 2017 A1
20170236232 Morton Aug 2017 A1
20170242148 Yu Aug 2017 A1
20170309043 Li Oct 2017 A1
20170319169 Rohler Nov 2017 A1
20170328844 Li Nov 2017 A1
20170371010 Shanbhag Dec 2017 A1
20180106733 Li Apr 2018 A1
20180162584 Tauber Jun 2018 A1
20190003989 Miyazaki Jan 2019 A1
20190219729 St-Aubin Jul 2019 A1
20190346379 Awad Nov 2019 A1
20190346381 Awad Nov 2019 A1
20200085404 Siewerdsen Mar 2020 A1
20200103548 Yu Apr 2020 A1
20200110043 Marín Apr 2020 A1
20200146648 Rohler May 2020 A1
20200158909 Morton May 2020 A1
20200211186 Gong Jul 2020 A1
20200249179 Yamakawa Aug 2020 A1
20200348247 Bur Nov 2020 A1
20200355631 Yu Nov 2020 A1
20210004994 Kubo Jan 2021 A1
20210361254 Rohler Nov 2021 A1
20210381991 Desjeans-Gauthier Dec 2021 A1
20220291148 Gill Sep 2022 A1
Foreign Referenced Citations (58)
Number Date Country
1301371 May 1992 CA
2163884 Dec 1994 CA
2574402 Jan 2006 CA
2744690 Jun 2009 CA
2692662 Mar 2010 CA
2697525 Mar 2010 CA
2709468 Mar 2010 CA
2690163 Aug 2011 CA
2869201 Oct 2013 CA
102175698 Sep 2011 CN
103327901 Sep 2013 CN
104165896 Nov 2014 CN
108937992 Dec 2018 CN
116359257 Jun 2023 CN
2729353 Jan 1979 DE
0432568 Jun 1991 EP
0531993 Mar 1993 EP
0584871 Mar 1994 EP
0924742 Jun 1999 EP
0930046 Jul 1999 EP
1277439 Jan 2003 EP
1374776 Jan 2004 EP
2328280 May 1977 FR
3037401 Dec 2016 FR
1497396 Jan 1978 GB
1526041 Sep 1978 GB
2015245 Sep 1979 GB
2089109 Jun 1982 GB
2212903 Aug 1989 GB
2299251 Sep 1996 GB
2356453 May 2001 GB
2437777 Nov 2007 GB
S57175247 Oct 1982 JP
600015546 Jan 1985 JP
600021440 Feb 1985 JP
H0479128 Mar 1992 JP
H10211196 Aug 1998 JP
2001176408 Jun 2001 JP
2004079128 Mar 2004 JP
3946612 Jul 2007 JP
1022236 Jun 1983 SU
9423458 Oct 1994 WO
9528715 Oct 1995 WO
9960387 Nov 1999 WO
03051201 Jun 2003 WO
03105159 Dec 2003 WO
2004097889 Nov 2004 WO
2004111625 Dec 2004 WO
2005084351 Sep 2005 WO
2006135586 Dec 2006 WO
2006137919 Dec 2006 WO
2008133765 Nov 2008 WO
2008139167 Nov 2008 WO
2008157843 Dec 2008 WO
2009114928 Sep 2009 WO
2010025538 Mar 2010 WO
2013149788 Oct 2013 WO
2018121444 Jul 2018 WO
Non-Patent Literature Citations (25)
Entry
International Search Report and Written Opinion for International Application No. PCT/CA2020/051239, dated Dec. 16, 2020, (17 pages).
International Search Report for International Application No. PCT/CA2019/051489, dated Dec. 30, 2019, (pp. 4).
International Search Report for International Application No. PCT/CA2013/050744, dated Jun. 10, 2014, (5 pages).
K. Wells; D.A. Bradley;, “A review of X-ray explosives detection techniques for checked baggage”, Applied Radiation and Isotopes., Elsevier, Oxford., GB, GB, (Jan. 12, 2012), vol. 70, No. 8, doi:10.1016/j.apradiso.2012.01.011, ISSN 0969-8043, pp. 1729-1746, XP028401820.
Richard D. R. Macdonald, “<title>Design and implementation of a dual-energy x-ray imaging system for organic material detection in an airport security application</title>”, Proceedings of SPIE, SPIE, (Apr. 4, 2001), vol. 4301, doi:10.1117/12.420922, ISSN 0277786X, pp. 31-41, XP055104503.
International Search Report for corresponding International Patent Application No. PCT/CA2014/050981 dated Jan. 5, 2015, 6 pgs.
International Search Report for corresponding International Patent Application No. PCT/CA2014/051074 dated Jan. 20, 2015.
International Search Report & Written Opinion for PCT/CA2019/050616, dated Jul. 5, 2019, (15 pages).
International Search Report and Written for International Application No. PCT/CA2018/051673, dated Mar. 14, 2019, (8-pages).
International Search Report and Written Opinion for International Application No. PCT/CA2019/050617, dated Jul. 30, 2019, (11 pages).
International Search Report and Written Opinion for International Application No. PCT/CA2018/051674, dated Mar. 29, 2019, (8 pages).
International Search Report and Written Opinion for International Application No. PCT/CA2018/051675, dated Mar. 21, 2019, (11 pages).
International Search Report and Written Opinion for International Application No. PCT/CA2018/051676, dated Mar. 26, 2019, (7 pages).
Hurd et al (U.S. Pat. No. 8,515,010, hereafter referred to as Hurd), Ying et al (“Dual Energy Volumetric X-ray Tomographic Sensor for Luggage Screening”, IEEE, SAS Feb. 2007) (Year: 2007).
International Search Report and Written Opinion for International Application No. PCT/CA2018/051677, dated Mar. 29, 2019, (8 pages).
Lehmann et al., Generalized image combinations in dual KVP digital radiography, Medical Physics, Sep. 1981, 659-667, 8-5, American Association of Physicists in Medicine.
Bond et al., ZeCalc Algorithm Details, Lawrence Livermore National Laboratory, Jan. 7, 2013, Livermore U.S.A.
Hassanpour et al (NPL “Illicit Material Detection using Dual-Energy X-ray Images”, The International Arab Journal of Information Technology, vol. 13, No. 4, Jul. 2016, p. 8) (Year: 2016).
International Search Report for PCT/GB2004/001729, Aug. 12, 2004.
International Search Report for PCT/GB2004/001741, Mar. 3, 2005.
International Search Report for PCT/GB2004/001731, May 27, 2005.
International Search Report for PCT/GB2004/001732, Feb. 25, 2005.
International Search Report for PCT/GB2004/001751, Mar. 21, 2005.
International Search Report for PCT/GB2004/001747, Aug. 10, 2004, CXR Ltd.
Development of ultra-fast X-ray computed tomography scanner system, INS 98-43 6068772 A9823-8760J-016 (PHA); B9812-7510B-113 (EEA) NDN-174-0606-8771-7, Hori, K.; Fujimoto, T.; Kawanishi, K., Editor—Nalcioglu, O., Abbreviated Journal Title—1997 IEEE Nuclear Science Symposium, Conference Record (Cat. No. 97CH36135) Part No. vol. 2, 1997, pp. 1003-1008 vol. 2, 2 vol. xlviii+1761 page(s), ISBN-0 7803 4258 5.
Related Publications (1)
Number Date Country
20220323030 A1 Oct 2022 US
Provisional Applications (1)
Number Date Country
62900713 Sep 2019 US