Raw material identification may be utilized for quality control of material products (e.g., pharmaceutical products, medical products, or food products, among other examples). For example, raw material identification may be performed on a material to determine whether component ingredients of the material correspond to a packaging label associated with the material. Spectroscopy may facilitate non-destructive raw material identification with reduced preparation and data acquisition time relative to other analytical techniques.
Some implementations described herein relate to a method. The method may include obtaining, by a device, a spectroscopic measurement associated with a sample. The method may include generating, by the device and based on the spectroscopic measurement and a global classification model, a local classification model that includes a plurality of classes. The method may include identifying, by the device and based on the spectroscopic measurement, a particular class of the plurality of classes of the local classification model. The method may include identifying, by the device, a prediction threshold associated with the particular class. The method may include classifying, by the device and based on the particular class and the prediction threshold, the spectroscopic measurement. The method may include providing, by the device and based on classifying the spectroscopic measurement, information indicating whether the sample belongs to the particular class.
Some implementations described herein relate to a device. The device may include one or more memories and one or more processors coupled to the one or more memories. The one or more processors may be configured to obtain a spectroscopic measurement associated with a sample. The one or more processors may be configured to identify, based on the spectroscopic measurement, a particular class of a plurality of classes of a local classification model. The one or more processors may be configured to identify a prediction threshold associated with the particular class. The one or more processors may be configured to classify, based on the particular class and the prediction threshold, the spectroscopic measurement. The one or more processors may be configured to provide information indicating whether the sample belongs to the particular class based on classifying the spectroscopic measurement.
Some implementations described herein relate to a non-transitory computer-readable medium that stores a set of instructions for a device. The set of instructions, when executed by one or more processors of the device, may cause the device to obtain a spectroscopic measurement associated with a sample. The set of instructions, when executed by one or more processors of the device, may cause the device to identify, based on the spectroscopic measurement, a particular class of a plurality of classes of a classification model. The set of instructions, when executed by one or more processors of the device, may cause the device to identify a prediction threshold associated with the particular class. The set of instructions, when executed by one or more processors of the device, may cause the device to classify, based on the particular class and the prediction threshold, the spectroscopic measurement. The set of instructions, when executed by one or more processors of the device, may cause the device to provide information indicating whether the sample belongs to the particular class based on classifying the spectroscopic measurement.
The following detailed description of example implementations refers to the accompanying drawings. The same reference numbers in different drawings may identify the same or similar elements. The following description uses a spectrometer as an example. However, the techniques, principles, procedures, and methods described herein may be used with any sensor, including but not limited to other optical sensors and spectral sensors.
Raw material identification (RMID) is a technique utilized to identify components (e.g., ingredients) of a particular sample for identification and/or verification. For example, RMID may be utilized to verify that ingredients in a pharmaceutical material correspond to a set of ingredients identified on a label. In some cases, a spectrometer may be utilized to perform spectroscopy on a sample (e.g., of the pharmaceutical material) to determine components of the sample. The spectrometer may determine a set of measurements of the sample and may provide the set of measurements for a spectroscopic determination. A spectroscopic classification technique (e.g., a classifier) may facilitate determination of the components of the sample based on the set of measurements of the sample.
However, some unknown samples (e.g., samples that have not been identified), which are to be subject to a spectroscopic classification, may not be included in classes that a classification model is configured to classify. For example, for a classification model trained to distinguish between types of fish, a user may inadvertently provide beef for classification. In this case, a control device may perform a spectroscopic classification of the particular material, and may provide a false positive identification of the particular material as a particular type of fish, which would be inaccurate.
As another example, a classification model may be trained to classify types of sugar (e.g., glucose, fructose, galactose, and/or the like) and quantify respective concentrations of each type of sugar in unknown samples. However, a user of a spectrometer and a control device may inadvertently attempt to classify an unknown sample of sugar based on incorrectly using the spectrometer to perform a measurement. For example, the user may operate the spectrometer at an incorrect distance from the unknown sample, under environmental conditions different from calibration conditions at which the spectrometer was performed to train the classification model, and/or the like. In this case, the control device may receive an inaccurate spectrum for the unknown sample, resulting in a false positive identification of the unknown sample as a first type of sugar at a first concentration, when the unknown sample is actually a second type of sugar at a second concentration.
Some implementations described herein utilize prediction thresholds to facilitate spectroscopic classification and therefore reduce false positive identifications. For example, a control device that receives a spectroscopic measurement of an unknown sample may generate a local classification model, based on the spectroscopic measurement, and a global classification model. The control device may identify a particular class of a plurality of classes of the local classification model (e.g., a class of which the unknown sample is most likely to belong) and may identify a prediction threshold associated with the particular class. The prediction threshold may be used to define a tolerance area around a boundary of the particular class. In this way, when the spectroscopic measurement for the unknown sample satisfies the prediction threshold of the particular class (e.g., within the tolerance area of the boundary of the particular class), the control device may determine that the unknown sample belongs to the particular class. Alternatively, when the spectroscopic measurement for the unknown sample is not within the prediction threshold of the particular class (e.g., not within the tolerance area of the boundary of the particular class), the control device may determine that the unknown sample does not belong to the particular class (and/or to any other class of the local classification model).
In this way, an accuracy of spectrometry is improved relative to spectroscopy performed without use of a local classification model and/or a prediction threshold. This reduces a likelihood of reporting a false positive identification of the unknown sample, which reduces a use of computing resources (e.g., processing resources, memory resources, communication resources, and/or power resources, among other examples) to recheck and/or verify identification of the unknown sample and/or to address issues that arise from false positive identification of the unknown sample.
As further shown in
The training set and validation set 108 may be selected to include a threshold quantity of samples for each class of the global classification model. A “class” of the global classification model may refer to a grouping of similar materials, such as (in a pharmaceutical context) lactose materials, fructose materials, acetaminophen materials, ibuprophen materials, and/or aspirin materials, among other examples. Materials used to train the global classification model, and for which raw material identification is to be performed using the global classification model, may be termed materials of interest.
As shown by reference number 110, the spectrometer 104 may perform the set of spectroscopic measurements on the training set and validation set 108 (e.g., based on receiving the instruction from the control device 102). For example, the spectrometer 104 may determine a spectrum for each sample of the training set and validation set 108 to enable the control device 102 to generate a set of classes for classifying an unknown sample as one of the materials of interest for the global classification model.
As shown by reference number 112, the spectrometer 104 may provide the set of spectroscopic measurements to the control device 102. For example, the control device 102 may receive a first set of spectra for the training samples and a second set of spectra for the validation samples. The control device 102 may store information identifying each sample of training set and validation set 108.
As shown by reference number 114, the control device 102 may generate a global classification model based on the set of spectroscopic measurements. For example, the control device 102 may utilize the set of spectroscopic measurements to identify classes of spectra with types or concentrations of materials. In some implementations, the control device 102 may train the global classification model when generating the classification model. For example, the control device 102 may cause the global classification model to be trained using the first set of spectra (e.g., measurements associated with the training samples). Additionally, or alternatively, the control device 102 may perform an assessment of the global classification model. For example, the control device 102 may validate the global classification model (e.g., for predictive strength) utilizing the second set of spectra (e.g., measurements associated with the validation samples).
In some implementations, the control device 102 may validate the global classification model using a multi-stage determination technique. For example, for in-situ local modeling based classification, the control device 102 may determine that the global classification model is accurate when utilized in association with a local classification model. In this way, the control device 102 ensures that the global classification model is generated with a threshold accuracy prior to providing the global classification model for utilization, such as by the control device 102 or another the control device 102 associated with another spectrometer 104.
In some implementations, the control device 102 may generate the global classification model using a particular determination technique and based on the set of spectroscopic measurements. For example, the control device 102 may generate the global classification model using a support vector machine (SVM) technique (e.g., a machine learning technique for information determination). “SVM” may refer to a supervised learning model that performs pattern recognition and uses confidence metrics for classification.
In some implementations, the control device 102 may utilize a particular type of kernel function to determine a similarly of two or more inputs (e.g., spectra) when generating the global classification model using the SVM technique. For example, the control device 102 may utilize a radial basis function (RBF) (e.g., termed SVM-rbf) type of kernel function, which may be represented as k(x,y)=exp(−∥x−y∥{circumflex over ( )}2) for spectra x and y; a linear function (e.g., termed SVM-linear and termed hier-SVM-linear when utilized for a multi-stage determination technique) type of kernel function, which may be represented as k(x,y)=x·y; a sigmoid function type of kernel function (e.g., termed SVM-sigmoid); and/or a polynomial function type of kernel function (e.g., termed SVM-polynomial); an exponential function type of kernel function (e.g., termed SVM-exponential); among other examples.
In some implementations, the control device 102 may utilize a particular type of confidence metric for SVM, such as a probability value (e.g., determination based on determining, using a probability value technique, a probability that a sample is a member of a class of a set of classes) and/or a decision value (e.g., determination utilizing a decision function based on pattern similarities to vote for a class, of a set of classes, as being the class of which the sample is a member). For example, during use of the global classification model with a decision value based SVM, the control device 102 may determine whether an unknown sample is located within a boundary of a constituent class based on a plotting of a spectrum of the unknown sample, and may assign the sample to a class based on whether the unknown sample is located within the boundary of the constituent class. In this way, the control device 102 may determine whether to assign an unknown spectrum to a particular class.
In some implementations, the control device 102 may utilize a particular class comparison technique for determining decision values. For example, the control device 102 may utilize a one-versus-all decision value technique (sometimes termed a one-versus-all-others decision value technique), where the global classification model is divided into a group of sub-models with each sub-model being compared to all other substrate-models, and the decision values being determined based on the comparison for each sub-model. Additionally, or alternatively, the control device 102 may utilize an all-pairs decision value technique, where the global classification model is divided into each possible pair of classes to form sub-models from which to determine decision values.
As shown in
As further shown in
As shown in
In some implementations, to identify the particular class, the control device 102 may determine a set of decision values associated with each class of the plurality of classes of the local classification model. For example, the control device 102 may determine the set of decision values using an SVM-rbf kernel function that utilizes a one-versus-all decision value technique. Further, the control device 102 may determine whether a threshold amount of decision values of a particular set of decision values (e.g., a majority of the decision values of the particular set of decision values, all of the decision values of the particular set of decision values, or other examples) are each less than a particular decision value threshold (e.g., zero, or another value). When the control device 102 determines that the threshold amount of the decision values of the particular set of decision values are each less than the particular decision value threshold, the control device 102 may determine a set of probability values associated with each class of the plurality of classes of the local classification model and may determine the particular class based on the set of probability values. For example, the control device 102 may determine the set of probability values using an SVM-rbf kernel function that utilizes a probability value technique and may select a class associated with a greatest probability value as the particular class. Alternatively, when the control device 102 determines that the threshold amount of the decision values of the particular set of decision values are each greater than or equal to the particular decision value threshold, the control device 102 may determine an additional set of decision values associated with each class of the plurality of classes of the local classification model and determine the particular class based on the additional set of decision values. For example, the control device 102 may determine the additional set of decision values using an SVM linear kernel function that utilizes an all-pairs decision value technique and may select a class associated with a greatest decision value as the particular class.
As shown in
In some implementations, to identify the prediction threshold associated with the particular class, the control device 102 may determine a first set of decision values associated with the particular class, may determine a second set of decision values associated with the particular class, and/or may determine a third set of decision values associated with at least one other class of the plurality of classes of the local classification model (e.g., that does not include the particular class). For example, the control device 102 may determine the first set of decision values associated with the particular class using an SVM-rbf kernel function that utilizes a self-prediction technique. As another example, the control device 102 may determine the second set of decision values associated with the particular class using an SVM-rbf kernel function that utilizes a cross-validation technique (e.g., a leave-one-out cross-validation technique). In an additional example, the control device 102 may determine the third set of decision values associated with the at least one other class using an SVM-rbf kernel function that utilizes a binary classification technique (e.g., a one-versus-all technique, such as with the particular class as the “one” in one-versus-all). The control device 102 may determine the prediction threshold associated with the particular class based on at least two of the first set of decision values, the second set of decision values, or the third set of decision values.
For example, the control device 102 may determine that a first amount of decision values of the first set of decision values are less than a particular decision value threshold (e.g., zero, or another value) and that a second amount of decision values of the second set of decision values are less than the particular decision value threshold. The control device 102 may determine that the first amount (e.g., a first percentage) and the second amount (e.g., a second percentage) are each greater than or equal to an amount threshold (e.g., a percentage threshold, such as 50%, 70%, 95%, or 100%, among other examples). Accordingly, the control device 102 may determine the prediction threshold using a first prediction scheme. For example, when the control device 102 uses the first prediction scheme, the control device 102 may determine the prediction threshold based on a minimum decision value of the second set of decision values and/or a maximum decision value of the third set of decision values.
As an alternative example (e.g., after determining that at least one of the first amount and the second amount is less than the amount threshold), the control device 102 may determine a first minimum decision value of the first set of decision values and/or a second minimum decision value of the second set of decision values. The control device 102 may determine whether at least one of the first minimum decision value or the second minimum decision value is greater than or equal to a particular decision value threshold (e.g., zero, or another value). When the control device 102 determines that at least one of the first minimum decision value or the second minimum decision value is greater than or equal to the particular decision value threshold, the control device 102 may determine the prediction threshold using the first prediction scheme. Alternatively, when the control device 102 determines that both of the first minimum decision value and the second minimum decision value are less than the particular decision value threshold, the control device 102 may determine the prediction threshold using a second prediction scheme. For example, when the control device 102 uses the second prediction scheme, the control device 102 may determine the prediction threshold based on the first minimum decision value and/or the second minimum decision value.
As shown in
As shown by reference number 136, the control device 102 may provide information relating to identification of the unknown sample 120. For example, based on classifying the spectroscopic measurement as part of the particular class and/or determining that the unknown sample 120 belongs to the particular class, the control device 102 may provide (e.g., to a client device 140) information indicating that the spectroscopic measurement associated with the unknown sample 120 is determined to be associated with the particular class, thereby identifying the unknown sample. As another example, based on not classifying the spectroscopic measurement as part of the particular class and/or determining that the unknown sample 120 does not belong to the particular class, the control device 102 may provide (e.g., to the client device 140) information indicating a classification failure, thereby reducing a likelihood of a false-positive determination. In some implementations, the control device 102 may provide the information to cause the information to be displayed (e.g., as an alert, as a message, and/or as part of an informational dashboard, among other examples) on a display (e.g., of the client device 140).
In this way, the control device 102 enables spectroscopy for an unknown sample 120 with improved accuracy relative to other classification models based on reducing a likelihood of reporting a false positive identification of the unknown sample 120 as being a material of interest.
As indicated above,
The control device 102 includes one or more devices capable of storing, processing, and/or routing information associated with identifying an unknown sample based on a spectroscopic measurement. For example, the control device 102 may include a server, a computer (e.g., a desktop computer, a laptop computer, or a tablet computer), a wearable device, a cloud computing device in a cloud computing environment, a mobile device, a smart phone, or the like that generates a classification model using a particular classifier and based on a set of spectroscopic measurements of a training set, and/or utilizes the classification model to identify an unknown sample. In some implementations, multiple control devices 102 may utilize a common classification model. For example, a first control device 102 may generate the classification model and provide the classification model to a second control device 102, which may use the classification model to identify an unknown sample (e.g., at a restaurant, at a meat packaging plant, or a pharmacy, among other examples). For example, the control device 102 may utilize an SVM type of classifier with a linear kernel, a rbf kernel, or another kernel. In this case, the control device 102 may perform a classification based on a confidence measure technique, a decision value technique, or another technique. In some implementations, the control device 102 may be associated with a particular spectrometer 104. In some implementations, the control device 102 may be associated with multiple spectrometers 104. In some implementations, the control device 102 may receive information from and/or transmit information to another device in environment 200, such as the spectrometer 104 and/or the client device 140.
The spectrometer 104 includes one or more devices capable of performing a spectroscopic measurement on a sample. For example, the spectrometer 104 may include a spectrometer device that performs spectroscopy (e.g., vibrational spectroscopy, such as near infrared (NIR) spectroscopy, mid-infrared spectroscopy (mid-IR), Raman spectroscopy, or the like). In some implementations, the spectrometer 104 may receive information from and/or transmit information to another device in environment 200, such as the control device 102 and/or the client device 140.
The client device 140 includes one or more devices capable of receiving, generating, storing, processing, and/or providing information associated with identifying an unknown sample based on a spectroscopic measurement, as described elsewhere herein. The client device 140 may include a communication device and/or a computing device. For example, the client device 140 may include a wireless communication device, a mobile phone, a user equipment, a laptop computer, a tablet computer, a desktop computer, or a similar type of device. In some implementations, the client device 140 may receive information from and/or transmit information to another device in environment 200, such as the control device 102 and/or the spectrometer 104.
Network 210 may include one or more wired and/or wireless networks. For example, network 210 may include a cellular network (e.g., a long-term evolution (L 1E) network, a 3G network, a code division multiple access (CDMA) network, etc.), a public land mobile network (PLMN), a local area network (LAN), a wide area network (WAN), a metropolitan area network (MAN), a telephone network (e.g., the Public Switched Telephone Network (PSTN)), a private network, an ad hoc network, an intranet, the Internet, a fiber optic-based network, a cloud computing network, or the like, and/or a combination of these or other types of networks.
The number and arrangement of devices and networks shown in
Bus 310 includes one or more components that enable wired and/or wireless communication among the components of device 300. Bus 310 may couple together two or more components of
Memory 330 includes volatile and/or nonvolatile memory. For example, memory 330 may include random access memory (RAM), read only memory (ROM), a hard disk drive, and/or another type of memory (e.g., a flash memory, a magnetic memory, and/or an optical memory). Memory 330 may include internal memory (e.g., RAM, ROM, or a hard disk drive) and/or removable memory (e.g., removable via a universal serial bus connection). Memory 330 may be a non-transitory computer-readable medium. Memory 330 stores information, instructions, and/or software (e.g., one or more software applications) related to the operation of device 300. In some implementations, memory 330 includes one or more memories that are coupled to one or more processors (e.g., processor 320), such as via bus 310.
Input component 340 enables device 300 to receive input, such as user input and/or sensed input. For example, input component 340 may include a touch screen, a keyboard, a keypad, a mouse, a button, a microphone, a switch, a sensor, a global positioning system sensor, an accelerometer, a gyroscope, and/or an actuator. Output component 350 enables device 300 to provide output, such as via a display, a speaker, and/or a light-emitting diode. Communication component 360 enables device 300 to communicate with other devices via a wired connection and/or a wireless connection. For example, communication component 360 may include a receiver, a transmitter, a transceiver, a modem, a network interface card, and/or an antenna.
Device 300 may perform one or more operations or processes described herein. For example, a non-transitory computer-readable medium (e.g., memory 330) may store a set of instructions (e.g., one or more instructions or code) for execution by processor 320. Processor 320 may execute the set of instructions to perform one or more operations or processes described herein. In some implementations, execution of the set of instructions, by one or more processors 320, causes the one or more processors 320 and/or the device 300 to perform one or more operations or processes described herein. In some implementations, hardwired circuitry may be used instead of or in combination with the instructions to perform one or more operations or processes described herein. Additionally, or alternatively, processor 320 may be configured to perform one or more operations or processes described herein. Thus, implementations described herein are not limited to any specific combination of hardware circuitry and software.
The number and arrangement of components shown in
As shown in
As further shown in
As further shown in
As further shown in
As further shown in
As further shown in
Process 400 may include additional implementations, such as any single implementation or any combination of implementations described below and/or in connection with one or more other processes described elsewhere herein.
In a first implementation, identifying the particular class comprises determining a set of decision values associated with each class of the plurality of classes of the local classification model, determining that a threshold amount of decision values of a particular set of decision values are each less than a particular decision value threshold, determining, based on determining that the threshold amount of the decision values of the particular set of decision values are each less than the particular decision value threshold, a set of probability values associated with each class of the plurality of classes of the local classification model, and determining, based on the set of probability values associated with each class of the plurality of classes of the local classification model, the particular class.
In a second implementation, alone or in combination with the first implementation, the set of decision values associated with each class of the plurality of classes of the local classification model are determined using a support vector machine radial basis function (SVM-rbf) kernel function that utilizes a one-versus-all decision value technique, and the set of probability values associated with each class of the plurality of classes of the local classification model are determined using an SVM-rbf kernel function that utilizes a probability value technique.
In a third implementation, alone or in combination with one or more of the first and second implementations, identifying the particular class comprises determining a first set of decision values associated with each class of the plurality of classes of the local classification model, determining that a threshold amount of decision values of each of the first set of decision values are each greater than or equal to a particular decision value threshold, determining, based on determining that the threshold amount of the decision values of each of the first set of decision values are each greater than or equal to the particular decision value threshold, a second set of decision values associated with each class of the plurality of classes of the local classification model, and determining, based on the second set of decision values associated with each class of the plurality of classes of the local classification model, the particular class.
In a fourth implementation, alone or in combination with one or more of the first through third implementations, the first set of decision values associated with each class of the plurality of classes of the local classification model are determined using an SVM-rbf kernel function that utilizes a one-versus-all decision value technique, and the second set of decision values associated with each class of the plurality of classes of the local classification model are determined using an SVM-linear kernel function that utilizes an all-pairs decision value technique.
In a fifth implementation, alone or in combination with one or more of the first through fourth implementations, identifying the prediction threshold associated with the particular class comprises determining a first set of decision values associated with the particular class, determining a second set of decision values associated with the particular class, determining a third set of decision values associated with at least one other class of the plurality of classes, and determining the prediction threshold associated with the particular class based on at least two of the first set of decision values, the second set of decision values, or the third set of decision values.
In a sixth implementation, alone or in combination with one or more of the first through fifth implementations, the first set of decision values associated with the particular class are determined using an SVM-rbf kernel function that utilizes a self-prediction technique, the second set of decision values associated with the particular class are determined using an SVM-rbf kernel function that utilizes a cross-validation technique, and the third set of decision values associated with the at least one other class are determined using an SVM-rbf kernel function that utilizes a binary classification technique.
In a seventh implementation, alone or in combination with one or more of the first through sixth implementations, determining the prediction threshold associated with the particular class comprises determining that a first amount of decision values of the first set of decision values are less than a particular decision value threshold, determining that a second amount of decision values of the second set of decision values are less than the particular decision value threshold, determining that the first amount and the second amount are each greater than or equal to an amount threshold, and determining the prediction threshold based on at least one of a minimum decision value of the second set of decision values or a maximum decision value of the third set of decision values.
In an eighth implementation, alone or in combination with one or more of the first through seventh implementations, determining the prediction threshold associated with the particular class comprises determining a first minimum decision value of the first set of decision values, determining a second minimum decision value of the second set of decision values, determining that at least one of the first minimum decision value or the second minimum decision value are greater than or equal to a particular decision value threshold, and determining the prediction threshold based on at least one of the second minimum decision value or a maximum decision value of the third set of decision values.
In a ninth implementation, alone or in combination with one or more of the first through eighth implementations, determining the prediction threshold associated with the particular class comprises determining a first minimum decision value of the first set of decision values, determining a second minimum decision value of the second set of decision values, determining that each of the first minimum decision value and the second minimum decision value are less than a particular decision value threshold, and determining the prediction threshold based on at least one of the first minimum decision value or the second minimum decision value.
Although
As shown in
As further shown in
As further shown in
As further shown in
As further shown in
Process 500 may include additional implementations, such as any single implementation or any combination of implementations described below and/or in connection with one or more other processes described elsewhere herein.
In a first implementation, process 500 includes determining a set of probability values associated with each class of the plurality of classes of the local classification model, and determining, based on the set of probability values associated with each class of the plurality of classes of the local classification model, the particular class.
In a second implementation, alone or in combination with the first implementation, process 500 includes determining a set of decision values associated with each class of the plurality of classes of the local classification model, and determining, based on the set of decision values associated with each class of the plurality of classes of the local classification model, the particular class.
In a third implementation, alone or in combination with one or more of the first and second implementations, identifying the prediction threshold includes determining that a first amount of decision values of a first set of decision values associated with the particular class are less than a particular decision value threshold, determining that a second amount of decision values of a second set of decision values associated with the particular class are less than the particular decision value threshold, determining that the first amount and the second amount are each greater than or equal to an amount threshold, and determining, based on determining that the first amount and the second amount are each greater than or equal to the amount threshold, the prediction threshold based on a minimum decision value of the second set of decision values.
In a fourth implementation, alone or in combination with one or more of the first through third implementations, identifying the prediction threshold includes determining a first minimum decision value of a first set of decision values associated with the particular class, determining a second minimum decision value of a second set of decision values associated with the particular class, determining that at least one of the first minimum decision value or the second minimum decision value are greater than or equal to a particular decision value threshold, and determining, based on determining that at least one of the first minimum decision value or the second minimum decision value are greater than or equal to the particular decision value threshold, the prediction threshold based on the second minimum decision value.
In a fifth implementation, alone or in combination with one or more of the first through fourth implementations, identifying the prediction threshold includes determining a first minimum decision value of a first set of decision values associated with the particular class, determining a second minimum decision value of a second set of decision values associated with the particular class, determining that each of the first minimum decision value and the second minimum decision value are less than a particular decision value threshold, and determining, based on determining that each of the first minimum decision value and the second minimum decision value are less than the particular decision value threshold, the prediction threshold based on at least one of the first minimum decision value or the second minimum decision value.
Although
As shown in
As further shown in
As further shown in
As further shown in
As further shown in
Process 600 may include additional implementations, such as any single implementation or any combination of implementations described below and/or in connection with one or more other processes described elsewhere herein.
In a first implementation, identifying the particular class includes determining a set of decision values associated with each class of the plurality of classes of the classification model, and determining, based on the set of decision values associated with each class of the plurality of classes of the classification model, the particular class.
In a second implementation, alone or in combination with the first implementation, identifying the particular class includes determining a set of probability values associated with each class of the plurality of classes of the classification model, and determining, based on the set of probability values associated with each class of the plurality of classes of the classification model, the particular class.
In a third implementation, alone or in combination with one or more of the first and second implementations, identifying the prediction threshold includes determining a first set of decision values associated with the particular class, determining a second set of decision values associated with the particular class, determining a third set of decision values associated with at least one other class of the plurality of classes, and determining the prediction threshold associated with the particular class based on at least two of the first set of decision values, the second set of decision values, or the third set of decision values.
Although
The foregoing disclosure provides illustration and description, but is not intended to be exhaustive or to limit the implementations to the precise forms disclosed. Modifications and variations may be made in light of the above disclosure or may be acquired from practice of the implementations.
As used herein, the term “component” is intended to be broadly construed as hardware, firmware, or a combination of hardware and software. It will be apparent that systems and/or methods described herein may be implemented in different forms of hardware, firmware, and/or a combination of hardware and software. The actual specialized control hardware or software code used to implement these systems and/or methods is not limiting of the implementations. Thus, the operation and behavior of the systems and/or methods are described herein without reference to specific software code—it being understood that software and hardware can be used to implement the systems and/or methods based on the description herein.
As used herein, satisfying a threshold may, depending on the context, refer to a value being greater than the threshold, greater than or equal to the threshold, less than the threshold, less than or equal to the threshold, equal to the threshold, not equal to the threshold, or the like.
Even though particular combinations of features are recited in the claims and/or disclosed in the specification, these combinations are not intended to limit the disclosure of various implementations. In fact, many of these features may be combined in ways not specifically recited in the claims and/or disclosed in the specification. Although each dependent claim listed below may directly depend on only one claim, the disclosure of various implementations includes each dependent claim in combination with every other claim in the claim set. As used herein, a phrase referring to “at least one of” a list of items refers to any combination of those items, including single members. As an example, “at least one of: a, b, or c” is intended to cover a, b, c, a-b, a-c, b-c, and a-b-c, as well as any combination with multiple of the same item.
No element, act, or instruction used herein should be construed as critical or essential unless explicitly described as such. Also, as used herein, the articles “a” and “an” are intended to include one or more items, and may be used interchangeably with “one or more.” Further, as used herein, the article “the” is intended to include one or more items referenced in connection with the article “the” and may be used interchangeably with “the one or more.” Furthermore, as used herein, the term “set” is intended to include one or more items (e.g., related items, unrelated items, or a combination of related and unrelated items), and may be used interchangeably with “one or more.” Where only one item is intended, the phrase “only one” or similar language is used. Also, as used herein, the terms “has,” “have,” “having,” or the like are intended to be open-ended terms. Further, the phrase “based on” is intended to mean “based, at least in part, on” unless explicitly stated otherwise. Also, as used herein, the term “or” is intended to be inclusive when used in a series and may be used interchangeably with “and/or,” unless explicitly stated otherwise (e.g., if used in combination with “either” or “only one of”).