The present disclosure relates to systems and methods to create and test food products using a variety of ingredients at the molecular level.
Existing techniques for creating new food products and associated recipes often require significant experimentation and considerable human tasting. Additionally, these existing techniques may require an experienced chef or other food product designer to create new combinations of ingredients that are likely to taste good to a human.
The techniques that require an experienced chef, significant experimentation, and many human tasting tests can be expensive and time-consuming. Further, those techniques can be limited to the chef's personal experience with different types of recipes and ingredients. The need exists for systems and methods that can create new food products and develop new recipes in a manner that accesses a wider universe of ingredients, is less expensive, and requires less trial-and-error to implement.
Non-limiting and non-exhaustive embodiments of the present disclosure are described with reference to the following figures, wherein like reference numerals refer to like parts throughout the various figures unless otherwise specified.
The perception of taste is a psychological experience based primarily on the structural and chemical molecular properties of various ingredients and their interactions with the taste and smell receptors and themselves.
In some embodiments, the systems and methods discussed herein identify the objective properties of ingredients and molecules. The objective properties of various ingredients and molecules are translated into a subjective tasting experience that may include, for example, its savor, smell, texture, and mouthfeel.
As discussed herein, the described systems and methods can evaluate actual human tasting results as well as the objective properties using a food processing unit (FPU) and other components or systems to generate reliable distributions of predicted user responses from a small number of actual human tastings. Thus, the systems and methods may provide alternate materials or ingredients that can be mixed and prepared to provide tasting experiences similar to traditional foods with minimal human tasting activities.
In some embodiments, the described systems and methods may identify new ingredients and preparation instructions for traditional foods that eliminate animal products, eliminate certain food allergens, replace expensive ingredients, replace ingredients that are in short supply, and the like.
In the following disclosure, reference is made to the accompanying drawings, which form a part hereof, and in which is shown by way of illustration specific implementations in which the disclosure may be practiced. It is understood that other implementations may be utilized and structural changes may be made without departing from the scope of the present disclosure. References in the specification to “one embodiment,” “an embodiment,” “an example embodiment,” etc., indicate that the embodiment described may include a particular feature, structure, or characteristic, but every embodiment may not necessarily include the particular feature, structure, or characteristic. Moreover, such phrases are not necessarily referring to the same embodiment. Further, when a particular feature, structure, or characteristic is described in connection with an embodiment, it is submitted that it is within the knowledge of one skilled in the art to affect such feature, structure, or characteristic in connection with other embodiments whether or not explicitly described.
Implementations of the systems, devices, and methods disclosed herein may comprise or utilize a special purpose or general-purpose computer including computer hardware, such as, for example, one or more processors and system memory, as discussed herein. Implementations within the scope of the present disclosure may also include physical and other computer-readable media for carrying or storing computer-executable instructions and/or data structures. Such computer-readable media can be any available media that can be accessed by a general purpose or special purpose computer system. Computer-readable media that store computer-executable instructions are computer storage media (devices). Computer-readable media that carry computer-executable instructions are transmission media. Thus, by way of example, and not limitation, implementations of the disclosure can comprise at least two distinctly different kinds of computer-readable media: computer storage media (devices) and transmission media.
Computer storage media (devices) includes RAM, ROM, EEPROM, CD-ROM, solid state drives (“SSDs”) (e.g., based on RAM), Flash memory, phase-change memory (“PCM”), other types of memory, other optical disk storage, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store desired program code means in the form of computer-executable instructions or data structures and which can be accessed by a general purpose or special purpose computer.
An implementation of the devices, systems, and methods disclosed herein may communicate over a computer network. A “network” is defined as one or more data links that enable the transport of electronic data between computer systems and/or modules and/or other electronic devices. When information is transferred or provided over a computer network or another communications connection (either hardwired, wireless, or a combination of hardwired or wireless) to a computer, the computer properly views the connection as a transmission medium. Transmission media can include a computer network and/or data links, which can be used to carry desired program code means in the form of computer-executable instructions or data structures and which can be accessed by a general purpose or special purpose computer. Combinations of the above should also be included within the scope of computer-readable media.
Computer-executable instructions comprise, for example, instructions and data which, when executed at a processor, cause a general purpose computer, special purpose computer, or special purpose processing device to perform a certain function or group of functions. The computer executable instructions may be, for example, binaries, intermediate format instructions such as assembly language, or even source code. Although the subject matter is described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the described features or acts described herein. Rather, the described features and acts are disclosed as example forms of implementing the claims.
Those skilled in the art will appreciate that the disclosure may be practiced in network computing environments with many types of computer system configurations, including, personal computers, desktop computers, laptop computers, message processors, hand-held devices, multi-processor systems, microprocessor-based or programmable consumer electronics, network PCs, minicomputers, mainframe computers, mobile telephones, PDAs, tablets, pagers, routers, switches, various storage devices, and the like. The disclosure may also be practiced in distributed system environments where local and remote computer systems, which are linked (either by hardwired data links, wireless data links, or by a combination of hardwired and wireless data links) through a communication network, both perform tasks. In a distributed system environment, program modules may be located in both local and remote memory storage devices.
Further, where appropriate, functions described herein can be performed in one or more of: hardware, software, firmware, digital components, or analog components. For example, one or more application specific integrated circuits (ASICs) can be programmed to carry out one or more of the systems and procedures described herein. Certain terms are used throughout the description and claims to refer to particular system components. As one skilled in the art will appreciate, components may be referred to by different names. This document does not intend to distinguish between components that differ in name, but not function.
It should be noted that the sensor embodiments discussed herein may comprise computer hardware, software, firmware, or any combination thereof to perform at least a portion of their functions. For example, a sensor may include computer code configured to be executed in one or more processors, and may include hardware logic/electrical circuitry controlled by the computer code. These example devices are provided herein for purposes of illustration, and are not intended to be limiting. Embodiments of the present disclosure may be implemented in further types of devices, as would be known to persons skilled in the relevant art(s).
At least some embodiments of the disclosure are directed to computer program products comprising such logic (e.g., in the form of software) stored on any computer useable medium. Such software, when executed in one or more data processing devices, causes a device to operate as described herein.
Various terms are used in this specification to describe systems, methods, ingredients, molecular structures, processing steps, data, and the like. For example, the following terms are briefly described for a particular embodiment. It should be understood that the following descriptions are presented by way of example only, and not limitation. It will be apparent to persons skilled in the relevant art that different descriptions may be provided for these terms without departing from the spirit and scope of the disclosure.
In some embodiments, vectors are represented as bold, roman, lower case. Scalars may be represented as italics. Matrices may be represented as bold, roman, upper case. T may represent transpose.
The systems and methods define the fundamental and indivisible constituent of a food product as a simple (mono-molecular) ingredient—a substance containing only one type of molecules. Simple ingredients may be mixed in some proportions forming composite ingredients. Ingredients may be further subjected to various types of transformations such as heating or cooling. From this perspective, any food product can be described as a sequence of mixing and transformation operations applied initially to the raw simple ingredients, and to the intermediate products until the final product is obtained. We refer to such a sequence as “preparation instructions”, while the list of the initial ingredients and their quantities is referred to as the “formula” of the food product.
We henceforth refer to the set of characteristics of a food product pertaining to the flavor perception it generates as to its “flavor profile”. A flavor profile may contain objective characteristics characterizing the sensory response (for example, the binding affinities of the different molecular constituents of the food product to a set of known taste receptor proteins, or mechanical properties such as elasticity as a function of temperature), as well as subjective characteristics (for example, a verbal description of the food product's taste and smell and its comparison to other reference products in the sense of some fixed flavor features such as sweetness, bitterness, sourness, texture, and mouthfeel).
In some embodiments, an ingredient is a natural or synthetic mixture of molecules in some concentrations (e.g., relative amounts). An ingredient can be simple (mono-molecular) or composite (comprising more than one molecule). Concentrations and the constituent molecules of an ingredient can be determined by chemical analytic methods such as liquid chromatography (LC) and mass spectrometry (MS).
A formula may include a list of ingredients with their quantities, which is different from a chemical formula. A mixture is the result of mixing various ingredients according to a formula. The chemical composition of a mixture may change based on chemical reactions between the constituent molecules.
A transformation is an operation or process applied to an ingredient, such as baking at 180 degrees Celsius for 5 minutes.
A preparation instruction is a directed graph starting from a formula and applying a sequence of mixtures and transformations resulting in a single output food (prepared according to the preparation instructions). In some embodiments, a food may also be an ingredient.
A subjective flavor profile may include a description of how the taste/smell of an ingredient is perceived by a human taster. It may also include one or more keywords that approximate the evoked perception, a vector of scores numerically grading different flavor features (sweetness, bitterness, and the like), or a comparison of the above features to another ingredient (e.g., A is sweeter than B, A is more bitter than C, A is as sour as D, and the like).
An objective flavor profile may include measurable physical and chemical characteristics such as pH, viscosity, hardness, and the like.
A flavor profile may be a combination of the subjective flavor profile and the objective flavor profile.
In some embodiments, the systems and methods described herein may receive an ingredient list and a reference food. Based on the ingredient list and reference food, the systems and methods generate preparation instructions for a particular food item using one or more alternate ingredients than the traditional preparation instructions.
In some implementations, FPU 102 may contain or access a digitization of one or more food features using various combinations of subjective food tastings, mixture prediction, and molecule taste prediction. FPU 102 then generates new preparation instructions that are similar to the food to be created. A profile of the food to be created may be generated from the subjective food tastings, analytical data (e.g., liquid chromatography mass spectrometry), and other information discussed herein.
As shown in
In some embodiments, molecular embedder 104 is implemented as a learned model that conceptually follows an auto-encoder architecture. The input to the encoder model is a molecular profile that includes the molecular structure and its chemical and physical properties, which is collectively denoted by the vector m. The output of the encoder model is a latent vector z=E(m). A decoder D is a learned model that receives a latent vector z representing the mono-molecular tastant substance and predicting a property of the mono-molecular tastant substance.
In some embodiments, multiple decoding heads are used, such as:
1. Dauto≈E−1−A model predicting the molecular profile vector m itself. The model ensures that Dauto∘E≈Id makes the latent representation complete about the input molecule.
2. Dsens−A model predicting the sensory response of certain gustative and olfactory receptor cells.
For simplicity of explanation, when describing the systems and methods herein, the explanation may refer to the encoder model as a deterministic one. A specific embodiment may instead represent, in some parametric form, the distribution of E(m) in the latent space.
As illustrated in
In some implementations, mixture modeler 106 is built to approximately satisfy homogeneity and additivity under the mixture, such as:
M(z1, z2, α, 1−α)=αM(z1)+(1−α)M(z2)
In some embodiments, for purposes of convenience, the coordinate system is defined such that water is represented as zero.
In particular implementations, using mixture modeler 106 and asserting one of the mixands to be a solvent (e.g., water), the systems and methods can define another decoder head operating on the mixture representation space:
Dsubj−A model predicting the subjective flavor profile. For example, in the case of a molecule m at concentration α in water, Dsubj(αM∘E(m))=f predicts the perceived flavor characteristics, such as flavor categories, flavor feature scores, and relations to reference flavors, which are collectively denoted by the (pseudo-) vector f.
In some embodiments, the described systems and methods may assert the same space suiting both mono-molecular and mixture embeddings. In these implementations, the systems and methods use z and M(z) interchangeably (e.g., referring to both as z), such that the systems and methods may assume M∘E in place of E.
In some embodiments, FPU 102 further includes a preparation process modeler 108 capable of representing the effect of cooking and preparation processes on the latent representation. In some situations, a preparation process model may also be referred to as a precision graph or cooking graph.
In particular implementations, preparation process modeler 108 models a single step of the preparation process as a transformation of the latent space T(w)=w′. Using these terms, preparation instructions can be thought of as the composition of binary mixture and unary preparation operations. For example:
T
2(M(T1(M(z1, z2, α)), z3, α′)=T2(α′T1(αz1+(1−α)z2)+(1−α′)z3).
In some embodiments, such a sequence can be represented as a tree with basic mono-molecular ingredients as the leaf nodes and the final food product at the root. The ingredients themselves, Z=(z1, z2, . . . , zn) and their relative quantities α=(α1, α2, . . . , αn) can be referred to as the formula of the food, which is different from the chemical formula. In some implementations, preparation instructions may be represented using a shorthand notation T(Z, α).
As shown in
Virtual tasting system 110 may support food testing and obtaining feedback on new food products using a smaller group of human testers. Instead of testing food products with a large number of random people, virtual tasting system 110 can provide valuable feedback on new food products using a smaller number of human testers. For example, the human testers for a new food product may be selected based on the human testers' food preferences, previous tasting event results, and the like.
In some embodiments, a tasting event produces various results that may include data related to taster preferences for one or more food products or compounds. Based on these results, each taster's profile may be updated based on their tasting preferences, and each food product's profile may be updated based on the tasting results from multiple tasters.
In some embodiments, virtual tasting system 110 may implement graph learning methods by, for example, predicting a taster's response to a substance. Based on sparse data collected from multiple tasters related to multiple substances, a deep neural network may be trained that recreates the geometry of the taster's space (e.g., the intra tasters relations) and the geometry of the substance space (e.g., the intra substance relations). Additionally, the deep neural network may be trained to recreate the interrelation between the tasters' graph and the substances' graph. In some embodiments, virtual tasting system 110 also supports the generation of new tasters, based on the required demographic and other background questionnaires, and prediction of the new tasters' response to a variety of substances.
In some embodiments, FPU 102 further includes a food model trainer 112 capable of training food models using a multi-task learning approach. In some implementations, individual models (e.g., molecular embedding models, mixture models, and preparation process models) can be pre-trained using individual sets of tasks followed by joint fine-tuning. Example learning tasks may include the following:
1. Homogeneity: asserting that given mixtures of ingredients z1, z2 in concentrations α, 1−α, and the flavor profile f of the mixture:
D
subj(αz1+(1−α)z2)=f
2. Transformed flavor profile: given pairs of flavor profiles (f, f′) of ingredients before and after a certain preparation process (e.g., heating to 180 degrees Celsius for 15 minutes), the transformed model can be trained by minimizing the discrepancy of the predicted taste profiles, Dsubj∘T(f) and Dsubj(f′).
3. Transformed chemistry: given pairs of chemical compositions ((Z, α), (Z′, α′)) of ingredients before and after a certain preparation process, the transformation model can be trained by minimizing the discrepancy of the predicted molecular profiles, T(α1z1+ . . . +αnzn) and α′1z′1+ . . . +α′n′z′n′.
As shown in
1. Number of ingredients
2. Similarity to a target flavor profile Dsubj∘T(E(M), α)=ftarget, where ftarget denotes the target flavor profile
3. Nutritional values of the molecular ingredients
4. Product cost including the sum of the cost of each mi weighted by αi and, in some situations, by the cost of all preparation stages comprising T.
The solution of the inverse problem can be carried out using regular backpropagation techniques.
In the case where the encoder E is stochastic, rather than getting a single solution, the systems and methods produce a posterior distribution from which multiple solution candidates can be sampled.
In some embodiments, FPU 102 further includes a preparation instruction manager 116 capable of storing and managing various preparation instructions. For example, preparation instruction manager 116 may track various ingredients, mixture ratios, and processing steps for different preparation instructions. Additionally, preparation instruction manager 116 may record tasting results (both subjective and objective) for various preparation instructions so the data can be used for creating different preparation instructions in the future. Preparation instruction manager 116 may also monitor and record visual, mechanical, and chemical properties of the prepared food.
In some embodiments, environment 100 further includes subjective flavor measurement data 118, objective flavor measurement data 120, ingredient data 122, and preparation instruction data 124. Subjective flavor measurement data 118 may include subjective results associated with an ingredient or preparation instruction by a human user. For example, subjective flavor measurement data 118 may include human user opinions regarding taste, texture, odor, and the like for a particular preparation instruction or ingredient.
In some embodiments, objective flavor measurement data 120 includes objective results associated with an ingredient or preparation instruction by a human user. For example, objective flavor measurement data 120 may include objective flavor profile data that is created or predicted using the systems and methods described herein. The objective flavor profile data may include predicted data regarding taste, texture, odor, and the like for a particular preparation instruction or ingredient.
Ingredient data 122 may include information associated with particular ingredients, such as an ingredient flavor profile, taste testing results associated with the ingredient, preparation instructions that include the ingredients, and the like. Preparation instruction data 124 may include information associated with various preparation instructions. In some embodiments, preparation instruction data 124 includes preparation instruction ingredients, preparation instruction mixing instructions, preparation instruction process, preparation instruction flavor profiles, preparation instruction taste testing results, and the like.
In some embodiments various ingredient data and preparation instruction data may be accessed or received from public databases combined with a measured outcome (e.g., objective or subjective features). In some implementations, the systems and methods described herein may perform pairwise comparisons or absolute taste grades with respect to different features, flavor keywords, and the like. In the case of absolute taste grades, the systems and methods may add heads that predict those characteristics.
It will be appreciated that the embodiment of
Process 200 continues by identifying 204 subjective flavor measurements associated with the target food. For example, the subjective flavor measurements may include taste, texture, smell, and the like. In particular implementations, the subjective flavor measurements are based on responses from human users who tasted the target food.
Process 200 then identifies 206 objective flavor measurements associated with the target food. For example, the objective flavor measurements may include physical and chemical information that may be used to predict taste, texture, smell, and the like. In some embodiments, the objective flavor measurements may be obtained as predictions from virtual tasting system 110 and other components of FPU 102.
The process continues by determining 208 a target flavor profile based on the subjective flavor measurements and the objective flavor measurements. This target flavor profile is used to create new preparation instructions with the same, or similar, flavor profiles as the existing food product. Process 200 then proposes 210 one or more candidate preparation instructions with predicted candidate flavor profiles based on the target flavor profile. In some embodiments, the candidate preparation instructions are expected to have predicted candidate profiles that are close to the target flavor profile.
The process continues by preparing 212 the one or more candidate preparation instructions and measuring the actual flavor profiles of the candidate preparation instructions. The process then compares the actual flavor profiles of the candidate preparation instructions to the target flavor profile. Process 200 continues by determining 214 whether the actual flavor profiles of the candidate preparation instructions are close to the target flavor profile. If the actual flavor profiles of the candidate preparation instructions are close to the target flavor profile, the process ends at 218. In some embodiments, the candidate preparation instructions that are close to the target flavor profile may be tested by one or more human users to determine whether the flavor of the food product created with one or more candidate preparation instructions is a viable replacement for the target food.
If the actual flavor profiles of the candidate preparation instructions are not close to the target flavor profile, process 200 updates 216 the candidate flavor profile based on the measured actual flavor profiles. The process then returns to 212, where the updated candidate preparation instructions are prepared and their actual flavor profiles are measured. The process further determines whether the actual flavor profiles of the updated candidate preparation instructions are close to the target flavor profile. This process of updating candidate preparation instructions and determining updated actual flavor profiles is repeated until the flavor profile of one or more candidate preparation instructions is close to the target flavor profile.
In process flow 300, each molecular embedder 308, 310, 312 generates a representation 314, 316, 318, respectively. Representations 314-318 of each molecule are vectors created via a (trainable) non-linear map of the input data. Each representation 314-318 contains enough dimensions such that the corresponding decoder heads can extract the required information with sufficient precision.
In some embodiments, the representations 314-318 are provided to a preparation process modeler 320. Preparation process modeler 320 may be similar to preparation process modeler 108 shown in
Preparation process modeler 320 receives the representations of the input ingredients and generates a representation 324 of the prepared ingredient.
In some embodiments, representation 324 is provided to a predictor 326. Predictor 326 represents decoder heads that extract different objective and subjective characteristics from the representation vector regarding the food product being represented. In some embodiments, predictor 326 generates any number of predicted characteristics 328 related to the food product associated with representation 324. For example, predicted characteristics 328 may include a flavor profile associated with the food product identified in representation 324.
In some embodiments, system 410 receives a list of ingredients 404-408 and instructions about their preparation 412 (e.g., preparation instructions), then predicts a set of characteristics 414 of the final food product. Optimizer 416 may decide how to modify the candidate preparation instructions 412 to better match the objective or constraints. In some implementations, system 410 is the forward model that is inverted in the inverse modeler.
As shown in
In the forward mode, given preparation instructions, the systems and methods described herein predict the characteristics of the preparation instructions. In the inverse mode, given particular target characteristics, the systems and methods find preparation instructions that satisfy the target characteristics.
As shown in the example of
In some embodiments, composite modeler 504 may receive a mixture definition that includes a list of base ingredients and their relative quantities. Composite modeler 504 may output a representation of a particular mixture. As discussed herein, vector generator 506 may generate a vector having multiple dimensions that represent features associated with a mixture. The features may include, for example, a taste, a smell, a texture, or a nutritional value associated with the mixture.
In particular implementations, pairwise comparator 508 may receive a pair of feature lists (e.g., from two different mixtures) and produces a list of pairwise comparisons based on the pair of feature lists. The pairwise comparator 508 may also determine if one of the mixtures has a stronger presence of a feature than the other mixture. In some embodiments, projection matrix generator 510 may support handling of cases in which not all measurements are given or when the measurements are performed in a different basis. As discussed herein, ingredient optimizer 512 may optimize any number of ingredients in a mixture to achieve the desired results, such as desired taste, desired smell, desired texture, desired nutritional value, and the like.
In some embodiments, the purpose of mixture modeling (MM) is to produce a representation of composite tastants comprising multiple molecules. For example, define a universe of n base ingredients and refer to them by their index, i=1, . . . , n. Base ingredients can be mixed together in arbitrary proportions to produce new composite ingredients. A mixture will be represented by an n-dimensional vector α=(α1, . . . , αn) on the probability simplex (i.e., having non-negative entries summing to 1, and representing the relative quantity of each base ingredient in the mixture. Using this notation, the base ingredients can be represented by the standard base vectors, e1, . . . , en (where each ek has 1 in coordinate k and zeros elsewhere).
In some embodiments, more complicated preparation process can be applied to the base ingredients. In that case, the process can be represented as a directed tree-structured graph with the base ingredients on its leaf nodes. A subset of nodes can be mixed in proportions specified on the graph edges, producing a new node representing the composite tastant. A node representing a tastant can also undergo processing like heating or cooling, producing a new node representing the product tastant. In the latter case, the type of processing and its parameters are encoded as edge attributes connecting the two nodes. The root of the tree represents the final product of the preparation process.
Some embodiments further define a set of m features measured objectively (e.g., quantitative sensory response to an ingredient) or subjectively (e.g., the sweetness or sourness of an ingredient). Given a pair of ingredients represented by mixture coefficients α and α′, a pairwise comparison can determine how the first ingredient is compared to the second ingredient in terms of each of the features. These measurements may be represented as an m-dimensional vector y∈{−1, 0, 1}m, where +1 in coordinate k means that the first ingredient is “bigger” than the second ingredient in the sense of feature k (e.g., if feature k represents sweetness, then +1 implies that the first ingredient is sweeter); similarly, −1 implies the reverse, and 0 means that the two ingredients are about the same with respect to feature k.
Sometimes, not all features may be measured. To model such partial measurement situations, some embodiments may use a projection matrix P defining a subspace of features where the measurements are available (for all measurements, P is set to the identity matrix).
In some embodiments, combinations of features may be measured instead of “pure” values of each individual feature. To model such superposition measurements, some embodiments may use a projection matrix P defining the measurement operator.
In some embodiments, a non-linear transformation of features may be measured instead of features or their combinations. The projection P in such cases should be interpreted as a general known non-linear map.
In some embodiments, the goal is to create a representation of the base ingredients in an m-dimensional embedding space, such that each ingredient is modeled by a vector x. In some implementations, the following properties may be satisfied by a good representation:
1. Order relation: Let a pair of ingredients represented by x and x′, respectively, be compared producing a pairwise comparison vector y. Then, xk>ykx′k if yk=±1 and xk≈x′k if yk=0, for each k=1, . . . , m.
2. Homogeneity: A mixture of ingredients represented by x and x′ should be represented by αx+(1−α)x′, with α and 1−α being the mixing proportions of the first and the second ingredients, respectively.
The following discussion describes the construction of the embedding from training data. As the input, the described systems and methods receive a collection of tuples of the form:
{αi, α′i, Piyi}i=1N
The learnable degrees of freedom are the embeddings of the base ingredients. In order to account for uncertainty in the latter, some embodiments represent each base ingredient k by a parametric m-variate probability distribution, pθ
The joint distribution of the base ingredients is assumed independent and denoted collectively by the matrix of multivariate distributions,
PM,Σ(X)=(pμ
where all the learned parameters are captured by the means matrix M=(μ1, . . . , μn) and the covariance tensor Σ=(Σ1, . . . , Σn). In some embodiments, the number of degrees of freedom can be reduced by asserting structure on the covariance Σ, such as Σk diagonal or low rank.
In some embodiments, due to ellipticity assumption, for any deterministic vector α, the mixture of the base ingredients is given by the distribution
PM,Σ
where Σα2=α12Σ1+ . . . +αn2Σn.
For each measurement α, α′, Py, the two compared ingredients can be modeled as two independent random vectors X˜pMα, Σα
In some embodiments, this task can be carried out by defining the following negative log likelihood pointwise loss
and solving the following (maximum likelihood) optimization problem
In some embodiments, a Bayesian formulation is used, in which the posterior expectation of some loss function ρ(ΔX, y) is minimized. For example, in some embodiments, the correlation between ΔX and y in the subspace spanned by P, ρ(ΔX, y)=sign(ΔX)TPTPy, may be minimized.
In the latter case, a closed form expression exists of the sign of a normal variable, which is derived below for completeness. Let Z˜N(μ, σ2). Then,
, where Φ denotes the cumulative density function of the normal distribution. Hence,
Consequently, for measurement i, one embodiment can write the following pointwise loss:
(the function application is element-wise). As noted above, Bayesian loss minimization amounts to solving the following problem
with the pointwise loss i defined above.
The comparison result of the loss function 1002 (e.g., comparing the predicted feature {circumflex over (f)}i(αi) with the target result) is provided to an optimizer 1004. Optimizer 1004 updates the candidate mixture definition based on the comparison result of the loss function 1002. The updated candidate mixture definition is then communicated to candidate mixture manager 1006, and the process is repeated iteratively.
The process flow 1000 optimizes the value of a while the parameters (θ) remain fixed. In some embodiments, optimizer 1004 tries different candidate mixture definitions until the predicted result matches (or is substantially close to) the target result based on taste, smell, texture, nutritional information, and the like.
After the embedding has been learned, the representation problem consists of finding the mixture coefficients a of the base ingredients that optimally describe another given ingredient. In some embodiments, the systems and methods are given another set of measurements comprising a set of pairwise comparisons Piyi of the target ingredient against mixtures of the base ingredients, each mixture represented by αi (the comparisons are also possible with the new ingredient appearing in mixtures with base ingredients; however, for presentation clarity this discussion stays with the simpler formulation). Some embodiments aim at finding such a mixture of base ingredients β, that PM,Σβ is maximally consistent with a set of measurements. The representation problem can be again considered as the minimization of one of the losses detailed above, this time with respect to β while keeping M,Σ fixed.
where Pn={β:β≥0, βT1=1} is the probability simplex, and i is a pointwise loss.
In some embodiments, following the maximum likelihood formulation the pointwise negative log likelihood loss of the form
i(β)=−log pP,M(β−α
is used.
In some other embodiments, following the Bayesian formulation, the pointwise loss of the form
is used.
Another version of the representation problem consists of approximating a base ingredient with a fixed subset of other base ingredients (e.g., replacing an animal ingredient with vegan ingredients). In some embodiments, this subset is denoted by restricting the mixtures to the subspace α=Qβ, where Q is a projection matrix. In this case, the distribution model, pμ
possibly with additional constraints on β such as sparsity. In some embodiments, the distance D is chosen to be the Kullback-Leibler divergence or the Wasserstein distance.
Computing device 1200 may be used to perform various procedures, such as those discussed herein. Computing device 1200 can function as a server, a client, or any other computing entity. Computing device can perform various functions as discussed herein, and can execute one or more application programs, such as the application programs described herein. Computing device 1200 can be any of a wide variety of computing devices, such as a desktop computer, a notebook computer, a server computer, a handheld computer, tablet computer and the like.
Computing device 1200 includes one or more processor(s) 1202, one or more memory device(s) 1204, one or more interface(s) 1206, one or more mass storage device(s) 1208, one or more Input/Output (I/O) device(s) 1210, and a display device 1230 all of which are coupled to a bus 1212. Processor(s) 1202 include one or more processors or controllers that execute instructions stored in memory device(s) 1204 and/or mass storage device(s) 1208. Processor(s) 1202 may also include various types of computer-readable media, such as cache memory.
Memory device(s) 1204 include various computer-readable media, such as volatile memory (e.g., random access memory (RAM) 1214) and/or nonvolatile memory (e.g., read-only memory (ROM) 1216). Memory device(s) 1204 may also include rewritable ROM, such as Flash memory.
Mass storage device(s) 1208 include various computer readable media, such as magnetic tapes, magnetic disks, optical disks, solid-state memory (e.g., Flash memory), and so forth. As shown in
I/O device(s) 1210 include various devices that allow data and/or other information to be input to or retrieved from computing device 1200. Example I/O device(s) 1210 include cursor control devices, keyboards, keypads, microphones, monitors or other display devices, speakers, printers, network interface cards, modems, lenses, CCDs or other image capture devices, and the like.
Display device 1230 includes any type of device capable of displaying information to one or more users of computing device 1200. Examples of display device 1230 include a monitor, display terminal, video projection device, and the like.
Interface(s) 1206 include various interfaces that allow computing device 1200 to interact with other systems, devices, or computing environments. Example interface(s) 1206 include any number of different network interfaces 1220, such as interfaces to local area networks (LANs), wide area networks (WANs), wireless networks, and the Internet. Other interface(s) include user interface 1218 and peripheral device interface 1222. The interface(s) 1206 may also include one or more user interface elements 1218. The interface(s) 1206 may also include one or more peripheral interfaces such as interfaces for printers, pointing devices (mice, track pad, etc.), keyboards, and the like.
Bus 1212 allows processor(s) 1202, memory device(s) 1204, interface(s) 1206, mass storage device(s) 1208, and I/O device(s) 1210 to communicate with one another, as well as other devices or components coupled to bus 1212. Bus 1212 represents one or more of several types of bus structures, such as a system bus, PCI bus, IEEE 1394 bus, USB bus, and so forth.
For purposes of illustration, programs and other executable program components are shown herein as discrete blocks, although it is understood that such programs and components may reside at various times in different storage components of computing device 1200, and are executed by processor(s) 1202. Alternatively, the systems and procedures described herein can be implemented in hardware, or a combination of hardware, software, and/or firmware. For example, one or more application specific integrated circuits (ASICs) can be programmed to carry out one or more of the systems and procedures described herein.
While various embodiments of the present disclosure are described herein, it should be understood that they are presented by way of example only, and not limitation. It will be apparent to persons skilled in the relevant art that various changes in form and detail can be made therein without departing from the spirit and scope of the disclosure. Thus, the breadth and scope of the present disclosure should not be limited by any of the described exemplary embodiments, but should be defined only in accordance with the following claims and their equivalents. The description herein is presented for the purposes of illustration and description. It is not intended to be exhaustive or to limit the disclosure to the precise form disclosed. Many modifications and variations are possible in light of the disclosed teaching. Further, it should be noted that any or all of the alternate implementations discussed herein may be used in any combination desired to form additional hybrid implementations of the disclosure.
This application is a Continuation in Part of U.S. application Ser. No. 17/691,662, entitled “Food Processing Systems and Methods,” filed Mar. 10, 2022, the disclosure of which is incorporated herein by reference in its entirety.
Number | Date | Country | |
---|---|---|---|
Parent | 17691662 | Mar 2022 | US |
Child | 17826839 | US |