The present invention relates to random number generation. Specifically, but not exclusively, the invention relates to entropy sources for use in the generation of random numbers (or, random bit sequences) and to noise conditioners that can increase entropy in entropy sources and other random number generation pipelines.
True random number generation is one of the most important parts of any computer security system as this is where secret keys and other sensitive security parameters ultimately come from. Anything less than truly random numbers, or, rather, random bit sequences, increases opportunities for cyber-attacks, which may be based on the ability to predict sequences that are not truly random.
Many schemes are known for generating random numbers. For example, certain Intel® processors offer the RDRAND or RDSEED instructions, which return random numbers from on-chip hardware random number generators that have been seeded from an on-chip entropy source.
In the present description, an entropy source is generally a hardware bit source that produces a stream of bits having a random character. Entropy sources are commonly found on-chip, as external entropy may not be available or may not be reliable. Entropy sources may generate random bits using a noise source based on phenomenon such as thermal noise from electronic circuits or systems, such as ring oscillators. For example, the entropy source for the RDSEED instruction runs asynchronously on a self-timed circuit and uses thermal noise within silicon to output a random stream of bits.
Entropy sources typically have a qualifiable level of entropy. High entropy implies high randomness whereas low entropy implies low randomness, and different entropy sources have different levels of entropy. Levels of entropy can be established using industry standard test procedures.
Often, entropy sources are not themselves used to deliver random numbers that are used by applications. Instead, the output from an entropy source may be used to seed a downstream random number generator, for example a cryptographic and/or deterministic pseudorandom number generator, which produces random numbers for use by software applications. Such downstream random number generators may be realized in hardware or software, based on random bits provided by a respective entropy source.
Certain known random number generators that are associated with large-scale integrated circuits, such as Intel™ and AMD™ processors, implement cryptographic conditioners using existing instructions, such as AES, which, for example, can take bit samples from a source and produce a conditioned (that is, an encrypted) entropy sample.
While known cryptographic conditioners perform adequately in some scenarios, they may not be appropriate for applications with resource-constrained, so-called lightweight processors and/or embedded systems, which may not have the chip real-estate or processing power to support functions such as AES. There is therefore a need for new kinds of entropy source suitable for lightweight microcontrollers, reduced instruction set architectures, such as RISK-V, and/or processors adapted for post-quantum cryptography.
The paper “Independent unbiased coin flips from a correlated biased source—a finite state Markov Chain”, by M Blum, Combinatorica 6 (2) (1986), pp 97-108 describes how to take von Neumann's scheme for simulating an absolutely unbiased coin using a biased coin and extend it to generate an independent unbiased sequence of Hs and Ts from any Markov chain in expected linear time. Blum presents two algorithms—A and B—and shows that algorithm A is “bad”, and that algorithm B is “good”. Blum's paper is theoretical and mathematical, it does not provide a concrete design for an entropy source suitable for low-cost and embedded computing platforms. Although the “Algorithm B” is presented as a high-level method, no concrete implementation of this algorithm is provided.
Aspects and embodiments of the present invention are disclosed in the claims that are appended hereto.
Various features of the present disclosure will be apparent from the detailed description which follows, taken in conjunction with the accompanying drawings, which together illustrate features of the present disclosure, and wherein:
Examples described herein provide an entropy source for the secure and efficient generation of random number sequences. In preferred examples, the entropy source is implemented in hardware and is used as a resource by a secure kernel or operating system service for delivery of random bit sequences to applications upon a computing platform. In other examples, the entropy source may be implemented in firmware and/or software as a “protected” random number source or as a “virtual entropy source” for enclaves, simulators, and virtual machines. The entropy source may comprise a lightweight, adaptive non-cryptographic (arithmetic) noise conditioning mechanism with a particularly efficient computational arrangement. In “single vendor” implementations, where a manufacturer can control both hardware and software (e.g., firmware), the present examples may provide a set of components that ensure uncorrupted noise sources that allow for independent assessment. The present entropy source examples thus provide a minimalistic true random number generator that may be included as part of a secure microcontroller profile. The preferred hardware configuration of the entropy source may be cheaply constructed from standard cell logic components with easy synthesis via Application Specific Integrated Circuit (ASIC) flows as well as in the form of Field Programmable Gate Arrays (FPGAs) via popular FPGA development platforms.
In certain examples, both a noise source and a noise conditioner of the entropy source have continuous tests. A pre-processed noise signal from the noise source may also be available for evaluation. The entropy source may be implemented as part of a Reduced Instruction Set Computer (RISC) Central Processing Unit (CPU), such as a RISC-V CPU, and may be implemented with a noise source, noise conditioner, health tests, and a signaling mechanism for alarms. The global reset line of a CPU may serve a dual purpose of zeroization and initiating built in self tests. Noise conditioners as described herein use much less energy and have a much smaller implementation area than cryptographic conditioners, and can be free of computational hardness assumptions, making them essentially “quantum-safe”. The entropy source also supports numerous “post-quantum” security standards (e.g., those set by the National Institute of Standards and Technology—NIST) and thus forms a basis for large-scale post-quantum implementations.
The example entropy sources described herein are particularly suited to providing a lightweight and secure microcontroller true random number generator that exhibits good performance within a small (silicon) area while meeting a complex set of engineering and validation requirements (e.g., such as those set by a variety of cryptographic standards). Embedded systems may particularly benefit from an on-chip entropy source as per the described implementations as for these systems external entropy may not be available or may not meet high cryptographic standards (e.g., may be open to manipulation and/or may generally not be trustworthy).
The described examples may allow certain requirements for a true random number generator to be provided by a dedicated hardware (or software) entropy source. This then reduces the need for computationally intensive cryptographic conditioning in software prior to use. The present entropy source examples are a departure from previous CPU-based random number generation, which are often “30 parts” later cryptographic conditioning and “1 part” entropy generation. This is typically the case as these CPU-based random number generators were open to cryptographic attack or manipulation. The present examples further provide an entropy source wherein additional random bits may be supplied as desired (e.g., as polled). In comparative CPU-based random number generators (such as those used in Intel® processors), for larger bit sequences (e.g., over 128 bits) large blocks (e.g., 512-bit blocks) may need to be computed in a manner that is resource intensive, especially for battery-powered or other embedded implementations.
For the present purposes, a noise conditioner 114 is to increase the entropy of bits received from the noise source 112, and may be thought of as a device or processing module that aims to retain “significant” features or entropic bits of a bitstream while discarding features that are known (for instance from a model) to be redundant. For example, if it is assumed in an electronic circuit such as a ring oscillator that per-cycle jitter accumulates at a constant rate, a low-frequency input (which contains fewer cycles) may have more bits to ‘throw away’ than a high-frequency input. In the case of a low-frequency oscillator source, for example, discarded bits may comprise long runs of zeros or ones while the least significant bit of those run lengths might retain a significant amount of entropy. A noise conditioner 114 in that context may be seen as analogous with a dynamic “lossy compression algorithm”.
The architecture 100 also comprises software 120, including a software driver 122 and a software application 160. The software driver 122 comprises a polling interface 124 to interact with the hardware interface 118 by sending polling signals and receiving entropy bits. The software driver 122 may include a further conditioner 126, such as cryptographic conditioner, and/or a deterministic random number generator (DRNG) 126, that takes the entropy bits and produces random bits of a required standard that are required by the software application 160.
In the present example, the software application 160 may be a cryptographic application, which receives entropy bits from the entropy source 110 via the software driver 122. The software driver, when instructed by the software application 160, polls the buffer 116 via the hardware interface 118 and receives a stream of bits, for example 256 bits (if the buffer has 256 locations). The entropy source 110 may only deliver the entropy bits once the buffer 116 is full. The make-up of the software driver 122 and software application 160 varies according to the use case and further details are not provided herein. The entropy source 110 may, in certain cases, be implemented as part of a secure microcontroller profile for a RISC-based processor, such as a RISC-V processor.
Ring oscillators, of the kind illustrated in
The randomness of a ring oscillator is attributed to timing jitter. Timing jitter can be shown to have a strongly Gaussian character and is seen to accumulate in a phase difference against a reference clock, CLK, with variance σt2 growing almost linearly from one cycle to the next.
It is noted that comparative noise sources such as an XOR sequence of multiple rung oscillators were considered. However, such constructions were found to lead to pseudorandom generators, where output sequences look random even without much jitter. This made it difficult to accurately estimate the produced entropy.
Under common conditions, transition time standard deviation (uncertainty) σt after time t can be estimated, for instance, for CMOS ring oscillators as:
with time t, proportionality constant η˜1, Boltzmann constant k, absolute temperature T, power dissipation P, supply voltage VDD, and device characteristic voltage Vchar. It has been observed for the present purposes that the number, N, of NOT gates in the loop does not materially impact performance, although N and the frequency f affect P via known dynamic switching equations.
It is evident from the foregoing equation that variations in temperature and voltage can materially impact ring oscillator jitter and, consequently, the level of entropy of a respective noise source. It has therefore been appreciated for the present purposes that entropy sources may benefit from a degree of adaptive behavior, to adapt to increasing or decreasing levels of entropy, as will be described further herein.
Other examples may use alternative noise sources. These include continuous and shot (Poisson) electronic noise processes, such as those from noisy diodes. Physical sources that can be digitized as noise input include thermal, optical, mechanical, and quantum phenomena. Quantum phenomenon yielding randomness includes quantum state superposition, quantum state entanglement, Heisenberg uncertainty, quantum tunneling, spontaneous emission or radioactive decay. Noise sources do not need to be directly physical; examples of derived random sources include system events and communications traffic in a computer system. Other examples based on the teachings herein may employ any such alternative noise sources.
Entropy sources may comprise a conditioner, sometimes referred to as a noise conditioner, to increase the entropy of a bit stream delivered by the entropy source. Entropy may be increased, for example, by removing redundant bits, such as long streams of 0s or 1s, or artifacts that characterise the noise source, which would otherwise act to reduce the entropy of the entropy source.
The noise conditioner 114 of the present example employs functions and logic that implement one or more Markov Chains to condition bits that are output from the noise source 112. Using Markov Chains in a so-called deterministic circuit can be criticized (for example, see David Johnston, “Random Number Generators—Principles and Practices: A Guide for Engineers and Programmers”, De Gruyter, 2018) because Markov Chains only have a finite set of states whereas real-world entropy bits, for example derived from a noise source 112 as described, arguably have an infinite number of states. It is therefore impossible for a practical Markov Chain to track all possible states of real-world entropy bitstreams. It is also perhaps counterintuitive to expect a ‘stateful conditioner’ to be able to make bits in a bitstream more independent. However, it has been determined herein that use of Markov Chains, even with only a few states, greatly helps to reduce bias and correlation in entropy bits produced by a noise source 112. Indeed, it has been shown that Markov Chains as applied herein in noise conditioners can increase entropy to well-above the lower threshold demanded by industry standards.
In certain examples described herein, a noise conditioner is used as a device that is configured to retain “significant” features or entropic bits while discarding bit sequences that are known from a model to be redundant. The noise conditioner may be seen as implementing a form of lossy compression algorithm on the raw noise source. In certain cases, the noise conditioning within the entropy source may be followed by cryptographic conditioning (e.g., within software following polling) such as a hash function that further evenly distributes input entropy to output bits. In examples, the noise conditioner may be configured to provide adequate independent entropy content under both normal and adversarial conditions. If noise (e.g., bits from the noise source) cannot be processed to meet a desired set of constraints, the entropy source has characteristic, identifiable failure modes and mechanisms to detect them. This is all provided within implementation circuitry that is small and power efficient.
Correlation can be mitigated using an n-state Markov model, as was proposed by Manuel Blum, where n>1.
For a Markov Chain, a single input bit, together with the current state 0≤i<n, determines the next state; there are again two choices with complementary probabilities pi and 1−pi for which exit from the state is taken. It is assumed that these “exit probabilities” are static, insofar as they do not change over time. With these assumptions, an arbitrary state may be selected, state transitions tracked, and exit bits recorded only from that single state. The resulting bit sequence would have an uncorrelated bias pi. It has been appreciated that one can then apply a von Neumann debiaser to only those bits to produce a bit sequence that is simultaneously decorrelated (via the foregoing state-correlation assumption) and unbiased.
Blum observed that a straightforward approach to increasing output rate by viewing every state as an independent von Neumann debiaser and directly merging their outputs does not yield independent, unbiased output bits. In an alternative approach, therefore, Blum proposed a modification to address this, assuming that the Markov assumption (that is, the memoryless property of a stochastic model) holds but without knowledge of its static exit probabilities. The modification is presented in Blum's so-called “Algorithm B” (as presented in “Independent unbiased coin flips from a correlated biased source—a finite state Markov Chain”, M Blum, Combinatorica 6 (2) (1986), pp 97-108). In this variant, the model outputs a (debiased) von Neumann bit from a state only after that state is re-entered. Due to this delay, the approach adopted by Algorithm B is said to be ‘patient’.
A conditioner herein that operates generally in accord with or implements Blum's Algorithm B is referred to as a Blum B Conditioner. Implementations described herein address a common criticism of Blum's algorithm, namely that real-world entropy source circuits tend to have infinite states. It is thus not possible to track those states well. A naïve implementation of Blum's algorithm thus needs to address the need to track states. In the present examples, it has been found that a multiple stage, multiple state arrangement such as shown in
Experiments performed for the present purposes considered the Markov Chain 400 in
For the present purposes, the performance of different conditioners has been evaluated using a range of noise corpus datasets {Corpus A-Corpus E}. On the graph in
From a practical point of view, therefore, it has been shown that a 23=8 state Blum B Conditioner is capable of offering relatively high entropy, in the form of unbiased and relatively uncorrelated bits, while having a minimal architectural and power overhead in, for example, an embedded processor. In other words, although a Blum B Conditioner with more states can be applied to examples herein, the architectural overhead of the additional states typically would not be warranted due to the relatively small increase in entropy of the resulting bit sequence. In addition, if it is assumed that correlation is the strongest with most recent bits inputted from a noise source, an 8-state Blum B Conditioner should be expected to perform decorrelation adequately most of the time.
A Markov Chain 700 for an 8-state Blum B Conditioner, which can be implemented according to the present example, is shown in
An architecture arranged to implement an 8-state Blum B Conditioner 800. implementing the Markov Chain 700 of
Because a shift register arrangement operates as a memory of the previous input bits, it has been appreciated that the shift register arrangement can approximate the state of the Blum B Conditioner (or, indeed, any circuit that realizes a Markov Chain) and represent or correspond with the respective transitions. Accordingly, a one-bit shift register can be used in a 2-state Markov Chain, a two-bit shift register can be used in a 4-state Markov Chain, and, as for this current example, a 3-bit shift register can be used in a 8-state Markov Chain, and so forth. Any appropriate arrangement of memory and instructions/logic circuitry may be applied in examples herein to perform the role of the shift register arrangement 808.
The selector 810 is arranged to select one of eight destination logic cells, in this example comprising von Neumann cells 818, which perform von Neumann debiasing, as will be described, according to addressing or selection determined by the bit values {Dj−1, Dj−2, Dj−3} in the shift register arrangement 808. The input D is sent to a selected von Neumann cell 818 by the selector 810. The selected von Neumann cell 818 produces an output Q of the Blum B Conditioner based on its current state S and the value of the new input bit D. The remaining seven von Neumann cells maintain their states and do not perform any function until they are again selected by the selector 810. On a next clock cycle CLK, the input bit D enters the shift register arrangement 808 at Dj−1 and the oldest bit in the shift register 808, Dj−3, is dropped or discarded. Any appropriate arrangement of memory and instructions/logic circuitry may be applied in examples herein to perform the role of the selector 810.
As will be appreciated, as input bits enter the shift register arrangement 808, the selector 801 selects the von Neumann cells 818 in accordance with the transitions between cells that are represented by the Markov Chain 700 in
The operation of the architecture of
Of course, feeding bits into the left-hand side of the shift register arrangement 808 is only one option. Feeding bits into the right-hand side of the shift register arrangement 808 is also viable, and, in that case, would deliver addresses: [abc, bcd, cde, def], and the shift register arrangement 808 would operate instead like a ‘sliding window’ over the input bit stream. In this case, certain transitions in the Markov Chain 700 would need to be altered to match the new set of possible transitions.
It will be appreciated that the architecture in
The von Neumann cells 818 follow logic to process input bits D according to the pseudo-code in
In operation of the von Neumann cell, and with reference to
As has already been indicated, the input bits D 802, when clocked by the clock signal CLK 804 (when the enable input W 806 is active), feed the shift register arrangement 808 and the selector 810 and produce an output Q 814 (or not) according to the logic. In producing an output, the state is set to the value of D independently of whether the S=H or T. This can be thought of as “wipe-on-read”. Each von Neumann cell may be efficiently implemented using three internal flip-flops to store the state and externally each cell may be provided with a simple serial datastream interface.
One advantage of the two-stage noise conditioner as described in
In arriving at exemplary designs herein, various entropy source configurations have been modelled and tested assuming a sampled, digital oscillator noise source. For sampled, digital oscillator sources, signal amplitude is ignored and a pulse wave with a period T is considered with a relative pulse width (“duty cycle”) D. A constant sampling rate is assumed and the sample bits are used as a measure of time.
A sinusoidal phase ω is normalized with x=(ω−δ)/2π to range 0≤x<1, where δ is a leading edge location. Base frequency F=1/T is a per bit increment to phase and per bit jitter σ2 its variance. The behavior of a (F, D, σ2) noise source sampler can then be modelled as
Given some initial state x0, and using a normal distribution random generator (sampler) N$, these two equations can be used to generate a simulated sequence of bits z1, z2, z3, . . . matching the model. The per-bit variance σ2 is related to a “RMS jitter” measure tjit(cc) via standard deviation √{square root over ((σ2/F))}.
Yield (y axis) on the graphs is a ratio of the number of bits output from the conditioner to the number of bits input into the conditioner. Yield is an important parameter not only because it indicates the “output speed” of the conditioner for any given input bit rate but also because it indicates accumulated RMS jitter, which is approximately √{square root over ((σ2/Y))}.
Since initially some bits are discarded by the Blum B Conditioner (as all states are initially “λ”), a continuous yield was approximated by feeding the conditioners 2000 bits with given parameters (F, D, σ2) and measuring how many bits were output. In the case of the von Neumann debiaser, the output can be between 0 and 500 (maximum yield 0.5, the first 1000 bits having been discarded). The simulations used n=40000 samples (of 1000 bits) each. As a von Neumann debiaser is expected to handle basic bias perfectly, bias was set to a duty cycle D=½. Frequency F was random modulo the sampling frequency—between 0 and the “Nyquist bound” ½.
As shown in
The adaptive nature of the Blum B Conditioner not only means that the conditioner will work with different kinds of noise source, it also means that as environmental parameters, especially temperature, change, a minimum level of entropy of the output bits will be maintained. This is extremely important for devices and systems that can be installed in all manner of different and/or extreme locations and environments, unlike computer servers and other computer systems that typically reside in temperature-controlled environments. For the present purposes, performance was tested with varying temperatures of operation down to −50° C. by freezing the circuitry using freeze spray, and the adaptive nature of the Blum B Conditioner was in accord with the modelling.
Considering a practical design for a Blum B Conditioner, it is noted that bit-pattern state i of a Blum B Conditioner is capable of “isolating” local entropy “features” such as leading and falling edges to perform decorrelation. However, the scope of such decorrelation is in relation to k input bits, rather than output bits, which is perceived to be a problem for low-entropy or variable-entropy noise sources. As has been alluded to, increasing the number of states, k, may improve the situation marginally but doing so leads to diminishing returns and is at the cost of an exponentially-growing circuit footprint. Hence, examples herein apply two stages of conditioner each with a modest number of states (and relatively low real estate overhead) each, rather than a single conditioner with a far greater number of states. In one example, both stages of conditioner are Blum B Conditioners. In this way, two stages of variable-yield, adaptive noise conditions combine to mitigate more macroscopic features. Conveniently, both stages of Blum B Conditioner employ k=3 (8-states). Having two 8-state stages has the potential to provide better adaptive performance than a single conditioner that is limited to a window of 6-bits, in addition to having a smaller implementation size. In other words, since both Blum B Conditioners are adaptively dynamic, the overall “memory” reaches well past 6-bits in the distance, and so the performance of the 2×8-state Blum B Conditioners, should provide better decorrelation and yield than a single 64-state Blum B Conditioner. The configuration of
This is useful for strongly biased or serially conditioned input; for example the first “feature detection” stage might have a yield of only 0.1; the output from the second stage sees 3 bits from that it is a function of 3/0.1=30 raw noise bits rather than 6.
Certain examples described herein thus provide a two-stage noise conditioner that goes against conventional design principles. For example, it is normally desired to avoid pseudorandomness in the entropy path before any cryptographic conditioner. The present examples however use a stateful conditioner. Yet this stateful conditioner is applied to help make output bits more independent. For example, a differing input bit will cause a branch to a different delayed von Neumann cell and the input bit to the von Neumann cell sets the state change of the cell. This results in a cascade of branching states based on the input bits while the carry-on effect is limited via a “wipe on read” property as implemented by the von Neumann cell (e.g., when a bit is read from a state, the state bit Si is set to λ and no longer contains information about that output bit).
An example of a practical entropy source 900 is illustrated in
As shown in
The noise conditioner in
According to examples herein, two false-positive rates are applied to the online tests. A first rate α=2−64 indicates fatal noise source failures and a second rate β=2−20 indicates recoverable entropy source failures. The rates may be expressed “per bit tested”. The bounds are configurable and can be set according to use case.
For the present purposes, fatal failure rate α is expressed in relation to the noise source bits. It is chosen so that it should not significantly affect an overall failure rate of the system, even if the continuously-running noise source is relatively fast. The false-positive rate reaches 50% for each test after about 400/GHz-years, depending on the associated sampling frequency.
The secondary alarm rate β is interpreted in relation to noise conditioner output bits, assuming a software driver 122 has a cryptographic conditioner 126 that processes 16×16=256-bit ES16 blocks. Thus, for four tests, it can be seen that 1−(1−α)4·256≈0.001, so that approximately 1 in 1000 entropy source blocks will produce a recoverable alarm. This was chosen to be high enough so that a rate of non-fatal alarms can be monitored. It can be seen that after three immediately consecutive failures, a fatal bound is met if it is assumed a 1/16 noise conditioner yield (them β3/16=α).
The online test 920 for the noise source 905 comprises a repetition count test (RCT) 922 and an adaptive proportion test (APT) 924. Both tests aim to have a threshold entropy level below which an alarm is generated and prevents entropy bits from being output from the entropy source 900. According to the present example, the entropy level (min-entropy) is set to H=0.5, which corresponds to a maximum probability of p=2−0.5=0.7071. Of course, the noise source (a ring oscillator, according to the example) is designed to be better than that and so if any bias worse than this emerges it is likely that there has been a degradation or failure in the respective circuitry.
According to the present example, the RCT 922 counts repeated identical samples (runs of 1s or 0s). A fatal cutoff threshold is set below which an alarm is generated and prevents entropy bits from being output from the entropy source 900. According to the present example, the fatal cutoff is computed as:
such that any run of a repeated sample above 129 bits generates the alarm. Other thresholds can be employed according to the use case.
According to the present example, the APT 924 is a simple bias test, as follows. Let B0 and B1 be the number of 0 or 1 bits in a window W of 1024 bits. A fatal alarm is triggered when {B0, B1}≥Capt for a cutoff value of Capt.
At min-entropy H=0.5. Capt=841, whereby a value above this threshold causes a fatal alarm which prevents entropy bits from being output from the entropy source 900. Other thresholds can be employed according to the use case.
Regarding performance of the noise conditioner 910, if, say, pseudorandom behavior of the noise source is causing significant overall entropy loss (due, for example, to external interference or a complex phase lock), bias or short repeating patterns might be expected. These can be tested for using an autocorrelation test, for example, by XORing an output bit sequence with a delayed (or shifted) version of itself. If the original sequence is sufficiently random, the XOR result should also be random, while serial correlation causes a bias. According to the present example, four noise conditioner output tests measure bias (no XOR), and bias in XOR with the sequence delayed by 1, 2, and 3-bit positions. These tests may be performed assuming H>0.5.
The BIST may be initiated either by a rest or when a non-fatal alarm is generated by the noise conditioner tests. The following BIST tests may be employed, according to an example:
The foregoing tests and associated parameters can be adjusted to work well with a particular ring oscillator or an operating frequency. According to an example, the BIST state persists until polled at least once by a software driver 120 so that the driver can register even non-fatal errors.
Tests carried out on a two-stage noise conditioner of the kind illustrated in
Entropy and yield assessments are made at the input 911 to the noise conditioner 910, H0 Y1, between the two Blum B Conditioners, H1 Y2, and at the output 912 from the noise conditioner 910, H2 Y2, not assuming that the data is IID (independent and identically distributed). In addition, Hmin is shown, which is the final minimum-entropy for the arrangement if the output is considered to produce IID bits (which is based on just bias). As can be seen for Sample A, H2 is not always higher than H1, which implies there is some variability in performance depending on the input samples, which is attributed to the nature of “randomness”. However, in general, the first stage Blum B Conditioner is shown to increase entropy and yield significantly, while the second stage Blum B Conditioner is shown to increase entropy and yield gain, towards Hmin.
As shown, total yield is Y1Y2< 1/16. Therefore, the conditioner may be seen to have a nominal output speed of >1 Mbps for a noise source that samples a ring oscillator against a 25 MHz reference clock, which, in practical terms, is fast enough for most applications that require a random bit stream.
Certain examples described herein may provide an output random bit sequence that is close to Independent and Identically Distributed (IID) and/or that contains at least 0.997 bits of Shannon entropy per output bit. This differs from comparative raw noise sources that are usually not IID and/or do not meet the aforementioned entropy criteria. The present examples provide “arithmetic” rather than cryptographic post processing in the form of the noise conditioners built into the entropy source. These are much more resource efficient than cryptographic conditioners and may be free of computational hardness assumptions, thus making them suitable for post-quantum computing and cryptography (e.g., making them essentially “quantum-safe”).
It is noted that many “randomness tests” that are used in comparative random number generators may be suitable for evaluating randomness for Monte Carlo simulations but may be ill-suited to security and cryptographic applications. For example, cryptoanalysis with linear algebra shows that “random” bit sequences generated with linear-feedback shift registers used in certain random number generators may present a security risk. In these comparative cases, an internal state of a linear-feedback shift register may be derived from a relatively small amount of output, allowing future and past outputs to be reproduced with little effort. This may be a devastating outcome if the output of the “random” number generator is used for a cryptographic keyring. In contrast, the present examples provide random bit sequences that are well suited for cryptographic applications without a risk of past or future output discovery.
A flow diagram illustrating the operation of an entropy source according to an example is shown in
In a first step 1000, noise samples are generated by a noise source 905. Then 1010, a first Blum B Conditioner stage 913 of a noise conditioner 910 conditions the noise samples and, next 1020, the conditioned noise samples are further conditioned by a second Blum B Conditioner stage 914 of the noise conditioner 910. Next, the conditioned noise samples are stored in a buffer 915 from where a software driver 122 may poll and receive conditioned noise samples.
The preceding description has been presented to illustrate and describe examples of the principles described. This description is not intended to be exhaustive or to limit these principles to any precise form disclosed. Many modifications and variations are possible in light of the above teaching. For example, different noise sources may be employed to deliver random bits into a noise conditioner. Such noise sources may be ‘on-chip’ and comprise circuits or may be derived from external devices or systems. In addition, the form of the noise conditioner may vary. For instance, any number of Blum B Conditioners may be employed. There may only be one Blum B Conditioner, there may be two (as exemplified) or there may be more than two, for instance, 3, 4, 5 or more Blum B Conditioners. For each Blum B Conditioner, the number of states may differ from 8 in any number 2k. Or, one Blum B Conditioner may have 8-states and another may have a different number of states. The number of states in a Blum B Conditioner determines the configuration of the logic, for instance in terms of a number of stages in a shift register and associated logic to select an output (for instance) von Neumann debiaser. In other examples, the noise conditioner may comprise at least one Blum B Conditioner and one or more other stages that perform debiasing and/or decorrelation in other ways. Various other tests or alarms, with associated parameters and/or thresholds, may be applied to different configurations of entropy source. Noise conditioners, and indeed entropy sources more generally, according to examples herein, may be realized, without limitation, in hardware, such as hard-wired or configurable circuitry, in firmware, in software or in any appropriate combination of the foregoing, as determined by the use-case. An entropy source according to examples herein may be integrated with a central processing unit or operate and be realized independently thereof, for instance as a co-processor or other separate processing unit of an overall computer/processing architecture. In other examples, the entropy source may be integrated into a standalone processing environment, for instance an embedded processor that may be deployed in a connected, IoT device or apparatus. In examples, the entropy source is adapted to be deployed in lightweight microcontrollers, reduced instruction set architectures, such as RISK-V, and/or processors adapted for post-quantum cryptography.
It is to be understood that any feature described in relation to any one example may be used alone, or in combination with other features described, and may also be used in combination with any features of any other of the examples, or any combination of any other of the examples.
Certain un-claimed examples will now be presented as a set of clauses.
In one case, a processing module or noise conditioner is provided for reducing bias and/or increasing the entropy of an output bit sequence compared to an input bit sequence. The processing module is arranged to output the output bit sequence with an output bit rate that varies according to the entropy of the input bit sequence, such that an increase in the entropy of the input bit sequence is associated with an increased output bit rate and a decrease in the entropy of the input bit sequence is associated with a decrease in the output bit rate.
The processing module may have an input to receive input bits from a source and an output to produce output bits for use in random number generation. The processing module may implement a 2k state Markov Chain (e.g., where k=3), wherein each state has two entry transitions from one or more previous states and two exit transitions to one or more subsequent states. The processing module may comprise a processing block arranged to output bits according to Blum's Algorithm B, the processing block implementing the Markov Chain. The processing module may implement at least a first 2k state Markov Chain and a second 2J state Markov Chain, in series with the first Markov Chain. Both k and J may equal 3. The processing block may be arranged to output bits according to at least two sequential stages of Blum's Algorithm B, each stage implementing an instance of a Markov Chain.
In certain cases, a logic device is provided, comprising an input, selector and 2L output logic cells, the selector arranged to select an output logic cell based on a sequence of L input bits from an input bit sequence received at the input, the selected output logic cell arranged to produce an output bit according to the logic in the output logic cell operating on an input bit value. The logic device may comprise a von Neumann cell as described in the above examples.
The logic device may comprise a shift register to receive the input bit sequence, wherein L stages of the shift register are arranged to address the selector to select the one of the 2L output logic cells (e.g., where L=3). Each of the output logic cell may be arranged to perform a debiasing logic operation, wherein the debiasing logic operation may comprise a delayed von Neumann debiasing logic operation. The logic device may be arranged to output bits according to Blum's algorithm B and/or to implement a Markov Chain. The selector may be arranged to select output logic cells, on the basis of the input bit sequence, thereby to implement the Markov Chain.
In one case, a processing module as described above may comprise at least one logic device as also described above. In one case, two or more logic devices may be arranged in series and an output bit from a first of the logic devices is fed as an input bit into a second of the logic devices.
In one case, an entropy source for use in random number generation, may comprise a noise source and a noise conditioner in the form of a processing module as described above. A hardware circuit for use in random number generation, may comprise this entropy source. The noise source may comprise a ring oscillator, wherein the ring oscillator may comprise a plurality of invertors, for example three invertors, connected in a free-running loop. Preferably there are an odd number of invertors. The hardware circuit may comprise a buffer to hold a number of bits comprised in an outputted random bit sequence. The buffer may be arranged to be polled and deliver a number of bits of the held random bit sequence.
In one case, an adaptive rate entropy source is provided for reducing bias and/or increasing the entropy of an output bit sequence compared to an input bit sequence, the adaptive rate entropy source arranged to output the output bit sequence with an output bit rate that adapts according to the entropy of the input bit sequence, such that an increase in the entropy of the input bit sequence is associated with an increased output bit rate and a decrease in the entropy of the input bit sequence is associated with a decrease in the output bit rate.
In one case, a processing module is provided that is arranged to reduce bias and/or increase the entropy of an output bit sequence compared to an input bit sequence, the processing module being further arranged to output the output bit sequence with an output bit rate that varies according to the entropy of the input bit sequence, such that an increase in the entropy of the input bit sequence is associated with an increased output bit rate and a decrease in the entropy of the input bit sequence is associated with a decrease in the output bit rate, and thereby reduce bias and/or increase the entropy of the output bit sequence compared to the input bit sequence, wherein the output bit rate is reduced by discarding input bits that would otherwise lead to increased bias and/or reduced entropy.
In one case, a noise conditioner is provided comprising an input to receive an input bitstream having a first entropy and an output for delivering an output bitstream having a second, increased entropy compared to the first entropy. The noise conditioner may comprise: a shift register arrangement operative on each clock cycle of a system clock to load a next input bit of the input bitstream into the shift register arrangement; 2k destination logic cells each arranged to perform a debiasing operation on the basis of the next input bit and a previous input bit of the input bitstream that was clocked into the respective destination logic cell and either output an output bit or not output an output bit according to the outcome of the debiasing operation performed by the respective destination logic cell; and a selector operative to select a destination logic cell on the basis of an address determined by a sequence of k previous input bits of the input bitstream that are loaded in the shift register arrangement, whereby each next input bit introduced into the shift register is also clocked into the selected logic cell to cause a respective debiasing operation to be performed, wherein, a sequence of addresses determined by the selector selects a respective sequence of destination logic cells in an order that corresponds with transitions of a predetermined 2k-state Markov Chain. The noise conditioner may comprise two processing blocks as described previously, a second processing block having the same form as and operating in series with the first-mentioned processing block.
This application is a continuation of International Application No. PCT/GB2021/053088, filed Nov. 26, 2021, which claims priority to GB Application No. GB 2018650.8, filed Nov. 26, 2020 and GB Application No. GB 2020614.0, filed Dec. 24, 2020, under 35 U.S.C. § 119(a). Each of the above-referenced patent applications is incorporated by reference in its entirety.