This invention relates generally to wireless communications, and more particularly to encrypting and decrypting data using private keys.
Security in Wired and Wireless Networks
In wired communication networks, signals are mostly confined in physical transmission media, such as conductive wires and optical fibers. Hence, an eavesdropper can only access the signals by physically attaching to the media.
In contrast as shown in
Public-Key Cryptography
To achieve secure wireless communication, public key cryptography is widely used. The asymmetric structure of the keys does not require a perfectly secure wireless channel to exchange a pair of keys between the transmitter and the receivers. The pair of keys includes a private key and a public key. To operate the network, a public key infrastructure (PKI) generates, distributes and maintains the public keys, in which a trusted party, i.e., a certificate authority (CA), binds all the public keys with a receiver identity and issues a public key certificate to the receiver.
To establish secure wireless communication, the transmitter first verifies the public key certificate of the receiver. After the public key is verified, data are then encrypted using the receiver's public key. The data can only be decrypted using the corresponding private key.
However, for many wireless networks, access to the PKI is difficult, or completely unavailable. In such cases, secure communication in wireless networks becomes challenging. Given this, guaranteeing security in a wireless network is of great interest to users of wireless networks.
Channel Reciprocity
The reciprocity of wireless channels enables two nodes to generate private keys based on the channel responses of the reciprocal channels. However, due to noise, interference and hardware impairments, the channels are not perfectly correlated. Therefore, private keys generated independently by the nodes do not always match. If a conventional cryptography technique is used for data to be transmitted, the two keys must be identical, or otherwise the receiver cannot decrypt the data correctly.
Therefore, it is desired that each key has a low bit mismatch rate (BMR). The BMR is defined as a ratio between the number of mismatched bits to the total number of bits in each key.
Embodiments of the invention provide a method for a first wireless node (user) and a second wireless node (user) to independently generate respective private keys and that have a low bit mismatch rate (BMR) by quantizing correlated signal sources and using public message exchanges.
Each wireless node first independently estimates the channel response H to the other node. The first node quantizes the channel response to produce a first bit sequence and a feed-forward message, and transmits the feed-forward message to the second node via a public channel.
The second node quantizes the channel response based on the received feed-forward message. The second node generates a second bit sequence and a corresponding feed-back message, which is transmitted to the first node via the public channel. Based on the feed-forward and the feed-back message, the nodes deleted selected bits in the first and second bit sequences to produce respective private keys.
Method Overview
As shown in
During channel sounding and estimating 301, Alice 101 and Bob 102 estimate the response of the channel between the nodes. In one embodiment of the invention, Alice and Bob each transmit a known sounding signal S. When Alice transmits S, Bob receives S and estimates the channel parameters. When Bob transmits S, Alice receives S and estimates the channel parameters.
Due to the noise and hardware impairment, the received signals by Alice and Bob can be expressed respectively as follows:
R
a
=H
ba
*S+Z
a, and Rb=Hab*S+Zb,
where * is a convolution operator, Nab 201 and Hba 202 can be vectors, with elements correspond to the channel response for a predetermined transmitter, receiver, frequency, and time. The channel reciprocity states that Hab=Hba, and Za and Zb denote noise.
The sounding signals S transmitted by Alice and Bob can also be different as long as the signals are known at the receiver. The sounding signal S can be transmitted multiple times to improve the signal-to-noise ratio (SNR).
Based on the received signal, Alice and Bob estimate the channel independently: The estimated channels are ba and ab respectively as
ba
=H
baηa,
ab
=H
abηb,
where ηa and ηb represent the channel estimation errors at Alice and Bob, respectively.
Then, Alice and Bob determine channel parameters X 311 and Y 321 as a function of the channel estimates ba and ab. The parameters can be simply ba and ab, or extracted parameters such as the channel gains, phases at predetermined frequencies, statistical parameters associated with the channel gains, or combinations thereof. If the total number of measurements is n, then the channel parameters X and Y are two length-n bit sequences Xn and Yn, respectively, which can be expressed as:
X
n
=f(ba)=G+FA,
Y
n
=f(ab)=G+FB,
where G˜(0,P), FA˜(0, NA) and FB˜(0, NB) denote the extracted parameter from noise-free channel responses, the estimation error at Alice and the estimation error at Bob, respectively. For simplicity, we consider the case of NA=NB=N.
Alice quantizes 302 the bit sequence X using, e.g., a scalar equiprobable quantizer. The level of quantization L is pre-determined. The boundaries for an L=2l-level quantization are (q0, q1], (q1, q2], . . . , (qL−1, qL). These are selected such that
The quantizer further partitions each interval (qi-1, qi] into M=2m sub-regions to generate an m-bit feed-forward message based on the bit sequence X. Therefore, the input channel response is quantized into l+m bits, where l bits are in the original bit sequence and m bits are part of the feed-forward message Va.
Alice transmits 303 the m-bit feed-forward message Va to Bob.
After Bob receives the feed-forward message Va, Bob quantizes 305 the bit sequence Y using, e.g., a Maximum a posteriori Probability (MAP) equalizer, which is expressed as
T
bi=argmaxPr(βXiε(qj,qj+1)|Yi=yi,Vai=vai),
for i=1, 2, . . . , n, where β is a power normalization factor. The MAP quantization process also produces a feed-back message Vb, which is transmitted 306 to Alice.
In the final steps 307, both Alice and Bob produce respective private keys 313-314 by deleting bits from the original sequences using the feed-back and feed-forward messages.
Feed-Forward and Feed-Back Message
The generation of feed-forward and feed-back bits is illustrated in detail in the following.
for any iεZL, where Q(•) denotes a Gaussian tail function and ZL={0, 1, . . . , L−1} denotes an integer set. As an example, we set q0=−∞ and qL=∞. Gray coding is used for mapping the quantizer indices to bits, e.g., if L=4, the four quantization intervals are mapped to 00, 01, 11, and 10, respectively as shown in
For generating the feed-forward message, each quantization interval (qi−1, qi] is further partitioned into m sub-intervals, (ti−1,0, ti−1,1], (ti−1,1, ti−1,2], . . . , (ti−1,m−1, ti−1,m], where ti−1,0=qi−1 and ti−1,m=qi, such that each sub-interval (ti−1,k, ti−1,k+1] for kεZm has an identical probability of
Given each channel estimate Xi and Yi for any iε={1, . . . , n}, Alice and Bob individually quantize the estimates into log2(L)-bit indices KA,i and KB,i, using the L-level scalar equiprobable quantizer. Note that the quantization is performed with a power normalization to have unity variance for quantizing data. The quantized data at iε N are then given by
K
A(i)={j:βxiε(qj,qj+1]}
K
B(i)={j:βyiε(qj,qj+1]}.
After n observations, Alice and Bob obtain bit sequences KA=[KA(1), KA(2), . . . , KA(n)] and KB=[KB(1), KB(2), . . . , KB(n)], respectively. Both are n×log2(L)-bit long.
Alice also generates the log2(m)-bit feed-forward message Va(i) from Xi such that Va(i) is the sub-interval index of the interval KA(i), more specifically, we can write Va(i) as follows:
V
a(i)={jβxiε(tK
Alice transmits the n×log2(m)-bit feed-forward message Va=[Va(1), Va(2), . . . , Va(n)] to Bob, where each sub-interval index Va(1) can be expressed by its binary natural code representation with log2(m) bits.
After receiving the feed-forward message Va from Alice, and his own observation Yn, Bob carries out a Maximum a Posteriori Probability (MAP) estimation of K′A. In the MAP estimation, for each Yi=yi and Vai=vai (for any iε), Bob searches for the index ji1 such that the following condition is satisfied:
With the MAP estimate ji1 and the original quantized data KB(i), Bob generates the feed-back message Vb(i) which is set to be one if ji1 and KB(i) do not match (i.e., ji1≠KB(i)), and zero otherwise, for each iε. The feed-back message Vb=[Vb(1), Vb(2), . . . , Vb(n)] of n bits is transmitted to Alice.
Based on the feed-back message Vb, Alice deletes 307 the corresponding KA(i) if Vb(i)=1, for iε, and sets the remaining as her private key WA. Similarly, for each iε, Bob also deletes 207 the corresponding bits KB(i) if Vb(i)=1, and produce his private key WB using the remaining bits.
The key-generation and deletion of the mismatched bits can be summarized as follows.
Alice also generates a feed-forward message Va based on her received observation and the generated key KA. Alice sends the feed-forward message Va to Bob.
Universal Quantization
In another embodiment of the invention as shown in
Alice receives the input channel parameters X 311, and sorts 501 the channel parameters Xn=[X1, X2, . . . , Xn] in an ascending order, and partitions the sorted parameters into L intervals with each interval containing an equal number of the parameters X(i), and then selects quantizing boundaries in an ascending order to generate 502 a first quantization codebook as QA={q0, q1, . . . , qL}, i.e., q0=X(1), q1=X(└n/L┘), . . . , qL=X(n).
Bob similarly sorts 501 his input 321 and generates 502 a second quantization codebook QB. QA and QB are not necessarily identical.
Alice and Bob obtain KA and KB respectively by universal quantization 503 of the inputs X and Y using the codebooks QA and QB, respectively. Alice generates the feed-forward message VA using the same universal equiprobable quantizer principle, i.e., sorting the parameters within a predetermined interval in ascending order and then partitioning the parameters in M sub-intervals to produce the m-bit feed-forward message VA(i) for X(i), based on the corresponding sub-interval
Bob generates the feed-back message VB as follows. For each iε, using the feed-forward message Vai=vai, Bob determines his estimation of Alice's index by ji1=jεZL, if yiε(
Then, Bob sets Vb(i)=0 if ji1=KB(i) and Vb(i)=1 otherwise, for each iε.
The median in the above equation can be replaced by the mean.
The invention provides a method for generating private keys to encrypt and decrypt data to be transmitted in a wireless network. The keys are generated independently based on sounding signals exchanged between two nodes. The private keys have a low bit mismatch rate (BMR) by quantizing correlated signal sources and using public message interchanges.
By transmitting the feed-back and feed-forward messages between two nodes, the BMR can be reduced, e.g., at 20 dB SNR, the BMR is reduced by 40%. The universal quantizer achieves a similar performance as the MAP equalizer when the sources have Gaussian distribution. The universal quantizer has better BMR performance when the source distribution is uniform.
It is to be understood that various other adaptations and modifications may be made within the spirit and scope of the invention. Therefore, it is the object of the appended claims to cover all such variations and modifications as come within the true spirit and scope of the invention.