Multiple access channel Flashcards
Gaussian multiple access channel
We consider m transmitters, each with a power P. Let
Y = \sum_{i=1}^m X_i + Z.
Let
C(P/N) = 1/2 log(1 + P/N)
denote the capacity of a single-user Gaussian channel with signal-to-noise ratio P /N. The achievable rate region for the Gaussian channel takes on the simple form given in the following equations:
R_i < C(P/N)
R_i + R_j < C(2P/N)
R_i + R_j + R_k < C(3P/N)
…
\sum{i=1}^m R_i < C(mP/N).
Note that when all the rates are the same, the last inequality dominates the others. Here we need m codebooks, the ith codebook having 2^{nR_i} codewords of power P.
Transmission is simple. Each of the independent transmitters chooses an arbitrary codeword from its own codebook. The users send these vectors simultaneously. The receiver sees these codewords added together with the Gaussian noise Z.
Optimal decoding consists of looking for the m codewords, one from each codebook, such that the vector sum is closest to Y in Euclidean distance. If (R_1, R_2, … , R_m) is in the capacity region given above, the probability of error goes to 0 as n tends to infinity.
General multiple-access channel
A discrete memoryless multiple-access channel consists of three alphabets, ˆX_1 , ˆX_2 , and Y, and a probability transition matrix p(y|x_1, x_2).
A ( (2^{nR_1}, 2^{nR_2} ), n) code for the multiple-access channel consists of two sets of integers W_1 = {1, 2, … , 2^{nR_1} } and W_2 = {1, 2, . . . , 2^{nR_2} }, called the message sets, two encoding functions,
X_1 : W_1 → ˆX_1^n
and
X_2 : W_2 → ˆX_2^n
and a decoding function,
g : Yn → W1 × W2.
There are two senders and one receiver for this channel. Sender 1 chooses an index W1 uniformly from the set {1, 2, … , 2^{nR_1} } and sends the corresponding codeword over the channel. Sender 2 does likewise.
General multiple access channel capacity
The capacity of a multiple-access channel (X_1 × X_2, p(y|x_1, x_2), Y) is the closure of the convex hull of all (R_1, R_2) satisfying
R_1 < I (X_1; Y |X_2),
R_2 < I (X_2; Y |X_1),
R_1 + R_2 < I (X_1, X_2; Y )
for some product distribution p_1(x_1)p_2(x_2) on X_1 × X_2.
Multiple access channel examples
Independent BSCs
Multiple users transmit bits X_1, X_2, … , X_k
Each user’s transmission passes through an independent BSC with an error probability p.
The receiver observes Y_1, Y_2, … , Y_k, where Y_i is the noisy output of the i-th user’s BSC.
Binary multiplier channel:
Transmitters send X_1 and X_2.
Receiver observes Y=X_1⋅X_2, which provides limited information about the individual inputs.
Binary erasure MAC
Each transmitter sends a bit X_i∈{0,1}.
The channel delivers the bits to the receiver, but with some probability p, a bit is erased and replaced by the erasure term.
Gaussian broadcast channel
Two distant receivers with noise N_1 and N_2. Without loss of generality, assume that N_1 < N_2. The capacity region of the Gaussian broadcast channel is
R1 < C( α * P / N_1)
R2 < C( (1 − α)P / α * P + N_2)
where α may be arbitrarily chosen (0 ≤ α ≤ 1) to trade off rate R1 for rate R2 as the transmitter wishes.
General broadcast channel
The capacity region of the broadcast channel is the closure of the set of achievable rates.
A rate is said to be achievable for the broadcast channel if there exists a sequence of ( (2^{nR_0}, 2^{nR_1}, 2^{nR_2}, … ), n) codes with P_e^n → 0. The capacity region of a broadcast channel depends only on the conditional marginal distributions p(y_1|x) and p(y_2|x).