# Wireless network information flow

We have developed linear deterministic models that capture these interactions and show that for such models, one can obtain an information-theoretic max-flow min-cut result, in analogy to the classic wireline Ford-Fulkerson result. We have extended these ideas to general deterministic functions and showed achievable rates for such networks with broadcasting at the transmitters and interference at the receivers. In particular we show that if the optimizing distribution for the information-theoretic cut-set bound is a product distribution, then we have a complete characterization of the achievable rates for such networks. For linear deterministic finite-field models this is indeed the case, and for such cases we have a generalization of of the max-flow, min-cut theorem. This result applies to arbitrary networks, which could also have cycles.

Using the insights from the deterministic analysis, we develop approximate characterizations for noisy Gaussian networks. We show that the achievable rate, for single unicast or multicast (single source, multiple destinations interested in complete source message), is within a constant number of bits from the information-theoretic cut-set upper bound on the capacity of these networks. This constant depends on the topology of the network, but not the values of the channel gains. Therefore, we uniformly characterize the capacity of Gaussian relay networks within a constant number of bits, for all channel parameters. This result applies to arbitrary Gaussian relay networks.

In order to show the approximate max-flow min-cut characterization for Gaussian networks, we developed a new relaying strategy called quantize-map-forward (QMF). None of the previously known strategies are approximate optimality for multicasting over general networks. The QMF is a simple and robust scheme that quantizes the received signal and maps it forward directly to the transmit signal, to succintly summarize the received signal. The relaying strategy is therefore (almost) oblivious to the network topology and channel conditions. The QMF strategy uses the insights from the deterministic framework and is philosophically the network coding technique generalized to noisy wireless networks.

There are subtle but critical differences between the QMF strategy, with the natural extension of compress-forward to networks for the following reasons. The compress-forward scheme quantizes the received signal and then mapped the digital bin index onto the transmit sequence. This means that we need to make choices on the binning rates at each relay node. However, the QMF scheme directly maps the the quantized sequence to the transmit sequence, and therefore does not make such choices on the binning rates. In fact this gives the scheme a `universality’ property, which allows the same relay operation to work for multiple destinations (multicast) and network situations (compound networks); a property that could fail to hold if specific choices of binning rates were made. Moreover, our scheme unlike the classical compress-forward scheme, does not require the quantized values at the relays to be reconstructed at the destination, while it is attempting to decode the transmitted message. These are the essential differences between our scheme and the traditional compress-forward, or the natural network generalization of it.

We have also extended this approach to multi-source multicast, where many sources are to be received at multiple receivers, where all sources are required.