Wikipedia:Finite blocklength information theory

Finite block-length information theory is a branch of information theory that analyzes maximum channel coding rate under the finite length frame. The Shannon–Hartley theorem developed under the hypothesis of an infinite length frame and  to approach Shannon capacity it is necessary to use codes with large block length. However, in the presence of URLLC-enabled wireless communication networks, sending information with an infinite-block length regime is impracticable. As a result, short-packet data transmission is used to meet both the reliability and latency requirements of wireless communication networks, which has been theoretically studied using Finite block-length information theory. Furthermore, finite block-length information theory provides a precise framework for determining the relationship between wireless communication latency and reliability. The maximal achievable channel coding rate $$ \left ( \bar{R} \right ) $$  with given block error probability $$ \left ( \epsilon  \right ) $$ and block-length $$ \left ( n \right ) $$ (for binary Additive white Gaussian noise (AWGN) channels, with short block lengths), closely approximated by Polyanskiy, Poor and Verdú (PPV) in 2010, is given by


 * $$ \bar{R} \approx C-\sqrt{\frac{V}{n}}Q^{-1}\left ( \epsilon \right )$$

where $$ Q^{-1}$$ is the inverse of the complementary Gaussian cumulative distribution function, $$ C$$ is the channel capacity and $$ V $$ is a characteristic of the channel referred to as channel dispersion.