# McMillian’s inequality

Kraft’s inequality states that: There is an instantaneous $r$-ary code $C$ with word lengths $l_1...l_q$ if and only if :

$\sum_{i=1}^q \frac{1}{r^{l_i}} \leq 1$

But what of codes that are merely uniquely decodable, and not necessarily instantaneous?

There number is given to us by McMillian’s inequality and the surprising thing is that, despite the looser bound this is:

There is a uniquely decodable $r$-ary code $C$ with word lengths $l_1...l_q$ if and only if :

$\sum_{i=1}^q \frac{1}{r^{l_i}} \leq 1$

And here’s a proof (again drawn from Information and Coding Theory):

We have a code $C$ with word lengths $l_1 ... l_q$ and we define:

$K = \sum_{i=1}^q \frac{1}{r^{l_i}}$.

Additionally we say that $x = max(l_1 ... l_q)$ and $y = min(l_1 ... l_q)$

Let us now consider $K^n$ or $(\sum_{i=1}^q \frac{1}{r^{l_i}})^n$ where $n \geq 1$.

This is $\sum\sum...\sum \frac{1}{r^{l_i}}$ and is the sum of a series of products of the form:

$\frac{1}{r^{l_1}}\times\frac{1}{r^{l_2}}...\times\frac{1}{r^{l_q}}$

(I found this a bit difficult to follow at first, but imagine we had $(\frac{1}{A} + \frac{1}{B})(\frac{1}{C} + \frac{1}{D})(\frac{1}{E} + \frac{1}{F})$, then we would could expand this as:

$(\frac{1}{A}\times\frac{1}{C} + \frac{1}{A}\times\frac{1}{D} + \frac{1}{B}\times\frac{1}{C} + \frac{1}{B}\times\frac{1}{D})(\frac{1}{E} + \frac{1}{F})$

$\frac{1}{A}\times\frac{1}{C}\times\frac{1}{E} +\frac{1}{A}\times\frac{1}{C}\times\frac{1}{F} + \frac{1}{A}\times\frac{1}{D}\times\frac{1}{E} +\frac{1}{A}\times\frac{1}{D}\times\frac{1}{F} + \frac{1}{B}\times\frac{1}{C} \times\frac{1}{E}+ \frac{1}{B}\times\frac{1}{C}\times\frac{1}{F}+\frac{1}{B}\times\frac{1}{D}\times\frac{1}{E}+\frac{1}{B}\times\frac{1}{D}\times\frac{1}{F}$

And we can also write:

$\frac{1}{r^{l_1}}\times\frac{1}{r^{l_2}}...\times\frac{1}{r^{l_q}} = \frac{1}{r^j}$ where $j = l_1 + l_2 +...+l_q$.

Now $yn \leq j \leq xn$ since $y \leq l_i \leq x$ for all $i$.

Now, here’s the fancy bit (at least for me!)…

This means we can rewrite $K^n$ as $\sum^{xn}_{j=yn}\frac{N_{j,n}}{r^j}$

The upper and lower bounds for $j$ follow from the inequalities above (we could sum over a wider range but then the coefficient $N_{j,n}$ would be zero beyond these bounds. And what is this coefficient?

$N_{j,n}$ is the number of ways of writing any given $j$ (or, alternatively, the number of different code word sequences whose total lengths sum to $j$).

Let’s take a simple example again – we have a binary code $C$ with code words $0,01,011$ (plainly not instantaneous as $01$ is a prefix of $011$).

Here $K = \sum_{i=1}^q \frac{1}{r^{l_i}} = \frac{1}{2} + \frac{1}{4} + \frac{1}{8} = \frac{7}{8}$

But what if we look at $K^2$?

Then we have:

$\frac{1}{2}\times\frac{1}{2} + \frac{1}{2}\times\frac{1}{4} + \frac{1}{2}\times\frac{1}{8} +\frac{1}{4}\times\frac{1}{2} + \frac{1}{4}\times\frac{1}{4} + \frac{1}{4}\times\frac{1}{8} + \frac{1}{8}\times\frac{1}{2} + \frac{1}{8}\times\frac{1}{4} + \frac{1}{8}\times\frac{1}{8}$

Writing this out in the form above we then have – and you can see the coefficients:

$\frac{1}{2^2} + \frac{2}{2^3} + \frac{3}{2^4} + \frac{2}{2^5} + \frac{1}{2^6}$

The coefficient $N_{j,n}$ can never be greater than $r^j$ (as it can never be greater than the number of possible ways to arrange the code), so $\frac{N_{j,n}}{r^j} \leq 1$.

We also know that $K^n$ is a sum of $xn + 1 - yn$ terms, so:

$K^n \leq(x - y)n + 1$.

As $n$ increases, if $K > 1$ then the left hand side increases exponentially while the right (where $x$ and $y$ are independent of $n$) increases only linearly, hence $K \leq 1$.

I am sure this will be one of the less read postings on the blog (congratulations if you got this far), but if it helps just one person get a better undestanding of McMillan’s inequality then it has been worth it!

# CRCs as matrix maths

I wrote about CRCs - cyclic redundancy checks – about a year ago, following on from what I had read in Andy Tanenbaum’s Computer Networks – and because you see CRCs used everywhere.

But now, thanks to another book – Networks on Chips – here’s a slightly more fundamental examination.

CRCs are a form of Cyclic codes – codes where a rotation – eg., make the 0 bit the highest order bit, the previously highest order bit the second highest order bit and so on – produces a new codeword.

As before we can think of code words as polynomials of the power of 2 with coefficients of 0 or 1, eg.,

$p(x) = c_02^0 + c_12^1 + c_22^2 + c_32^3 + ... + c_{n - 1}2^{n - 1}$

(And of course any natural number can be represented in this fashion.)

Then a circular shift is similar to multiplying by 2 – though with the proviso that $c_{n -1} \rightarrow c_0$

We can construct a codeword in the form $p(x) = m(x)g(x) + r(x)$ where $m(x)$ is the message, $g(x)$ is the (monic) generator and $r(x)$ a remainder (of the same or lesser order as $g(x)$).

But if we stipulate that $g(x)$ is of the smallest order of a codeword then that means $r(x)$ must be zero.

So, what does a generator matrix look like?

If we have a monic generator polynomial $g(x)$ of degree $k$ and a code word length of $n$:

$G = \left( {\begin{array} {rrrrrrrr} g_0 & g_1 & g_2 & ... & g_k & 0_l & 0_m & 0_n \\ 0_a & g_0 & g_1 & g_2 & ... & g_k & 0_m & 0_n \\ ... & ... & ... & ... & ... & ... & ... & ... \\ 0_a & 0_b & 0_c & g_0 & g_1 & ... & g_j & g_k \end{array} } \right)$