*(This is a translation of a 2002 article)*

A colleague asked me, "How can I tell how many bits a number needs without dividing repeatedly by two?", to which I quickly replied with:

ν.N= ⌊log.N/ log.2⌋ + 1

and, after staring at it for a while I added, "It's pretty easy to see where this comes from."

Mentally trying to organize the proof I met a first hurdle: how to justify the term "+ 1". Since I was committed to a proof outline, I resorted to pulling a rabbit out of the hat.

For starters, we know that the definition of the floor function is:

(0) ⌊x⌋ =N≡N≤x<N+ 1

with `N` an integer; as a consequence, for `N` an integer:

(1) ⌊x+N⌋ = ⌊x⌋ +N

I'm aiming at putting in symbols the following definition: if the number `N` is greater than or equal to a power of two 2^{k}, then it needs ν.`N` = `k` + 1 bits for storing it; so (and this is the rabbit) I premultiply to compensate for the extra bit:

2^{k}≤ 2·N< 2^{k + 1}≡ { taking logarithms on both sides }k·log.2 ≤ log.2 + log.N< (k+1)·log.2 ≡ { dividing on both sides by log.2 }k≤ 1 + log.N/ log.2 <k+ 1 ≡ { (0) }k= ⌊1 + log.N/ log.2⌋ ≡ { (1), commutativity }k= ⌊log.N/ log.2⌋ + 1

The justification for starting with 2·`N` is more heuristic than anything else (meaning, since I know where I want to arrive I can define from where I will start). The key fact that 2^{k} needs `k` + 1 in its binary representation, one "1" bit and `k` "0" bits doesn't change. Hence, the following alternative proof:

2^{k - 1}≤N< 2^{k}≡ { taking logarithms on both sides } (k-1)·log.2 ≤ log.N<k·log.2 ≡ { dividing both sides by log.2 }k- 1 ≤ log.N/ log.2 <k≡ { (0) }k- 1 = ⌊log.N/ log.2⌋ ≡ { algebra }k= ⌊log.N/ log.2⌋ + 1

This proof is essentially the same as the former, but it not only avoids the rabbit but also having to appeal to (1).

## No comments:

Post a Comment