Shannon Sags

Into week out. Got to venture into the park this morning but the indisposition limited my constitutional in magnitude. And the summerishness continues to intensify. Sometimes I am glad to be senior so I can get away from these annoyances sooner.

Speaking of annoyances, I ran across an article [Link] complaining about the inadequacies of Shannon’s information theory. The most intriguing thing about this is that someone thought it news. I can recall reading papers at conferences on how broken Ma Bell’s information theory was fifteen or twenty years ago. Apparently this is one of those cases of non-academics aren’t relevant or being too far ahead of the academic herd. I am going to continue to entertain both conjectures absent any stronger evidence than academic ‘bitchin’. 

The problem is that Shannon entropy counts encoded information only and that by counting characters only. The more letters in your alphabet, the more accessible states and hence the more entropy. And the statements “dogs eat” and “Spot ate” have the same entropy but oh, so different!, information content. 

If your dog is named Spot, of course. Otherwise it’s the same. 

So information is contextual and depends on how it relates to knowledge.

And Shannon information theory doesn’t do that. 

But it does explain a lot about why journalism is so bad.

Advertisements