11 August 2007

Containing multitudes

I just had a really kooky thought. This was spurred by having my concept of genetic information straightened out.

The upshot is, the technical sense of 'information' is reduction of uncertainty, and this leads to some counterintuitive results. Eg., if you have two notes with the exact same thing written on them, you might be tempted to say that this is more information, but that's not the technical sense: technically, once you've read a sentence, reading it again contains nothing new, hence no 'information'. That is, 'information' is measured by how much it informs.

When we take this application of information theory to genetics and turn it back on informatics, we get some wonderfully strange results: after all, when you learn something you gain information, yes? Now, in meme-type theories, you tend to think of one mind as one organism, because that's nice and intuitive. But, the human mind is able to contain far more uncertainty than any single organism's genes. Wouldn't this make the human mind more like a species?

More astute wackiness ensues when we consider that the information in a genotype typically increases under selection pressure. Information in a mind increases while learning; hence, studying becomes the meme-equivalent to selection pressure.

Yes, you are reading Blogger, home of half the crackpot theories in the universe. But still, it amuses me to think of exams as extinction events.

No comments: