Posted on behalf of Retread
People speak of information pretty glibly. Claude Shannon defined it as various combinations of bits (binary digits which can be ones and zeros) for electronics 61 years ago in a paper written about his classified work during World War II. Neuroscientists speak of information processing by the brain as the way it manipulates its input (a series of action potentials in nerve fibers which are about as close to Shannon’s ones as you can get).
So that’s what information is. But we really don’t understand the entities (electronics, the brain) which actually do the processing terribly well. Consider first solid-state electronics, which catches Shannon’s ones and zeroes. Just how well do we understand the solid state? Not very well according to Robert Laughlin, Nobel physicist, in his book “A Different Universe: Reinventing Physics from the Bottom Down”. Quantum mechanics is now introduced to chemists in college, and I assume a course is obligatory in graduate school these days. Laughlin says it doesn’t really matter in understanding the solid state, in the same way that the underlying chemical structure of the zillions of organic compounds which have been crystallized does not in any sense matter in explaining the crystalline state. All that matters, is that each molecule adopts the same shape regardless of what that shape is (this is why proteins are hard to crystallize). The book will make your head swim.
How about the brain? Do we understand it? Ask your friendly neighborhood neuroscientist why we need sleep, or better, exactly how and where in the brain memories are stored. You may hear a few mumbles about reverberating circuits or long term potentiation, but we really don’t know. Although the brain has 10^10 neurons and probably 10^13 synapses (which is how neurons talk to each other), we can’t use statistical mechanics to understand it. Amazingly, even in the case of the monatomic ideal gas, the atoms are assumed not to interact with each other (other than collide), and their energies are sufficiently low that electronic excitation isn’t possible. Just as a list of the 10^23 positions and the 10^23 momenta do not explain the pressure of a gas, the list of what the 10^13 synapses are doing every millisecond, in addition to being incomprehensible, would not explain in any sense how and where memories are stored.
Where does chemistry come in? Consider the chemiotics posts of 9 Feb and 20 Jan. They say something profound about information, and not just in the cell. The information in DNA depends on how it’s read (one way by the ribosome reading mRNA to make a protein, another by the splicing machinery to determine what mRNA is made, and a third way by microRNAs to determine how long the mRNA hangs around). Only through chemistry can the reader of the information be understood, and I think chemists understand the readers fairly well. I’m not sure if Shannon’s concept of information entropy could even be applied to a DNA sequence being read 3 ways at once by different molecular machines. All discussions of information I’ve seen, pretty much ignore what’s actually doing the reading.
Galileo famously said, “The universe cannot be read until we have learnt the language and become familiar with the characters in which it is written. It is written in mathematical language”. Well, the information we have the best chance of understanding (because we understand the reader) is written in the language of chemistry. Thus do chemists stand astride the Cartesian dualism of materiality and the nonphysicality of information.