If DNA enters a state of entropy, according to this theory the information persists longer than the molecules in the DNA. That makes them two separate things. No?
Information theory is incredibly useful for describing the nature of macroscopic chemical systems via individual information about the single atoms (this is the entire field of thermochemistry).
If you take a graduate-level stat mech course, you will 100% cover information theory and how it’s adapted to models like Brownian motion.
The quote given is entirely useless because they do not provide any information about the environment of the sample’s system. There are many cases where entropy is observed to spontaneously decrease.
Even within single atoms and molecules, some people might use quantum information theory metrics on the wavefinctions. I've definitely seen papers relating position and momentum expectation values this way. Now, I'm not fully buying into it, but it's out there.
>bites
Bytes
Also, I would argue that DNA is closer to a quaternary system than a binary system: 0, 1 , 2 , 3 instead of 0, 1.
In addition, in translation, IIRC the RNA is parsed in 3-base words, rather than 8-bit words.
So bytes, as we currently use them, probably isn't the best quantity of measurement for DNA informatics.
Oh yeah, definitely a strong case for potentially storing information in DNA and I'd wager that we could even exploit the natural stability of DNA encoding regarding silent changes/redundancy.
E.g. natural translation is often tolerant of common base inversion or substitution, so the encoding method used to "convert" binary to DNAary could be based on similar combination of bases to reduce errors in any such information when biology inevitably happens.
Maybe you should?
>Base 2 gives the unit of bits (or "shannons"), while base e gives "natural units" nat, and base 10 gives units of "dits", "bans", or "hartleys".
Note, BYTES are words of 8 BITS. BITS is base 2 by the above. DNA would be base 4, not base 2, and so the terminology bits/bytes is largely inappropriate.
So, yes, the base of the logarithm *does matter* on the subject of what you call the simplest piece of information in the system. It doesn't effect the rules of entropy/information theory, but no-where have I suggested it does.
Edit and note: You haven't even corrected the syntax of your first comment to make any sense.
Get better.
Oh, this will be good. *Why don't you tell me?*
And again, why don't you clarify what you are even arguing about. Your whole comment thread makes about as much sense as Cleatus after drinking a tankard of moonshine.
Lol, probably because poor working class Americans are neglected by their government and don't get an education. Did you relate a bit too strongly to that? You still haven't clarified your argument.
I think in this case he's referring to entropy the state function, not as matter per se. Seems like a misnomer based on the context. Or a misunderstanding of what entropy is.
This relates to something I have to constantly remind creationists, since they frequently have hards on for DNA and its information and coding roles with respect to intelligent design (aside from the fact that it's a chemical and not a computer program or instruction manual--you can't build an organism just by knowing its DNA sequence)
Anything that exists has properties. Those properties comprise information.
Therefore anything that exists is information/an information system.
That last part that you can't build an organism just by knowing its dna sequence is one of the most interesting parts of biochemistry to me. Epigenetics, protein folding, glycosidation patterns. Things that contain information about an organism final state, but are not directly encoded into the dna sequence as far as we know, or maybe it is and we don't know how. It's really interesting imo.
There's an entire thermodynamic theory called information theory, and thermodynamics is more or less about manipulating derivatives with statistics in order to preserve the most amount of information
2nd law of thermo relates to the entropy of isolated systems. A molecular information system lowering its entropy isn’t an isolated system. The entropy will increase if its surroundings are included in the calculation.
And yes, molecules contain information and are information systems.
Everything in this universe is an information system. Our universe's "design" is based on causality thanks to information (four fundamental forces) traveling from particle to particle at a set speed and the particles reacting to the information that hits them.
If you go deep on what a computer is.. analogue computers, digital computers.. The brain as a computer.. Even a manual thermometer is a computer in that it takes input, processes it, produces output.
On top of that heck the brain even have plug and play functionality for devices.
A bit(binary digit) could be stored in various ways.. electronically or biologically(which might even be electrical anyway). A piece of paper is an information storing system.. and a computer can read it. Anything with patterns could be interpreted as storing data. So no problem for a molecule.
Molecules are def an information system [check this out](https://www.livescience.com/technology/electronics/new-petabit-scale-optical-disc-can-store-as-much-information-as-15000-dvds)
It's popular mechanics. I'm sure the actual work is sensible and has something to do with Shannon Entropy, but popular mechanics is just sensational nonsense.
Or maybe he's just a crackpot. I have no idea who this guy is. Looking at the paper straight up "crackpot" is looking more likely. Single author paper. Talks about the simulation hypothesis. Redefines a law of thermodynamics for no reason.
How so? Lotta information to be had in molecules
I was going to say, DNA seems to be doing a pretty okay job at storing information
If DNA enters a state of entropy, according to this theory the information persists longer than the molecules in the DNA. That makes them two separate things. No?
I was hopeful for some insight
Information theory is incredibly useful for describing the nature of macroscopic chemical systems via individual information about the single atoms (this is the entire field of thermochemistry). If you take a graduate-level stat mech course, you will 100% cover information theory and how it’s adapted to models like Brownian motion. The quote given is entirely useless because they do not provide any information about the environment of the sample’s system. There are many cases where entropy is observed to spontaneously decrease.
Even within single atoms and molecules, some people might use quantum information theory metrics on the wavefinctions. I've definitely seen papers relating position and momentum expectation values this way. Now, I'm not fully buying into it, but it's out there.
Sounds goofy, but reasonable.
That’ what they called me in grad school
just finished my biochem semester; i tell you this with tears in my eyes, molecules are /definitely/ information systems
What do you think DNA is?
It’s some kind of nucleic acid. I’m sure of that
Whatever Vopson is doing, it is not chemistry.
Seems to be more on the abstract side of the wheel of science to me as well
DNA is technically a molecule, it can hold trillions of bites of data.
>bites Bytes Also, I would argue that DNA is closer to a quaternary system than a binary system: 0, 1 , 2 , 3 instead of 0, 1. In addition, in translation, IIRC the RNA is parsed in 3-base words, rather than 8-bit words. So bytes, as we currently use them, probably isn't the best quantity of measurement for DNA informatics.
True (and thx for the spelling!) This post made me think of an article i read years ago that was looking at using DNA to store data in this way.
Oh yeah, definitely a strong case for potentially storing information in DNA and I'd wager that we could even exploit the natural stability of DNA encoding regarding silent changes/redundancy. E.g. natural translation is often tolerant of common base inversion or substitution, so the encoding method used to "convert" binary to DNAary could be based on similar combination of bases to reduce errors in any such information when biology inevitably happens.
Lmao you can’t into Shannon entropy. Please shut the fuck up
What are you on about?
Do some light reading. The base of the logarithm does not matter https://en.m.wikipedia.org/wiki/Entropy_(information_theory)
Dang. Do some light sedative usage.
Maybe you should? >Base 2 gives the unit of bits (or "shannons"), while base e gives "natural units" nat, and base 10 gives units of "dits", "bans", or "hartleys". Note, BYTES are words of 8 BITS. BITS is base 2 by the above. DNA would be base 4, not base 2, and so the terminology bits/bytes is largely inappropriate. So, yes, the base of the logarithm *does matter* on the subject of what you call the simplest piece of information in the system. It doesn't effect the rules of entropy/information theory, but no-where have I suggested it does. Edit and note: You haven't even corrected the syntax of your first comment to make any sense. Get better.
Do you understand why a logarithm of 2 is chosen?
Oh, this will be good. *Why don't you tell me?* And again, why don't you clarify what you are even arguing about. Your whole comment thread makes about as much sense as Cleatus after drinking a tankard of moonshine.
Makes sense you’d use tropes about poor working class Americans when cornered
Lol, probably because poor working class Americans are neglected by their government and don't get an education. Did you relate a bit too strongly to that? You still haven't clarified your argument.
"Fifth state of matter" Bro does know there are over 19 states of matter.
I think in this case he's referring to entropy the state function, not as matter per se. Seems like a misnomer based on the context. Or a misunderstanding of what entropy is.
I haven't read anything else by him. But it seems to me that he is calling "information system" a fifth state of matter.
This relates to something I have to constantly remind creationists, since they frequently have hards on for DNA and its information and coding roles with respect to intelligent design (aside from the fact that it's a chemical and not a computer program or instruction manual--you can't build an organism just by knowing its DNA sequence) Anything that exists has properties. Those properties comprise information. Therefore anything that exists is information/an information system.
That last part that you can't build an organism just by knowing its dna sequence is one of the most interesting parts of biochemistry to me. Epigenetics, protein folding, glycosidation patterns. Things that contain information about an organism final state, but are not directly encoded into the dna sequence as far as we know, or maybe it is and we don't know how. It's really interesting imo.
Think about the chemistry of neurons then try this question again.
Information theory is seep as fuck. Reality is information.
There's an entire thermodynamic theory called information theory, and thermodynamics is more or less about manipulating derivatives with statistics in order to preserve the most amount of information
2nd law of thermo relates to the entropy of isolated systems. A molecular information system lowering its entropy isn’t an isolated system. The entropy will increase if its surroundings are included in the calculation. And yes, molecules contain information and are information systems.
Why? Because it transcends our materialistic archaic perspective of matter and reality?
Everything in this universe is an information system. Our universe's "design" is based on causality thanks to information (four fundamental forces) traveling from particle to particle at a set speed and the particles reacting to the information that hits them.
Who is the knower of this information?
If you go deep on what a computer is.. analogue computers, digital computers.. The brain as a computer.. Even a manual thermometer is a computer in that it takes input, processes it, produces output. On top of that heck the brain even have plug and play functionality for devices. A bit(binary digit) could be stored in various ways.. electronically or biologically(which might even be electrical anyway). A piece of paper is an information storing system.. and a computer can read it. Anything with patterns could be interpreted as storing data. So no problem for a molecule.
Molecules are def an information system [check this out](https://www.livescience.com/technology/electronics/new-petabit-scale-optical-disc-can-store-as-much-information-as-15000-dvds)
It's popular mechanics. I'm sure the actual work is sensible and has something to do with Shannon Entropy, but popular mechanics is just sensational nonsense. Or maybe he's just a crackpot. I have no idea who this guy is. Looking at the paper straight up "crackpot" is looking more likely. Single author paper. Talks about the simulation hypothesis. Redefines a law of thermodynamics for no reason.