Comments
efh1 OP t1_j1qge1b wrote
This paper is on information theory and covers entropy as it relates to genetics. The same author has other papers on information theory that is more in the area of mathematics or physics and another paper on experimentally testing the theory that is more in the area of condensed matter physics.
The mass-energy-information equivalence principlehttps://aip.scitation.org/doi/full/10.1063/1.5123794
Experimental protocol for testing the mass–energy–information equivalence principle
664C0F7EFEFFE6 t1_j1qh8hn wrote
Fascinating read. Thanks for sharing.
[deleted] t1_j1qlxtn wrote
[removed]
[deleted] t1_j1qqnnq wrote
[removed]
efh1 OP t1_j1qtr5a wrote
He is suggesting information has a small amount of mass and has calculated it. He then predicts a way to detect it with a design of experiment. Perhaps his underlying assumptions are wrong but it’s very interesting and testable so we should reserve judgement if it’s wrong or right based on experimental evidence.
tornpentacle t1_j1qy1ch wrote
Right, but isn't this compeltely ignoring the very obvious fact that the universe is physical and therefore anything in it is physical? Like this doesn't actually seem to be saying anything other than "things that exist exist in the same way that everything else that exists exists". It's also obvious that organizing things into a configuration that humans can use as reference (whether it be writing on paper, encoding on a disk, punching holes in cards, whatever) takes energy to do, and also due to entropy, takes energy to maintain because of quantum effects and degradation of matter into lower configurations of energy...
So given all that, I'm assuming I'm missing something, if this is indeed some kind of revolutionary idea. What am I missing? How is this not simply restating the laws of physics and framing it as some huge paradigmatic shift even though it's just (apparently) saying "the laws of physics haven't broken down yet"?
efh1 OP t1_j1qzjmo wrote
Because it’s stating that information itself has mass and the current classical interpretations doesn’t. He then predicts results that wouldn’t happen under current interpretations. Adding information as part of the fundamental framework with mass and energy would certainly lead to paradigm shifts. The theory is not in contradiction to any other theories so if it was confirmed it would integrate well. The fact it’s testable means it should be considered even if you disagree with the assumptions as anything that could potentially move science forward deserves consideration.
[deleted] t1_j1r38ml wrote
[removed]
FienArgentum t1_j1r4yiu wrote
Realy fascinating. I would love to read more
Tioben t1_j1ro7uq wrote
Cool article! Could someone Explain Like I'm 14 the part about how informational entropy decreasing linearly with time proves mutations are not random?
ory_hara t1_j1sv1ef wrote
First of all, nothing in the paper proves that mutations are not random. Here is what is stated:
>... but it also points to a possibledeterministic approach to genetic mutations, currently believed tobe just random events.
This is an extremely bold statement to make and is purely speculation at this point (as described in paper).
Basically, we already know that mutations happen in nature, but scientists don't seem to agree exactly why and how all mutations happen. We know that radiation can cause mutations because we've seen it happen and seem to understand the mechanism of action. We know that viruses tend to mutate over time too, but speculating that mutations are deterministic based on the available dataset is an outrageous reach.
Since you're like 14, think of the paper as presenting a duality or inverse conjecture of the 2nd law of thermodynamics, only instead of entropy we have information. The thing that most people probably won't pick up right away is that entropy and information are basically the same thing in different colored clothing. So essentially, the paper re-words the second law of thermodynamics to apply to information makes a general hypothesis that this applies universally. It doesn't actually *prove* anything, and if it did, it would be the first successful proof by example. It does however show that the hypothesis does seem to hold in two different types of systems and demonstrates the relation between them.
So how does it relate to mutations again? Long story short, this leads us to believe that we can perhaps reduce the search space for likely variants of viruses' mutations (and perhaps other systems) by calculating probabilities. This isn't exactly something extremely novel, but now that it has been formalized it can be used more effectively.
Tioben t1_j1sv98x wrote
Much obliged!
AgentHamster t1_j1t5m9r wrote
Let's imagine what would happen if all mutations were random, and every base had the same probability of turning into another base. What would be the result of this? The sequence would be entirely randomized, with each nucleotide have a 25% probability of being one of the four bases. This would be the state with the maximum amount of Shannon's entropy - which would be 2N, where N is the DNA length in base pairs. The fact you get a decrease in Shannon's IE suggests that this isn't occurring - aka, the mutations you see don't just scramble the genome to make it a randomized collection of 4 bases, which would increase rather than decrease Shannon's IE.
​
Edit - I want to point out that the article (from my understanding) is rather confusing, since it uses two distinct arguments in each section of the paper. The first part of the paper is essentially claiming information decreases due to the loss of coherent regions of the film - in other words, regions of of the film no longer coherently encode 0 or 1, and thus cannot be counted an information encoding region. This is despite the fact that the Shannon's entropy per region should actually increase. The second part of the paper deals with an example where this argument of loss of regions can no longer be applied.
Tioben t1_j1t9c51 wrote
Thanks! Huh.
But, okay, suppose we had a magnetized array where all the magnets are entangled, so if one is up all are up. The system would only have a single bit of information. No matter which magnet you flip, the whole array can only express a single 1 or a 0.
If you could physically force them to disentangle, my intuition is that this would increase the thermodynamic entropy of the system, because the magnets are now more mixed. But wouldn't it also increase the informational entropy, because now you can express more independent bits?
(I mean, to be honest, not sure if answering this will make me any less confused.)
ubermeisters t1_j1tauzu wrote
maybe dark matter is just far away information?
efh1 OP t1_j1tb6qn wrote
This concept does potentially address dark matter and it’s funny to me that people are just beginning to point this out. I shared a video of Vopson explaining his theory and multiple people just commented about it as well.
So, what’s the connection to dark matter? Vopson says, “M.P. Gough published an article in 2008 in which he worked out … the number of bits of information that the visible universe would contain to make up all the missing dark matter. It appears that my estimates of information bit content of the universe are very close to his estimates.”
AgentHamster t1_j1th4l4 wrote
Yes, your intuition is correct, although I will admit the general idea of informational content (as measured by Shannon's entropy) increasing with increasing randomness seems a bit counterintuitive. If all you knew was that each magnet had a 50-50 chance of being up or down, Shannon's entropy would be maximized, since you would have the maximum number of possible configurations to encode your message. For example, there 6 possible ways to have a 2 ones and 2 zeros in a sequence of 4 numbers (1100, 1010,1001,0110 0101,0011). This means you can code in 6 different options with this limitation. In contrast, if you knew all your magnets had to be in one configuration (for example, 1111), there are more limited combinations (only one in this case, so 0 information) you have to encode information.
If you take this as is, you might argue that this goes against the general idea of their paper. However, it seems that they are defining the total informational entropy as Sinf = N*ln(2)*kb*H(x), where H(x) is Shannon's entropy and N is the number of domains encoding the information (?). While H(x) goes up with increasing entropy, the article claims N goes down (because the domains themselves are loss due to decoherence in the individual regions used to store information, if I am understanding correctly). It's honestly a little hard for me to tell since they don't have an N or Sinf plot for Fig 2.
zenzukai t1_j1wu6s7 wrote
Living organisms also error correct most single base changes. Also if a mutation is non-viable that eliminates many types of mutations from being reproduced.
There is no need for a mathematical information theory to rationalize mutation rates because natural selection has already.
This paper is an extension of Maxwell's Demon thought experiment.
AutoModerator t1_j1qfz46 wrote
Welcome to r/science! This is a heavily moderated subreddit in order to keep the discussion on science. However, we recognize that many people want to discuss how they feel the research relates to their own personal lives, so to give people a space to do that, personal anecdotes are allowed as responses to this comment. Any anecdotal comments elsewhere in the discussion will be removed and our normal comment rules apply to all other comments.
I am a bot, and this action was performed automatically. Please contact the moderators of this subreddit if you have any questions or concerns.