Viewing a single comment thread. View all comments

AgentHamster t1_j1th4l4 wrote

Reply to comment by Tioben in Second law of information dynamics by efh1

Yes, your intuition is correct, although I will admit the general idea of informational content (as measured by Shannon's entropy) increasing with increasing randomness seems a bit counterintuitive. If all you knew was that each magnet had a 50-50 chance of being up or down, Shannon's entropy would be maximized, since you would have the maximum number of possible configurations to encode your message. For example, there 6 possible ways to have a 2 ones and 2 zeros in a sequence of 4 numbers (1100, 1010,1001,0110 0101,0011). This means you can code in 6 different options with this limitation. In contrast, if you knew all your magnets had to be in one configuration (for example, 1111), there are more limited combinations (only one in this case, so 0 information) you have to encode information.

If you take this as is, you might argue that this goes against the general idea of their paper. However, it seems that they are defining the total informational entropy as Sinf = N*ln(2)*kb*H(x), where H(x) is Shannon's entropy and N is the number of domains encoding the information (?). While H(x) goes up with increasing entropy, the article claims N goes down (because the domains themselves are loss due to decoherence in the individual regions used to store information, if I am understanding correctly). It's honestly a little hard for me to tell since they don't have an N or Sinf plot for Fig 2.

4