Submitted by DryEstablishment2 t3_zy4vh0 in explainlikeimfive
Scary-Competition838 t1_j23rehy wrote
Computers do math operations, so a “big number,” particularly when converted to binary, is not much more demanding than a small one- the electrons just need to flow through a circuit that tells them what to do, and twice or ten times as long isn’t usually apparent on a human scale. To me, it’s equally or more interesting how a human brain is so efficient at some kinds of calculations, particularly abstract ones when developed, but only ever treats numbers as meaningful proportional to what their size represents versus the information encoded in them, which can be much less.
Viewing a single comment thread. View all comments