zethenus

zethenus t1_j2iw3l3 wrote

Not a Computer Scientist. I don’t have a Computer Science degree. The way I understood it is this.

Computer works on 1 and 0. So if you apply that over a light bulb. 1 is on. 0 is off. So 8 bit is 8 light bulbs in a row. 16 bit is 16 light bulbs in a row. 32 is … so on and so on.

Each combo of 1 and 0 are used to represent something. Thus 10111001 = something. (I just made that up. No clue what it might represent)

For the majority of computers today. We managed to arrange millions of light bulbs in a grid and layers upon layers on top of each other on a single surface, and that is a single computer chip. That’s the computing today.

Now imagine we managed to do that on a cube which has 6 surfaces vs 1 on today’s computer chips. Thus one computer chip in Quantum Computing is essentially 6 computer chips of today. How much more data can be processed in a single Quantum Computing chip vs a single Computer chip today?

That’s how I know understood it. However I’ve not idea how accurate this is, if at all.

1