Submitted by R0oty t3_10ovaci in explainlikeimfive
When I type 4 on my keyboard, How does a computer know its binary representation is 100. Is there a software to do so ? or how does it do it
Submitted by R0oty t3_10ovaci in explainlikeimfive
When I type 4 on my keyboard, How does a computer know its binary representation is 100. Is there a software to do so ? or how does it do it
I don't know the exact answer youre looking for, but there is definetly a way to convert decimal to binary.
Use a loop that compares the number with increasingly larger powers of two.
Find the power of two that is just below (or equal)
So for 103, the power of two just below would be 64. Aka 2⁶
So then we know that in binary it is a six-digit number.
If the number is equal or larger than 2⁶ subtract 2⁶ and note a '1', otherwise note a '0'
Then repeat in descending order for 2⁶, 2⁵, 2⁴, 2³, 2², 2¹, 2⁰
>= Means greater than or equal to
!>= Means not greater than or equal to
103 !>= 128 so '0'
103 >= 64 so '1', and 103 - 64= 39
39 >= 32 so '1', and 39 - 32 = 7
7 !>= 16 so '0'
7 !>= 8 so '0'
7 >= 4 so '1' and 7 - 4 = 3
3 >= 2 so '1' and 3 - 2 = 1
1 >= 1 so '1' and we are done.
Unless I made a mistake, 103 in binary should be 01100111, you also could cut off the extra 0 and just say 1100111.
Digits are usually represented by their ASCII code, with '0' having code 48 (binary 00110000) and '9' having code 57 (binary 00111001). So for a single digit the first step is to simply subtract 48. That would convert your '4' to binary 00000100.
For multi-digit whole numbers, the computer works from the first (leftmost) digit. For each ASCII digit it subtracts 48. If there's another digit following it multiplies its current answer by ten then continues, adding future digits to the total.
So, for '1234' (ASCII 49, 50, 51, 52) it goes:
When you type 4 in your keyboard, the button you pushed presses on one of many physical switches, which each produces a unique number (in binary) called a scan code which tells your computer which of the many keys you pressed. At point the computer knows which button you pressed, but not what that button represents (you may have seen keyboards for different languages have different physical layouts)
There is then, in the operating system, there is a table which knows the layout and can convert those scan codes into the appropriate binary representation of the number, letter or symbol that the key you pressed represents
There’s a little computer on a microchip inside your keyboard. That chip has a bunch of pins that are constantly (say 100x per second) checking the state of every single button on your keyboard. The buttons on your keyboard work like light switches; there is power on one side and it either connects the power or disconnects it.
When it sees a button state change, software on the microchip goes and looks in a table to see what code it should send to the computer.
Here the button code is sent to the computer over the wire or wireless connection. (Not ELI5: In this case the button code for the “4” key is the value “21” decimal; you can look them up in ch10 here: https://www.usb.org/document-library/hid-usage-tables-14 ).
Then the computer takes the button code and sends that text (still not the value of “4” but the character “4”) to the software that is running. This software can interpret it however it wants, basically through another lookup table. In a calculator application it will convert it to the value of “4”, in a game it might select weapon slot 4, in a document it might just output the text of “4” directly to the document without converting it.
Every character that you use in the everyday world is associated with a binary number per ASCII standards. The character '4' is not actually associated with the binary equivalent of 4 but rather with the binary equivalent of 52. So, if you input '4', your computer receives a signal of 110100 and proceeds from there.
4 in decimal form actually doesn't exist in the computer. It either draws pixels in the shape of a 4, or you have a 4 drawn on the keyboard, they're the only places "4" exists like that, the moment you press the 4 key, it's already in binary form. Your keyboard sends the binary scancode representation of "4" to the computer.
Thanks for the explanation.
To add to that, in most circumstances pressing four on the keyboard will not result in 00000100 binary being stores, but instead 00110100.
The keystrokes get normally stored as characters not numbers and this character just happens to be a digit.
The value of the characters for 0 to 9 are not the values of the numbers 0 to 9, but 48 (00110000) added to them.
To actually treat what you enter as a number rather than a string of characters the computer needs to internally convert them.
And computers do this lightning fast. They can perform several billion instructions per second.
The operating system is told that the "4" key has been pressed. The operating system (keyboard driver) then must determine what to do with the keystroke. If it is part of an operating system meta code (for example ctrl-alt-delete) then the operating system will execute that portion of code, otherwise it will pass the keystroke on to the active program.
Generally the program will receive the keystroke as an ASCII (or unicode if your prefer) character. It just so happens that the ASCII code for "4" is 0x34 (or 0b00110100 in binary). If your program wants to treat this is a number, then all it does is subtract 0x30 from 0x34 leaving 0x04 (or 0b100 in binary). If the program receives a 4 and 1 as keyboard strokes, then it will perform the following:
(0x34 - 0x30)*0xA + (0x31 - 0x30) = (0x04)*0xA + 0x01 = 0x28 + 0x01 = 0x29 = 41 in decimal
This conversion is frequently done via a library function call. When displaying a number on the screen, a different function call is used to do the reverse and convert you binary number into a sequence of ASCII digits.
TL/DR-
When the computer is doing math is does so in binary. When you are entering or reading the results, the computer uses ASCII. Simple software algorithms (generally a prebuilt library) are used to convert between binary and ASCII.
FreeXFall t1_j6h2fno wrote
A computer processor can only give two signals - on (or 1) or off (or 0). Everything is binary on a computer when you remove all the layers.
I don’t know how it works beyond that - but it’s not just decimals, numbers - but everything. Colors, videos, programs, etc - everything is just 0 or 1. It’s the only thing a processor can do is on and off - yes/no electricity.