Submitted by R0oty t3_10ovaci in explainlikeimfive
Any-Growth8158 t1_j6jhg4u wrote
The operating system is told that the "4" key has been pressed. The operating system (keyboard driver) then must determine what to do with the keystroke. If it is part of an operating system meta code (for example ctrl-alt-delete) then the operating system will execute that portion of code, otherwise it will pass the keystroke on to the active program.
Generally the program will receive the keystroke as an ASCII (or unicode if your prefer) character. It just so happens that the ASCII code for "4" is 0x34 (or 0b00110100 in binary). If your program wants to treat this is a number, then all it does is subtract 0x30 from 0x34 leaving 0x04 (or 0b100 in binary). If the program receives a 4 and 1 as keyboard strokes, then it will perform the following:
(0x34 - 0x30)*0xA + (0x31 - 0x30) = (0x04)*0xA + 0x01 = 0x28 + 0x01 = 0x29 = 41 in decimal
This conversion is frequently done via a library function call. When displaying a number on the screen, a different function call is used to do the reverse and convert you binary number into a sequence of ASCII digits.
TL/DR-
When the computer is doing math is does so in binary. When you are entering or reading the results, the computer uses ASCII. Simple software algorithms (generally a prebuilt library) are used to convert between binary and ASCII.
Viewing a single comment thread. View all comments