- How does a computer know the difference between a letter and a number?
- How does a computer know the difference between instructions and data?
- How does the computer know you have written a string?
- Do computers understand letters?
How does a computer know the difference between a letter and a number?
How does the computer know whether the 01000001 in a byte of memory is the number 65 or the letter A? Because an application program keeps track of what it put where in memory, so MS Word knows that a given byte where it has stored text contains numbers that represent letters.
How does a computer know the difference between instructions and data?
Computers can't tell the difference between instructions and data in memory. The only clue is if memory pages have protection, in which case pages containing instructions will have the execute bit set. If you take some data and set the execute bit and then jump into it, the computer will try and execute it.
How does the computer know you have written a string?
How does a computer know it is a number or a string or a letter? A computer knows because you tell it. When you write a program, you declare variable types and data structures, and the instruction sequence of the program guides the computer to understand how to interpret the bit patterns in different places in memory.
Do computers understand letters?
Every letter has its unique unicode/ascii value which is then converted into binary which can be understood by the computer.