sajad torkamani

In a nutshell

Computers can only understand binary (1s and 0s). ASCII (American Standard Code for Information Interchange) is a character encoding standard that maps a set of characters into a 7-bit binary representation.

ASCII uses seven bits to represent a number which means it can only represent the numbers 0-127 (1111111 is 127) and therefore a maximum of 128 characters.

Here’s the ASCII table which shows the mapping between the numbers 0-127 and their corresponding characters.

ASCII table

The limitations of ASCII

Due to its limited 7-bit capacity, ASCII covers only a tiny proportion of available characters. For example, it doesn’t cover the characters found in Greek, Hebrew, Arabic, Chinese, Russian, and countless other languages, or symbol systems. Due to ASCII’s limited capacity, Unicode, which can use 8, 16, or 32 bits is a far more widely used encoding standard.

Sources & further reading

Tagged: Computing