Computers represent numbers using different bases. Below is a quick explanation of number bases, followed by an interactive decimal converter.
Any number \(N\) written in base \(B\) can be expressed as:
\[ N = a_0 + a_1 B^1 + a_2 B^2 + \cdots + a_k B^k \]where each coefficient satisfies:
\[ 0 \le a_i < B \]The most common number bases in computing are:
Hexadecimal digits include: \(0\)–\(9\), \(A\)–\(F\), where \(A=10\), \(B=11\), \(C=12\), \(D=13\), \(E=14\), \(F=15\).
\[ A67B = 11 + 7 \times 16^1 + 6 \times 16^2 + 10 \times 16^3 \]Enter a non-negative integer (base 10):
| Binary (Base 2) | |
| Octal (Base 8) | |
| Hexadecimal (Base 16) |