Boolean algebra deals with variables that take only two possible values: true or false, often represented as \(1\) and \(0\). This mathematical system is fundamental to modern digital computers, which operate using binary logic.
Consider the statement: “I will buy a car if I get a salary increase or I win the lottery.” The decision to buy a car depends on two propositions, each of which can be either true or false.
George Boole introduced a symbolic way to represent logical relationships: \(1\) for true, \(0\) for false, and the symbol \(+\) for the logical OR. Let \(X\) = “get a salary increase”, \(Y\) = “win the lottery”, and \(F\) = “buy a car”.
The logical relationship can be written as:
\[ F = X + Y \]
Now consider the statement: “I will be able to read e-books online if I buy a computer and get an internet connection.” Here, the outcome depends on two conditions being true simultaneously.
Using Boolean notation, with \(X\) = “buy a computer”, \(Y\) = “get an internet connection”, and \(F\) = “read e-books”, the AND operation is written as:
\[ F = X \cdot Y \]
The third fundamental Boolean operator is the NOT operator, which reverses the value of a proposition. If \(X\) is true, then \(X'\) is false, and vice versa.
\[ X' = \text{NOT } X \]
Using these three basic operators, more complex Boolean expressions can be formed. For example:
\[ F = X \cdot (Y + Z) \]Commutative Laws
\[ X + Y = Y + X \] \[ X \cdot Y = Y \cdot X \]Associative Laws
\[ X + (Y + Z) = (X + Y) + Z \] \[ X \cdot (Y \cdot Z) = (X \cdot Y) \cdot Z \]Distributive Law
\[ X \cdot (Y + Z) = X \cdot Y + X \cdot Z \]De Morgan’s Theorems
\[ (X + Y)' = X' \cdot Y' \] \[ (X \cdot Y)' = X' + Y' \]