What is a Bit?
A bit is a discrete "piece" of information. It is a physical entity.
Consider a coin. It has two information states
Heads or Tails.
So a sequence of bits can represent a series of successive coin tosses.
A mechanism, like a box that flips coins, which produces a series of bits, is sometimes called an
information source. Even if this information source produces "noise"
(e.g. random bits of information) it is still generating physical
information.
Consider a coin that is flipped twice:
What are the possible outcomes?
- HH
- HT
- TH
- TT
Four equally likely outcomes can be represented by two bits of individual information (e.g. H or T).
For three coin
flips there are 8 different equally likely outcomes and three bits of information is needed to represent those 8 possible outcomes.
So if something has N equally likely outcomes, it will take
b = log2 N
bits to represent this.
Powers of 2 table:
Now let's go back to the case of 3 bits that represent 8 equally
likely outcomes.
The next step is to have "protocol" mapping onto the bits. That is,
column A makes sense as a protocol while column B does not. This
is precisely what the ASCII standard is. To represent a character
(in the english language) by a unique pattern of bits.
In fact, it is a unique pattern of 8 bits and 8 bits make up a
BYTE .
Each character then requires 1 BYTE of
memory to be stored.
An arithmatic example:
A useful binary to decimal calculator
|