Teach Yourself PIC Microcontrollers | www.electronicspk.com | 14
the suffix D. Commonly if no suffix is used the number is assumed to be decimal.
Bit
Theory says a bit is the basic unit of information...Let us neglect such a dry explanation for a moment and
take a look at what it is in practice. The answer is- nothing special- a bit is a binary digit. Similar to decimal
number system in which digits in a number do not have the same value ( for example digits in the number
444 are the same, but have different values), the “significance” of some bit depends
on the position it has in
binary number. Therefore, there is no point to talk about units, tens etc. Instead, here it is about zero bit
(rightmost bit), first bit (second from the right) etc. In addition, since the binary system uses
two digits only
(0 and 1), the value of one bit can be 0 or 1.
Do not let you be confused if you find some bit has value 4, 16 or 64. It means that bit’s values are
represented in decimal system. Simply, we have got so much accustomed to the usage of decimal numbers
that these expressions became common. It would be correct to say for example, “the value of the sixth bit in
binary number is equivalent to decimal number 64”. But we all are just humans and a habit does its
own...Besides, how would it sound “number: one-onezero- one-zero...”
Byte
A byte or a program word consists of eight bits placed next to each other.
If a bit is a digit, it is logical that
bytes represent numbers. All mathematical operations can be performed upon them, like upon common
decimal numbers. As It is case with digits of any other number, byte
digits do not have the same
significance. The largest value has the left-most bit called most significant bit (MSB). The right-most bit
has the least value and is therefore called least significant bit (LSB). Since eight zeros and units of one byte
can be combined in 256 different ways, the largest decimal number which can be represented by one byte is
255 (one combination represents zero).
Concerning terminology used in computer science, a concept of nibble should be clarified. Somewhere and
somehow, this term referred to as half a byte came up. Depending on which half
of the byte we are talking
about (left or right), there are “high” and “low” nibbles.
Chia sẻ với bạn bè của bạn: