bit / byte



The word bit is short for bi nary digit, where a binary digit is a numeric symbol that can take on one of two values, usually 0 or 1. You are probably more familiar with decimal digits, that is, numeric symbols that can take on one of ten values: 0, 1, 2, 3, 4, 5, 6, 7, 8, or 9. The use of decimal digits seems natural to us because it is natural to begin our experience of counting using ten fingers. Within a computer, however, the natural system of counting is one that is based on two options: switches that are either on or off. You can think of it as an alphabet with only two characters. The table below indicates how you would count from 0 to 7 using bits.


        
Decimal number: 0 1 2 3 4 5 6 7
Binary number: 0 1 10 11 100 101 110 111

    1
  Notice that the higher you count, the more adjacent bits you need to represent the number. For example, it requires two adjacent bits to count from 0 to 3, and three adjacent bits to count from 0 to 7. This has important implications for coding and transmitting information. Suppose that you wish to assign a numeric code to each character on a typical computer keyboard and then you wish to transmit this information in bits. A typical keyboard has 101 keys. When you include the option of simultaneously pressing the shift key, a rough estimate tells you that you need to be able to count from 0 to 201. The minimum number of adjacent bits that we require in order to count this high is 8: 7 adjacent bits would allow you to count from 0 to 127 only, and 8 adjacent bits will allow you to count from 0 to 255. A sequence of 8 bits is so important that it has a name of its own: a byte. A handy mnemonic for byte is binary digit eight. Thus, it requires one byte of information (or 8 bits) to transmit one keystroke on a typical keyboard. It requires three bytes of information (or 24 bits) to transmit the three-letter word the.    2

AI Trading