Sunday, July 10, 2011

ones and zeroes

The ascension of the computer constitutes the crudest proof of the methodology by which the "or" uses the "and" to create complexity.  For all computer language, in its most basic form, consists of zeroes and ones. Eight bits (Eight zeroes and ones) equals one byte, and the number of combinations of Os and 1s that can make up a byte is two raised to the power of eight, or 256.  This is more than the number of combinations needed to form the letters of the alphabet and the 10 numbers (0 to 9) upon which the western numerical system is based.  Thus, from these zeroes and ones we can construct letters, and from these letters we can construct words, and from these words we can construct sentences and so on.  All from combinations of zeroes and ones.  All from the repetition of zero and one.  The "and".

   Of course, the "and" is given a little boost from the "or" in this process, for we start from two different entities, zero and one.  We could conceivably base a language on just one entity using repetition.  One "1" could constitute the letter "A", two "1"s could constitute the letter B and so on.  But it would seem to be less efficient, using more space.  Also, something would be needed to separate every letter from every other letter, and that, it would seem, would be a zero.   Thus, it would seem difficult to get around the need for at least two entitites.

Of course, even before the advent of computers, the "or" made use of the "and".  There are only 26 letters in our alphabet, but through combining them in various ways, we have constructed thousands of words, sentences, verse, books.  The number of combinations is infinite.  Similarly, with only 10 numbers we can construct the symbols for an infinite number of numbers.

A thought: The use of zeros and ones at least seems to require a good deal of space, but this is because electrical switches, at least in the non quantum world, have two states, on and off.  But in the quantum world, things do not have easily definable values. We can't know the position and momentum of an electron at the same time (Heisenberg's uncertainty principle. Unfortunately, Heisenberg was a Nazi.) Values, Heisenberg's included, become fuzzier in the quantum world.   I don't know how many values there are.  But if there were, say, 26 values, we could take advantage of this so that we would not need such long strings and sets of strings of ones and zeros.  Can we take advantage of the quantum world to create vastly more efficient computerized systems? I'm sure that some are working on it.

No comments:

Post a Comment