• Quick note - the problem with Youtube videos not embedding on the forum appears to have been fixed, thanks to ZiprHead. If you do still see problems let me know.

Binary logic

senorpogo

Master Poster
Joined
Mar 18, 2006
Messages
2,100
I'm reading Dawkins' "The Ancestor Tale". In the prologue, he notes -

If we had eight (or sixteen) fingers, we'd think naturally in octal (or hexadecimal) arithmetic, binary logic would be easier to understand, and computers might have been invented much earlier.

I'm a little fuzzy on the how and why of his reasoning. Any help?
 
It is trivial to convert between bases that are powers of each other.

Converting between bases 16, 8 and 2 is trivial because they are all powers of two.

For example, 0101b is 5h, 5o. 01010101h is 55h. 010101010101 is 555h - each group of four binary digits can be easily mapped to the appropriate hex digit. For octal it is each group of three binary digits. To do the reverse: Ah is 1010h, AAh is 10101010h etc...

There is no such quick conversion for base 2 to base 10.

5d is 101b, 55d is 110111b, 555d is 1000101011b - as you can see there is no such straightforward digit grouping.
 
Last edited:
It is trivial to convert between bases that are powers of each other.

Converting between bases 16, 8 and 2 is trivial because they are all powers of two.

For example, 0101b is 5h, 5o. 01010101h is 55h. 010101010101 is 555h - each group of four binary digits can be easily mapped to the appropriate hex digit. For octal it is each group of three binary digits. To do the reverse: Ah is 1010h, AAh is 10101010h etc...

There is no such quick conversion for base 2 to base 10.

Got it.

Base 8 or 16 easily converts to base 2.
Base 10 does not.
Since computers rely on binary logic, we may have invented them earlier if our arithmetic was based on a system that easily converts, like base 8 or 16, rather than akward base 10.
 
I can go along with Dawkins on the arithmetic radix bit, but I fail to see how any positive integral number of fingers facilitates understanding binary logic. As long as you've got some fingers you can use any one to represent a binary logic state.

His octal assertion probably considers counting on our fingers to have motivated our decimal preference.

His hexadecimal assertion sounds superficially reasonable, but it doesn't address why we never ever (well, hardly ever) used radix of 20. I'm guessing numbers were in widespread use long before shoes, but 20 is a few too many for various casual mental processes. Since 16 isn't that much smaller, we might have stuck to octal for the same reason.

Of course, one handy thing about hexadecimal is that not only is 16 a power of two, it's a power of a power of two; in fact, its two to the power of two two times. That just seems two handy two ignore.
 
I can go along with Dawkins on the arithmetic radix bit, but I fail to see how any positive integral number of fingers facilitates understanding binary logic. As long as you've got some fingers you can use any one to represent a binary logic state.

That is true but since there is not a single known culture on this planet that has ever developed a counting system using hands that works by considering each finger to be a power I think the point stands.

Interesting non-10 radix systems include the Egyptian base 12 - counting with the segments of the fingers (3*4) and the Babylonian base 60. And of course the Roman screwed up system that doesn't really have a base.
 
Of course, one handy thing about hexadecimal is that not only is 16 a power of two, it's a power of a power of two; in fact, its two to the power of two two times. That just seems two handy two ignore.
Definitely. Never underestimate the power of two!
 
Got it.

Base 8 or 16 easily converts to base 2.
Base 10 does not.
Since computers rely on binary logic, we may have invented them earlier if our arithmetic was based on a system that easily converts, like base 8 or 16, rather than akward base 10.

Don't be too hasty to bundle the logic and the arithmetic together. It's not *too* difficult to do decimal arithmetic with binary logic, and some early computers did just that. You can do a lot of good computing -- digital as well as analog -- without any particular reliance on binary. Lookup Knuth and Babbage for starters.

Methinks it would be more defensible to say that binary logic is sufficiently powerful for computation and representation of information, and sufficiently straightforward to implement reliably and efficiently in hardware. Together that made it the "logical" choice for computer development.
 
I'd rather we had 12 fingers rather than 8 or 16.

Counting in base 12 is far more practical than counting in base 10 or a power of two because it is easier to take the most common fractions. If you count in base 12, it is trivial to divide by 2, 3, 4, and 6. Base 10 is easy to divide by 2 and 5, and you can do 4 without too much difficulty. But dividing into thirds comes up far more often in real life than dividing into fifths.

Despite the fact that our number system is base 10, this fact keeps on getting noticed. Which is why our units of time are multiples of 12, angles are measured in multiples of 12, things are often sold by the dozen or the gross, and measuring systems that emerge out of daily life (the Imperial one is the only one which survives) tend to have 12s in lots of useful places.

If we had 12 fingers, then we could have a measuring system that was easy to calculate with (like the metric system) and made it easy to cut quantities into thirds.

See http://www.polar.sunynassau.edu/~dozenal/articles.html for more.

Cheers,
Ben
 
Geez, I had a response 90% complete but inadvertently closed the browser when some yayhoo wandered into my office with the same old song and dance: "Help me with this", "Do your job", "Wake up", "Put your pants back on". Working for such an old-fashioned company can get sooo tiresome.

That is true but since there is not a single known culture on this planet that has ever developed a counting system using hands that works by considering each finger to be a power I think the point stands.

That's not the point I was contesting. Counting and representing numbers is but one application of binary logic. The key feature of binary logic is not representation of numbers but the representation of a single bit (pun intended) of information as a choice of exactly one of exactly two permitted states. The overused "zero/one" labeling of the states leads too many people to the equate binary logic with binary numbers; try changing the names to, say, "mark/space", "dot/dash", "extended/flexed".

Interesting non-10 radix systems include the Egyptian base 12 - counting with the segments of the fingers (3*4) and the Babylonian base 60. And of course the Roman screwed up system that doesn't really have a base.

I had a marvelously eloquent diatribe for this too, but Ben hit most of the high spots:

Counting in base 12 is far more practical than counting in base 10 or a power of two because it is easier to take the most common fractions. If you count in base 12, it is trivial to divide by 2, 3, 4, and 6. Base 10 is easy to divide by 2 and 5, and you can do 4 without too much difficulty. But dividing into thirds comes up far more often in real life than dividing into fifths.

Decimal is no worse than duodecimal (12) or sexagesimal (60) for addition, subtraction and multiplication, but its limited suite of factors makes decimal less handy for many common divisions. That's probably why we cling to sexagesimal for angles and time and duodecimal time, both of which we have frequent need to subdivide into smallish integral parts and relatively little use for arbitrary multiplications.
 
Last edited:
I'm not entirely convinced that having a convenient base system has much to do with computing. Computers are based on the turing machine, not the binary system. They are built on the binary system, but the concepts for inventing a computing machine come from the turing machine, and I don't get how a convenient base system would hasten the idea of a turing machine.
 
I'm not entirely convinced that having a convenient base system has much to do with computing. Computers are based on the turing machine, not the binary system. They are built on the binary system, but the concepts for inventing a computing machine come from the turing machine, and I don't get how a convenient base system would hasten the idea of a turing machine.

I'm pretty sure the idea of a Turing machine had nothing at all to do with actually inventing computers.
 
Regarding Dawkins' quote
If we had eight (or sixteen) fingers, we'd think naturally in octal (or hexadecimal) arithmetic, binary logic would be easier to understand, and computers might have been invented much earlier.
The abacus and early computers used the bi-quinary system. The first computer I programmed was the IBM-650, a bi-quinary machine. Very analogous to two hands of five fingers each. Binary machines came later.

Counting in base 12 is far more practical than counting in base 10 or a power of two because it is easier to take the most common fractions. If you count in base 12, it is trivial to divide by 2, 3, 4, and 6. Base 10 is easy to divide by 2 and 5, and you can do 4 without too much difficulty. But dividing into thirds comes up far more often in real life than dividing into fifths.
Having many divisiors is convenient for many purposes. Consider the acre, which equals 43560 square feet, an odd-sounding number. However, 43560 is evenly divisible by 2, 3, 4, 5, 6, 8, 9, 10, 11, and 12, which is quite convenient when dividing land into rectangles having an even number of feet on each side.

Can't do that with hectares, but with pocket calculators and computers, it's not really inconvenient to use several decimal places in land boundaries any more.
 
His hexadecimal assertion sounds superficially reasonable, but it doesn't address why we never ever (well, hardly ever) used radix of 20.

Because your statement is simply not true. Radix-twenty counting systems are quite common -- for example, most of the Mesoamerican languages such as Quechua use a base 20 system.

Part of the problem (from my admittedly superficial reading of the subject) is that there appears to be a link between radix choice and climate -- in general, the warmer the climate, the larger the radix. Radix 20 systems are common in tropical or near-tropical conditions, while radix 10 is common in temperate climates and when you get further north --- for example, Greenlandic -- you get down to radix 5. (Those, by the way, seem to be the three main choices. I've never seen a single radix-17 language and I don't expect to see them). But since there also seems to be a correlation between climate and technological development (most "civilizations" are temperate-climate), that means that most technological societies developed from a radix-10 substrate.
 
Part of the problem (from my admittedly superficial reading of the subject) is that there appears to be a link between radix choice and climate -- in general, the warmer the climate, the larger the radix. Radix 20 systems are common in tropical or near-tropical conditions, while radix 10 is common in temperate climates
In hot climates they go barefoot so they count on their toes as well...
and when you get further north --- for example, Greenlandic -- you get down to radix 5.
They wear mittens and it's so cold they prefer to keep at least one on (this is getting speculative now).
 
Regarding Dawkins' quote
The abacus and early computers used the bi-quinary system. The first computer I programmed was the IBM-650, a bi-quinary machine. Very analogous to two hands of five fingers each. Binary machines came later.

Something I couldn't help but notice from the Wiki article was that it still used Binary, it had a bunch of switches that were either on (1) or off (0) and just accepted input in a slightly differentl format than a 'straight' binary machine.
 
I'm reading Dawkins' "The Ancestor Tale". In the prologue, he notes -



I'm a little fuzzy on the how and why of his reasoning. Any help?
I'm a fan of richard dawkins generally, but I'm fairy sure that, in this instance, he is talking male genetalia.

Modern digital computers use base 2 because they are electronic circuits and with electronic circuts, base 2 is a convenient base to use - any number can be repesented by a series of ons and offs where on is circuit closed and off is circuit open.

the idea of using binary or octal or hex is a consequence of the technology we use to implement digital computers, not the other way around.
 
I'm a fan of richard dawkins generally, but I'm fairy sure that, in this instance, he is talking male genetalia.

Modern digital computers use base 2 because they are electronic circuits and with electronic circuts, base 2 is a convenient base to use - any number can be repesented by a series of ons and offs where on is circuit closed and off is circuit open.

the idea of using binary or octal or hex is a consequence of the technology we use to implement digital computers, not the other way around.

Yes, but the question Dawkins proposes is: If we were already used to thinking in Octal, would thinking in binary have happened sooner (as the two are so closely related), thus causing us to build computers (which are easier when thinking in binary) sooner?
 
I suspect we might be reading too much into a tongue-in-cheek comment on RD's part.
 

Back
Top Bottom