• Quick note - the problem with Youtube videos not embedding on the forum appears to have been fixed, thanks to ZiprHead. If you do still see problems let me know.

Binary logic

I can tell there are some folks here who have never encountered binary coded decimal. OK, quick now, what's the 2's complement of 320 in BCD? :D
 
Hex is pretty easy to use. 0-7 count up to 0111- eight is 1000, nine is eight plus one, and A is 1010- you can see the mnemonic there, since A is 10 in decimal; and B is 1011, also easy. C, D, and E are finger-counting unless you do it all the time; F is 1111.

I seriously doubt, given the relatively complexity, that counting in hex would have made inventing binary computers any easier. I'll also point out that Boole's work is far more important than arithmetic; and counting in hex doesn't help with Boolean algebra.
 
It is trivial to convert between bases that are powers of each other.
Well, easier. Not exactly trivial.

Converting between bases 16, 8 and 2 is trivial because they are all powers of two.
16 isn't a power of 8. At least, not an integer power. Granted, an rational power is easier than an irrational power, but not quite as easy as an integer power.

There is no such quick conversion for base 2 to base 10.
The algorithm isn't too complicated.

1. Set "current digit" to the left most digit.
2. Set "current total" to zero.
3. Add the current digit to the current total.
4. If the current digit is rightmost, end.
5. Set current digit to the next digit to the right.
6. Double the current total.
7. Go to step three.

Since computers rely on binary logic, we may have invented them earlier if our arithmetic was based on a system that easily converts, like base 8 or 16, rather than akward base 10.
I don't see it. For that to be true, there would have to be someone who was on the cusp of inventing computers, but decided to not bother because he didn't want to bother learning another base. The base issue is simply a matter of programming; the technology has little to do with it.

I can go along with Dawkins on the arithmetic radix bit, but I fail to see how any positive integral number of fingers facilitates understanding binary logic. As long as you've got some fingers you can use any one to represent a binary logic state.
Not really. Try holding up all your fingers. Now try touching your middle finger to your palm without moving any of your other fingers.

Plus, there's a conceptual leap that is needed. For base ten, in a low technology culture, most numbers will be under ten. Multi-digit numbers will be "large" numbers, and will be easily seen as several large groups. Also, you don't need a positional system; you can give a different symbol to each power of ten. For instance, unit, decade, century, millenium, myriad. Five symbols get us up to 10,000, more than most people will need. Five symbols in binary only gets us to 16. Plus, there would be something a bit perverse about having more symbols than the base.

Geez, I had a response 90% complete but inadvertently closed the browser when some yayhoo wandered into my office with the same old song and dance: "Help me with this", "Do your job", "Wake up", "Put your pants back on". Working for such an old-fashioned company can get sooo tiresome.
Yes, I've tried to make it a habit of doing ctrl-a ctrl-c after typing all of my posts.
 
Last edited:
I'm not convinced that the number of fingers we have has anything to do with our current use of base 10, it is simply coincidence. Many civilisations used base 12 or 16, and I'm fairly sure that these were actually used before base 10. The only reason base 10 is in use is because one of the civilisations that happened to use it became extremely powerful, and once a large portion of peope were using it it was easier to go along with it. Same reason that the metric system is taking over now and English is the common international language. It doesn't mean English is better or that there is any particular reason for using it, it is simply that it was used for the British Empire and it is easier to keep using it than to make everyone switch to something else.
 
In hot climates they go barefoot so they count on their toes as well...

They wear mittens and it's so cold they prefer to keep at least one on (this is getting speculative now).


As far as I can tell, that's bang on.
 
I'm not convinced that the number of fingers we have has anything to do with our current use of base 10, it is simply coincidence. Many civilisations used base 12 or 16, and I'm fairly sure that these were actually used before base 10. The only reason base 10 is in use is because one of the civilisations that happened to use it became extremely powerful, and once a large portion of peope were using it it was easier to go along with it. Same reason that the metric system is taking over now and English is the common international language. It doesn't mean English is better or that there is any particular reason for using it, it is simply that it was used for the British Empire and it is easier to keep using it than to make everyone switch to something else.
Alternative:
The folks who first thought up positional notation used base 10. Since positional notation is such a tremendously useful thing, everyone took to using it and the base 10 system that came with it.


If you don't think positional notation is a useful thing, try doing long division by hand with roman numerals. Real mathochists (math masochists) might want to try calculating a square root that way.
 
Last edited:
Alternative:
The folks who first thought up positional notation used base 10. Since positional notation is such a tremendously useful thing, everyone took to using it and the base 10 system that came with it.


If you don't think positional notation is a useful thing, try doing long division by hand with roman numerals. Real mathochists (math masochists) might want to try calculating a square root that way.


Pfft, easy. The square root of IV is II. Duh. If you can't see that, then you probably don't even know the cubed root of XCVII.
 
Charles Babbage's engines worked (or would have worked) in Base 10, even though he was a good friend of George Boole. Given that he was working with toothed gears rather than transistors, he could have used any radix, limited only by how finely his engineer Clement could mill the gear teeth.

I think only a complete lack of imagination on behalf of the British Government (plus ça change!) prevented him from getting the funding to do the work (assuming he was ordered to work to a contract and a strict timeline - he was a dreadful one for cutting and changing his designs). It's interesting to speculate (as many fiction writers have) that decimal cogwheel computers actually became viable in the Victorian era, especially since Babbage's recreated Difference Engine (now in the Science Museum), can handle artillery ballistic calculations with a high order of precision - something ENIAC was first used to calculate in the 1940s.

The proposed Analytical Engine had memory (the store), a processor (the mill) and a logic unit (the barrel), and (very) limited conditional logic. However, there's no reason why far more complex logic couldn't have been implemented. As manufacturing procedures improved, they would have become cheaper and cheaper. Decimal logic would have become perhaps the worldwide norm. The Engine was punch-card programmable, too.

Perhaps if Babbage engines had become commonplace, in time they would have delayed the rise of electronic computers: ENIAC, Colossus and the early thermionic computers were far more expensive in real terms than Babbage's engines, and bigger, too.

Perhaps if we'd had efficient, powerful, reliable decimal computers in the Victorian age, binary computer logic would have seemed not so obvious under those circumstances.

However, Babbage was largely below the radar to the early modern computer pioneers like Eckert, Mauchly and Turing. Effectively, they started with a blank sheet, and thermionic valves (tubes) seemed the logical way to go, and binary made complete sense in that application. What would have happened if they'd been brought up with the great-great grandsons of Babbage Analytical Engines?
 
Pfft, easy. The square root of IV is II. Duh. If you can't see that, then you probably don't even know the cubed root of XCVII.
Right.

Now, without converting to decimal, calculate the square root of VI - and represent your answer in roman numerals.
 
Alternative:
The folks who first thought up positional notation used base 10. Since positional notation is such a tremendously useful thing, everyone took to using it and the base 10 system that came with it.

If you don't think positional notation is a useful thing, try doing long division by hand with roman numerals. Real mathochists (math masochists) might want to try calculating a square root that way.

That'd work too. Whatever the case, I think any connection between our number of digits and numerical base is simply an unsupported assumption, especially when even just a quick look at the history of maths and numbers will show differently.
 
Perhaps if Babbage engines had become commonplace, in time they would have delayed the rise of electronic computers: ENIAC, Colossus and the early thermionic computers were far more expensive in real terms than Babbage's engines, and bigger, too.

I would go with this, but my hobby and interest is mechanical television (1925-1936). This was the initial developmental stage of television, using a mechanical means of scanning (nipkow disc, mirrow drum,etc) the image and sending it to a set with a modulated light source and a synched scanning drum or disc. Images were at most 4 to 6 inches diagonal, but sets were easy to make for the hobbyists, and inexpensive. They sparked the interest in television while the kinks were worked out of the development of all electronic broadcast and receivers (which had better resolution). So while the mechanical computers may have become the standard work horse for a while, the need for speed (and fewer moving parts) would have only meant a small delay for ENIAC/UNIVAC. :)
Check outhttp://www.earlytelevision.org for more info.

basilio
 
Right.

Now, without converting to decimal, calculate the square root of VI - and represent your answer in roman numerals.

Absolutely, MortFurd. Without a radix of multiplication, it's really hard if not impossible to do mathematics, or even arithmetic.

It's not just a matter of simple familiarity with the symbols - you can use the rules of long multiplication or division with binary, hex, octal, duodecimal - anything, as long as you know the radix and the numerical order of the symbols. Also, a zero is essential, which the Romans and Greeks lacked.

The ancient Greeks had some brilliant thinkers, like Pythagoras, but even he didn't do algebra. He considered all such problems in terms of geometry - the "square on the hypotenuse" to him was a real square, not the mathematical product of two identical numbers.

Such brilliant people didn't fail to discover maths because they were stupid - they just didn't have the tools.

Arabic numerals have spread all over the world, not because they look pretty but because, with a radix and a zero, they make real mathematics possible.

Otherwise, why aren't we still using Roman numerals?
 
I would go with this, but my hobby and interest is mechanical television (1925-1936). This was the initial developmental stage of television, using a mechanical means of scanning (nipkow disc, mirrow drum,etc) the image and sending it to a set with a modulated light source and a synched scanning drum or disc. Images were at most 4 to 6 inches diagonal, but sets were easy to make for the hobbyists, and inexpensive. They sparked the interest in television while the kinks were worked out of the development of all electronic broadcast and receivers (which had better resolution). So while the mechanical computers may have become the standard work horse for a while, the need for speed (and fewer moving parts) would have only meant a small delay for ENIAC/UNIVAC. :)
Check outhttp://www.earlytelevision.org for more info.

basilio

I'm quite up on Logie Baird's mechanical TV with the Nipkov disc - I think it's a marvellous, madcap example of lateral thinking. However, remember that Baird and Farnsworth were coeval - they weren't three generations apart.

Farnsworth wanted to invent television, as did Baird. However, if Baird had come along in 1822 with his idea for television, he might have had no competitors. Over the decades, there would have been at least some improvements in mechanical TV, and it might have become relatively practical. It would be "part of the furniture".

Would Philo Farnsworth then have had the same drive to invent electronic TV?

Do you know how the Japanese got the reputation for making smaller, cheaper electronic gizmos? Simply because the American industries gave them the transistor because they thought it would go nowhere. It took America a while to catch on!

Also, think of how long the slide rule and the abacus lasted, even after they were replaced by cheap, readily obtained, electronic calculators. They had several centuries' head start. I was taught to use a slide rule at school, even though CHEAPER pocket calculators were widespread by then.

Sorry if I'm in bludgeon mode, basilio - I don't mean to hammer you. Peace, brother! :)
 
However, if Baird had come along in 1822 with his idea for television, he might have had no competitors.
Maybe, but he also wouldn't have had a way to make it work, as television requires some sort of photosensitive element. The precursor of the fax also might have benefitted from such an element, but physics simply hadn't reached the point where it could have been made.
 
Maybe, but he also wouldn't have had a way to make it work, as television requires some sort of photosensitive element. The precursor of the fax also might have benefitted from such an element, but physics simply hadn't reached the point where it could have been made.

Agreed, Earthborn - that was only for the sake of argument. However, my point is that Charles Babbage could well have come up with a programmable digital computer in 1850. As it was, that just didn't happen until the 1940s, when thermionic valves had been in existence for a long time.

We now know that Babbage's Difference Engine Mark 2 would have worked (I must go and see the working replica!), and engineers have found nothing major wrong with his concept and drawings of the Analytical Engine.

Like I said, the slide rule lasted long after it effectively became obsolete, even though it's only good for log functions like multiplication. It was the de jure engineer's tool well into the 70s, when I was at primary school. Even when I was at secondary school, I was not allowed to use my fully-featured Casio electronic pocket calculator until I had mastered the use of function tables and the slide rule. It took no special brain power to use either of these tools, but the teachers obviously couldn't foresee a time when they'd become obsolete.

Babbage's computer would have been infinitely more useful than the humble slide rule. How long might it have clung on as the mathematical prime mover?
 
However, remember that Baird and Farnsworth were coeval - they weren't three generations apart....

Also, think of how long the slide rule and the abacus lasted, even after they were replaced by cheap, readily obtained, electronic calculators. They had several centuries' head start. I was taught to use a slide rule at school, even though CHEAPER pocket calculators were widespread by then.

Sorry if I'm in bludgeon mode, basilio - I don't mean to hammer you. Peace, brother! :)

No bludgeon apology needed:)

Well, Nipkow and contemporaries tried to develop a sort of television, but there were not sensitive enough light pick-ups, amplifier circuts, or neon tubes. (1880s?) Sort of like James Burke's "Connections", it had to wait for photo cells and amplifiers for Baird, Jenkins, Ives, etc, to start off on the Mechanical track. Farnsworth was a true genius and in the same time period had the all electronic system come to him in one fell swoop. (Baird was a genius too, and came to the electronic system, and made improvements too, ie. 600 or so line color television during the war years)

But I digress, and may be vearing off topic. I think the message to take from the mechanical/electronic situation is that people would buy and use the more expensive tv (technology) if it offered better resolution and reception. As to slide rules, I will always say that mine would never have a battery die in the middle of an exam, but calculators are so damn fast, easy and sexy. I've really forgotten how to work one now (I was part of the overlap generation and got my first calculator as a senior in high school).

I will agree though, that if certain technologies developed earlier (if they were not dependant on specific previous developments) they would likely have spurred sooner growth in their industries, but if your were offered a Model T Ford and a '57 Chevy, most would take the latter.

basilio
 
But if you were offered the choice of a '57 Chevy or a Nissan Micra...? :)
 
Not really. Try holding up all your fingers. Now try touching your middle finger to your palm without moving any of your other fingers.

Then don't do it that way. That's like saying a pencil's not helpful for working sums 'cuz I can't keep the columns straight when I write on my backside. So I quit using it that way. For arithmetic, anyway.
 

Back
Top Bottom