I'm with Zep here. I was taught to use one at school. Invaluable practice, both in keeping track of what you were doing (right order of magnitude), and not reporting silly numbers of "significant" figures.
But you can't add up columns of piglets born to a shed full of sows on one. When I was a student, the one girl in our animal husbandry class who had a calculator was the envy of us all (her father was a professor in the medical school).
Then two years later I started an advanced biochemistry course. I'd been doing medicine for three years and forgotten how to do logarithms - so they stuck me in remedial maths till I remembered. But at the same time I realised I really did need a calculator, and that I could actually afford one (I was a qualified vet by this time and had actually earned some real money.) I was in love. I wore out its first battery the first evening.
My lab partner, though, had paid twice as much a year previously for something which worked on "reverse Polish logic" (is that a real term?) and she kept getting the wrong answers. As they always made us BOTH repeat the sums when we didn't agree, I was less than enchanted (my new toy had a floating decimal point).
I managed to get through my PhD mostly with a scientific calculator, and as little use as I could get away with of a tape-drive computer which honestly wasn't much more than a programmable calculator. Ever worked out an RIA curve fit to a sigmoid curve on paper?
When I got a credit-card-sized calculator free with a packet of Betamax videotapes, I knew the future had arrived! But lookee here. Squillion mega-something Pentium 4, smaller than the desk calculator in my 1976 biochemistry class - and I still count on my fingers.
Rolfe (in nostalgia mode).