Positional noation is the key to the whole thing, and positional notaion happened to have been invented by folks who used base ten. The notational form caught on for its praticality, and the folks who started using it went along with base ten because they came together.
Here's the catch to positional notation:
10 can mean different things depending on the base.
10 base two is two.
10 base ten is ten.
10 base sixteen (hexadecimal) is sixteen.
10 base twelve is twelve.
It doesn't really matter what base you use. You just need the same number of symbols as your base.
However, the lower the base, the greater the number of digits in a given quantity. Imagine your new friend's phone number is 164-355-41709. In binary, that would be 10100100-101100011-1010001011101101.
A little hard to scribble on the back of your hand, and very easy to make a mistake! Also, imagine reading it out to someone!
The higher the base, the greater the number of symbols needed. This certainly trims down the length of numbers (in hex, the above example is A4-163-A2ED), and it's easier to memorise correctly.
However, 10 is a handily sized base for the sorts of quantities with which we're familiar in our everyday lives. (Well, OK, I'm an electronics engineer, so I use exponential notation a lot, and I use hex frequently, but that's not everyday life for most people).
I think that's why decimal notation has caught on so well. It's not uniquely useful (duodecimal and hexadecimal would be easy for kids to learn), but I do feel it's more useful than binary in everyday life.
If we dealt routinely in quintillions rather than tens, hundreds and thousands, decimal would be cumbersome and another radix, say base 1000, would be more useful.
Binary's useful if you deal in 2's, like computers. But we don't as a matter of course, and the number lengths for humans are cumbersome and prone to copying error. That's just why octal and hex were invented by computer programmers, rather than them using straight binary.
A computer doesn't see FE - its sees 11111110. However, programmers needed to add a level of complexity to make the numbers manageable.