In the thread discussing the 1 = 0.999... topic, many posters seem to think that the symbols 0,1,2,3,4,5,6,7,8,9,- and the decimal point actually are numbers. This seems wrong to me. The base-10 decimal notation, using arabic numerals, is just a clumsy way of representing numbers; roman numerals are another, even clumsier, way. The posters in the other thread are really arguing about the inadequacies of the representational system, rather than the numbers themselves.
So does a better system exist, or can you invent one? By a better system, I mean one that can represent numbers at least as well as our current system, but without the ambiguity of being able to write some numbers down in two different ways.
So does a better system exist, or can you invent one? By a better system, I mean one that can represent numbers at least as well as our current system, but without the ambiguity of being able to write some numbers down in two different ways.