I've heard about these strange threads popping up all around the internet with people claiming the same (false) idea about infinite decimal representations of numbers. Allow me to use another explanation that will (hopefully) clear up where the confusion sets in for the laymen. Let me first say that .999... is not a decimal representation of a number at all. The hint as to why it's not? Those dots: "..." They're not numbers! A decimal representation of a number has a finite number of digits chosen from the set {0 1 2 3 4 5 6 7 8 9} with the addition of a single "." somewhere in the number (the "." is implied at the end if unwritten). So, the question is, who started to perpetrate these lies and deceits to our children? (Is our children learning?) Well, the answer if simple: mathematicians lie. All the time. Well, almost always. They tell lies because normal people (much less children) do not have the years of conditioning to understand the true reasons behind whatever "cool" fact they're trying to talk about. Some of the lying even extends up until collegiate levels of mathematics learning! Usually, once you hit the last year of your undergraduate mathematics degree, your professors stop lying about most of the things they say in the classroom. So, anyway, we're not able to compare ".999..." with "1" visually at all, because we aren't using the same representations for the numbers. One is using a decimal representation (1) and the other is using a lie that mathematicians tell the public to keep the truth at bay! I'm glad there were some on this forum to set the facts straight, entroper.







