Quantcast XP Math - Forums - View Single Post - Infinite 0.9 Not Equal To 1
View Single Post
Old 06-12-2008   #15
Posts: n/a


Another Explanation:

Perhaps the most common development of decimal expansions is to define them as sums of infinite series. In general:


For 0.999… one can apply the powerful convergence theorem concerning geometric series:

If | then .

Since 0.999… is such a sum with a common ratio , the theorem makes short work of the question:

This proof (actually, that 10 equals 9.999…) appears as early as 1770 in Leonhard Euler's Elements of Algebra.

The sum of a geometric series is itself a result even older than Euler. A typical 18th-century derivation used a term-by-term manipulation similar to the algebra proof given above, and as late as 1811, Bonnycastle's textbook An Introduction to Algebra uses such an argument for geometric series to justify the same maneuver on 0.999…. A 19th-century reaction against such liberal summation methods resulted in the definition that still dominates today: the sum of a series is defined to be the limit of the sequence of its partial sums. A corresponding proof of the theorem explicitly computes that sequence; it can be found in any proof-based introduction to calculus or analysis.

A sequence has a limit x if the distance becomes arbitrarily small as n increases. The statement that 0.999… = 1 can itself be interpreted and proven as a limit:

The last step — that — is often justified by the axiom that the real numbers have the Archimedean property. This limit-based attitude towards 0.999… is often put in more evocative but less precise terms. For example, the 1846 textbook The University Arithmetic explains, ".999 +, continued to infinity = 1, because every annexation of a 9 brings the value closer to 1"; the 1895 Arithmetic for Schools says, "…when a large number of 9s is taken, the difference between 1 and .99999… becomes inconceivably small". Such heuristics are often interpreted by students as implying that 0.999… itself is less than 1.