dsgrue3 said:
Jay520 said:
dsgrue3 said:
You went from 10x = 9.9999... and by subtracting x somehow arrived on 9x = 9; this is wrong. You should have arrived on 9.000.....1x = 9.
Series will converge to 1, but never arrive at it. It's like saying that you can divide 1 in half and eventually get to 0. You can't. You can approach 0, but never reach it.
As to 1+1 =3 for sufficiently large values of 1, it's just a joke. 1.49 +1.49 = 2.98.
|
No. X = 0.999.... So when, I subtract x, I can subtract either X or 0.999.... It wouldn't make a difference.
Btw, 9.000....1x does not make sense. You can't just have a 1 there after an infinite amount of zeros.
|
You must have a 1 there. Otherwise you're using circular logic.
I'm not disputing 1 = 0.999999 though, Pezus example made it very clear.
|
uh...that's not circular logic. It's an infinite amount of zeros. To put a 1 at the end would mean that it's a finite amount of zeros, which is wrong.
You seem to think I'm subtracting 0.9999x from 10x. I'm not. On the left, I'm subtracting X from 10x, which is 9x. On the right side I'm subtracting 0.9999 from 9.0000 which is 9. In both cases, I'm subtracting the same value (since X = 0.999...) so no rule is broken.