By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - General Discussion - Another math question

Guys,

Read up on the concept of "limits" (fundemental to how differential calculus & integration work).

Limit (SUM (9/10^x)) as x --> 1..infinity = 1 (*)

(i.e. 9/10 + 9/100 + 9/100 + ... = 1) 

Anything other than the limit is NOT 1 (1 > value). 

(* - a common way of describing this is using "." - repeater - its basically the same thing).

....

Since we are talking about funky maths, here is a cute on I "proved" many years back (it doesn't actually work, the question is why).

When you have two straight lines on a graph, that are perpendicular - the product of their gradients is ALWAYS -1, i.e.:

m1 x m2 = -1 (where m1, m2 are the gradients of each line, in the form y = mx + b)

Try applying this to the x/y axis of the graph!

Gradient x-axis = 0
Gradient y-axis = 1/0 (infinite)

Therefore, 0 x 1/0 = -1 (!!)

(or 1 = -1 !!)

 



Gesta Non Verba

Nocturnal is helping companies get cheaper game ratings in Australia:

Game Assessment website

Wii code: 2263 4706 2910 1099

Around the Network

Man, I never thought so many people would have trouble comprehending that 0.999... with infinite 9's goes on forever, you can´t just cut it off at some point. Both basic demonstrations have already been done in this topic: 1.- x=0.999999....... 10*x=9.99999......... 10*x - x = 9.9999......... - 0.9999999........ = 9 -> x= 1 2.- 0.99999....... = 0.33333...... + 0.333333....... + 0.333333....... = 1/3 + 1/3 + 1/3 = 1 Anyway, you don't have to believe me, this is a link to Doctor Math: http://mathforum.org/dr.math/faq/faq.0.9999.html



Well I didnt read all this but: 1 = .999~ Why? Because for 2 numbers to be different there has to be at least one number inbetween them, and in this case there isn't. Also 1/3 + 1/3 + 1/3 = 1 Right? So .333~ + .333~ + .333~ = .999~ = 1



Yes they are the same. There are two representations for 1 in the decimal number system. Sorry.



The .3333... argument is the easiest to understand. The real "discrete math" proof does say that between any two real numbers are an uncountabily infinite number of real numbers. So you would have to find a number that is between .9999... and 1. jlauro's argument (.9999... + 1)/2 doesn't work, because it assumes that .9999... and 1 are not the same number. You can't assume the thing you are trying to prove! If they are equal, then the average of the two numbers is also equal, and therefore not between them. Happy, .9999... is a rational number. It can be expressed as the ratio between 1 and 1. See the 1/3 argument above.



Around the Network

People, people.

Let's stop arguing over this...the answer's pretty simple, anyway.

Here's a Youtube video that explains it perfectly:
http://www.youtube.com/watch?v=7sK3AqFYAWQ&mode=related&search=



LEFT4DEAD411.COM
Bet with disolitude: Left4Dead will have a higher Metacritic rating than Project Origin, 3 months after the second game's release.  (hasn't been 3 months but it looks like I won :-p )

Entroper said:
The .3333... argument is the easiest to understand. The real "discrete math" proof does say that between any two real numbers are an uncountabily infinite number of real numbers. So you would have to find a number that is between .9999... and 1. jlauro's argument (.9999... + 1)/2 doesn't work, because it assumes that .9999... and 1 are not the same number. You can't assume the thing you are trying to prove! If they are equal, then the average of the two numbers is also equal, and therefore not between them. Happy, .9999... is a rational number. It can be expressed as the ratio between 1 and 1. See the 1/3 argument above.

Actually the (a + b)/ 2 does work. If it didn't, it wouldn't be math.

The problem is, by itself it doesn't help prove if .999... and 1 are equal or not. It would just be one step. So you could then have to show that result is equal to at least one of the other numbers.

 

Things can be proven by assuming what you are trying to prove, but it's generally much harder that way to be done correctly. Generally you dissprove stuff by assuming what you want to disprove and then finding the inconsistancies.

 

So if you want to prove that 1 is equal to .999... then you want to disprove that 1 not equal to .9999.... so the correct way to start the proof is to assume that 1 is not equal to .9999 and so (a+b)/2 is clearly what you want to assume if you were formally proving it from scratch.



You can't prove something by assuming that it's true.  You can disprove something by assuming that it's true, and then reaching a contradiction, that's a proof by contradiction. 

If you're doing a proof by contradiction, you have to reach a contradiction.

In this example, if you assume that .9999... and 1 are not the same number, then you can use (.9999... + 1)/2 to show that there exists a number between .9999... and 1, but this is not a contradiction, so it neither proves nor disproves anything.

I'm just trying to state this clearly so that it's understandable, not trying to say "YOU'RE WRONG!"  When I said that the average "doesn't work" I meant that it doesn't prove or disprove that .9999... = 1, not that it doesn't give you a number.



Any number below 1 multiplied by itself will be lower than the original number, thus being less than 1. 1 multiplied by itself is always 1.



I've heard about these strange threads popping up all around the internet with people claiming the same (false) idea about infinite decimal representations of numbers. Allow me to use another explanation that will (hopefully) clear up where the confusion sets in for the laymen. Let me first say that .999... is not a decimal representation of a number at all. The hint as to why it's not? Those dots: "..." They're not numbers! A decimal representation of a number has a finite number of digits chosen from the set {0 1 2 3 4 5 6 7 8 9} with the addition of a single "." somewhere in the number (the "." is implied at the end if unwritten). So, the question is, who started to perpetrate these lies and deceits to our children? (Is our children learning?) Well, the answer if simple: mathematicians lie. All the time. Well, almost always. They tell lies because normal people (much less children) do not have the years of conditioning to understand the true reasons behind whatever "cool" fact they're trying to talk about. Some of the lying even extends up until collegiate levels of mathematics learning! Usually, once you hit the last year of your undergraduate mathematics degree, your professors stop lying about most of the things they say in the classroom. So, anyway, we're not able to compare ".999..." with "1" visually at all, because we aren't using the same representations for the numbers. One is using a decimal representation (1) and the other is using a lie that mathematicians tell the public to keep the truth at bay! I'm glad there were some on this forum to set the facts straight, entroper.