Science
In reply to the discussion: If you're having math problems, I feel bad for you, son... [View all]napoleon_in_rags
(3,992 posts)First things first.
If you are sincere, read Howard Eves Foundations and Fundamental Concepts of Mathematics.
I'm always looking for new truth and sincerely appreciate your attempts to share it with me, its going on me Amazon wish list now.
That said, let's start at the end.
It is a VERY unambiguous field, especially at this level.
There's nothing that can create more confusion than when people believe meanings are not ambiguous when they actually are. How much conflict is rooted in failures to communicate? Math is, after all, a language.
Let's keep it simple. I'm a working man, a simple man. And I have my own down home definition of the decimal numbers, a vernacular if you will, which I assure you most common folks like me share. Let's say you have a number n =
3.xxx....
where the x's are unknown, and the ... means repeating. What do I know about this number? Well, in the common down home sense of decimal numbers, I can tell you that n is greater than or equal to 3, but less than 4. (equal if all the x's are zeros) So the first digit reveals a certain interval: 3, as well as every number up to but not including 4 is in that interval. If next we find out n = 3.1xxx.... that specifies an interval to us that includes 3.1, and all the numbers between 3.1 and 3.2, but not including 3.2.
So based on this common interpretation of the decimal numbers, what somebody like me hears when one of these pointy hat people comes down from on high and tells us that 0.999... = 1, what they are in effect saying is that that there is exists a number in the interval that is defined as being less than 1, which is equal to 1. And that is just plain false on its surface. Its like saying there is a cat that is a dog, they clearly have a different definition of the word "cat" than us, which includes chihuahuas or something.
So what is this different definition of the decimal numbers where 0.999 can equal 1? Thinking about it and turning it over in my head, I see that not only must 0.999... = 1, but 1.999... must equal 2, and 3.1999... must equal 3.2, etc. In fact every decimal number with a finite amount of digits must have an alternative form with its last digit decremented, being trailed by infinite 9's. Okay. So its a system where any number representable with finite digits has two possible representations, where in our common down-home system, each number only has 1 representation. Okay. its seemingly consistent, and interesting. We'll call it the elite decimal system, where ours is called the common decimal system.
But something strikes me as odd, in having our common down home decimal system dismissed in this way. Specifically, its the idea that even if its discarded, I can create a new one: Consider two numbers, say - e and pi. I can define an interval which includes all real numbers which are equal to or greater than e but less than pi. I can write it ((e, pi). Then, because the real numbers are closed under division, there will be a midpoint c halfway between, defining a new interval ((c, pi). I can refer to e as ((e, pi).0 and this new midpoint as ((e, pi).1 Then I can define a new midpoint for that new upper half, and designate it as ((e, pi).11 and a new midpoint in that even newer interval becomes ((e, pi).111 etc. (binary numbering, basically) Eventually, I see I can keep this process going, Zeno's paradox style, until I get ((e, pi).111.... (repeating). Now we got us a funny number, by definition less than pi, but infinitely close to pi. And what my intuition tells me, (and my intuition is pretty durn good) is that all the arguments for 0.999... = 1 apply equally to the idea that this new number ((e, pi).111... = pi, when by definition is does not... And furthermore these arguments can be extended, and reduced to the argument that any interval on the real line defined as containing all numbers less than c also contain c, and that, is what we call in these parts, a problem.
So, sorry to be Bukowskiesque, but I see a real cultural problem. When 99% of people see the decimal system the way I do, where the decimal number 3.x is less than 4, but an elite pointy hat crowd comes down with a new definition of the decimals where it may also equal 4 without explaining their assumptions, the concept of math as a cultural phenomenon to enlighten us all is lost. We have to at least recognize that there ARE differences between the 99% concept of the decimals and the 1% concept. (and thus between .99... and 1, such as it is) And use that as a springboard to spread the joy of mathematics, making sure its not just locked up for a few.
PEace