I have been teaching high school and university mathematics for around 30 years, and I still find new things to wonder at within the material I teach on a regular basis. Many times, this has to do with infinity. I'll start with something many people have learned, and feel familiar with: The repeating decimal.

**When Fractions Convert to Repeating Decimals**

You have probably been taught that some fractions, like one-half, can convert to a decimal that terminates, while others do not. The common example of this is one-third, which converts to what is called a *repeating decimal*:

The way I've written it here, the three dots mean that the 3's "keep on going". Or another way to say that is that there are an "infinite" number of 3's after the decimal point. Students learn this in elementary school, and are usually taught to put a dot or a line on top of the repeating number (or numbers):

This is certainly nice and tidy. But ... those numbers *go on forever*. Think about what that means! It means that if you actually tried to write one-third as a decimal, you would never be able to stop writing those 3's. Like never. Because the moment you get tired and stop, you are wrong. None of these are true:

Of course, the more 3's you write, the closer your decimal value gets to what one-third means, but unless you write an infinite number of 3's, you will never have a decimal value that means what one-third means. Think of those words: *Infinite. Never.* These are scary words. When I learned this idea in elementary school I recall being quite upset. I asked the teacher about it. My question was something along the lines of

"So if the 3's go on forever then how can we actually write it?"

and her response was to say

"That's what the dot on top is for. It means the 3's go on forever."

I didn't have the maturity or words at the time to tell her that my difficulty was that the dot doesn't save us - it just hides what we are claiming, which is that we have successfully completed writing an infinite number of 3's.

**Mathematics is Not False**

This notion stayed in my mind for years, and every now and then I would spend time thinking about how this can work. Then one day I saw this deduction, which pops up regularly in math forums and math classes, and is intended to prove that "math is false":

This, many claim, is absurd and therefore math is false. How can something equal 1 if it is less than 1? 0.9 is less than 1, after all. And 0.99 is also less than 1. So it doesn't matter how many nines there are, it will always be less than one!

But it is not absurd. It is actually true! It's just that it means that we really have to think about what those "infinite" 9's mean, and how using a word like infinity requires more care and thought than we would perhaps like to believe. You see, before you can declare that the number on the right is less than one, you have to count the nines. Read that again and think about it. The only reason we know that 0.9 is less than 1, is because we can subtract. Consider these examples:

Since each result is greater zero, we know these numbers area all less than 1. And it's true - no matter how many nines you write, the answer to the subtraction will always be greater than zero. In fact, we can see that the number of nines we write determines the number of zeros we write after the decimal before the 1. Specifically, the number of zeros is always one less than the number of nines.

But what happens when there are an infinite number of nines?

See? That's the crazy thing about infinity. You would sort of have to somehow write an *infinite number* of zeros *and then* a one. But how can there be an "and then" if the event has to happen after you've completed an infinitely long task? Seriously. Good luck with that.

**How the Mathematicians Look at the Infinite Nines**

Hopefully you are beginning to feel a little uncomfortable about this idea, because it is an uncomfortable idea! Now, here is the mathematical explanation for why

So we can't actually write an infinite number of nines. In fact, the idea of it is misunderstood. No mathematician expects that there are an infinite number of nines, because the phrase "infinite number" is actually an oxymoron. Infinity is *not a number!* It is a concept. If it were a number it would be the largest number there is. But then what would happen if we added one to it? Or multiplied it by two? Or squared it? Seriously. It's not a number. So you can't really say "an infinite number". Instead, what mathematicians do is interpret the infinite nines as a sort of game of "close to the target". The target is the number one. The game is that someone tells you how close to the target you need to get, and the way you play is that you write nines until you are at least as close to the target as that person told you to get - but you are not allowed to go over.

So for example, suppose someone said you have to get within 0.003. Now to play, you write nines until the decimal you've written is within 0.003 of 1. Another way to think of that is that the number you write has to be equal to or greater than 0.997. Well that's easy! Just write 0.999 and you win. If you want, you can keep going and write 0.9999, since that is also within 0.003 of 1, but the point is that it can be done if the tolerance they give you is 0.003.

Now the question is, is there any tolerance that someone could give you that would not work? Like, is there a number that is small enough that no matter how many nines you write, you could not get that close to one? The answer is no. No matter how close to one someone wants you to get, you can eventually write enough nines that you would get that close or even closer. Because each nine you write gets you closer to one, and there is *no limit* on the number of nines you can write.

The moral is, there is always more to be seen, even in things we already think we understand. If you enjoyed this and would like to read more interesting things that infinity can do, leave a comment.

Thanks for reading!

Rich

## Comments