I was reading online about a game I bought while in the US and which I have only just had time to take a peek at. Someone made a point about many parts of the game being based on the fact that a right triangle with sides of 7 and 4 units has a hypotenuse with length very close to 8.

Well, 7

^{2} + 4

^{2} = 8

^{2} + 1

so the hypotenuse is close to 8, as suggested: √(7

^{2} + 4

^{2}) ≅ 8.

In fact, I knew √65 to be very close to 8

^{1}/

_{16}(if

*x* is not too small, √(

*x*^{2} + 1) ≅

*x* +

^{1}/

_{2x}).

(Note that (

*x* +

^{1}/

_{2x})

^{2} =

*x*^{2} + 1 +

^{1}/

_{4x²} , and if

*x* >> 1, the final term is quite small )

So that's an error of around 1/128, or about 0.8%; pretty good, since the game aims for much less accuracy than that in general.

But then I thought about the fact that 16 in the denominator was a bit too small, and I wondered about how much. I realized straight away that it was in fact about a sixteenth too small. That is, it occurred to me that √65 is very close to 8

^{1}/

_{(16 + ¹/16}).

A little light went off in my head, so I hauled out my calculator.

Try this with me, if you have a calculator handy:

Take the square root of 65. (You should see 8.06225...)

Now subtract 8 (the bit we know).

Take the reciprocal (¹/

*x*). You get 16 and a bit.

Subtract 16 and take the reciprocal. Looks like you get the same number back...

What is this number? A tiny bit of algebra shows it's 8 + √65.

So far, that may seem like a trivial curiosity. But this happens all over.

For example, you get the same thing with any positive integer,

*x*;

√(

*x*^{2} + 1) +

*x* is a number like that "16 and a bit", where

you can keep subtracting that integer part and taking the reciprocal.

That is, expressions like 8

^{1}/

_{(16 + ¹/(16+ ...))} come up lots of times (and recognizing that I'd hit one of these objects was what made the light go off).

Take √10 for example - it's 3

^{1}/

_{(6 + ¹/(6+...))}And you don't just get it with roots of

*1* more than a perfect square. As I said before, it happens

*all over*.

We've hit

continued fractions. They come up a fair bit in mathematics, and they appear in numerous places where rational approximation comes in - I remember playing with them when dealing with asymptotic approximations in statistics, for example. There's a much nicer notation (see the wikipedia article), so if you're playing with them you're not stuck with endless layers of fraction running down the page.

So, for example, the sequence 8, 8

^{1}/

_{16}, 8

^{1}/

_{(16 + ¹/16)}, ... 8

^{1}/

_{(16 + ¹/(16+ 1/16...))} would be rendered as:

8, [8; 16], [8; 16, 16], ... [8; 16, 16, ...]

Similarly, √10 is [3; 6, 6, 6, ...].

The well known continued fraction for √2 falls into this class: [1; 2, 2, 2...].

Compute a few terms in that sequence with me:

1, 1.5, 1.4, 1

^{5}/

_{12} = 1.416666... , ...

already we're quite close - and it continues to jump about either side of √2, getting closer and closer.

For larger numbers, the convergence is much faster. The general continued fraction for √(

*x*^{2} + 1) is [

*x*; 2

*x*, 2

*x*, 2

*x*, ...].

Try seeing if you can work out what is going on with square roots with different offsets from a perfect square.

So anyway not only is there a handy way of computing square roots that are close to perfect squares, there's a handy way to improve the calculation if it wasn't as accurate as you needed.

There are many beautiful things related to continued fractions. Take a look over at

MathWorld if you've a mind for some boggling factoids.

What fun.

(Two posts in one day! OMFFSM)