Thrilling Tilings at Tiring Timings
(Etienne here.)
The date was 24 October 2021. It was late at night, and I was just about to tuck myself in. After all, the next day was the A-Level Physics practical! I needed to be well-rested (spoiler alert - I actually didn't). I was lying comfortably and about to bid a temporary farewell to the waking world, when suddenly, out of nowhere, I was attacked by a sudden thought: For what could I tile a square with rectangles whose sides had ratio ?
This was not a random thought. Earlier in the day, I had somehow encountered this other problem, from the 2018 Simon Marais Mathematical Competition:
For which integers is it possible to tile a square with rectangles whose sides have a ratio of ?
An excellent problem, and I invite the reader to try it in their spare time. The question that had come to me is then a natural generalisation of the above.
There is also another related classical problem, the answer to which is well-known, which can be thought of as the "inverse": For which can a rectangle with sides and 1 be tiled with squares? This problem has some neat proofs and definitely deserves an article of its own, but suffice to say, the answer is that it's possible only when is a rational number. So, then, one might reasonably expect that the answer to my question would also be the same. Right?
Investigations
Firstly, it's not hard to see that rational numbers work. I tried for a while, but it didn't seem to work. Let's try constructing a "bad" . When we put 2 rectangles with side together, the other side can be or . If we can make for some positive integers , , and then we are done, because we can then use copies of this rational rectangle to make a square. But this is just a quadratic equation! So we have the following:
works if is a positive real solution to some equation , where are positive integers.
This suggests that if works, then any other positive real solution to its minimal polynomial (that is, the polynomial with minimum degree that has root ) should also work. In fact, in this case, the condition is equivalent to:
works if the degree of its minimal polynomial is 2 and if both solutions are real and positive.
Exercise: Prove that the two conditions are equivalent.
What about the other quadratic numbers, like ? Well, let's move on to cubics first :)
Let's apply the same wishful thinking: Let's suppose that the sign of the coefficients alternated, and that we could separate the polynomial as follows:
So we construct on the one hand rectangles with sides - and - and on the other - and -, hoping that we can make a square. But if we try applying the condition directly, we get
If the third expression is equal to either of the first two, we get a polynomial with smaller degree, contradicting the minimality of the original polynomial!
I was stumped. Perhaps we ought to think about it in another way:
Say we use our similar rectangles to construct a rectangle whose sides have a ratio of . What can we do with this new rectangle? Let's suppose that the set of all ratios of sides we can form is . We can make a few deductions:
- .
- If , for any
- If ,
- If ,
All we need to prove now is that 1 is in . With this, the problem transitions from a geometric one to a fully algebraic one.
Even still, I struggled to make any significant progress after. Prudent readers might recall the setting of our story - yes, 'twas the night before Physics practical, and by the time I had reached this point, it was already 5am! Alas, I tried to make peace with the problem so that I may once more enter the world of dreams, but no! It gnashed and gnawed at me, not unlike the comedic trope in which a dog chases a butcher holding a string of sausages. Faced with this conundrum, I decided to google the answer, which brought me to a 1994 paper by Freiling and Rinne.
Learning the Truth
As it turns out, the answer is that rectangles whose sides have ratio can tile a square if and only if is algebraic, and the roots of its minimal polynomial all have positive real part. In the proof, the paper immediately presents this result without much explanation, attributing it merely to "Wall":
Let polynomial have real coefficients, and let . Then all the zeroes of have negative real parts iff
which left me rather flabbergasted. Why would the position of the roots affect this continued fraction expansion? As it turns out, it takes some effort to prove...
Let us unpack the theorem first. Both the numerator and denominator in the fraction consist of either odd or even powers of . In fact, for the rest of this article I'll just assume that the degree of is even, because the argument for the odd case is exactly the same and also because I'm feeling a bit lazy. Then the numerator has only even powers and the denominator has only odd powers, and the degree of the numerator is one more than the denominator. So we can perform "long division":
where and are what you think they are, and now the degree of is , which is one less than the degree of . So we can represent it as , and repeat this process. In this way we get the continued fraction in the above function.
That's nice and all, but how does this help us? Well, let's suppose that our ratio has a minimal polynomial whose roots all have positive real part.
Exercise: Prove that if has real coefficients and all its roots have positive real part, then its coefficients must alternate sign.
Now if we let , we have that all of the roots of have negative real part, and also that , so . Thus 1 may be expressed as the continued fraction up there where are positive and rational. If you, dear reader, would remember our discussion about the set which consists of the ratio of sides of attainable rectangles, you would realise that can be found to be in this set by inductively building up fractions.
So the problem is solved! As long as we can prove the theorem above, that is.
Since all polynomials can be written in the manner of the continued fraction, we should focus on what is special in this case, which is that the coefficients are all positive. Firstly, note that is simply , and the rest of the coefficients can be defined inductively on their respective polynomials.
In particular, we get that .
Secondly, we've gone from , but what I'd really like is something that characterises . So we add 1 to all our fractions in the chain:
Of course, we know what is, so we can substitute in the value, and the second term becomes
Now for a key observation: the denominator consists of all of the even-degree terms of the numerator!
That is, if we let the numerator be , the denominator is simply , which is of the same form as the first fraction. This suggests that if we can prove that and share some property, then by an inductive argument this property is preserved down the chain of polynomials.
For our case, the property we want to prove is not just that ; in fact, we can prove that the stronger condition that all the roots have negative real part is preserved.
Further Down the Polynomial Path
(Routh-Hurwitz) Call a polynomial stable if all its roots have negative real part. A polynomial with real coefficients is stable if and only if is stable, and and have the same sign.
This is apparently a famous theorem from control theory, which is where the name "stable" comes from. (Hint: it does not refer to a building in which horses are kept.)
There are various proofs of this; I will present an elementary one due to Gjerrit Meinsma of the University of Newcastle.
Before anything else, I will once again assume the degree is even (out of laziness). We will define the polynomial
where is some real number.
The main ingredient of the proof is showing that the family of polynomials have exactly the same roots on the imaginary axis (that is, roots of the form where and ).
To do this, write so that it is the sum of an odd and an even polynomial. For any purely imaginary number, the even part of takes only real values and the odd part takes imaginary values. Thus, any imaginary root of must also be a root of both and , which is the case iff it is also a root of and (with the same multiplicity, which can be shown by differentiating).
Therefore, as varies from 0 and approaches , the roots of vary continuously also. If at any point the number of roots in the left half-plane is different, a root must have crossed over the imaginary axis, which is impossible. So the number of roots in the left half-plane remains constant. The only time it may change is when changes degree, exactly when .
But note that as approaches , the leading coefficient of approaches 0, and roots of approach those of , during which none cross the imaginary axis. From Vieta's we have that the sum of roots must be , which either goes to positive or negative infinity depending on the sign of , and so the last root also tends to either positive or negative infinity. If and have the same sign, it goes to negative infinity, and therefore is stable iff is stable and the first two coefficients of have the same sign. (Note: the condition that the sign is the same is necessary only when going to from . We know that it is superfluous when going down.)
Sanity Check
To recap, we have shown that if has all of its roots in the complex right half-plane, then may be expressed as a continued fraction expansion of over , or vice versa, whose coefficients are all positive. We can also clearly reason that the coefficients are rational, since they are the ratios of coefficients of some polynomials with rational coefficients. And this leads directly to the desired tiling.
Perhaps some readers are still puzzled as to how we obtain a tiling: I will demonstrate. Take for example the ratio . Clearly the minimal polynomial for is .
The roots of this are , where is the fifth root of unity and varies from to , all of which have positive real part because is small.
Expanding, we have that . Since ,
Hopefully I did the calculations right. Anyway, we construct starting down and work our way up. So we start with a rectangle of ratio , which we know we can create. Then we adjoin to the side corresponding to a rectangle with ratio , where the side corresponding to in the latter is touching. In this way, we get the sum . Eventually, we get something like this:
Where the rectangles are colour-coded with the order of the rainbow starting from and going upward. (Orange is , and so on.)
Diligent readers might realise that we are actually only half-done with the problem! We have proven that algebraic numbers whose conjugates are all positive work, but we didn't prove that these are the only numbers which work. Unfortunately, proving the opposite direction requires more theory than I'm willing to go into in this article. It is however, similar in spirit to the proof of the classical problem of tiling rectangles with squares.
Ending Thoughts
I'm not really sure what the key takeaway from this story is supposed to be. Maybe it's "Try not to have late night thoughts, lest they will haunt you in your sleep." For those wondering, I did indeed do quite well for my practical exam.
Maybe there doesn't need to be a takeaway, and we can simply appreciate the proof presented here: the way in which it takes us on a journey from combinatorics and geometry, through continued fractions, to polynomials. I personally found the continuity argument used to prove the Routh-Hurwitz criterion quite enlightening. A good proof should be like an artwork framed in a museum - evoking awe and emotion through its beauty.
Comments
Post a Comment