Arbitrary decisions? Asserting Order? Exploiting symmetry?

(Wee Kean here.)

Problem:

(2022 C4) Let $n > 3$ be a positive integer. Suppose that $n$ children are arranged in a circle, and $n$ coins are distributed between them (some children may have no coins). At every step, a child with at least 2 coins may give 1 coin to each of their immediate neighbours on the right and left. Determine all initial distributions of coins from which it is possible that, after a finite number of steps, each child has exactly one coin.

Warning: spoilers ahead.

The invariant for this question should be rather natural; it is rather classic. But I'm not here to talk about monovariants or invariants. Instead, I wanna explore an idea I had recently (possibly from competitive programming). The idea is pretty vague but it's an amalgamation of making arbitary decisions, asserting order and exploiting symmetry.

The setup is similar among the following questions. There is some process or operation which acts on identical objects. During this process or operation, an arbitary decision is made and the idea is that we can make this decision less arbitary by imposing some rule or condition (usually by distinguishing these identical objects). Of course, the resulting end state will still be the same but now we can gather information about this process by observing the behaviour of these objects.

So, on to this question. Some first thoughts:

  • Determining if its possible to go from one distribution to another -> probably an invariant. In this case, labelling the positions around the circle modulo $n$ works; the sum of coin positions modulo $n$ is invariant.
  • This reduces the question to: for any initial distribution whose sum of coin positions is $\frac{n(n-1)}{2}$ modulo $n$, is it always possible to obtain the distribution where each child has exactly one coin?
  • What's unique about the distribution where each child has exactly one coin? For any arbitrary distribution, is it possible to attain it?
  • We denote this coin redistribution step as an operation on a child for simplicity.

Playing with the question a bit, you might stumble across the fact that for most distributions (with $n$ coins initially), we may keep applying this coin redistribution operation indefinitely. So, taking a slight detour, we can wonder about for which distributions (possibly with lesser than $n$ coins) will the operation eventually halt.

Now back to the main theme of the post. If a child has more than two coins, how do we decide which coins to pass to the left and right? Well, for this question, we don't care! After all, all coins are identical and the state of the problem is decided by how many coins each child has, not which coin he has.

But as with the theme of the post, we can exploit this symmetry. For example, suppose we denote some arbitary coin a "left" coin. Whenever a child is holding on to the "left" coin and we operate on them, they will pass this coin to the child on their left. Firstly, convince yourself that this doesn't change the problem and that furthermore, at any point we can ensure that the "left" coin will be passed to the child on the left. Now, if for some distribution, we may apply operations indefinitely, the "left" coin will cycle through the circle of children indefinitely as well (prove this!). This is not very helpful, but it is pretty funny. At this point, try your imposing your own conditions and see if you can get any interesting behaviours.

Warning: More spoilers ahead

If you've managed to show that the "left" coin must eventually cycle through every child indefinitely, you probably would have shown that if a child is operated on finitely many times, then his neighbors are also operated on finitely many times (or something to that effect). With this idea in mind, the hope is to control / limit where each coin can go.

Can we confine a coin to a pair of neighboring children? Arbitarily, choose a coin, say coin X, which is initally on child A who is next to child B. If we operate on either of these children and they are holding onto coin X, they will pass it to the other child in this pair (i.e. A -> B, B->A). We have successfully trapped this coin between these two children!

The idea now is to trap a coin between every pair of neighboring children. This is slightly trickier than the argument above, since we assumed that the coin was already initially held by one the children in the pair. Try to show that if each child is operated on infinitely many times, eventually, then between every pair of neighboring children we may eventually trap some coin between them.

Then, if there exists an infinite sequence of operations, the number of coins is at least the number of pair of neighboring children.

Wait but so what?

If you've made it this far and you haven't figured out how this may help solve the problem, don't give up, the solution is close! Here's some motivation:

Alt text

Ok, we have seemingly taken a big detour from our problem. So, where's the finish? Well, we now know that with $n-1$ coins, eventually, we will be unable to operate on any child (i.e. the $n-1$ will be distributed evenly among some $n-1$ children). So what happens if remove a coin, randomly operate on the children and then place the coin back?

Below are a few problems which I think fit this theme. I would include 2023 ELMO C3 as well but I think the idea is a bit different.

(Chain Reaction) Let $m,n$ be positive integers. Suppose that there is a child in each cell of a $m$ by $n$ rectangular board and $k$ coins are distributed between them (some children may have no coins). Two childs are considered neighboring if the cell theys are in share an edge. At every step, a child with at least as many coins as the number of their neighboring children may give 1 coin to each of his neighbors.

  • Determine the smallest $k$ such that regardless of the initial distribution of coins, there exists an infinite sequence of steps.
  • Determine the largest $k$ such that regardless of the initial distribution of coins, there does not exist an infinite sequence of steps.

This is some old SIMO question from my memory, can't find any specific source but it's based of the game chain reaction. The idea is exactly the same.

(2018 C3) Let $n$ be a given positive integer. Sisyphus performs a sequence of turns on a board consisting of $n + 1$ squares in a row, numbered 0 to $n$ from left to right.Initially, n stones are put into square 0, and the other squares are empty. At every turn, Sisyphus chooses any nonempty square, say with $k$ stones, takes one of those stones and moves it to the right by at most $k$ squares (the stone should stay within the board). Sisyphus' aim is to move all $n$ stones to square n. Prove that Sisyphus cannot reach the aim in less than $$\lceil \frac{n}{1} \rceil + \lceil \frac{n}{2} \rceil + ... + \lceil \frac{n}{n} \rceil$$ turns. (As usual, $\lceil x \rceil$ stands for the least integer not smaller than x.)

Comments

Popular posts from this blog

SMO Open 2024 ??% Speedrun

Musings about the IMO