Probability and Spirographs


Small spirographIt would be great if I wrote a post actually combining probability and spirographs, but that’s not what this is.  This is two completely different topics, joined together  by the fact that they both elicited conversation during or after dinner last night.

The Probability Problem:
Suppose your school collects Box Tops, and to encourage you to turn them in, for each 10 you turn in each month you’re entered into a drawing for a Webkinz (so if you turn in 20, you’re entered twice).  The least well formed question is:  if you were able to generate at least one group of 10 per month, is it better to enter them once a month, or to save them all until the end in the hopes of maximizing the opportunity for that single month?  Feel free to put answers in the comments!

I have an idea as to the answer to this in simplest terms, and also an idea as to the answer in practical terms.  In reality, we work on the system of turning them in whenever we remember, which is something between the two extremes.

The Spirograph Site:
I was surfing the web, and found a site called Spirograph Math (which is actually part of a larger site, but this is the game that occupied me).  You have one circle going around the other, and you can trace the design like a spirograph.  You can also pause, change the color, etc.  It draws pretty pictures like this:

I think you could use math as an excuse for playing with it.  For example, can you predict how many times the second circle will go around the first before repeating, or how many lines of symmetry the final figure will have?  (One thing to note:  the Pen Position is set relative to the center of Radius B:  if the Pen Position matches Radius B, it means the Pen is on the edge of the second circle, larger and it’s on the outside of that circle, and smaller and it’s on the inside.)

6 Responses to “Probability and Spirographs”

  1. Sue Says:

    On first thought (which is not reliable with probability problems, I know), I figure you’re competing against yourself when you enter twice in one month, so better to turn them in on different months.

  2. Michael Lugo Says:

    I thought the same thing as Sue. But I’m currently reading The Monty Hall Problem, Jason Rosenhouse’s new book which explores many variations of the Monty Hall problem, so I’m especially skeptical of my probabilistic intuition right now.

    To formalize the question, let’s say you’re able to make two entries, either one in the first month and one in the second month, or both in the second month. (You’re not allowed to make both entries in the first month because you have to collect the box tops first.) Let’s say that other than you, x entries will be made in the first month, and y in the second month. Finally, let’s say that the thing we want to maximize is your expected number of wins.

    Then under the first strategy, your expected number of wins is 1/(x+1) + 1/(y+1); under the second strategy, it’s 2/(y+2).
    If we assume x = y, then the first of these is 2/(y+1) and the second is 2/(y+2), so spreading out your bets is a good idea but not by much. Without that assumption, the first strategy is better when

    1/(x+1) + 1/(y+1) > 2/(y+2)

    which, after some algebra, is true when

    y^2 + 2y + 2 – xy > 0

    or, equivalently (since x is positive) when

    y + 2 + 2/y > x.

    and moving things around, that’s

    2 + 2/y > x-y.

    Assuming y ≥ 3, that’s equivalent to y > x-2 — so you should spread out your bets if the number of entrants in the second month is at least the number of entrants in the first month, minus two. The basic form of this isn’t surprising — you want to enter in months when there are less entrants, if possible — but the “up to two less” isn’t something that’s easily guessed. (It’s also useless, since you don’t know the number of entrants ahead of time.)

    Also, what sort of figures do you get if you just randomly move around the sliders?

  3. Ξ Says:

    Sue, that was TwoPi’s gut feeling as well, though mine had been to save them up.

    Michael, that’s interesting about the 2. I would have guessed 1, but it must depend on the fact that you’re changing the number if you enter, and what that does. So now, although I think the answer to question doesn’t change, I’m wondering how the parameters change if you have something like 3 months.

    (My practical answer was to wait until June: it’s a shorter school month and full of distractions, so there’s probably less competition. Of course, if everyone did that it wouldn’t matter — this strategy would only work if you’re the only one using it.)

    As for the sliders, the symmetries have to do with the relationship between the two radii (after factoring out the gcd); moving the pen changes the figure (e.g. loops or dents), but not the symmetry.

  4. Chris Wellons Says:

    I simulated the problem with the Monte Carlo method. Source code (emacs lisp) here:

    It runs both strategies side by side and show the total number of wins on each by their averages.

    With at least 1 entry per month (occasionally more) against about 100 entries per month slightly prefers playing every month. Using 10000 separate runs, playing every month won, on average, a total of 0.1466 times vs. 0.1319 wins for playing it all at once at the end.

    As I decrease the number of opposing entries, playing every month becomes an even better option. Increasing the number of box tops the player gets also increasingly favors playing every month.

    Changing the total number of months doesn’t seem to make much of a difference.

    Playing all at once at the end becomes slightly favorable if there is a lot of competition. That is, if you are competing against more than 1000 times as many other entries each month than you can put in.

    All of this assumes everyone else is choosing the strategy of playing each month. There can be game theory behind this to make it a lot more complicated than that: do we assume all of the other players are rational agents? Do they assume everyone else is a rational agent? Do they assume that everyone else assumes that everyone else is a rational agent? Might some players collaborate somehow?

    Players might observe other player’s strategies and adjust accordingly.

  5. Ξ Says:

    Chris, thanks for the code! I’m having trouble understanding why the situation *changes* if there’s a lot of competition, though. It’s consistent that the difference in strategy doesn’t matter so much — you probably won’t get one no matter why, but why would the strategy change?

    [I also realized that one way in which I was unclear was that I didn’t specify if I was trying to maximize the expected number of Webkinz, or to maximize the odds of getting at least one. I don’t think those give the same answer; indeed, a strategy might maximize one at the expense of the other.]

  6. Carnival of Mathematics #53 is up! « 360 Says:

    […] 360 12 tables, 24 chairs, and plenty of chalk « Probability and Spirographs […]

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: