10
$\begingroup$

I am not sure if there is a name associated with this game but I do not know of one.

There was this 2 player game I had once played, and I am wondering if there is a winning strategy. The game is as follows:

Two players both get 6 rocks, numbered 1-6.
Each rock number corresponds to its weight (6 is heavier than 5, is heavier than 4, etc)
Each number on the rock also corresponds to its points. (rock 6 = 6 points).
The game is played in rounds, 6 in total.

Each round,A random rock from the referee's (or computers) pile [similar to player rocks 1-6] is placed into the middle of the lever by the ref.
Each player then chooses one of their rocks to place onto the lever. (The players do not show their choice until they are revealed on the lever.)

The heavier rock chosen between the player wins the round, and the winning player receives the points for all the rocks on the lever. (Their number + random Ref number + opponents number).

The rocks used in a round are no longer able to be used in the next rounds. This includes the referee's rocks that go into the middle. (so if the ref uses rock 6, then the next round the rock in the middle must be random from 1-5)

If there is a tie, the rocks remain on the lever for the next round.
The player with the most points at the end of the game (6 rounds) wins!

My question is:

Is there a possible winning strategy to this game? [Ties will be counted as wins]

Alternatively, is there a strategy which can give you the highest chance of winning?

Assume it can be used in either situation:

  • Both players are using the optimal strategy.
  • Your opponent is not following the optimal strategy.

If you were both given more or less rocks, would this change the situation? (i.e 1-10, or 1-4)?

$\endgroup$
13
  • 1
    $\begingroup$ If one person were required to place his rocks before the other, and the other knew what the first person placed, the latter person could always win. As such, there is no optimal strategy against an arbitrary opponent. The only meaningful questions are whether there is any partially-random strategy against which no alternative strategy would have either a greater-than-50% probability of winning, or a higher expected value, and I don't know how to answer that. $\endgroup$
    – supercat
    Commented Jun 8, 2015 at 17:57
  • $\begingroup$ You don't say what happens if there's a tie in the last round. Also, there's no way a "winning" strategy can exist in a symmetrical game. I'd suggest that you make the game asymmetrical such that on each turn player #1 must announce a procedure for selecting a rock based upon some dice rolls or other probability-generating mechanism, then player #2 picks his rock, and then a rock is chosen using player #1's announced procedure. The question would then be whether player #1 could formulate a non-losing strategy subject to those constraints, regardless of what player #2 does. $\endgroup$
    – supercat
    Commented Jun 8, 2015 at 18:27
  • $\begingroup$ @supercat A tie on the last round leaves the rocks on the lever, so no one gets those points. $\endgroup$
    – Mark N
    Commented Jun 8, 2015 at 18:28
  • $\begingroup$ That rule adds the interesting wrinkle that if two players each choose numbers randomly, their expected value will be less than what could be achieved by two players working cooperatively. Still, I think the problem would be made more tractable by making the players' choices asymmetric and, at least initially, having the referee's rocks played in a fixed order. The question "Is there any strategy by #1 which #2 couldn't beat even if #2 knew the procedure for picking rocks" is tractable without having to worry about whether #2's strategy will affect #1's. $\endgroup$
    – supercat
    Commented Jun 8, 2015 at 18:38
  • 1
    $\begingroup$ This sounds kind of like a Blotto game, played iteratively. $\endgroup$ Commented Jun 9, 2015 at 2:36

4 Answers 4

3
$\begingroup$

Did a little bit of simulation for this, using 8 players:

  • stratrand: Chooses randomly every time
  • stratav-high: Chooses the lowest rock that is greater than the average of the other player's remaining rocks. If does not exist, choose your highest tile.
  • stratav-low: Chooses the lowest rock that is greater than the average of the other player's remaining rocks. If does not exist, choose your lowest tile (to try to keep high rocks for later rounds).
  • stratmatch: match the upturned rock
  • stratmax: always choose the max rock available (chooses 6 -> 1)
  • stratmin: always choose the min rock available (chooses 1 -> 6)
  • stratnext-high: choose the lowest rock that is higher than the upturned rock. If not available, choose your highest rock
  • stratnext-low: choose the lowest rock that is higher than the upturned rock. If not available, choose your lowest rock.

As all of these are purely reactionary, we can evaluate for all possible iterations of upturned stones. For interactions with the random player, I did a Monte Carlo simulation. For each "battle", we can create a results space:

enter image description here

"AverageClose" is the average of the last two rocks upturned. "AverageOpen" is the average of the first two stones upturned. A game that started with two high rocks and ended with two low rocks would be on the bottom right of this graph. Games with equivalent coordinates (i.e 1,6,2,5,3,4 and 6,1,5,2,4,3) were averaged.

enter image description here

Each tile represents a different outcome, with green meaning that the x axis player won, and red meaning the y axis player did. Gold would either mean that every outcome was a tie, or there was a equal proportion of wins and losses for each "battle" at that coordinate.

Some interesting observations (although by no means exhaustive):

  • Almost every strategy beats the random one over time, although the max/min get screwed if the opening is low/high respectively.
  • Next looks like the best strategy (against the field presented, I'm sure there is a strategy that can beat it). I think this is mainly because it forces conservation of your high rocks, paced by the upturned rocks.
  • The match strategy looks fairly good, although is destroyed (by design) by the "next" strategies
  • Min/Max counter each other, depending on the pattern of upturned rocks.
  • No real differences between the average strategies, and neither seems that great
$\endgroup$
1
  • $\begingroup$ Thanks for taking the time to put together this answer. The data you provided really helps illustrate some of the features in this problem! $\endgroup$
    – Mark N
    Commented Jun 10, 2015 at 11:18
2
$\begingroup$

Thanks to Meelo I was able to find the similar game to which this is a variant (and get this answer).

One game is called Goofspiel(which is considered as a Game of Pure Strategy [GOPS]), In which 2 or more players are given 1 suit from a deck of cards each, and a 'prize' suit would be used for the intermediary. The prize suit is shuffled and one of the cards is placed between the players, and the players choose one card of their own suit to play. The highest card wins the cards 'prize' card that was placed in the middle, and the cards used by the players are discarded. This repeats until all the cards have been used. Points are defined by the card values (Ace - 1, King - 13). There are variants for how draws are dealt with (splitting, ignoring, 'roll over', etc)

According to the Wiki page: "Any pure strategy in this game has a simple counter-strategy", which demonstrates that :

A) The number of initial options each player has (>1) does not effect possible strategies. (i.e 6 rocks or 13 cards)

B) Any 'optimal' strategy that is used can be countered (if the opponent knows your strategy). If the opponent does not know your strategy (or psychology), then the winning outcome would be determined by the differences in your strategies as well as randomness of the 'prize' card (or referee rock) [Assuming both players stuck to their initial algorithms/strategies and that their algorithm has finite states].

In summary, the only way a player is able to have a 'better' strategy is to know his opponents strategy. This is why GOPS can be considered to be more on psychology of your opponent rather than pure tactics and predefined decisions.

Similar to JonTheMon's answer, a mathematician Sheldon Ross considered the case when one player plays his cards randomly, to determine the best strategy that the other player should use. Using a proof by induction on the number of cards, Ross showed that the optimal strategy for the non-randomizing player is to match the upturned card, i.e. if the upturned card is the Jack, he should play his Jack, etc. In this case, the expected final score is 59½ - 31½, for a 28-point win.

Source: http://en.wikipedia.org/wiki/Goofspiel

Interesting side note:

It is very interesting to note about GOPS, that it can relate to the micro-trading algorithms used in the stock markets today! These algorithms can work based on the assumptions of other traders/algorithms to try and best benefit themselves. In this scenario the price of option X would be the 'referee' in the middle, however in this case the price can also be influenced by the player!

$\endgroup$
2
  • $\begingroup$ Perhaps you meant to thank Meelo (who linked the article on Blotto) rather than me $\endgroup$ Commented Jun 9, 2015 at 14:36
  • $\begingroup$ @JulianRosen Thanks, Fixed! Too many comments in the main section....I got them mixed up $\endgroup$
    – Mark N
    Commented Jun 9, 2015 at 14:42
1
$\begingroup$

Normally, you'd want to put a rock around the value of the ref rock. You don't want to waste a 6 on a 1 point ref rock.

However, things do get interesting if there is a collision. Say the ref rock is 5 and you both put 5 out. As a tie, that means that 15 points go to the next round. Naturally, you'd both collide with 6. Then 4. Then 3. Then 2. Then 1. Tie.

Say instead you take the 5-5 with a 6. You have 16 points, and a 5 left. Presumably the opponent will take the ref rock worth 6 and you'd place a 1 (13 points). So, you're left with 5,4,3,2 and your opponent is left with 4,3,2,1. You're likely to win. So, by covering their expected take, you were able to position a 6-1 loss in your favor.

Let's take another important ref rock, 6. Now, most of the time you'd use a 6 to cover it, but then the opponent could use a 1 to get future position. Well, then you could use a 2 to maintain your position. And so on and so forth. Likely end with 50-50 chance.

$\endgroup$
3
  • $\begingroup$ It seems that some of these situations can reduce into "What if they think that I think that they think....etc" $\endgroup$
    – Mark N
    Commented Jun 8, 2015 at 18:14
  • $\begingroup$ Yeah, pretty much. It would be interesting if there were some sort of "you go first". Maybe A, B, B, A, A, B(win ties). Or even alternating winning ties? $\endgroup$
    – JonTheMon
    Commented Jun 8, 2015 at 18:18
  • $\begingroup$ I am interested in what happens if you throw some probability and randomness into the stew that is possible solutions? $\endgroup$
    – Mark N
    Commented Jun 8, 2015 at 19:04
0
$\begingroup$

I would place them in order 1, 2, 3, 4, 5, 6. You will lose some but guaranteed at least 3 (if one was a tie.)

$\endgroup$
3
  • 1
    $\begingroup$ There is never a guaranteed amount of round wins unless you and your opponent have predefined moves such that they let you win x amount of times. $\endgroup$
    – Mark N
    Commented Jun 9, 2015 at 15:21
  • $\begingroup$ @Mark N ah I may have misunderstood. I was thinking you do: 1, 2, 3, 4, 5, 6 and your opponent does the opposite 6, 5, 4, 3, 2, 1. That is the best way to get at least two (I thought...) You would tie one and win two, at least. Ex: 1, 2, 3, 4, 5, 6 and 4, 2, 1, 6, 3, 5. One tie, Three win, Two loss... $\endgroup$ Commented Jun 9, 2015 at 15:42
  • $\begingroup$ What if your opponent plays 234561? $\endgroup$
    – Deusovi
    Commented Jul 15, 2015 at 19:39

Not the answer you're looking for? Browse other questions tagged or ask your own question.