Agile Games – Battleships

No plan of operations extends with certainty beyond the first encounter with the enemy’s main strength – Generalfeldmarschall Helmuth von Moltke

 

 

This is a game I use to introduce people to iterative development. The aim is to help the players to understand that upfront large plans are pointless the minute they are created. Helmuth (quoted above), for me, was one of the first proponents of an iterative approach. He didn’t necessarily create grand plans, conversely he didn’t disregard planning altogether, but he was smart enough to plan just enough and adjust his plan to meet the changing reality. Often my experience of working on projects with large up front plans, the opposite is true and the owner of that plan tries to adjust the reality to fit the plan.

So how do you play it?

Simple, one player (Player A) is given 40-45 pegs and told to place their ships and all their planned attacks up front. The second player (Player B) simultaneously places their ships. Once that has been done, the first player reads out the 40 pre-planned attacks and is given the hits and misses. Player B then get to play each attack (up to 40-45) separately, getting feedback on hits & misses for each.

It’s pretty obvious what will happen, most of the time Player B’s turn (iterative) based play will allow him to adjust and change his plans as he scores hits. Player A (large plan) will score hits, but is far less likely to sink all the opponents ships. So player B being limited to 40 moves also may sink the whole fleet or not, typically though that player will score more hits than the other.

What does this tell us?

Predictive planning is unreliable and is akin to reading crystal balls, reading tea leaves or any other clairvoyant technique you care to mention. Ultimately you’re predicting the future based on an infinite number of possible outcomes, effects and variables. The iterative approach is empirical, each time Player B hits his opponents ships, he can instantly change plans and target nearby locations to sink it.

There are a few other parallels you can draw from it.. Player A will often score hits but not sink, this is akin to features that were developed but not tested before time or money ran out. Player B however may not hit all the ships but is much more likely to sink the ones he does hit, making it a metaphor for “potentially shippable product increments” (no pun intended).

Occasionally Player A wins, this is a bit like the fortune teller coincidentally getting something right, it’s often more luck than judgement.

 

 

 

9 thoughts on “You sunk my methodology

  1. Love this game. I’ve used in on several occasions to demonstrate the consequences of lack of feedback. You can also play in “iterations” without any feedback, which I often see companies trying to do. It becomes very obvious that playing with “iterations” but without feedback is the same as a large, up-front plan. I think this game is especially helpful when working with product owners.

  2. A product owner friend just pointed me to this blog entry… and I guess great minds think alike! I’ve never seen your post, but I wrote a similar exercise contrasting automated testing and manual exploratory testing that I ran at the Agile2013 conference. I love your application of it to product. 🙂

  3. @Mark Yeah I’ve played around with batches instead of shot by shot, works fine like that too, and you can vary the batches from game to game to show the feedback cycle. Longer batches (iterations) give less great results.

    @ken I love the value aspect of the shots.. that’s a great adaptation. perhaps there could be an ROI figure for hits?

  4. James, we did this exercise with a large group last Friday and it worked great! We created a board with half the squares, which kept it moving quickly. We gave a budget of $25K with each bomb costing $1K. This enhanced the metaphor, and we got the expected results. Thanks for a great game!

  5. Maybe to strengthen the analogy with a sprint, let the ‘iterative shooter’, shoot two (or three) pegs at a time. This way a peg better identifies with a user-story. But we still show the power of Agility.

  6. Hi Don

    Much of the play testing I did with colleagues never seemed to last much longer that 30 minutes.. most of the other time is taken up with explaining the rules and the introspection of the outcomes.

    One of the reasons we limited the number of playable pegs was to minimize the time it takes to run a session.

    I’d be interested to see what the community can do to shorten it.

    James

  7. Thanks James, this seems very interesting.
    My only concern is that it could take a long time to play. Have you ever implemented a smaller, quicker version of the game?

Comments are closed.