Stay Updated: Posts | Comments   Follow us on Twitter

Name: David

Posts by :

    CardUnit: A Unit Testing Simulation

    September 4th, 2017

    It can be difficult for developers to find the motivation or support to write automated unit tests. This simulation aims to demonstrate the value of automated unit tests to identify and localize defects. In the simulation, participants will play the roles of programs and tests using playing cards.

    Timing: About 30 minutes.
    Materials:

    • 3 decks of standard playing cards for every 8 groups of participants. Each group should have at least 4 participants, and at most 7. 5 or 6 participants per group works best.
    • 2 small envelopes per group
    • 1 large envelope or other container per group to hold the materials.
    • Handouts, described in the attached facilitation guide
    Setup: The attached facilitation guide describes the details of the setup.  The materials for each group includes a specific collection of cards for each group, and handouts to match that collection.  Some of the handouts are included as part of the initial setup, and some others are handed out in between rounds.
    How to run the game: The simulation proceeds through several rounds.  Start off by getting 2 volunteers from each group.  1 of those is the computer, and 1 is the system test.  The computer runs by turning over the cards one at a time.  Each time the computer runs, the deck is reshuffled.  The person in the computer role is given some instructions for resetting the deck between each round.  For each round, sometimes the program will be correct, and sometimes it will have an error.  Have a discussion with the participants after each round.
    Round 1: Assess the program (i.e the deck of cards) using just a system test.  The person in the system test role is given a sheet with all of the cards expected to be in the deck.  The person in the system test role can ask the computer to run more than once if there is not confidence on the correctness of the test.  Key discussion topics: Was the system testing easy?  How many times did the program need to run?  Was there strong confidence in the assessment?
    Round 2: Unit test sheets are handed out.  The rest of the participants in each group create unit tests of 3 or 4 cards.  This time the program is assessed using the unit tests. Key discussion topics: How did this round compare to the first round?  How  did participants test for the cards that appear in triplicate?  (Many groups end up creating unit tests that are not independent)
    Round 3: Same rules as the second round, but participants get a chance to update their tests based on the second round’s discussion. Key discussion topics: What changes did participants make to their tests and why?
    Round 4: Additional instructions are added that invoke an expensive operation (represented turnings over an additional set of cards) Key discussion topics: What kinds of real-world software testing did this change in the rules seem similar to?
    Round 5: Additional instructions are introduced to illustrate the concept of mocking the expensive operations. Key discussion topics: How did the introduction of the mocks change the test experience?
    Acknowledgements.  Developed by David Kane (@ADavidKane) and George Paci (gpaci@tiac.net).   Thanks to the Lithespeed.  The initial concept of this game was created at one of their Game Day events.  Thanks too to Agile DC and Cincinnati Day of Agile for the opportunities to share the game.
    Materials: CardUnit Materials 170904
    VN:F [1.9.16_1159]
    Rate This
    Rating: 0.0/5 (0 votes cast)

    No Comments "

    “Hitting the Target” — Business Value in Mission-Focused Organizations

    January 29th, 2016

    In the simplest terms, software development decisions for commercial organizations can be reduced to a calculation of whether the cost of developing the software will be outweighed by the revenue generated or costs saved by the software.

    However, what does this mean for government and other non-commercial organizations for whom the impact of software isn’t primarily measured in terms of revenue? How should organizations prioritize work in the face of conflicting goals and metrics? Help more people? Minimize delays? Prosecute more crimes? Lower costs? In this game participants will experience a dice-based simulation that has been created to explore these questions by examining the impact of these decisions on the performance of organizations in changing environments.

    Timing: The game takes 45 to 75 minutes to run depending on whether you run it over 2 or 3 rounds, and how long you run the debriefs.

    Materials:For each group of 5 to 7 participants

    • Instruction Sheet
    • Basic Rules Sheet
    • Organization Profile
    • Goal Sheet
    • Recorder Rules Sheet
    • A Blank Chart Sheet
    • A lot of dice (4 per person)
    • 3 pens of different colors
    • New Goal Sheet

    It is best to have each of the groups around a table.

    Instructions: (The three round version. Instructions for the two round version are included in the attachments)

    Each group represents the organization described on your Organization Profile card. Most of the participants in each group will roll dice to represent the work of the organization. The recorder in each group will capture how the organization performs. The baseline for how your organization performs is described on the Basic Rules card

    • The group complete ten turns of work. In each turn each of the non­recorder participants will roll the dice and note how many of the organization’s goals were met, and share them with the recorder.
    • The recorder will plot the data on the chart sheet after each round
    • At the end the round, discuss your observations about the round, and any parallels you
      might observe between the game and your actual work
    • At the start of the 2nd and the 3rd round, select ONE of the candidate New Rule cards
      for your team, i.e. an improvement for how your organization can perform. Note on which turn the new rule will take effect

    How to run the game:Set up all of the materials except for the New Goal Sheet for each table. Have a short discussion after each round. In the third round, after each of the tables has selected their new rule and have started the round, hand out the New Goal Sheet, and have them start applying that instead.

    Learning Points:

    • What is “success” and value within a Government or other non-commercial context.
    • How value can drive prioritization decisions.
    • How the inspect-adapt cycle can drive success within these organizations
    • The tradeoffs represented by prioritizing towards multiple mission metrics

    Discussion and facilitation guidance:

    • The first round helps establish a baseline. Ask questions about similarities to what the participants experience in their day jobs. Also ask questions about whether they are concerned about the relative progress against their three metrics.
    • The second round discussion can both address the process by which each group chose their improvement and the outcome. Did groups do any estimation or ranking of the different options? Did groups value their three metrics equally? Many times participants haven’t done anything formal, but the conversation can prompt discussion of different models. Ask the groups how much they factored the time it would take to get the new capability.
    • In the third round discussion focuses on the impact of the change in goals. If there are multiple groups, see if groups picked different capabilities, and did the new goal affecting different groups differently. If participants knew the goals could change, would it have changed their strategy. Ask if groups changed their approach to selecting a new capability based on the discussion after the second round.

    Acknowledgements:Developed by David Kane (@ADavidKane) and Deepak Srinivasan (@deesrinivasan). Thanks to the Games for Agility, Learning and Engagement Meetup for the forum to initially workshop the game. Thanks too to Southern Fried Agile and the DC Enterprise Agilists Meetup for the opportunities to share the game.

    Files:

     

    VN:F [1.9.16_1159]
    Rate This
    Rating: 0.0/5 (0 votes cast)

    No Comments "

Login

Discuss Comments
Agile Games Group