Identifying Sprint Review Anti-patterns
We’re coming to the end of the year, which means a lot of us are attending various town halls and high profile meetings showcasing achievements of the year. Particularly for me working in a large organisation, I’m attending a few of them. It’s nice to reflect back on recent accomplishments. Less so to focus on any failures.
A more regular Scrum ceremony focused on showcasing sprint achievements and contributions is the friendly sprint review. Recently we have faced considerable challenges with this ceremony which are slightly different to those I have reflected on before. Here I discuss some of the recent review anti-patterns I have encountered, and explain how to address these early warning signs.
A Horse With No Name
For those unfamiliar with this term, let’s review the Sprint Review definition as per the new shiny 2020 Scrum Guide:
The purpose of the Sprint Review is to inspect the outcome of the Sprint and determine future adaptations. The Scrum Team presents the results of their work to key stakeholders and progress toward the Product Goal is discussed.
Make Your Move
Moving the Review out by a few days or hours is a symptom that often originates from good intentions. It is normally an attempt at kindness to accommodate others. Yet it is akin to that age old analogy of moving the goalposts half-way through the game.
There are numerous examples of these innocent rescheduling attempts that I’ve encountered over the years:
- Issues with non-prod environment unavailability.
- Key stakeholder or team member out on vacation.
- It’s almost finished. I just need X more days to complete the item.
On point one specifically, there is definitely a concern there about incomplete features being showcased. However, in larger organisations with poor support for fast pipelines to production, this can be an allowance teams make in their Definition of Done. However, with the others it’s probably best to proceed at the original slot rather than break the cadence. Movement can interfere with scheduling for all stakeholders and have them question whether it is proceeding or not.
Don’t Stop Me Now
An issue related to the aforementioned movement of a review is cancelling the session altogether. It is a common anti-pattern I observe when a sprint doesn’t quite go to plan. I’ll hold my hands up and admit that I have also been guilty of this previously as an engineer. However, on reflection I see it was the wrong thing to do. Each time it was moved I was projecting a false illusion that we were perfect achievers. In reality we were not.
This situation can be identified by these common phrases:
- We don’t have anything to show.
- We are not ready to show feature X to users yet.
- It’s not done yet.
- The team is not prepared.
- Users are not interested in the hygiene work delivered this sprint.
- There’s not enough features delivered to warrant updating stakeholders.
By cancelling the sprint review, irrespective of success or failure, you remove the ability to inspect the outcome of the Sprint and determine future adaptations. Cancelling a review even once, removes accountability from the squad. Let’s face it, if you cancel the session once, there’s nothing stopping you repeating that pattern the next time you’re not comfortable facing stakeholders.
Can we honestly say we’re working in a partnership to achieve a common Product Goal if we are not comfortable showcasing failures and challenges as well as successes? I would suggest not.
It’s (Not) A Demo
Remember back to school maths and set theory? Did you ever think it would connect to Agility? Well you are in luck! It’s time to commit the following equations to memory:
Sprint Review≠ Demo
Sprint Review∋ Demo
To reiterate, a sprint review is not a demo. Yes, a sprint review can contain a demo. However, many of us are incorrectly using the terms as equivalent structures.
I can’t take credit for this insight. Many amazing people have pointed this out, including the insightful Ryan Ripley & Todd Miller in Fixing Your Scrum. But this piece wouldn’t be complete without commenting on recent experiences that many consider a review to be only a demo. Attending a recent guild meeting I was very aware that many of us were using the term demo throughout, even after learning they are not the same. It’s been hardwired into our brain.
Now is the time to break the habit. Part of the problem may be that the Scrum Guide doesn’t prescribe a clear agenda. However, as we can see from the following quote, it does impart some advice on what the intent should be:
During the event, the Scrum Team and stakeholders review what was accomplished in the Sprint and what has changed in their environment. Based on this information, attendees collaborate on what to do next. The Product Backlog may also be adjusted to meet new opportunities. The Sprint Review is a working session and the Scrum Team should avoid limiting it to a presentation.
From the above, a sample structure that I’ve been working to recently comprises of the following steps:
- Welcome everyone! You’d be surprised how many times meetings don’t include a simple hello.
- Reiterate the product goals.
- Highlight process towards any goals, and results of any Key Performance Metrics (KPIs), or metrics to show your progress towards product goals.
- Highlight key achievements or issues achieved during the sprint. Yes, we want to showcase success and failure.
- Demo these achievements. Ideally this should be working software. However, any artefacts you can show to demonstrate progress towards the sprint and product goals can be showcased. Eligible artefacts include, but are not limited to, analysis results, mock-ups, wireframes and architecture diagrams are eligible.
- Obtain feedback on these items.
- Review next steps for the team, highlighting any risks or concerns you see. This is the perfect time to highlight any help or support you may need from stakeholders.
- Add anything else the team feels is appropriate to cover in the review.
- Close out and thank everyone for their time, as well as the team for their contributions.
There are many other resources available on conducting effective sprint reviews. A simple online search will give many useful posts. Do also check out the aforementioned book Fixing Your Scrum as well.
Recently I have seen that lack of preparation is a key problem that impacts a sprint review. It can be easy to stick our heads in the sand and focus on the work and not consider how to demonstrate progress. Especially if the squad has exhibited the aforementioned moving or cancelling of sprint reviews. That can set the expectation that it might not happen. Therefore why would I prepare for it? Which in turn can lead to moving or cancelling as the team are not ready to showcase work and be accountable to stakeholders.
When I think back over my Agile journey to date, I know I have been guilty of inadequate preparation. Back when I dipped my toes into the fast-paced world of front-end Web development, I was often the individual conducting the demo. It was very easy to have a local version running on my machine and quickly click through how the new shiny feature worked on the screen. Yes I intentionally use the word demo here as at the time that’s what we were doing rather than a review. Even your friendly neighbourhood agile advocate has been guilty of past Scrum indiscretions!
The reality is that my prior style presents some challenges. Simply showing the feature working doesn’t highlight what problem has been solved with this new addition. It limits the feedback model to a simple question and answer format, rather than encouraging meaningful conversation on the item. Finally, it doesn’t make clear how this feature fits with the Product Goals.
I’m not suggesting you need to practice for weeks on end over and over again to refine your review. I save that level of diligence to my talks! Nevertheless, I have found taking time to determine what I want to say helps me focus on the true journey. Preparation should apply to the entire review. The recent addition of a quick team sprint review prep call is yet to be evaluated for its preparation benefit. However I remain optimistic that those 15 minutes shall help the team be more prepared for their next review.
Lead Me On
Many of the anti-patterns discussed here can be symptoms of a lack of education within teams. They can also be a symptom of management interference. Traditional technology-business relationships are generally classed as a service function. Showing any weakness or failure in such situations is a big faux-pas.
Management have grown up within cultures where these relationships are common place. As idealistic as it may sound to some, my perspective is that these relationships should be fostered as more of a partnership. This can be very challenging indeed if you are having to change from a servant mindset.
Leadership within both IT and business functions may think it is right to influence or alter the timing or content of a review to increase attendance or convey a particular success. By doing this they may break team autonomy in certain situations by influencing the timing or content of a review. It’s important for them to support the team by attending the review, rather than undermine the squad’s ability to conduct a regular review.
There are many more pitfalls to watch out for when it comes to sprint reviews. It’s definitely worth investing time in learning about these other anti-patterns and seek ways to avoid them if you can. I would strongly recommend checking out Fixing Your Scrum by Ryan Ripley & Todd Miller, focusing on the dedicated Sprint Review chapter. It’s definitely been a great resource for me over the last year for me in so many ways.
Press ahead with any of these symptoms at your peril. The five faux-pas discussed today point to a massive problem with accountability and partnership plaguing not only the team, but management and stakeholder circles. Outside coaching and influence is imperative to fix such anti-patterns and guide all levels on the course to frequent and successful sprint reviews.
Thanks for reading!