Skip to content

Latest commit

 

History

History
66 lines (44 loc) · 3.34 KB

system-review.md

File metadata and controls

66 lines (44 loc) · 3.34 KB

System Review

System Review — the meeting where teams demonstrate a comprehensive overview of new features that were implemented by teams during the last sprint. System Review = (Value + Velocity + Progress) * All The Teams.

It is conducted at the end of each sprint for each team.

Goals

  • transparency of the delivery process for Product + Business + Tech representatives and other interested parties;
  • feedback on the delivered value and adjustment of further plans;
  • finding points of interaction between teams;
  • competition within product teams — competition + collaboration;* data on the pace of work of teams will allow to focus teams on the speed of delivery and quality of the delivered value;
  • authorization of the result by the teams themselves.

Difference from Team Sprint Review

  • System Review is attended by participants who do not have time to attend separate team reviews;
  • the focus of the sprint review is on goals and Stories, the system review — on OKR and features;
  • the capabilities of the demonstration are limited in time — 10 minutes per team (against 60);
  • there may be a large feature that many teams are doing. Then all teams show it as a whole.

Attendees

  • product team;
  • Engineering Managers (EMs), Product Managers (PMs);
  • Cluster Head of Engineering (HoE), Cluster Head of Product (HoP);
  • Stakeholders (Business Developers, Marketing, Digital, etc.).

Format

Each team for 10-12 minutes:

  • briefly — teams showcase their progress towards quarterly goals;
  • briefly — the goal of the sprint, how it relates to OKR;
  • briefly — a description of the feature before its demonstration;
  • demonstration of the feature on the staging environment (end-2-end scenario);
  • briefly — team performance metrics (velocity, TTM, diagrams);
  • plans for the next sprint;
  • feedback and questions from stakeholders and adjacent teams.

Process Metrics

The process metrics to be reviewed at the meeting:

  • Scope Drop — displayed in the dashboard for both the current and previous sprint, using the formula: planned SP - implemented from planned SP) / planned SP;
  • Wasted Time — the average time a feature remains in development without active work, for the current and last sprint, visualized in the dashboard;
  • Delivery Time — the average time spent on the feature development, presented in the dashboard for the current and previous sprint.
  • Quality Rate — an indicator of the quality of features released to production, shown on the dashboard;
  • Velocity Chart — illustrates the team's planned versus actual work completed (in Story Points), as shown in the dashboard;
  • Sprint Report — this chart tracks the remaining work over time during a sprint, offering insights into progress towards sprint goals. Available for viewing in Jira.

Tips

  • book a time slot in advance;
  • features should be demonstrated by the immediate performers of tasks, not by the leads;
  • demonstrate the features on the staging environment;
  • minimize presentation part and not to waste time on it. Try to do it without presentation;
  • if the feature is done by several teams, then they show it together;
  • if there is nothing to show, the team should tell why;
  • if 10 minutes is not enough for the demonstration, then the demonstration of the features and video recordings are allowed, so as not to break the timing.