Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.


Standard Definition of Done

DoD for LiveDefects

DoD for Investigation tickets

DoD for Retro actions

  1. Passed Acceptance Criteria

  2. Unit tests to reflect the code

  3. Peer Code Reviewed

  4. Tested

  5. On Production

  6. PM informed

  7. Documented, documentation updated (including the Iteration Review page)

  1. Standard DoD

  2. Root Cause Analysis

  3. Incident log

  1. New ticket(s) for proposed solution where appropriate

  2. Document the findings and communicate in the relevant Refinement session (on the follow up ticket), or in a Team/Tech Sharing

  1. Action is carried out in the coming iteration

  2. For actions that are not finite (e.g. adopting newly stated best practice) the DoD is to add the decision to adopt this best practice on the Decisions log page.

Definition of Value (DoD)

In our last Retro, the team identified the need to define 'values' that help drive our delivery to our end users. Defining and agreeing on value within a team not only enhances alignment, clarity, and motivation but also improves efficiency, fosters innovation, and ultimately leads to greater success in achieving shared objectives.

The workshop presentation was conducted with the team, and we concluded that due to the structure of our organization and the complexity of our products, defining value could be challenging. This challenge often arises from the need to balance diverse perspectives and priorities, including those of users, stakeholders, and the development team. However, we recognized that it's crucial for managing expectations effectively and ensuring both user satisfaction and team productivity.

During the workshop, we identified a lack of visibility in the value that we deliver to our users. Therefore, adopting metrics and establishing a clear process for defining 'value' would be beneficial.

Key points:

Jira Legacy
serverSystem Jira
serverId4c843cd5-e5a9-329d-ae88-66091fcfe3c7
keyTIS21-5903

Team Description/Views of Value (DoD) when it comes to allocating 70% of effort to routine tasks and 30% to improvement.

•Bugs are tricky, some are "edge cases no one thought of" and kind of feel like improvements to the system born of user research, some are "we messed up/something broke" and are more like "this should always have been working this way" so don't really represent and increment in value

•I'm still mulling things over in my brain so this isn't my final thoughts, but I think I'm settling on something like: The deliverable value should in some way be determined by user needs - things we've obtained from user research and can present in review as an improvement, this should ideally be about 70% of our effort in an iteration Stuff like tech improvement and bug fixes I'd argue comes under "expected" work that we should be ideally committing 30% of our time to, but wouldn't necessarily be presented to users as delivered value (even though it is valuable in some sense) maybe that 70/30 split isn't always possible, and maybe we need to be working 70% on bugs because a bunch of stuff has broken A "successful" iteration means that the 70/30 split is adhered to (and that the amount of work delivered is non-zero)  something like that?

The need is identified for further session re: Value and how its measured.