...
Standard Definition of Done | DoD for LiveDefects | DoD for Investigation tickets | DoD for Retro actions |
---|---|---|---|
|
|
|
|
Definition of Value (DoD) | ||||||||||
---|---|---|---|---|---|---|---|---|---|---|
In our last Retro, the team identified the need to define 'values' that help drive our delivery to our end users. Defining and agreeing on value within a team not only enhances alignment, clarity, and motivation but also improves efficiency, fosters innovation, and ultimately leads to greater success in achieving shared objectives. | Work shop The workshop presentation was carried out conducted with the team, and in conclusion we understood concluded that due to the structure of our organization and the organisation and complexity of our products, defining value could present challenging as it is often involves balancing be challenging. This challenge often arises from the need to balance diverse perspectives and priorities, such as including those of users, stakeholders, and the development team. However, we recognized that it's crucial for managing expectations effectively and ensuring both user satisfaction and team productivity. In carrying out During the workshop, we recognised the identified a lack of visibility in the value that we delivered deliver to our users therefore . Therefore, adopting metric metrics and having establishing a clear process to for defining ‘value’ 'value' would be beneficial. Key points: |
| ||||||||
Team | Definition Description/Views of Value (DoD) when it comes to allocating 70% of effort to routine tasks and 30% to improvement. | •Bugs are tricky, some are "edge cases no one thought of" and kind of feel like improvements to the system born of user research, some are "we messed up/something broke" and are more like "this should always have been working this way" so don't really represent and increment in value •I'm still mulling things over in my brain so this isn't my final thoughts, but I think I'm settling on something like: The deliverable value should in some way be determined by user needs - things we've obtained from user research and can present in review as an improvement, this should ideally be about 70% of our effort in an iteration Stuff like tech improvement and bug fixes I'd argue comes under "expected" work that we should be ideally committing 30% of our time to, but wouldn't necessarily be presented to users as delivered value (even though it is valuable in some sense) maybe that 70/30 split isn't always possible, and maybe we need to be working 70% on bugs because a bunch of stuff has broken A "successful" iteration means that the 70/30 split is adhered to (and that the amount of work delivered is non-zero) something like that? | The need is identified for further session re: Value and how its measured. | |||||||