Skip to main content

Acceptance Criteria vs Definition of Done

Definition of Done (DoD) is a list of requirements that a user story must adhere to for the team to call it complete.

While the Acceptance Criteria of a User Story consist of set of Test Scenarios that are to be met to confirm that the software is working as expected.

Acceptance criteria:

 Acceptance criteria or  Acceptance scenarios are an integral part of a story.

The  acceptance tests define what actually has to be built to implement a story.

There is no allocated responsibility for writing acceptance criteria.

While it is usually the product owner or product manager who gets around to defining the functionality, just about anyone on the team could write acceptance criteria for user stories.

The writer must be careful to ensure that the acceptance criteria are written from the perspective of the end user, and for this reason the product owner (who is considered to be the voice of the customer) often undertakes this task. 

The acceptance criteria written in this format Given - When - Then

Given - some initial context

When - an event occurs

Then - ensure some outcomes

The example for the acceptance criteria given below for the user story, "Customer withdraws cash from ATM:

User story:

As a customer

I want to withdraw cash from ATM,

So that i don't have to wait at the bank.

Acceptance criteria for the above user story:

Positive scenario:

Given the account is in credit

and the card is valid

and the dispenser account contains cash

When the customer requests cash

Then ensure the account is debited

and ensure cash is dispensed

and ensure the card is retuned

Negative scenario:

Given the account is in overdrawn

and the card is valid

When the customer requests cash

Then ensure a rejection message is displayed

and ensure cash is not dispensed

and ensure the card is retuned

Definition of Done:

The definition of Done is structured as a list of items, each one used to validate a Story, which exists to ensure that the Development Team agree about the quality of work they’re attempting to produce.

It serves as a checklist that is used to check each Product Backlog Item or User Story for completeness. Items in the definition of “Done” are intended to be applicable to all items in the Product Backlog, 

For example, in the software industry, teams may need to ask some of the following questions to come up with their DoD:

  • Code peer reviewed?
  • Code completed?
  • Code reviewed?
  • Code checked-in?
  • Unit tests passed?
  • Functional tests passed?
  • Acceptance tests completed?
  • Product Owner reviewed and accepted?

Comments

Popular posts from this blog

Scaled Agile Framework (SAFe)

The Scaled Agile Framework (SAFe) is a set of organizational and workflow patterns for implementing agile practices at an enterprise scale. The framework is a body of knowledge that includes structured guidance on roles and responsibilities, how to plan and manage the work, and values to uphold. Scrum is a simple, flexible approach to adopting Agile that's great for small teams. SAFe is an enterprise-wide Agile framework designed to help bring Agile beyond the team and into the company as a whole. Scaled Agile has built a comprehensive level that includes all the four layers called the team, program, large solutions, and portfolio level. 4 Layers: Portfolio - Strategy, Vision, Roadmap, Strategy goal, Decision making, Budget, Portfolio level metrics,  Program - Align multiple teams towards a common mission, Bring together all the Agile teams, transparency, collaboration, and synchronisation, Scrum of Scrums, Product Owners to define the overall vision. Large Solutions - archite

Risk Register

A project risk register is a tool project managers use to track and monitor any risks that might impact their projects. Risk management is a vital component of project management because it's how you proactively combat potential problems or setbacks. Risk Description Impact Risk Response Risk Level Risk Owner Automation Testing Software licence delay Delay in starting testing and project schedule impact As we have one licence. Planned to start automation testing in 2 shifts. Planned to get one more licence in 2 weeks’ time. High IT team Frequent Disruption in dependency API services Delay in development of integration and unit testing Dependency API service is down, and the team is working on resolving the issue. Continuously working with API team High External Team/ Project Manager There is chance of new requir

Lessons learned from sprint retrospective meeting

Scenario: Team Missed Sprint Goals Challenge: A development team consistently missed its sprint goals, leading to frustration and a drop in morale. Team members felt overwhelmed by the workload and struggled to communicate effectively. Retrospective Insights: During the retrospective, team members openly discussed their challenges and frustrations. They identified bottlenecks in communication, unclear priorities, and unrealistic expectations. The team realized that individual workloads were not evenly distributed, causing burnout for some members. Lessons Learned: Effective Communication Matters: The team recognized the importance of clear communication. They committed to regular stand-up meetings, where everyone shared progress, blockers, and priorities. Balancing Workloads: The retrospective highlighted the need to distribute tasks more evenly. They decided to monitor workloads and adjust assignments accordingly. Setting Realistic Goals: The team acknowledged that setting achievable