Need the #1 custom application developer in Brisbane?Click here →

Scoping

Writing User Stories and Acceptance Criteria

7 min readLast reviewed: March 2026

User stories are the bridge between requirements and development work. They convert abstract "the system should allow users to search" into specific, testable work that developers can build and testers can verify.

The User Story Format

The standard format: "As a [user type], I want to [do something] so that [I achieve this outcome]."

Good examples:

  • "As a sales rep, I want to search for customers by name or company so that I can quickly find someone who calls in."
  • "As an admin, I want to view a report of all orders from the past 30 days so that I can see sales trends."
  • "As a user, I want to reset my password via email so that I can regain access if I forget it."

Bad examples:

  • "As a user, I want a search function." (Doesn't say why, what they're searching for, or what they'll do with results)
  • "As a developer, I want to implement JWT authentication." (Wrong perspective—shouldn't be written from developer perspective)
  • "As the system, I want to validate input." (Systems don't want things; users do)

Why This Format Works

The format forces you to think from the user's perspective. You can't think about technology first. The "so that" part is critical—it explains the business value. If you can't complete the "so that" sentence, you might not need the feature.

The format also scales. You can have large user stories (epics) and small ones (tasks), and you use the same format for all.

Epics vs. Stories vs. Tasks

Epics are very large feature areas. "As a user, I want to manage my account settings so that I can control my profile, security, and preferences." This is too big to build in one sprint. It gets broken down into smaller stories.

Stories are implementable in 1-3 days (roughly 5-15 story points). "As a user, I want to change my email address so that my account is associated with the correct email." This is something a developer can build and test in a day or two.

Tasks are implementation details, not user-facing. "Set up database migration for user profile changes." "Update the email change endpoint to verify the new email." Tasks support stories.

Not every project needs to separate these rigidly. But it's useful to know the difference so you don't try to estimate an epic as if it's a story.

Acceptance Criteria

Acceptance criteria are the specific conditions that must be met for a story to be considered complete. They're testable and unambiguous.

Story: As a user, I want to search for customers by name so that I can quickly find someone.

Bad acceptance criteria: "Search works well." (Not testable—what does "works well" mean?)

Good acceptance criteria:

  • User can enter a customer name in a search box
  • Results show matching customers (first name, last name, or company name contains the search term)
  • Search is case-insensitive
  • Results load in under 1 second for up to 1,000 customers
  • No results displays "No customers found"
  • User can click a customer to view their full profile

Each criterion is testable. A QA person can verify that each one is true. A developer can read them and know exactly what to build.

Writing Good Acceptance Criteria

Be specific, not vague. "The system should be fast" is not a criterion. "Search results load in under 1 second" is testable.

Cover the happy path and edge cases. What happens when the user enters a search term that matches nothing? When they leave the field empty? When they enter special characters?

Don't include implementation details. "Use PostgreSQL full-text search" is not an acceptance criterion—that's how the developer implements it. The criterion is what the user observes: "Search returns relevant results quickly."

Use "given-when-then" format for complex criteria. "Given a user is logged in, when they click the delete button, then a confirmation dialog appears before deletion." This is clearer than prose.

Aim for 3-8 acceptance criteria per story. If you have more than 10, the story is probably too big.

Testing Your Criteria
A good test for acceptance criteria: can a QA person verify each one without asking "but what if...?" If the answer is no, the criterion is too vague.

Story Points and Estimation

Story points are a relative sizing technique. You don't estimate "this will take 5 hours." You estimate "this is about the same size as the story we built last week" or "this is 3x as big as that story."

The typical scale: 1, 2, 3, 5, 8, 13, 21 (roughly Fibonacci numbers). The idea is that bigger stories are proportionally less predictable, so the gaps get larger.

1 point: Trivial. A single simple change. Shouldn't come up often in real projects.

2-3 points: Small. A straightforward story without surprises. A developer can build and test it in a day.

5 points: Medium. A story with some complexity or integration points. 1-2 days to build.

8 points: Large. Multiple components or complex logic. 2-3 days to build.

13+ points: Too big to estimate reliably. Break it down into smaller stories.

The power of story points is that once a team has built stories of different sizes, you can estimate velocity: "we build about 40 story points per sprint." Then you can predict how many sprints a release will take.

Definition of Done

Before a story is really "done," there are usually things beyond just writing code:

  • Code is written and follows project standards
  • Code has been reviewed by another developer
  • Acceptance criteria have been tested
  • Unit tests have been written
  • The change has been deployed to staging
  • A QA person has verified it on staging
  • Documentation has been updated
  • Related tests (integration, performance) pass

Your team should agree on a definition of done and apply it consistently. It prevents surprises where something is "coded" but not actually ready for production.

Common Mistakes

Writing stories from the developer's perspective. "As a developer, I want to refactor the authentication module." Refactoring is important but doesn't belong in user stories—it goes in technical tasks or technical debt backlog.

Stories that are too big. If a story would take a week to build, break it down. Big stories are hard to estimate, hard to test, and hide surprises.

Acceptance criteria that are too vague. "The system should work correctly" is not a criterion. What does correctly mean?

Acceptance criteria that are implementation details. "Use GraphQL to fetch data" is not a criterion—it's how you implement the story. The criterion is "The user sees data on the screen in under 1 second."

Not including acceptance criteria for failure cases. What happens when the network is down? When the user has no permissions? What error messages show?

Story Workflow

A typical story lifecycle:

  1. Backlog: Story is written but not yet prioritized
  2. Refined: Acceptance criteria are clear, team has estimated it
  3. In Progress: Developer is actively working on it
  4. In Review: Code review is in progress
  5. In Testing: QA is verifying acceptance criteria
  6. Done: Acceptance criteria are all met, code is in production or staging

Some teams use different workflows, but the idea is the same: stories move through stages until they're complete.

Tools for Story Management
Stories are tracked in tools like Jira, Linear, GitHub Issues, or Azure DevOps. Pick one that your team will actually use. The tool doesn't matter—consistent use of it matters.