What a Product Backlog Item (PBI) Is
A Product Backlog Item (PBI) is a placeholder for work that could improve the product. It can represent a user-facing feature, a bug fix, a technical improvement, a research spike, a compliance task, or an operational need. A PBI is not a promise to deliver; it is an option the team can choose when it becomes valuable and understandable enough.
Think of the Product Backlog as a funnel: ideas enter in many shapes, and PBIs are the shaped pieces that can be discussed, compared, and eventually selected. A PBI should be small enough to talk about in a concrete way, but it does not need to be fully specified far in advance.
Examples of PBIs
- Feature: “Allow users to export invoices as PDF.”
- Bug: “Fix rounding error in tax calculation for CHF currency.”
- Technical: “Introduce caching for product catalog API responses.”
- Research: “Spike: evaluate payment provider X for recurring billing.”
Characteristics of an Actionable Item
An actionable PBI is one that can be meaningfully discussed and, if chosen, can be turned into a plan for implementation. “Actionable” does not mean “perfectly detailed.” It means the team can understand what problem is being solved, what outcome is expected, and what constraints matter.
Signals that a PBI is actionable
- Clear intent: Why it matters and what change is desired.
- Concrete outcome: What will be different for a user or system.
- Boundaries: What is in scope and what is explicitly out of scope (even if briefly).
- Testability: A way to tell whether it works (often via acceptance criteria).
- Right-sized: Not obviously too large to complete within a sprint once selected.
- Dependencies surfaced: Known external needs are visible (other teams, vendors, data access).
Practical shaping pattern: from idea to actionable PBI
- Capture the idea in one sentence. Example: “Customers want to download invoices.”
- State the user or stakeholder and the outcome. “As an account admin, I can download an invoice PDF so I can file it for accounting.”
- Add a short context note. “Needed for enterprise customers; support tickets mention this weekly.”
- Define a first slice. “PDF for paid invoices only; no bulk export.”
- Add acceptance criteria at a basic level. Enough to verify the slice works.
- Flag open questions. “Do we need branding? Which languages?”
Ordering vs. “Prioritization” Language
In Scrum, the Product Backlog is ordered. Ordering means arranging items so the most valuable, urgent, and sensible work appears closer to the top. People often say “prioritization,” but that word can imply a fixed ranking based on a single factor (usually value) or a one-time decision. Ordering is more practical: it reflects multiple considerations and is updated as learning happens.
What influences ordering (typical factors)
- Value: Revenue impact, customer satisfaction, risk reduction, strategic alignment.
- Urgency/time sensitivity: Deadlines, seasonal needs, contractual commitments.
- Risk and uncertainty: Items that reduce unknowns may move up to enable better decisions.
- Dependencies: Some items must precede others (or unblock them).
- Effort/cost: Not to optimize for “easy wins” only, but to make trade-offs explicit.
Practical tip: explain ordering with a short rationale
For the top items, add a one-line “ordering note” such as: Ordered high because it unblocks onboarding flow experiment or Ordered high due to regulatory deadline. This reduces debate about hidden assumptions.
- Listen to the audio with the screen off.
- Earn a certificate upon completion.
- Over 5000 courses for you to explore!
Download the app
How Refinement Happens (and Who Participates)
Refinement is the ongoing activity of improving PBIs so they are clearer, smaller, and better ordered. It is not a single meeting and not a phase. It happens whenever the team needs to reduce uncertainty for upcoming work.
Who participates
- Product Owner: Brings goals, context, and ordering decisions; ensures items express desired outcomes.
- Developers: Explore feasibility, suggest slicing, identify risks and dependencies, propose technical approaches.
- Others as needed: UX, analytics, security, legal, support, operations, or domain experts join when their input changes understanding.
Common refinement activities
- Clarify intent: What problem are we solving? For whom?
- Split items: Turn a large PBI into smaller slices that still deliver value.
- Add acceptance criteria: Define observable behavior and constraints.
- Estimate or size (if your team uses it): Enough to compare and plan, not to predict perfectly.
- Identify dependencies and risks: External approvals, data availability, performance constraints.
- Re-order: Adjust based on new information or changed priorities.
Step-by-step: a lightweight refinement flow for one PBI
- Read the PBI out loud. Ensure everyone is discussing the same thing.
- Ask “What’s the user-visible outcome?” If unclear, rewrite the PBI title/description.
- List assumptions and questions. Keep them in the PBI so they are not lost.
- Propose a smallest valuable slice. Define what can be delivered first without building everything.
- Draft acceptance criteria. Focus on behavior and boundaries.
- Check size and risks. If too big or risky, split or add a spike.
- Confirm next action. Example: “Need UX mock,” “Need API contract,” or “Ready to discuss for selection.”
Acceptance Criteria Basics (Enough to Verify the Work)
Acceptance criteria describe the conditions that must be met for the PBI to be considered done from a functional or behavioral perspective. They help align expectations, reduce rework, and make testing straightforward. Acceptance criteria are not the same as tasks; they describe outcomes, not the internal steps to achieve them.
Good acceptance criteria tend to be
- Specific: Avoid vague words like “fast” or “user-friendly” without measurable meaning.
- Testable: Someone can verify them through observation or checks.
- Focused on behavior: What the system does, not how it is implemented.
- Bounded: Include what is excluded to prevent scope creep.
Two practical formats
Checklist style (simple and quick):
- Invoice PDF can be downloaded from the invoice detail page.
- PDF includes invoice number, date, line items, totals, and tax breakdown.
- Only paid invoices show the download option.
- Unauthorized users cannot access another account’s invoices.
Given/When/Then (useful for complex rules):
Given an account admin viewing a paid invoice When they click “Download PDF” Then a PDF is generated and downloaded within 5 secondsGiven a user without invoice permission When they open an invoice URL Then they see an access denied message and no PDF is generatedPractical tip: keep criteria at the right level
If criteria start describing internal design (e.g., “use library X” or “store in table Y”), move that into technical notes or tasks. Keep acceptance criteria about what must be true for the user or system behavior.
Spotting Unclear or Oversized Items
Many backlog problems show up as “mystery PBIs” (unclear) or “monster PBIs” (too large). Catching these early makes selection and planning smoother.
Red flags for unclear PBIs
- Ambiguous nouns: “Improve onboarding,” “Enhance performance,” “Fix reporting.”
- No user or stakeholder: It is unclear who benefits.
- No success signal: No way to tell if it worked.
- Hidden policy decisions: Pricing rules, permissions, or compliance requirements not stated.
- Too many interpretations: Different team members describe different solutions.
Red flags for oversized PBIs
- Contains multiple outcomes: “Redesign billing and add new plans and migrate customers.”
- Touches many systems: UI + backend + data migration + analytics + emails all at once.
- Long list of acceptance criteria: Especially if they represent separate features.
- “We can’t even estimate it” is often a sign it needs splitting or discovery.
Practical splitting techniques
- Split by workflow step: Create invoice PDF first, then add email sending later.
- Split by user segment: Admins first, then standard users, then partners.
- Split by scope boundary: Single invoice download first, then bulk export.
- Split by rule complexity: Basic tax rules first, then edge cases.
- Split by interface: API capability first, then UI integration (or vice versa) if it still delivers value.
A Practical Checklist for “Ready Enough to Discuss” (Without a Strict Definition of Ready)
Teams often want a strict “Definition of Ready” to prevent messy work. The risk is turning it into a gate that blocks learning or creates bureaucracy. Instead, use a flexible checklist to decide whether a PBI is ready enough to discuss for selection. The goal is to reduce avoidable surprises while keeping room for discovery.
Use this checklist as a conversation tool
| Check | What “good enough” looks like | If missing, try this |
|---|---|---|
| Intent is clear | One or two sentences explain the problem and desired outcome | Rewrite the title and add a short “why” note |
| Scope is bounded | At least one explicit in-scope and one out-of-scope statement | Add “Not included” bullets to prevent assumptions |
| Acceptance criteria exist | A few testable bullets or Given/When/Then for key behavior | Draft 3–5 criteria that define success |
| Open questions are visible | Unknowns are listed, not hidden in people’s heads | Add an “Open questions” section to the PBI |
| Dependencies are surfaced | External needs are noted (approvals, data, other teams) | Add a dependency note and an owner for follow-up |
| Size seems plausible | Team believes it can be completed within a sprint if selected | Split into smaller slices or add a spike to learn |
| Value is explainable | There is a reason it belongs near the top | Add a one-line ordering rationale |
How to apply the checklist in practice
- Pick the next few items near the top. Don’t refine the entire backlog.
- Run the checklist quickly. Aim for minutes, not hours, per item.
- Fix only what blocks understanding. Add minimal notes, criteria, or a split.
- Leave discovery where appropriate. If uncertainty is the point, capture it and plan a spike or experiment.
- Re-order if new information changes the trade-offs. Treat ordering as a living decision.
Example: turning a vague item into “ready enough to discuss”
Before: “Improve onboarding.”
After (ready enough to discuss):
- Title: “Add progress indicator to onboarding checklist”
- Why: “Users drop off after step 2; support reports confusion about remaining steps.”
- In scope: Show step count and current step on onboarding pages.
- Out of scope: No redesign of onboarding content; no new steps.
- Acceptance criteria:
- Progress indicator appears on all onboarding steps.
- Indicator updates when the user completes a step.
- Returning users see their current progress.
- Open questions: “Do we need analytics events for step views/completions?”