Free Ebook cover Excel for Analysts: Modeling, Scenario Testing, and Audit-Friendly Spreadsheets

Excel for Analysts: Modeling, Scenario Testing, and Audit-Friendly Spreadsheets

New course

9 pages

Assumptions Tables and Controlled Inputs

Capítulo 2

Estimated reading time: 12 minutes

+ Exercise

Why assumptions tables matter in analytical models

Most spreadsheet models fail for one of two reasons: inputs are scattered across many sheets, or inputs are easy to change in ways that are hard to detect. Assumptions tables solve both problems by centralizing model drivers (prices, volumes, growth rates, conversion rates, tax rates, discount rates, timing, etc.) into a structured, readable, and auditable area. Controlled inputs add a second layer: they restrict what can be entered, document what is allowed, and make changes traceable.

An assumptions table is not just a list of numbers. It is a small system with rules: consistent units, clear time alignment, explicit scenario selection, and guardrails that prevent accidental or invalid entries. When built well, it becomes the single source of truth for the model’s drivers, and it makes scenario testing faster because you can change a few cells (or a scenario selector) and see the model update predictably.

What qualifies as an “assumption”

In practice, assumptions are values that are not calculated from other parts of the model, but instead represent external facts, management decisions, or analyst judgments. Typical categories include:

  • Market and demand drivers: units sold, growth rates, churn, conversion rates, seasonality indices.
  • Pricing and revenue drivers: price per unit, discount rate, mix percentages, take rate.
  • Cost drivers: COGS per unit, variable cost rates, headcount plan, salary bands, inflation.
  • Financial and accounting drivers: tax rate, depreciation lives, working capital days, payment terms.
  • Timing and one-off events: launch month, ramp periods, step changes, capex schedule.

Anything that a user might reasonably want to change during scenario testing belongs in the assumptions area. Anything that is derived from other values should generally not be typed in as an assumption, because that creates hidden circular logic and breaks auditability.

Designing an assumptions table that scales

Principles: clarity, consistency, and minimal friction

A good assumptions table is optimized for reading and changing, not for compactness. Aim for:

Continue in our app.

You can listen to the audiobook with the screen off, receive a free certificate for this course, and also have access to 5,000 other free online courses.

Or continue reading below...
Download App

Download the app

  • One place to look: avoid duplicating the same assumption in multiple locations.
  • Consistent time structure: if the model is monthly, assumptions should be monthly (or clearly mapped from annual to monthly).
  • Explicit units: show whether a value is %, currency, units, or days.
  • Stable layout: avoid inserting rows/columns that break references; use Excel Tables where appropriate.
  • Separation of input vs. calculated: inputs should be visually distinct and ideally located in a dedicated block.

Common table layouts

There are two common patterns, and you can choose based on the model’s needs:

  • Wide time-series table: rows are assumptions, columns are periods (e.g., Jan–Dec). This is ideal for time-dependent drivers like price changes or seasonality.
  • Long “database” table: columns like AssumptionName, Period, Value, Scenario. This is powerful for advanced filtering and pivoting, but requires more formula sophistication.

For most analyst models, a wide time-series table is the fastest to implement and easiest to audit.

Step-by-step: build a wide assumptions table with Excel Table features

This walkthrough creates a structured assumptions area that is easy to reference and hard to break.

Step 1: Create a dedicated assumptions block

  • Choose a sheet area reserved for inputs (for example, top-left of an “Inputs” sheet).
  • Add columns: Assumption, Unit, Notes, then period columns (e.g., 2026-01, 2026-02, …).
  • Keep the first three columns non-time metadata so users can understand what they are editing.

Step 2: Convert to an Excel Table

  • Select the range and use Insert > Table.
  • Ensure “My table has headers” is checked.
  • Name the table (Table Design > Table Name) something like tblAssumptions.

Why this helps: Tables expand automatically, structured references are more readable, and you reduce the chance of missing new rows in formulas.

Step 3: Standardize units and formats

  • Use consistent number formats: percentages as % with a fixed number of decimals; currency with a consistent symbol; days as integers.
  • In the Unit column, use a controlled list (e.g., %, $, units, days) to avoid variations like “USD” vs “$”.

Step 4: Add a “Notes” discipline

  • In the Notes column, capture the source or logic: “Contracted price through Q2”, “Management target”, “Based on last 12 months”.
  • Keep notes short but specific enough that another analyst can understand the intent.

Controlled inputs: preventing bad data from entering the model

Controlled inputs are techniques that restrict or validate what users can enter. The goal is not to make the spreadsheet annoying to use; it is to prevent silent errors like typing 15 instead of 15%, entering a date outside the modeled range, or selecting a scenario name that does not exist.

Data Validation as the first line of defense

Excel’s Data Validation can enforce rules at the cell level. Use it for:

  • Lists: scenario names, product names, regions, cost categories.
  • Numeric bounds: tax rate between 0% and 40%, discount rate between 0% and 30%.
  • Date bounds: start date must be within the model horizon.
  • Custom formulas: more complex checks like “must be a multiple of 5” or “must not be blank if another cell is filled”.

Step-by-step: create a scenario selector with a validated dropdown

Step 1: Create a scenario list

  • In a small range, list scenario names vertically: Base, Upside, Downside.
  • Convert that range to a Table or define a named range (e.g., rngScenarios).

Step 2: Create the selector cell

  • Choose a single cell (e.g., B2) labeled “Scenario”.
  • Apply Data > Data Validation, allow List, source = your scenario list range.

Step 3: Add an input message and error alert

  • Input message: “Select a scenario; do not type.”
  • Error alert (Stop): “Invalid scenario. Choose from the list.”

Result: scenario selection becomes consistent, and formulas that depend on scenario names won’t break due to typos.

Step-by-step: validate numeric assumptions with bounds and units

Suppose you have an assumption row for “Tax rate” and you want to prevent entries like 2.5 (meaning 250%) or -5%.

  • Select the tax rate input cells across periods.
  • Data Validation: Allow = Decimal, Data = between, Minimum = 0, Maximum = 0.4.
  • Set the format to Percentage so users see 25% rather than 0.25.

For values that must be integers (e.g., “Days sales outstanding”), use Allow = Whole number and set reasonable bounds (e.g., 0 to 180).

Scenario-ready assumptions: Base/Upside/Downside without chaos

Scenario testing becomes messy when you copy entire sheets or maintain separate workbooks. A cleaner approach is to store scenario-specific assumptions in a structured way and use a single selector to pull the correct values.

Pattern A: separate tables per scenario (simple and readable)

Create three tables with identical structure: tblAssump_Base, tblAssump_Upside, tblAssump_Downside. Each table has the same assumption names and period columns. Then use a selector to choose which table feeds the model.

Pros: easy to audit; you can compare scenarios side-by-side. Cons: more space; must keep structures synchronized.

Pattern B: one table with a Scenario column (compact and scalable)

Use a long table with columns: Scenario, Assumption, Period, Value, Unit, Notes. Then retrieve values using lookup logic based on Scenario + Assumption + Period.

Pros: scales to many scenarios; easy to filter. Cons: requires more complex formulas and careful performance management.

Step-by-step: implement Pattern A with a selector-driven retrieval

Assume you have a model calculation area that needs “Price per unit” for each month. You want the model to pull the price row from the selected scenario table.

Step 1: Ensure identical assumption labels

  • In each scenario table, the Assumption column must match exactly (e.g., “Price per unit”).
  • Period headers must match exactly (e.g., 2026-01, 2026-02).

Step 2: Create a mapping cell for the active table

  • Use the scenario selector cell (e.g., B2) to determine which table to reference.
  • Because structured references cannot easily switch table names dynamically, use a controlled approach: create an “Active Assumptions” block that is populated based on scenario, or use CHOOSE with fixed ranges.

Step 3: Use CHOOSE to switch among scenario ranges

Example approach (conceptual): assign an index to scenarios (Base=1, Upside=2, Downside=3). Then use CHOOSE to select the appropriate range for lookups.

=LET(scn,$B$2, idx, XMATCH(scn, {"Base","Upside","Downside"}), rng, CHOOSE(idx, tblAssump_Base[[2026-01]:[2026-12]], tblAssump_Upside[[2026-01]:[2026-12]], tblAssump_Downside[[2026-01]:[2026-12]]), row, XMATCH("Price per unit", CHOOSE(idx, tblAssump_Base[Assumption], tblAssump_Upside[Assumption], tblAssump_Downside[Assumption])), INDEX(rng, row, 1))

This example returns the first month’s price; you can adapt the column index to match the period needed. The key idea is controlled switching: only predefined scenario tables are allowed, so you avoid arbitrary references.

Practical tip: if you find CHOOSE formulas too complex, a robust alternative is to create a single “Active” table that is filled via Power Query or a simple copy/paste macro, but only from controlled sources. The main objective remains: one selector, one active set of inputs.

Making assumptions auditable: transparency without clutter

Use explicit labels and stable keys

Assumption names should be stable identifiers, not prose. Prefer “COGS per unit” over “Cost of goods sold per unit (estimated)”. Put the extra commentary in Notes. Stable keys make lookups reliable and reduce the risk of broken references when wording changes.

Separate “raw inputs” from “derived inputs”

Sometimes you need to input an annual number but the model runs monthly. Instead of typing monthly values manually, store the annual assumption as the raw input and calculate the monthly series in a derived block.

Example: Annual inflation rate is an input; monthly inflation factor is derived. This reduces manual work and keeps the logic consistent.

=LET(annual_rate, $C$10, monthly_rate, (1+annual_rate)^(1/12)-1, monthly_rate)

Then apply the monthly rate across periods in a calculated row, clearly labeled as derived.

Document sources in a structured way

Beyond Notes, consider adding a Source column with controlled values such as “Contract”, “Historical”, “Management”, “Benchmark”, “Analyst estimate”. This makes reviews faster: reviewers can filter to “Analyst estimate” and focus their attention where judgment is highest.

Input controls beyond Data Validation

Conditional formatting to highlight editable cells and detect issues

Conditional formatting can serve two purposes: usability (show users where to type) and error detection (flag out-of-range values).

  • Editable cell shading: apply a light fill color to input cells only. Keep calculated cells unshaded.
  • Out-of-range flags: apply a rule that highlights values outside expected bounds.

Example: highlight any percentage assumption greater than 1 (100%) if your model expects rates in decimal form.

=A1>1

Apply this rule to the relevant input range (adjust the formula to the top-left cell of the applied range).

Preventing blanks where blanks are dangerous

Blanks can be valid (unknown) or dangerous (accidentally cleared). For assumptions that must always be present (e.g., tax rate, discount rate), enforce non-blank entries with Data Validation (Custom):

=NOT(ISBLANK(A1))

Pair this with an error alert that explains what to do: “This assumption cannot be blank. Enter a value or use 0 if not applicable.”

Controlled input “switches” (toggles) for model behavior

Many models need binary choices: include/exclude a cost, choose between two methods, turn on a feature. Implement these as controlled switches rather than ad hoc text.

  • Use a dropdown with values {0,1} and label it clearly (0=Off, 1=On).
  • Or use {“Off”, “On”} but ensure formulas map it to numeric logic consistently.

Example: a switch that turns marketing spend on/off.

=IF($B$5=1, MarketingSpend, 0)

Then validate B5 as a list {0,1} and add an input message explaining the meaning.

Time alignment: controlling how assumptions map to periods

Even with a clean table, many errors come from misaligned time: an annual rate applied monthly, a start date that shifts one column, or a step change applied in the wrong month. Controlled inputs should include controls for timing.

Step-by-step: use an effective date to drive a step change

Suppose price increases from $10 to $11 starting in July 2026. Instead of manually typing $11 from July onward (which is error-prone if the timeline changes), store:

  • Old price
  • New price
  • Effective date (month)

Then calculate the monthly price series using the model’s period headers.

=LET(old, $C$2, new, $C$3, eff, $C$4, period, D$1, IF(period<eff, old, new))

This approach is controlled: the only manual entries are the two prices and the effective date, and the series updates automatically if you extend the timeline.

Practical example: a compact assumptions table for a revenue model

Below is an example of assumptions you might include for a simple subscription revenue model. The key is not the specific drivers, but the structure and controls.

  • New customers (units) by month (validated as whole number, min 0)
  • Conversion rate (%) (validated between 0 and 1, formatted as %)
  • Monthly churn rate (%) (validated between 0 and 0.2, formatted as %)
  • ARPU ($/month) (validated as decimal, min 0)
  • Discount rate (%) (validated between 0 and 0.5)
  • Tax rate (%) (validated between 0 and 0.4)
  • Scenario selector (dropdown list)

To make it audit-friendly, add Notes and Source columns, and apply conditional formatting to highlight any entry that violates expected ranges even if it technically passes validation (for example, churn of 19% might be allowed but should be visually flagged for review).

Common pitfalls and how to avoid them

Mixing units (percent vs. basis points vs. decimals)

Decide one representation for rates (typically decimals with % formatting) and enforce it with validation and formatting. If you must accept basis points, add a separate unit field and convert in a derived row rather than mixing formats in the same input range.

Hardcoding values inside formulas

When a value is likely to change (tax rate, inflation, growth), do not embed it in formulas. Put it in the assumptions table and reference it. This makes scenario testing possible and reduces hidden logic.

Over-validating and blocking legitimate cases

Validation bounds should reflect realistic ranges, but avoid making them so tight that users cannot model edge cases. If you need strict controls for production use, consider a “Review mode” where validation is strict, and a “Sandbox mode” where bounds are wider, but still present.

Allowing free-text categories that drive calculations

If a text value affects logic (e.g., “Method = Straight-line vs Accelerated”), do not allow free typing. Use a dropdown list and map each option to a defined calculation path.

Now answer the exercise about the content:

Which approach best improves auditability and reduces silent errors when building scenario-ready Excel models?

You are right! Congratulations, now go to the next page

You missed! Try again.

An assumptions table centralizes key drivers, while controlled inputs (e.g., validated lists, numeric bounds, explicit units) prevent invalid entries and make scenario changes predictable and auditable.

Next chapter

Structured Calculations and Transparent Logic

Arrow Right Icon
Download the app to earn free Certification and listen to the courses in the background, even with the screen off.