CareerClimbCareerClimb
Promotion
Promotion Case
Impact
Career Growth
March 29, 20267 min read

How to Frame Impact vs Output in a Promotion Case

How to Frame Impact vs Output in a Promotion Case

You shipped more features than anyone on your team last year. You reviewed hundreds of PRs. You were on-call for twelve straight weeks while half the team was on vacation. And when the promotion decisions came back, someone who shipped half as much got the nod instead of you.

The feedback from your manager was some version of: "Your work is solid, but the committee wanted to see more impact."

You're staring at the word "impact" like it's a riddle. You had plenty of output. What else is there?

That gap between output and impact is where most promotion cases die. Not because the engineer didn't do the work, but because the case document described activity instead of outcomes.

What output looks like in a promotion case

Output is a list of things you did. It's the default way engineers write about their work because it's concrete and easy to recall.

  • "Led the migration to the new auth framework"
  • "Reviewed 200+ PRs this quarter"
  • "Shipped the payments redesign on time"
  • "Mentored three junior engineers"
  • "Participated in 12 on-call rotations"

Every item on that list is true. Every item describes real work. And every item tells a promotion committee exactly nothing about whether you should be at the next level.

The problem isn't that these accomplishments are weak. The problem is that they're incomplete. A junior engineer can review 200 PRs. An L4 can ship a feature on time. The committee is reading your case looking for one thing: what changed because you did this work? If the answer isn't in the document, the committee fills in the blank with "nothing unusual."

What impact looks like

Impact is the answer to "so what?" It's what changed in the business because your work existed.

Take the same list and push each item one step further:

OutputImpact
Led the auth framework migrationReduced login latency from 800ms to 200ms, eliminating the #1 customer-reported friction point
Reviewed 200+ PRsCaught a recurring data consistency bug during reviews that led to a team-wide schema validation initiative, cutting production incidents by 30%
Shipped payments redesign on timeCheckout abandonment dropped 18% in the first month after launch
Mentored three junior engineersTwo of them independently led their first design reviews by end of quarter
12 on-call rotationsWrote the runbook overhaul that reduced mean-time-to-resolve from 45 minutes to 12 minutes for the three most common alert types

Same work. Completely different story. The output column says "I was busy." The impact column says "my work moved numbers that the business cares about."

As Gergely Orosz writes, software engineers are in the business of solving problems, not in the business of shipping features. The promotion committee already assumes you ship features. That's your current-level job. The case for the next level needs to show what those features solved.

Why committees respond to impact over output

Promotion committees at most big tech companies review dozens of candidates in a single session. At Google, a calibration meeting might cover 15-30 people in a few hours. At Amazon, your promo doc passes through multiple layers of review. Nobody is reading six pages of activity logs carefully.

When your manager stands up to pitch your case, they get roughly five minutes. In those five minutes, they need to make the argument that you belong at the next level. "She shipped a lot of features" is not an argument. "She reduced checkout abandonment by 18% through a payments redesign she scoped and drove cross-team" is an argument. Writing a promotion case document forces this translation — you can't fill out the evidence section with activity logs and expect it to hold up in calibration.

Output lists blur together because every engineer ships features. Impact statements stand out because they answer the question the committee is actually asking: did this person's work matter beyond their immediate task list?

Will Larson's guide on promotion packets puts it directly: if a promotion packet is all about the features someone has done or the technology they used, it will fail the impact test. The committee needs to see that your work expanded the definition of "contribution" beyond individual output.

How to reframe output as impact

The conversion from output to impact isn't creative writing. It's a mechanical process you can run on every line in your promotion case.

The "which led to" chain

Start with the output statement. Then keep asking "which led to..." until you hit something the business measures.

"I built the internal deployment dashboard" → which led to → "deploy time dropped from 45 minutes to 8 minutes across 12 teams" → which led to → "roughly 30 engineer-hours saved per week, and two teams that had been batching deploys weekly started shipping daily."

You don't need to reach the final link in the chain every time. But you need to get past the first one. "I built the dashboard" is output. "Deploy time dropped from 45 to 8 minutes" is impact. "Teams changed their shipping behavior as a result" is organizational impact, which is what senior and staff-level cases need.

The three-sentence pattern

For each accomplishment in your case, write exactly three sentences:

  1. What you did (the action, specific enough that someone could find the PR or design doc)
  2. What changed (the measurable outcome)
  3. Where the proof lives (link to the dashboard, postmortem, or doc that validates the claim)

That's it. Not a paragraph of context. Not a story about how hard it was. Three sentences that make the committee's job easy.

Here's what that looks like in practice:

"Identified that our alerting configuration was generating 200+ false-positive pages per month, causing on-call engineers to ignore real incidents. Redesigned the alerting thresholds and runbooks for the three highest-noise services. On-call response time improved from 45 minutes to 12 minutes MTTR over the following quarter. [Postmortem and dashboard link]."

A committee member reads that in fifteen seconds and understands exactly what happened, why it mattered, and where to verify it.

When you genuinely don't have a number

Not every piece of work has a clean metric. Infrastructure work, developer tooling, and platform improvements often don't map to a revenue number. That's fine. Impact doesn't have to be a percentage. Understanding what counts as a win when your job is shipping code helps here — adoption, unblocking, risk reduction, and velocity changes are all valid forms of impact. And when the work genuinely has no obvious number attached, there are concrete techniques for building one from first principles: before/after snapshots, time-saved calculations, frequency counts, and counterfactual framing.

Other forms of impact that committees recognize:

  • "The migration framework I built was adopted by 8 teams in the first quarter" (adoption is impact)
  • "The API redesign unblocked three teams that had been waiting on a shared dependency for two months" (unblocking is impact)
  • "The chaos testing framework I introduced caught two critical failure modes before they hit production" (risk reduction is impact)
  • "After the CI pipeline overhaul, median build times dropped from 20 minutes to 4 minutes. The team went from shipping twice a week to daily deploys." (velocity change is impact)

The common thread: something changed in the world because your work existed. If you can't name what changed, either the impact hasn't happened yet (and you should wait) or you need to dig deeper to find it.

The mistake that costs the most time

Engineers who focus on output tend to write their promotion case at the end of the year, looking backward. They reconstruct a list of projects from memory, describe what they did, and submit it.

Engineers who focus on impact tend to log wins as they happen, noting the outcome alongside the work. When it's time to write the case, they already have the impact framing because they captured it in the moment, when the metrics were fresh and the context was clear. That same discipline is what determines what survives your promo packet and what quietly weakens it — volume metrics without outcomes are the first thing committees discount.

The difference isn't talent or political savvy. It's a documentation habit. And the longer you wait to capture the impact of your work, the harder it gets. Three months after shipping that critical project, you'll remember you did it. You won't remember the specific latency numbers or the adoption curve or the quote from the PM who said it unblocked their roadmap.

A quick test for your promotion case

Read through every line in your document and ask two questions:

  1. Could a competent engineer at my current level have written this same line? If yes, it's describing current-level output. Cut it or reframe it.
  2. Does this line contain a verb like "led," "built," or "shipped" without a follow-up clause about what changed? If yes, you've stopped at output. Add the "which led to" chain.

If more than half your promotion case survives those two filters without needing changes, you're already framing for impact. If most of it fails, you have the exact edit list you need.

The promotion committee doesn't need to know how hard you worked. They need to know what changed because you were there. Give them that, and the five-minute pitch your manager makes on your behalf becomes a lot easier to win.

Frequently Asked Questions

Related Articles