CareerClimbCareerClimb
Stripe
Performance Review
Promotion
Big Tech
Company Guides
March 13, 20268 min read

Stripe's Performance Review Process: What Actually Happens

Stripe's Performance Review Process: What Actually Happens

If you joined Stripe expecting a meritocracy (do great work, get promoted), you've probably already noticed the gap between the handbook version of the process and what colleagues who've been through it tell you privately.

Stripe's performance review system has real structure: career ladders, formal promotion cycles, calibration committees. But like most tech companies, the formal structure is only part of the story. The engineers who understand how the process actually works, who nominates them, when, and why, are the ones who build the cases that survive calibration.

This guide covers how Stripe's review cycle works from the inside, what the 36-month clock actually means, and what you need to document and demonstrate to build a case that lands.

Stripe's engineering levels at a glance

Before talking about performance reviews, it helps to understand the level structure. Stripe has levels from L1 to L7, but two things catch people off guard.

First, there's no "Senior" title at Stripe. Externally, you're either a Software Engineer or a Staff Software Engineer. Internally, L3 is roughly equivalent to Senior Engineer at most large tech companies, and L4 maps to Staff.

Second, levels have sub-bands. L3 and L4 each split into a and b sub-bands (L3a, L3b, L4a, L4b), which gives managers more granularity when leveling hires and evaluating performance without forcing a full promotion conversation.

Stripe LevelApproximate EquivalentTypical Tenure Range
L1–L2Entry / Junior0–3 years total experience
L3 (a/b)Senior Engineer3–7 years, ~18–36 months at Stripe
L4 (a/b)Staff / Principal7+ years, cross-org scope
L5–L7Senior Staff, DistinguishedLeadership-scale impact

How Stripe's performance review cycle works

The timing: annual plus mid-year

Stripe runs one primary annual promotion cycle plus a shorter mid-year cycle where uplevels are possible. The mid-year cycle is more limited; not every engineering org participates equally. But it exists, and strong cases can move then.

What this means practically: you have two windows per year to be considered for uplevel. The annual cycle is where most promotions happen and where you should be building toward.

Self-assessment

The formal review process starts with a self-assessment: your written summary of what you accomplished, how your work connected to business outcomes, and where you operated at or above your current level.

Stripe has a strong written culture. The self-assessment isn't just a formality. Your manager uses it as ammunition in calibration. If you can't articulate the scope and impact of your work clearly in writing, your manager has less to work with in the room where decisions actually get made.

The engineers who do this well treat the self-assessment as the final step of a year-long documentation practice, not a one-week sprint before the review window closes. See our software engineer self-review guide for how to structure it.

Peer feedback

After self-assessment, you nominate peers to give feedback on your work. Stripe expects you to include peers from outside your immediate team, and this is taken seriously. An engineer who can only produce strong peer endorsements from within their own team raises questions about cross-org impact.

One pattern that shows up repeatedly among Stripe engineers on Team Blind: the peer feedback step matters more than people expect. A lukewarm review from outside your team can complicate your case in calibration, even if your manager is supportive.

Calibration: where promotions actually get decided

After managers collect self-assessments and peer reviews, they assign an initial performance designation. Then comes calibration.

Calibration at Stripe involves managers presenting their direct reports and defending ratings and promotion recommendations against their peers. The goal is consistency, everyone evaluated against the same bar. In practice, Stripe employees describe calibration as opaque and subject to org-level dynamics. For a deeper look at how calibration works in big tech, see how performance review calibration works.

Two structural realities about calibration matter here:

Your manager's reputation in the room. Stripe engineers are direct about this on Blind: "if your manager has a good reputation you have a chance, otherwise it could be stalled for multiple years." Your work doesn't walk into calibration alone. Your manager walks it in. This manager-dependency is consistent across big tech. Microsoft's Connects system shows the same dynamic in a different structure.

Promotion quotas exist. Stripe limits the number of uplevels per review cycle per org. This isn't publicly acknowledged, but multiple engineers have noted it: "there is a limited number of promotions each cycle per org, so it might not happen even if you are ready." Strong case, wrong quarter, wrong org, and you wait another cycle. Uber's performance review process operates with the same quota constraints.

The 36-month clock

Stripe's promotion expectation is that engineers uplevel within 36 months. The 18–24 month average you'll hear cited is the target; the outer bound is three years.

If you're two years in and haven't had a concrete uplevel conversation with your manager, that's worth acting on. Not a crisis. But worth understanding where you stand and what's blocking the case.

The good news: engineers coming from Google and Amazon describe Stripe's bureaucratic overhead as noticeably lower. One engineer who switched from Google and got promoted in under two cycles described the process as having about ten times less bureaucratic overhead, saying they only spent a few hours on perf during the promo cycle itself. The actual promotion cycle paperwork is relatively light once your impact doc is in shape. The work happens throughout the year.

What gets engineers promoted at Stripe

The official answer is impact at the next level, sustained over time. Here's the more useful breakdown.

An impact doc you've been maintaining all year

The engineers who do well in review season at Stripe aren't writing their impact narrative from scratch in November. They keep a running impact document throughout the year: wins, technical decisions, scope of projects, how their work affected teams outside their own.

This doc feeds your self-assessment and gives your manager specifics to bring into calibration. Vague impact statements lose to specific ones in that room. If your work sits on platform or infrastructure rather than user-facing features, frameworks for quantifying engineering impact without obvious metrics cover exactly this problem.

What to track in your impact doc:

  • Projects you drove end-to-end and their outcomes, in concrete terms
  • Technical decisions and architectural tradeoffs you documented that others referenced
  • Incident post-mortems you wrote that became the org's source of truth
  • Cross-team collaboration where your contribution was visible to the other team's leadership

Cross-team visibility: start earlier than you think

For L4 (Staff) promotions especially, cross-org visibility is the critical factor. Stripe evaluates Staff engineers on breadth and depth of impact: not just what you built, but the scope of who you influenced.

A pattern that shows up from engineers who've made the L3-to-L4 jump: showing up consistently to cross-team syncs, not just to participate, but to connect dots. Pointing out that another team already solved a problem you're facing, or linking teams working on adjacent things. That connective work is visible to the right people without requiring extra headcount or scope.

At the L3 level, the visibility bar is lower. But the muscle is worth building early.

Peer feedback from people outside your manager's org

When you nominate peers for your review, include people from other teams who can speak to your cross-org impact. Peer feedback from within your immediate team is expected. Peer feedback from a tech lead in another org who references specific moments adds credibility in a way that internal endorsements can't.

Work backward from this: in the six months before review season, have you worked with people outside your team in ways they could articulate clearly in writing?

Manager alignment: have the conversation early

Engineers often leave this one until last. Have an explicit conversation with your manager about uplevel expectations before review season, not during it.

That conversation needs to establish three things: whether you're in consideration this cycle or the next, what specific evidence would strengthen your case in calibration, and whether there are org-level quota constraints to be aware of.

If you get vague answers, ask again with more specificity. The engineers who feel blindsided at Stripe are almost always the ones who avoided this conversation until it was too late. If you've been passed over despite a strong case, how to recover from a failed promotion attempt covers the next steps.

Common mistakes at Stripe

Assuming impact is self-evident. Stripe has a strong engineering culture; the default assumption is that your work is high quality. But quality isn't the same as documented impact. If your manager has to reconstruct your year from memory in a calibration session, you've already lost ground to the person whose manager walked in with a tight impact narrative.

Relying only on your immediate team for peer feedback. Cross-org visibility matters at Stripe, and your peer reviewer list reflects that. An all-within-team feedback set reads as limited scope, even if the work was genuinely impactful.

The process-level mistakes are just as costly, and they're more avoidable.

Letting Objectives and Key Results (OKRs) define your entire contribution. Multiple Stripe employees have noted that team OKRs are sometimes set at intentionally high targets. If your impact narrative ties entirely to OKR completion, it's fragile. Document the work that didn't fit neatly into OKRs: incident response, design doc contributions, the technical mentoring that helped another engineer ship faster.

Treating promotion as something your manager handles. Manager advocacy matters at Stripe. But advocacy works when there's a strong underlying case, specific documented impact your manager can present and defend. The manager walks your case in; you built it.

One last mistake, and it's probably the most fixable:

Not raising the conversation early enough. One engineer who spent time at Stripe noted that raises had minimal correlation to actual output, but strong correlation to the manager relationship. Whether or not that holds across every team, it points to something real: your manager knowing your career goals, well before review season, is a lever you can pull now.

Building your case between now and the review window

If your review cycle is six months out, here's where to focus.

Open your impact doc today if you don't have one. Write down every project you've owned or significantly contributed to this year, the outcome, and who outside your team was affected by the result.

Identify two or three people in other teams who've seen your work. Not to ask for a favor. Just make sure those working relationships are solid enough that those engineers could say something specific about your contributions.

Have the explicit uplevel conversation with your manager. This cycle or next? What would strengthen the case? Are there quota constraints to know about?

Then document your next three months with calibration in mind. The decisions and contributions that demonstrate scope beyond your current level: capture them as they happen, not three weeks before the review window closes.

The case you build between now and the review window is the one your manager presents. Make it easy for them.


Build your case with CareerClimb

Stripe's review process rewards the engineers who document their impact throughout the year, not the ones who reconstruct it in a week before the self-assessment deadline.

CareerClimb is an AI coaching app that helps you log wins as they happen, map your contributions to your company's level expectations, and walk into every review cycle with your evidence already organized. Your AI coach Summit knows your situation, tracks your progress all year, and helps you build a case that's ready when the review window opens.

Download CareerClimb

Frequently Asked Questions

Related Articles