CareerClimbCareerClimb
Meta
Performance Review
Review Season
Calibration
PSC
March 1, 202611 min read

Meta performance review process for software engineers

Meta performance review process for software engineers

You've been at Meta for a year. Your tech lead says your work is strong. Your manager gives you positive signals in your 1:1s. Then Performance Summary Cycle (PSC) results come back and you got Meets Most (MM).

Now you're on a 6-month check-in clock, and one more MM means you're automatically on a Performance Improvement Plan (PIP).

This is the Meta performance review experience for a meaningful number of engineers who work hard but don't understand how the system actually operates. The PSC rewards specific behaviors, and positive feedback from your manager during the cycle doesn't guarantee a strong outcome if you haven't structured your evidence the right way.

This guide covers how Meta's review process actually works: the cycle structure, the ratings, how calibration operates, and what engineers who consistently get Exceeds or Greatly Exceeds do differently.

Meta's PSC structure at a glance

Meta runs one formal PSC per year, supplemented by a 6-month check-in that applies to engineers rated below expectations. Meta has recently rebranded aspects of the PSC as "performance @," and as of 2026 the Checkpoint program is transitioning to two full review cycles per year.

ComponentTimingWho participates
Formal PSCAnnual (results typically January)All engineers
6-month check-inMid-yearEngineers rated Meets Most or below
Calibration windowAfter submissions close (~2 weeks)Managers only
Promotion windowTied to PSC; mid-year possible with strong caseManager nomination required

The review involves four inputs: your self-review, peer feedback from 3-5 nominators you choose, your manager's assessment, and upward feedback you write about your manager.

Meta's leveling structure

Understanding Meta's levels is essential to understanding what the review system measures you against. Each calibration session is level-specific: your E4 work is compared against other E4 engineers, not against the org overall.

LevelTitleTypical tenurePromotion pressure
E3Software Engineer II0-2 yearsMust reach E4 within 24 months
E4Software Engineer III2-4 yearsMust reach E5 within 33 months from E4
E5Senior Software Engineer4-6 yearsTerminal level for most; no forced timeline
E6Staff Software Engineer6-10 yearsNo formal timeline; requires broad scope
E7+Senior Staff and above10+ yearsOrg-level impact required

Meta has an explicit up-or-out policy at E3 and E4. If you don't reach E4 within 24 months, or E5 within 33 months of reaching E4, you will be exited. This timeline pressure makes the PSC higher-stakes for early-career engineers than at most other tech companies.

The ratings scale: what each one actually means

Meta uses a 7-tier ratings scale. Based on data from TeamRora, which has supported 65+ Meta employees through performance reviews and PIPs, the approximate distribution is:

RatingApprox. distributionBonus multiplierImplications
Too New to EvaluateN/A (new hires)Equivalent to Meets AllFirst review only
Meets Some (MS)~2%0xAutomatic PIP
Meets Most (MM)~8%0.65x6-month check-in; 2 consecutive = PIP
Meets All (MA)~45%1.0xPassing grade; standard outcome
Exceeds (EE)~35%1.25xAbove bar; standard promo-track signal
Greatly Exceeds (GE)~7%1.65xStrong promo case; significant comp upside
Redefines Expectations (RE)~3%2.5xExceptional; rare at most levels

A few things worth understanding about these numbers.

Meta does not enforce a forced distribution at the team level. Your manager isn't required to give a certain percentage of Meets All or Exceeds. However, at the org level, leadership reviews distributions and will push back on calibration results that look significantly off from these ranges.

Meets All is not a warning sign. It's the expected outcome for engineers performing solidly at their level. But it also doesn't move a promotion case. If you're aiming to advance, Exceeds is the target, and Greatly Exceeds is what creates a compelling promotion argument.

The Meets Most trap is real. Getting MM once starts a 6-month check-in process. Getting it twice triggers a PIP. Engineers near the MM/MA boundary carry more uncertainty than those clearly above or clearly below. If you receive a rating below expectations, what to do after a bad performance review covers how to respond and rebuild your standing before the next cycle.

How the PSC actually works

Phase 1: Feedback collection

You write a self-review (approximately 1,000 words), nominate 3-5 peers to write feedback about you, write feedback for others who nominate you, and write upward feedback for your manager. Your manager then reviews all inputs and drafts their own assessment.

Your self-review is structured around Meta's four performance dimensions:

  • Project Impact: what you shipped and what changed because of it
  • Direction: how you influenced the roadmap and priorities, not just executed on them
  • Engineering Excellence: code quality, reliability, scalability, maintainability, technical leadership
  • People: mentorship, cross-team collaboration, raising the team's overall quality bar

Covering all four dimensions matters. A self-review that addresses only Project Impact and Engineering Excellence leaves Direction and People under-evidenced, and calibration reviewers will notice.

One Meta-specific factor that doesn't exist at most other companies: your Workplace posts. Meta's internal social network is treated as evidence. Engineers who answer questions publicly, share project updates, and document decisions create a trail of artifacts that feed into peer feedback and self-review content. Engineers who do strong work quietly have less material to draw from.

Phase 2: Calibration

After submissions close, your manager takes your package (self-review, peer feedback, and their own assessment) into a calibration session with other managers at your level. The goal is to normalize ratings across the org so that Exceeds at one team means the same thing as Exceeds at another.

Calibration at Meta is competitive. The number of Greatly Exceeds and Redefines Expectations slots per level is limited by org-level distribution expectations. Your manager has to argue that your Exceeds or Greatly Exceeds is more defensible than someone else's. The managers who win those arguments have specific evidence to point to. The managers who struggle are saying "they're really great" without context.

This manager-dependency is one of the most consistent patterns reported by engineers navigating Meta's PSC. From multiple sources that have supported Meta engineers through performance reviews: "Since your manager is the one presenting your package at calibration and advocating on your behalf, they are for all intents and purposes the gatekeeper for your promotion."

This is why working with your manager isn't just good career advice at Meta. It's how the process actually functions. An engineer who hasn't kept their manager closely informed will see that gap reflected in the calibration packet.

Phase 3: Results and follow-up

Results come through the review tool, followed by a 1:1 with your manager. Calibration conversations are confidential, so you won't know what was said in the room.

If your rating is below Meets All, you'll receive a 6-month check-in date. If you received Meets Most in the prior cycle, you're already on your second-strike clock.

What a strong Meta self-review looks like

Given the 1,000-word limit and four performance dimensions, most effective self-reviews at Meta are structured, not narrative. Here's a practical approach:

Project Impact is the highest-weight dimension: lead with it. Focus on your most significant contribution: what you shipped, what moved because of it, and how it connected to team or company priorities. Quantify where possible. "Shipped the redesigned checkout flow that reduced payment drop-off from 12% to 7%" is more useful than "improved the checkout experience."

Direction is where engineers operating at E5+ need to be explicit. Did you influence what got built, not just how it was built? Did you push back on scope that would have created technical debt? Did you identify the problem before the solution was defined? These contributions are often invisible in a task-focused self-review.

Engineering Excellence requires specifics. "Maintained high code quality" doesn't survive calibration. "Led the team's migration to the new testing framework, reducing flaky test rate from 18% to 3%" gives your manager a line they can read out loud in the calibration room.

People: document actual instances: the engineer whose architecture improved after a review you gave, the cross-team collaboration you drove, the question you answered on Workplace that saved another team two days of debugging. Vague "collaborated effectively" language helps no one.

For more on writing self-reviews that work in calibration, read the complete guide to writing your software engineer self-review. For a foundational understanding of what calibration is and why it matters, see how performance review calibration works. When nominating peers, the guide on how to write peer feedback that survives calibration also applies in reverse: it describes exactly what your reviewers should be submitting on your behalf.

Common mistakes Meta engineers make

  • Not documenting during the cycle. Engineers who scramble during PSC season are working from memory. The ones who write strong self-reviews were tracking wins in real time.

  • Writing for four dimensions but going shallow on all of them. A self-review that covers everything superficially is weaker than one that goes deep on two or three dimensions with real evidence.

  • Treating peer nominations as a social obligation. At Meta, who writes your peer reviews matters for calibration. Choose people who saw specific work, not people who will write generically positive reviews.

  • Missing the Direction dimension. Most engineers focus on what they built. Calibration reviewers specifically look for whether you showed judgment about what should be built. If you never address that question, the answer defaults to "no."

  • Assuming manager alignment means calibration alignment. Your manager may genuinely believe you deserve Exceeds or Greatly Exceeds. But if they can't defend that position against competing claims in the calibration room, the rating can change.

  • Underestimating how much Workplace posts matter. If you're not documenting your thinking publicly inside Meta, you're leaving peer feedback and self-review material on the table.

How Meta's review connects to promotion

Promotions at Meta require your manager to build and present a promotion case: a packet that goes beyond your PSC review to include a narrative about your scope, impact, and readiness for the next level. This is separate from the calibration process, though a strong calibration result (Exceeds or above) makes the promotion argument substantially easier.

Mid-year promotions are possible even under the annual cycle, but require your manager to make a case outside the normal window. This typically happens for engineers who have been demonstrating next-level scope for a full cycle and have strong PSC results to point to.

One thing that separates Meta's process from Google's review system: manager advocacy and timing can be influenced by leverage. Sources that have supported 65+ Meta engineers through this process have noted that holding an offer from another team or company has changed outcomes for engineers delayed for reasons unrelated to performance. This is a real dynamic, not one most engineers want to use, but worth understanding as part of how the system operates.

What engineers who got promoted at Meta consistently did

Across accounts from engineers who advanced through the levels at Meta, a few patterns repeat.

They treated the four performance dimensions as a checklist, not an afterthought. Before starting each new cycle, they identified which projects would generate evidence for each dimension.

They kept their managers informed in real time. Not just in 1:1s, but through Workplace posts and direct updates that created a paper trail. When calibration came, their managers had recent, specific, documented evidence to draw from.

They built peer relationships that generated meaningful feedback. Not shallow visibility, but actual collaboration with engineers who could describe specific decisions, specific outcomes, specific problems solved together.

They understood what operating at the next level actually required. They sought out work that demonstrated it. Getting promoted to E5 means showing E5 scope, not just excellent E4 execution.


Meta's PSC happens once a year, which means every cycle matters. The engineers who get strong results aren't the ones who work hardest in the three weeks before submissions open. They're the ones who documented their impact all year. CareerClimb helps you track wins across all four Meta performance dimensions as they happen, so your self-review is backed by evidence, not memory. Download CareerClimb

Frequently Asked Questions

Related Articles