Key Takeaways

  • Criterion 9 is the most mathematically verifiable of the ten criteria — but the comparison group construction is where most petitions go wrong.
  • The peer group must be specific: same seniority, same specialty, same geographic market. Comparing to national "software developer" averages when you are a Staff ML Engineer in San Francisco is not an apples-to-apples comparison.
  • Total compensation matters in tech: base salary in isolation dramatically understates the compensation premium for senior roles at top-tier companies where equity dominates total compensation.
  • Document every component: offer letters, vesting schedules, bonus confirmations, and RSU grant values all form part of the exhibit.
  • Data sources matter: BLS Occupational Employment data is the most commonly cited authoritative source; HR consultant analyses with documented methodology are accepted for roles where published data is thin.

The salary criterion is elegant in concept and deceptively technical in execution. The regulation asks for evidence that you command a high salary or remuneration for services in relation to others in the field. The challenge is not proving what you earn — offer letters and pay stubs do that — but proving that what you earn is genuinely high relative to a correctly defined peer group.

The Peer Group Construction Problem

The most common Criterion 9 failure is comparing a senior professional's compensation to the wrong peer group. A Staff Software Engineer at a major tech company in San Francisco earning $450,000 in total compensation is not being compared fairly to the Bureau of Labor Statistics' mean wage for "Software Developers" nationally — which, as of 2024 BLS data, was approximately $124,000 per year. That comparison is accurate but misleading: it compares different roles, different seniority levels, and different labor markets.

A properly constructed comparison: Staff Engineers or Principal Engineers at major technology companies in the San Francisco Bay Area. That peer group, using data from multiple sources including the BLS metro-area breakdown, technology industry compensation surveys, and published data from comparable companies, will show a significantly narrower gap — but the applicant in a legitimately high compensation position will still demonstrate a meaningful premium over even this correctly defined peer group.

★ Information Gain

USCIS officers reviewing salary evidence often have access to the same publicly available BLS data you do — but not to specialized compensation surveys. An exhibit that presents the BLS general category average alongside a properly constructed specialty-specific and geography-specific comparison, with sources cited and methodology explained, does the adjudicator's analytical work for them. Petitions that present only raw salary data alongside the BLS national average are asking the adjudicator to make the comparison — and they may not make it in your favor.

Total Compensation: The Technology Industry Complication

In the technology sector, base salary systematically understates total compensation for senior roles. A Staff Engineer at Google, Meta, or Apple might have a base salary of $220,000 and total annual compensation — including RSUs, sign-on bonuses, and cash bonuses — of $450,000 or more. Filing a Criterion 9 exhibit that references only base salary when total compensation is the relevant market metric produces a misleading comparison that understates your actual compensation premium.

Document total compensation completely: the offer letter establishing base salary, the RSU grant documentation (shares granted, vesting schedule, grant-date price), any publicly available or attorney-confirmed information on the company's stock valuation, and bonus confirmation letters or history. Frame the compensation comparison using total annual compensation as the unit, and use comparison data sources that also report on total compensation rather than only base salary.

Founders and Equity-Heavy Roles

For founders, compensation evidence is more complex. Cash salary at an early-stage startup may be intentionally low; equity ownership may be substantial. A purely salary-based comparison will not serve a founder well. The most defensible approach is a combination strategy: document the equity stake, the company's most recent valuation (from a funding round or third-party assessment), and the implied economic value — then compare to the total economic compensation of comparable executives at funded companies of similar stage and sector. An attorney with tech startup experience can help structure this argument correctly.

Data Sources That Work

Bureau of Labor Statistics (BLS) Occupational Employment and Wage Statistics: The most commonly cited authoritative source. Use the metro-area data, not national averages, and use the most specific occupational code that matches your role.

Radford / Aon compensation surveys: Industry-standard HR compensation data, widely used by major employers and accepted by USCIS. Requires a company or HR consultant subscription to access formally, but your employer's HR department may have access.

Levels.fyi: Useful as supporting context for tech industry total compensation comparisons, but crowdsourced — supplement with more authoritative sources rather than relying on it alone.

Custom HR consultant analysis: For niche roles or unusually structured compensation, a brief analysis from a qualified HR consultant documenting their methodology, data sources, and comparison framework can be compelling. It adds E-E-A-T credibility beyond generic published data. See the full EB-1A criteria overview →