User Access Reviews for SOC 2: What Auditors Want to See | Lorikeet Security Skip to main content
Back to Blog

User Access Reviews for SOC 2: What Auditors Want to See

Lorikeet Security Team February 26, 2026 9 min read

User access reviews are one of the most commonly failed controls in SOC 2 audits. Not because they are technically difficult, but because organizations treat them as a checkbox exercise rather than a meaningful security control. The result is incomplete evidence, inconsistent execution, and audit exceptions that could have been avoided.

The frustrating part is that the requirements are not complex. Auditors want to see that you know who has access to what, that access is appropriate for each person's role, that you review this regularly, and that you act on what you find. The organizations that struggle are the ones that try to do this manually at the last minute before an audit, rather than building it into their operational rhythm.

This guide covers exactly what SOC 2 auditors expect, the quarterly review process that satisfies them, the evidence you need to collect, and the common failures that trip organizations up.


Why Access Reviews Matter for SOC 2

SOC 2's Common Criteria (specifically CC6.1, CC6.2, and CC6.3) require organizations to manage logical access to their systems. This includes provisioning access based on roles, restricting access to authorized users, and periodically reviewing access to ensure it remains appropriate. The access review is the control that validates everything else in your access management program is working.

Without regular access reviews, several things go wrong over time:

Access reviews catch all of these. When they are done properly.


The Quarterly Review Process

Here is the access review process that consistently satisfies SOC 2 auditors and actually improves your security posture:

Step 1: Define scope

Identify all systems that are in scope for your SOC 2 audit. This typically includes your production environment (cloud infrastructure, databases, application servers), source code repositories, CI/CD pipelines, customer data stores, identity provider (IdP) and SSO, monitoring and logging systems, and any SaaS tools that access or process customer data.

For each system, identify the access levels or roles that exist. Not just "user" and "admin" but the full role taxonomy. This becomes your review matrix.

Step 2: Extract current access lists

Pull the current list of users and their roles/permissions from each in-scope system. For cloud providers, this means IAM users, roles, and policies. For SaaS tools, it means user accounts and their permission levels. For source code repos, it means collaborators and their access levels.

Automate this extraction wherever possible. If you are pulling these lists manually every quarter, the process is fragile and error-prone. Most cloud providers and SaaS tools have APIs that can export user lists programmatically.

Step 3: Compare against HR and role data

Cross-reference the access lists against your current employee roster, contractor list, and role definitions. Flag any accounts that do not match a current employee or contractor (potential orphaned accounts), any users with access levels that do not match their current role, any shared or generic accounts, and any service accounts or API keys that do not have a documented purpose and owner.

Step 4: Review with system owners

Send the flagged items and the full access list to each system owner (typically a manager or technical lead) for review. The reviewer should confirm that each user's access is still appropriate, identify any access that should be modified or revoked, and approve the current state of access for their system. Document the reviewer's name, the date of their review, and their specific decisions.

Step 5: Remediate and document

For any access that needs to change, create tickets, execute the changes, and document them. This includes revoking access for terminated users, reducing permissions for users with excessive access, removing or reassigning shared accounts, and updating service account permissions. Keep evidence of both the decision (the reviewer said "revoke this access") and the execution (the access was actually revoked, with a timestamp).

Step 6: Generate the review artifact

Produce a dated document or report that summarizes the review. This becomes your audit evidence. It should include the date of the review, the systems reviewed, the reviewer for each system, a summary of findings (how many accounts reviewed, how many changes made), specific changes made with evidence, and reviewer sign-off.


What Evidence Auditors Expect

When an auditor asks for access review evidence, they are looking for specific artifacts. Here is what to have ready:

Evidence Type What It Demonstrates Format
Access Lists Complete inventory of users and their permissions per system Exported user lists with roles, timestamps
Review Records Someone evaluated each user's access appropriateness Signed review sheets, email approvals, tool records
Change Evidence Inappropriate access was actually modified or revoked Tickets showing changes, before/after screenshots
Cadence Evidence Reviews happen consistently on schedule Dated artifacts for each quarter in the audit period
Termination Evidence Access is revoked when employees leave Access revocation records correlated with termination dates
Policy Document Formal access review policy exists and is followed Written policy with defined frequency, scope, and responsibility

Auditor perspective: The most common question auditors ask during access review testing is "show me the evidence for Q3." If you cannot produce a dated access review artifact for every quarter in the audit period, you will get an exception. This is true even if you did the review but did not document it. In audit, undocumented means it did not happen.


Common Access Review Failures

These are the patterns we see most frequently when organizations fail the access review portion of their SOC 2 audit:

Terminated employees with active accounts

This is the single most common finding. An employee leaves the company, but their accounts on various systems remain active for weeks or months. The root cause is usually a disconnected offboarding process where HR, IT, and system owners do not communicate effectively. The fix is automated deprovisioning tied to your HR system, backed up by quarterly access reviews that catch anything the automation misses.

Missing review periods

Organizations complete three out of four quarterly reviews and assume the auditor will not notice the gap. They will. Every quarter in the audit period must have documented evidence. If you missed a quarter, it is better to acknowledge the gap and show your corrective action than to try to backdate a review.

Rubber-stamping reviews

When reviewers approve every single user's access without making any changes quarter after quarter, auditors become skeptical. A healthy access review should produce some changes over time: permission adjustments as roles evolve, removal of access for completed projects, and cleanup of unused accounts. If your reviews never result in changes, either the reviewer is not actually looking, or your provisioning process is unrealistically perfect.

Incomplete system coverage

Some organizations review their primary systems but miss less obvious ones: monitoring tools, logging platforms, CI/CD systems, or third-party SaaS tools that access customer data. If a system is in scope for SOC 2 and handles or accesses customer data, it needs to be included in the access review.

No review of privileged access

Auditors pay special attention to privileged accounts (admin, root, database admin). If your review treats all users the same regardless of privilege level, you are missing a critical element. Privileged accounts should receive more scrutiny, and any changes to privileged access should require additional approval.


Tooling Recommendations

The right tooling makes access reviews sustainable rather than painful. Here is what to consider at different organization sizes:

Early-stage startups (under 50 employees)

A well-structured spreadsheet works fine at this scale. Export user lists from each system quarterly, review them in a shared spreadsheet, and store the signed-off artifacts in your compliance documentation. The overhead is manageable and the process is straightforward.

Growth-stage companies (50-250 employees)

At this scale, manual spreadsheets become unsustainable. Consider tools like Vanta, Drata, or JumpCloud that can automatically pull access lists from integrated systems, track review status, and generate audit-ready evidence. These tools integrate with your IdP, cloud providers, and common SaaS tools to automate much of the extraction and tracking work.

Larger organizations (250+ employees)

Dedicated Identity Governance and Administration (IGA) tools like Okta Identity Governance, SailPoint, or Saviynt provide automated access certification workflows, role-based access modeling, and compliance reporting. These tools handle the complexity of large user populations across many systems.

Regardless of tooling, the fundamentals remain the same: complete coverage, documented reviews, actionable findings, and consistent cadence.


How Access Review Gaps Lead to Pentest Findings

Access control weaknesses are consistently among the top findings in our penetration testing engagements. The gaps that access reviews are designed to catch are the same gaps that pentesters exploit:

Organizations that conduct regular, thorough access reviews find and close these gaps before a pentester or attacker can exploit them. The intersection of SOC 2 compliance and penetration testing is where access reviews deliver the most value: satisfying auditors while simultaneously reducing real-world attack surface.


Building a Sustainable Access Review Program

The goal is not to pass one audit. It is to build a process that runs consistently with minimal friction. Here is how:

  1. Automate extraction. Script the process of pulling user lists from each in-scope system. Run it quarterly on a schedule so the data is ready before the review starts
  2. Assign clear ownership. Each system needs a designated reviewer who is responsible for evaluating access quarterly. This should be someone who understands the system and the roles that require access to it
  3. Set calendar reminders. Schedule the review for the same week each quarter. Do not let it slide. Consistency is what auditors care about most
  4. Make it easy for reviewers. Send reviewers a pre-populated form with the current access list and ask them to confirm or flag each entry. Do not ask them to build the list themselves
  5. Track remediation. When a review identifies access that needs to change, track the remediation as a ticket and close the loop. The review is not complete until changes are executed and documented
  6. Store everything centrally. Keep all access review artifacts in a single, organized location that your auditor can access. Date everything clearly

The bottom line: Access reviews are not hard. They are tedious, and that is why they get skipped. Build automation around the tedious parts, assign clear ownership, and treat the review as a real security control rather than an audit checkbox. The organizations that do this right pass their audits without exceptions and have fewer access-related vulnerabilities for pentesters to find.

Want to find access control gaps before your auditor does?

Our penetration testing identifies the same access control weaknesses that cause SOC 2 audit failures. Test your defenses with a team that knows what auditors and attackers both look for.

-- views
Link copied!
Lorikeet Security

Lorikeet Security Team

Penetration Testing & Cybersecurity Consulting

We've completed 170+ security engagements across web apps, APIs, cloud infrastructure, and AI-generated codebases. Everything we publish here comes from patterns we see in real client work.

Lory waving

Hi, I'm Lory! Need help finding the right service? Click to chat!