A readiness assessment is the structured gap analysis that determines what is working, what is missing, and what needs to be remediated before the formal SOC 2 audit begins. It is not a requirement — organizations can go directly from “no compliance program” to “Type I audit engagement” — but organizations that skip readiness assessments consistently experience longer audits, more exceptions, and higher total costs.
The readiness assessment serves three purposes: it identifies control gaps that would generate audit exceptions if not remediated before the observation period begins; it inventories existing evidence and reveals where documentation is insufficient; and it provides a realistic estimate of the timeline to audit-readiness, enabling accurate planning for the observation period start date and Type II report delivery.
Readiness Assessment Methodology
| Phase | Activities | Output |
|---|---|---|
| Scoping | Confirm system scope; identify in-scope Trust Services Criteria; map in-scope infrastructure, personnel, and processes | Confirmed scope document; system component inventory; TSC selection rationale |
| Control mapping | Map existing controls to each Trust Services Criteria requirement; identify controls that exist but are not documented vs. controls that do not exist | Control matrix: TSC requirement → control description → implementation status → gap rating |
| Evidence inventory | Review existing documentation: policies, procedures, configurations, training records, access review logs, vendor assessments | Evidence inventory: what exists, where it lives, how current it is, what’s missing |
| Control testing | Test a sample of controls to verify they operate as described; identify controls that exist on paper but not in practice | Control testing results: design gaps vs. operating gaps; priority remediation items |
| Gap analysis report | Synthesize findings into a prioritized remediation plan with estimated effort and timeline for each gap | Readiness report: gap matrix, priority ranking, estimated remediation effort, recommended timeline |
The Gap Matrix: How to Prioritize Remediation
Not all gaps are equal. A missing quarterly access review is a more critical gap than the absence of a vendor questionnaire template — because the access review gap will generate an exception in almost every Type II audit, while the vendor questionnaire gap may be addressable through alternative evidence. The gap matrix prioritizes remediation by the likelihood that a gap will generate an audit exception and the effort required to close it.
| KEY IDEA | High-priority gaps (those likely to generate exceptions) that are also quick to close should be remediated first. The classic example: enforcing MFA for all production systems. This is a CC6.2 requirement, the most scrutinized access control, and typically achievable in days through an identity provider policy change. Start with high-impact, low-effort remediations before addressing structural gaps that require policy development and evidence collection. |
| Gap Category | Typical Remediation Effort | Approach |
|---|---|---|
| Technology control gaps (MFA not enforced, encryption not enabled) | Hours to days | Configuration change in cloud console or identity provider; implement immediately before observation period begins |
| Policy gaps (policy exists but doesn’t match practice, or policy doesn’t exist) | Days to weeks | Draft or revise policy; have leadership approve; distribute to employees and collect acknowledgments |
| Evidence cadence gaps (access reviews not performed quarterly, training not tracked) | Weeks to months (must be sustained over observation period) | Implement repeating calendar reminders and templates; track completion; build into operational calendar |
| Process gaps (no formal change management process, no incident response procedure) | Weeks | Define process; document in policy; train relevant personnel; begin running the process to accumulate evidence |
| Vendor management gaps (no vendor inventory, no DPA with critical vendors) | Weeks to months | Build vendor inventory; tier vendors; send DPA to high-risk vendors; allow time for vendor review and execution |
| Structural gaps (no risk assessment, no BCP documented) | 1–3 months | Facilitated risk assessment workshop; BCP documentation project; requires dedicated time and subject matter expertise |
Setting the Observation Period Start Date
One of the most important outputs of the readiness assessment is the recommended observation period start date. This is the date from which a Type II audit’s evidence collection will begin — and all controls must be operating from that date forward. Setting the observation period start date too early — before critical gaps are closed — means the Type II audit will test a period during which some controls were not yet in place, generating exceptions.
| IMPORTANT | The observation period start date should be set after high-priority gaps are closed and key controls are operating. For most organizations completing a thorough readiness assessment and remediation program, this means the observation period begins 6–12 weeks after the readiness assessment concludes. The temptation to start the observation period immediately to accelerate timeline should be resisted — a clean 12-month Type II report is more valuable than a faster but exception-heavy one. |