Controls
Control creation, ownership, mapping, exceptions, testing, and the workflow that makes a control library usable.
Scope
Controls are where policy intent becomes operational reality. This page keeps the public-safe guidance for creating, mapping, testing, and managing controls while removing private product navigation and API detail.
Creating Controls
Controls are organizational practices that address compliance requirements. Each control belongs to a compliance program.
Steps
- Go to Controls in the sidebar and pick the target program from the program selector at the top of the page.
- Click New Control.
- Fill in the required fields:
- Control Ref: A unique identifier within the program (e.g., “CTRL-001”, “AC-01”).
- Title: A descriptive name for the control.
- Optionally fill in:
- Description: What the control does.
- Implementation Notes: How the control is implemented.
- Owner: User or team responsible for the control.
- Fulfiller: Who operates the control (Internal, MSP, or Vendor).
- Frequency: How often the control operates (Continuous, Daily, Weekly, Monthly, Quarterly, Annual, Ad Hoc).
- Effective Date: When the control became or becomes active.
- Optionally map the control to framework requirements inline on the same form — add one or more requirements with full or partial coverage and the mappings are created in a single follow-up API call after the control is persisted. Mappings can also be added later from the control detail page.
Controls are always created in “Not Implemented” status.
Naming Conventions
Use a consistent naming scheme for control references. Common patterns:
- Sequential: CTRL-001, CTRL-002, etc.
- Category-based: AC-01 (access control), CM-01 (change management), etc.
- Framework-aligned: CC1.1-01 (matching the requirement it addresses).
The control_ref must be unique within each program but can be reused across different programs.
Managing Control Status
Controls follow a lifecycle state machine. You can transition controls through these statuses:
- Not Implemented: The control exists as a plan but is not yet active.
- Partially Implemented: The control is in progress or only partially operational.
- Implemented: The control is fully operational.
- Not Applicable: The control has been scoped out.
To change a control’s status: 1. Open the control detail view. 2. Use the status transition action. 3. Select the target status from the available transitions.
Not all transitions are allowed. For example, you cannot jump directly from “Implemented” to “Not Implemented” — you must go through “Partially Implemented” first to signal a regression.
Mapping Controls to Requirements
Controls address framework requirements. Mapping a control to a requirement records that relationship and contributes to gap analysis coverage.
Mapping a Control
- Open a control’s detail view.
- Go to the “Requirements” tab.
- Click “Add Mapping”.
- Search for and select one or more requirements.
- Set the coverage level:
- Full: The control fully satisfies the requirement by itself.
- Partial: The control only partially addresses the requirement.
- Optionally add notes about how the control addresses the requirement.
You can map a single control to requirements across multiple frameworks. This is common when one control satisfies criteria in both SOC 2 and ISO 27001, for example.
Removing a Mapping
- Open the control’s Requirements tab.
- Click the remove icon next to the mapping you want to delete.
Tips
- Map controls to requirements early to track gap analysis progress.
- Use “partial” coverage when a requirement needs multiple controls working together.
- Bulk mapping (selecting multiple requirements at once) is supported and saves time during initial setup.
Reading Gap Analysis
Gap analysis shows how well your compliance program covers framework requirements.
Accessing Gap Analysis
- Navigate to a compliance program.
- Click “Gap Analysis” in the program navigation.
Understanding the View
The gap analysis presents a per-framework breakdown with color-coded status for each requirement:
- Covered (green): At least one control with full coverage is mapped.
- Partially Covered (yellow): Controls are mapped but only with partial coverage.
- Not Covered (red): No controls are mapped — this is a gap.
- Excepted (gray): An approved exception exists for this requirement.
Summary Metrics
At the top of the gap analysis view, you see aggregate metrics: - Total Requirements: Total applicable requirements across all frameworks. - Covered: Requirements with at least one full-coverage control. - Partially Covered: Requirements with only partial-coverage controls. - Gaps: Requirements with no controls mapped. - Excepted: Requirements with approved exceptions. - Coverage %: Percentage of requirements with full coverage.
Filtering
Use the filters to focus on specific areas: - Framework: Show only one framework’s requirements. - Category: Show only requirements in a specific category (e.g., “CC1” for SOC 2 Common Criteria 1). - Coverage Status: Show only requirements matching a specific status (covered, partially_covered, not_covered, excepted).
Using Gap Analysis for Audit Readiness
Gap analysis answers “are we ready for audit?”: 1. Target zero red (not_covered) items for the frameworks in scope. 2. Review yellow (partially_covered) items and decide if additional controls are needed. 3. Document gray (excepted) items with clear justifications and compensating controls. 4. Track coverage_pct over time to measure progress toward audit readiness.
Managing Exceptions
Exceptions document when a control cannot be implemented or a requirement does not apply.
Creating an Exception
- Open a control’s detail view.
- Go to the “Exceptions” tab.
- Click “Add Exception”.
- Fill in:
- Reason: Why the exception is needed (required).
- Requirement: Optionally target a specific requirement.
- Compensating Control: Optionally reference another control that mitigates the gap.
- Expires At: Optionally set an expiration date for time-boxed exceptions.
Exceptions are created in “Draft” status.
Approving an Exception
- Open the exception (from the control’s Exceptions tab or the exception list).
- Click “Approve”.
- The system records who approved it and when.
Only approved exceptions affect gap analysis. Draft exceptions do not change requirement coverage status.
Exception Expiry
- Manual expiry: An approved exception can be manually transitioned to “Expired”.
- Auto-expiry: Exceptions with an expiration date are automatically expired when the date passes. This happens when the exception list is viewed — no background job needed.
- Revocation: An approved exception can be “Revoked” if it should no longer apply regardless of expiration date.
Both expired and revoked are terminal states — they cannot be changed further. To reinstate coverage, create a new exception.
Best Practices
- Always document a clear reason for each exception.
- Set expiration dates for temporary exceptions (e.g., “we’ll implement this control by Q3”).
- Reference compensating controls when possible — auditors want to see risk mitigation.
- Review exceptions before each audit to ensure they are still valid.
Adding Test Definitions
Tests define how controls are verified for effectiveness.
Creating a Test
- Open a control’s detail view.
- Go to the “Tests” tab.
- Click “Add Test”.
- Fill in:
- Name: Descriptive test name (e.g., “Verify quarterly access review completed”).
- Description: What the test checks.
- Test Type: Manual, Automated, or Hybrid.
- Frequency: How often the test should be run.
- Expected Result: What a passing result looks like.
Test Types
- Manual: Requires human execution and judgment (e.g., reviewing access logs).
- Automated: Runs automatically via a connector (WS-04/WS-05 will enable this).
- Hybrid: Has both manual and automated components.
Recording a test execution (CW-23)
Every time you actually run one of your control tests — a quarterly access review, a monthly firewall rule check, a daily log sampling — record it so your audit package has the proof.
How:
- Open
ControlDetail(e.g. CC6.1) → Tests tab. - Click Record on the row for the test you just ran (or open
TestDetailand click Record execution in the top right). - Fill in the modal:
- Executed at — when you actually ran it (default is now).
- Result — pick one: pass, fail, partial, error, not_run.
- Sample size / description — how many items you looked at and how you picked them (e.g. “10 admins pulled from Okta, all checked for MFA enrollment”).
- Notes — anything an auditor will want to know.
- Evidence — attach existing files via the picker (scoped to your program) OR click Upload new to upload fresh ones inline. Mark one as primary if there’s a main artifact; the rest are supporting.
- Click Record execution.
The execution becomes part of the test’s history and shows up in the Test executions during this cycle section of every future audit cycle that covers its date range. It is immutable — corrections are recorded as a new execution, and the old one can be marked superseded.
If your control test carries a plain-English hint (every SOC 2 test does), it shows above the execution history on TestDetail. It’s “Example guidance — confirm with your auditor” copy, not audit gospel, but it’s usually enough to know what to put in the notes and what to attach.
How to tell if a test is overdue
Every test row on the ControlDetail Tests tab carries a cadence badge. Read it like this:
- Green (Current) — you’re inside the first 80% of the frequency window since the last run. Nothing to do yet.
- Yellow (Due soon) — you’re in the last 20% of the window. Put it on this week’s list.
- Red (Overdue) — the window closed. Record an execution now, and (if it’s been a while) think about whether the underlying control was actually operating during the gap.
- Gray (Never run) — you’ve defined the test but never executed it. Either run it or delete it — a test definition with no runs is audit-bait.
Hover the badge for the next expected date. Continuous tests are special: >1h without a run is due_soon, >24h is overdue. Ad-hoc tests never go overdue — they stay Current after the first run.
To see every overdue test across the whole program at once, open Program → Tests Due (the widget on the program overview or the full page at the relevant workflow). Filter by status, sort, and click Record on any row to open the same modal without navigating away.
How to give an auditor your evidence package
When you’re ready to hand the fieldwork to an auditor, generate an evidence package:
- Open the audit cycle on
AuditDetail. - In the Evidence Package panel, click Generate package. The backend walks the whole cycle — controls, reviews, tests, executions, evidence, findings — and builds a deterministic ZIP with a SHA-256
manifest_hash. Generation is synchronous; large cycles can take a minute. - Click Preview manifest to expand the tree view and verify exactly what the auditor will see before sharing. Everything in the preview is everything they’ll get.
- Click Issue share link. The modal defaults to a 7-day expiration, hard-capped at 30 days. Long expirations (>14 days) get a yellow warning because long-lived share tokens widen the breach window.
- Copy the URL immediately. It is shown exactly once. The raw token is never stored server-side — only its SHA-256 hash is. If you close the modal without copying, the only fix is to revoke and issue a new link.
- Send the URL to the auditor through whatever channel you use. They click it, no login required; the backend streams the ZIP and logs
audit.package.token.usewith the auditor’s IP and user agent.
Back on AuditDetail you’ll see the issued tokens in a table with status badges:
- Not yet used — issued but the auditor hasn’t opened it yet.
- Downloaded recently — opened in the last 24 hours.
- Last used {date} — older downloads.
Revoke any token before its expiration if you need to cut off access. Revokes are idempotent; you can’t unrevoke — just issue a fresh link if you still need to share.
If you change data in the cycle (new evidence, new executions, new findings) between generations, re-generating the package produces a new row with a different manifest_hash. Re-generating with no changes returns the existing package unchanged — this is the “here’s the exact package I showed the auditor last Tuesday” guarantee, backed by SHA-256 over the canonical manifest bytes. Auditors can reference the hash in their workpaper.
Gate: cycle closure with missing test executions
When you try to mark an audit cycle complete, the backend checks that every test on every control has at least one execution during the fieldwork window. If any don’t, you get a modal listing the missing tests with a free-text override reason box:
- Minimum 20 characters after trim. Empty, short, and whitespace-only reasons are rejected.
- The counter shows
{count}/20 minimum; the Submit button is disabled until you’re over. - On submit, the cycle closes and an
audit_cycle.closure_overrideaudit log entry captures the reason plus a full snapshot of the missing tests at that moment. Auditors (and future you) can see exactly what was skipped.
Use the override when the coverage gap is genuine (connector was down, test was being rewritten) and either reaffirm the control with compensating evidence or flag it as a finding. Don’t use the override to hide gaps — the audit log is forever.