# Automated Compliance Check (ACC)

## Run Automated Compliance Checks (ACC)

Automated Compliance Checks (ACC) score a model document against an internal control set or a regulation (e.g. SR 11-7) and return **met / partially met / not met** per control, with concrete recommendations.

The document to assess can be imported in Vectice; it doesn’t need to be authored in Vectice.

***

### <sub>Prerequisites</sub>

* **Control set to enforce**
  * e.g. **SR 11-7**, **internal model policy**, **Control set created from your internal templates**
  * Defined in **Admin → Manage control Sets**
  * Vectice provides starter control sets for SR 11-7, OSFI E-23, PRA SS 1/23, NIST AI RMF, and the EU AI Act. You can clone and customize them.&#x20;
* **Document to assess**
  * An existing Vectice document (e.g. MDD/MVD generated from AutoDoc)
  * **or** an imported document (DOCX/PDF/Markdown)

{% hint style="info" %}
Only Admins can create, import, or edit control sets and publish them for users. If you are a user and need a new set of controls, contact your administrator.
{% endhint %}

***

### Step 1 - Generate the assessment report

1. Go to the tab: *Your Project* **→ Reports**&#x20;
2. (Optional) **Import** the document if it was created outside Vectice, then open it to confirm the layout is correct.
3. Click **Generate report → Assessment**, then select the **control set** (e.g. *SR 11-7*).
4. Select the report you want to assess
5. Click on "**Start generating**"

<figure><img src="https://content.gitbook.com/content/sMRjTa7AGX49De62RAyl/blobs/saAH4h74XdlDSj6Ey5fW/image.png" alt="" width="563"><figcaption></figcaption></figure>

The generation of the assessment report should now be in progress. When done, open it.

***

### Step 2 — Review the Assessment report

The report has three main parts:

#### 1. Overall summary

* Highlights **strengths** (what is clearly documented)
* Lists **areas for improvement** (what is present but incomplete)
* Lists **gaps** (what is missing and must be added)
* Shows the overall tally, e.g. “9 of 20 obligations not met, 6 partially met, 5 met.”&#x20;

#### 2. Recommendations to reach full compliance

* Ordered list of actions such as:
  * “Add explicit model equations and stepwise processing flow”
  * “Provide scenario/back-test results for varied conditions”
  * “Add independence attestation signed by IMV head”
  * “Provide monitoring dashboard / KPI log”\
    These come directly from the rule that failed. :contentReference\[oaicite:5]{index=5}

#### 3. Controls Assessment Table

For **each** control, ACC shows:

* **Control ID** as defined in your control set if present
* **Evidence code** as defined in your control set if present
* **Status**: ✅ Met / ⚠️ Partially met / ❌ Not met
* **Key evidence** found in the document, with a link to the relevant section.
* **Comment / suggestion** for each control to explain the rationale and suggestions for improvements.

<figure><img src="https://content.gitbook.com/content/sMRjTa7AGX49De62RAyl/blobs/0x1cuHstGqgOflCrb6Su/image.png" alt="" width="563"><figcaption></figcaption></figure>

***

### Step 3 — Apply recommendations and re-run the assessment

As part of the assessment the original document was also annotated with a list of comments and recommendations.

To address the gap in the document you have two options:

1. **Edit outside Vectice:**
   * Export the assessed document with ACC comments attached using the **export** button
   * Share/edit externally
   * Re-import the updated version and re-run ACC
2. **Edit in Vectice:**
   * Open the annotated document directly in Vectice.
   * Add missing evidence (stress tests, monitoring, attestation...). You can use Vectice [AskAI](https://docs.vectice.com/25.4/introduction/readme/askai) to go faster.
   * Click **Run Automated Compliance Check** again

<figure><img src="https://content.gitbook.com/content/sMRjTa7AGX49De62RAyl/blobs/fVlwWiX65kYbeHKLBggD/unknown.png" alt=""><figcaption></figcaption></figure>

ACC is designed to be **iterative**: fix → re-run → close remaining gaps.&#x20;

***

### What ACC typically checks

* Purpose, business alignment, and methodology sections
* Data suitability: representativeness, transformations, logs
* Testing and boundaries: assumptions, scenarios/back-tests, extreme-input stress tests
* Benchmarking vs baseline/challenger models
* Monitoring & conservatism: KPI logs, overlays, rationale
* Validation governance: plan, scope, final report, independence attestation
* Documentation quality: complete package, completeness checklist, sign-offs

***

### Good usage patterns

* **Before handing off the document to quality or validation team**: run ACC on the development or deployment document to spot missing content
* **Before audit / model committee**: export the assessment report and attach to the approval packet
* **During review:** import the provided document and ensure the documentation is complete before starting the review process.
* **After validation:** ensure the various steps of the testing plan were respected

{% hint style="info" %}
You can maintain several control set variants: by regulation, by use case and by model types depending on how specific and granular you want to be e.g. SR 11-7 – development, SR 11-7 – monitoring, Internal – GenAI
{% endhint %}
