ISO 27001 Annex A 5.12 Audit Checklist

ISO 27001 Annex A 5.12 Audit Checklist

Auditing ISO 27001 Annex A.5.12 is the validation of the classification scheme applied to information assets to ensure appropriate protection based on sensitivity. The audit confirms the Primary Implementation Requirement that data is labelled, handled, and protected according to its value and criticality. The Business Benefit is the prevention of unauthorized disclosure and the reduction of data leakage risks.

Use this pass/fail checklist to strictly validate compliance with ISO 27001 Annex A 5.12 (Classification of information). For a detailed methodology on how to conduct the interviews and system tests required to generate this evidence, refer to our Annex A 5.12 Audit Guide.

1. Classification Scheme Formally Defined

  • Verification Criteria: A formal policy document exists that explicitly defines the organisation’s classification levels (e.g., Public, Internal, Confidential) and is approved by management.
  • Required Evidence: The “Information Classification Policy” (Version Control Table showing approval within the last 12 months).
Pass/Fail Test: If the scheme relies on ad-hoc terms or unwritten “common knowledge” rather than a documented hierarchy, mark as Non-Compliant.

2. Handling Guidelines Explicitly Documented

  • Verification Criteria: Each defined classification level has clear, written handling instructions covering storage, transmission, and destruction.
  • Required Evidence: A “Handling Matrix” or “Data Handling Procedure” document that maps levels (e.g., “Confidential”) to technical requirements (e.g., “AES-256 Encryption required”).
Pass/Fail Test: If the policy defines “Top Secret” but fails to specify how to transmit it (e.g., “Do not email”), mark as Non-Compliant.

3. Asset Register Classification Integration Verified

  • Verification Criteria: The Information Asset Inventory (Annex A 5.9) includes a mandatory column for “Classification Level” that is populated for all critical assets.
  • Required Evidence: A distinct sample of the Master Asset Register showing “Classification” fields populated for 5 random information assets.
Pass/Fail Test: If the Asset Register lists hardware (laptops) but does not classify the data stored on them, mark as Non-Compliant.

4. Visual Labelling Implementation Verified

  • Verification Criteria: Digital and physical assets carry visual indicators of their classification status in headers, footers, or metadata.
  • Required Evidence: Screenshots of a “Confidential” Word document showing the watermark/header, or a photo of a physical “Restricted” file folder.
Pass/Fail Test: If a document is listed as “Confidential” in the registry but prints without a visual warning label, mark as Non-Compliant.

5. Data Loss Prevention (DLP) Rules Active

  • Verification Criteria: Technical controls are active that detect and block the unauthorised transfer of classified information based on its label.
  • Required Evidence: System logs from the DLP or Email Gateway showing a blocked attempt to send “Internal” data to an external domain.
Pass/Fail Test: If you can successfully email a file tagged “Restricted” to a personal Gmail account without a block or alert, mark as Non-Compliant.

6. Third-Party Mapping Agreements Validated

  • Verification Criteria: Contracts with suppliers receiving sensitive data include a mapping of your classification levels to theirs.
  • Required Evidence: A signed Data Processing Agreement (DPA) or Supplier Contract referencing the handling requirements for “Confidential” data.
Pass/Fail Test: If a supplier contract states “protect data” but does not define what specific security level applies to your transmitted data, mark as Non-Compliant.

7. User Awareness of Classification Confirmed

  • Verification Criteria: Employees can correctly identify the classification level of a sample document and describe the handling procedure without prompting.
  • Required Evidence: Interview notes or a scored “Phishing/Handling Simulation” test result showing >90% pass rate.
Pass/Fail Test: If a user handles a “Confidential” document correctly but calls it “Private” (using incorrect terminology not in the policy), mark as Non-Compliant.

8. De-classification Process Evidence Present

  • Verification Criteria: A process exists and is used to downgrade data classification when sensitivity decreases (e.g., after public release).
  • Required Evidence: An audit trail or log showing a document’s status changing from “Confidential” to “Public” authorised by the owner.
Pass/Fail Test: If 5-year-old marketing drafts are still classified as “Confidential” with no review date, mark as Non-Compliant (Over-classification).

9. Legal & Regulatory Alignment Verified

  • Verification Criteria: Data subject to specific laws (GDPR, HIPAA) is automatically assigned a classification level that meets those legal strictures.
  • Required Evidence: The Legal Register (Annex A 5.31) mapped against the Classification Policy (e.g., “PII = Confidential”).
Pass/Fail Test: If PII is found in a system classified as “Public” or “General,” mark as Non-Compliant immediately.

10. Owner-Driven Classification Confirmed

  • Verification Criteria: The Information Owner (business side) determined the classification, not the System Administrator (IT side).
  • Required Evidence: Email confirmation, meeting minutes, or workflow logs showing the Information Owner approving the classification level.
Pass/Fail Test: If IT sets the classification defaults globally without input from the specific data owners, mark as Non-Compliant.
ISO 27001 Annex A 5.12 SaaS / GRC Platform Failure Checklist
Control Requirement The “Checkbox Compliance” Trap The Reality Check
Automated Labelling Tool claims “AI-powered auto-classification” exists. Auditor must upload a dummy PII file and time how long it takes to tag. Many tools only scan once every 24 hours.
Visual Marking Tool shows a coloured tag inside the web dashboard. Auditor must export the file to PDF. If the PDF lacks the “Confidential” watermark, the tool fails the “labelling” requirement for data egress.
Granularity Tool allows classifying a “Project” or “Folder.” Auditor must attempt to classify a single file differently from its parent folder. If the tool forces inheritance, it fails granularity checks.
Search Indexing Permissions prevent users from opening the file. Auditor must search for sensitive terms (e.g., “Salary”) as a low-level user. If the search preview shows snippet text, the index is leaking data.
Mobile Parity Desktop UI shows clear “Restricted” banners. Auditor must open the app on a smartphone. If the banners are hidden to save screen space, the tool is non-compliant on mobile.
Default Settings Admin can manually set folders to “Private.” Auditor must create a new workspace. If it defaults to “Public” (Open), the tool violates Privacy by Design principles.
API Security UI prevents export of data. Auditor must query the API directly. If the JSON response includes the sensitive data without the classification metadata, it’s a fail.

About the author

Stuart Barker
🎓 MSc Security 🛡️ Lead Auditor 30+ Years Exp 🏢 Ex-GE Leader

Stuart Barker

ISO 27001 Ninja

Stuart Barker is a veteran practitioner with over 30 years of experience in systems security and risk management. Holding an MSc in Software and Systems Security, he combines academic rigor with extensive operational experience, including a decade leading Data Governance for General Electric (GE).

As a qualified ISO 27001 Lead Auditor, Stuart possesses distinct insight into the specific evidence standards required by certification bodies. His toolkits represent an auditor-verified methodology designed to minimise operational friction while guaranteeing compliance.

Shopping Basket
Scroll to Top