ISO 27001 Annex A 5.13 Audit Checklist

Auditing ISO 27001 Annex A 5.13 Information Labelling involves verifying that an appropriate set of procedures is implemented to label information in accordance with the organization’s information classification scheme. This process validates the Primary Implementation Requirement of applying visible and metadata-based labels to communicate the value and sensitivity of data. The Business Benefit ensures consistent data handling, prevents accidental leakage, and automates protection mechanisms like DLP based on clear classification tags.

This technical verification tool is designed for lead auditors to confirm the rigour of organisational data disposal and protection protocols. Use this checklist to validate compliance with ISO 27001 Annex A 5.13 (Information labelling) by ensuring that all information assets are identifiable and handled according to their specific classification levels.

1. Information Labelling Procedure Formalisation Verified

Verification Criteria: A documented procedure exists that defines the specific methods for labelling information in all formats (digital, physical, and electronic) based on the classification scheme.

Required Evidence: Approved Information Labelling and Handling Procedure, integrated with the wider Classification Policy.

Pass/Fail Test: If the organisation has a classification policy but no documented instructions on how or where to apply the labels, mark as Non-Compliant.

2. Physical Media Labelling Accuracy Confirmed

Verification Criteria: Physical assets, including removable media, printed reports, and backup tapes, bear visible classification labels.

Required Evidence: Physical inspection of a sample of 5-10 items (e.g., printed board packs, encrypted USB drives) for correct classification stickers or markings.

Pass/Fail Test: If a physical document containing sensitive PII or financial data is found without a classification marking, mark as Non-Compliant.

3. Digital Document Metadata Labelling Validated

Verification Criteria: Electronic documents (PDFs, Word documents, Spreadsheets) contain internal classification labels within the metadata or as visible headers/footers.

Required Evidence: A sample of 10 documents from internal repositories (SharePoint/Google Drive) showing consistent use of labelling tools (e.g., Microsoft Purview/Sensitivity Labels).

Pass/Fail Test: If high-classification digital files lack corresponding metadata tags that trigger DLP (Data Loss Prevention) rules, mark as Non-Compliant.

4. Automated Labelling Integration in SaaS/Cloud Verified

Verification Criteria: Cloud environments and SaaS applications are configured to automatically apply labels to information created within the platform based on content scanning.

Required Evidence: Configuration screenshots of the auto-labelling policy settings in the cloud tenant (e.g., M365 Sensitivity Label Auto-labeling rules).

Pass/Fail Test: If labelling relies entirely on manual user selection for sensitive data without any automated “recommended” labelling backup, mark as Non-Compliant.

5. Email Communication Labelling Consistency Confirmed

Verification Criteria: Outgoing and internal emails containing classified information are marked appropriately in the subject line or via header tags.

Required Evidence: Review of a sample of “Confidential” tagged emails to verify that the labelling remains intact during transmission.

Pass/Fail Test: If the organisation’s policy requires subject line marking (e.g., [PROTECTED]) but recent sensitive communications lack this, mark as Non-Compliant.

6. Labelling Exceptions for Public Information Validated

Verification Criteria: Information intended for public release is either explicitly labelled “Public” or is exempt from labelling via a documented exception within the procedure.

Required Evidence: Public-facing marketing materials or website content reviewed against the “Labelling Exemptions” list in the procedure.

Pass/Fail Test: If public information is mixed with unlabelled internal-only information due to a lack of clear exclusion criteria, mark as Non-Compliant.

7. Labelling Integrity in System Outputs Verified

Verification Criteria: Reports generated from core databases or ERP systems automatically include the correct classification label in the output format.

Required Evidence: Sample of generated reports (PDF/CSV) from a production system showing a pre-configured classification footer.

Pass/Fail Test: If system-generated sensitive reports require manual labelling after export, mark as Non-Compliant.

8. Employee Competence in Labelling Tools Confirmed

Verification Criteria: Personnel demonstrate an understanding of how to use the organisation’s specific labelling tools and why different labels are applied.

Required Evidence: Training records specifically covering the “Labelling and Handling” module or interview notes with a random staff sample.

Pass/Fail Test: If staff members are unaware of the existence of labelling tools or cannot explain the difference between two internal labels, mark as Non-Compliant.

9. Alignment Between Inventory and Labelling Present

Verification Criteria: Classification labels found on sampled assets match the classification level recorded in the central Asset Register (Annex A 5.9).

Required Evidence: Cross-reference of 5 physical or digital assets against their corresponding entry in the Asset Register.

Pass/Fail Test: If an asset is marked as “Confidential” in the register but bears a “General” label in practice, mark as Non-Compliant.

10. Labelling Review and Update Records Identified

Verification Criteria: Labels are reviewed and updated if the classification of the information changes over time (re-classification).

Required Evidence: Document version history or audit logs in the labelling tool showing a change of label based on a change in data sensitivity.

Pass/Fail Test: If highly sensitive historical data remains under an obsolete or “Legacy” labelling scheme without a migration plan, mark as Non-Compliant.
ISO 27001 Annex A 5.13 SaaS / GRC Platform Failure Checklist
Control Requirement The ‘Checkbox Compliance’ Trap The Reality Check
Handling Procedures GRC tool identifies that a “Classification Policy” exists. Verify that the policy includes *specific* handling rules (e.g., “Confidential must be encrypted”) for each label.
Labelling Implementation SaaS dashboard shows “Purview Enabled” for the tenant. Inspect the actual file repository; verify if users are actually *applying* the labels or just ignoring the prompt.
Metadata Integrity Tool checks if a “Sensitivity” column exists in SharePoint. Download a file; check if the label persists when the file is moved out of the SharePoint environment.
Physical Labelling GRC platform ignores physical media entirely. Real auditors must physically see labels on backup drives or paper archives; software cannot verify physical stickers.
Training Evidence Platform logs “Security Awareness Course” as complete. Check the course content; ensure it specifically covers *how* to use the organisation’s classification software.
System Outputs SaaS tool assumes generic cloud outputs are compliant. Verify if custom internal apps or databases have been hardcoded to print labels on reports.
Asset Alignment Tool verifies that the Asset Register is filled in. Verify that the classification *logic* in the register actually matches the labels seen in the wild.

About the author

Stuart Barker
🎓 MSc Security 🛡️ Lead Auditor 30+ Years Exp 🏢 Ex-GE Leader

Stuart Barker

ISO 27001 Ninja

Stuart Barker is a veteran practitioner with over 30 years of experience in systems security and risk management. Holding an MSc in Software and Systems Security, he combines academic rigor with extensive operational experience, including a decade leading Data Governance for General Electric (GE).

As a qualified ISO 27001 Lead Auditor, Stuart possesses distinct insight into the specific evidence standards required by certification bodies. His toolkits represent an auditor-verified methodology designed to minimise operational friction while guaranteeing compliance.

Shopping Basket
Scroll to Top