ISO 27001 Annex A 5.28 is a security control that establishes formal procedures for the identification and preservation of digital evidence. The primary implementation requirement is a documented chain of custody, providing the business benefit of legal admissibility and protection against intellectual property theft.
In the fast-paced world of artificial intelligence, the primary focus is on innovation—building breakthrough models, securing new funding, and capturing market share. However, this focus on growth can obscure a critical vulnerability: a single information security incident can trigger significant legal, financial, and disciplinary consequences that threaten the entire business. When an incident occurs, your ability to respond effectively depends entirely on the quality of your evidence.
This is the core of ISO 27001 Annex A 5.28 Collection of evidence, a control that requires your organization to “identify, collect, acquire and preserve evidence related to information security incidents.” Without a formal, repeatable process for this, you are left defenceless when you need to support legal proceedings or internal disciplinary actions.
Table of contents
- The “No-BS” Translation: Decoding the Requirement
- The Business Case: Why This Actually Matters for AI Companies
- DORA, NIS2 and AI Regulation: Prove It
- ISO 27001 Toolkit vs SaaS Platforms: The Evidence Trap
- The Unique Risks: Applying Annex A 5.28 to Your AI Workflows
- Your Compliance Blueprint: Four Steps to Mastering Evidence Collection
- The Evidence Locker: What the Auditor Needs to See
- Common Pitfalls & Auditor Traps
- Handling Exceptions: The “Break Glass” Protocol
- The Process Layer: “The Standard Operating Procedure (SOP)”
The “No-BS” Translation: Decoding the Requirement
Let’s strip away the consultant-speak. Annex A 5.28 is about CSI: Cyber. It demands that when you catch a hacker (or a rogue employee), you don’t accidentally destroy the fingerprints.
| The Auditor’s View (ISO 27001) | The AI Company View (Reality) |
|---|---|
| “The organisation shall define and apply procedures for the identification, collection, acquisition and preservation of evidence related to information security events.” | Don’t touch the crime scene. If a developer steals your code and quits, don’t just format their laptop. You need to image the drive forensically so you can prove the theft in court. |
| “Procedures… shall take into account legal requirements.” | Chain of Custody is King. If you hand the laptop to IT, then to HR, then to the CEO, and nobody signs for it, the evidence is worthless in court. You need a log: “Person A gave Item X to Person B at [Time].” |
The Business Case: Why This Actually Matters for AI Companies
Why should a founder care about “evidence handling”? Because IP theft is the most common crime in AI, and you can’t sue without proof.
The Sales Angle
Enterprise clients will ask: “How do you investigate internal misconduct?” If your answer is “We look at logs,” that’s weak. If your answer is “We follow a forensically sound evidence collection procedure compliant with ISO 27037 to ensure legal admissibility,” you sound like a serious partner who can protect their data.
The Risk Angle
The “Departing Co-Founder” Lawsuit: A co-founder leaves and starts a competitor using your exact model architecture. You want to sue. But because you let them keep their laptop “for personal photos” before wiping it, you destroyed the evidence of the exfiltration. A 5.28 process would have seized and imaged that device immediately.
DORA, NIS2 and AI Regulation: Prove It
Regulators don’t just want to know that you were breached; they want to know how.
- DORA (Article 11): Financial entities must have procedures to preserve evidence for forensic analysis. If a bank’s data is stolen from your AI platform, you must preserve the logs so the bank can investigate.
- NIS2 Directive: Requires incident handling that includes “evidence gathering.” You must be able to show the root cause. If you rotate your logs too fast (overwrite evidence), you are non-compliant.
- GDPR (Accountability): If you suffer a data breach, the ICO/DPA will ask for evidence of the scope. “How many records were taken?” If you can’t prove it was only 100 records, they might fine you as if it were 1,000,000. Evidence limits liability.
ISO 27001 Toolkit vs SaaS Platforms: The Evidence Trap
SaaS platforms collect logs, but they don’t collect evidence. There is a legal difference. Here is why the ISO 27001 Toolkit is superior.
| Feature | ISO 27001 Toolkit (Hightable.io) | Online SaaS Platform |
|---|---|---|
| The Process | Forensic Policy. A document that tells you how to seize a laptop without corrupting the timestamps. | Log Aggregation. Platforms collect logs, but they don’t tell you how to physically handle a compromised device. |
| Chain of Custody | Templates. A “Chain of Custody” form you can print and sign. Essential for physical evidence (laptops, USBs). | Digital Only. SaaS tools can’t track physical assets moving between people. They fail when the evidence is hardware. |
| Ownership | Your Strategy. You decide retention periods based on legal advice (e.g., 7 years). | Retention Limits. SaaS platforms often delete logs after 30 or 90 days to save costs. If you need evidence from 6 months ago, it’s gone. |
| Cost | One-off fee. Pay once. Keep your evidence forever. | Storage Costs. Keeping forensic-grade logs in a SaaS tool gets incredibly expensive as data volume grows. |
The Unique Risks: Applying Annex A 5.28 to Your AI Workflows
For an AI business, “evidence” extends to the core of your operations: data, code, and infrastructure.
- Proprietary Models: Immutable logs demonstrating unauthorized access to proprietary model weights are among your most valuable assets.
- MLOps Pipelines: Logs from your MLOps pipelines are critical evidence to prove or disprove a model poisoning attack.
- Algorithmic Investigations: Code commits and model behaviour logs are essential for investigations into algorithmic bias.
- Data Provenance: Verifiable records proving the licensing of training data are critical in IP disputes.
Your Compliance Blueprint: Four Steps to Mastering Evidence Collection
Achieving compliance with Annex A 5.28 is about engineering a resilient, legally-defensible process.
Understand the Legal and Regulatory Landscape
Understand the laws that apply to you. Digital evidence often spans national boundaries. What is admissible in the US might not be in the EU.
Document Your Collection of Evidence Policy
A documented policy is the singular source of truth. It guides your team on how to identify, gather, and preserve evidence. To an auditor, an undocumented process is an uncontrolled one.
Establish Your Process (and Get Professional Help)
Detail specific procedures. Best practice is to have a procedure that calls in professionals for forensic work. Untrained staff often corrupt evidence.
Integrate with Your Incident Management Framework
Evidence collection must be part of your Incident Management framework (A 5.24). Trigger the evidence procedure as soon as an incident looks legal/disciplinary.
The Evidence Locker: What the Auditor Needs to See
When the audit comes, prepare these artifacts:
- Evidence Collection Policy (PDF): The signed document defining your rules.
- Chain of Custody Forms (Template): A blank copy of the form you would use.
- Asset Retainment Log (Spreadsheet): A list of devices currently held as evidence (e.g., “Laptop #123 held in safe”).
- Log Retention Settings (Screenshot): Evidence that AWS CloudTrail/Okta logs are set to retain for 1 year (or whatever your policy says).
Common Pitfalls & Auditor Traps
Here are the top 3 ways AI companies fail this control:
- The “Wipe and Reissue” Reflex: IT’s instinct is to wipe a virus-infected laptop and give it back to the user to maintain productivity. This destroys the evidence of how the virus got there (e.g., phishing email). You must image it first.
- The “Admin Login” Error: During an investigation, the admin logs in to the compromised user’s account to “look around.” This changes the “Last Login” timestamp and contaminates the evidence. Use read-only logs.
- The “Screenshots” Fallacy: Taking a screenshot of a log isn’t forensic evidence (it can be Photoshopped). You need the raw log file with a checksum (hash) to prove it hasn’t been altered.
Handling Exceptions: The “Break Glass” Protocol
Sometimes you can’t preserve everything (e.g., disk is full, or privacy laws prevent full capture).
The “Limited Collection” Workflow:
- Trigger: Privacy constraints (e.g., employee personal data on device) prevent full imaging.
- Action: Collect targeted logs only (e.g., firewall logs, file access logs) rather than full disk image.
- Approval: Legal Counsel approves the scope of collection to ensure privacy compliance.
- Documentation: Record why full evidence wasn’t collected.
The Process Layer: “The Standard Operating Procedure (SOP)”
How to operationalise A 5.28 using your existing stack (Linear, Google Drive).
- Step 1: Identification (Manual). Incident Commander flags an incident as “Legal Hold Required.”
- Step 2: Collection (Automated/Manual). IT runs a script to snapshot the AWS EBS volume (Automated). IT seizes the physical laptop (Manual).
- Step 3: Documentation (Manual). IT fills out the “Chain of Custody” form (High Table Template) detailing serial numbers and times.
- Step 4: Preservation (Manual). Physical evidence goes into a locked safe. Digital snapshots are locked with “WORM” (Write Once Read Many) policies in AWS S3 (Object Lock).
- Step 5: Handover (Manual). Evidence is transferred to external forensics firm via secure courier or encrypted drive. Transfer logged.
By implementing a formal process based on professionally developed policies – like those included in the High Table ISO 27001 Toolkit – you can build a defensible and audit-ready position. This allows you to protect your organization and maintain your focus on driving innovation with confidence.
ISO 27001 Annex A 5.28 for AI Companies FAQ
What is ISO 27001 Annex A 5.28 for AI companies?
ISO 27001 Annex A 5.28 requires AI companies to establish procedures for the identification, collection, acquisition, and preservation of digital evidence related to security incidents. For AI firms, this ensures that 100% of forensic data—from LLM prompt logs to model weights—is handled in a manner that maintains its integrity and legal admissibility.
What constitutes digital evidence for an AI security incident?
In the context of Annex A 5.28, digital evidence for AI companies typically includes the following high-density data points:
- API Gateway Logs: Timestamps and IP addresses associated with suspected prompt injection attacks.
- Inference Metadata: Parameters used during a specific model output that resulted in a security violation.
- Training Set Snapshots: Version-controlled datasets used at the time a model poisoning event occurred.
- System Audit Trails: Records of administrative access to GPU clusters or vector databases.
How do AI firms preserve the integrity of evidence for Annex A 5.28?
To preserve evidence integrity, AI firms must use cryptographic hashing (e.g., SHA-256) at the point of collection. This creates a “digital fingerprint” of logs or model files, ensuring that any subsequent modification is detectable. This process is vital for meeting the “Chain of Custody” requirements during a formal ISO 27001 audit or legal proceeding.
Is forensic readiness required for ISO 27001?
Yes, forensic readiness is a core component of Annex A 5.28. AI companies must proactively configure their infrastructure to ensure logs are retained for a minimum of 90 to 180 days (depending on internal policy) and are stored in a “Write Once, Read Many” (WORM) format to prevent attackers from deleting evidence of their intrusion.
How does ISO 27001 Annex A 5.28 support legal compliance for AI?
Annex A 5.28 provides the technical foundation for meeting the reporting obligations of the EU AI Act and GDPR. By maintaining a structured evidence collection process, AI firms can provide regulators with verifiable proof of the incident’s scope, reducing the risk of “failure to report” fines which can reach up to 7% of global annual turnover under certain AI regulations.