ISO 27001 Annex A 5.21 Managing information security in the ICT supply chain is a security control that requires organizations to define and implement processes for managing supply chain risks. For AI companies, this control is essential to secure upstream dependencies like open-source libraries, cloud GPU providers, and data annotation services, ensuring that third-party vulnerabilities do not compromise model integrity or proprietary data.
For a modern AI business, the pace of innovation is relentless. To stay competitive, you rely on a complex ecosystem of third-party products and services, from cloud computing platforms to specialised data providers. While this strategy accelerates development, it also introduces significant, often hidden, security risks within your supply chain. Managing these dependencies is no longer just an IT issue; it is a strategic imperative for protecting your intellectual property and maintaining operational resilience.
The international standard for information security, ISO 27001, directly addresses this challenge through ISO 27001 Annex A 5.21 Managing information security in the ICT supply chain. The core requirement of this control is for your organisation to define and implement processes and procedures “to manage the information security risks associated with the ICT products and services supply chain.”
The purpose of this control is fundamentally preventative. It is designed to help you maintain an agreed level of information security in supplier relationships, ensuring that vulnerabilities from your partners do not become your own security incidents. For a business built on data and algorithms, understanding and implementing this control is a critical step in securing the very foundation of your innovation.
Table of contents
- The “No-BS” Translation: Decoding the Requirement
- The Business Case: Why This Actually Matters for AI Companies
- DORA, NIS2 and AI Regulation: Secure the Chain
- ISO 27001 Toolkit vs SaaS Platforms: The Supply Chain Trap
- Why Your AI Supply Chain Presents Unique Security Challenges
- Your Action Plan for Securing the AI Supply Chain
- The Evidence Locker: What the Auditor Needs to See
- Common Pitfalls & Auditor Traps
- Handling Exceptions: The “Break Glass” Protocol
- The Process Layer: “The Standard Operating Procedure (SOP)”
The “No-BS” Translation: Decoding the Requirement
Let’s strip away the consultant-speak. Annex A 5.21 is about understanding that your “product” is actually 90% other people’s code and infrastructure. It demands you stop trusting and start verifying.
| The Auditor’s View (ISO 27001) | The AI Company View (Reality) |
|---|---|
| “Processes and procedures shall be defined and implemented to manage the information security risks associated with the ICT products and services supply chain.” | Know what you are importing. 1. Don’t just pip install a random library for your production model without checking if it sends your env variables to Russia. 2. If AWS us-east-1 goes down, do you have a plan? If OpenAI changes their API terms, do you know about it? |
| “Requirements for information security… shall be defined and communicated to suppliers.” | Set the standard. Tell your data centre: “If you swap out a hard drive, you must crush the old one.” Tell your labeling service: “You cannot use Google Translate on our data.” |
The Business Case: Why This Actually Matters for AI Companies
Why should a founder care about “ICT Supply Chain”? Because you are building a house on rented land.
The Sales Angle
Enterprise clients will ask: “What is your Software Bill of Materials (SBOM)?” and “How do you secure your CI/CD pipeline?”. If your answer is “We don’t track that,” you are a high-risk vendor. If your answer is “We maintain a live SBOM, scan all dependencies for CVEs daily, and contractually mandate security patches from our cloud providers,” you win the deal. A 5.21 is your proof of engineering maturity.
The Risk Angle
The “Upstream” Attack: Attackers love to poison open-source libraries (e.g., PyTorch dependencies). If you blindly pull the latest version, you might import a backdoor that exfiltrates your model weights. Annex A 5.21 forces you to pin versions and scan dependencies.
DORA, NIS2 and AI Regulation: Secure the Chain
Regulators are targeting the “weakest link.”
- DORA (Article 28): Mandates management of “ICT Third-Party Risk.” You must assess the concentration risk. If you and all your competitors rely on the same Azure region, that is a systemic risk you must document.
- NIS2 Directive: Explicitly focuses on “security of the supply chain.” You are responsible for the security of the software you buy and the cloud services you use.
- EU AI Act: High-risk AI providers must ensure transparency. You need to know the provenance of the pre-trained models you use (e.g., Llama, Mistral). If you can’t trace the origin of your base model, you can’t comply.
ISO 27001 Toolkit vs SaaS Platforms: The Supply Chain Trap
SaaS platforms scan your cloud, but they can’t negotiate with your cloud provider. Here is why the ISO 27001 Toolkit is the smarter play.
| Feature | ISO 27001 Toolkit (Hightable.io) | Online SaaS Platform |
|---|---|---|
| Depth | Hardware & Software. Templates cover data centers, laptops, SaaS, and code libraries. | Cloud Only. Most platforms only look at AWS/Azure. They miss the physical router vendor or the bespoke code house. |
| Ownership | You Own the Risk Assessment. The Supply Chain Risk Register is yours. | Rented Logic. The platform gives you a “score” based on a proprietary algorithm. If you leave, you can’t explain your risk posture to an auditor. |
| Simplicity | Standard Clauses. We give you the text to put in your contracts regarding ICT security. | No Legal Help. The platform tells you “Supplier is critical” but doesn’t give you the legal language to fix the risk. |
| Cost | One-off fee. Pay once. Secure your entire stack. | add-on Fees. Supply Chain Risk Management is often a premium add-on costing thousands extra per year. |
Why Your AI Supply Chain Presents Unique Security Challenges
An AI company’s supply chain is more complex than a standard SaaS. It involves specialized suppliers for data, compute, and models.
Data Sourcing and Annotation
Partners providing datasets or data labelling services are part of your ICT supply chain. The risk is amplified because annotation partners often handle raw, un-sanitized data. Their security posture is a direct extension of yours. If they have malware, you have malware.
Model Development and Training
This includes cloud providers offering specialized GPU resources and vendors of pre-trained foundational models. A key risk is model poisoning or tampered pre-trained models that can subtly corrupt your results or steal proprietary data via hidden channels.
Deployment and Operations
Once deployed, you rely on hosting services. This introduces the risk of fourth-party risk. For example, your cloud provider (third party) may rely on a specific hardware vendor (fourth party) for its GPUs. A firmware vulnerability in that hardware impacts you.
Your Action Plan for Securing the AI Supply Chain
Compliance with Annex A 5.21 is the operational blueprint for de-risking your innovation.
Establish Your Governance Framework
Draft a comprehensive set of information security standards for ICT suppliers. This isn’t just about “uptime”; it’s about patch management, encryption standards, and SBOM delivery.
Scrutinise Your AI Partners and Tools
- Demand Transparency: Mandate a Software Bill of Materials (SBOM) for any third-party model or library.
- Identify Critical Components: Map out which libraries are “load bearing.” If pandas breaks, does your product break?
- Verify Secure Implementation: Don’t just accept a product; verify its secure configuration. Require hardening guides for APIs or platforms.
Enforce Security Through Agreements and Monitoring
Include specific clauses in your agreements. Require suppliers to propagate your security standards to their subcontractors (your fourth parties).
The Evidence Locker: What the Auditor Needs to See
Auditors want proof that you are managing the tech stack, not just buying it. Prepare these artifacts:
- ICT Supply Chain Risk Assessment (Excel): A specific risk assessment focusing on technical vendors (AWS, GitHub, Slack).
- Approved Technology List (PDF): A list of “Greenlit” software and hardware. “If it’s not on the list, you can’t install it.”
- SLA Monitoring Logs (Screenshots): Evidence that you check if AWS hit their 99.9% uptime target.
- Patch Compliance Reports: Proof that you ensure your managed service providers are patching their servers.
Common Pitfalls & Auditor Traps
Here are the top 3 ways AI companies fail this control:
- The “Open Source” Blind Spot: You treat paid vendors as suppliers but ignore open-source libraries. Open source is a supplier. You need a process to vet libraries (e.g., using Snyk or Dependabot).
- The “Terms of Service” Cop-out: You say “We use Google, we can’t change their contract.” The auditor accepts this only if you have assessed the risk of accepting their standard terms. If you haven’t documented that risk acceptance, you fail.
- The “Shadow AI” Tool: Developers are using a new code generation tool that isn’t on the Approved Technology List. This is an unmanaged ICT supplier.
Handling Exceptions: The “Break Glass” Protocol
Sometimes a library is deprecated, and you have to switch to an unverified fork to keep production running.
The “Unverified Code” Workflow:
- Trigger: Critical dependency failure requiring immediate replacement.
- Action: Engineering Lead approves use of unvetted library.
- Constraint: Code is sandboxed. Network access is restricted.
- Review: Full code audit scheduled within 7 days.
The Process Layer: “The Standard Operating Procedure (SOP)”
How to operationalise A 5.21 using your existing stack (GitHub, Linear).
- Step 1: Request (Manual). Dev requests new library/tool via Linear.
- Step 2: Automated Scan (Automated). GitHub Action/Snyk scans the library for known CVEs.
- Step 3: Approval (Manual). Tech Lead reviews scan results. If clean, approves merge.
- Step 4: Register (Manual). Tool is added to the “Approved Tech Stack” page in Notion.
- Step 5: Monitor (Automated). Automated dependency bot alerts if a new vulnerability is discovered in the future.
By implementing a systematic approach to ICT supply chain security—from establishing a robust governance policy to enforcing standards through monitoring—you can protect your core assets. The High Table ISO 27001 Toolkit provides the templates you need to document this without inventing the wheel.
ISO 27001 Annex A 5.21 for AI Companies FAQ
What is ISO 27001 Annex A 5.21 for AI companies?
ISO 27001 Annex A 5.21 requires AI companies to manage security throughout the ICT supply chain. For AI firms, this involves ensuring that 100% of software and hardware dependencies, such as GPU clusters, LLM APIs, and open-source libraries, are secure to prevent supply chain attacks and model compromises.
Why is ICT supply chain management critical for AI startups?
It is critical because AI models rely on complex stacks of external infrastructure and code. With 62% of system intrusions originating in the supply chain, AI startups must monitor vendors like cloud providers and API-based LLMs to ensure that vulnerabilities do not compromise model weights or proprietary datasets.
How should AI firms monitor ICT suppliers for compliance?
AI firms should implement a multi-layered monitoring strategy for their tech stack to satisfy Annex A 5.21. Recommended steps include:
- Software Bill of Materials (SBOM): Maintaining a complete list of open-source components used in the AI training pipeline.
- API Security Reviews: Conducting monthly audits of tokens and endpoints used to connect with LLM providers like OpenAI or Anthropic.
- Hardware Verification: Ensuring GPU and server providers offer cryptographic assurance of hardware integrity and firmware security.
- SLA Enforcement: Monitoring uptime and security patch compliance for all MLOps platforms and cloud-based compute environments.
How does Annex A 5.21 relate to the EU AI Act?
Annex A 5.21 aligns with the EU AI Act’s focus on transparency and safety in high-risk AI systems. By managing the ICT supply chain, firms ensure their upstream components meet the rigorous safety standards required for certification, reducing the risk of global fines that can reach €35 million.
What evidence is needed for an Annex A 5.21 audit?
Auditors require documented proof of ICT supply chain oversight. Essential evidence includes an ICT Supplier Register, recent SBOM reports for AI software, third-party security audit reports (e.g., SOC 2 Type 2), and evidence of periodic risk assessments for critical infrastructure providers like AWS, Azure, or GCP.