The convergence of the Digital Operational Resilience Act (DORA) and the explosive adoption of Generative AI (GenAI) represents a singular event in the history of financial technology regulation. As the January 17, 2025, compliance deadline passes, the European financial sector faces a stark reality: the architectural paradigms currently driving GenAI innovation are fundamentally incompatible with the operational resilience mandates of DORA. While financial institutions have spent the last two years aggressively deploying Large Language Models (LLMs) to gain competitive advantages in customer service, fraud detection, and code generation, the regulatory framework governing these deployments has shifted from a focus on data privacy (GDPR) to a relentless focus on supply chain resilience, substitutability, and deterministic control.
The central thesis of this report is that up to 90% of current GenAI pilots — specifically those relying on opaque, multi-layered API dependencies and proprietary foundational models — will fail to meet the audit standards set forth in DORA Article 28.1 This failure will not stem from a lack of cybersecurity intent but from a structural inability to verify the “Nth-party” supply chain, guarantee “functional equivalence” during vendor exit, and maintain the granular “Register of Information” required for all critical ICT services.3
DORA effectively reclassifies AI not as a magical innovation, but as a standard ICT service subject to rigorous third-party risk management (TPRM). This reclassification exposes the fragility of the current “wrapper” architecture, where financial applications are often thin layers of logic atop a deep, invisible stack of vector databases, orchestration libraries, and cloud inference endpoints managed by US-based hyperscalers. The regulation demands what GenAI struggles to provide: traceability by design, total visibility into sub-outsourcing, and the proven ability to decouple from a specific vendor without operational degradation.5
This document serves as an exhaustive breakdown of the collision between DORA Article 28 and GenAI architectures. It details the specific audit traps inherent in modern AI stacks, the legal impossibility of “functional equivalence” in probabilistic systems, and the technical remediation strategies required to survive the 2025 audit cycle.
Part I: The Regulatory Architecture of DORA
To understand the magnitude of the compliance gap, one must first dissect the regulatory architecture of DORA, moving beyond the high-level pillars to the granular requirements of the Implementing Technical Standards (ITS) and Regulatory Technical Standards (RTS). Unlike previous guidelines which were often “comply or explain,” DORA is a Regulation, directly applicable and enforceable across all EU member states, carrying the weight of significant financial penalties and, more critically, the power of regulators to order the cessation of non-compliant critical functions.
1.1 The Shift from Security to Resilience
Historically, financial regulation focused on information security — confidentiality, integrity, and availability. DORA shifts the paradigm to operational resilience. The distinction is profound. Security asks, “Can you stop the attacker?” Resilience asks, “When the attacker succeeds, or the vendor fails, or the cable is cut, can you continue to deliver your critical economic functions?”.7
For GenAI, this means the auditor is less interested in the firewall protecting the model and more interested in the continuity of the service the model provides. If a bank uses a GenAI chatbot for first-line customer support (a critical function), and the model provider (e.g., OpenAI, Anthropic) suffers an outage or a regulatory ban, the bank must demonstrate it can sustain that customer support function. The “move fast and break things” ethos of AI development is diametrically opposed to the “sustain and endure” mandate of resilience.9
1.2 Article 28: The Core of the Conflict
Article 28, “General principles for the management of ICT third-party risk,” is the weaponization of supply chain governance. It mandates that financial entities must control their ICT providers as rigorously as they control their internal departments.
The requirements of Article 28 can be categorized into four primary vectors of conflict for GenAI:
- The Register of Information (RoI): The absolute accounting of every node in the supply chain.4
- Sub-outsourcing Visibility: The requirement to know and approve the vendor’s vendors.6
- Location & Sovereignty: The requirement to identify exactly where data is processed.10
- Exit Strategy & Portability: The requirement to be able to leave a vendor without losing functionality.3
Each of these vectors presents a unique challenge to the current way GenAI systems are architected and sold. The market is dominated by “black box” APIs where the provider abstracts away the complexity — precisely the complexity DORA demands be made visible.
1.3 The Concept of Critical ICT Third-Party Providers (CTPPs)
A pivot point in DORA is the designation of Critical ICT Third-Party Providers (CTPPs). The European Supervisory Authorities (ESAs) have the power to designate specific providers as “critical” based on their systemic importance to the financial sector.8
Table 1: Criteria for CTPP Designation and Relevance to GenAI
| CTPP Designation Criteria | Relevance to GenAI Ecosystem |
| Systemic Impact | If a foundational model (e.g., GPT-4) fails, does it impact a significant number of financial entities? Given the concentration of the market, the answer is increasingly yes. |
| Substitutability | Is it difficult to switch to another provider? GenAI models have low substitutability due to proprietary prompt engineering and non-standard APIs.3 |
| Reliance on Infrastructure | Does the provider rely on critical infrastructure? Most AI providers rely heavily on specific GPU clusters within hyperscalers (Azure/AWS), creating a concentrated dependency risk. |
If a bank’s GenAI pilot relies on a provider designated as a CTPP, the bank faces heightened scrutiny. However, even if the provider is not a CTPP, the bank must treat them with high rigor if the function they support is critical to the bank. The dangerous misunderstanding in the market is that “experimental” pilots are exempt. Once a pilot touches real customer data or influences a financial decision (credit, claims, code deployment), it falls under DORA’s scope.9
Part II: The Anatomy of the Compliance Failure
The projection that 90% of pilots will fail audits is derived from an analysis of the specific data fields and contractual clauses required by Article 28 versus the reality of current GenAI contracts and architectures.
2.1 The Register of Information (RoI) Gap
Article 28(3) mandates the maintenance of a Register of Information. This is not a simple vendor list; it is a complex, relational database of contracts, functions, and supply chains. The EBA and ESMA have released templates for this register which require granular data that most AI project teams simply do not possess.12
2.1.1 The LEI Challenge
The RoI requires the Legal Entity Identifier (LEI) for all ICT third-party service providers. In the convoluted corporate structures of AI startups and major tech firms, identifying the specific legal entity providing the service can be difficult. Is the contract with “Microsoft Ireland Operations Ltd” or “Microsoft Corporation (USA)”? For a startup wrapping an LLM, does the bank have the LEI of the startup and the LEI of the underlying model provider (the sub-outsourcing chain)?.14
2.1.2 Function Criticality and Mapping
The register requires mapping each ICT service to the specific “Critical or Important Function” (CIF) it supports.
- The Trap: Banks often classify GenAI as “General IT Support” to avoid scrutiny.
- The Audit: If an auditor sees an AI tool summarizing “suspicious activity reports” (AML function) but classified as “General IT,” this is a finding of misclassification and a failure of governance.15
- Data Fields: The ITS templates require fields such as “Data Location,” “Governing Law,” and “Notice Period.” For API-based GenAI services, “Data Location” is often dynamic (e.g., “Any available region in EU”). DORA auditors, however, expect static, deterministic answers. If the contract says “EU or US based on availability,” this flag raises a sovereignty risk that must be mitigated.13
2.2 The Sub-Outsourcing Black Hole
Article 28(4) is perhaps the most lethal provision for GenAI compliance. It states that contractual arrangements must specify whether sub-outsourcing of a critical function is permitted and the conditions that apply.6
2.2.1 The Nth-Party Dependency Chain
Modern GenAI applications are built on a “stack of cards” dependency model:
- The Application: A “Compliance Chatbot” built by a Fintech.
- Orchestration: Uses LangChain or Semantic Kernel (software dependency).
- Memory: Uses Pinecone or Weaviate (SaaS vector database) for RAG.
- Intelligence: Uses OpenAI or Anthropic (Model API) for inference.
- Compute: The Model API runs on Azure or AWS GPU clusters.
Under DORA, if the Compliance Chatbot supports a critical function, the Bank must have visibility into this entire chain. The Bank must ensure that the Fintech (1st party) monitors Pinecone (2nd party) and OpenAI (2nd party).
- The Failure Point: Most SaaS contracts for GenAI tools have broad sub-outsourcing clauses allowing the vendor to “use commercially reasonable efforts” to select sub-processors. DORA requires the Bank to have the right to object or terminate if a sub-processor changes materially.17
- Scenario: The Fintech switches its vector database from an EU-hosted instance to a cheaper US-hosted instance. Under DORA, the Bank must be notified. In reality, this happens silently in the backend. This silence is a compliance breach.18
2.3 The “Right to Audit” Fiction
Article 28(6) grants financial entities the right to audit their providers. In the context of GenAI, this creates a practical absurdity that regulators are only beginning to address.
- The Impossibility: A mid-sized European bank cannot physically audit a hyperscaler’s data center or demand to inspect the weights of GPT-4.
- The Pooled Audit: DORA suggests “pooled audits” where multiple banks join forces. While valid for cloud infrastructure, how does one audit a probabilistic model? An audit of an LLM requires testing for bias, safety, and data leakage.
- The Evidence Gap: When the DORA auditor asks, “Show me the evidence that your AI provider deletes your data after inference,” the bank will likely produce a Terms of Service URL. The auditor will reject this. They require a SOC 2 Type II report specifically covering the AI service’s data retention controls, or a specific audit certification. Most GenAI providers are still catching up on these specific assurance reports.19
Part III: The Exit Strategy Paradox and Functional Equivalence
The most technically damning aspect of DORA for GenAI is Article 28(8), which mandates a comprehensive exit strategy ensuring “functional equivalence”.3
3.1 Defining Functional Equivalence
“Functional equivalence” means that if the bank is forced to exit a contract (e.g., Vendor A goes bankrupt or is sanctioned), it can move the service to Vendor B or in-house, and the system will perform “materially the same”.21
- Deterministic Systems: In a SQL database migration, functional equivalence is proven by hashing the data and running test queries. If the results match, you have equivalence.
- Probabilistic Systems: In GenAI, functional equivalence is mathematically impossible to guarantee between different model families. A prompt that generates a perfect summary on GPT-4 may generate a hallucination on Claude 3 or Llama 3. The “latent space” of these models is different.
3.2 The Vendor Lock-in Trap
GenAI development relies heavily on “Prompt Engineering” — the art of crafting inputs to get the desired output from a specific model. This creates massive technical debt and vendor lock-in.
- The Audit Trap: The auditor asks, “What is your exit plan for your Credit Analysis Bot?”
- The Bank’s Response: “We will switch to an open-source model like Llama 3.”
- The Audit Test: The auditor requires proof. “Show me the test results where you ran your credit analysis prompts through Llama 3. Did the risk scores match? Did the summaries contain the same key facts?”.3
- The Reality: They will not match. To achieve functional equivalence, the bank would need to have essentially rebuilt the application for the second model in parallel. Without this “Multi-Model Architecture” from day one, the exit strategy is a lie, and the audit is failed.22
3.3 Data Portability and “The Knowledge”
DORA requires data portability.23 In GenAI, “data” is not just the documents; it is the embeddings (vector representations) of those documents.
- Incompatibility: Vector embeddings are model-specific. An embedding generated by OpenAI’s
text-embedding-3-smallis a vector of 1536 dimensions. It is mathematically meaningless to a system using Cohere’s embedding model. - The Migration Nightmare: To exit a vendor, the bank cannot just “move” the data. It must re-process and re-embed its entire knowledge base. This takes time and compute power. If the exit plan says “transition in 48 hours” but the re-embedding process takes 2 weeks, the plan is non-compliant.22
Part IV: Traceability by Design and the Black Box
DORA places immense weight on ICT risk management (Article 6) and the ability to reconstruct events (Article 12). For AI, this translates to “Traceability by Design”.5
4.1 The ISO 8000 Connection
While DORA doesn’t explicitly name ISO 8000, it references data integrity and quality standards. ISO 8000 (Data Quality) principles are the likely benchmark auditors will use. This requires that data is “provenance-aware.”
- The Gap: Most GenAI pilots log the user prompt and the final answer.
- The Requirement: To meet professional audit standards, the system must log:
- The Prompt: Exact text sent to the model.
- The Context: The specific chunks of data retrieved from the RAG system (and their source document IDs).
- The Parameters: Temperature, Top-P, System Prompt version.
- The Model: Specific version hash (e.g.,
gpt-4-0613not justgpt-4). - The Output: The raw generation.
Without this metadata, a forensic investigation into an AI error (e.g., a hallucinated interest rate) is impossible. DORA views untraceable systems as “unmanaged risk”.26
4.2 OpenTelemetry as the Compliance Artifact
The technical solution to this — and the expectation of technically savvy auditors — is the implementation of observability standards like OpenTelemetry.28
- Implementation: Banks must instrument their AI stacks to emit “spans” for every step of the AI chain (retrieval -> ranking -> inference -> guardrail check).
- The Audit Artifact: These traces form the “Audit Trail.” Instead of screenshots or logs, the bank provides a queryable dataset of AI behavior. “Show me every time the AI refused to answer due to safety guardrails in Q3.” If the bank cannot answer this query, they lack the “continuous monitoring” required by DORA.30
4.3 The “Black Box” Liability
The “Black Box” nature of LLMs creates a liability shield problem. If a bank relies on an AI for a decision, and that AI is a black box, the bank is accepting unlimited liability for a process it cannot explain.31
- Legal Opinion: Banks may need to obtain legal opinions confirming that their use of “black box” AI does not violate their fiduciary duties or consumer protection laws, which often require “explainability” of decisions. DORA reinforces this by demanding that ICT risks (including the risk of inexplicability) be “minimized.” Relying on a model where the vendor refuses to disclose training data or weights arguably fails the minimization test.33
Part V: Data Sovereignty and the Geopolitical Trap
DORA acts as a geopolitical filter. While it focuses on resilience, its requirements on data location and control effectively mandate a degree of “Digital Sovereignty” that clashes with the US-centric AI ecosystem.35
5.1 The US Cloud Act vs. DORA
DORA requires financial entities to identify the location of data processing.10 The complexity arises from the US Cloud Act, which allows US authorities to subpoena data held by US companies (like Microsoft, Amazon, Google) regardless of where that data is physically stored (even in the EU).
- The Conflict: If a European bank uses Azure OpenAI (hosted in Amsterdam), is the data sovereign? Technically, yes. Legally, it is subject to US extraterritorial reach.
- DORA’s View: While DORA doesn’t explicitly ban US providers, it classifies this extraterritorial reach as a “Risk.” The bank must assess this risk. If the function is critical (e.g., central banking operations, sovereign debt analysis), the risk of US interference might be deemed “unacceptable” by the regulator, forcing a move to a purely European provider (e.g., Mistral on OVHcloud).35
5.2 Sovereign AI Deployments
To mitigate this, sophisticated institutions are moving towards “Sovereign AI” architectures.
- On-Premise: Running open-weights models (Llama 3, Mistral) on internal hardware.
- Private Cloud: Using “Dedicated Instances” where the model weights are frozen and the hardware is isolated.
- Compliance Benefit: This eliminates the “Sub-outsourcing” risk of the public API (where the vendor changes models) and solves the Data Residency issue completely.9
Part VI: The Audit Simulation (2025)
To visualize the failure, let us simulate a DORA audit for a hypothetical “GenAI Credit Assistant” in mid-2025.
Auditor: “Please provide the Register of Information entry for your Credit Assistant.”
Bank: “Here it is. Provider: AI-Startup-X. Function: Credit Support.”
Auditor: “I see AI-Startup-X uses OpenAI. Who is the sub-processor for the vector database?”
Bank: “We believe it is Pinecone, but it’s managed by the startup.”
Auditor: “Do you have the contract clause where AI-Startup-X is required to notify you before changing that vector database provider?”
Bank: “No, that’s their internal architecture.”
Finding 1: Failure to monitor critical sub-outsourcing (Article 28(4)).
Auditor: “This assistant summarizes borrower financials. What happens if OpenAI is down?”
Bank: “We wait for it to come back up.”
Auditor: “This is a critical function supporting loan approval. Waiting is not a resilience strategy. Show me your exit plan.”
Bank: “We would switch to Gemini.”
Auditor: “Show me the test report proving functional equivalence between the OpenAI summaries and Gemini summaries.”
Bank: “We haven’t run that test yet.”
Finding 2: Lack of tested exit strategy and functional equivalence (Article 28(8)).
Auditor: “How do you trace why the AI recommended ‘Reject’ on Loan #12345?”
Bank: “We have the chat log.”
Auditor: “I need the system prompt active at that moment and the specific RAG documents retrieved. Was the borrower’s ‘Income Statement 2024’ retrieved, or the ‘Draft 2023’?”
Bank: “Our logs don’t show the retrieval chunks.”
Finding 3: Insufficient traceability and data integrity controls (ICT Risk Management Framework).
Result: Major Non-Compliance. The system must be shut down or remediated immediately.
Part VII: Remediation and Strategy – The Path to “Compliance by Design”
The 90% failure rate is not inevitable. It is a function of current architectural choices. To survive 2025, financial institutions must adopt a “Compliance by Design” approach for GenAI.
7.1 The “GenAI Gateway” Pattern
Institutions must stop connecting applications directly to Model APIs. Instead, they must deploy a “GenAI Gateway” or “Model Garden” infrastructure.37
- Centralized Control: All AI traffic flows through this gateway.
- Standardized Logging: The gateway handles OpenTelemetry logging, ensuring every request is traced.30
- Model Abstraction: The gateway exposes a generic API (e.g.,
generate_summary()) to internal developers, while the gateway handles the specific API calls to OpenAI, Anthropic, or Mistral. - Failover Logic: The gateway can detect if Provider A is down and automatically route to Provider B, maintaining resilience. This fulfills the “Exit Strategy” and “Continuity” requirements technically.
7.2 The “Model Agnostic” Development Mandate
Development teams must be mandated to write prompts and code that are “Model Agnostic.”
- The Test: Every feature must be tested against at least two distinct model families (e.g., GPT and Llama) before deployment.
- Evidence: These test reports serve as the “Proof of Substitutability” for the DORA audit.
7.3 Renegotiating Contracts (The “DORA Addendum”)
Procurement teams must enforce a “DORA Addendum” for all AI vendors.
- Sub-outsourcing Veto: The bank must have the right to terminate if the vendor changes critical sub-processors.
- Assistance on Exit: The vendor must be contractually obligated to assist in exporting data (including embeddings in a standard format) upon termination.3
- Insurance: Vendors must carry liability insurance commensurate with the risks of their “black box” failing.
7.4 Threat-Led Penetration Testing (TLPT) for AI
Resilience testing (Article 26) must evolve. Standard pen-tests are insufficient.
- Red Teaming: Institutions must employ “AI Red Teams” to attempt prompt injection, data extraction, and denial-of-wallet attacks (driving up API costs to exhaust budgets).
- The Artifact: The TLPT report must specifically address AI-specific vulnerabilities. An audit report showing “No SQL Injection found” is irrelevant to an LLM; the report must say “No Prompt Leakage found”.39
Conclusion
The year 2025 will be a watershed moment for AI in finance. DORA effectively ends the “wild west” era of GenAI pilots. The regulation imposes a maturity model on the industry that currently exceeds the capabilities of most vendors and internal teams.
The 90% of pilots that will fail are those that function as “wrappers” around third-party black boxes, with no visibility, no portability, and no resilience. The 10% that succeed will be those that treat GenAI as a critical component of their infrastructure — architecting for sovereignty, building abstraction layers for portability, and logging every probabilistic event with deterministic rigour.
DORA does not ban GenAI. It bans unaccountable GenAI. The friction between the fluid, opaque nature of neural networks and the rigid, transparent demands of financial regulation is the defining challenge of the next decade. Financial institutions must choose: innovate within the constraints of resilience, or face the “compliance cliff” where their most promising technologies are deemed illegal to operate.
Appendix A: DORA Audit Readiness Checklist for GenAI
Table 2: High-Priority Audit Checks for GenAI Systems
| Audit Domain | Specific Requirement | Evidence Required | Failure Indicator |
| Inventory | Article 28(3): Register of Information | Excel/XML file matching EBA ITS standards, including LEIs for all model providers. | “Shadow AI” spending found in accounts payable but not in RoI. |
| Sub-outsourcing | Article 28(4): Monitoring of Nth parties | Contract clauses allowing veto/termination on sub-processor changes. Risk assessments of the vendor’s cloud provider. | Vendor contract says “We may use sub-processors at our discretion.” |
| Exit Strategy | Article 28(8): Portability & Transition | A documented, tested plan to switch models within a defined RTO (Recovery Time Objective). | Plan relies on “retraining from scratch” but has no GPU capacity reserved. |
| Traceability | ICT Risk Framework: Data Integrity | Logs linking Input -> RAG Context -> System Prompt -> Model Version -> Output. | Logs only show “User said X, Bot said Y” without metadata. |
| Resilience | Article 24: Business Continuity | Proof of failover capability (e.g., auto-switching to backup model during outage). | System goes offline when OpenAI API has latency. |
| Sovereignty | Data Localization | Data flow diagrams showing processing locations. Legal assessment of US Cloud Act risks. | “Region: Global” or “US-East” in configuration files for EU customer data. |
Works cited:
- Radovan’s Blog · Identity Wizard, accessed December 16, 2025, https://dracones.ideosystem.com/blog/
- What is Third-Party Risk Management (TPRM)? – Panorays, accessed December 16, 2025, https://panorays.com/blog/third-party-risk-management/
- Exit Strategy Requirements: EU DORA Compliance for Data Platforms – Airbyte, accessed December 16, 2025, https://airbyte.com/data-engineering-resources/exit-strategy-eu-dora-compliance
- DORA: Reporting DORA registers of information in April 2025 | De Nederlandsche Bank, accessed December 16, 2025, https://www.dnb.nl/en/sector-news/supervision-2025/dora-reporting-dora-registers-of-information-in-april-2025/
- FullStackAI Platform How – Iguana Solutions, accessed December 16, 2025, https://www.ig1.com/fullstackai-platform-how/
- Key Themes of Resiliency, Outsourcing, and Third-Party Risk Management Regimes, Journal of Securities Operations & Custody – Morgan Lewis, accessed December 16, 2025, https://www.morganlewis.com/-/media/files/publication/outside-publication/article/2025/key-themes-of-resiliency-outsourcing-and-third-party-risk-management-regimes.pdf?rev=-1&hash=5F542572D948FC93FDB7849C9E5CF28B
- Managing Digital Operational Resilience Act (DORA) Compliance – Informatica, accessed December 16, 2025, https://www.informatica.com/resources/articles/digital-operational-resilience-act-compliance.html
- What Is the Digital Operational Resilience Act (DORA)? | Proofpoint US, accessed December 16, 2025, https://www.proofpoint.com/us/threat-reference/digital-operational-resilience-act-dora
- Deploying Generative AI Under DORA: Ensuring Compliance and Resilience with Kosmoy, accessed December 16, 2025, https://www.kosmoy.com/post/deploying-generative-ai-under-dora-ensuring-compliance-and-resilience-with-kosmoy
- DORA Contract Compliance Guide for Financial Firms – 3rdRisk, accessed December 16, 2025, https://www.3rdrisk.com/blog/dora-compliant-contracts
- DORA FAQs – Cloudflare, accessed December 16, 2025, https://www.cloudflare.com/trust-hub/compliance-resources/dora/
- Preparations for reporting of DORA registers of information – European Banking Authority, accessed December 16, 2025, https://www.eba.europa.eu/activities/direct-supervision-and-oversight/digital-operational-resilience-act/preparation-dora-application
- Understanding the DORA Register of Information: A Complete Beginner’s Guide, accessed December 16, 2025, https://www.portotheme.com/understanding-the-dora-register-of-information-a-complete-beginners-guide/
- CSSF guide concerning the submission of DORA Register of information, accessed December 16, 2025, https://www.cssf.lu/wp-content/uploads/CSSF-guide-for-submission-of-RoI.pdf
- DORA outsourcing: Key rules and best practices – Copla, accessed December 16, 2025, https://copla.com/blog/compliance-regulations/navigating-dora-outsourcing-requirements-regulations-guidelines-and-best-practices-for-critical-and-cloud-outsourcing/
- DORA Monitor – Annerton, accessed December 16, 2025, https://annerton.com/wp-content/uploads/250801-Annerton-Dora-Monitor-EN.pdf
- Guidelines – | European Securities and Markets Authority, accessed December 16, 2025, https://www.esma.europa.eu/sites/default/files/2025-09/ESMA65-294529287-4737_Guidelines_on_outsourcing_to_cloud_service_providers.pdf
- Planning Priorities 2024: Investment Management and Private Equity | Deloitte UK, accessed December 16, 2025, https://www.deloitte.com/uk/en/services/consulting-risk/perspectives/investment-management-and-private-equity.html
- ECB consults on guide for outsourcing of cloud services – Moody’s, accessed December 16, 2025, https://www.moodys.com/web/en/us/insights/regulatory-news/ecb-consults-on-guide-for-outsourcing-of-cloud-services.html
- From Gaps to Compliance: A Practical Guide to Readiness For The Digital Operational Resilience Act (DORA) – Quod Orbis, accessed December 16, 2025, https://www.quodorbis.com/from-gaps-to-compliance-a-practical-guide-to-readiness-for-the-digital-operational-resilience-act-dora/
- EN EN PROVISIONAL AGREEMENT RESULTING FROM INTERINSTITUTIONAL NEGOTIATIONS – European Parliament, accessed December 16, 2025, https://www.europarl.europa.eu/RegData/commissions/itre/inag/2023/07-14/ITRE_AG(2023)751822_EN.pdf
- EU DORA Regulations: Granular Exit Plans for Data Integration Platforms – Airbyte, accessed December 16, 2025, https://airbyte.com/data-engineering-resources/eu-dora-regulations-granular-exit-plans
- EU Data Act operational impacts: Compliance and technical considerations of cloud switching | IAPP, accessed December 16, 2025, https://iapp.org/news/a/eu-data-act-operational-impacts-compliance-and-technical-considerations-of-cloud-switching
- EU Data Act + DORA: Cloud Exit & Portability for Financial Services – A-Team Insight, accessed December 16, 2025, https://a-teaminsight.com/blog/eu-data-act-dora-cloud-exit-portability-for-financial-services/
- Can Financial Institutions Modernise Without Losing Control …, accessed December 16, 2025, https://nexusfrontier.tech/can-financial-institutions-modernise-without-losing-control/
- India’s AI Governance Guidelines 2025 – AIGN, accessed December 16, 2025, https://aign.global/ai-governance-insights/aign-global/indias-ai-governance-guidelines-2025/
- ISO 8000 and Data Extraction – Building Reliable Automation With Data Quality Standards – Parseur, accessed December 16, 2025, https://parseur.com/blog/iso-8000
- Best practices for storing OpenTelemetry Collector data | Engineering | ClickHouse Resource Hub, accessed December 16, 2025, https://clickhouse.com/resources/engineering/best-resources-storing-opentelemetry-collector-data
- The AI Audit Trail: How to Ensure Compliance and Transparency with LLM Observability | by Kuldeep Paul | Oct, 2025 | Medium, accessed December 16, 2025, https://medium.com/@kuldeep.paul08/the-ai-audit-trail-how-to-ensure-compliance-and-transparency-with-llm-observability-74fd5f1968ef
- The rise of agentic AI part 7: introducing data governance and audit trails for AI services – Dynatrace, accessed December 16, 2025, https://www.dynatrace.com/news/blog/the-rise-of-agentic-ai-part-7-introducing-data-governance-and-audit-trails-for-ai-services/
- Protecting the Blueprint of Life: Navigating the Cybersecurity and Privacy Frontier of Genomic Data – Compliance Hub Wiki, accessed December 16, 2025, https://www.compliancehub.wiki/protecting-the-blueprint-of-life-navigating-the-cybersecurity-and-privacy-frontier-of-genomic-data/
- Blog | Magistral | Operations Outsourcing for Global Financial Services’ Industry, accessed December 16, 2025, https://magistralconsulting.com/blog/
- Derivatives 2025 – Katten Muchin Rosenman LLP, accessed December 16, 2025, https://katten.com/files/2167224_derivatives_2025.pdf
- Gibson Dunn | Europe | Data Protection – April 2025, accessed December 16, 2025, https://www.gibsondunn.com/gibson-dunn-europe-data-protection-april-2025/
- Data Sovereignty Drives Enterprise IT Decisions – Nutanix, accessed December 16, 2025, https://www.nutanix.com/theforecastbynutanix/business/data-sovereignty-drives-enterprise-it-decisions
- AI, Data Sovereignty, and Compliance – ServiceNow Blog, accessed December 16, 2025, https://www.servicenow.com/uk/blogs/2025/ai-data-sovereignty-compliance
- Streamline AI operations with the Multi-Provider Generative AI Gateway reference architecture | Artificial Intelligence – AWS, accessed December 16, 2025, https://aws.amazon.com/blogs/machine-learning/streamline-ai-operations-with-the-multi-provider-generative-ai-gateway-reference-architecture/
- From Noise to Insight: Building a GenAI Fabric (DGF™) for Finance & Risk – Dawgen Global, accessed December 16, 2025, https://www.dawgen.global/from-noise-to-insight-building-a-genai-fabric-dgf-for-finance-risk/
- “Implementing DORA” – Remarks by Gerry Cross, Director of Financial Regulation – Policy and Risk – Central Bank of Ireland, accessed December 16, 2025, https://www.centralbank.ie/news/article/speech-implementing-dora-speech-by-gerry-cross–director-of-financial-regulation-policy-and-risk-23-nov-2023
Leave a Reply