How the EU AI Act will impact AI development and adoption within the UK’s Financial Services Industry

What is the EU AI Act: the key takeaways

The December 2023 EU AI Act is the first comprehensive legal framework for AI in the world. It aims to promote trustworthy AI by addressing key risks, ensuring fairness, and protecting fundamental rights. 

While the UK is no longer part of the EU (shh, don’t mention the B word), the Act will still have a big impact on the development and implementation of AI in the UK financial services sector. Here’s why:

Key elements of the AI Act that you need to know:

  • Risk-based approach: The Act categorises AI systems based on their potential risk. High-risk systems, such as credit scoring or fraud detection, have stricter regulations.
  • Transparency and explainability: Developers must explain how decisions are made by high-risk systems, allowing for better understanding and potential challenges.
  • They need to design the model to prevent it from, for example, generating illegal content or publishing summaries of copyrighted data used for training.
  • Prohibition on certain AI uses: Practices like social scoring (e.g. classifying people based on behaviour, socio-economic status or personal characteristics) and manipulative content generation (e.g. voice-activated toys that encourage dangerous behaviour in children) are strictly forbidden. 
  • Human oversight: Humans must be ultimately responsible for the development and use of high-risk AI systems.

How this affects AI adoption for UK financial services:

  • Compliance considerations: For UK financial companies doing business in the EU or offering services to EU citizens, the EU AI Act means following its rules. This adds extra steps and potentially higher costs, but it’s important to ensure compliance across the board.
  • Global Standards in the making? The Act might indirectly influence UK regulations by setting a global standard for what “good” AI looks like. This could actually make things easier in the long run for companies operating in both the EU and UK, as they wouldn’t have to navigate two separate sets of rules.
  • Responsible AI in finance: The Act’s principles should encourage the UK financial sector to be more thoughtful when developing and using AI. This means putting fairness, transparency, and minimising risks at the forefront.
  • Innovation and responsibility: Stricter regulations might make it slightly trickier to develop cutting-edge AI applications in finance. However, it could also push companies to be more responsible in their development, ultimately minimising potential risks.

Uncertainties and future developments

The EU AI Act will undoubtedly impact the UK’s financial sector despite not directly applying within its borders. The UK’s response remains a question mark: will it mirror the EU’s approach, creating a unified system, or forge a separate path with potentially diverging regulations? This decision sits at the intersection of the UK’s recent commitment to responsible AI development, as evidenced by its own regulatory advancements, and the evolving landscape of AI itself. 

With the potential for future revisions to the EU Act adding another layer of complexity, the coming years will be crucial.

Nicola Wee

CMO at Aveni | AI built for FS | Board Trustee

Share with your community!

In this article

Related Articles

Aveni AI Logo