Is Risk & Compliance switching on to Artificial Intelligence?

Written on
byMartin Roe
human centred advice

It’s inescapable that over the last few years that the power and promise of Artificial Intelligence (AI) has permeated so many aspects of our working lives – both invisible and many less obvious ways. It’s only comparatively recently that the Financial Services (FS) sector has begun to tune into the potential it promises. It can help deliver huge operation and cost benefits and augment many less efficient processes for better outcomes for customers and clients. 

 

Until recently, AI has been held back by a lack of understanding, by outdated technology, by having good quality data and data capture mechanisms, while even corporate cultures have been obstacles to its development. But FS firms’ attitudes are changing – back in 2019 PwC surveyed global FS executives on their adoption of AI – 56% expected to take up the technology over the next two years, but that rose dramatically to 85% looking at the next five years. 

 

A new toolkit

Going back a step, what do we mean when we’re talking about AI in these organisations? For many institutions such as banks, this is Machine Learning (ML), which is embedded in many operations and processes, such as trade matching or algorithmic trading. For the Risk and Compliance (R&C) departments that oversee a myriad of operations, it can become a game-changer. The uses of AI include Anti-Money Laundering (AML) and transaction monitoring, alert scoring, fraud detection, while there are also increasing uses of AI to help support functions such as regulatory reporting. 

 

For many R&C functions, it is also worth noting that the use of the technology has been catalysed by the previous financial crisis – with team sizes growing appreciably over the last decade, particularly driven by the high pace of regulatory change. For these teams having new tools in their armoury is proving invaluable in areas such as robotic process automation (RPA) which is speeding up routine tasks while minimising human error and also allowing those same humans to spend more time on higher-quality tasks. 

 

Mind your language

Another and one of the biggest areas for R&C to date is through text analytics and insights – including Natural Language Processing (NLP) to help interactions with customers and clients. For example, having a more efficient customer experience for an application for a loan could be transacted quicker, with less chance of fraud situations while also reducing the number of false-positive flags. 

 

So how does NLP make these improvements? The difference is that NLP can look closely at the context – instead of just focusing on certain keywords in a conversation that may flag it, it will look at where the words are in whole sentences – a much more sophisticated method of analysis – especially when you think that some keywords could have more than one meaning. Research into these techniques is showing improvements in false positives by anywhere between 50%-70%. 

 

Going back to the example of a loan being issued, in a traditional compliance process it’s often only 10% of these which would be reviewed by an employee to ensure all is in order. AI processes can review around 90% of the data with much greater accuracy; this, coupled with the speed of analysing vast amounts of data is certainly compelling. 

 

Another even newer AI technology is emerging which has great promise for compliance departments is Natural Language Generation (NLG). This is effective software that can write a whole range of compliance content automatically through analysing structured data and putting this into natural language. There is still a long way to go with this emerging field – language is complex and there are multiple meanings and subtleties to interpret, but the ability to prepare a complicated and often repetitive compliance document at speed is meaning greater investment in heading into this technology. 

 

High-level discussions 

But what about the regulators themselves and their approach to AI? In the UK, the Bank of England and the Financial Conduct Authority (FCA) have announced a forum for AI/ML and this is expected to develop as time goes on. It is expected that when regulations appear they will take a technology-neutral stance which should aid adoption, while looking further afield the EU is also expecting to put in place similar rules. 

 

One other area that affects both R&C departments and wider leadership teams alike is the principle of accountability i.e. who is ultimately responsible for the implementation and oversight of new technologies such as AI? The Prudential Regulation Authority has stated exactly this point, while the FCA has called for senior accountability – particularly senior managers at the Board level who will need to explain and evidence how AI integrates with a companies’ systems. The earlier PwC report suggests 34% of UK FS firms have a senior executive responsible for this area – so there is clearly some way to go in terms of representation. 

 

There is also an ongoing debate around the concept of ‘explainability’ which is permeating AI adoption – this is the key concept of understanding exactly how AI systems reach their conclusions. Critics have argued that there is a risk of putting in place ‘black box’ technology without a true understanding of the inherent methodologies which could result in poor internal or external outcomes. The FCA has made it clear that companies should provide sufficient oversight to this. 

 

Computer says ‘yes’

Of course, there is also a broader argument that the emergence of AI will do away with many jobs – but that’s very likely too simplistic a case. If we look at R&C departments you will clearly need more technology literate employees, alongside those that would support the technologies, such as data scientists and data analysts. It is highly unlikely that many aspects of compliance can be fully automated, but instead the emphasis will shift to professionals having their focus on higher quality or more complex tasks, from this standpoint jobs are not being so much taken away as the nature of the jobs are changing.

 

If R&C is to fully reap the rewards of AI then it’s essential it continues to embrace these new innovations to help drive the business on. We’ve all seen how, in the course of a year, technology has fundamentally changed our world through the pandemic and those changes, together with the impetus behind these innovations, will continue to shape our future. For R&C it’s absolutely key that boards and senior management really develop a true understanding of AI and the benefits it enables. It’s time to move from a ‘computer says no’ culture to one where the computer says ‘yes, and I’ll tell you why!’.

 

Written by Martin Roe

 

Learn more about Aveni. Find us on LinkedIn and Twitter

Other recent posts

Podcast logos-03

AI’ll Take Your Job

Join Joseph Twigg and Jamie Hunter, the dynamic duo of financial services and AI, as they unleash their wit and wisdom on the game-changing influence of recent AI development on the industry.  

Replay social 2-06

An Introduction to ChatGPT in Financial Advice

Our CEO Joseph Twigg was joined by Iria Del Rio, our lead NLP engineer to talk about the explosive rise of ChatGPT and other large language models, what got us here and what this...

Three confident business people having discussion while working in the office together

Beyond the basics: Debunking common QA assumptions in financial services

Quality assurance (QA) is a critical business function, ensuring that products and services are compliant with regulation and meet customer needs. However, there are some common assumptions about QA within the financial services industry...

Board member and FCA proposal

Aveni partners with Delta Capita to power their Consumer Duty offering

Aveni has partnered with Delta Capita, a leading global capital markets consulting, managed services and technology provider, to advance their Consumer Duty offering by providing their clients with access to our cutting edge AI...

limitation of ChatGPT in financial services

Limitations of adopting chatGPT and other large language models in the financial services industry

There has recently been a lot of hype surrounding ChatGPT, and other large language models (LLM’s) and rightly so. Their extensive capabilities make them impactful for a wide range of use cases across various...

It pays to plan properly for the future. Cropped shot of a senior couple getting advice from their financial consultant.

Age Partnership selects Aveni Detect platform to enhance customer outcomes and mitigate risks

Aveni.ai, has been selected by Age Partnership, one of the UK’s leading Equity Release Advisory firms, to revolutionise its Quality Assurance systems setting new standards for customer service and outcomes.     Following a successful...

featured-image

Demonstrating Consumer Duty Compliance with Technology - Key Takeaways from Aveni’s recent webinar

In our latest Consumer Duty webinar series,”Demonstrating Consumer Duty Compliance with Technology,” Joseph Twigg, CEO of Aveni, sat down with John Liver, Strategic Adviser at Kore and NED at Barclays, and Alan Blanchard, Head...

Aveni SPW

Schroders Personal Wealth adopts AI-based Aveni Detect platform to transform compliance function

Aveni, has been selected by Schroders Personal Wealth (SPW) to transform its compliance function. Through the deployment of the Aveni Detect platform SPW will use the latest advances in Natural Language Processing (NLP) to...

Humain-in-the-loop

Human in the loop 101: what is it and why is it so important?

Financial services firms have been turning to Natural Language Processing (NLP) solutions to extract valuable insights from vast amounts of unstructured data. But even the most advanced algorithms can’t match the intuition and creativity...