Menu
Fri, 22 November 2024

Newsletter sign-up

Subscribe now
The House Live All
Education
Women in Westminster: In Conversation With Eleni Courea Partner content
Parliament
Parliament
Press releases

Who’s Watching the Machines? Ethics of AI in Financial Services

Chartered Insurance Institute

6 min read Partner content

At Conservative Party Conference, the Chartered Body Alliance hosted packed fringe event discussing the consequences of this technological revolution.


Monday afternoon at the Conservative Party Conference saw Chancellor Philip Hammond address the main hall on the future of the UK economy. At the heart of that vision was the fourth industrial revolution.

“Technological change is transforming not only our economy, but our society and our politics at a rate that none of us have seen in our lifetimes”, said Mr Hammond.

Just down the hall from him was a packed fringe event discussing the consequences of this technological revolution, entitled: “Who’s Watching the Machines Ethics of AI in Financial Services”.

Hosted by the Chartered Body Alliance - comprising the Chartered Insurance Institute, Chartered Banker Institute and the Chartered Institute for Securities & Investment - -the panel event was chaired by the former Treasury Minister and current chair of the Office of Tax Simplification Angela Knight. She was joined by AI Program Director for the Alan Turing Institute Dr. Adrian Weller, Head of HSBC Digital Bank UK Raman Bhatia, CEO of the Chartered Institute for Securities & Investment Simon Culhane, CEO of the Chartered Banker Institute Simon Thompson, and Co-Chair of the Cross-Party Parliamentary Commission on Technology Ethics Lee Rowley MP.

AI systems are rapidly replacing traditional systems, remarked Ms Knight, with the main difference being thatartificial intelligence learns from its customer base in a way that traditional systems do not.

The panel agreed that this change had been largely beneficial, bringing greater choice for consumers, such as better fraud protection, trade processing, and compliance observance.

However, as the technology advances concerns over the ethical use of AI and those who use it are growing.

For Mr Culhane, AI has created an imbalance in society: “We have seen a great rise in inequality, a huge power shift to a few firms and individuals - the rise of the Facebooks and the Amazons.”

It has meant the power of these firms and what they are doing with AI has left consumers wary - especially when it comes to data gathering.

"Customer consent in a world where AI is using our data information is more pronounced and even more critical," said Bhatia.

GDPR laws aimed at controlling how data is handled have been welcomed by many as a good step towards greater transparency, however Lee Rowley MP was unsure if it was the best way forward, saying:

"There is an underlying question as to where we are going with privacy in society, and we are currently at a crossroads. It is either going to go down the GDPR line where everyone owns their own data and everyone thinks GDPR is great - I’m a bit sceptical - or it is going to go down the road where people start pulling back in certain areas about the kind of data they share."

In order to address those concerns, Mr Bhatia suggested that companies should be wary of the explainability of complex AI models, which can be a black box.

"If you are using AI for the risk assessment of a customer, banks have to be able to explain how a decision was made. Can you always do that? No you can’t, so that is a key concern."

This issue is particularly prominent in the insurance industry, as noted by Lee Rowley, who said: “Insurance is built on the idea of pooled risk, but when you understand exactly what the profile is of the person coming to ask you for insurance, that is a really challenging place to go.

“There are lots of opportunities but also a lot of ethical questions to answer in terms of who you insure, how you insure, and the premium you insure them with.”

Simon Thompson said the focus needs to be on openness.

"It should not be black boxes working away unseen in the background, they should be glass boxes; transparent technology with clear and transparent accountability.

“Whether that is us as users, customers, or policymakers putting the frameworks around it - we need to be sure we can monitor, understand, and explain what that technology is doing and how it actually works, how they reach those decisions, and whether those decisions are actually in line with expectations when there are unexpected consequences."

Mr Culhane added that beyond transparency, consumers were most concerned with fairness.

"Because what do consumers want? They want to be treated as an individual, to be engaged not exploited."

Raman Bhatia said that surveys show that for many consumers, trust in the use of AI in the financial services is low, revealing that people are more likely to trust a robot to perform open heart surgery than offer financial advice.

“This is was a real failing,” he remarked.

To combat this, Adrian Weller said financial institutions need to ensure that new technologies are focused primarily on fulfilling customer needs, rather than chasing quick profits.

“The nudging of customers is an important issue”, he said.

“When AI is providing advice we need to be sure that this kind of nudging will guide people towards decisions that are in their best long term interests and not just to  create short term profits in companies.”

For Lee Rowley the questions around the use of AI go further than financial services.

He said: “There are wider issues of trust, security, privacy, and slightly more existential ones, like: ‘What is it as a society we want our machines to do and how do we want them to do it?’

“For instance do we want machines to identify when they are interacting with us? At the moment we know it is a machine answering service when we call, but in the future as the tech advances that may not be the case. And the situation may happen, when the robots look like humans and you can’t tell if they are humans or not. It’s a whole continuum we have to talk about because there may or may not be regulatory interventions needed, but there are definitely societal and cultural questions we need to ask both within this industry and outside.”

Simon Thompson said it was essential for companies, regulators, and politicians to understand how the technology works. He said:

“When we lack curiosity, fail to understand what’s underneath technology and just expect it to ‘do its thing’ and give us the results we want, perhaps we shouldn’t be surprised when it doesn’t meet our expectations.”

PoliticsHome Newsletters

Get the inside track on what MPs and Peers are talking about. Sign up to The House's morning email for the latest insight and reaction from Parliamentarians, policy-makers and organisations.

Associated Organisation
Podcast
Engineering a Better World

The Engineering a Better World podcast series from The House magazine and the IET is back for series two! New host Jonn Elledge discusses with parliamentarians and industry experts how technology and engineering can provide policy solutions to our changing world.

NEW SERIES - Listen now

Partner content
Connecting Communities

Connecting Communities is an initiative aimed at empowering and strengthening community ties across the UK. Launched in partnership with The National Lottery, it aims to promote dialogue and support Parliamentarians working to nurture a more connected society.

Find out more