Menu
Sat, 23 November 2024

Newsletter sign-up

Subscribe now
The House Live All
A highly skilled workforce that delivers economic growth and regional prosperity demands a local approach Partner content
By Instep UK
Economy
UK Advertising: The Creative Powerhouse Fuelling Global Growth Partner content
Economy
Trusted to deliver Britain’s green growth Partner content
By Trust Ports Partnership
Economy
Taking the next steps for working carers – the need for paid Carer’s Leave Partner content
By TSB
Health
“Quo vadis” for the foundational industries in the UK Partner content
By BASF
Economy
Press releases

Algorithmic pricing in insurance: mining patterns of social inequalities

Maria Busca | Dods Monitoring

@MariaBuscaMB

5 min read

Dods Monitoring's Maria Busca examines the nature of algorithms and discrimination in the insurance industry.


Recent research by BBC Radio 4 programme You and Yours suggested some striking results: people with names common among ethnic minorities may be quoted more for car insurance than those with traditional English names. The Sun also ran a similar story recently involving Admiral and M&S.

Probed on this by the Treasury Committee, Christopher Woodlard of the Financial Conduct Authority (FCA) said the watchdog was aware of serious concerns being raised particularly regarding race and ethnicity. Their findings of the October 2018 review of pricing practices in household insurance indicated that firms were using data sets which “could implicitly or potentially explicitly relate to race or ethnicity” and some insurers claimed that using the datasets was legal as it represented “a proportionate means of achieving a legitimate aim”, which was – or is - compliant with the Equality Act.

The challenge around algorithmic pricing is not specific to insurance. Even when protected characteristics are eliminated from data sets, algorithms can find proxies for those, leading to the discriminatory outcomes. Essentially, algorithms are likely to reinforce the exiting inequalities even when data is complaint with Equality Act 2010.

In the context of political pressure and regulatory developments addressing vulnerable consumers in finance and the CMA trying to beef up their powers to better protect consumers, the question is where the policy solution is going to come from.

Auditing and certifications

The opaque nature of algorithms is the core of the problem. Some algorithms are seen as black boxes, leaving no record of the decision-making process. Techniques aimed at introducing transparency to machine learning models may offer a fairly straightforward solution in the future. MPs have already called for a number of measures to address unfair algorithmic decision-making. Some of them are setting principles and ‘codes’, establishing audits and certifications for algorithms and making ethics boards responsible for oversight of algorithmic decisions.

An interesting approach proposed by Alan Turing Institute researchers is counterfactual fairness – a framework which aims to eliminate bias by taking them into consideration and compensating for them. The idea is that a decision is fair towards an individual if the outcome is the same in the actual world as it would be in a ‘counterfactual’ world, in which the individual belongs to a different demographic.

On Wednesday 20th March 2019, the Centre for Data Ethics and Innovation published their work plan on algorithm bias, committing to look at financial services and to publish an interim report in Summer 2019. The Centre has the power to make recommendations to the Government, regulators and creators and users of data-driven technology. The Government has committed to giving the Centre a statutory footing in the future. However, it remains unclear what statutory powers and capacity the Centre would have to create an enforceable framework of principles and auditing.

Price intervention – price capping or banning price differentiation

During a session of the APPG for Insurance and Financial services, Citizens Advice said the discussion on dual pricing should begin with the assumption that pricing differences are unfair. The issue of dual pricing is another challenge the industry is embattled with currently and is thought to affect more vulnerable consumers according to the CMA.

Price differentiation based on granular risk assessment has become ingrained in the insurance business model, and thus, the industry is likely to fight strongly against it. Moreover, reducing price differentiation would increase prices of those presenting low risks, which would not be any fairer. A price collar (a relative price cap) would be a compromise, but both suggestions could also render some consumers uninsurable – with algorithms pricing them out of the market.

Discrimination impact assessment

An interesting proposal for tackling this was put forward by the Villani report commissioned by the French Government. It suggested introducing a discrimination impact assessment similar to the privacy impact assessment to determine AI developers to consider the social consequences of the algorithms they produce. The main benefit of this approach is that it would overcome the never-ending struggle to play regulatory catch-up with machine learning, where regulation is likely to always fall behind.

This would undoubtedly add a further layer of regulation, potentially burdening the small innovative businesses. The framework for introducing them would also be a challenge. It remains to be seen what statutory powers the Centre for Data Ethics and Innovation will receive and whether it could potentially enforce such impact assessments.

Another option is for the FCA to lead the way by introducing a responsibility to ensure non-discriminatory consumer outcomes through a new duty of care for financial services, should the regulator decide there is need for introducing one this Spring. (The FCA is currently reviewing the responses received on their July 2018 discussion paper and is due to decide whether new duty of care for financial services is needed.) The key challenge would be around liability for those insurers who do not own their own algorithms.

The focus on outcomes puts the onus on businesses to use algorithms only after ensuring that they don’t reinforce the existing inequalities. Whatever the next regulatory step will be, price differentiation and algorithm bias are bound to receive increasing attention. The CMA, FCA and the Treasury Select Committee are all focussed on consumer outcomes. Moreover, the regulatory framework around consumer protection is only expected to be further reinforced, prompted by the recent CMA’s new package of reforms and the findings of the National Audit Office regarding regulators’ performance in protecting consumers.

PoliticsHome Newsletters

Get the inside track on what MPs and Peers are talking about. Sign up to The House's morning email for the latest insight and reaction from Parliamentarians, policy-makers and organisations.

Read the most recent article written by Maria Busca - Has the Financial Services Bill done enough to tackle climate change?

Categories

Economy
Podcast
Engineering a Better World

The Engineering a Better World podcast series from The House magazine and the IET is back for series two! New host Jonn Elledge discusses with parliamentarians and industry experts how technology and engineering can provide policy solutions to our changing world.

NEW SERIES - Listen now