AI Use Should Go Hand In Hand With More Transparency To Consumers And Regulators

February 24, 2022
Back
Despite progress made by artificial intelligence (AI) in recent years, the lack of transparency in explaining outcomes of current models, as well problems with bias, can create a challenging scenario for firms.

Despite progress made by artificial intelligence (AI) in recent years, the lack of transparency in explaining outcomes of current models, as well problems with bias, can create a challenging scenario for firms.

The report released by the Financial Conduct Authority (FCA) and the Bank of England (BOE), explored in a recent VIXIO article, notes the varied nature of the data used by firms, including data from third parties. This can create problems, which can be compounded when multiple models interact, which the report concludes could potentially bring instability to the financial system.

The report also detailed a number of technological and compliance traps that firms would be wise to avoid. These include denying customers access, potentially missing fraud, data protection risks, and the liability, sanctions and reputational risk that comes with that.

However, a major challenge for firms using AI to help them make decisions to determine the level of service they offer customers, such as credit decisions and risk-based pricing, is explaining seemingly discriminatory outcomes the data produces.

Rock and hard place

The lack of transparency in AI can result in conclusions that seem all out of proportion. Take the widely reported example of Apple co-founder Steve Wozniak receiving a better credit score from the Apple card than his wife, despite sharing all their assets together.

This led to Apple being investigated by the New York State Department of Financial Services (NYDFS), with a spokesperson saying at the time: "Any algorithm that intentionally or not results in discriminatory treatment of women or any other protected class violates New York law."

Although Wozniak stated the algorithm was inherently biased towards women, the NYDFS said its investigation “did not produce evidence of deliberate or disparate impact discrimination but showed deficiencies in customer service and transparency”, which Apple rectified.

Apple also produced a page that told consumers how to increase the probability of gaining an Apple card.

Lack of bias

The reason the NYDFS did not find evidence of discrimination was because the algorithm had been designed to be gender and colour blind, using inputs such as “credit score, indebtedness, income, credit utilisation, missed payments, and other credit history elements”.

However, despite the notion that spouses who share assets should have the same access to financial products, multiple factors such as not having your name on the title deed of the house or owning different lines of credit can have a significant effect on equal access.

This does not get firms out of hot water with consumers, however, when people expect the same score as their spouse and when challenged, customer services were not able to explain the reasons behind the outcome beyond first principles, as happened with Apple.

Firms can learn from these episodes by implementing better transparency when deciding to use AI with their product. Firms may also want to test their algorithms against issues of historic or demographic bias, as well as continue to find higher quality datasets, such as using transactional history via open banking.

Regulators may also intervene to address social and demographic bias to compensate for historic data gaps or to ensure a more equitable access to finance. This is something the FCA/BOE report alluded to, stating that “protected characteristics may be needed to measure or establish the fairness of AI predictions”.

Our premium content is available to users of our services.

To view articles, please Log-in to your account, or sign up today for full access:

Opt in to hear about webinars, events, industry and product news

Still can’t find what you’re looking for? Get in touch to speak to a member of our team, and we’ll do our best to answer.
No items found.