Building fairer models for finance
Fairness and bias are rightly hot topics in ML because of their impact on people’s lives. It’s particularly important in finance, where bad decisions can limit the ability to participate in society. We look at how fairness can be defined, and how the law defines it. We explore how to detect bias, the trade off between bias and performance, and how to build fairer models that still perform well.
I’m a product-focused data scientist at Aire, a fintech startup and credit reference agency. I’ve been part of the startup scene in London for nearly a decade.
At Aire I’ve been responsible for shaping products for the UK and US markets from an early stage, building credit risk models and other insights, and worked closely on our governance framework and fairness processes.
I try to spend as much time as I can on understanding the problem. It’s usually a lot messier and more complex than I realise, but it’s key to building solutions that have a real impact!