AI has exacerbated racial prejudice in housing. Could that help eliminate it instead?
We start by helping businesses understand the history of housing and finance in the United States and how all of our housing and finance policies have been imposed through a racial lens. You cannot start from scratch in terms of developing a system and think that the system will be fair. You have to develop it in a way that uses anti-racist technologies and methodologies.
McIlwain: Can we still realistically make a dent in this problem using the technological tools at our disposal? If so, where do you start?
Rice: Yes, once the 2008 financial crisis ended for a bit and we looked up, it was as if the technology had passed us. And so we decided, maybe if we can’t beat him, maybe we’ll join. So we’ve spent a lot of time trying to learn how algorithmic systems work, how AI works, and we’ve gotten to the point where we think we can now use technology to help decrease discriminatory outcomes.
If we understand how these systems exhibit biases, we can hopefully get into the guts and then debase those systems and build new systems that infuse the techniques of depolarization within them.
But when you think about how late we are, it’s really intimidating to think of all the work that needs to be done, all the research that needs to be done. We need more of the world’s Bobbys. But also all the training to be done so that data scientists understand these issues.
Rice: We try to get regulators to understand how systems exhibit biases. You know, we don’t really have a body of reviewers in regulatory agencies who understand how to conduct a review of a lending institution to find out if its system – its automated underwriting system, its marketing system, its system. service — is biased. But the institutions themselves develop their own organizational policies that can help.
The other thing we need to do is really increase the diversity in the tech space. We need to bring more students from diverse backgrounds into STEM fields and the tech space to help implement the change. I can think of a number of examples where just having a person of color on the team made a huge difference in terms of increasing the fairness of the technology being developed.
McIlwain: What role does politics play? I feel like just as civil rights organizations were behind the industry in terms of understanding how algorithmic systems work, many of our policymakers are lagging behind. I don’t know how much confidence I would place in their ability to realistically serve as effective control of the system or over the new AI systems that are quickly making their way into the mortgage arena.
McIlwain: I remain skeptical. For now, for me, the scale of the problem still far exceeds our collective human will and the capabilities of our technology. Bobby, do you think technology can ever help
Bartlett: I have to answer that with the lawyer “It depends. What we’re seeing, at least in the context of lending, is that you can eliminate the source of bias and discrimination you’ve observed with face-to-face interactions through some sort of algorithmic decision-making. The flip side is that if implemented poorly, you could end up with a decision-making apparatus as bad as a redlining regime. So it really depends on the execution, the type of technology, and the care with which it is deployed. But a fair loan scheme that is operationalized through automated decision making? I think this is a really difficult proposition. And I think the jury is still out.