Gender Bias Complaints Against Apple Card Signal A Dark Side To

Gender Bias Complaints Against Apple Card Signal A Dark Side To

Gender Bias Complaints Against Apple Card Signal A Dark Side To Sandy  

4 years ago

~2.5 mins read


The possibility that Apple Card applicants were subject to gender bias opens a new frontier for the financial services sector in which regulators are largely absent, argues

In late August, the Apple Card debuted with a minimalist look and completely “no fee” model, creating a frenzy of anticipation. Millions signed up to be alerted for the release. Designed to boost traffic to its slow-to-be-adopted Apple Pay system and increase consumer dependency on iPhones, the Apple Card marked another significant innovation in access to financial services.

Fast forward two months, and Apple Card may now find its place in history for a less positive reason—the dark side of the technological revolution rearing its ugly head. Last week, Danish programmer David Heinemeier Hansson tweeted that after both he and his wife Jamie applied for the Apple Card with much of the same or shared financial information, he was astonished to receive a credit limit 20 times higher, despite his wife’s higher credit score.

Cue the viral tweet storm that followed, rife with accusations of bias in Goldman Sachs’s underwriting model. (Goldman developed and issued the card.) Adding fuel to the fire, Apple co-founder Steve Wozniak shared that the same thing had happened to him and his wife.


Officials from the New York Department of Financial Services quickly chimed in, assuring the Twitter sphere that they would investigate.

Technology is undeniably transforming the financial services industry. Fintechs, Big Tech, and banks are using increasing volumes of data, artificial intelligence, and machine learning to build new algorithms to determine creditworthiness. The lending process, which was historically plagued by frictions, is becoming potentially more accurate, efficient, and cost effective.

For small-business lending, technology is changing the game, providing access to capital for more small businesses that need it to grow and succeed.


But when lending relies on algorithms to make loan and underwriting decisions, as the Apple Card situation illustrates, the potential for discrimination grows.

Should the customers be able to see what pieces of data may have led to a loan rejection or a lower credit limit? Should regulators have access to the algorithms and test them for the impact they have on underserved or protected classes?

The Apple Card situation has raised these questions in a visible way and the public engagement has been strong and immediate. Clearly, this is a new frontier for the financial services sector—and the industry’s regulators are also operating without a roadmap. We need to stop arguing about more versus less financial regulation and begin the hard work of creating smart regulation. This would include at least three parts, all of which are all hard to accomplish:

Disclosure rules on who gets to see what is in the credit algorithms.
Increased expertise at the regulatory agencies.
Data collection to know who is getting loans and where the gaps are occurring.

The Apple Card fiasco is not going to be an isolated incident—it’s the canary in the coal mine for the financial services industry and regulators playing catch up to the implications of the fintech revolution. For all the promise that comes with the Apple Card or other new innovations for deploying capital, if creditworthy customers are being shut out, that’s a problem. Even worse, if we don't understand why, we can’t fix it.

Was my post useful? Support me to keep creating useful content

Disclaimer If this post is your copyrighted property, please message this user or email us your request at with a link to this post

1 like