Goldman Sachs Responds to Allegations of Apple Card Gender Bias

Originally published at: https://tidbits.com/2019/11/12/goldman-sachs-responds-to-allegations-of-apple-card-gender-bias/

Developer and entrepreneur David Heinemeier’s tweet about his wife’s paltry Apple Card credit limit has sparked a government investigation and a fierce debate about sexism in the financial industry.

My credit limit on my Apple Card is $250. Crazy - I mean, when I was a young guy working for minimum wage with zero credit history back in 1980, my credit card had a $650 limit. When I called about it, the Goldman Sachs rep agreed it seemed crazy low given my top-tier credit score and extensive credit history, and that I could appeal in 6 months. He said they will be making no adjustments until then.

Sounds like you might want to give them another call now.

Unless they’re willing to essentially admit they only budge due to social media hysteria, they should now be more than willing to re-evaluate your case. :slight_smile:

1 Like

I asked for a review, the text messenger from GS was courteous and to the point, and said the review would take about 30 days. It isn’t a big issue as we have good credit (well above 800) and pay off all our account bills each month. But even though we don’t carry debt except for our mortgage and one vehicle, the APR seemed much higher than the range from low to high that is available from GS and what it assigns.

I’ve been reading a couple discussions on this topic elsewhere and something I keep seeing is a statement along the lines of “its based on an algorithmic check, there’s no human decision so there can’t be a bias”. WTH? [searching for smacks-head-against-brick-wall emoji]

1 Like

Yeah, algorithmic bias is a serious problem, and one that people need to keep in mind. Sometimes it’s poor programming in the algorithm itself, sometimes the training set is biased, and sometimes it learns “badly” based on how biased people interact with the algorithm.

Yeah, algorithmic bias is a serious problem

But is this really an algorithm bias problem? People were quick to blame computers on this, but from the statements I’ve seen from Goldman Sachs (grain of salt) it sounds like the issue is they only look at each person individually, not couples as a group. And apparently because some couples combine their finances, one person in the couple may not have the credit rating/experience or income of the other, depending on the situation.

If this is true, then we should see situations where a female bread-winner of a couple would getting better terms than her husband, but of course that news doesn’t go viral.

Ideally Goldman’s plans to allow shared cards would solve this, but we’ll have to wait and see how long that takes to add.

Indeed—I don’t know that this was exactly a case of algorithmic bias or not. I was just supporting @Simon’s irritation with the erroneous claim that because the situation was algorithmic, it couldn’t be biased.

The article that Josh linked to sort of suggested that the problem was related to an algorithm, and it must be at some level, given that the approval process is happening automatically, without human intervention. It may just be a poorly designed algorithm, in the sense of evaluating every individual separately rather than taking into account family financial situations.

if the Apple Card had simply offered joint accounts from the beginning, like nearly every other credit card

I’m not sure this is entirely accurate—Chase is the US’s largest credit card issuer, and stopped offering joint accounts in 2013. Of the 7 largest issuers, only Bank of America and Wells Fargo offer true joint accounts, and they together only make up 16.8% of the US credit card market. All the other large issuers only offer “authorized user” cards in addition to the primary account holder’s card. Goldman Sachs’s approach of primary accountholder only cards is pretty much industry standard practice at this point.

1 Like

David might be on to something here. The algorithm might be assuming that the “additional user” has a lower credit score. The Woz got caught in this: his wife was given a lower credit line than he was even though her credit score is HIGHER than his! Of course if they both applied individually as primary users, then something is definitely wrong in the state of Denmark!

Interesting! When Josh wrote “joint accounts,” I didn’t really think about the difference between a true joint account and one with additional authorized user cards. For Apple Card, I’m not sure there’s any real difference, though—the point is that a couple can easily get a pair of cards on the same account.

If Apple card had allowed authorized users like nearly every other credit card, I don’t think the gender bias question would have ever come up, would it?

I don’t know that the news reports give enough details of the cases of apparent bias to confidently say if there was bias or not.

What if the algorithm considers a person’s address, their claimed assets, their income, etc. No mention of gender there. Suppose it gives the person $X of credit based on the claimed assets and $Y of credit based on income (and other factors). Then along comes another application: same address, similar (or same) assets, some income (maybe more, maybe less than the previous application), etc. Suppose it thinks the similar claimed assets in the two cases are in fact the same assets, but the income in the second application is separate. What should it do? It has already given the first applicant $X of credit for the claimed assets. Should it give the second applicant $X of credit for those same assets? Presumably, no! So it ignores those assets on the second application and just gives credit for the income (and other factors, if any). This could cause the credit lines of the two applicants to be very different.

Does anyone have any reason to believe that this is not happening?

We have the opposite issue. I have $10k and she has $15k. I applied first and she a few days later. Same information. We are both retired.
David