The deeper issue behind Apple Card's alleged sexism
Priya Wadhwa
10X Technology
Published:

The deeper issue behind Apple Card's alleged sexism

A blame game into the black box.

Apple Card — one of the most successful launches for Apple and Goldman Sachs as they entered into new realms of finance — is now facing trouble with allegations of sexism when it comes to credit limits set to individuals.

Following a viral tweet by David Heinemeier Hansson, Danish programmer known for his creation of the popular Ruby on Rails web development framework, a number of people, including Apple co-founder Steve Wozniak, have come forward to say how the men have received credit limits that are many times higher than what their spouses have received.

This is the case even as people did not give their income-related data as well as when financial accounts were shared and women held better credit scores.

The New York Department of Financial Services has since launched an investigation into the bank’s practices to determine whether it violates the New York law.

A spokesman for Linda Lacewell, the Superintendent of the New York Department of Financial Services said in a statement, “The department will be conducting an investigation to determine whether New York law was violated and ensure all consumers are treated equally regardless of sex. Any algorithm, that intentionally or not results in discriminatory treatment of women or any other protected class of people violates New York law.”

However, the waters are murky as the credit limit given to each person is determined by a “black box algorithm”, which has been developed by Apple; but the credit decisions are the responsibility of Goldman Sachs.

The larger issue is that of algorithms being biased, which is not particularly the fault of the algorithm script, but that of the data fed into its system — and where it came from. The data might not have been from Goldman Sachs as this is their first venture into retail banking.

So then the question is whether the gender discrimination is indeed the issue of a country-wide pattern of lower credit limits for women, or simply the case of a data set that is misrepresentive of society or regulations. For example, if the data set includes women with lower incomes and worse credit scores, it is more likely to generate lower credit limits for women.

The algorithm’s bias based on gender, if that is truly the case, might be a symptom of the data set not accounting for enough data points, or indeed being a biased set — even if that is unintentionally.

And this is not the first time that the New York Department of Financial Services has launched an investigation into algorithm bias. Bloomberg reported that this is the second such case, the first one being UnitedHealth Group Inc. “after a study found an algorithm favored white patients over black patients.”

Artificial intelligence and algorithms have the power to remove descrimination from the system — but this can only happen in an ideal society where the data set is representative of a system or society that is unbiased. This is the biggest challenge that AI faces today, as humans inherently, even if only subconsciously, carry bias.