Algorithms seem abstract, but they have long been immersed in every corner of human society, when they cannot be decoupled from algorithms, if the algorithm is not objective and fair, how should we deal with it that relies heavily on algorithms?

Facebook's parent company Meta recently faced four charges from European human rights groups that its algorithms for running recruitment ads are suspected of being sexist. Ironically, Meta had paid $5 million in fees for allegedly discriminating against real estate and credit ads on the platform.

Meta, which promised to eliminate the problem of algorithmic discrimination, has once again caused controversy, and people can't help but wonder, how to enjoy the benefits of algorithms while avoiding the infiltration of gender discrimination?


Photo by cottonbro studio on Pexels

Meta has been accused of algorithmically recreating gender stereotypes

To find out whether Facebook's algorithm is latent in sexism, Global Witness, an international nonprofit, ran a series of recruitment ads in France and the Netherlands from February to April without specifying the gender of the audience.

Global Witness only asks to maximize clicks on ad links, and the rest is left to Facebook's algorithm to decide who ultimately sees the ad.

From data obtained from Facebook's Ad Manager platform, Global Witness found that the ads that Facebook recommended to different users were lurking with a large number of gender stereotypes. For example, 85% of Facebook users in the Netherlands saw advertisements for teacher vacancies were women, and 96% of those who saw ads for waiter vacancies were women, but only 4% of women saw vacancies as pilots.

The results of the French study found similar findings – 93% of women saw job titles as teachers, 86% of women saw job titles as psychologists, and 6% of women saw job titles as pilots.


Photo by Pixabay on Pexels

With similar findings in Europe, Asia, and other countries, Global Witness can't help but suspect that Facebook's algorithmic bias has risen to a global problem.

So to avoid this bias exacerbating inequality and pay gaps in the workplace, three non-profit organizations, Global Witness, Bureau Clara Wichmann and Fondation des Femmes, have filed complaints with human rights and data protection authorities about gender stereotypes in Meta's algorithm, which could end up facing fines or pressure to change the algorithm.

(Guess what you want to see: "Sister Hana, a cheerful and kind mechanical technician" Korean navigation system breaks through gender stereotypes in AI)

Meta's introduction of emerging algorithms, can it reduce bias?

Pat de Brún, head of Amnesty International's large tech accountability team, also expressed concern about algorithmic bias, arguing that Facebook, as a field of public discourse, should not allow algorithms to reproduce and amplify the injustices that already exist in society.

Especially when low-income households often use Facebook to find jobs, if the algorithm is biased, the biggest impact will be those who are already marginalized.

Meta spokesperson Ashley Settle said that advertisers are not allowed to choose gender when posting employment, real estate and credit ads on Meta, and the restrictions on the audience profile of such advertisements have been implemented in more than 40 European countries, including the United States and Canada.

In addition, Ashley Settle also said that Meta will continue to work with academics, human rights groups and other experts to study and solve the best solution to algorithmic fairness, and earlier this year, Facebook also launched new machine learning technology to reduce the bias of advertising delivery.

(Same screening: Will AI replace humans in decision-making?) Put aside unconscious bias, embrace diversity and inclusion, and use ChatGPT to have the ability that humans must have)

After reading the algorithm controversy incident, although how to enjoy the benefits of the algorithm while avoiding recreating gender stereotypes, this problem still needs to be solved, but we can still remind ourselves from the algorithm controversy incident to pay more conscious attention to the biases hidden in life, rather than just passive information recipients.