A company develops recruitment software that discovers suitable talent. But the algorithm has a catch - it sorts out women. One of the biggest tech companies in the world made headlines with this scandal in 2018. Amazon's software was shut down as a result, but the issue remains explosive.

Facebook algorithm keeps out female truck drivers

Women face technological discrimination. And they do so even before they apply. An experiment by AlgorithmWatch shows that discriminatory "gender targeting" occurs already in job ads. The NGO used Facebook to place job advertisements for lorry drivers and educators, deliberately wording both postings in masculine terms. It narrowed down the target group only on the basis of geographical criteria. Despite this, Facebook delivered the ads based on gender stereotypes. The lorry driver position was shown to 7 percent of women, while 96 percent received the ad for the educator position.

Angela Müller heads the policy team at AlgorithmWatch, which regularly conducts such research projects. They are thereby messing with the tech giants - AlgorithmWatch has already been increasingly pressured to cancel research projects: "I then ask myself what it means for a society when a large part of recruitment processes take place via such platforms and we are not even allowed to know how they actually work," says Angela Müller.

Pregnancy becomes a knock-out criterion

Automation has permeated almost every step of the recruitment process in recent years - whether it's checking documents and qualifications, shortlisting applicants, or analysing interviews. A new study by Harvard Business School says that the global market for recruitment technology will grow to 3.1 billion US dollars by 2025 - a doubling compared to 2017. Drivers of the exploding supply of digital recruitment aids are large companies and their quest for greater efficiency.

HR software only seemingly meets this need. The alleged efficiency comes at a price. Recruitment tools do allow employers to quickly filter and weed out applicants based on certain criteria. However, estimates by the Harvard Business School show that the machines have weeded out over 27 million qualified workers in the US alone. Women are particularly affected by this. A majority of the companies surveyed sort out CVs that have gaps of more than six months. Since algorithms are not interested in the reason for these gaps, longer maternity absences or breaks due to care work become a knock-out criterion. Women are therefore one of the groups that are particularly often overlooked by algorithms as qualified workers.

The transparent applicant

In Switzerland, too, automated tools are increasingly being used to recruit employees. However, applicants are often not aware of this, says Angela Müller. This creates an enormous risk that the power imbalance between employers and employees will increase. If job seekers do not know about the use of such tools, they cannot defend themselves against them.

Angela Müller
Systems do not create prejudices, but reproduce those of our society.

The expert emphasises that she does not fundamentally advise against HR-Tech: "Systems do not create prejudices, but take over and reproduce those of our society and data". In this respect, she does not see the solution to the problem at the technological level. Rather, the first step is to create transparency. She demands: "Companies must really understand their systems. It is also their responsibility to disclose where they are used." This means that companies have to hire people with the necessary expertise or train existing employees accordingly.

With diverse teams to inclusive technologies

"Diversity brings experience and different perspectives," says Nadia Fischer. She has already worked for a wide variety of tech companies - but the teams were anything but diverse. The employees were mostly male, white, and in their early thirties. "Such teams develop digital solutions that determine almost all of our lives. I find that very problematic," she says.

Nadia Fischer uses the example of HireVue to show how problematic homogeneous development teams can be in tech companies. The tool analyses the facial expressions, gestures, and language of job applicants during interviews for companies like Hilton and Unilever. The problem: the software cannot correctly interpret the expressions of women or non-white people and therefore rates them worse. The tech expert is convinced that such biases could be reduced by more diverse development teams.

Based on this conviction, Nadja Fischer founded her own company "Witty Works" three years ago. The Zurich-based start-up supports companies in making their communication more inclusive - especially when it comes to job advertisements. These are often unconsciously male-oriented. "Companies often emphasise their competitive and performance-oriented environment," says the founder, "something that doesn't appeal to women at all because they've been socialised to cooperate." Witty Works' tool recognises such unconscious stereotypes in language and shows neutral alternatives.

Nadia Fischer
If the algorithm is to learn from itself, it must be built on a data base without bias

Bad data, stupid tools

Witty Works' software now also works with artificial intelligence.  What is unusual about this is that the Witty Works team developed the data basis themselves. Based on studies on the topic of inclusive language, they defined rules for the algorithm. Only afterward, when all these rules had been created and tested, could the tool be strengthened with machine learning. Nadia Fischer emphasises: "If the algorithm is to learn from itself, it has to build on a database without biases."

The Witty Works example illustrates that algorithms do not discriminate per se. Rather, the people, as well as the data behind them, are crucial.  If systems are fed incorrectly, they reinforce and multiply discrimination in the real world.