Source: The UNESCO Courier. 2020, no. 3. © UNESCO 2020. ISSN 2220-2285 • e-ISSN 2220-2293. Periodical available in Open Access under the Attribution-ShareAlike 3.0 IGO (CC-BY-SA 3.0 IGO) licence.
_____________________________________________________________________________
“We must educate algorithms”
Flora Vincent: Scientific teams lack diversity – the phenomenon is well-known. What is not so well known is that this has consequences on how research is developed and what subjects are given priority. An American science historian, Londa Schiebinger, has been working on this topic recently. She shows that the more women there are on a team, the more likely it is for the gender issue to be taken into account in the study itself.
There are many examples of this discrimination in research. One example is that drugs are tested more on male rats because they have fewer hormones, and therefore it’s considered easier to measure side effects. Another example: for crash tests, standard 1.70-metre and seventy-kilogram dummies, modelled on the average size and build of a man, are used. As a result, the seatbelt does not take into account certain situations, such as pregnant women, for example.
Has computer science been a predominantly male-dominated discipline from the outset?
Bernheim: No, that was not always the case. In the early twentieth century, computer science was a discipline that required a lot of rather tedious calculations. At the time, these were often done by women. When the first computers came along, women were in the lead. The work was not seen as prestigious at the time. As recently as 1984, thirty-seven per cent of those employed in the computer industry in the United States were women. By comparison, in France in 2018, only ten per cent of students in computer science courses were women; it is estimated that only twelve per cent of students in the AI sector are women. In fact, a significant change took place in the 1980s, with the emergence of the personal computer. From then on, computer technology acquired unprecedented economic importance. The recreational dimension of computers also emerged in those years, developing a very masculine cultural imagery around the figure of the geek. This dual trend was accompanied by the marginalization of women. This shows that boys’ affinity for computers is not natural, but that it is, above all, cultural and constructed.
One might think that algorithms are neutral by nature. To what extent do they contribute to reproducing gender bias?
Bernheim: Some whistleblowers realized quite quickly that algorithms were biased. They found, for example, that translation software [into French, which has masculine and feminine nouns] tended to give professions a gender by translating the English “the doctor” into “le docteur” (masculine), and “the nurse” into “l’infirmière” (feminine). When voice assistants appeared – whether Alexa, Siri, or Cortana – they all had feminine names and responded to orders in a rather submissive manner, even when they were insulted (see box).
In 2016, Joy Buolamwini, an African-American researcher at the Massachusetts Institute of Technology (MIT), became interested in facial recognition algorithms. Her work showed that they [the AI] were trained on databases which contained mostly photos of white males. As a result, they were much less effective on [recognizing] black women or Asian men, than on white men. You can imagine that if she had been part of the team developing these algorithms, the situation would have been different.
Vincent: Coding an algorithm is like writing a text. There’s a certain amount of subjectivity that manifests itself in the choice of words, the turns of phrases – even if we have the impression that we are writing a very factual text. To identify the biases, our approach consisted of dissecting the different stages of what we call “sexist contagion”. That’s because there isn’t a single cause that creates a biased algorithm, but rather, it’s the result of a chain of causality that intervenes at the different stages of its construction.
In effect, if the people who code, test, control and use an algorithm are not aware of these potential biases, they reproduce them. In the vast majority of cases, there’s no willful intention to discriminate. More often than not, we simply reproduce unconscious stereotypes forged in the course of our lives and education.
Is there an awareness of the bias in certain AI products today?
Bernheim: AI is a field where everything is evolving very quickly – the technology itself, but also the thinking about its use. Compared to other disciplines, the problem of discrimination emerged very early on. Barely three years after the onset of algorithm fever, whistleblowers started drawing attention to the differentiated treatment of certain algorithms. This is already a subject in its own right in the scientific community. It fuels many debates and has led to research work on the detection of bias and the implications of algorithms from an ethical, mathematical and computer science point of view. This awareness has also recently been reflected in the mainstream media. Not all the problems have been solved, but they have been identified and once they have been, solutions can be implemented.
_____________________________________________________
Voice assistants: Apps that reinforce gender bias
“I’d blush if I could”: for years, this is how Siri, Apple’s voice activated assistant reacted, if a gendered insult was hurled at her. This incongruous response was the inspiration for the title of I’d Blush if I Could, a UNESCO publication that examines the impact of gender bias on the most common artificial intelligence (AI) applications, like voice assistants.
Most voice assistants, like Siri, Amazon’s Alexa, and Microsoft’s Cortana, have women’s names and voices, and a docile “personality”. These machines that have invaded our daily lives express a submissive style that illustrates the gender bias in some AI applications.
This is not surprising, given that the tech teams developing these cutting-edge technologies are made up mainly of men. Globally, only twelve per cent of AI researchers today are women. They comprise only six per cent of software developers and file thirteen times fewer patents in information and communication technologies (ICTs) than their male colleagues. To overcome these prejudices, the UNESCO publication makes a series of recommendations. In particular, it recommends ending the practice of giving voice assistants a female voice by default, and programming them to discourage the use of sexist insults. In particular, the publication stresses the need to provide girls and women with the technical skills to develop new technologies on an equal footing with men.
In this area, statistics sometimes defy conventional wisdom. The countries closest to achieving gender equality, particularly in Europe, have the lowest rates of women employed in the technology sector. In contrast, some countries with low levels of gender equality have high percentages of women graduates in new technologies.
In Belgium, for example, only six per cent of graduates in ICTs are women, while in the United Arab Emirates, fifty-eight per cent are women. Hence the need – insist the authors of the publication – to adopt measures to encourage the presence of women in digital education everywhere.
Launched in May 2019, I’d Blush if I Could was produced in collaboration with Germany’s Federal Ministry for Economic Cooperation and Development and the EQUALS Skills Coalition, a global partnership of governments and organizations that promotes gender balance in the technology sector.
__________________________________________________________
How can algorithms be made more egalitarian?
Bernheim: To begin with, we must act at the level of databases, so that they are representative of the population in all its diversity. Some companies are already doing this and are working on databases that take into account differences in gender, nationality or morphology. As a result of work published on the shortcomings of facial recognition software, some companies have retrained their algorithms to be more inclusive. Companies have also emerged that specialize in developing tools to evaluate algorithms and determine whether they are biased.
Vincent: At the same time, in the scientific and research community, there has been reflection on how to implement a more independent evaluation, and on the need for algorithmic transparency. Some experts, such as Buolamwini, advocate the development and generalization of an inclusive code, just as there is for inclusive writing.
Among existing initiatives, we should also mention the work done by the collective Data for Good, which is thinking about ways to make algorithms serve the general interest. This collective has drafted an ethical charter called the Hippocratic Oath for Data Scientists, establishing a list of very concrete parameters to be checked before implementing an algorithm, to ensure it isn’t discriminatory. It is important to support this type of initiative.
Could AI eventually become an example for how biases can be combated?
Bernheim: In a sense, yes, to the extent that we became aware fairly quickly of the biases these new technologies could induce. AI is in the process of revolutionizing our societies, so it can also make things evolve in a positive way. AI makes it possible to manage and analyze very large amounts of data. It enabled Google, in particular, to create an algorithm in 2016 to quantify the speaking time of women in major American film productions and show their under-representation. At the same time, the teams developing algorithms also need to become more gender-balanced. Today, however, for a number of reasons – including girls’ self-censorship when it comes to scientific fields, and the sexism that reigns in high-tech companies – very few women study computer science. It will take time to reverse this trend.
Vincent: Of course, the algorithms need to be educated, but changing a few lines of code will not be enough to solve the problems. We must bear in mind that there will be no willingness to code for equality if the teams involved do not include women.
______________________________________________________