As I mentioned in my last post, I would love to buy an iphone, ipad, iwhatever, but Apple’s human rights record in China is making it difficult.
Now it seems that Apple’s record on women is adding to the complications. Megan Carpentier reported a gender-biased, political bent in Siri’s artificial intelligence software, designed to serve as a personal assistant. When asked for places to find an abortion, Carpentier notes, Siri gives answers not to Planned Parenthood, but to anti-abortion centers, and, most surprisingly, in New York, Siri was able to come up with no help:
“Ask in New York City, and Siri will tell you “I didn’t find any abortion clinics.”
It’s an experience that’s being replicated by women around the country: despite plentiful online information about actual places to get an abortion, Siri doesn’t seem to provide it. It’s a similar experience for women seeking emergency contraception: in New York City, Siri doesn’t know what Plan B is and, asked for emergency contraception, offers up a Google results page of definitions.”(Megan Carpentier, 10 things the iPhone Siri will help you get instead of an abortion | The Raw Story, November 29, 2011).
As Carpentier points out, Siri is full of knowledge about strip clubs and Viagra, which led me to wonder about the male-bias in Siri’s programming.
How many women are there at the top of Apple? And what percentage of engineers are female? Relative to the country as a whole?
No women are at the top at Apple. None. Zip, zero, zilch. Now, granted, the competition is not doing much better.
David Zielenziger of the International Business Times reports that numbers for senior management at IBM are 10:12 and 3:12 at Texas Instruments (David Zielenziger September 2, 2011, Tech CEOs in 2011: Where Are the Women?, International Business Times).
But, nevertheless, that is some female representation–at least a few voices who are not in isolation who might have the ability to speak up about why software designed to serve as a personal assistant might need to give as neutral answer as possible regarding abortion information, or why the same software should not be overly-programmed to give information about strip clubs and escort services.
I have been unable to find the percentage of Apple engineers who are women, but I am mighty curious.
Christopher Drew reported that only 17-18% of women earn bachelor’s degrees in engineering and computer science, with 40% in math, whereas “about 58 percent of all bachelor’s, master’s and doctorates in biology are awarded to women ( Christopher Drew, Where the Women Are – Biology – NYTimes.com, November 4, 2011).
I could go on about a lot of topics here (how Apple is dominated by male engineers, how there aren’t enough women coming through the educational pipeline, how gender bias is still reflected in the “hard” sciences and what this means for the study of “empirical” knowledge and orientations to fact, and knowing across the board not only in science but in other humanities disciplines all the way to preschool and educational pedagogy).
But I’ll just quickly conclude by drawing on my friend Lyotard. Lyotard addresses the question of institutions in relation to language games and technology. [Jean-Francois Lyotard, The Postmodern Condition: A Report on Knowledge, Transl. Geoff Bennington and Brian Massumi (Minneapolis: University of Minnesota Press, 1984). Originally published in French as La Conditione postmoderne: rapport sur le savoir (1979)].
Lyotard against the idea of legitimation by consensus through dialogue (discourse) put forward by Habermas because this concept of dialogue presumes a universal subject that desires the same thing and speaks in a universal language.
Instead, Lyotard argues that language games (here he draws on the work of the philosopher Wittgenstein) are the key to social legitimation. Wittgenstein argues that the rules of language games are always changing. From here, Lyotard argues that access to data and information storage is a particularly postmodern problem.
He argues that master narratives where life, knowledge, and truth are united (such as Hegel’s “Life of Spirit”) have been declining since World War II as a result of technology, and in their place are smaller narratives, particularly in the sciences, where the university produces new skills and knowledge for consumption in the market.
For Lyotard, writing, storage (in libraries, museums, and universities), and microstorage (in today’s terms, say, on a personal computer, in the cloud, or on a USB stick, or on an iphone, or, micro-micro storage, what information Siri gives you at what point in time) are all questions of proportion (xii). Similarly, issues of ownership and control over data and storage are political issues.
As Lyotard points out, I think correctly, technology instead of industry will provide the leading questions of access, data, politics, and class. (Just look at the News of the World scandal over hacking or how information stemming from WikiLeaks resulted in the Arab Spring or who owns your data on Facebook or who tracks your movements based on cell phone usage and who owns this technology or how politicians, too, are also using data for campaigns…or…or…or…).
Who has the rights to this information? Who is legitimizing its use? Who has the technical know-how to distribute the information?
As Lyotard wrote decades ago, the questions that technology addresses are question of users, control, and access: “Who will have access to them? Who will determine which channels or data are forbidden? The State? Or will the State simply be one user among others?”” (6) and what Lyotard refers to as a “transformation in the nature of knowledge”” changes relationships among public power, civil society, world markets and, economic competition (6).
These changed relationships among power and technology are ultimately problems of legitimation (or, what society will accept as a norm) regarding the status of knowledge.
And this is precisely why I am worried over Siri’s clearly biased answers. Are we going to accept this as a norm?
What worries me about Apple’s dominance, and yes, monopolization of the market, is that technology is becoming a norm, but the modes of access and definitions of those norms are increasingly being defined by a small few, and as Siri’s Badness points out, these norms are interested and biased.
Tim Cook! You, again! I’m talking to you!