Caroline Criado Perez is a social activist and journalist who, in 2017, successfully campaigned for British banknotes to feature the image of Jane Austen, after the Bank of England said it would be phasing out Elizabeth Fry’s portrait in favor of Winston Churchill. Criado Perez has also been a vocal critic of Twitter’s policies around abusive tweets, since she herself has been the target of severe Twitter harassment. And her Women’s Room database of female experts tries to ensure that more women are tapped as sources in the media.
In her new book Invisible Women, Criado Perez examines different elements of the modern world that appear to be designed with less consideration for women: Transportation systems, medical devices and treatments, tax structures, consumer products, even the smartphones and voice-recognition technologies we use every day. The 321-page book is a rapid-fire delivery of data sets, making it more of an academic tome than a light and hopeful read to take with you on summer vacation. But despite the occasional meandering, Invisible Women often arrives right back at the same seemingly inevitable conclusion: There exists a real gender data gap that is “both a cause and a consequence of the type of unthinking that conceives of humanity as almost exclusively male.”
Criado Perez spoke to WIRED about the book. The conversation has been edited for length and clarity and includes information provided in follow-up emails.
Lauren Goode: My first question is this: What was the moment for you that made you think, OK, this is the time for me to write this book? You’ve been observing and covering these issues for a very long time, but I’m wondering if there was something in particular that made you want to publish this book at this moment.
Caroline Criado Perez: I first came across the gender data gap in the world of medicine in 2014, when I was writing my first book. I was just so shocked that this was an issue in the 21st century, that doctors were misdiagnosing women because the symptoms of our heart attacks don’t confirm to those of men. And that women were more likely to die and more likely to be misdiagnosed. Around that same time I also found out that we don’t tend to involve female humans or animals or cells in medical trials, and the result of that is women have less effective treatment and more side effects.
That was just really gobsmacking. So really it was that, and me not being able to get it out of my head. And because I knew it was happening there, I realized it was happening in other places. Since I’d studied behavioral and feminist economics at the London School of Economics, I already knew about the default male in that area, but I started discovering all of these other areas where it was popping up. The more I found out, the more learned about data gaps in technology, and car safety design … and even data gaps in refugee policy. And so eventually it was just that I had so much information that the only way to cover it was to write a book.
LG: Can you talk specifically about the technology devices you highlight in the book, and how biased data sets have informed biased design? I always think about giant smartphones, because as a reviewer I often note that they just don’t fit in my hands all that well. But then in marketing, the companies might use professional athletes with giant hands holding the phones, so of course it seems small in comparison.
CCP: The category of smartphones is a massive bugbear of mine because I actually got RSI [repetitive strain injury] from an iPhone 6. And I now am stuck with an iPhone SE which I can’t upgrade. The only small phone they had, they discontinued, and it’s the only one that fits my hand. It’s incredibly frustrating. And then later when [Apple] introduced Siri, you could use it to find a viagra supplier but not an abortion clinic. So there’s all sorts of examples like that, where there’s not as much thought being put into, you know—female customers exist. Another example is VR headsets being too big.
But to me the most worrying examples are about algorithms rather than hardware. Because with hardware, it’s kind of easy to see how it is affecting us or not fitting us, and so it’s relatively easy to fix. What’s more concerning to me are algorithms being trained on highly biased male data sets, and the way these algorithms are being introduced in all sorts of areas of our lives. There doesn’t seem to be much understanding amongst the people who are coding these algorithms about the issues with the data they are training them on. That goes from voice recognition systems that don’t recognize female voices, to online dictionaries, to algorithms deciding whether a certain CV will ever reach human eyes.
And this is often proprietary software, so we don’t always get to see whether gender bias is being accounted for. So we’re outsourcing the future to private companies that are using biased data sets, and there’s no way of knowing what’s going on there.
LG: Transportation, and really more broadly city planning, is something else you cover quite a bit in the book. You point out that in some societies, women walk more than men, and that the way they lump trips and errands together—referred to as trip-chaining—and even their safety isn’t really considered. How do you fix something like that when the transportation systems are so firmly embedded?
CCP: There are a number of things that can be done. The obvious one is to move bus routes because, as you say, things like subways are fixed and it’s much more expensive to change them. When new lines are added and new stations are added, absolutely those things should be taken into consideration. But bus routes are very easy to change and the thing about buses is that, in some places, women are much more likely to use buses. That’s one easy way of addressing the male bias in transport infrastructure in a relatively short order.
More long term, it really is about the design of cities themselves and looking again at zoning laws. One of the big problems with the way we’ve laid out cities is that they’ve been laid out in such a way to serve the needs of this mythical male breadwinner who has a wife home in the suburbs. This man drives to work and conceives of home as a place of leisure, so you don’t have as many services; you can just have a residential area. It’s this idea that you just go home and you sleep. And it’s completely untrue to how women and people live their lives. They’ve got to take kids to the doctor, to school, get groceries, check in on a relative …all the things we are doing on a daily basis requires a lot of complicated logistics.
In some societies women are also less likely to have access to a car than a man; if a household has one car, men dominate access to it. So women use public transport, but the public transport hasn’t been designed for unpaid care work. The ridiculous thing about this as well is that by making it difficult for women to complete their unpaid care work, it makes it much harder for them to engage in their paid work. In the US, for example, female labor participation has been dropping behind other developed countries, and there’s a need in America for women to engage more in the paid labor force. But nothing is being done to help them do that in really this very simple ways, enabling them to do the unpaid work that has to get done.
LG: When I think about bias in transport design, I think about this breastfeeding pod I saw last year in an airport. It’s this Zappos-sponsored pod in the middle of the airport terminal walkway for women to nurse in. The person I was traveling with at the time said something like, “Isn’t that an interesting idea that there are these pop-up mother’s rooms?” And my thought was, “Isn’t it terrible that adequate family rooms weren’t designed in the airport back when it was originally built?”
CCP: I sort of take it one step further and wonder why we have to lock women up in pods to feed their children. It seems bizarre. I’m not sure I see that as progress in any way, shape, or form. I can’t think of the word. I’m quite horrified by it … And I know obviously some women would want to use them, but also, if a woman wants to put a muslin over her baby that should be enough.
LG: In the book when you refer to your campaign to get the Bank of England to put a woman on its banknotes, you wrote something that comes up often in the book. You wrote, “No one meant to deliberately exclude women. It’s just what may seem objective is actually highly male-biased.” At what point though—especially now that we have access to more data sets—at what point does the ignorance of data become deliberate?
CCP: That’s a very good question, and it reminds me of a quotation someone sent to me on Twitter the other day. It was something about how ignorance or a refusal to know is an epistemological political project. This is something [feminist scholar] Nancy Tuana argues. I think that that’s such an interesting way of framing it. That’s not the way that I frame it exactly, because I do think that even when … how should I say this? So, I think there are two things.
First of all, a lot of the male bias we come across shows they just forgot to factor women in because it was a male-biased team and they just sort of forgot we exist. It happens all the time by accident. And then there is simply just not knowing what women’s needs are.
For example, I always think of Sheryl Sandberg going in to ask the head of Google to put in pregnancy parking and he said, I never thought about it, of course. And she says he feels bad for never having thought about it. But that highlights the need for diversity. Because it’s perfectly normal that a guy who has never been pregnant, or also a woman who has never been pregnant, to not think about that. Of course, they could have been collecting data on the needs of women employees. But nevertheless, it wasn’t an act of malice.
The point where I start thinking about this as a political project is when you start getting to the excuses. One of the things I’m asked most about the book is, “What is the example that made you the most angry?” And I can’t really choose one. But the thing that does really make me angry and never ceased to is the excuses. At that point it’s not forgetting. It’s about excluding. For example, with car manufacturers, the decision was made in the EU to finally introduce a female car crash system and it’s just a scaled-down male dummy, and it’s only used in certain tests and in the passenger seat. How did that decision happen? That’s not forgetting; that’s a deliberate act.
LG: Do you see a world in which technology can actually help solve some of these problems?
CCP: Maybe. I think that certainly technology has historically helped women. It has lessened the amount of time that women have to spend doing certain things. One of the examples I talk about in the book is stoves. Most women in low-income countries still cook using the three-stone stove, which gives off incredibly toxic fumes. So the stoves we have in modern homes are absolutely incredible when it comes to helping women when it comes to both the health burden and the time burden they reduce.
There is hope, though I don’t know what that technology will be because I‘m not an inventor. But I suppose the answer is: it depends on who is going to be allowed to do the inventing. The large majority of VCs are men, and they are just much more likely to give funding to male entrepreneurs. And male entrepreneurs are much more likely to develop technology that helps men.
And that, again, is not a conspiracy. That’s just because you’re more likely to develop something that fixes a need you yourself have. Female entrepreneurs are more likely to develop tech that helps women. And that’s great, but they’re not getting the funding. And that goes back to the data gap. It’s just this Catch 22. And that’s where the concern is: Because we don’t have the data and because the mostly-male VCs don’t recognize it, will technology be able to solve the problems because will we give the women the money and resources to do it?
When you buy something using the retail links in our stories, we may earn a small affiliate commission. Read more about how this works.
More Great WIRED Stories
social experiment by Livio Acerbo #greengroundit #wired https://www.wired.com/story/caroline-criado-perez-invisible-women