Shauna Dillavou “The internet and tech are infrastructure created _for_ a certain kind of people, _by_ a certain kind of person.”

Shauna is the Co-Founder and Executive Director of CommunityRED, which works to empower vulnerable communities through secure technologies and practices. She spoke to us about CommunityRED’s research into the adoption of secure tools and security-positive behaviors, their use of human-centred and trauma-sensitive design, and how these approaches and findings are informing the design of their training programs. With a background in global security analysis, Shauna has explored the intersections of social media, politics, and transnational crime in Mexico and Latin America; trained law enforcement in social media exploitation; and researched political tolerance in Bejing. Her work in security began with a National Security Education Program Boren Fellowship and continues through a Truman National Security Project fellowship.

Evidence

Shauna’s Story

Can you tell me a bit about your work?

CommunityRED works to create a secure world for all people. To provide avenues for freedom of expression. The groups we serve include political activists working in oppressive regimes, journalists, civil society organizations that support alternative voices, women, and LGBT and trans communities. We work all over the world, including domestically — in North America, serving people who are attacked online or whose expression is curtailed because of massive surveillance followed by thuggish security agencies, trolls, or dox or swat attacks. [“Doxing” is a form of revenge where the attacker publicly identifies or publishes private information. To “swat” is to falsely report a dangerous situation that provokes a highly militaristic police response].

We’re agnostic about who we serve. We work across the spectrum. But we only work by invitation, because that is more likely to result in the adoption of secure tools. The biggest barrier to adoption is a lack of will and desire. It is possible to create a need, but we don’t have time for that. We look at behavior change and transformation. So we only work with folks who want to work with us. Typically the invitation comes from a forward thinking person in an organization — an internal champion. Then the rest of the organization comes along. Same with broader networks of organizations and individuals. Each group decides on their security standards. We train them to asses their own risks and vulnerabilities and what they can do to counteract that. The worst offenders tend to be those with the most power (the CEO, COO, etc.). A company is only as strong as the weakest password. Often people in positions of power can’t be bothered.

What would be an example of a success for you?

There is a lack of research on the adoption of secure tools and security-positive behaviors. In April 2016 we conducted the second trial of a study. We were training 2 groups of 10 journalists. We trained both groups the same way — we even wore the same clothes and served the same meals. The only thing we changed is the presence or absence of the instructors during the training on three separate tools. For one group the instructors stayed in the room and for the other group they left. The pre- and post-tests revealed that the group that was left by themselves did 20% better. But they were also angry about being left alone. The group where the instructor — the “expert” — stayed in the room reported higher levels of comfort and security, but they did not do as well in terms of learning the tool. This is a significant finding. It’s huge. This success was a hydra that sprouted many more heads. Now we have more questions.

All of our work is trauma sensitive. We have psychologist on staff leading research. We ask about feelings of comfort and security, but never ask directly. We call this “circling the tiger”. This is also why we use human centered design — to flush out information that cannot be articulated verbally. Most of the security community operates on the assumption that they know what is best. But we operate under the assumption that people know what they need — because they live with it every day. Trauma affects security decisions. We have to test our assumptions. It took time to narrow it down to this one question: Do people learn better if they are supported by a community or individual expert? Turns out it is the community. The results of our research are pushing our work in a different direction.

Another assumption is that training should start by assessing security risks and concerns. But focusing on threats can be so overwhelming that people are not able to take in anything new. They’re completely focused on staying safe or keeping their family safe. It’s consuming. So we focus on feelings of security. Bruce Schneier has a TED Talk, The Security Mirage, where he explains how we hear and react to things that are not actually threats to us. In the past, when one villager heard from another that there was tiger roaming around, that threat was probably real. But information and communication technologies have increased feelings of fear and insecurity about things that are not a real threat to us. The technology has evolved faster than our brains. We think a lot about the perception of fear and what that does to the human brain — shutting it down. How do we work with that? We don’t want to shut down the ability to learn new information. So instead we draw maps of who we communicate with, and explore different levels of sensitivity. We’ve also built games. We do focus groups and interviews to learn more about what makes people feel comfortable. We ask what they are doing when they feel comfortable, what that’s like. Then we ask how they feel when using security tools. We want to connect those two things — feeling good and using secure tools.

How would you describe the open internet? Why is it important to you?

The open internet is something that does not exist yet. The internet and tech are infrastructure created for a certain kind of people by a certain kind of person. Built for non-targets by non-targets. They are the only ones served well by it. A lot of the internet is unsafe for other populations. The structure exploits those who are different. A woman recently suggested that Twitter should deal with their harassment problem before tackling whether or not to increase their character limit to 10k. She has since had to block 850,000 abusive accounts. That’s more than the population of the District of Columbia. Twitter is not a safe place.

We have few choices. Another example: Either you’re on Facebook or you’re not. In some countries, if you can only choose between Facebook or no internet. That is not a choice. We don’t get to choose things that reflect our values. We don’t get to determine who we want to be and how we should act. An open internet would be one where there would be choices and options and the ability to express ourselves how and when we like.

What CommunityRED does is a drop in the bucket. A tiny band-aid on a gushing wound. We only reach the most urgent cases. We try to help them before someone shows up at their front door. We create alleys — but what is needed are are broad avenues.

How about an example of a challenge?

A big challenge is having my vision narrowed by funders. This is why we do consulting work. Our consulting income provides freedom and choice. But the biggest challenge is telling the stories — explaining what is happening and the implications for people here in the United States and in other parts of the world. Those stories need to be connected back to people here — to companies and to legislators. One challenge is that we have to be careful to not put people at risk or expose our organization when we tell those stories. To begin to address this I’ve started working with reporters. I’m a Truman fellow, so I’ve also been talking to folks there. We’ve started developing briefs. I have not figured out how to tackle this yet. We need to make space in our schedules.

Can you tell me about how you got involved with Mozilla? What has that been like?

Gemma Barrett, a Ford-Mozilla Open Web Fellow at the New America Foundation’s Open Technology Institute, put me in touch with Mozilla because I wanted a fellow. We had been working on building tools together. She is awesome. I thought “How can I find someone like this?” I have not been more involved than that. Although I did go to Mozfest a few years back and liked it. I had a panel accepted the following year but they were not offering travel stipends. I don’t know what they are doing around privacy these days. I’d like to know their thoughts about protecting users. Where do they stand on user security and safety? Gamma International created a surveillance tool called FinFisher that pretends it is Firefox. Mozilla sent them a cease and desist letter but I don’t know if they followed up. I would like to see more of an effort and engagement all around.

At Mozfest I met Harlo Holmes (I already knew of her) and we worked on a steganography tool. But we could not sustain the project without funding. There were three or four of us working on it. I also ran a workshop about envisioning tools, which went well. Overall, the privacy track at Mozfest was disappointing because it was basically a bunch of dudes telling you that you were doing it wrong. Also it was held in huge open room. There was no help getting quiet space to hear one another. I would like to have seen a broader approach to privacy. If it had been a stellar experience I would have found the money to attend the following year. I’ve taken a step away from their privacy stuff.

What stories would be helpful for you?

It would be nice to talk to others who are working to have impact and doing that work from an authentic place. So many people in security field have a hero complex. They want to be a savior. They talk over you. They don’t let you ask questions. I’d like to speak with people with aligned values and who also work internally — in themselves and in their companies — to reflect those values. We work hard to maintain this internally. We leave space for each other. We are careful not to interrupt each other. Cutting people off curtails their expression.

I would also like a space to discuss making room for vision. There are weird pressures to work all the time. Because working means you are doing something. We try not to do that because that is the death of inspiration and mission. I’d love to be connected to to the people who share that value. Creating space for vision, innovation, inspiration. So much of innovation is focused on Silicon Valley, San Francisco, and privileged white men. I’m curious to see how people are creating avenues for innovation and success for any kind of person. I am tired of a story about one angel investing firm for women. I want to hear about the glut in the angel investment for women market and what that is doing. I’m tired of seeing the one woman and one person of color on an event panel. I don’t want to hear stories about gender parity at an event — it should be the norm. Instead I’d like to know specifically how others are creating avenues to make that the norm. I’m interested in working to get at what is at the heart of that, systemically. Not in band-aid actions. For example how are we getting under-represented people into the pipeline earlier on? How are we re-tooling workforces after a certain age? What are we doing to act around inclusivity? In my work prior to CommunityRED I hit a glass ceiling. It is not an accident that we are completely female led. I love dudes. I am married to one. We hire them. But it is not about leveling the playing field… there simply is no playing field for us. We look for opportunities channel power to the people we serve.