Ron Deibert “The internet is the repository of our knowledge and the means by which we exchange and share ideas — it has to be protected and preserved and shepherded.”

Ron Deibert is a Professor of Political Science, and Director of the Citizen Lab at the Munk School of Global Affairs, University of Toronto. The Citizen Lab is an interdisciplinary research and development laboratory working at the intersection of the digital technologies, global security, and human rights. He was a co-founder and a principal investigator of the OpenNet Initiative (2003-2014) and Information Warfare Monitor (2003-2012) projects.

Evidence

Ron’s Story

Start by giving me a broad overview of your work and then some specific projects that you’re currently working on.

I’m a political scientist and the Director of the Citizen Lab, a research lab at the University of Toronto. The lab comprises anywhere between about 12 to 20 staff depending on the time of year. We have a combination of full-time staff, visiting fellows, postdoctoral fellows, and affiliates. While some of us are political scientists, staff also come from different disciplines such as computer science, engineering science, law, and area studies.

The signature of the Citizen Lab is our combination of methods; our research is focused on digital security issues that arise out of human race concerns — worldwide internet censorship, surveillance, targeted digital attacks, and trying to better understand the security and privacy issues around popular mobile applications. We are primarily interested in taking techniques and methods that are used in computer science and engineering science and applying them to policy and rights questions.

A good example is the work that we’ve done over the last decade around documenting internet censorship. We have developed a suite of methods — technical interrogation methods and network measurement methods — that we use to document patterns of internet filtering worldwide, including in countries such as China, Saudi Arabia, Iran, and others. A byproduct of this research is that we are often able to identify the vendors of the technology that is sold to countries and their ISPs that engage in internet censorship — companies like Blue Coat Inc, or Netsweeper.

One of our reports, in 2011, fingerprinted Blue Coat deep packet inspection technology in Syria which in turn contributed to an investigation by the United States government and eventually a fine levied against a reseller of Blue Coat.

We’ve also done reports on a Canadian company, Netsweeper, which sells internet filtering technology to schools and libraries — but also to countries that use it to block access to political speech or speech related to human rights issues — countries like Pakistan and Bahrain.

Thinking about your work, can you hone in on a specific example where you felt a sense of success?

We’ve been very fortunate to have many of our reports covered extensively in the world media. At least 14 of our reports have been covered exclusively on the front pages of either The New York Times, Washington Post, or The Globe and Mail in the last eight years. That’s one measure of success. If you’re reporting and the work that you’re doing is attracting global interest then that’s important because it puts you on many peoples’ radars.

Singling out a recent success, we recently discovered a United Arab Emirates human rights defender, named Ahmed Mansoor, had his iPhone targeted by an Israeli cyber warfare company, called the NSO Group, that was apparently contracted by the UAE government. We analyzed the SMS messages Mansoor shared with us, and after an intensive investigation done in collaboration with Lookout Inc, we determined that this attack involved NSO Group technology which employs several unpatched vulnerabilities on the iPhone. We did a responsible disclosure to Apple that resulted in patches to iOS, OSX and Safari for all Apple users in September 2016. This was a pretty big impact of our research since it affected about a billion Apple users worldwide, and help disarm a cyber mercenary that was selling its technology to a regime that was abusing human rights.

That’s a great example. How about an example of a challenge? I’m sure you’ve had many, but one that is top of mind.

We’ve done about seven reports on Netsweeper, a Canadian company that sells Internet filtering services, in which we identified its technology being used in country contexts where human rights violations are occurring. Typically as part of that research, we send a letter to Netsweeper prior to the publication of our report asking for their response to our questions about whether the company has any corporate social responsibility practices or other due diligence concerning potential abuses of their products, and offering to publish in full whatever their response would be.

In all of our reports that we’ve done on Netsweeper, they never once replied to us and they’ve typically never replied in any depth to any of the journalists that would follow up after the reports were published. However, last year, we received a defamation lawsuit from Netsweeper. The University of Toronto and I were named as defendants for the $3.5 million defamation suit.

We spent about six months preparing our legal defense and then Netsweeper discontinued the lawsuit in its entirety — that was obviously a big challenge. It is, however, also a mark of success — if the work that we’re doing is upsetting people, then we’re doing our job well. It was a challenge — to wake up and see yourself named as a defendant of a $3.5 million lawsuit is a bit unsettling.

Does it make sense that they intended to follow through with it or just put you through the expense of preparing?

I can’t get inside their heads and see what’s driving them — all I can say is that they filed the defamation suit. We were preparing for a lengthy and vigorous defense, and then they withdrew their lawsuit in its entirety.

I am, however, very proud that the the University of Toronto stood by my researchers and me throughout the whole process. Had the case gone forward, the process of discovery would have been quite interesting for us, as we would have certainly requested to see detailed particulars of their business.

The discovery process would have worked for you in that way. You’ve already touched on this but my next question is, how do you approach addressing this challenge? Are you thinking of using different tactics or just continuing what you’ve been doing?

We had been preparing for such an eventuality a few years beforehand. The nature of the work we do is such that there are companies and governments that are going to get upset — we’re lifting a lid on things they might not want exposed. We have, for many years now, taken both digital security and physical security very seriously. We have on our staff an attorney whose job is to help with legal research and to think about these type of questions in terms of protection for the Citizen Lab’s research activities. The best defense is to be prepared for any physical problem, digital security problem, or legal problem that might occur, so we spend a lot of time thinking and preparing for these.

Turning now to the broadest issue in the Mozilla universe — internet health. What, for you, is a healthy internet?

I look upon the internet in broad historical terms. I came to this subject — not with a technical perspective — but from a historical and political perspective. The way I see it is that we’ve got an opportunity to create a global communication system that is essential for the long-term survival of our species. The original operating principles of the internet — having an open distributed, non-hierarchical mode of communication — is something I think fits well with principles I care about and think are essential for humanity — principles around human rights and liberal democracy.

However, what we’re documenting in our research is that these principles are very much under threat. There’s growing internet censorship worldwide, surveillance, targeted digital attacks, militarization, and weaponization of digital technologies. The internet is under systematic and wholesale threat today. We need to work correctively, whether as companies or advocacy groups or academic researchers or NGOs or policymakers, to try to protect and preserve the internet as a secure and open forum — the global public space of communication.

The internet is the repository of our knowledge and the means by which we exchange and share ideas — it has to be protected and preserved and shepherded in that way. What we’re doing is one element of stewardship over the internet.

Shifting now to that concept of working open — what does that mean for you?

People think of working open in different ways, but when I take into consideration the academic way of thinking and the origins of the internet in the university system, working in an open fashion means being transparent and being able to put forward what it is you’re doing in a way that’s reproducible by others.

I think of it in scientific terms and at the core of the academic world — there are times when you’re doing research and you need to protect the identity of people or follow certain ethical protocols that require you to ensure data is secured in certain ways. Openness is not the end-all and be-all for the internet. It is an important principle but like all principles, there are important caveats as well and I think we need to reinforce that in various ways when it comes to certain projects on the internet.

WikiLeaks, for example, which has to do with openness and transparency, recently dumped documents which included personal information about LGBTQ communities in Saudi Arabia and information about the addresses and phone numbers of women in Turkey based on voting records. None of that type of personal information is in the public interest and it really should not have been posted on the internet. This is a good example of how we need to strike a balance between openness and privacy.

Has there been a time where working open has had an impact in your work?

Oh, for sure. We try as much as possible to carefully document all the methods we use and the data we collect so that others can duplicate these methods and build upon them. When I started doing research on documenting internet censorship worldwide, we developed certain technical means and started producing our data in ways that others could build upon. Eventually, the project that organized this research, called the OpenNet Initiative, concluded. Yet, the activities around it didn’t end as some of the methods and data we employed were open and are now being adopted and improved upon by a whole community of researchers — some of them organized around the OONI (Open Observatory of Network Interference) and Tor Community.

Getting more specific about Mozilla, how did you get involved with them and what is that been like for you?

With Mozilla, my principal interaction has been around the Open Web Fellows program. Etienne Maynier, one of this year’s Open Web Fellows, is working with Citizen Lab — we’re a host organization. Of course, I have known about Mozilla and Firefox going back to Netscape times, but this is the first real formal interaction.

Do you have any impressions, feedback?

I can tell you about the Open Web Fellows program particularly. I think it’s extraordinarily well-run. There are other opportunities and fellowship programs that we’re involved in at the university, sometimes paid for by organizations like Mozilla, and in comparison most are not as well organized from the get-go.

The people that we’ve interacted with at Mozilla have been very helpful — the requirements for what we need to do as a host organization have been very well laid out. A lot of effort has been put into building the community around the Open Web Fellows program — the host organizations interacting with their fellows and each other and among the fellows.

What impact, if any, have you seen from hosting an Open Web Fellow?

Etienne has integrated into our research and assisted with the core area that we work on — targeted malware attacks on human rights organizations. It’s been helpful for us to have additional support and an insertion of his talent into this very demanding area of our work. We can’t afford somebody like Etienne who could command a much larger salary than we could provide him if he was working in the private sector — he came to us from the French aerospace industry.

If you had access to 10 more skilled volunteer collaborators/contributors, what would you ask them to do?

We have a large suite of projects across many areas. There’s so much work to do — so additional resources could mean we’d be doing more work documenting some of these human rights concerns that arise in the digital space. It’s a type of thing that given that it rests on open reproducible methods — it’s the type of thing we just step into and start working on if they have the right skills, background, and training. It would mean building up what we do and collaborating more with other people.

Do you have your project set up, for example, in GitHub or something similar to that where you have established growing volunteers or is it easy for people to step in?

Yes and no. We track a lot of our technical software work on GitHub and most our projects sync up there, however, we don’t take on volunteers per se because of the ethical requirements we have to follow as a university research group — some of which are very much like medical studies which involve strict privacy protections for study subjects. We can’t have somebody coming in and randomly working on a project — we have a responsibility to protect the rights of the people with whom we work.