Technology-facilitated violence, abuse, and harassment against women and girls: A 21st Century Challenge

In October 2012, 15-year-old Amanda Todd from Port Coquitlam, British Columbia, committed suicide after being manipulated into exposing her breasts via webcam by an online predator and enduring months of torment and blackmail. Between January 2015 and August 2016, Mexican television personality and investigative journalist Carmen Aristegui, along with her son Emilio, was heavily targeted with surveillance spyware as a result of her reporting on government corruption. In 2015, two prominent lawyers and human rights defenders representing the families of three slain Mexican women were targets of infection attempts with spyware. In 2010, a Manitoba judge retired early and endured a years-long disciplinary hearing after nude photos taken by her husband and posted online without her knowledge or permission became public.

These are just a few examples of a widespread phenomenon, variously labelled as cyber-misogyny, online violence against women, or technology-facilitated violence, abuse, and harassment against women and girls.

In her report to the Human Rights Council last year, the Special Rapporteur on violence against women, its causes and consequences, Ms. Dubravka Šimonović, addressed the challenges posed by online violence against women and the barriers to prevention, protection, prosecution and redress for such acts. Ms. Šimonović noted that while the use of information and communications technology has contributed to the empowerment of women and girls and to a fuller realization of their human rights, there is a need to examine this recent phenomenon and the applicability of national laws to it. She intends to make recommendations for states and non-state actors to fight online violence against women and girls while respecting freedom of expression and the prohibition of incitement to violence and hatred, in accordance with Article 20 of the International Covenant on Civil and Political Rights. In order to inform her work on the topic, the Special Rapporteur issued a call for submissions on online violence against women. Through these submissions, Ms. Šimonović hopes to collect input and views from different stakeholders, including states, national human rights institutions, non-governmental organizations, and members of academia, on existing legislative models, policies, and jurisprudence related to online and technology-facilitated violence against women. The Citizen Lab, an interdisciplinary laboratory based at the Munk School of Global Affairs, University of Toronto, responded to this call on November 3 of this year.

As an International Human Rights Program (IHRP) clinic student, I assisted with the preparation of the Citizen Lab’s submission in response to Ms. Šimonović’s call. With the guidance and supervision of Samer Muscati, the IHRP Clinic Director, and Lex Gill, a Research Fellow at the Lab and my Project Supervisor, I undertook research on various topics related to technology-facilitated violence, abuse, and harassment against women and girls. My first assignment involved researching the nature of technology-facilitated violence, coming up with a workable taxonomy of the different forms of violence, and exploring the harms experienced by victims and survivors. After reviewing academic literature, publications from advocacy organizations, and reports from international bodies, I had a good grasp on the most common forms of technology-facilitated violence and the various harms that flow from this violence.

Forms of technology-facilitated violence, abuse, and harassment

Cyber stalking

Non-consensual sharing or distribution of intimate photos and videos (“revenge porn”)



Denial-of-service (DOS) attacks

Use of gender-based slurs

Publication of private and identifiable personal information (“doxing”)



Rape and death threats

Electronically enabled trafficking

Sexual exploitation or luring of minors

Harms resulting from online and technology-facilitated violence, abuse, and harassment may be physical (e.g., stress-related illness, injury, and physical trauma), psychological or emotional (e.g., experiences of shame, stress, and fear; loss of dignity; costs to social standing), and/or financial (e.g., costs related to legal support, online protection services, missed wages, and professional consequences). Online and technology-facilitated violence, abuse, and harassment can also have an adverse impact more broadly by increasing needs for health care, judicial, and social services; impeding the exercise of free expression and other human rights; and disturbing the sense of peace and security required to fully participate in economic, social, and democratic life.

I also drafted some substantive sections of the Submission. This included drafting a part of a section on the Necessary and Proportionate Principles, a set of principles that provide civil society groups, states, the courts, legislative and regulatory bodies, industry, and others with a framework to evaluate whether current or proposed electronic surveillance laws and practices are compatible with human rights. I also drafted a section on the newly proposed Bill C-51 which, if passed, would revise the “rape shield” provisions in the Criminal Code to include communications of a sexual nature or for a sexual purpose (e.g., text messages, emails, video recordings) within the definition of the complainant’s prior sexual history. The rape shield provisions provide that evidence of a complainant’s prior sexual history cannot be used to support an inference that the complainant was more likely to have consented to the sexual activity at issue, or that the complainant is less worthy of belief. I also drafted a section on education, training, and capacity-building for legal professionals, law enforcement, frontline anti-violence workers, and the information and communication technology (ICT) sector.

Taking responsibility for the section on education, training, and capacity-building was a daunting but ultimately very rewarding experience. In the Submission, the Citizen Lab calls for evidence-based, rights-protective, proportionate, and targeted measures to address online and technology-facilitated violence, taking care to respect both women’s safety and the full panoply of human rights that are implicated in the digital sphere. An emphasis on education, training, and capacity-building challenges the presumption in favour of potentially overbroad and disproportionate new powers to surveil, de-anonymize, police, and censor in the digital sphere. For instance, rather than providing greater generalized power to law enforcement, training law enforcement on the search powers and grounds to investigate that they already possess, and how to use these powers to identify legal wrongs in a technologically-mediated context may be more effective. Likewise, digital literacy training for prosecutors and judges could help these professionals understand the ways in which new technologies can be misused by stalkers, abusers, and other violent perpetrators, and the ways in which new technologies can make it more difficult to hold perpetrators accountable for their crimes by creating investigative and evidentiary barriers. In many cases, there will be no need for Parliament to create new offences or civil wrongs. Instead, training on how women’s experiences of violence, abuse, and harassment online map onto existing categories of unlawful wrongdoing would be sufficient. In some cases, it might be necessary to make minor amendments to existing provisions in order to fill gaps created by technological change – the proposed Bill C-51 is one example of this kind of amendment.

My contributions comprise only a fraction of the Citizen Lab’s submission, which makes sixteen recommendations in total. Most of our recommendations are directed to states – for example, in Recommendation 12, we recommend that states should hold manufacturers of commercial spyware accountable and engage legal measures to ensure that these tools are not abused to facilitate surveillance against women and human rights defenders. Other recommendations are directed to intermediaries – for example, in Recommendation 10, we recommend that intermediaries engaged in the moderation of online content adopt transparency reporting mechanisms, publish clear and comprehensive content moderation policies, and develop explicit review and appeal processes . Still others are directed to the Special Rapporteur herself – for example, in Recommendation 1, we advise the Special Rapporteur to collaborate with the Special Rapporteur on the promotion and protection of the right to freedom of expression and the Special Rapporteur on the right to privacy when formulating policy responses to technology-facilitated violence against women .

Working on this project with the Citizen Lab has taught me so much about the intersections between violence against women, digital security, and freedom of expression, and I’ve had the enormous benefit of learning from researchers and advocates from vastly different fields and areas of study. Of course, the potential that our work might have far-reaching impact on the development of international human rights norms is a huge bonus –the cherry on top of an immensely rewarding and educational experience!

The Special Rapporteur will present her report to the Human Rights Council in June 2018.

Link to submission: