Check out the video from my talk at “The Conference” in Malmo, Sweden, September 4, 2017. In this talk, I focus on how designing small barriers in apps and platforms for content distribution might help us do a better job of respecting each other’s privacy.
One of the main arguments for decriminalizing consensual teen sexting (with age spans) is that it would prevent victims from being charged. District Attorneys and others who are opposed to this change often claim that law enforcement would never do such a thing, so therefore no legal reform is needed.
This 2016 report on on “sextortion” from the Crimes Against Children Research Center provides new evidence that teenage victims of privacy violations (or threats, or other related harassment) are indeed sometimes threatened with prosecution under child pornography laws:
When victims were minors, perpetrators were often breaking criminal laws about the production or distribution of child pornography, but respondents feared they were vulnerable to criminal charges also. Some respondents [victims of “sextortion”] who described incidents that occurred when they were minors had been threatened with charges or blamed. So in many cases described in the survey, perpetrators were shielded from criminal consequences and respondents had little support from authorities. (p. 55)
Some examples from this survey:
“I was the one who ended up getting in legal trouble since I was the one who sent it.” Female, 16, f2f
“I was told I could be held responsible for making and distributing child pornography.” Female, 14, f2f
“The police threatened to bring me up on charges of distribution of child pornography.” Female, 17, online
“My boyfriend sent my whole family and his friends and my friends the photo. [My family and I] tried to press charges [and get a restraining order against him]. Him and I both looked at jail time, fines, and having to register as a sex offender for ‘child pornography’ since we were both under 18. Luckily, the state [did not press charges].” Female, 15, f2f
“I feel really intensely angry that you can get in legal trouble for sending naked pictures of YOURSELF when under 18. You literally can be charged as a sex offender for it, which is so incredibly wrong because I was the victim. All that law does is protect abusers…” Female, 17, online (p. 52)
The report makes this important recommendation for law enforcement:
[A]s with other sexual assault victims, police need to be trained to focus on perpetrator behavior to avoid exacerbating the sense of shame and self-blame that many victims feel.
In addition, law enforcement agencies need to review policies that lead them to charge young victims of sextortion with child pornography offenses or threaten to do so. Such policies, or victims’ fears of such policies, appeared to deter police reporting of perpetrators who victimized minors and increase the distress of victims who felt they could not get justice. (p. 63)
Non-consensually recorded and non-consensually posted pictures and video of people in sexual situations may be frequently called “revenge porn,” but they are very different from the way the actual porn industry operates. I perform in commercial porn with high production values, porn in which stacks of paperwork — including model releases and 2257 compliance documentation — confirm the age, identity, and legal consent of the performers to both the recording and distribution of the resulting product.
Professional adult entertainment, though often maligned and defined by its worst iterations — like the dramatized biographies based on the stories of Linda Lovelace and Traci Lords — is largely an industry where consent is absolutely necessary. This is not to say that it is a utopia full of sunshine and vulva daisies — it isn’t — but it most certainly requires consent, consent that may be given based on a variety of reasons, from the desire to indulge an exhibitionistic streak to calculations that balance the pressures of economic necessity against willingness to work in a stigmatized and sometimes risky field.
I like her idea to hold websites that host nonconsensual sexual images to the same standard that the law holds for consensual, legal pornography:
It’s terrible to see women who don’t wish to be seen naked in public forced into navigating the stigma associated with visible, public record of their sexuality. And I’m happy to see Twitter and Reddit finally taking steps to curtail this violation of privacy. But I think executives at these companies can do a little better than just allowing users to report violations of their updated terms of service: They should require proof of consent before a nude image is posted, period.
Our current system of data privacy is based on a fundamental flaw. We are all supposed to be solely responsible for our personal information, but at the same time we are all part of a social network of family, friends and services with whom we are expected to share.
Our data systems ask us to be individually responsible but fail to account for how and why we share data with each other. They assume our data is personal, when in reality it is interpersonal. We are caught between opting out entirely and managing an impossible number of changing services with finesse. We do all this with our most important relationships at stake.