It could lead to significant changes in how social media deals with the situation.
Every day, approximately 1.8 billion photos are uploaded to the internet. Microsoft estimates some 720,000 of these pictures contain online child sexual abuse. Every single day.
If the exact image is uploaded to sites multiple times, it can be blocked. But what if the image is altered – such as cropped or words added? Then it’s a different picture with different metadata.
That’s the reason Microsoft, working with Dartmouth College on a project for the National Center of Missing and Exploited Children, created PhotoDNA.
PhotoDNA is a free service that scans pictures and can block child porn from hitting the internet. It uses what inventors call “hash technology.” It works by converting images into a grid and assigning a number value to each tiny square. In doing so, the values equate to a “hash.” You might call it the picture’s DNA, hence the name. The program doesn’t actually look at the pictures, but rather matches the DNA against an ever-growing database of illegal images.
“Certainly, it’s important from a victims’ rights perspective; these are crime scene photographs,” says John Shehan, vice president of NCMEC’s Exploited Child Division.
Facebook, Twitter, and Flipboard use PhotoDNA to detect child porn.
Why isn’t Facebook using similar technology to scan for other problem images?
That question may be central in a court case in Ireland involving a 14-year girl that was the victim of “revenge porn.” A naked photo of the teen was posted on Facebook on a so-called “shaming page.”
Facebook argued in court that it had removed the picture when asked and had done so on more than one occasion. Such photos have to be manually reported and reviewed before they are taken down unless they are caught by programs such as PhotoDNA.
It begs the question, though, isn’t a nude photo of a 14-year old considered child porn? If so, shouldn’t PhotoDNA have caught it? And if not the first time, wouldn’t it certainly be flagged and included in the has database to avoid allowing it to be reposted?
The girl is suing Facebook, claiming misuse of private information, negligence, and breach of the Data Protection Act.
The court case has the potential to mean dramatic changes for social media. Facebook argued to have the case thrown out, but their plea was rejected.
“A case like this risks opening the floodgates for other civil cases to be taken against Facebook and other social media sites. We’ve already seen an increase in the number of people calling to find out more. I can see it being a very real problem for all the social media sites going forward.” – Paul Tweed, Media Lawyer/Senior Partner, Johnson via news.com.au
The Guardian, in a recent article, proposes there may be another reason Facebook doesn’t remove images before they are reported. Under current EU law, social media sites are immune from liability as long as they react to complaints and remove offensive material.
“One reason they could be reluctant to proactively search for all potentially abusive images is that, ironically, by assuming some level of editorial responsibility, in theory they could be held liable for the abuse they miss.” – The Guardian
It’s not the only “revenge porn” case making news in social media circles
Tiziana Cantone, a 31-year-old Italian woman won the “right to be forgotten” ruling from an Italian court, which ordered a sex video she appeared in be removed from search engines and social media sites, including Facebook.
One video gained particular traction, with a remark she made in the film — “You’re filming? Bravo” — used as marketing catchphrases, t-shirt slogans and memes all over Italy – NY Daily News
The video, which she admits she sent to some friends was posted on-line on various sites and viewed almost a million times. The woman quit her job, moved to a new time and tried to change her name. Recently, she killed herself.
Policing the internet is no easy task
The sheer volume of photos posted makes it virtually impossible to review everything. Even software-based solutions won’t be perfect… and humans won’t always make the right decisions either.
Just look at the recent controversy over the posting (should it be censored or not?) of an iconic 1972 Pulitzer-Prize Winning photograph showing children running from a napalm attack, which showed a naked 9-year old girl. Facebook deleted the photo, citing nudity and child protection, but later allowed the photo after being accused of censoring history.
Such a case shows how difficult the decisions can be, whether it’s automated or not.
“These are difficult decisions and we don’t always get it right,” Sheryl Sandberg, Facebook COO, in Quartz, reacting to the controversy over the “Napalm” photos. “Even with clear standards, screening millions of posts on a case-by-case basis every week is challenging.”
As to the 14-year old girl’s lawsuit, a Facebook spokeswoman told The Guardian there is “no place for this kind of content on Facebook and we remove it when it’s reported to us. As outlined in our community standards, nudity and sexual exploitation are not allowed.”
European courts are going to take a look at the issue and decide whether there is liability. The ruling may change how social media has to deal with such case in the future.