Increased Use Of Machine Learning, Facial Recognition Outs Sex Workers’ Real Names

Image courtesy of opacity

If you operate a video-sharing site with millions of user-uploaded clips, it sounds like a great idea to use software that is smart enough to identify some of the faces in those videos. The clips would be indexed more accurately, you might be able to more readily identify copyrighted content, but you could also be risking the privacy — and maybe the physical well-being — of those identified by the software.

Hire a robot

PornHub — do not search for or visit that one from work — is exactly what it sounds like. The site hosts around 5 million decidedly explicit videos that adult viewers can enjoy at their leisure.

But as with any other kind of entertainment (perhaps even more so), viewers often approach the site with some kind of specific content preference. They may have a certain performer, performer gender, and/or performer quantity in mind, for example, or perhaps are looking for some particular kind of setting or action.

So the site uses a tagging system to categorize all its content. Each video is labeled with a set of tags saying what’s in it, so a viewer can pick and choose based on their mood. Users can add tags to videos to keep content organized.

But videos already outpace humans’ ability to keep up and tag everything, and so the site is turning to help from software.

The company announced [content is safe for work; URL is not] this week that it’s adding an AI model to help it categorize content. The robot overlord will be starting by identifying certain stars, using facial ID tech not unlike that which Facebook, Amazon, and other media entities apply, and will next year branch out into adding more content categories, too.

“Artificial intelligence has quickly reached a fever pitch, with many companies incorporating its capabilities to considerably expedite antiquated processes. And that’s exactly what we’re doing with the introduction of our AI model, which quickly scans videos using computer vision to instantaneously identify pornstars,” company VP Corey Price said in a statement.

Price added, “Now, users can search for a specific pornstar they have an affinity for and we will be able to retrieve more precise results. Our model will undoubtedly play a pivotal role moving forward too, especially considering that over 10,000 videos per day are added to the site. In fact, over the course of the past month alone, while we tested the model in beta, it was able to scan through 50,000 videos that have had pornstars added or removed on the video tags.”

We know who you are

There’s nothing illegal about making money from pornography, as long as everyone is of age, consenting, and and material is recorded and distributed within the boundaries of federal, state, and local law.

In most places, though, it’s still not a line of work one really discusses in-depth with the neighbors. Performers, especially amateurs, may well prefer to keep their public, working persona separated from the name and identity they use in private life. But that gets ever harder in the era of algorithmic recognition and big data.

Motherboard observes that in many ways, this particular use of facial recognition is a privacy disaster in the making.

For one thing, porn piracy is definitely a thing that exists. A video that has been uploaded and tagged on PornHub won’t necessarily stay on the service, but will instead travel the internet — and bring the performer’s auto-tagged name along with it.

There’s also the entire challenge of revenge porn, in which sexually explicit content features someone who did not authorize its sharing, often because a jilted ex shared it out of spite. Although the site has tried to make it easier for victims to report content and have it removed, it still exists on the site until or unless someone flags it.

For now PornHub says its AI will only be used to match the 10,000 “stars” in its database, but any technology that exists can be expanded — intentionally or not. And sweeping up amateurs and unwitting participants into its collection of named and tagged faces could have serious negative effects on people’s lives.

Plausible, not just paranoia

Data that gets collected gets sold, shared, and matched across companies — a practice that seems likely only to increase, not decrease, as time goes on.

Gizmodo recently reported on a case where Facebook matched one sex worker’s two completely separate and disconnected identities to each other.

Her public, real identity is like virtually anyone else’s, Gizmodo reports: She lives in California, discusses politics, has a work-related email address.

But her private, working persona is, very deliberately, not on Facebook at all. Professionally, she uses a different name, different email address, and different phone number. Like Bruce Wayne and Batman, she keeps the two halves of her life discrete from each other: never the twain shall meet.

And yet she found Facebook was filling its “People You May Know” recommendations for her daytime persona with a list of her evening clientele. That, in and of itself, is bad enough — but, she reasoned, if they were being suggested to her, then it was highly possible that she was also being suggested to them, despite going to lengths to keep her real name and personal details out of clients’ hands.

It’s not just the workers who are going to lengths to try and maintain their privacy, she told Gizmodo: “The people who hire sex workers are also very concerned with anonymity so they’re using alternative emails and alternative names. And sometimes they have phones that they only use for this, for hiring women. You have two ends of people using heightened security, because neither end wants their identity being revealed. And they’re having their real names connected on Facebook.”

She’s not alone, Gizmodo reports; other sex workers have privately discussed the same happening to them.

“I don’t want my 15-year-old cousin to discover I’m a porn star because my account gets recommended to them on Facebook,” another person in the industry told Gizmodo.

“We’re living in an age where you can weaponize personal information against people,” she added. “Facebook isn’t a luxury; it’s a utility in our lives. For something that big to be so secretive and powerful in how it accumulates your information is unnerving.”

Unfortunately for all of us, the math Facebook uses to determine who adds up with whom remains internal, proprietary, and secret. When a Gizmodo reporter had their own unnerving experience with Facebook seemingly knowing too much earlier in the year, the social network said only that “more than 100 signals” go into the recommendations. Facebook, however, lists only five of those potential signals on its help page.

Want more consumer news? Visit our parent organization, Consumer Reports, for the latest on scams, recalls, and other consumer issues.