Facebook Now Has An Internal Panel Reviewing Research On You To See If It’s Ethical Image courtesy of Poster Boy
Odds are very, very good that you’ve been part of a scientific research experiment in the past few years. Probably more than 70% likely if you’re on the internet at all, and approaching 100% if you’re under 30. Why? Because those are the percentages of Americans who use Facebook… which is constantly conducting some of the largest-scale behavioral research ever done.
Facebook Research is far from secret, although it’s true that the big blue everything platform doesn’t talk it up all that often. The last time it made big headlines was in 2014, when researchers published a paper that was the result of an experiment on 700,000 people who had never granted permission or, in fact, ever been told about the experiment they were unwittingly part of.
That got Facebook a lot of negative attention, since the U.S. (and dozens of other nations) has really rigorous guidelines about participation in human experiments — and disclosure is right near the top of the list. But two years ago, at least, there was no framework really required to apply to large scale population manipulation over the internet. Basically, it was your classic “nobody said we couldn’t” sort of situation.
But in the wake of all that bad press, Facebook promised to create an internal review process for its research. And 18 months later, as the Wall Street Journal reports, they’re finally sharing some of the details.
Facebook has now created an internal review board, the WSJ reports. The five-employee panel includes experts in various fields, including law and ethics, and has the authority to consult more outside experts for assistance as needed.
Not every study is escalated to the panel, however. If a research proposal is comparatively innocuous, a manager has the discretion to give it the green light. If it deals with a “sensitive topic” (the WSJ’s example is mental health), then it gets kicked up to the board, which reviews the risks and benefits of the proposed study, as well as whether it’s within the realm of things users might expect their data to be used for.
The review group, the WSJ reports, is modeled on the IRBs that researchers at every university have to go through to do any research with human subjects. And in fact, Facebook hired a former IRB manager from Stanford to head up the process.
As the WSJ points out, Facebook isn’t the only company grappling with the ethics of an extraordinarily broad swath of data. They are the biggest, with the widest reach and the deepest pool, but they aren’t alone. A senior Microsoft manager told the WSJ that big data research ethics is “definitely an emerging field that everyone in the industry is struggling with.”
Facebook Offers Details on How It Handles Research [Wall Street Journal]
Want more consumer news? Visit our parent organization, Consumer Reports, for the latest on scams, recalls, and other consumer issues.