We still have two months to go before the US presidential election, but Facebook is already readying a report about the impact of Facebook and Instagram on the electorate in 2020.
The social network wants to study “how people interact with our products, including content shared in News Feed and across Instagram, and the role of features like content ranking systems,” Nick Clegg, VP of Global Affairs and Communications, and Chaya Nayak, Head of Facebook’s Open Research and Transparency Team, wrote in a blog post.
It will do so with the help of two dozen Facebook researchers and independent external academics who will be at liberty to publish their findings without any pushback from Facebook higher ups.
“The external researchers won’t be paid by Facebook and they won’t answer to Facebook either. Neither the questions they’ve asked nor the conclusions they draw will be restricted by Facebook,” Clegg and Nayak write.
Who will participate in this study? You, possibly.
“Some potential participants will see a notice in Facebook or Instagram inviting them to take part in the study,” according to Clegg and Nayak, who say they expect between 200,000 and 400,000 US adults to participate.
If you opt in, you could be asked to take part in surveys or agree to “targeted changes” to your Facebook or Instagram experience. “For example, participants could see more or fewer ads in specific categories such as retail, entertainment or politics, or see more or fewer posts in News Feed related to specific topics,” according to a FAQ. “Other participants may be asked to stop using Facebook or Instagram for a period of time.”
Others might be asked to “install an app on their devices–with their permission–that will log other digital media that they consume. This will allow researchers to understand more comprehensively the information environment that people experience.”
We won’t have results before Nov. 3, however. Facebook expects the study to run through December and for any findings to be published in mid-2021 at the earliest.
The news comes as Facebook, as well as Twitter and YouTube, reckon with how best to police out-of-control conspiracy theories, recommendation engines that send people down rabbit holes of misinformation, and coordinated disinformation campaigns from the likes of Russia, China, and Iran.
Chloe Albanesius contributed to this story.