By Naomi Nix
For years Meta's role as an arbiter of speech has placed it in a political hot seat: Conservatives charge it with politically motivated censorship, while liberals contend leaving harmful content online fuels its spread.
But with the launch of its buzzy new Twitter alternative, Threads, Meta sees an opportunity to excise itself from the debate by putting the onus of policing its fledgling platform on users.
As it builds out Threads, Meta will probably offer users control over what kind of content they see – including the diciest and most controversial posts – rather than the company making those decisions on its own, Meta Global Affairs President Nick Clegg said. That's a strategy that Meta has already embraced on Facebook, where the company has increasingly given users more ways to shape what appears in their news feeds.
“I hope over time we'll have less of a discussion about what our big, crude algorithmic choices are and more about whether you guys feel that the individual controls we're giving you on Threads feel meaningful to you,” he said.
Though the approach diverges from those of rivals like TikTok, it comes amid a throng of upstart social media platforms, like Mastodon and Bluesky, which offer users more control over their experience – and as decisions about content moderation are becoming legally risky.
Earlier this month, a federal judge blocked key Biden administration agencies and officials from communicating with social media companies, charging that the White House was engaging in a “massive effort… to suppress speech” on the internet. Experts have said the July 4 injunction could make it difficult for social media companies to fight election interference and other forms of problematic content, because such tasks often involve communicating with government officials about the threats they are seeing.
Meta has already hinted that it hopes Threads can avoid the quagmires that have led to high-profile congressional hearings with CEOs, lawsuits and a mounting list of technology-focused regulations around the world.
Last week, Instagram head Adam Mosseri wrote that the company would not “encourage” politics and “hard news” on the platform. Upticks in engaged readership are “not at all worth the scrutiny, negativity (let's be honest), or integrity risks that come along with them”, he added.
But politics has already arrived to Threads, which has attracted more than 100 million users. An array of news organisations have started posting about everything from former president Donald Trump's super PAC to Russia's detention of a “Wall Street Journal” reporter. Meanwhile, politicians such as Republican presidential hopeful Mike Pence have joined the platform.
“They're not gonna get rid of news and politics,” Graham Brookie, senior director of the Atlantic Council's Digital Forensic Research Lab. “It's a text-based platform, and it's likely that news and politics will be a major component of what's being discussed there because of the medium.”
And the public scrutiny on how Threads moderates political content has already begun as civil society groups eye the potential ramification of the app's popularity during the 2024 presidential election. On Thursday, two dozen civil rights and digital advocacy groups urged Meta to publicise its trust and safety plans to protect users on Threads.
“For the good of its more than 100 million users, Threads must implement guardrails that curtail extremism, hate and anti-democratic lies on its network,” Nora Benavidez, senior counsel and director of digital justice at the civil rights group Free Press, said. “Meta must implement basic moderation safeguards on Threads now or the platform will become as toxic as Twitter.”
Clegg said Mosseri's comments didn't mean the company planned to block or suppress the distribution of political content.
“Are we going to suppress and censor anyone who wants to talk about politics and current affairs? Of course not,” Clegg said. “That would be absurd.” But he added that the company probably wouldn't go out of its way to “massively boost” news on Threads or create a special tab for it in the app.
Meta has said it will apply the same guidelines to Threads as those it enforces on Instagram, where hate speech, harassment and content that degrades or shames private individuals are prohibited. Users who are barred from having accounts on Instagram are also barred from creating profiles on Threads.
At present, Meta is not including posts on Threads in its third-party fact-checking program. Though posts on Facebook or Instagram that are rated as false by fact-checking partners will also be labelled as such on Threads, according to the company. Meta spokesman Dave Arnold said the company's “industry leading integrity enforcement tools and human review are wired into Threads”.
More choice
But Meta and other social media companies have already started to give users more choice in what they see. For instance, Meta recently introduced a handful of new Facebook settings allowing users to change the frequency of sensitive, controversial and conspiratorial content in their news feeds. Under the new effort, users can opt out of Meta's policy of reducing the distribution of content that independent, third-party fact-checkers have rated as false. The new settings don't yet apply to Instagram or Threads.
“We feel we've moved quite dramatically in favour of giving users greater control over even quite controversial sensitive content,” Clegg said. “That's the kind of spirit in which we're going to be building Threads.”
With the launch of Threads, Meta is also joining the decentralised social networking movement. The company has said it plans to ensure Threads supports ActivityPub – the open, decentralised social networking protocol that powers Mastodon and other social media platforms.
On Mastodon, content moderation isn't controlled by a single company or person. Instead, a network of thousands of sites – called instances or servers – often run by volunteers, set their own rules for the type of content that's allowed. Users can see posts on other servers and interact with members of those communities. That means if users don't like the rules on one server they can hop on over to a different instance.
Bluesky, a social media company founded by former Twitter CEO Jack Dorsey, is also building a tool that would allow individuals, businesses and organisations to host their own sites that will be able to communicate with one another. As part of the project, users will be able to transport their accounts from one provider or server to the next and have some control over the algorithms that determine what they see.
“In a federated model, each server has discretion over what they choose to serve and who they choose to connect to,” the company recently wrote in a blog post. "If you disagree with a server's moderation policies, you can take your account, social connections, and data to another service.”
That new approach diverges from the early versions of social media sites like Facebook and Twitter, which gave users very little choice over the underlying algorithms delivering them content every day.
Now, the social media world appears headed in a different direction, said Jeff Jarvis, a professor at City University of New York who is writing a book about the internet.
“I think I see a glimpse of a different future, with the idea of ‘pick your own algorithm’, ” said Jarvis. “I see a new model in which we can each have a different world view on social media.”
Washington Post