Home   News   National   Article

How Trump controversies plunged social media into constant battle on free speech


By PA News

Easier access to your trusted, local news. Subscribe to a digital package and support local news publishing.



Click here to sign up to our free newsletters!

Donald Trump had to use an adviser’s Twitter to promise an orderly transition of power to Joe Biden after he was locked out of his own social media accounts.

Social media has evolved rapidly in recent years, from taking a more hands-off approach in the early days, to implementing stricter sanctions on users breaking an ever-growing list of rules.

Platforms have faced the moral dilemma of balancing free speech against hate speech, and no world leader has tested the line more than Mr Trump.

Here, the PA news agency looks at how social network policies have been shaped by the prolific tweeter during his tenure, leading up to his eventual suspension on the big sites.

– Twitter

Mr Trump had already posted a number of controversial comments on Twitter before his rise to power and within his first year as president, generating debate about whether they should be censored.

A tweet in January 2018 in particular caused alarm, warning North Korean leader Kim Jong Un that Mr Trump had a “much bigger” and “more powerful” nuclear button.

It was left up by Twitter, prompting the firm to clarify its stance on world leaders.

At the time, the social network said: “Blocking a world leader from Twitter or removing their controversial tweets would hide important information people should be able to see and debate.

“It would also not silence that leader, but it would certainly hamper necessary discussion around their words and actions.”

A wider debate about disinformation was taking hold, leading Twitter to introduce fact-checking labels, notifying users about unverified claims.

In May last year, the firm placed such labels on tweets from his personal account for the first time after he said postal votes will be “forged” and create a “rigged election”.

– Facebook

For many years after Mr Trump emerged as a political figure, Facebook said it would not take action against content from political leaders – even if it contained false claims and would otherwise break Facebook rules – because the public deserved to hear unfiltered statements from politicians.

Chief executive Mark Zuckerberg repeatedly said it was not Facebook’s role to be the “arbiter of truth”.

This was despite significant backlash and ultimately several public apologies from Facebook after the 2016 US presidential election, when waves of disinformation were allowed to spread on the platform by often Trump-supporting internet users who were embraced by the president, and were pushing a number of conspiracy theories.

Mark Zuckerberg repeatedly said it was not Facebook’s role to be the ‘arbiter of truth’ (Niall Carson/PA)
Mark Zuckerberg repeatedly said it was not Facebook’s role to be the ‘arbiter of truth’ (Niall Carson/PA)

One theory was the pizzagate conspiracy, which centred around fictitious claims of a child sex abuse ring based at a Washington DC pizzeria, falsely linked to high-ranking Democrats including Mr Trump’s then presidential rival Hillary Clinton.

The unfounded claims remained in circulation on Facebook and other online discussion boards such as 4chan and 8chan in various forms all the way through to the 2020 election.

It was not until 2020 that Facebook finally changed its policy on political leaders, including Mr Trump, when it began adding warning labels to posts which violated its policies – long after other major platforms such as Twitter had started doing so.

– YouTube

YouTube has faced issues with disinformation and hate speech but not nearly as many have been a direct result of content shared by Mr Trump.

However, there was the problem of the QAnon conspiracy theory shared by other users, which claims Mr Trump is fighting a secret war against “deep-state enemies” and a cabal of child sex traffickers.

YouTube also had to content with the QAnon conspiracy theory (Nick Ansell/PA)
YouTube also had to content with the QAnon conspiracy theory (Nick Ansell/PA)

YouTube was forced to take action on the baseless claims, saying in October that it had removed tens of thousands of QAnon videos and terminated hundreds of channels under its existing content rules.

The Google-owned firm went a step further, prohibiting content that “targets an individual or group with conspiracy theories that have been used to justify real-world violence”.

Do you want to respond to this article? If so, click here to submit your thoughts and they may be published in print.

Keep up-to-date with important news from your community, and access exclusive, subscriber only content online. Read a copy of your favourite newspaper on any device via the HNM App.

Learn more


This site uses cookies. By continuing to browse the site you are agreeing to our use of cookies - Learn More