Facebook, Google, Microsoft, LinkedIn, Reddit, and Twitter are joining forces to fight coronavirus misinformation.
Getty/Business Insider composite
  • Facebook, Google, YouTube, Microsoft, LinkedIn, Reddit, and Twitter said they're working with each other and government health agencies to ensure people see accurate information about the novel coronavirus and Covid-19.
  • The companies are hoping to combat fraudulent and harmful content on their platforms, according to a joint statement published on Facebook's website Monday.
  • The coronavirus pandemic has caused a spike in fake news and profiteering that's testing the industry's ability to crack down on harmful content.
  • For more stories go to www.BusinessInsider.co.za.

Several of the world's largest social media companies announced that they're working together to fight misinformation surrounding the coronavirus pandemic and the Covid-19 disease, according to a joint statement published to Facebook's website on Monday.

Facebook, Google and its subsidiary YouTube, Microsoft and its subsidiary LinkedIn, Reddit, and Twitter all co-signed the statement.

"We're helping millions of people stay connected while also jointly combating fraud and misinformation about the virus, elevating authoritative content on our platforms, and sharing critical updates in coordination with government healthcare agencies around the world," the statement read. "We invite other companies to join us as we work to keep our communities healthy and safe."

See also: Covid-19 update: get outdoors and give the domestic worker leave, SA govt recommends

The statement comes as social media companies are under immense pressure to crack down on rampant fake coronavirus cures, false testing methods, and other inaccurate or misleading claims that have spread across their platforms.

Facebook and Twitter have taken steps to ban content about the coronavirus that could cause harm - both platforms said they'll highlight government agency information under searches for coronavirus-related terms.

Google recently announced a 24-hour coronavirus incident response team and said it will work to remove misinformation from search results and YouTube, while also promoting accurate information from health agencies. On Sunday, Google sister company Verily released an apparently half-finished website meant to direct Americans to testing locations, after President Donald Trump announced it prematurely.

But the sheer volume of fake news, which the World Health Organisation has called an "infodemic," is testing whether the industry is actually capable of effectively limiting the spread of misinformation.

Newsguard, which ranks websites by trustworthiness, said in early March that "health care hoax sites" have received more than 142 times as much social media engagement in the past 90 days as the websites for the Centers for Disease Control and Prevention and the World Health Organization combined.

Even before the COVID-19 outbreak, Facebook, Google, Twitter, and others were already under fire from lawmakers and other critics who claim the companies aren't doing enough to stamp out harmful and misleading content in other contexts like violent extremist, cyberstalking, and political ads.

On Monday, Facebook CEO Mark Zuckerberg said it's easier for Facebook to "take a much harder line" in cases like a global health emergency, while Sundar Pichai, CEO of Google parent company Alphabet, told employees in a memo that this is a pivotal moment for the company, according to Bloomberg. It remains to be seen how much the companies' aggressive efforts will make a difference in halting the spread of harmful coronavirus content.

For more information direct from the source, see also:

Receive a daily update on your cellphone with all our latest news: click here.

Also from Business Insider South Africa: