After Parler Ban, Rein in Big Tech Now or Cease Being Free Citizens | Opinion

Rise Up 'Deplorables': Rallying Round Pro-America Businesses

In the wake of the protests and tragic violence at the United States Capitol last Wednesday, Parler, the popular alternative to Twitter, is facing an unprecedented crackdown from its competitors. In the span of 48 hours, both Apple and Google announced they would be removing the app from their smartphone app stores. Shortly thereafter, Amazon Web Services announced it would stop hosting Parler, thus also wiping out its web component.

Signaling his thanks, Twitter CEO Jack Dorsey—who calls his platform one that stands for “free expression” and “empowering dialogue”—tweeted out a heart emoji when Parler no longer showed up on Apple’s list of popular apps.

The rationale given by all these Big Tech behemoths is that Parler doesn’t do enough to moderate the violent threats its users make on its platform. This is rich, coming from companies that host and circulate Facebook and Twitter, where violent threats proliferate on a daily basis. Twitter has even gone to court, on free speech grounds, to protect the use of its site for organizing protests—even ones where conduct is disorderly.

Over the summer, many Black Lives Matter protests were organized on social media. Many of those protests later turned violent. All told, this summer’s riots, which spanned 140 cities, caused more than $2 billion in damage and resulted in at least 25 deaths. Has anyone undertaken an investigation into the links between those riots and social media?

Moreover, when it comes to their own behavior, these companies deny that any links could possibly exist between content moderation and offline harm. Last year, their representatives sat on stage at a Department of Justice workshop and insisted that what is said or circulated on social media isn’t their fault—they just amplify reach. Streaming a murder, for example, isn’t at all the same as committing it, they asserted. They’ve testified before Congress that their platforms should not be held in any way responsible for one image of a child’s sexual abuse circulating more than 160,000 times. Law enforcement should just do more, they’ve argued.

BY RACHEL BOVARD 

Read Full Article on Newsweek

Contact Your Elected Officials