The Supreme Court may be about to rule on a topic that, even more than other high-profile topics within its jurisdiction, touches almost every citizen almost every day: the Internet. In doing so, judges have the opportunity to make a confusing area of governance less murky. They also have a chance of doing a lot of harm along the way.
A split panel of the United States Court of Appeals for the 5th Circuit last week upheld a Texas law that prohibits online platforms from removing user-generated material on their sites based on a user’s point of view. a user or from the point of view expressed in a message. Earlier this year, a unanimous panel of the United States Court of Appeals for the 11th Circuit determined a Florida law that similarly restricted tech companies violating the First Amendment. Now Florida has asked the Supreme Court to reconsider. The court, if it agrees to take on the case, will be faced with questions about governments’ ability to regulate speech in the digital age, which both sides have so far tackled as an all-or-nothing deal – but which really require nuances and care.
These two attributes were sorely lacking in the majority opinion of Justice Andrew Oldham in NetChoice vs. Paxton, the 5th Circuit case, which denies any First Amendment protection for what most people call content moderation by platforms but what its author insists on calling censorship. This conflicts with many precedents on the right of corporations to decide what type of speech they will host. But most alarming are the blatant fake features of social media sites that the public is using to justify this position. The claim that neo-Nazi and terrorist material is a “borderline hypothesis” ignores the documented and ongoing game of mole scam platforms with exactly this kind of hate. The claim that the sites “exercise virtually no editorial control or judgment” misses the millions of content they review daily – and the many other algorithmic filters prevent it from showing up.
This last point is supposed to prove that the government can qualify the platforms as “common carriers”, in the same way as the railways or the telephone operators, and require that they do not discriminate. Those on the other side of this debate believe that is a bad analogy, and it is. But the alternative they offer is just as flimsy: They say these platforms are more like newspapers or broadcasters. The truth lies somewhere in between. Social media sites act as a kind of public utility; they also exercise editorial control and judgment that are essential to the value they provide. They exist in a class of their own, and no court has yet determined what standard should apply to them – or what kinds of speech regulations, from extreme restrictions in Texas and Florida to more moderates considered elsewhere to nothing at all, the Constitution permits.
The Supreme Court seems more likely than ever to make this reflection in the near future. If so, judges should resist the temptation of seemingly easy answers that miss the most difficult realities of the digital age.
The post’s point of view | About the Editorial Board
Editorials represent the opinions of The Washington Post as an institution, as determined by debate among members of the Editorial Board, based in the Opinions section and separate from the newsroom.
Members of the editorial board and areas of intervention: Karen Tumulty, Associate Editorial Page Editor; Ruth Marcus, Associate Editorial Page Editor; Jo-Ann Armao, Associate Editorial Page Editor (Education, DC Affairs); Jonathan Capehart (National Policy); Lee Hockstader (immigration; issues affecting Virginia and Maryland); David E. Hoffman (global public health); Charles Lane (foreign affairs, national security, international economy); Heather Long (economics); Molly Roberts (technology and society); and Stephen Stromberg (elections, White House, Congress, legal affairs, energy, environment, health).
#Reviews #Supreme #Court #exercise #caution #internet #speech