The Supreme Court May Let TikTok Go Dark – Mother Jones

A video streamer outside the Supreme Court on Friday.Michael Brochstein/Zuma

Fight disinformation: Sign up for the free Mother Jones Daily newsletter and follow the news that matters.

The Supreme Court on Friday heard oral arguments on the future of TikTok—whether to let the platform go dark on January 19 according to a bipartisan law passed by Congress, or to intervene and spare the platform.

The case pits the First Amendment free speech rights of TikTok and its users against the government’s assertions that the platform poses a national security risk. With bipartisan support, Congress passed a law that will essentially ban TikTok in the United States on January 19 unless ByteDance, the Chinese-based company that owns TikTok, divests the platform.

Genuflecting to national security over fundamental rights has led to some of the court’s most regrettable decisions.

The Supreme Court does not generally like to second-guess the federal government when it comes to national security concerns, and is therefore likely to ultimately uphold the law. While the justices did express doubts about some of the government’s national security rationale, it’s unclear if there are strong enough to delay the law from taking effect, or to overturn it as an unconstitutional infringement on the right to free speech.

The government’s national security arguments are twofold. First, that TikTok vacuums up user data that it then sends to its corporate owner, the China-based ByteDance, where the Chinese government can access it. The People’s Republic of China has been designated a foreign adversary with a documented strategy of gathering vast quantities of data on Americans.

The justices seemed genuinely concerned about this national security risk. Justice Brett Kavanaugh raised the fear that China would gather data on teenagers and people in their twenties “that they would use that information over time to develop spies, to turn people, to blackmail people, people who, a generation from now, will be working in the FBI or the CIA or in the State Department.” Even the lawyers for TikTok and content creators challenging the law acknowledged the threat. But, they said, it was not enough to make the law constitutional.

The government’s second national security argument, which the justices were more skeptical of, is that China can use TikTok to covertly manipulate its 170 million users. Multiple justices had problems with this rationale. After Prelogar suggested that China might benefit from fomenting arguments between Americans, Chief Justice John Roberts sensed an opportunity for a joke. “Did I understand you to say a few minutes ago that one problem is ByteDance might be, through TikTok, trying to get Americans to argue with each other?” he asked, then answered with the punchline. “If they do, I say they’re winning.”

The courts use various levels of scrutiny to determine whether a law is constitutional. If a law abridges the right to free speech, for example, the courts subject it to a higher level of scrutiny, forcing the government must prove it had a compelling interest to abridge that right. The level of scrutiny can also be determined by whether the government is restricting certain viewpoints. In this case, TikTok and the content creators fighting the law claim that Congress passed a content-based free speech restriction. The government denies this. The law, they say, is content neutral; they are not banning any particular speech, but rather, the manipulation of that speech for geopolitical gain. “TikTok, if it were able to do so, could use precisely the same algorithm to display the same content by the same users,” Prelogar explained. “All the act is doing is trying to surgically remove the ability of a foreign adversary nation to get our data and to be able to exercise control over the platform.”

But the justices seemed to raise an eyebrow at the government’s defense here. Justice Elena Kagan, in particular, pushed back against the idea that a ban on manipulation is content-neutral because, ultimately, it does effect what content is shown. “Content manipulation is a content-based rationale,” Kagan said.

Moreover, the justices seemed dismissive of the idea that covert algorithmic manipulation is an actual national security problem. Kagan drew laughs from the courtroom when she stressed that, at this point, everyone knows China is behind TikTok. “It’s just because people don’t know that China is pulling the strings? That’s what ‘covert’ means?” Kagan asked. “Everybody now knows that China is behind it.”

Prelogar attempted to push back on this. “The problem with just saying, as a general matter, China has this capability and might at some point be able to exercise it and manipulate the platform is it doesn’t put anyone on notice of when that influence operation is actually happening, and, therefore, it doesn’t guard against the national security harm from the operation itself.”

Automatically genuflecting to government assertions of national security peril, especially when fundamental rights are at stake, is a habit that has led to some of the Supreme Court’s most regrettable decisions, including Korematsu, when it upheld the use of detention camps for United States citizens of Japanese origin during World War II. As Jeffrey Fisher, a Stanford Law professor who represented TikTok users, put it on Friday, “The government just doesn’t get to say ‘national security’ and the case is over.”

But the justices’ downplaying of the risk of covert manipulation also ignored the extraordinary power of social media and the difficulty of detecting and counteracting propaganda, misinformation, and narratives intended to weaken the United States or harm its citizens.

During the 2016 election, Russia used social media, including dozens of accounts on Instagram, to dissuade Black people from voting. Often, the Kremlin-backed effort would create accounts with an apolitical focus, then shift them to politics once it had gathered an audience. With TikTok, China’s ability to manipulate is far greater. Instead of working to gather an audience through popular content, it could simply use algorithmic manipulation, powered by its vast data trove, to show certain voters information that would dissuade them from voting. They could use the algorithm to threaten public health by increasing fear of vaccines. The scenarios go on and on.

The justices seemed to dismiss the idea covert algorithmic manipulation is an actual security problem.

Algorithms are a potent tool. Because fear grabs users’ attention, algorithms have long prioritized scary and sensationalist material. It’s one reason that Facebook and YouTube radicalized an untold number of people to fear vaccines during the Covid-19 pandemic and helped spread conspiracy theories like Q’Anon. And these algorithmic decisions were motivated by profit—not ones designed to create geopolitical dominance by an enemy nation.

It’s quite possible that a majority of the justices could deem TikTok’s algorithm protected speech but also determine that the government’s national security interest is strong enough to curtail that right. It’s also possible that the justices could decide that algorithmic manipulation is a protected right of TikTok, a US-based company, but not, ByteDance, a foreign company. That may be the government’s argument: TikTok is free to use whatever algorithm it wants, but ByteDance, and through it, the Chinese government, does not have a similar right.

One wildcard is that the law is set to take effect the day before Donald Trump’s inauguration. Trump has asked the court to halt implementation of the law on the premise that he alone can reach a better resolution, claiming that “President Trump alone possesses the consummate dealmaking expertise, the electoral mandate, and the political will to negotiate a resolution to save the platform while addressing the national security concerns expressed by the Government.” During oral argument, Francisco likewise argued that the court should halt implementation of the law until the next administration.

Justice Samuel Alito, who suspiciously spoke by phone with Trump on Tuesday, raised the possibility of an administrative stay—a maneuver by which a court hits the pause button on a new law or regulation to give the it extra time to assess the situation. This may sound like a compromise solution, but halting a law passed by Congress from taking effect to allow a new administration to ignore it is a significant exercise of judicial authority.

The justices don’t appear happy about any of their options. But they have nine days to do something.

Leave a Reply

Your email address will not be published. Required fields are marked *