Views on First

S1 E5: The 1A A-Bomb

Knight First Amendment Institute at Columbia University Season 1 Episode 5

Social media platforms have evaded heavy regulation on their content moderation practices so far, but the jig may very well be up. Many U.S. states are considering enacting laws to rein platforms in. To date, two states—Florida and Texas—have passed laws that significantly limit social media companies’ ability to moderate their platforms, and the measures look very likely to be up for Supreme Court review soon. Guests Alex Abdo—litigation director of the Knight Institute—and Brian Willen—one of the lawyers representing platforms challenging these laws—discuss these immensely important Supreme Court cases. The debate around if and how to regulate social media grows increasingly fierce, and the future of the internet hangs in the balance.

Views on First is brought to you by the Knight First Amendment Institute at Columbia University. Please subscribe and leave a review. We’d love to know what you think.

To learn more about the Knight Institute, visit our website, knightcolumbia.org, and follow us on social media.

Jameel Jaffer:

Most of the big principles that established the First Amendment as we think of it today, were established in the 1960s and '70s, long before the internet, social media, smartphones, all of the challenges that we're thinking about today.

Katie Fallow:

A lot of times when new forms of technology come up like video games or like social media, there is a tendency by some people, particularly government officials, to say, "These old structures under the First Amendment don't apply."

Alex Stamos:

All these decisions are generally just based upon what the companies think are both right for their platform and their users, and right for society.

Nicole Wong:

Who gets to be the decider, and why should we trust that person? Why should it be one person?

Prof. Noah Feldman:

There's a very likely scenario to me coming, where a war is going to happen.

Prof. Genevieve Lakier:

This is a very exciting time to be thinking about freedom of speech and social media, because the story is not yet written.

Evelyn Douek:

Good day and welcome back to Views on First. I'm Evelyn Douek. This is a podcast about the First Amendment in the digital age from the Knight First Amendment Institute at Columbia University. So far, over the course of the previous four episodes of this season, we've gone from the courtroom to the history books to Big Tech headquarters in Silicon Valley, to explore how the digital age has destabilized, how law, society and industry think about free speech. As we've heard, the rise of digital technologies has raised difficult new questions about the meaning of the First Amendment's guarantee of freedom of speech. Now, in this final episode of the season, we're going to explore the pending Supreme Court litigation that could answer some of these questions, but also maybe upend much of what we know about regulating online speech. The cases we're going to talk about could well be the most important First Amendment cases in a generation.

Alex Abdo:

This would be like an atom bomb going off in the middle of First Amendment doctrine. It would be as radical a departure as you can imagine from existing precedent.

Evelyn Douek:

That's Alex Abdo, the litigation director of the Knight Institute. We'll be hearing a lot from him later, but I want him to just sit tight for a minute as I explain how we got to this point. Let's cast our mind back all the way to episode one, where we started this season. We started with Knight v. Trump and the narrow question of whether government officials could block people on social media.

Jameel Jaffer:

Just imagine what the world would look like if you didn't have the rule that we've established in this case, the rule that government officials who use their social media accounts for official purposes can't discriminate on the basis of viewpoint. Government officials would suppress dissent from these forums. Anybody who cares about the integrity of public discourse should see the problem with that right away.

Evelyn Douek:

But in episode two, Professor Noah Feldman worried that even arguing that some part of social media was a public forum. Even just the replies section of an official government account might lead to arguments that all of social media was a public forum in the First Amendment sense, and that courts would therefore say that it was unconstitutional for social media platforms, not just government actors, to block or take down people's accounts.

Prof. Noah Feldman:

The path that the Knight Institute is pushing for in my view, is the path where it will no longer be available to private companies to make their own decisions about this.

Evelyn Douek:

In the years since we have, in fact, seen those arguments being made. This episode is about some of the laws enacted by states that reflect that kind of thinking. In some cases, the people bringing those arguments and the courts that have been receptive to them have cited the ruling in Knight v. Trump, to support their conclusion. This is not what Knight was arguing for or what the case says, but it is how the case is being cited nonetheless. I asked Alex if he or Knight regretted bringing the suit arguing that part of Twitter was a public forum, given it's being used in this way.

Alex Abdo:

No, I really don't think there's any connection. The idea that the governments are restricted in what they can censor in public forums and the idea that the government is restricted in how it can regulate the editorial decision-making of media platforms, those rules have coexisted without the bleed over that some worried about when we were litigating the Knight vs. Trump case. I don't actually lose any sleep over the claim that the Knight Institute has somehow shepherded in this radical legal theory.

Evelyn Douek:

Okay, so how did we get here? In episode three, we heard about rumblings in the First Amendment firmament that threatened to disrupt decades of doctrine, and about how there are odd political realignments happening in First Amendment land where conservatives are now making arguments for reining in the corporate power that they have usually been quick to protect.

Prof. Genevieve Lakier:

What is remarkable about what's happening now is that with the rise of the social media platforms, conservatives, particularly conservative politicians, but also judges and justices, have started to talk about the real problem with private corporate power abusing speech.

Evelyn Douek:

A lot of this comes back to the moment when the mainstream platforms booted Donald Trump off their sites.

News4JAX Newscaster:

Governor DeSantis taking aim at Big Tech. Today he signed a bill into law barring social media companies, including Facebook and Twitter, from banning political candidates.

12NewsNow Newscaster:

Don't mess with Texas. That's the message tonight from governor Greg Abbott. He has introduced a new social media censorship bill.

Evelyn Douek:

This kicked the culture wars over content moderation into overdrive, and platform regulation became a central part of politicians' political platforms. The governor of Florida, Ron DeSantis, and the Attorney General of Texas, Ken Paxton, seize the day. They championed and their states passed the laws that are the focus of this episode. Texas and Florida's social media laws share a lot of similarities. They're both high profile political statements motivated by fears of left-wing political bias in Silicon Valley, and they both take a similar kind of regulatory approach. We're going to talk about both sets of laws together. Let's go back to Alex to tell us what the laws do.

Alex Abdo:

Texas' and Florida's laws, at a very high level of generality, really do two things. First, they both require the platforms to continue to carry certain kinds of speech and certain kinds of speakers that the platforms would prefer not to continue to carry. Those have colloquially been referred to as the must-carry provisions. The second high level bucket are a set of provisions in each law, requiring the platforms to be transparent about how they engage in content moderation, what exactly their policies are, how often they remove certain kinds of content, et cetera. The laws differ in their particulars, but they both generally have a must-carry provision and a set of transparency requirements.

Evelyn Douek:

The tech companies in response, well, they followed the time honored American tradition. They sued. We spoke to one of the lawyers who has been working on the challenge to Florida's law.

Brian Willen:

Law. My name is Brian Willen. I'm a partner at Wilson Sonsini Goodrich & Rosati.

Evelyn Douek:

Brian has been working on this case for a while.

Brian Willen:

I've basically been involved since the Florida law was enacted. We were involved in drafting the initial complaint, challenging the law, filing the preliminary injunction motion, and then I argued the case in the district court. I've been involved with it ever since.

Evelyn Douek:

Now, throughout this episode, you're going to hear me and the guests call the litigation around these laws, the Net Choice cases, as shorthand, after the name of the industry body challenging the laws on behalf of its members. Those members are tech platforms like Meta, Twitter, Google, and Pinterest. It isn't subtle why Florida and Texas want to regulate these platforms, in particular.

Brian Willen:

It was very clear that the point of the law fundamentally was to punish out-of-state Silicon Valley companies for what the governor and his political allies believe to be their speech, their political and ideological views. When the Florida law was enacted, the governor, the lieutenant governor and various other supporters of the law told us exactly why they had enacted the law. I can read a couple of quotes. This is what Governor DeSantis said. He said that, quote, "If Big Tech censors enforce rules inconsistently to discriminate in favor of the dominant Silicon Valley ideology, they will now be held accountable." The lieutenant governor said that, "Thankfully, in Florida, we have a governor that fights against Big Tech oligarchs that contrive, manipulate and censor if you voice views that run contrary to their radical leftist narrative."

Evelyn Douek:

In case you're still not sure if these laws were specifically targeted at companies that Texas and Florida didn't like, well ...

Brian Willen:

Initially, the Florida law had an exception for a social media company that was owned by an entity that owned and operated a theme park in the state of Florida, which was a pretty blatant political carve out for Disney.

Evelyn Douek:

Sure. Who wants to be the guy that regulates the happiest place on earth?

Brian Willen:

That got eliminated after Disney criticized the governor for enacting some other legislation. They got rid of that, essentially, to punish Disney for speaking out against the governor.

Evelyn Douek:

Okay, nevermind.

Brian Willen:

I won't say that the laws are purely symbolic. They have real force and they do real things, but it's also very clear that the laws were designed to send a particular political message.

Evelyn Douek:

The states are like the dog that caught the car. They passed these laws, in large part, to make a political point, and because the politics was the point, they said some pretty blatantly non-First Amendmenty things while they were doing it, about how these laws were targeted at certain groups of companies because of their politics.

Brian Willen:

If there's one thing the First Amendment really doesn't typically allow, it's the government to make decisions about private speech that are designed to punish private entities from expressing their political views.

Evelyn Douek:

But now that the laws have passed, they have to defend them in court.

Judge (U.S. Court of Appeals Fifth Circuit):

Alrighty, next case in the morning, number 2151178, Net Choice vs. Paxton. We will hear first from Mr. Bash.

Lawyer in NetChoice v. Paxton Oral Argument:

These social media platforms control the modern day public square, but they abusively suppress speech in that square.

Evelyn Douek:

Even though these laws are, in large part, political grandstanding, they also raise some pretty simple and fundamental legal questions. At their core, these laws raise the questions that we've been wrestling with all season. What is Twitter or any other social media platform as a matter of constitutional law, and who should decide who gets to speak there and what they can say?

Alex Abdo:

The Net Choice cases ultimately are about whether the government has any power to regulate the social media platforms. At the highest level, those are the stakes of the case.

Evelyn Douek:

Oh, I see. The stakes are just what power the government has to regulate, perhaps, the most expensive speech platforms in history. No biggie. Now, Alex is a nice guy with a great podcast voice, who is really well-placed to explain all of this to us, but that's not the only reason I asked him to be on this episode. The Knight Institute also wants a piece of the action.

Alex Abdo:

We have filed amicus briefs in each of the appellate court cases that have gone to decision, one in the 11th Circuit in Florida, and one in the Fifth Circuit in Texas.

Evelyn Douek:

Amicus briefs, also known as friend of the court briefs, are briefs filed by parties who are not directly involved in the case, but who just want to be friendly and help the court by giving their 2 cents on the issues. I asked Alex why the Knight Institute wanted to weigh in.

Alex Abdo:

The answer to the question of whether the government has a role in regulating social media is enormously important, and the parties were putting forward two diametrically opposed answers to that question. The states, for their part, were arguing that the platforms can be subject to any regulation whatsoever by the government because the First Amendment doesn't apply to them and has nothing to say about the social media platform's content moderation decisions.

Evelyn Douek:

Okay, so the states are arguing that the First Amendment gives a big green light to government regulation of social media.

Alex Abdo:

The platforms, for their part, were arguing that the First Amendment provided them near absolute protection from any kind of regulation by the states.

Evelyn Douek:

The platforms are arguing that the First Amendment says a big fat, "Hell no."

Alex Abdo:

To the Knight Institute, neither of those answers was appealing. We wanted to chart a middle course, one that we thought was faithful to the values that the First Amendment and free speech doctrine are meant to protect.

Evelyn Douek:

That's the dispute at a high level, but let's get a little bit more specific. As Alex said before, there are two main buckets of issues. In the first bucket are a bunch of what are being called must-carry provisions, so called because the laws would require platforms to keep up, that is, carry certain content they would normally take down under their rules. In Florida platforms couldn't take down posts from political candidates in the run up to an election, and in Texas they couldn't take down posts based on the viewpoint that they express, which doesn't sound so dramatic until you realize that things like, "Nazis are great," and, "Pedophiles are really onto something," of viewpoints. The second bucket of provisions impose a bunch of transparency and reporting mandates on platforms, and we'll come back to those. The must-carry provisions have generally gotten the most attention, so that's where we'll start. I asked Brian to lay out the platform's argument, that is, his client's argument, that these are unconstitutional.

Brian Willen:

The basic argument is that choices that platforms make about what content to remove, what content to highlight, what content to display in particular ways is protected by the First Amendment. The reason it's protected by the First Amendment, on this argument, is because those choices ... which we use the term editorial choices, editorial judgments often to speak about ... are highly expressive in that they are reflective of the norms and values and priorities of a given website or a given online service. For example, some online services choose to allow pornography. Some don't. But that's a choice that reflects some judgment about the nature of the service that you want to operate. Our argument is that that fundamental choice is expressive of a particular set of values and, often, ideas and therefore, like any other form of speech, is protected by the First Amendment.

Evelyn Douek:

This might be a little confusing at first, because Brian is arguing that the choice by platforms to take speech down is itself a form of speech. It goes back to this drum we've been beating the whole way through this season, about how the First Amendment prevents the government from regulating speech, but not private parties.

Brian Willen:

The fundamental problem with these laws is that they amount to government censorship of this form of speech, by private entities. That's the fundamental thing that the First Amendment does. If you are Twitter or YouTube or Google or Etsy ... or you can name whatever online service you want that's privately owned and privately operated ... the First Amendment doesn't limit their ability to make choices about how their property and their services should be used. In fact, it protects them from what, in our view, is government overreach.

Evelyn Douek:

We couldn't get a lawyer representing Texas or Florida to come on and give us their rebuttal, so I asked Alex to stand in.

Alex Abdo:

I think the best version of the state's argument is that no one person, no one company, no small handful of people should have complete power to decide what can be said on the forums that have become the places where people talk, and engage, and meet one another, and exchange the news.

Evelyn Douek:

This is not an argument out of nowhere. We have similar rules for telephone companies and cable companies. Verizon can't just cut your phone line because you vote for ... I don't know ... the Muppet party or something. We would find it really worrying if the companies that provided such important means of communication could just discriminate willy-nilly. Here's the lawyer arguing on behalf of Texas in the Fifth Circuit Court of Appeals.

Lawyer in NetChoice v. Paxton Oral Argument:

Supreme Court's never recognized in its real discretion right that goes that far, and I would submit, Your Honor, that the telephone companies could operate this way as well. They could screen calls and say, "We don't like what this caller is saying. We're going to drop their service." No one has ever thought that they have the constitutional right to behave that way.

Evelyn Douek:

Part of what makes this scary is the sheer size of some of the biggest platforms and just how much influence they have on public discourse.

Alex Abdo:

That's a compelling argument. To an extent, I agree. I don't think society should tolerate there being so few people with so much power over the media environment. The problem with what Florida and Texas have done though is to answer the problem of concentration of economic power and concentration of power over speech, by trying to seize that power for the government itself, which I think is the wrong answer to the problem.

Evelyn Douek:

In a way, this case is a battle of analogies. The platforms are arguing that they're really more like newspapers that act like editors when they choose what to publish and where on the page to put it, whereas the states are arguing the platforms are more like telephone companies who say, "We're the modern public forum," and so they should be made to stay open to everyone.

Lawyer in NetChoice v. Paxton Oral Argument:

With telephones, telegraphs, and the platforms, speech is traveling in multiple directions. People go to these places to have a conversation with each other. They're interacting with each other.

Evelyn Douek:

But some of the things the platforms say about themselves do sound like they agree with the states that they're open public forums.

Elon Musk:

Twitter has become the defacto town square.

Jack Dorsey:

And they often have the same expectations that they would have of any public space.

Evelyn Douek:

So I asked Brian about the statements that platform CEOs, his clients, like to make about how they're the modern public square and how to square that with the argument that they're really more like newspapers.

Brian Willen:

The choice about how open and how democratic or how neutral a service wants to be when it comes to choices about what content is and is not going to be allowed is, itself, part of the value judgment that I'm saying is protected by the First Amendment. But even within that framework, all of these services, I think, for a very long time, and certainly as time has gone on, have been very clear that while they do want to be open generally, while they do want a large number of users and a large number of voices, they have very clear rules, or at least rules, about what can and can't be said.

Evelyn Douek:

What is Alex's view on this?

Alex Abdo:

I think the platforms are right to an extent, that some of the things that they do look a little bit like what newspapers traditionally have done. The platforms do make editorial decisions when they say, for example, "We don't want certain kinds of speech on the platform and we don't want certain kinds of speakers on the platform," but they look very different than the New York Times. They don't engage in the kind of close editorial control of everything that is said on their platform. They don't put out a single coherent piece of expression, and there are a variety of other differences in how they operate.

Evelyn Douek:

The Knight amicus brief pointed to differences between platforms and newspapers, like the fact that newspapers publish mostly their own content while platforms publish their users' content, the sheer scale of platforms, millions of posts a day, and the use of algorithmic versus human decision-making.

Alex Abdo:

None of those differences, I think, remove the platforms from the First Amendment's protection, but they're relevant, I think, to understanding both the platform's own interests in controlling the speech that takes place on their services and also in understanding why the state or why the public has some interest in at least some forms of regulation of the platforms.

Evelyn Douek:

For Alex, when it comes to analogies ...

Alex Abdo:

They're helpful to a point, but they don't answer any of the hard questions.

Evelyn Douek:

It turns out that platforms look a little like newspapers and a little like telephone companies, so if you squint a little, you can make them look like either. That's what happened in the courts of appeals, where the 11th Circuit struck down Florida's must-carry laws, but the Fifth Circuit upheld Texas'.

Brian Willen:

The 11th Circuit basically adopted the argument that I've just been describing, which is to say the argument that the editorial choices that private internet platforms make are fundamentally protected by the First Amendment.

Evelyn Douek:

While in the Fifth Circuit ...

Brian Willen:

The Fifth Circuit saw this issue very differently.

Alex Abdo:

They started from a very different premise. The 11th Circuit started from the premise that most First Amendment scholars start from, which is that the platforms do engage in some kind of expressive activity, something that we would call speech within the meaning of the First Amendment. And the Fifth Circuit rejected that entirely.

Brian Willen:

Essentially, what the Fifth Circuit said is that what it characterized what we were describing as content moderation, editorial judgment, it just saw as, quote-unquote, censorship, and, in my view, misguided, but nevertheless rhetorically harsh way, the Fifth Circuit basically held that what it was characterizing as censorship is not protected by the First Amendment, and therefore, the argument that we were making, it fundamentally rejected.

Evelyn Douek:

Now, Brian said that the Fifth Circuit was rhetorically harsh. You know what? It's true. Judge Oldham, the judge who wrote the opinion, was clearly worked up about it. He says four times ... four ... that the Texas law doesn't chill speech, it chills censorship, and the platforms can't just shout editorial discretion and declare victory. Now, his opinion is 90 pages, and we can't get into all the doctrinal nitty-gritty here, but the main takeaway is that when it comes to the must-carry parts of the laws, the Fifth Circuit sided with the state of Texas, and the 11th Circuit sided with the platforms, and there's no real way to reconcile their decisions.

Brian Willen:

I think they're pretty much diametrically opposed to one another.

Alex Abdo:

You couldn't come up with more conflicting decisions than these two.

Evelyn Douek:

This is not just a little awkward disagreement.

Brian Willen:

This is what we lawyers call a circuit split, which basically means you have two different appellate courts that have decided the same issue differently. That is normally the situation where the Supreme Court would get involved to resolve the split.

Evelyn Douek:

This is not the only reason that people are confidently predicting the Supreme Court will sooner or later have to weigh in on these issues.

Alex Abdo:

The cases are just enormously important. The answers that the courts provide to these questions will decide what social media and online spaces look like for a generation.

Brian Willen:

The reason that people on both sides of these issues feel so strongly about them is that these platforms, these services, are incredibly important aspects of American life and culture and society. So the stakes are quite high on both sides.

Evelyn Douek:

Saying the stakes are quite high is a First Amendment understatement that's equivalent to saying ... I don't know ... "Taylor Swift is quite a popular singer."

Alex Abdo:

If the states win, then the likely result is that every state in the country and the federal government will have the power and then the interest in deciding what can be said online and how online platforms structure their services.

Evelyn Douek:

It's hard to overstate what an earthquake the Net Choice cases could be.

Alex Abdo:

This would be like an atom bomb going off in the middle of First Amendment doctrine. It would be as radical a departure as you can imagine from existing precedent. For the better part of half a century, First Amendment scholars and advocates, if they've understood one thing about the First Amendment, it is that it prevents the government from telling speakers what they can and cannot say when it comes to political discourse. Texas' and Florida's law aim to do exactly that. If the Supreme Court were to say, "That's okay," we'd have to go back and reevaluate just about every single major First Amendment opinion issued by the Supreme Court, because I think every one would, at that point, be potentially up for grabs.

Evelyn Douek:

Okay, so that all sounds very dramatic, and it is. That's why the must-carry provisions get most of the attention, because they threaten to prevent platforms from doing certain kinds of content moderation, from exercising a right that the First Amendment has been widely understood to protect, for decades. Alex and Brian mostly agree about this. They might differ a little around the very edges about some of the laws that might be constitutional, but they agree that what Texas and Florida have done is like taking a sledgehammer to settled First Amendment law. Where they disagree though is on the much less sexy sounding transparency provisions. The transparency provisions require platforms to do things like disclose what their content moderation rules are, issue transparency reports about their enforcement practices, and provide users explanations for content moderation decisions, and allow those users to appeal decisions that they don't like. That all sounds pretty bureaucratic, but, to my mind, these provisions could be the real dark horse of the Net Choice cases, the blockbuster ruling that people aren't expecting as much because they're focused on the must-carry provisions. When I asked Alex if he was nervous about the possible outcome of Supreme Court ruling in these cases, he said ...

Alex Abdo:

To be honest, I'm more nervous about how the court will analyze the transparency provisions of the laws, because it seems very clear to me that the court is going to strike down the must-carry provisions, but I don't know how the court is going to analyze the transparency provisions. For that reason, it seems to me as though more os at stake. What the court says here will decide whether any of the proposed regulations in Congress ... some of which are a lot more thoughtful than Texas' and Florida's laws ... have a chance at being upheld by the courts.

Evelyn Douek:

Now, one thing that both sides of the political aisle agree on, one of the few things, is that platforms are not to be trusted, and we need to know what on earth is going on behind their closed doors. Transparency is something that a lot of people can get behind, but the platforms have argued that these requirements are unconstitutional, because they are thinly veiled attempts to try and stop platforms from doing any content moderation at all. They argue that forcing platforms to explain themselves will chill their decision-making and make it overly burdensome.

Brian Willen:

You can look at the 11th Circuit decision in the Florida case, the provision that we talked about earlier, which is a provision that says essentially any time any user content not just is removed, but is deprioritized or made harder to find, you have an obligation to provide a detailed explanation to the user. If you took that seriously, that would apply to decisions that are made billions of times a day.

Evelyn Douek:

When Brian says billions, you might think he's just being extravagant, but it's not really an exaggeration. For some of the largest platforms, they're making millions of decisions every hour.

Brian Willen:

The idea of actually providing a detailed explanation every time YouTube removes a comment from a video or every time Facebook makes a certain tweet harder to find algorithmically without actually removing it, it is so massively burdensome and complicated to do that, that really the provision like that can really only be understood not as actually mandating editorial transparency, but as creating so much of a burden on any exercise of these underlying editorial judgments that it operates to effectively deter services from making those judgments.

Evelyn Douek:

The platforms' argument is that the states have made it pretty clear that they don't like the content moderation that platforms do, and so these transparency provisions are just an indirect way of disincentivizing them. Again, the states didn't really help themselves here by saying all the non-First Amendmenty stuff that they did when they were passing the laws. Texas and Florida absolutely made it clear that one of their prime objectives was to stop platforms doing so much content moderation. But these cases aren't just about these laws, and the decision that the Supreme Court makes here has the potential to shape the constitutional landscape for generations. We don't want to throw the transparency baby out with the Texas and Florida bathwater.

Alex Abdo:

There's a saying in the law that hard cases make bad law, but I think in this case, really easy cases have the potential to make bad law, because Florida's and Texas' laws should be really easy from a First Amendment perspective. But I think for that reason, there's a risk that the court rules more broadly than it should in striking the laws down, making it harder than it should be for states to pass thoughtful regulations of the platforms, including transparency requirements. I think it's possible to craft a rule that enables narrowly-crafted, precise government regulation that doesn't run the risk of allowing the Texases and Floridas of the country to pass laws that are really thinly veiled retribution for political decisions that the platform's made. I think you have to account for the substantial arguments made on the other side, the substantial arguments made for transparency. I don't think the rule that we come up with in the end can be an absolutist one, because that will force us into a world where we have to accept that the platforms get to operate entirely unaccountably to the public interest. I don't think the First Amendment condemns us to that world.

Evelyn Douek:

He doesn't think that it does, but he's nervous that it will. There are many people arguing that it should, that the kind of editorial decisions that platforms make are so sensitive and so important that allowing the government to probe them at all, to ask questions about them, is a violation of the First Amendment, because it will change how those companies make those decisions, and that, that's a thing that the government shouldn't be allowed to do. But taken to that extreme, this argument would mean that some of the most powerful corporations in the world can't be made to give us any information at all about how they govern some of the most important spaces for free expression ever.

EWTN Newscaster:

The new face of Twitter is vowing a crackdown on both child pornography and trafficking on the platform.

Scripps News Narrator:

What remains to be seen is how Musk intends to deal with the problem after gutting teams dedicated specifically to dealing with this content. Bloomberg reported that Musk slashed about half of the specialists that review reports of child porn from showing up on the platform.

Evelyn Douek:

Few corporations wield so much power with so little accountability. I asked Brian if this gave him pause. His clients are immensely powerful. Does arguing for the Supreme Court to adopt such a deregulatory approach towards them, worry him at all?

Brian Willen:

I am more worried about the power of government to take away the rights of private people, private businesses, and private companies than I am about the power of private platforms in this context. That's not to say that there shouldn't be a healthy conversation, whether it's a conversation about how to enforce antitrust laws or a conversation about how to think about the regulation of the size of businesses across the telecommunications and other industries. I think those are legitimate conversations. But I do think, when it comes to choices that private entities are making about speech, I am more likely to trust the decisions that those private entities make than I am to trust the government to dictate to those companies what decisions they can and can't make.

Evelyn Douek:

Alex understands the fear of government that Brian's talking about.

Alex Abdo:

I think that anxiety has been a major part of the development of a lot of First Amendment doctrine, and the First Amendment, I think, reflects the view that centralized control over speech generally is a bad thing, because centralized control can be abused. To my mind, though, it should be understood to reflect a discomfort both with centralized control within the hands of government, but also centralized control within the hands of private corporations, of the mediums of communication that a society relies on.

Evelyn Douek:

Right now, that good old American ethos of distrusting the government may feel especially appropriate.

Alex Abdo:

I think what's distinct about this moment is it feels that so many of our democratic institutions are failing us. That's a problem that's a lot bigger than the First Amendment. So I understand the impulse to double lock the door and put up a chair underneath a knob.

Evelyn Douek:

But crafting constitutional rules is about playing the long game.

Alex Abdo:

I think the rule that we're fighting for now has to be one that serves society not just over the next few months, but over the next few decades.

Evelyn Douek:

That's the task the Supreme Court will confront when the Net Choice cases, or whatever other similar case it is, ends up in front of them. As we've heard throughout this series, it's going to be hard. Platforms are complicated, technical, ever-changing beasts, and their role in politics is both profound and volatile.

Young Turks Host:

Twitter has decided to permanently ban Donald Trump.

CBS New York Newscaster:

Former President Trump's Twitter account is back. Elon Musk restored it last night.

Brian Willen:

The fact that Elon Musk has different views than the people that operated Twitter before and therefore the policies might change, is a perfect illustration of that. The state, in response to our arguments, wants to say, "This has nothing to do with values. It has nothing to do with expression. It has nothing to do with your speech," and I think what we're seeing is that it does.

Evelyn Douek:

We're fighting for free speech on lots of different fronts. The fight for free speech that dominates the headlines is a lot about corporations and billionaires. What kind of public sphere will our tech overlords give us today?

Brian Willen:

The whole point of a First Amendment that allows private entities and private parties, within certain limitations, to speak or make editorial choices the way that they want is, of course, we're not always going to like those choices. Those choices can be criticized and, in some instances, perhaps should be criticized, but they are reflective of a larger freedom that we, at least in most instances, afford to private individuals, private companies, private businesses.

Evelyn Douek:

But it's important to remember that that freedom is a product of a particular understanding of the First Amendment that was not inevitable and is not particularly stable. This is a rare area where the constitutional stakes are very high and very unpredictable. Both the left and the right wing of the court may be in play, and what will come out of the court when the Net Choice cases go in is really up for grabs. That legal battle, the push and pull of public and private power online, is another front in our free speech wars. But there's also another way in which what free speech means today is decided.

Alex Abdo:

I think that in the end, the cultural notion of free speech may end up being more important to what society looks like, to what our democratic institutions look like, than the First Amendment itself, for the reason that the cultural notion of free speech is effectively what decides what people say and what they don't say.

Evelyn Douek:

It's like there's the First Amendment as it lives in the case book, and then there's the First Amendment that lives in our ... hearts? Maybe the most important free speech lessons are the arguments we had along the way.

Alex Abdo:

I think it's a debate that every generation is meant to have, to decide for itself what that generation will, for the very short period of time that it is on earth, and what its relationship will be to free speech. I think we need to have that debate over and over, and I think it's important to have it over and over. I think that's part of what it means for people to coexist in a society, is to continually reexamine the rules of engagement. I think that's what free speech is. This cultural notion of free speech may, in the end, be more important to what public discourse looks like, to what our democratic institutions look like, than the precise boundaries that the Supreme Court articulates for the First Amendment itself.

Evelyn Douek:

Which may be a weird thing for a First Amendment Institute to say, but it may also be true. The First Amendment is important, and it is important to fight for the vision of it that you believe in.

Alex Abdo:

The Knight Institute will absolutely file an amicus brief if and when one of these cases makes it to the Supreme Court.

Evelyn Douek:

But fighting for free speech is not just the job of lawyers like Brian and Alex or, heaven forbid, law professors like me. It's something all of us can and should do every day when we tweet or toot or dance on TikTok. It's those actions that ultimately make up our free speech culture in the public sphere. We're a long way now from the case about @realdonaldtrump and whether a government official can block people on Twitter, but the question is the same. How can we best realize the goals of the First Amendment in this public sphere profoundly transformed by technology? This episode of Views on First is produced and edited by Merrin Lazyan and written and hosted by me, Evelyn Douek, with production assistance and fact-checking from Kushal Dev. Candace White is our executive producer. Audio and production services are provided by Ultraviolet Audio, with production and scoring by Merrin Lazyan, and mixing and sound design by David Bowman. Views on First is brought to you by the Knight First Amendment Institute at Columbia University, and is available wherever you get your podcasts. Please, exercise those First Amendment rights by subscribing and leaving a review. We would love to know what you think. To learn more about the Knight Institute, visit knightcolumbia.org or follow the Institute on Twitter at @knightcolumbia, or on Mastodon by the same handle. I'm Evelyn Douek. Thanks for listening. Could just discriminate willy-nilly. Willy-nilly. Oh, no. I can tell already you just wanted lots of tape of me saying willy-nilly. I've been suckered in. Yeah, there's the Easter egg.