Standing up to Dictators… and Facebook… to Save Democracy

By Gwyn Lurie   |   May 24, 2023
Maria Ressa will be speaking on May 18th at Campbell Hall

I don’t have many heroes. Maybe because I’m too easily disappointed. Or that just beneath my optimistic surface lives a somewhat jaded self. Or perhaps it’s simply that it’s hard to find heroes these days who stand up to the test of time, not to mention under the harsh glare of modern-day journalism. But when the folks at UCSB’s Arts & Lectures arranged for me to interview Maria Ressa in anticipation for herMay 18th talk at Campbell Hall, the fearless Filipina Journalist, former CNN correspondent, co-founder of Rappler, and the 2021 Nobel Peace Prize recipient for holding the line in the existential battle for truth, I must admit I got a little weak in the knees.

Even before Ressa became the first Filipino ever to be awarded the Nobel Peace Prize, this beyond-brave Journalist was already a strong candidate for my personal, very short hero list. And by the time our hour-long, mind-blowing, and a little terrifying, conversation ended, Ressa was solidly on my list. Malala, Gloria Steinem, Frieda Kahlo; you have company.

Okay, I get that our country is deeply divided. As is our world. Seemingly more so every day. But why? This extreme division has never made sense to me since I know that most of the issues over which we spar are complicated, and that “truth,” in most things, is usually found somewhere in the mushy middle. And facts, by definition, can be proven. So why then are so many of us clinging to the extremes? Or, as Ressa argues in her must-read tome, How to Stand Up to A Dictator, are we, in reality, not so divided; and is the idea that we are  a fiction that’s being orchestrated by people and platforms that are profiting grotesquely from that notion?

Ressa writes, “The very platforms that now deliver the news to you are biased against facts, biased against journalists. They are, by design, dividing us and radicalizing us – because spreading anger and hatred is better for Facebook’s business… This is anger and hatred that coalesces into moral outrage that then turns into mob rule.”

In no uncertain terms, Ressa names the evil protagonist starring in this real-life thriller. “Facebook represents one of the greatest threats to democracies around the world, and I am amazed that we have allowed our freedoms to be taken away by technology companies’ greed for growth and revenues.”

In How to Stand up to a Dictator, Ressa argues that there are three assumptions implicit in everything Facebook says and does: “First, that more information is better; second, that faster information is better; third, that bad information – lies, hate speech, conspiracy theories, disinformation, targeted attacks, information operations – should be tolerated in service of Facebook’s larger goals. All three are great for Facebook because they mean that the company makes more money, but none of them is better for users in the public sphere.”

And by the public sphere, she means us. Everyday people who get their news from social media and are constantly being riled up by the political indignance of our “friends.”

But, as you’ll read in this interview, Ressa is not without hope, or solutions, sounding a clarion call for legislation to hold technology companies accountable, for more investment in investigative journalism and more collaboration between news organizations and those who care about democracy and facts.

But the part Ressa rests squarely on our shoulders, yours, and mine, is the imperative to stand up to bullies. Because, as she repeatedly reminds us, “silence is complicity.” And as Human Rights Attorney Amal Clooney writes to Ressa in her book’s forward: “If you, a Nobel Peace laureate, can be locked up for nothing more than doing your work, what chance is there for others?”

I hope you will attend the May 18th Arts and Lecture event at Campbell Hall and that you will read this book – and when you do, that you will begin to see how we are being used and manipulated not only to the advantage of a few masters of the metaverse, but more importantly, to the grave disadvantage of ourselves.

I also hope that you will sign up to attend Arts & Lectures conversation which I will be moderating with Jonathan Greenblatt, CEO of the Anti-Defamation League (ADL), at Campbell Hall on May 22nd. The team at Arts & Lectures has done an amazing job of curating their speakers from a list of key players in the fight for democracy.

Gwyn Lurie (GL): I’m very inspired by your journey. And I’m so happy that you’ve gotten the recognition you have, not only because I think what you have to say is so important, but just for your safety.

Maria Ressa (MR): Actually, that’s one of the first things that did change. I had no approval to travel. They denied travel four times, including the time when my mom was getting an operation; and post Nobel, that was the first time they gave me the right to travel again, and I still have to put in for requests to travel and it goes all the way to the Supreme Court.

That one algorithm is a recommendation engine for growth for all the social media platforms. What they didn’t realize was friends of friends pulled society apart, pulled apart the public spheres so that
you can no longer have a
functioning democracy.

GL: Can you explain why you’ve been targeted?

MR: …journalism is under attack. We don’t have a business model like Tech companies that have not only poisoned or corrupted our information ecosystem, they’re also microtargeting. News organizations don’t do microtargeting in the same way that social media companies do… we don’t have a business model.

American media is zooming into Ukraine, but have you heard anything about Myanmar or about Pakistan? The chaos in Pakistan early this week, I had friends there who were saying they needed to leak information to me. The world is on fire and news organizations have fewer resources, and I worry that our younger journalists coming in – they want to be influencers rather than journalists. If you want to be an influencer, you shouldn’t be a journalist.

GL: In your book you write about the difference between “objectivity” and “good journalism” and how objectivity is really a myth. But hasn’t that always been true, that by virtue of what you report on and what you don’t report on, it’s subjective?

MR: Correct. Always. But the difference between when news groups had that gatekeeping power, and now, is that you could hold someone accountable. The biggest difference in our information ecosystem today is its impunity. We now have evidence of Russian disinformation attacking America at the cellular level. And yet, neither the companies that enabled it, nor the country itself, have been held to account. I think that that’s part of it. And if this continues, our values have been eroded. These are all cascading failures. In 2018, MIT came out with the study that shows how lies spread six times faster than facts. You lace that with anger and hard data shows its effect. For example – that in 2017, women in the Philippines were attacked at least 10 times more than men.

The reason why the world is upside down, and why on the Doomsday clock for democracy we’re in the last two minutes, is because that is the incentive structure that has been created by the new gatekeepers. In my Nobel lecture, I called it a behavior modification system. It really is. We’ve seen this in the Philippines.

GL: I’m curious, did you ever consider going into politics instead of journalism?

The fearless Filipina Journalist, former CNN correspondent, co-founder of Rappler, and the 2021 Nobel Peace Prize recipient, Maria Ressa

MR: Oh my god. I would never go into politics. No, I’m a journalist. The best part about journalism is I fell into it. In 1986, I was pre-med at Princeton when I got the Fulbright Scholarship to go from the U.S. to the Philippines, largely in search of my roots.

I had deferred admission to law school. I had deferred admission to medical school… I had had corporate job offers and I didn’t know what I wanted to do, but I knew that I never felt completely American. So, I thought if I go do the Fulbright in the Philippines, that would give me a year to understand where I came from and to figure out who I am.

I’m doing these commencement speeches now. I just did Vanderbilt and I’m doing one on Sunday. And it’s such a different world from the world I grew up in, from the America I grew up in. Anyway, I fell into journalism at an incredible moment. It was 1986, the People Power Revolution sent an electric charge through society…

[The People Power Revolution was a series of popular demonstrations in the Philippines that saw the country transition from authoritarian rule under Marcos to a democracy. It was during this period that Ressa embarked on a career in journalism. A time where, as Ressa writes: “…in the 1980s, another agreed-upon fact, a foundation of our shared reality, was that without good journalism, without the sound production of facts and information, there would be no democracy. Journalism was a calling.”]

When I was in my twenties, it was either go back to the United States or stay [in the Philippines] and do this startup. Rappler was much, much later, but do this company called Probe. And it was incredible. It’s the best way I could have learned television because we did it the way we thought it should be done, instead of following what others had done. If I had been in the U.S., it would be very, very different than what I wanted to do. And journalism was the best. I loved it.

The reason why I can’t see going into politics, is because what I loved about journalism is it continued the ideals I grew up with. And this is my worry now for this next generation. What are the ideals? It is that in the end truth wins, in the end there’s a meritocracy; in the book I wrote about this, the empty mirror that you search for knowledge, you search for a view of the world and you have to have enough confidence to do that; but not so much confidence that it becomes an arrogance that simply fills the mirror with your image. When you take yourself out of that picture, your image no longer obscures the objective truth that lies behind it. So, it’s like the Buddhist version of the myth of the cave of Plato.

Once I fell into journalism, I loved that the head of state would ask me, what are the principles? And that once you get beyond the ceremonial proforma questions you need to ask, that powerful figure and I are getting used to each other as people. How incredible to have somebody who has to make these really tough decisions, and you can ask them any question you want. I mean for television there’s a form, and I wrote about this, it’s the most unnatural way of being natural. But it was such an opportunity to continue to learn, I continue to understand why things happen. That’s why I fell into journalism.

Polarization is another way to describe this. You pull them
out, you don’t hear. Democracy is all about listening to all
sides and then making up your mind independently.

GL: Social media was something that, from early on in your career, you invested in heavily, right? You saw it as an opportunity for citizen empowerment. But then, as you say in your book, it ended up “tearing down everything you hold dear.” Do you think that your strong embrace of Facebook in building Rappler added to Facebook’s power?

MR: Definitely. And that’s why it felt like a betrayal. But when I look back over it, it’s the same mistake lawmakers make. News organizations have a set of standards and ethics and we self-regulate. Why would we expect tech companies – whose primary motivation is profit, who have no standards or ethics manual, who don’t actually embrace putting guardrails on, like protecting the public sphere – to behave like news organizations? In retrospect, I should have known that. But I guess I thought they were like CNN, because I was there in the beginning days of CNN; I was there during a tailwind. I was there when we were making mistakes because we were growing so fast.

I set up the Manila bureau and the Jakarta bureau and my team in Jakarta was one of 12 teams that tested out new technology and new equipment. So, I gave Facebook that same courtesy. When you’re a fast-growing organization, you make mistakes. But if you are guided by the right principles, you come out of it and you fix it. This is why I realized that news organizations, journalists, are different.

Every decision we made for the public put us at risk every step of the way. When in the time of Duterte we published the three-part weaponization of the internet series, I had an idea that it could be dangerous. I rolled that past our board; I secured board approval because I thought the series may actually threaten the business. Now if they had said no, I would’ve fought it. And ultimately our shareholders believe in and give journalists the power. Because I’ve worked in organizations where the businesspeople win over the editorial all the time. We created an organization that isn’t like that… But, we were the first ones that were attacked in the courts for it and hopefully we’re winning.

On that part, I can’t really talk about it because I’m kind of like on a gag order, but… I had 10 criminal warrants of arrest starting in 2019 and eight of them came within three months of each other. I just kept getting arrested and then posting bail. But all of them came out of the Philippine depository receipts, it was five tax evasion charges from that event in November 2015. It was the civil cases and the criminal cases.

GL: So, in 2017, you met with Mark Zuckerberg [Facebook CEO] and that’s when you said “97% of the Philippines are on Facebook” and he said, “Maria, where’s the other 3%?” Were you shocked that he said that, or did you already start to know by then that he was a big part of the problem?

MR: I was still working with Facebook at that point, and I laughed when he said that. I still was hoping that they would do the right thing and I thought they just didn’t understand it. I don’t know why I gave Facebook the benefit of the doubt. I didn’t know until post Cambridge Analytica, and we became fact-checking partners of Facebook in 2018. What happened in January 2018 was that they pivoted, and Mark Zuckerberg began saying that they were all about family and friends. So, they started threatening news. So, a news organization in Slovakia dropped 60% of page views. We tracked this on a daily basis with Rappler. We dropped maybe 15% based on their choices.

Then when Cambridge Analytica happened, that was March 2018 … we tracked the networks for disinformation. And the way we were able to take apart the tactics was because we were plugged into the APIs, which they shuttered as soon as Cambridge Analytica happened. And ostensibly it was because of data privacy issues. But that’s not true. I think it was because they didn’t want anyone else exposing exactly what was happening. And 2019 was when I really went frontal, and I started demanding answers.

When you look at how long it took me, it’s because I’m old power. I grew up at a time when if there is an issue, you sort it out, you pound it out behind the scenes. And in that sense, I wasn’t working like a journalist, I was working as the head of an organization, and it was a tough thing. So, I guess it was 2019 when I started getting arrested. In fact, the first time I got arrested, there were Facebook people in the office, and I thought that would send ripples, but it sent no ripples.

GL: Can you explain what Astroturfing is, or the fake bandwagon effect?

MR: Sure. AstroTurf is fake grass, right? So, what we did was we started looking, we started pulling the data and one of the things I think people didn’t realize was that it’s not just the post itself. Let’s say ABC News posts something, and then what happens is in the comment section – this is where the propaganda comes in – lies come in, and they come in quick and fast. There’ll be 20,000 comments… [but they’re not real] and it makes it look like it’s a grassroots statement: We hate what ABC is saying. But in the end, it’s all been done by the same small group. It is insidious manipulation. And what it makes people feel is that, oh, if this many different people are saying this and they believe this, it must be right.

We applied what used to happen with traditional media to this technology platform – which allows exponential lies and in fact spreads lies faster, distributing the lies more. So that was one of the first things they did, astroturfing. And how did I discover this? Because I saw it on my own feed. I saw suddenly so many people turn against me. I guess I still do, but I had a great reputation in the Philippines. The reason why the largest network asked me to come back home and run the largest news group is that I was reforming so many of the things that they were working on. But I watched people switch overnight from the long track record I had to – “She’s lying, she’s a criminal.” – and I was like, this is not normal. And that’s what made me begin to pull the data thread and see where it would lead. So, Astroturfing is like fake grass. It is fake. It’s trying to create a groundswell that is manufactured and controlled.

GL: So you have this theory, about friends and then friends of friends and then friends of friends of friends. Do you think that the original business plan of Facebook always intended to lead to this place? It started out as a way to connect with our friends. But ultimately that became a vehicle for gossip or for influencing people. And now we’re more influenced by our friends than we are by ads or by authority. Do I have that right?

Jonathan Greenblatt, CEO of the Anti-Defamation League (ADL), will be at Campbell Hall on May 22nd with the MJ’s Gwyn Lurie as a moderator

MR: Yes. So a book called Socialnomics, which was very early on, kind of laid out that you listen more to your family and friends. If a family member tells you to go have hummus, you would be more prone to go get hummus than if there was a blinking ad or if you watch it on CNN, right? So that was already kind of established as human behavior. Your question is, did they mean for this to go there? No, I don’t believe that they meant for that to happen. I don’t think they’re evil. Although when they see it, when they see genocide and they don’t do anything, I think they cross the line. But in general, was this their vision? No, they didn’t know. What they did have is an iterative AB testing. They tested different ideas in a way that journalists don’t. Because we don’t test things.

What we do, I guess the closest we come, is we (reporters) learn how to tell stories better, but it’s about the story. It isn’t about manipulating the audience; it is about capturing their attention. So, I think what they did is they just wanted to make more money. It’s a profit motive and the way you make more money is by keeping people scrolling on your site. It’s about that metric. It’s time on site, and the way that happens is that you recommend things. Where they went crazy is they never thought about the harm that AB testing could do. So, in the book I actually take what every social media organization uses, which is friends of friends. It’s chapter seven, “How friends of friends broke Democracy.” That one algorithm is a recommendation engine for growth for all the social media platforms. What they didn’t realize was friends of friends pulled society apart, pulled apart the public spheres so that you can no longer have a functioning democracy. That’s the gap that opens. So that tech people will call it, Eli Pariser calls it, Echo Chambers.

Polarization is another way to describe this. You pull them out, you don’t hear. Democracy is all about listening to all sides and then making up your mind independently. What these algorithms have done is splintered our public sphere. It’s like taking one editor and replicating that editor a million times. The last part is an algorithm that makes you stay scrolling longer, and it radicalizes. We’re splintering this way and then we’re radicalized downward. This is the most interesting thing about America today. They think the problems are out there, but they don’t realize, this is radicalization. It’s radicalization, but it’s not in security matters, it’s in politics and it’s an extreme radicalization. And part of the reason I think I saw it earlier is because I studied this. This is what my first book was about.

GL: You refer to the Philippines as ground zero for the terrible effect that social media can have on a nation’s institutions and its culture and the minds of its people. Can you tell me why, in your opinion, does the rest of the world need to pay attention to what has happened in the Philippines?

MR: I think the time to have paid attention was in 2016. I was in Mountain View and I pointed out to the Google News Initiative that all the data we had indicated this was happening and I think people thought I was crazy and I was like, this is coming for you. And when did it hit America? When did you finally have evidence of it? January 6th… and we haven’t solved any of these problems.

I mean, let me move it forward. AI is machine learning. This is the first time that humanity was subjected to common-use AI, which in social media is a curation and growth model. In December last year when generative AI was rolled out, again, we didn’t learn a lesson, governments didn’t learn the lesson that you cannot test these things in public.

It’s like giving a drug company carte blanche to test all their drugs in the town square. And then if half the town square dies, “Oh I’m sorry you died. This is important for us to keep testing the drugs.” So, what they’ve done now with generative AI, we’ve let it out of the gate, it’s testing AI in public and you are expecting people who will be affected by this, to test it for these large
American companies.

And the harm be damned, because here’s the reality… the basic tenet of the first generation AI was to personalize your feed. I also thought that was insane because I was like, wait, that will create huge problems – if everyone has their own personal feed, how do we have a public sphere? When you each have your own version of reality, that’s called an insane asylum. Think about it, it doesn’t make sense. Their basic premise doesn’t work, and they keep rolling it out.

I just gave a commencement speech, and the hard part is: How do
you tell kids to live by values that are being thrown out? Because
at the very base level, information is corrupted from the beginning.

GL: Not only do they keep rolling it out, but people keep investing in it. And politicians understand that this is a way to control the airwaves. They understand that this is a way to discredit their opponents and to keep control of the narrative. But we keep using Facebook. So, what do we do? Would you advise people to get off Facebook?

MR: No, no, you can’t. We can’t. A news group can’t because that is now the primary distribution system. Social media is, especially if you’re a digital first and digital only news site, which Rappler is, there was no way that we could do that. Okay, what do we do? Well let me give you one stat. In August 2022, a few months before ChapGPT was rolled out, there was a survey of about 800 folks from Silicon Valley who were working on AI. And they said that 50% of those surveyed said that if you roll this out today, that there would be a 10% or greater chance that it would lead to an extinction event. Extinction, like human being extinction event.

So, Tristan Harris and I were talking, and he said, “Maria, if you are told that there’s a 10% or greater chance that the plane would crash, would you board the plane?” So, this is what we’re doing. And generative AI is significantly different because of the parameters that they use. What makes generative AI different is that it doesn’t do phrases anymore. In the past it used to be phrases, chunks. Now they do every word trying to replicate the way a human brain thinks, and the growth is off the scale. In the past, to invest – Silicon Valley would want hockey stick growth. But this is exponential. 

GPT-2 was 1.5 billion parameters for every word you would go through. GPT-3, which we used to create about 50,000 pages for our May 2022 elections, was 175 billion. It went from 1.5 billion to 175 billion. GPT-4, which they just released, is 1 trillion parameters. And then they’re going to be releasing GPT-5 at the end of this year. Developers have said that the reason ChatGPT hallucinates is that they put so many variables in place it’s impossible for them to understand. But what GPT-4 can do is it can code itself. It is growing on its own. They cannot control it. That’s why these engineers thought that releasing it in the wild could lead to an extinction event. We don’t know what it’s going to become. But all you see coming out of Silicon Valley are all of the, “Oh this is incredible. It can write all your drudgery notes.” This is part of what I talked about at Vanderbilt. Good God, if you are outsourcing your writing, how will you learn to write.

If you don’t learn the drudgery of writing, how are you actually going to write novels? How are you going to write stories if you don’t do this every day?

So what needs to happen is something very simple; accountability. You have to stop the impunity. They must be accountable for every single harm that comes along the way. And we have enough existing laws to do that. If lawmakers had the political will, they would revoke the 1996 Communications Decency Act Section 230. Because Section 230 pretends these companies do not have editorial control.

GL: But the minute they can take anything they want off their site, they have editorial control.

MR: Yes. That’s right.

GL: What has come out of your Real Facebook Oversight Board?

MR: We’re still together. The first thing is we had three demands and within 24 hours Facebook met them. We did this largely because I’m worried about the 2024 elections. Because as I said, the doomsday clock is ticking, and there’s the cascading impact. People are being manipulated on three basic levels.

The first is personal. So now after January 6th, you’re running after the people who were there who committed violence on January 6th. 

The second layer is sociological. Groups. We know this from studying terrorists that groups behave differently. Individuals alone wouldn’t do some of the things groups would do. This is why a mob forms. And these are studies from all the way back. There are the conformity studies, like the Milgram study on authority. The Milgram experiments show that if you actually give someone the authority to give electric shocks to another person – even if that person is hurt or you hear them screaming for you to stop – because you have been given authority, you can potentially kill a person. And these are good people who were tested.

So, you have these studies, but now we have social media. Sociologically, we behave differently in groups, and that last part, is communal violence. We’ve seen it in India. In Pakistan, you’ve seen genocide in Myanmar, and I knew this from when we started Rappler, anything in the virtual world spreads at least four times faster. This was at the beginning in 2012. So, the last part, we haven’t done enough studies about emergent human behavior.

Astroturfing is like fake grass. It is fake. It’s trying to create a groundswell that is manufactured and controlled.

If you’re doing genetic research, you use vesicular fruit flies. You see emergent behavior from a whole, and you cannot predict what will happen to the species from the individual parts. That is what emergent behavior is. Essentially, it’s evolutionary because we are changing the plasticity of our brains. We are pumping toxic sludge through our information ecosystem. We’re changing our attention spans. It’s changing emotions and that changes the way we look at the world and the way we act, the way we vote.

So those are the three layers. But then the other part is, we are democratically electing these illiberal leaders; Rodrigo Duterte was elected democratically. He was the first of the political dominoes. Brexit happened about a month later. And then you had all the elections. Trump in November…

This is not in the book, but V Dem – from the University of Gothenburg in Sweden – uses a much more sophisticated way of looking at how democracy has regressed. In 2022, the V Dem said that 60% of the world was under autocratic rules. The next year in 2023, that number went up to 72%. So in 2022, I didn’t pay that much attention because I thought, well that includes India and that’s a big country, and China. By 2023, when it goes to 72%, you got to know that the world is tilting. So, the tipping point for democracy is 2024 and between January this year and then – there will be 90 elections. Turkey is having its elections May 14th. So that’s critical and Europe is right next door, and if nothing significant has changed… three key elections will be Indonesia, the world’s largest Muslim population; India, the world’s largest democracy, the path is already kind of clear where India’s headed; and then the United States. If those three show regression, then that’s the tipping point.

GL: Do you have hope?

MR: Always. How can you not have hope? I just gave a commencement speech, and the hard part is: How do you tell kids to live by values that are being thrown out? Because at the very base level, information is corrupted from the beginning. I always use Stranger Things as an analogy – everyone watches it in the Philippines – we’re in the Upside Down. I still have hope. We will turn it right side up, but the time is now, the window is closing.  

Editor’s Note: This interview was edited for space and clarity

 

You might also be interested in...

Advertisement
  • Woman holding phone

    Support the
    Santa Barbara non-profit transforming global healthcare through telehealth technology