Nobel Prize Summit 2023 Transcript

Kelly Stoetzel (15:02):

Welcome to the National Academy of Sciences here in the heart of Washington, DC. And, for the next three days, the home of the Nobel Prize Summit.

Sumi Somaskanda (15:23):

The topic is truth, trust and hope. Today, on the first day of the summit, policy makers, scientists, artists, thought leaders will share their view on mis and disinformation, how it changes our society and what we can do to fight it.

Kelly Stoetzel (15:49):

And we’re here live all day from the Covley Auditorium and from our broadcast studio just down the hall. Welcome.

Sumi Somaskanda (16:01):

Another very warm welcome to the first day of our Nobel Prize Summit. Warm welcome to everyone joining us on our live stream around the world. Good morning, good afternoon, good evening to you, wherever you’re joining us from today. I’m Sumi Somaskanda, as you saw on that video. It is my pleasure to host this part of our program here in the Digital Studio.

(16:21)
Now, misinformation is eroding trust and cohesion, and it runs a risk of becoming one of the greatest threats facing our society today. So what can we do together to build trust in truth, in facts, in evidence, to build a more shared present and a more hopeful future? That’s what we want to talk about with you, of course, all of you joining us for this summit.

(16:46)
Now, it’s being hosted by the Nobel Foundation and the National Academy of Sciences, which is where we find ourselves. You got a taste of this beautiful space during that opening video, we’re going to show you a little bit more of it through the course of the summit, and the main program on stage starts in about 30 minutes. But we’re going to start with a discussion to get us kicked off. And I have two guests joining us. In fact, representing the hosts of this summer of Marcia McNutt with us, the president of the National Academy of Sciences, and Vidar Helgesen, executive director of the Nobel Foundation. Good morning to both of you. An exciting day.

(17:20)
Vidar, tell us more about why you wanted to hold this summit on this topic.

Vidar Helgesen (17:26):

Well, we see across the world that there are very systematic efforts at undermining science, undermining truth, really tearing apart much of the social fabric of society because it undermines the level of trust in society. So that’s a problem, and that’s being reported on in the media, it’s being discussed in academic papers and at conferences. So what we wanted to do is to put it on the agenda, but with a different twist. Hence the title, Truth, Trust, and Hope. We would like to invite people to come forward with what works. How can we actually build trust? How can we make more people believe in facts and truth? And that’s what we’re hoping to get out of this and that’s the rationale for organizing this.

Sumi Somaskanda (18:20):

Marcia, what are you hoping to achieve with the summit over the next three days?

Marcia McNutt (18:25):

So we are attempting to bring together a group of odd bedfellows, people that don’t normally interact on this topic, but when we look at the impact of misinformation and disinformation, it’s not limited to science. We see it in politics, we see it in health, we see it all through society. We are hoping by bringing this group together, people will realize the shared commitment we have to truth, that we will uncover ways to build and disseminate trust. And if we don’t have hope, then what do we have? If we don’t have hope, then people are discouraged and they believe that there is no solution to the problems. We have to instill hope. And I personally believe that science, technology, and innovation are sources of hope.

Sumi Somaskanda (19:25):

So you said odd bedfellows. What are you looking forward most to hearing and seeing in this program?

Marcia McNutt (19:30):

So for example, we’ve got people from the business community, people from the business community where business has often been pinpointed as source of disinformation, for example, prominently in climate change. But yet there are other aspects of the business community where they know that disinformation is undermining their ability to provide services to the public. So by bringing together business, politics, science, bringing together information, technology, these are the roots of the solutions to these problems.

Sumi Somaskanda (20:13):

Vidar, What are you looking forward to seeing?

Vidar Helgesen (20:16):

Well, I think the most exciting aspect of this will be exactly to listen and learn what is happening out there, what initiatives are going on and emerging to tackle this challenge. We know a lot about how the problem is materializing in many societies across the world. I think we and the world knows less about all those good initiatives that are underway. And I really hope and expect that we’ll learn more about that and that this will play a role in a much needed mobilization of all those strange bedfellows in the quest for truth, in the quest for trust and the shared hope that we need.

Sumi Somaskanda (21:00):

Last quick question to you both, starting with you Vidar. We have this very big digital audience over the course of the summit. How do you think that enhances the conversation?

Vidar Helgesen (21:08):

Well, this is a global issue, and while we have a physical location here, this global issue merits international attention, merits a global audience, and not least we should all ask ourselves what can we do? So those out there watching this throughout these three days shouldn’t only be listening, but also asking themselves, what can I do?

Sumi Somaskanda (21:30):

And Marcia?

Marcia McNutt (21:31):

I think it’s so important for our digital audience globally to feel like they have shared in understanding the nature of the problem and in its solutions. How can you build trust if you don’t feel you understand where people are coming from, where they’re going, and you share their goals? So by bringing in this digital audience, I think that we can make everyone feel that they understand the nature of the problem and that they’re committed to the solutions.

Sumi Somaskanda (22:03):

Marcia, Vidar, thank you for getting us kicked off. We hope you enjoy the summit, as I’m sure you will.

Marcia McNutt (22:08):

Thank you so much, Sumi.

Sumi Somaskanda (22:09):

All right.

Marcia McNutt (22:09):

It’s been a pleasure talking to you.

Sumi Somaskanda (22:11):

Likewise. Well, we just heard there, please stay with us on our digital audience and do feel like you can participate and you should participate. We want to hear your comments and your thoughts. As you can see from the agenda on the website, there are a number of speakers who have very interesting things to share with you today. We have prominent speakers from all over the world, including 11 Nobel Prize laureates, and of course organizations like the United Nations. Achim Steiner is the administrator of UNDP, the United Nations Development Fund, and he has this message.

Achim Steiner (22:44):

It is a privilege to share some brief remarks on our global community’s efforts to tackle disinformation on the occasion of this year’s Nobel Prize Summit. As Max Eisen, a Holocaust survivor stated so eloquently, it starts with words. With this, he evoked the undeniable role of human communication in enabling a path towards hatred, conflict, and even genocide. The use of words to deceive, confuse and insight is not new. What is new in today’s information ecosystem is the speed and vigor with which information, both positive and negative, now can spread. Consider, for instance, vaccine hesitancy, fueled by disinformation which had deadly consequences for a countless number of people, or look to algorithms that are [inaudible 00:23:41] engagements over our health, wellbeing, human rights, and gender equality. Over time, these doubts and divisions undermine trust, hope, and ultimately the truth. In countries suffering from fragility or crisis, disinformation can swing the pendulum in a sometimes devastating direction.

(24:05)
UNDP is today supporting dozens of countries to harness the positive benefits of digitalization, helping to advance progress across the global goals. That means ensuring that it is inclusive and does not damage social cohesion, and nor should it exacerbate inequalities including gender or racial bias. That support includes UNDP’s iVerify system. This AI enhanced fact checking and monitoring tool has helped to mitigate the spread of false narratives during elections in Honduras, Kenya, Liberia, Sierra Leone, and Zambia. It has been added to the Digital Public Goods Registry as the first ever digital public good on fighting mis and disinformation UNDP and the Digital Public Goods Alliance are also crowdsourcing the very best open source solutions and concepts to challenge disinformation, which will in fact be showcased at this Nobel Prize Summit. This is a challenge that requires a human-centered response. UNDP is working to support locally led solutions, everything from fact checking and media campaigns to equipping young people with digital literacy skills.

(25:34)
Nobel Laureates, excellences, ladies and gentlemen, the United Nations has a clear vision of the future in which societies use science, information and technology to unleash creativity, expand human rights, and advance the health and wellbeing of people and planet. That will include a new code of conduct on the information integrity of digital platforms. Indeed, disinformation directly threatens democracy and peace, but only if you let it. At a time of profound global challenges, Nobel laureates remind us of the infinite possibilities that light our fingertips to shape a better world. While transformations like the fourth industrial revolution have become inevitable, shaping them in ways that will benefit our global family of nations, communities, people, is truly both a science and an art. Thank you.

Sumi Somaskanda (26:41):

All right, now it’s time to welcome our first two speakers. They’re heading to the main stage in less than 30 minutes, and they’re here with us in our digital studio. First, I’ll welcome them here. Alberto Ibarguen is the CEO of the John S. And James L. Knight Foundation, and Rachel Kuo is an assistant professor at the University of Illinois at Urbana-Champaign.

(27:02)
Good morning to both of you. You’ve both given a lot of thought to how misinformation impacts us historically and currently as well as we look to the future. So we wanted to get your takes on this. So let’s start with you, Rachel, what is in misinformation? How would you describe it?

Rachel Kuo (27:19):

Yeah, I would say a lot of people from the start think of mis and disinformation as something as whether it’s about true or false, we’re talking about struggles over truth. But really, what we’re also thinking about is the ways that people are thinking about information and power. Are they trying to maintain and preserve power they already have? Are they also trying to disrupt power? And so when we think about how we discern misinformation, for many people it’s actually about what are our political commitments, the kind of commitments to how we’re thinking about climate to how we’re thinking about justice, equality, and that very much undergirds that narrative. And I think also with mis and disinformation, something really important to really think about is that these are not new social problems that they build on preexisting histories. And the reasons that some of these harmful narratives can be really compelling to people is that they do build on age-old tropes. So I think it becomes really important to think about this moment in historical ways.

Alberto Ibarguen (28:18):

And I think you might add that mis and disinformation are not the same thing. It seems to me the implication of misinformation is that somebody made a mistake or that is misstated or is simply not close enough to an actual fact. Whereas disinformation is where you specifically intend to mislead. They’re so often used together, I think that distinction’s important to make.

(28:51)
Our focus at Knight is really less on the kind of global issues that you’ve raised and more on finding the common ground in local communities. So our emphasis is on not-for-profit local news, on the theory, that isn’t original with us, that you cannot have a functioning democracy or republic or both, or combination of both unless you have an adequately informed citizenry. And that citizenry has to be able to be sufficiently informed to be able to make useful decisions on behalf of the whole. In other words, when the phrase so that the people can determine their true interests, which is a phrase from our founder, Jack Knight, that implies that all of the people, all of the people participating, have been sufficiently informed that they can make an intelligent decision in the best interest of the whole, not just their slice or their perspective.

Sumi Somaskanda (30:03):

That’s the idea of a shared present that we started this summit with today. Rachel, who is responsible for misinformation? I mean, where does it come from?

Rachel Kuo (30:12):

Yeah, that’s a big question, because I think as Alberto also pointed out, disinformation is also about people with certain intentions. They have a certain goal and there’s different scales. You can think about certain nationalistic or authoritarian governments wanting to maintain particular regimes. You can think about specific political parties that have different interests that are also trying to sow information into different spaces. And then that gets taken up and I think about this in really personal ways, by aunties and uncles. I think actually when many people think about misinformation, they’re thinking about their family WhatsApp groups, their text threads, and the kind of means that circulate. But that didn’t begin just with the auntie and uncle at home, but also began in other places. And so I think the kind of scope of who is responsible becomes really big because I think these information networks are also transnational. It can start somewhere, end up in somebody’s YouTube at home somewhere else. So there’s kind of a mix of players in this space by the different actors sewing information, people receiving it, also the platforms and different mediums in which this is circulating on.

Alberto Ibarguen (31:26):

You have the… I’m sorry.

Sumi Somaskanda (31:28):

Go for it.

Alberto Ibarguen (31:29):

You have the same kinds of issues in commercial information. You can have mis or disinformation. Sometimes if you go to a website and you find all those things at the bottom, I think that’s a field day of mis and disinformation. When you start looking at all of the potential diseases you can get from breathing. If you go down that rabbit hole, you can easily get misled into clicking and buying a product that really isn’t going to do very much for you. So it’s a simple kind of explanation of how it can be used. And then the same thing may be more complex because it’s not as obvious than with the political, the national or the partisan kind of mis and disinformation.

Sumi Somaskanda (32:28):

What are the consequences to society if misinformation and disinformation if they’re not addressed?

Alberto Ibarguen (32:33):

I think the consequence is what we have now. The consequence is that the polling that we have funded at Gallup, is beginning to show us that people not only are divided, this is in the United States, not only are divided but that the signs are beginning to say, if you don’t agree with me, you have evil motive. That’s a terrible thing. It’s one thing for you and I to disagree. It’s another thing for me to think that because you don’t agree with me, you wish ill for the nation, you wish Ill for the United States. We’ve moved sufficiently far apart and divided to the point where at least on the extremes, and I don’t want to exaggerate, I think it was about 9%, but 9% is a lot of people thinking that if you don’t agree with me you have evil motive. That’s a real problem.

(33:30)
We have to have, it seems to me, a culture that has basic agreements about our essential beliefs. In a pluralistic society, you’re always going to have divisions, you’re always going to have different points of view. Diversity for us, frankly, is easy. It’s inclusion and equity that really get to be difficult because when you have the diversity and you have all these different points of view, now

Alberto Ibarguen (34:00):

Now, what do you do? And I don’t agree, or we do agree on some things and disagree on others, but because we disagree doesn’t mean that we’re evil or that we have mal-intent.

Sumi Somaskanda (34:10):

Rachel, what is the role of diverse voices and communities in combating myth and disinformation?

Rachel Kuo (34:15):

Yeah. I think it’s really important because I think as Alberto mentioned, it’s really on-the-ground community spaces that are also trying to disrupt some of these old narratives. We can think about also the different ways that community organizing and social movements is trying to reshape some of the common narratives that we do have around power. We’re in a moment where diversity, equity, inclusion is still being struggled over. And I think you were also asking about the stakes of this. And I think one of the things that this conversation about misinformation and disinformation has really brought to the fore is that we have to reckon with the kinds of histories of white supremacy, of old hierarchies that exist in our society and that have never really been healed, repaired, have gone unaddressed.

(35:04)
And I think that’s something that we’re seeing in the space from different diverse voices of saying, “Wait, democracy has never been something that has been equally accessed by all people.” People who are marginalized sometimes get pushed out of different spaces, institutions, channels. What are the ways that we can really think about ways to bring those voices to the fore? And these different struggles over what, when we start thinking about common truth and narrative, can we do to actually readdress these different roots of inequality?

Alberto Ibarguen (35:38):

I think you’re right. And change and inclusion is never harder than for the person who has power, whether it’s race or whether it’s religion, or whether it’s inside of a business. In the business I used to be in in the newspaper business, my goodness, the people were rewarded and praised and given positions of power and responsibility and then all of a sudden comes digital and everybody says, “Oh, now change.” Well, guess what? Resistance. Resistance all the way down the pike. I think it’s a natural thing that we simply have to acknowledge it. And then everything else, change begins with a recognition of the reality as opposed to what you wish people actually did or were thinking, let’s figure out what’s actually going on. Then you’ve got the vision, then the courage to put it out, and the tenacity to follow through. This lady has all the tenacity necessary.

Sumi Somaskanda (36:35):

Very quick last question because we’re just running out of time where we have to let you go. What’s your biggest expectation for the summit? Just 10 seconds if you can.

Alberto Ibarguen (36:43):

New ideas, things I had never thought about.

Sumi Somaskanda (36:46):

Yes.

Rachel Kuo (36:47):

Really excited just to hear from different voices and perspectives on the topic.

Sumi Somaskanda (36:50):

Well, fascinating conversation with you both. Clearly, it’s going to be a great session on the stage. Looking forward to seeing both of you there. Thank you Rachel, and thank you, Alberto.

Alberto Ibarguen (36:59):

Thanks.

Rachel Kuo (36:59):

Thank you so much.

Sumi Somaskanda (37:01):

Well, to all of you in our digital stream, our live stream, we do want to hear from you as well. We want to know what your comments are, your questions are. So put them directly into the chat there and we’ll bring them into our conversations through the course of the day. Please be active and share your thoughts with us and we’ll make sure to address them through the course of the day. And for those of you who are watching, the possibility is right below the stream. You’ll see it just on your window there, just under the stream. That’s where you can share your thoughts. Why does mis- and disinformation thrive? That’s a question we’ve just been discussing here. What causes us to fall for it as well? Here’s a brief look at a conversation between Asa Wikforss and Peter Pomerantsev. Take a look.

Asa Wikforss (37:44):

In general, the reason science works, I think, is that we have precisely this open marketplace of ideas where we question each other and we have institutionalized peer review and all that stuff. And the conference is all about trying to find fault with the arguments provided in a nice and non-personal way. I think there is a context where it works, but in the public arena, in the public sphere, of course, there’s going to be all sorts of other things. The conditions won’t be there.

Peter Pomerantsev (38:14):

And also, everything that we see and what we know around the way people choose information products is not necessarily by weighing their credibility. It’s got to do with confirmation bias or how it fits their identity or how it makes them feel. There’s all these other factors in play and also very self-destructive urges as well. I don’t think that theory of choice is right. Like here’s a piece of disinformation, here’s a piece of information. You weigh them up in a certain way and make a rational choice about them. That’s not how it works. Even people who are incredibly educated in academics, when you put them into the real world will make all sorts of crazy choices.

(39:04)
But also the marketplace in a public square, that also envisions a place where it’s one person, one voice, and we don’t live in that world. If you own a TV station, you have much more of a bigger loudspeaker than someone who doesn’t. And if you are an army of bots on the public square and there’s a hell of a lot of bots out there, and robots are not real people, how does that distort the public square? There are so many things wrong with that metaphor.

Asa Wikforss (39:32):

But I think it’s interesting about the cognitive biases that we have confirmation biases on. That’s something scientists have too, but precisely because they question each other, you can’t get stuck in your own confirmation bias. That’s sort of why that works in a social context. When you put a researcher alone, they will do all sorts of crazy things because no one will question them. So you need to have this. But one thing that’s striking about… Of course, we’re talking a lot about the new digital information landscape, and one thing that’s striking about that, that I’ve been thinking about is precisely how it interacts with our biases in a much stronger way than the earlier media landscape.

(40:08)
Take confirmation bias. One way to describe the new information landscape is to describe it as this high-choice information landscape. We can just go out and pick and choose. And of course, when we can do that, and the choice is in principle almost unlimited, we’ll go for the stuff that fits what we already believe or what makes us feel good or what gets us riled up in a way that feels good. That is this high-choice information landscape puts the choice increasingly on the individual instead of a more limited choice information landscape on the individual. And we are as individuals, not very well suited to make those choices. I think that’s one aspect of this.

Sumi Somaskanda (40:48):

Fascinating. A longer part of that conversation will be streamed right here tomorrow afternoon at 1:10 PM. We mentioned earlier that part of the summit is bringing laureates into conversation. There are 11 laureates who are taking part in the summit over the course of the three days, and I am very pleased to have two of them with us here in the digital studio. I’ll introduce them. Saul Perlmutter is a Nobel Prize Laureate for Physics from 2011, and Richard Roberts is the Chief Scientific Officer at New England Bio Labs, Ipswitch, Massachusetts, and also Nobel Prize Laureate of Medicine in 1993. Good morning, welcome to have you both here. You’ll be on the stage a little bit later and you’ll be taking part in day two of the summit.

(41:31)
Before the stage program gets started, we wanted to get your take on the topics that we’re going to be discussing here and that we’ve been talking about already with our guests, mis- and disinformation. Richard, I would start with you. Why do you think a summit like this is needed to tackle this subject, mis- and disinformation?

Richard Roberts (41:49):

Well, I think you only have to look around you at what is being covered by the media at the moment. And there is so much misinformation going around that is causing rifts in society. It is a major problem. And when scientists are accused of spreading misinformation, I think that gets particularly bad and that I don’t like at all, we have to do something about it. And I’m hoping by the end of this meeting we will have some ideas for actions of what we might do that will stop the spread of misinformation.

Sumi Somaskanda (42:22):

Saul, what’s your thought on this?

Saul Perlmutter (42:24):

I think we were all really struck during the pandemic about the possibility that even when people’s lives are personally at stake, they’re still able to be drawn into these misinformation bubbles. And it struck me that this was one of these moments where we really wanted to ask, what can science do to help?

Sumi Somaskanda (42:44):

Yeah, you are both laureates. Starting with you Saul, how does mis- and disinformation, how does that affect the work that you do?

Saul Perlmutter (42:51):

Well, the whole construct of how science approaches the world is affected when people are not working to build a trusting environment. You can’t actually do science unless people have a sense that they can rely on people trying to get at the underlying reality of what the world is offering us. This is really very much getting at the underpinnings of what it is that we do.

Sumi Somaskanda (43:18):

How about for you, Richard?

Richard Roberts (43:19):

Well, I feel the same way that it is very difficult to be a scientist these days when you have members of the government attacking you. One of the great things I’ve always found about science is that we can disagree vehemently about things, but we don’t find it necessary to shoot one another in order to settle our differences. We can have a reasoned debate, and in the end, the facts that come from science will win and we know what’s what. And to have politicians deny the facts of science, that is something that is very disturbing to me in particular.

Sumi Somaskanda (43:53):

Within your field of science, what are the tools that you use to combat mis- and disinformation to forge a common understanding of what truth and facts are? Because maybe that’s something we can all learn from.

Richard Roberts (44:03):

Well, I do quite a lot of public speaking. And a lot of information about genetically modified organisms, for instance, is something that I’ve been going around the world talking about because here is an area where there is tremendous misinformation and I try to make sure that I put out material that accurately reflects what we know that says GMOs are safe because the science says they’re safe. And I get very disappointed when I continue to hear that the non-GMO people like Greenpeace are trying to undermine the science and denying the science. And for me, this is terrible. This is just awful.

Sumi Somaskanda (44:45):

Saul, how about for you? Having a conversation where there’s a common understanding of truth and shared facts, what are the tools that you rely on?

Saul Perlmutter (44:52):

Well, one of the things that I’ve been particularly working on with actually the Nobel Foundation is a way to educate the younger children. The idea is to go back to high schools and earlier and find ways to teach, how is it that science actually does what it does? And I think if people understood that we use the adversarial process where people are trying to figure out what’s wrong with the current ideas, we are mostly trying to figure out how we are making mistakes ourselves. There’s this very strong tradition of looking for where it is you’re going wrong yourself, which we don’t see understood in the public in that same way.

(45:27)
We also ended up teaching something which we’re actually trying out in this particular summit here, which is the idea of what does it take to get a good deliberation going in the public? And that includes both an understanding of the facts that are in play, but also it takes a representative sample of the public and gets a chance to have them use their values and their fears, and their goals as part of the conversation. At the summit, we’re trying out this idea of demonstrating this with what’s called a deliberative poll in this particular case that we’re trying. And in fact, I think there’s still room for people who’d be interested in signing up for it. It happens tomorrow in the afternoon for the digital audience.

Sumi Somaskanda (46:08):

Right. Well, that’s an interactive way again to include the digital audience. Just in the last minute that we have, we’ll start with you Saul, what are your expectations for this summit?

Saul Perlmutter (46:18):

For me, if it’s at all possible to focus a group of people who come from all different parts of the story together on solutions, then that would be the outcome that I’d be looking for. I think that when we see a problem, we’re much better at solving it. And I think we’ve now seen it in the last few years, and now is really the time to turn to ask, what are the different approaches that we can use to deal with the situation?

Sumi Somaskanda (46:41):

And Richard, how about for you?

Richard Roberts (46:42):

Yes, I’m really looking to see if there are some new ways in which we can engage social media, the TV, regular media in order to get over the misinformation gap that is out there. Maybe there’s something new we could do that I’ve not thought of yet. I’m hopeful something will show up.

Sumi Somaskanda (47:02):

Maybe we’ll find that out together. Okay. Thank you both so much, Richard and Saul. Great to talk to you and we’ll see you a little bit later in the summit. And with that, we’ll hand you over to the Kavli Auditorium and to my lovely colleague Kelly, who will be taking it away with the program on the main stage. Enjoy.

Video Regina (47:16):

Were the eyes or ears or nose or memory playing tricks on them? Did they witness something out of the ordinary course of nature? Shall they keep quiet about it or shall they tell? Shall they tell or shall they keep quiet? They could see the sound. They could hear the light. Plainly, the world held wonders of a kind. A light, a light, a light flickered ahead. Did they see something that wasn’t there? What was this light? A flashing light. What are they to believe? A light, a craze, a life. At first, no one seemed to notice. There were no reports.

(48:53)
But suddenly astrology, the Bermuda Triangle, Bigfoot, the Loch Ness Monster, ghosts, the evil eye, multicolored halo-like auras surrounding the heads of everyone, extrasensory perception, telepathy, precognition, telekinesis, remote viewing of distant places, the belief that 13, 13, 13 is an unlucky number. Many no-nonsense office buildings, hotels in America passed directly from the 12th to the 14th floors. Why, why, why take chances? Bleeding statues, carrying the severed foot of a rabbit brings good luck. Razor blades stay sharper in magnetic fields. Phone calls from the dead. All of them collect prophecies of Nostradamus. Untrained flatworms learn a task by eating the ground-up remains of other better-educated flatworms.

(49:54)
More crimes are committed when the moon is full. Palmistry, numerology, polygraphy, comets the menace of the universe, loose-leaf tea leaves, the shapes of flames, shapes of shadow, shapes of excrement. And even for a brief period, a Russian elephant speaks. Oh, the lost continent of Atlantis will rise. Prophets sleeping and awake. Diet quackery. 10, 000 steps a day. Eight glasses of water, faith, healers, Ouija boards, the hundredth monkey confusion, spontaneous human combustion, perpetual motion machines, unlimited supplies of energy. The world would end in 1917. The world would end in 2020. The world would end in 2022. A small brontosaurus crashing through the rainforests of the Congo Republic.

(51:05)
This light was all noise. The evidence was always crummy. There is no laboratory of the paranormal. And now how have the lights from that night warped after 25 years? What is going on with the light, the light in their hands? When they talk about lights, they are talking quite specifically about every image that shimmers around the edges. A flash of light, a blinding light, an image taken without form and face, input 10,000 images, output data as infrastructure. Collect, collect, collect everything, extract the mouth, extract the eyes, extract the movements, calculate spending, calculate emotions, calculate location, exponential growth. An eye in the sky. An eye in the sky. Eye in the sky. Mobilize the arsenal of data. Look at the anomalies in the data, transform the patterns of the past into predictions of the future. What was to become of this light? Disformation’s web, tangled threads of falsehood spread. Truth obscured, misled. How could a tone become a picture and light become a noise? All of this noise seems like the end. The end. Oh, no. Is it the end? The end. If it ended, it would start one May morning as tulips, ripe seeds, and narcissus break the vowel of spring silence. An experiment, a discovery, a theory. The theory of relativity. If it ended, Bigfoot would not be resurrected but the theory of relativity would, written exactly as it was before. Proof, your actions, entangled photons, the physical modeling of earth’s climate, Hepatitis C virus and Hepatitis D genome, the sign of magnetic super-exchange in materials, synthesis of molecular machines, accelerating expansion of the universe through observation of distant supernova, green fluorescent protein, GFP, asymptotic freedom in the theory of strong interaction between corks and gluons. Genetic regulation of organ development density functional theory, cooling and trapping of atoms with laser lights, split genes, mobile genetic elements, superconductivity in ceramic materials. Chemical synthesis on a solid matrix, peptide hormone production of the brain.

(55:24)
Sandwich compounds the art of organic synthesis, the bubble chamber, sex hormones, cosmic radiation, the function of neurons, the scattering of light, typhus, the constitution of the bile acids, plant pigments, automatic valves, and gas accumulators in lighthouses and buoys, the structure of the nervous system, spontaneous radioactivity. And yet, only now the beginning of the search. What are we living with that is left to uncover?

Kelly Stoetzel (56:15):

Hi, y’all. Good morning everybody, and welcome to the Nobel Prize Summit. I’m Kelly Stoetzel, and I’m going to be your guide through the day. The first thing that I think that we should do is invite out the creator of that piece that we just opened with and have her tell us a little bit more about it. Smriti Keshari. Smriti, that was amazing. I bet nobody here thought that was the way we were going to start today. Can you tell us a little bit more about it and tell us what we just saw?

Smriti Keshari (57:08):

Yeah, absolutely. What you actually saw is inspired by an original-

Kelly Stoetzel (57:16):

Oh, do we-

Regina (57:16):

You can be closer to me.

Kelly Stoetzel (57:17):

Can we get a handheld?

Smriti Keshari (57:18):

There we go.

Regina (57:18):

Move closer to me.

Smriti Keshari (57:19):

There we go. How’s this? Can you guys hear me? Okay.

Kelly Stoetzel (57:22):

Sorry, everybody. Let’s grab a handheld because we have some people online as well that we want to make sure get to hear this. Just a second, y’all. Here we go.

Smriti Keshari (57:39):

Thank you, Ray. Hello. What you actually just saw is an original… it’s an inspiration from an original Beckett piece called Not I. And it was a form that I came across that I was really fascinated by. This idea of a solitary mouth, really a mouth performing, a mouth taking you through this emotional journey. I think so much of information that you come across, that you hear, that you experience comes from somebody’s mouth. And so when the wonderful Ann Merchant of the National Academy of Science approached me about creating a piece to kick off today, I thought this was the perfect opportunity to be able to adapt it around the themes of misinformation and disinformation. One of the central things to all of my work is, how can I take audiences on an emotional journey of a subject matter? That’s really at the heart of this. How can you take someone through an emotional journey, what it feels like to experience misinformation, disinformation?

(58:43)
When you look at the misinformation of the past, it’s so much around pseudoscience and superstition. And there is an allure, there’s a seduction. It’s humorous. When you look at so much of the misinformation, disinformation of the present, there is an absolute techno-optimism or a techno-pessimism. There is fear and fear-mongering. But in the reality, and when you think about the most important thing, and that is hope, because the sky is blue, that’s where hope is always in the action. It was such a joy to adopt it and look at science’s action. What the amazing Regina was presenting as she appeared on stage is 123 years of the Nobel discoveries.

audience (59:30):

[inaudible 00:59:30] mouth.

Smriti Keshari (59:33):

Right? I just want to say thank you, Regina, that was amazing, and thank you for the National Academy of Science for confronting some of the bold and pressing issues of our time.

Kelly Stoetzel (59:50):

Ah, thank you so much, Smriti and Regina. Wow. Well, you may have noticed that at the end of that piece, the camera was turned on the room here, and I was thinking we should hear a little bit more about who is here in the room with us. So by applause and cheering and whatever, who has been in this theater before? Come on. That’s a lot of y’all. Okay. Whose first time is it here? Excellent. Okay. Do we have any members of any of the academies or Nobel Laureates here in the room? I mean, how cool is that? That’s crazy. I just wanted to say that. And then yes, amazing. Okay. Who is excited to be here for the day ahead?

(01:00:54)
Well, I know I am, and it’s going to be thrilling. I can tell you that. And it’s going to be provocative and there may be even a few surprises that get you to think a little bit differently by the time we make it to the end of the day. We’re about 800 here in the room and another 5,000 plus listening online. Hello to our online friends. And it’s just going to be an important conversation all day. And it’s so good to have all of you part of it talking about Truth, Trust, and Hope. The program today is going to be broken into three chapters, and we’re going to begin by using history as a guide to better understanding misinformation and disinformation. After all, misinformation has been around as long as we humans have been on the planet. And then we’re going to take a little bit of a closer look at why that is and how we fall prey to that. And then finally we’ll look at how we can pave the road to a more hopeful future.

(01:02:03)
These chapters are really programming chapters and there’s a single extended break scheduled in the day for luncheon that’s going to be from 12:30 to 2:00. So if you’re in here, in the room here and you need to get up, we just ask that you try to do that between speakers. And then while we’re talking theater etiquette, let’s all just take a moment to check our phones and make sure ringers are off. And then just tuck those babies away because you’re going to want to stay totally focused. This stage has got some of the world’s most remarkable humans about to come share their wisdom with all of us. And you are not going to want to be distracted for even a second.

(01:02:43)
We’re going to kick things off today by looking backward because this is not our first rodeo with misinformation. History can help us understand what circumstances give rise to bouts of misinformation and where the pressure points lie and how we address the problem. Okay. Remember how we were just yelling and cheering a minute ago? Let’s do that as we bring out our first speaker. He is the President of the John S and James L Knight Foundation. He has a long illustrious career in journalism and he has received many major awards along the way. He’s lived on the front lines of our experience with news, both real and fake, and he’s here to help us take a look at the past to make sense of today. Let’s welcome Alberto Ibargüen.

Alberto Ibarguen (01:03:44):

Thank you. Thank you very much. It was wonderful to begin with Regina and Smriti and their exploration of how to hear the light. I’m Alberto Ibargüen. I’m not one of the remarkable people you’re going to hear today, so lower your expectations. I want to tell you a little bit about how we’ve thought about this at night and how I’ve thought about it actually for most of my life. I’ll tell you two quick stories, one from 1965, the first time I ever saw John Cage perform and met him. He came to Wesleyan where I was a student. He put some metal sheets thrusting out toward the audience from the back of an old New England brownstone chapel. The program said this was a program in electronic music. I thought music, I like music, I’ll go here. And he put microphones along the side and stood in the back and rolled ball bearings down and it went plink, plank, plink, plank. And I thought, if this is music, then there are no rules except for the blinders we’re willing to put on our own imagination.

(01:04:55)
But that structure gave me the opportunity to think about music in a different way. 30 years later in the nineties, I met Jerry Yang, the co-founder of Yahoo, and I asked him, “Who do you hire? What’s your favorite kind of employee?” As Yahoo was beginning to invent how we used internet. And I thought he’d say engineers. I thought he might say, I don’t know, mathematicians or scientists. He said, “Philosophers.” I said, “Philosophers?” “Yes. Because philosophers are the ones that have the most experience at connecting the dots, of seeing the relationship of things that appear to not have a connection with the other. They’re the ones who can really guide us into the dark where we really don’t know where we’re going.”

(01:05:49)
And that’s actually how we’ve thought about approaching these problems at Knight Foundation for the last almost 20 years whether from news challenges that had basically no rules, it was how to deliver news and information to geographically defined communities on digital platforms so people could have basic information. Art challenges that also have basically no rules. Art in Miami, what’s your idea? We’ve got money, you’ve got ideas. And that’s how we’ve done the funding. This conference, this exploration that began today with Regina and Smriti, this exploration of how to think about things is an absolute natural for us. In all of this, the structure is what allows us to be able to make practical the imagination, the findings that we come up with. In a world of division, I think we need balance. We need more evidence, we need more science.

(01:07:04)
We’ve funded over the last five years about $90 million of social science research. In the world of Google, that’s probably a committee. In the world of social science research, that’s a lot of money designed actually to explore issues like mis- and disinformation, privacy, antitrust, Section 230. We never tell a researcher what to conclude. We are always looking for ways to disseminate the information produced. And in a sense, that culminated yesterday in the announcement of a new institute at Georgetown University, the Knight Georgetown Institute, that will attempt to translate scholarly research, engage people, particularly policy leaders in a discussion about ways

Alberto Ibarguen (01:08:00):

… ways in which we might be able to address these important issues. And it will be, not coincidentally, but it will be placed at the Georgetown campus that’s about a five-minute walk from the capitol. It’s a collaboration with Georgetown University that very importantly has, as its core mission, something that we share and that is the public good. We share with our friend and colleague, Alondra Nelson, the former acting director of the White House Office of Science and Technology, the view that scholarship to be most effective in a democracy needs not only to be available, but it needs to be as much as possible, accessible, and that’s how the people can determine their best interest.

(01:08:51)
Your appetite for scholarship, for knowledge gives me hope and history gives me confidence. We’ve been there before. I know this is going to sound maybe a little arcane, but think about Gutenberg. Before Gutenberg there was order, the monks would illuminate a manuscript or two a year. The cardinal would put his imprimatur. It was always him. The cardinal would put his imprimatur and there was truth. You knew what you were reading, you knew it was true. Gutenberg mechanize the Chinese invention of the printing press, and all of a sudden, any Tom, Dick or Martin Luther can do a mimeograph machine and all hell broke loose and for I don’t know how many, for a hundred years, people couldn’t figure out. And the lament was the volume. How can we deal with the volume? The speed of information was absolutely stunning, shocking.

(01:09:50)
And go through history and look at your favorite crisis, the Industrial Revolution, the Communications Revolution, now we have ChatGPT on top of the digital revolution, not everybody is like Tim Burners Lee, who did not take out a patent on the World Wide Web. We have to deal with the reality of commercial self-interest and we are here to talk about the public good. This is a balancing act. This is the real world, and we want this conference to make contributions to that public good like the Science Foundation, like Georgetown, like Nobel. Nothing could please us more at the Knight Foundation than to support your efforts. Many of you we happen to work with to support your efforts toward the public good. I am thrilled you’re here and we are really honored to sponsor this conference. Thank you very much.

Kelly Stoetzel (01:10:51):

Thank you so much, Alberto Ibargüen. So we hear a lot about technology and misinformation, and while it’s true that technology can speed up the spread of all communications, it’s not what is at the root of the problem. Our next speakers research focuses on the place where technology intersects with politics and social movements. Let’s welcome Rachel Kuo.

Rachel Kuo (01:11:33):

Hi everyone. It’s so wonderful to be here with you this morning. Really excited for the next couple of days. Just to get us started, I want to do something a little interactive and just get a show of hands of how many people in this room in the last couple of years have received information from a loved one that you were like, “This is kind of inaccurate.” And then how many of you were like, “Ooh, let me try to intervene or do something about it.” Okay.

(01:12:04)
So one of the things I think that really brings us collectively in this room is the ways that mis and disinformation have impacted us in personal ways. And along with that is that we also see a connection between information, belief, and action. So it’s from the information that someone is receiving, how does that impact whether or not they get vaccinated? Will they go out to vote? How will they vote? Are they becoming indoctrinated into some forms of racist ideology? And on the other right, there’s a hope that through some change or intervention, we can change either the distributions or maybe the content and what people are consuming, and that can also change belief, action, and potentially lead to larger scale change.

(01:12:51)
I think beyond also the personal, something that also brings us together is the transnational scale of when we start thinking about information and harms. We’re here at a global summit and so I think something that we’ve thought about, mis and disinformation is its relationships to how people have thought about governance, disparity, and also its connection to mass atrocities in different places.

(01:13:15)
And so I think when we think about mis and disinformation, there’s a kind of work that we have to do in coming to different shared definitions. They’ve also become some kind of messy and ambiguous terms that are often used to diagnose a variety of social problems. And so on the one hand, we’ve seen the connection with it to how people have talked about electoral outcomes, pandemic responses. But on the other, if we think a little bit more again to the everyday, we have also seen the uses of information in different harmful circumstances. That can include the lies that a landlord might tell lower income tenants to displace them from their homes, to build luxury housing. We can think about the day-to-day crime coverage and narratives around criminal history that justify racialized tropes and legitimize different forms of violence against communities.

(01:14:10)
And so the kind of problems that we’re experiencing now are not new ones, but draw on histories of inequality and the ways there have always been deeper myths and narratives that have existed to justify racial and economic forms of disenfranchisement. And so often when I’m thinking about mis and disinformation, I’m thinking about power and politics. And often what we discern as what is true or false is also about applying our political lines, commitments and values. And I know some of you might be like, “Ugh, politics really got us into this mess in the first place, the polarization of information.”

(01:14:49)
But I think what’s really important to think about is actually what I mean by politics is this vision of the world that we’re struggling towards and the world that we want to have together. And also the kind of world that really also addresses some of these root causes of harm and violence and inequality that exists in the present. We cannot disconnect the fact that when mis and disinformation entered the public sphere, it has been tied to a resurgence of ethnonationalisms and white supremacy, not only here in the US but also globally, and that people are reckoning with that. And this, again, builds on different histories of violence that have also been sanctioned by different institutions. And so knowledge production has always served particular interests, and what is considered common knowledge and truth is something that people have always struggled over. And so I’ll offer you an example from the COVID-19 vaccine. I think on the one hand, it has seemed like a really simple disease prevention and a harm reduction tool at very low risk to most people. And at the same time, we’ve really seen the ways that it has polarized people that also reveals different fractures on how we collectively as society think about health and safety. What does care for immunocompromised people look like?

(01:16:09)
And so I think when we think about how history and politics helps us illuminate some of these other broader issues around information, we can look, for example, the ways that actually institutional distrust has been tied to different histories of institutional neglect, harm and mismanagement. You have the long history of racial and colonial violence in the eugenics movement and forced sterilization, where the uses of science and medicine has been used to deem who is quote unquote fit to live.

(01:16:41)
You have different histories of the uses of public health to justify different forms of state violence from the burning of Honolulu’s Chinatown during an outbreak of plague by the Board of Health, to the detention of Haitian refugees in the early nineties because of fears that they were all carriers of HIV. You have histories of the testing and data harvesting of communities often racialized as quote unquote underdeveloped without consent, the kind of birth control trials in Puerto Rico.

(01:17:12)
And so all of these different examples offer us a glimpse into why there might be intergenerational distrust and also write that these stories play out today in terms of stark lines of how people tell stories of who is getting equipment allocation, who is receiving pain relief in hospitals. And so I think that’s one way that we can look at this.

(01:17:34)
Another thing that the vaccine has also illuminated are also the contemporary and historical geopolitical tensions. Some people have hesitated around the vaccine because of their perceptions of what governments or nations or the distributors or manufacturers. Other people might also have institutional distrust because of their own lived encounters with forms of government surveillance, policing, data collection, also if they have precarious immigration status, certain fears of what implications and encounters with institutions might do in terms of detention and deportation. And these are not true cause and effect links. Getting vaccinated doesn’t mean that this will happen to someone, but they play on people’s real lived fears and precarities and how they exist vis-a-vis this state in an everyday way.

(01:18:26)
And I think finally, when we also think about the vaccine, what it is also illuminated is a real lack of adequate language translation and interpretation of social services, including how actually difficult to comprehend billing, insurance and payment services are for people, but also when there is information, it isn’t reaching people in the actual channels and venues and mediums that they consume on a day-to-day, which leaves voids of information that can be readily exploited by other actors.

(01:18:56)
And so I bring all of this up because I think when we start thinking about information, this is not the problem of individual responsibility to get better information or to get good information, but how we think about intersecting structural harms. And it becomes really important for us to begin to come to shared analyses of what have been some of the root problems. And I think those questions often seem like they exceed the scope of mis and disinformation, but I think really bring us to how information has always been bound up in social, economic and political systems and structures. And so again, our information problems today are not just technological ones. We aren’t going to be able to fact check or innovate nor police away racism and information harms from our digital spaces.

(01:19:44)
And so what we really need to understand is that we need to analyze power at local and transnational scales and also in the context of which people live them in order to actually address these different histories of harm. And I’ll offer some examples of how this plays out from my own political position in Asian American communities. And so for example, when we start thinking about how we understand root problems in the context of history and the local and the transnational, at the start of the COVID-19 pandemic, we’ve seen a lot of racialized information about the virus as Asian that actually has built on long histories and media narratives and propaganda of Asian people as dirty, as deviant, and always caught up in a perpetual enemy framing as the US has also engaged in different wars in and out of Asia. And so understanding this present moment means also understanding this history of how domestic forms of xenophobia and racism has also been a transnational story of US militarism in Asia in different ways.

(01:20:53)
I think another thing that we can think about how we think about power is how it actually applies in different spaces. And so for many technological platforms, often we’re thinking about US-based for platforms becoming police judge and jury of online violence, but they don’t actually have a nuanced analysis of power and context. And so this can also end up further marginalizing and harming people who are already oppressed. And so for example, in India, you saw the Sikh farmers protests. So were a religious minoritized group and a lot of the activists were targeted. Content about the protests and the hashtags were deemed as separatist activity and then removed and de-platformed from a lot of social media platforms including Facebook and Instagram. You also have [inaudible 01:21:42] and cast oppressed activists trying to make critiques of age-old hierarchies and that is getting flagged as hate speech.

(01:21:48)
And so throughout history, we’ve seen the ways that critiques of institutions by different grassroots organizers have been deemed as threatening or as terrorist activity. In the US, you have the history of COINTELPRO and the targeting of black, Latin acts, native and Asian movements in the sixties, the creation of the black identity, extremist tag in recent Black Lives Matter movements, after 9/11, the targeting of Arab and Muslim organizations as a terrorist organizations. And so we really have to think about power and how we analyze it.

(01:22:22)
And finally, I think when we think about power and how we analyze power, that comes up when we start thinking about how we build consensus in our communities. And so for example, a lot of historical traumas and experiences, we’ve seen them be weaponized and exploited in ways to actually further racist and nationalistic interests. And so for example, in June, we are about to see a very historic vote around affirmative action. And in some Asian communities, there have been anti-poor and anti-black narratives around meritocracy and qualification to justify the erosion of access to higher education and create wedges. And so how we also need to organize and build consensus around that.

(01:23:09)
And so really what is at stake it seems for many of us in this room is that we are thinking about the futures of democracy. That’s something that has already come up in this morning. But I think what we have to remember is our current democracies have also been underwritten by different histories of racial and colonial violence. And so there’s still a vision of democracy that we are continuing to struggle towards. And so we have to continue to ask ourselves, when we think about truth and consensus, what we’re also really talking about is what are our shared convictions, commitments and values to a world that we want to live in, like a world without racism, a world where people can access affordable and accessible healthcare. And so these are all part of that bigger picture.

(01:23:56)
And I really see this world as possible in some of the ways that community groups are building different infrastructures, projects and networks, often in the face of government neglect and harm. And so you’ve seen the ways that activists have also built community housing projects, mobile medical healthcare clinics. You see the ways that right now people are circulating information about gender-affirming health and also reproductive healthcare as access to bodily autonomy becomes increasingly criminalized. And also, there are a lot of different groups trying to do their own in-language translation work through the creation of help desks and hotlines to get their communities the information about social services and also civic processes and the channels and streams and languages that people need to get them at. And a lot of this work needs to be resourced more. We can’t pour resources into band A technological solutions, but really look to what is happening on the ground.

(01:24:56)
And so where I’ll close is to think about when we start thinking about the future, what we really need to do is start reckoning with the different histories that have often placed racialized people globally in different contexts of disenfranchisement, disposability and displacement, and we need to reckon with that and again, think towards the kind of world instead that we want to build for all people. So what is the world that we want to see and what is it that we need to do to build it together? So thank you so much and it was so nice to be here with you all this morning.

Kelly Stoetzel (01:25:29):

Thank you so much, Rachel. So y’all, it feels pretty amazing to be here and be part of this day with these two organizations coming together that are both just really focused on making the world a better place for all of us. Doesn’t it feel good? So we have here the leaders of both of those organizations. Marcia McNutt, the president of the National Academy of Sciences, and Vidar Helgesen, the executive director of the Nobel Foundation. Marcia and Vidar.

Marcia McNutt (01:26:16):

Well, thank you everyone. I have to say I am blown away by the energy in this room. It makes me feel that working together, there’s no problem we can’t solve. But of course, the specific issues that we’re dealing with today are misinformation and disinformation. I’m, by background, a geoscientist and there is probably no topic more essential to our future here on this planet that has suffered more from disinformation than climate change. After all, who doesn’t want to be told that, oh, climate change isn’t happening, or, oh, well, maybe it’s happening but people have nothing to do with it. Or the worst possible thing to hear is that it’s happening, it’s here, but there’s nothing we can do about it. So eat, drink, and be married because tomorrow we all die.

(01:27:13)
Well, this is a very bad position to take and it threatens our children and our grandchildren. And you just heard from Rachel about the misinformation associated with COVID-19. This misinformation, which probably stemmed from a lack of trust in our medical community, caused people to abandon scientifically proven interventions and treatments and instead grasp at home remedies, which at best were ineffective or at worst were actually dangerous. Science is the one human endeavor that we have had throughout the ages to best predict the outcome of our actions. Science is never complete, it’s never perfect, but it’s the best we have.

Vidar Helgesen (01:28:10):

So we have, as Marcia has said, a problem. The world has a problem, we have a problem, and we believe also that we have solutions that can be brought to the fore. When the National Academy of Sciences and the Nobel Foundation discussed how do we go about the next Nobel Prize summit, the second that we’re holding, what can we do to address this particular issue? And I think that’s the question we should all ask ourselves. What can we do? What can I do to address and find solutions?

(01:28:49)
If you consider the universe of Nobel Prizes, the science, the literature and peace prizes, at the heart of all of those disciplines is, in one way or the other, the quest for truth. And disinformation, misinformation and attacks on truth seekers undermines scientific progress, it undermines and weakens public deliberation, the freedom of expression and democracy, and it promotes conflict and tension in society. So we should do something about this.

(01:29:40)
And this meeting, as you’ve already experienced, we’ll be looking at the histories of disinformation and misinformation, it will be looking at the current dimensions and challenges of disinformation and misinformation, but most importantly, it will be trying to look at solutions. What can we do to really tackle this monumental challenge?

Marcia McNutt (01:30:11):

And the National Academy of Sciences is so thrilled to be partnering with the Nobel Foundation on this. At the core of both of our organizations is the recognition and promotion of excellence and the very highest quality science imaginable. And that is why we’re working together to attack this problem of mis and disinformation. Our organizations are all about excellence, integrity, truth and trust. And we hope, in the course of this meeting, that we will help all of you identify where can you go to find the truth, who can you trust to deliver it to you, and finally, how can we all find hope in solutions to address these very difficult problems. I’m so pleased that you’re all here today to help us with this. Today, in our plenary sessions, and then we have two other days devoted to breakouts for problem solving, and then dissemination on the third day.

Vidar Helgesen (01:31:24):

And we have such a magnitude of people, such a combination of backgrounds, professions, disciplines brought together to address this complex challenge, much like the Nobel Prize with these disciplines, scientific disciplines, literature, culture, peace, and peaceful political action. When you enter this building or before you enter this building, you’ll see the Einstein Memorial just outside, arguably the most famous Nobel laureate. And actually, next week marks 100 years after his first Nobel electorate. He won the Nobel Prize for physics, but it’s less known that he also has a stake in a Nobel Peace Prize. His actions spanned deep science and political activism for peace because he armed with his scientific rigor and excellence and his ethical compass, together with other experts, including a good number of Nobel laureates, issued the so-called Einstein-Russell Manifesto, where on the basis of their knowledge of what was at stake, called on world leaders to address the scourge of nuclear weapons and to seek peaceful resolution of conflict. That spurred the so-called Pugwash movement that later got the Nobel Peace Prize.

(01:32:59)
The audience today and the program over the next few days comprises 11 existing and living Nobel laureates that will be participating. One of those laureates is Maria Ressa, a peace laureate who won the prize in 2021 exactly for addressing the issue of misinformation and disinformation. And in her Nobel lecture a couple of years back, she said, “That an invisible atomic bump has exploded in our information ecosystem.” And she called on all of us to work together to mobilize against that. And she used the words arms race in our information ecosystem. And the last weeks and days have indeed demonstrated to us that there is an arms race out there and it needs to be dealt with, and that requires people from different disciplines to work together.

Marcia McNutt (01:34:03):

So as you’ve heard, this is the second Nobel summit. The first was held in 2021 with the theme, our planet, our future. This was an opportunity to bring together leaders from political spheres, from business, from education, from academia and from science, and have young people and older people together to talk about what would a sustainable future for human life on this planet look like. There were many outcomes from this, but the most important things that were happening, we’re bringing together these diverse voices to look into the future and figure out how together we could address climate change, fight inequality, and spur technical innovation to solve the problems that are plaguing us in terms of our unsustainable use of the resources of this planet.

(01:35:12)
There were a number of important initiatives that came out of this. One of the most obvious was a call to action that was signed by more than 150 Nobel laureates and science leaders, that called on people from all spheres of influence to look at what their roles and responsibilities were to achieve this brighter future for future generations. Another group that came out of this was the IPIE, which is the International Panel for the Information Environment. This was an idea that was generated during one of the breakout sessions at the last Nobel summit, and it actually provides a bridge to this one because this information environment is what we have to understand and we have to figure out how to make it work in more beneficial ways. They will be officially rolling out their agenda at this summit.

Vidar Helgesen (01:36:19):

So we’ll meet them at this summit. We’ll meet others that have done things to foster solutions to address this problem. We will meet youth, older people, academics, activists that come together at this summit, and we should all, in this hall, and the thousands that are watching this digitally also take part and ask ourselves what can we do, what can I do, but more importantly, what can we do together, because it is possible. We’ll see those examples over the next coming days. It is possible, and that’s why the title of the meeting, despite all the difficulties and problems, is truth, trust, and hope. Thank you.

Marcia McNutt (01:37:10):

Thank you.

Kelly Stoetzel (01:37:25):

Thank you so much. Marcia and Vidar. Well, following on all that we’ve heard so far, I can really not think of a more fitting next speaker than Nat Kendall-Taylor. Since 1999, the FrameWorks Institute has conducted cutting edge social science research to help reframe social problems as policy issues and drive social change. Nat understands better than most how misinformation impacts public trust in science, and he spends a lot of time focused on how we better understand and address the issues. Let’s welcome Nat Kendall-Taylor.

Nat Kendall-Taylor (01:38:09):

Hello everyone. How are you doing? My name is Nat Kendall-Taylor and I help lead an organization called the FrameWorks Institute. I’m a psychological anthropologist by training, and for the last 20 years, I’ve studied how culture influences the way that we think. And what I’d like to talk to you about today is how psychological anthropology, particularly the idea of cultural mindsets, is a potentially really powerful but largely untapped strategy that we can use in our mission to better navigate misinformation and build trust in science.

(01:38:41)
So the conference organizer has very kindly asked that I not use any charts or graphs in this presentation, so I just wanted to make sure that I got off on the right foot right from the very beginning. What you see here and notice is that as of December 2021, 22% of Americans indicated that they had not much or no trust at all in scientists. That’s almost one in every four Americans saying that they do not trust science. And as a scientist, and yes, we count too, social science is real science, I find this incredibly striking. Did not think that would get an applause.

(01:39:21)
And so trust in science matters because as it goes, so goes our ability to have all of the effects that we want to have, to use science to solve social problems, to use science to move society forward. But trust in science, it turns out also matters because of a very interesting emerging relationship between trust and science and misinformation. So numerous studies are finding that where trust and science and science understanding and science literacy are low, our susceptibility to misinformation is high.

(01:39:56)
Okay, so nothing terribly new at this point. Probably everyone in this room realizes that trust and science matters, certainly nothing Nobel worthy, but this is where the story gets new and interesting. So most of our efforts to build trust in science are educational in nature. We teach people about science to build their knowledge and improve their trust in science. And this work is incredibly important. Do not get me wrong. But I am here to tell you that there are other routes and there are other tools at our disposal. And I believe incredibly strongly that in the world in which we live and the challenges that we face, we need to be exploring all routes and using all tools that we have.

(01:40:35)
And it turns out that trust and science is also influenced by these things that my people, that psychological anthropologists call cultural mindsets. And cultural mindsets are these deep, implicit assumptions and understandings that have their effect in shaping our thinking below the level of conscious thought. They are incredibly powerful in shaping how we see the world and how we act in it. It’s important to note that we all, each and every one of us have multiple mindsets that we can use to make sense of a given social issue. Some of those are productive, they open conversation, they allow us to consider new ideas and solutions. Others are counterproductive, they can close down thinking, they can derail discussions, and they can keep us from engaging with new ideas. Most importantly for us here today, the choices that we make and how we communicate, how we say what we have to say has the ability to activate certain of these mindsets where they can shape our thinking and background others.

(01:41:39)
So let me give you a quick example to make this real and concrete. Individualism is a very dominant and powerful cultural mindset in American culture. This is the idea that the world is the way that it is, that outcomes are the way that they are as an exclusive result of how hard we try, of our decision to

Nat Kendall-Taylor (01:42:00):

… or to not exert our willpower, drive, grit and gumption. And this isn’t wrong, right? It turns out that decisions and willpower matter, but when we adopt this as our dominant lens on the world, there’s a whole bunch of things that surround us and shape us, that drop out of you, right? Things like resources, institutions, policies, relationships, and local contexts.

(01:42:26)
And so, just like individualism makes it hard for us to see that what surrounds us shapes us, there’s a set of mindsets that we use to think about science that can actively impede and get in the way of building trust and science. I want to tell you about two of them quickly here today.

(01:42:42)
The first is this sense that science is capricious, right? When people think in this way, science becomes something that we can’t trust because of the rapid pace at which it flips and flops and doubles back on itself, right? You have a very good sense of this mindset being active, as you can hopefully see in the quote there, when people start talking about scientific studies of wine, or chocolate, or eggs, right? Or, coffee, my favorite. And clearly this undermines our ability to have trust in this institution.

(01:43:13)
Second, less capricious and more nefarious, is this sense that science has a hidden agenda. And when people think in this way, science and scientists become these agents who are out to pull one over on the public, to hoodwink normal people, and are willing to manipulate data for their own interests and gains. Now, this is clearly related to a larger, and in other research we’re finding, growing sense that the system is rigged by those in power to disadvantage the rest of us. But we can see that this is a fairly direct and frontal assault on trust and science, because of the way that it positions scientists as actors willing to do the public harm for their own interests and gains.

(01:43:59)
So, I want to pause here and make sure everybody hears this. This might be the only time today that you hear this. There is good news. Okay? So, there are also a set of mindsets that we can activate, that build people’s trust in science. Again, I’m going to tell you quickly about two.

(01:44:17)
The first is this sense that science is a source of awe and wonder, and that people are deeply amazed and engaged at the way that science can teach us and help us better understand how the world works. Now, you can clearly see already that this is a fundamentally different lens that people can adopt, that allow them to approach trust and science from a different perspective.

(01:44:41)
So, second, more pragmatic, but no less productive is the sense that science is this incredibly valuable tool that we can use to solve social problems, this engine of innovation, this way of moving society forward. And again, you can see this is quite productive. So, we’re at this point now where we’ve got a set of mindsets that really are corrosive to trust in science and that increase susceptibility to misinformation. And this alternative set that build and are productive to trust in science. So, the question now is, what do we do? And the answer to that question really comes from the other half of what I do with colleagues at the Frameworks Institute, which is the science of framing.

(01:45:24)
So, remember that the choices that we make, how we say what we have to say, has the ability to selectively activate and foreground certain ways of thinking, where they become the dominant lens through which we see the world and simultaneously background others. And so, what this does is it gives us a set of very concrete and applicable practices, things first of all, that we should avoid doing because of their power in activating those unproductive mindsets.

(01:45:53)
And the first is that we have to be really careful and aware that, “Science says,” is not the reason we’re giving people for why our findings matter. Right? It turns out, I know it’s really hard to believe, that, “Science says,” is not the most engaging way of communicating what it is we have to say. Unfortunately, we do this all the time. Who in this room has used one of these phrases as the answer for why what we’re saying matters? I certainly have, and those of you who are not raising your hands, are not being honest. So, clearly, this is a great cue for that hidden agenda mindset.

(01:46:37)
Second, we want to be really careful that we’re avoiding unnecessary contradictions and hedges. Now, when we’re talking to other scientists, qualifying confidence and being careful about probability is essential. But when we’re talking to the public, what we have to do is be really clear about what we’re finding, why it matters, and what remains to be learned. And I think about this as a continuum. So, on one end of this continuum, you have concise, ethical, appropriate discussions of confidence. And on the other hand, you have things that look like this, right? This is an actual piece of science communication. And you can see I ran out of red ink on this one. This is one long, ongoing and continuous hedge. And this is like, I could not have written a better cue for that science is capricious mindset if I had tried.

(01:47:38)
So, there’s also a set of practices that we can deliberately advance in our communications, because of their power in activating those more salubrious, those more productive mindsets. And the first is really leaning into the power of examples of where social science and science more generally, has solved social problems. Now, we’ve done some work with this kind of strategy with the academies, tested a bunch of different examples of science conferring social benefit. You see one of them on the screen right now, which is the way in which social science led to the addition of a raised position brake light on automobiles and how this saved countless lives. And in our experimental research, when people were exposed to this example, their support for dramatically increasing funding for science, rose by 25% as compared to a control condition.

(01:48:32)
Second, and this is a pretty fundamental reorientation of how we think about communications. We want to move away from an approach to communications that is grounded in persuasion and convincing people of the value of science. And instead, we want to move to one which is really focused on explanation. And most of the work that I’ve done for the last 20 years has really been on explanation. And I have seen again and again and again, the value of an explanatory approach in shifting mindsets and opening up thinking.

(01:49:05)
I want to give you a quick example to show you what this looks like. So, what you’re going to see is a quick before and after video here. This is a gentleman in the London borough of Brixton, who’s talking about childhood obesity, right? So, you’re going to see him answer an open-ended question about childhood obesity. He’s then presented with an explanation, in this case, a metaphorical explanation about the food environment. And then, he’s asked the same question after that explanation as a follow-up. And I want you to notice the difference between the before and the after. And I should say, will not be hard to spot the difference.

Speaker 1 (01:49:45):

Immediately, I’m thinking that maybe the parents are coping with their children in maybe the wrong way, they’re feeding them too much, giving them food to keep them immediately happy.

Speaker 2 (01:49:59):

What would you say needs to happen for child obesity to go down?

Speaker 1 (01:50:03):

Well, as I said, it’s eating less and moving more, really.

(01:50:07)
It just comes down to other things. The producers of food, if they’re making stuff that’s going to make us unhealthy, that needs to be addressed, to how the food is marketed, how aggressive that is, and how much it’s particularly aimed at kids.

Nat Kendall-Taylor (01:50:24):

So, this is a little obnoxious, but you noticed some difference between the before and the after, right? And just a reminder, what you saw there was the power of explanation to foreground certain ways of thinking, what surrounds us, shapes us. And background others, in this case, individualism.

(01:50:42)
So, the last thing that we want to be really careful that we’re advancing are this practice of leading with broadly resonant and highly shared principles. This is sometimes called values framing. And again, I just want to be clear, the broadly shared and highly resonant principles should not be, “Science says.” Right?

(01:51:01)
And we’re seeing this really powerfully in work that we’re doing right now with the Missouri Health Foundation and the state of Missouri on gun violence, where we’re testing a bunch of different, both quantitatively and qualitatively, ways of opening that conversation.

(01:51:14)
So, the first two of these, I will just tell you, bombed epically, right? So, the first is that kind of strident, blamey and shamey message, that we all know we should not be doing, but that we all do all the time. The second is that kind of, “Science says.” Science authority. Again, both of these were not only ineffective, but they were counterproductive. The third one, which again is this appeal to something that we can all agree matters, was highly productive in opening up legitimate and considerate conversations about this issue. And I will say, even in focus groups that were composed of individuals who both strongly identify as Republican and strongly identify as Democrat. So, hopefully at the end of this, you are with me in realizing the power and potential of this approach, this way of thinking about cultural mindsets, these implicit understandings and our ability as communicators to activate them as a tool to build trust in science, and with it, all of those things that we care so much about. But also, to have this incredibly powerful collateral benefit in decreasing people’s susceptibility to misinformation.

(01:52:31)
And I want to end with a quote, this is from Walter Lippmann, communications scholar, political commentator, who has this great way of talking about cultural mindsets, even though I don’t think he knew he was talking about cultural mindsets. And Walter Lippmann said, “The way in which the world is imagined determines at any particular moment what people will do.” And with that, I want to say thank you very much and frame on.

Rachel Kuo (01:53:03):

Hi, everyone. We’ve just come out of the digital studio right back there, where we’ve been hosting our sessions all morning. And the main program on the stage has just wrapped up the first part, it’s called Let History Be Your Guide. And now they’re heading into the second part of the program, which is called Making Sense of Misinformation. Mis and Disinformation, of course, the theme of the summit. And at 4:00 PM, the third part of the summit will start. It’s called The Truth is Out There, Hope. So, that’s what we’re going to be seeing on the main stage, and I want to take you in that direction now. So, follow me here. We’re at the National Academy of Sciences.

(01:53:45)
All right. And you can see we’ve got some people getting coffee and snacks between the sessions. And as you can see this here, the main hall, is a pretty spectacular view. So, here we are, three days of the summit at the National Academy of Sciences. The second part, as I said, of the main program, about to start on stage. I’m supposed to be on stage, so I’m going to go in there. I’ll see you there. Thank you.

Kelly Stoetzel (01:55:05):

Right. So, now is the time that marks the close of chapter one, where we’re looking back. But before we head into chapter two, I’m excited to introduce y’all to a friend that our online friends have already met. Sumi Somaskanda. Oh, okay. Let me just … Sumi Somaskanda, I know, is the BBC’s chief presenter here in Washington.

Rachel Kuo (01:55:32):

Hi, everyone.

Kelly Stoetzel (01:55:32):

And she is also the host of our digital broadcast.

Rachel Kuo (01:55:36):

Hi, Kelly.

Kelly Stoetzel (01:55:37):

Hi, Sumi.

Rachel Kuo (01:55:40):

My name is not easy. I can tell you that. It’s a very long last name.

Kelly Stoetzel (01:55:44):

I just got a little tongue-tied. I know your name.

Rachel Kuo (01:55:44):

It’s totally okay.

Kelly Stoetzel (01:55:46):

Well, so how did you magically appear here on stage and jump out of the screen to get here?

Rachel Kuo (01:55:53):

Yeah, it’s a Byzantine path from the other end of the building all the way through the great hall, past the coffee and food and the wood milk perhaps, and made my way back here. And the magic of television, I landed here on stage where I’ve been watching some of the great program here and enjoying it along with you guys.

Kelly Stoetzel (01:56:09):

Oh, fantastic. Well, so what’s happening online? What’s it like? What do we have?

Rachel Kuo (01:56:13):

Yeah, so online, well, there are two objectives really for the online audience is first we wanted to have this digital studio where we could directly reach out to the audience that is following, because this is a global summit. There are people literally watching right now all over the world, and we wanted a way to speak directly to them. So, the online stream allows people to type in their thoughts and their comments, perhaps their questions, and we can bring them into our program. So, in and around your program here on the main stage. That’s what we’re doing in our digital studio.

(01:56:43)
And secondly, we are grabbing some of those speakers before and after they come on stage with you, to ask them to reflect on things with us. So, if they watched another speaker, what did they think? What have they learned about mis disinformation? How has it shaped their thinking? So, that’s what we’re up to in the digital studio. And in fact, I probably have to get back there pretty soon, because I think we have more guests joining me there, but that’s-

Kelly Stoetzel (01:57:04):

Well, good. Well, before you go, I think we should do something y’all. Let’s get a camera on the audience here, and if we can wave goodbye to Sumi, we will also be waving hello to our digital audience, our digital friends. Are we down? Thank you, Sumi.

Rachel Kuo (01:57:19):

Bye, everyone. Thank you.

Kelly Stoetzel (01:57:28):

Okay. So, now we are heading into chapter two. This is going to be our biggest chapter. And so, that means that we’re going to have a break for lunch midway through this chapter at 12:30, and then we’ll be back here again at 2:00, to pick chapter two back up.

(01:57:42)
So, we have laid some of the foundation for how we might think about misinformation, but now it’s going to be time to dig in a bit deeper. And while we started with some history lessons, we’re ready to focus on our current circumstances. What are some of today’s biggest challenges? What makes us vulnerable to today’s particular brand of misinformation?

(01:58:04)
We’re all stakeholders in this, but what do we really know about how our minds work when it comes to truth and to untruths? And if even the most vigilant among us are susceptible, is there any hope for the future? So, I was going to introduce our next speaker, but I think I’m going to let my friend, Ann, do that.

Speaker 3 (01:58:28):

Good morning. I’m Ann Merchant with the Creative Engagement Programs at the National Academies of Sciences. It is my distinct pleasure to introduce Eric Mead, perhaps the greatest sleight of hand artist in the world today. We are so honored to have him on our stage this morning, that we have been meeting in secret and hatching a plan to create a special Nobel Prize for Magic to honor his many accomplishments.

(01:58:53)
Knowing how modest he is, we were certain he would decline such an award had he heard about it in advance, regardless of how much he clearly deserves it. I am certain you’ll all agree after his presentation today, that such an award is not only warranted but long overdue. With that totally inadequate, but truthful and sincere introduction, please join me in welcoming Eric Mead.

Eric Mead (01:59:18):

Hi. Hi. It’s kind of quiet out there. Hi. Hi.

Audience (01:59:28):

Hi.

Eric Mead (01:59:30):

That’s good. My name is Eric. That introduction is the second video I made of that. The first one was actually so convincing that I got worried that people would think it was real. And so, I made this clownish one. And I want to talk a little bit about the tools that make that possible and how misinformation is absorbed in the brain, and how we, as people, can be fooled.

(01:59:56)
As you heard, I’m a sleight of hand magician. I’m going to begin, I think, with a little bit of magic. Would that be all right?

Audience (02:00:03):

Yes.

Eric Mead (02:00:04):

I need somebody. Oh, that’s a great, the blood pressure in the room just [inaudible 02:00:13] visibly. Somebody who is not afraid to … Gentleman right here, what’s your name?

George (02:00:19):

George.

Eric Mead (02:00:20):

George. Do you speak English, George? You have an accent.

George (02:00:22):

Yes. I do speak English. I’m English.

Eric Mead (02:00:25):

Oh, you’re English. So, in truth, it’s actually me who does not speak English. I speak American. Would you mind, everybody, give a little round of applause and let George make his way up to the stage here. George, how are you?

George (02:00:47):

I’m doing very well, thank you.

Eric Mead (02:00:48):

Thank you very much. English. Okay, great. From England?

George (02:00:51):

Yeah, from England.

Eric Mead (02:00:52):

Oh, okay. Would you stand just on the other side of this table?

George (02:00:57):

Of course.

Eric Mead (02:00:57):

And this is George, everybody. Yes. It is a very brave act to be sitting out there thinking that you’re just going to enjoy yourself and then suddenly be thrust into the spotlight. You have a reserved seat. Are you someone that I should know who you are?

George (02:01:13):

No. I’m just here to help.

Eric Mead (02:01:15):

Just here to help. But you haven’t met me before?

George (02:01:18):

No, certainly not.

Eric Mead (02:01:20):

Seem pretty happy about that. Okay, so here’s what we’re going to do. Do you know what that is?

George (02:01:25):

A cup.

Eric Mead (02:01:27):

It is in fact, a cup. Notice how he’s tentative already. He’s on guard. That’s a good thing. It is a cup. You step inside, walk around, make sure there’s no trap doors or escaping gas or anything funny going on there.

George (02:01:40):

It’s all right.

Eric Mead (02:01:41):

And then, I give the audience a quick peek. And then this, do you know what this is, George?

George (02:01:46):

A ball.

Eric Mead (02:01:48):

A ball, one of many possible correct answers. This is, in fact, a ball. This one is a cat toy. It’s a little wooden ball, you can feel it, and it’s got a sweater crocheted around the outside, but I don’t own cats, but I was told that you soak this in catnip and then you give it to your cat and they chew on it and they get a little like a kitty buzz.

(02:02:08)
So, here’s what’s going to happen. Little ball goes like this and it hides underneath the cup, okay? And you and I are going to play a guessing game. I want to be very clear to the audience before we start, that I’m going to ask him under pressure to guess several things. And sometimes he will get it wrong. This should not reflect poorly on him. You would get it wrong if you were up here too. I’m a magician. I’ve done this a lot, and I cheat.

(02:02:35)
So, a little ball is hidden underneath the cup. And normally this would be done with three of these and I would scramble them all around like this. And then you would be asked to guess which cup, right? This is the simplified version. There’s only one cup.

George (02:02:51):

Okay.

Eric Mead (02:02:51):

It’s not quite that simple. What’s going to happen is this, I’m going to take the ball and I’m going to hide it in my pocket. Sometimes, I’m going to leave it underneath the little cup. And your job is to guess whether it’s in the cup or whether it’s in my pocket. It’s a 50/50 proposition. We’re not playing for money because again, I cheat. Sound good?

George (02:03:17):

I think so. Yes.

Eric Mead (02:03:18):

You think so? How certain are you where it is right now?

George (02:03:21):

80%.

Eric Mead (02:03:23):

80%? Where would you say it is?

George (02:03:27):

I’d like to think it’s in your pocket.

Eric Mead (02:03:28):

Yes. Let’s start at the beginning. There you go. Little ball. Little cup. All right, let’s involve some other senses, George. You hear that rattling around?

George (02:03:37):

Yes.

Eric Mead (02:03:39):

Yes. I’m going to hold it where you can’t quite see. Reach down in there and scoop it out. Yes. Okay. And there’s nothing else hiding down in there?

George (02:03:49):

No.

Eric Mead (02:03:50):

Drop it back inside. How certain are you where it is right now?

George (02:03:56):

100%.

Eric Mead (02:03:58):

100%. It is, in fact, in there. Okay. But it gets harder as we go. Where is it right now? Do you have a guess?

George (02:04:05):

In the cup.

Eric Mead (02:04:07):

In the cup. Oh, so close. So close.

George (02:04:14):

It’s not going well, is it?

Eric Mead (02:04:15):

No, you’re doing great. You’re doing exactly what anyone else would do. I know, right? Let me show you something about this game. Because of the special circumstance that we find ourselves in, we’re going to talk openly about misinformation and how we process information, let me show you one aspect of this. When I take the little ball like this and I put it in my pocket, it never actually goes in my pocket. Yes. Let’s see that again.

(02:04:52)
Right? Like this, in my pocket. And it’s still right here. So, when I do this, hey, you can do cool variations like this one. In one ear, out the other. Ooh. The sound crew just went insane on that. Sorry, I just whacked right on the microphone. I apologize, gentlemen. Or, this is my 10-year-old daughter’s favorite. Oh, come on. A little bit of fun now and then.

(02:05:22)
Okay. So what happens is, when I take the ball and you’re watching to see if I put it in my pocket, right? What’s actually going on is I come over and I stand up the cup like this while you’re staring at my pocket. And then you go, “Hey, it’s in your pocket.” And I go, “Eh.” Okay. It’s not as good when you explain it. I understand that. You’re soured on the whole deal, aren’t you? I’ll tell you what, we’ll do this. How certain are you where it is right now?

George (02:05:53):

20.

Eric Mead (02:05:58):

20. George, check this out, George. Whoa. Say, it’s a lemon. It’s a great, big, yellow thing. It’s large. How does someone take a great, big, bright yellow thing and sneak it under the cup and no one sees it happen? George, what’s your guess?

George (02:06:16):

From the audience, perhaps.

Eric Mead (02:06:17):

From the audience, perhaps. I don’t actually know the answer, but it must be the same way I did this one. This is George, ladies and gentlemen. Watch out for that. Thank you very much. Thank you. George. Let him hear you.

(02:06:36)
So, let’s talk for just a minute about why we get fooled, right? Why is it that we get fooled? I’ve spent my whole life thinking about this subject since I was a little boy. You could call this talk, I guess, defense against the dark arts. If you know your Harry Potter, that was a course taught at Hogwarts, where they learned to defend themselves against black magic and evil.

(02:07:05)
And although I don’t think that new AI tools that we’re going to discuss for part of today are evil inherently, I do think it’s clear that some people are going to use them in ways that are not for our best interests. So, how can we harden our defenses?

(02:07:20)
Well, the first thing is there’s some of you out there that are saying to yourselves, “If I had been up there, I would’ve made different guesses than poor George. And it would’ve been harder. I would’ve been counterintuitive.” None of that matters. I’m a magician. You would’ve been wrong too. Okay? And that’s the first thing that I want say is, everyone is susceptible. No one, no matter how well-educated, no matter how smart you are, no matter how on guard you are, everyone is susceptible to being fooled. And we get fooled all the time.

(02:07:51)
Part of it is that you can’t trust your senses, what your eyes tell you, what your ears tell you, what you touch. He was listening to it. He felt it. I could have done it with taste and smell, but that’s a weird thing to do in a live performance situation. But our senses can be deceived and therefore, certainty, I asked him several times about his certainty, certainty is an illusion. Everything that you think you’re certain about, you should probably start to question, right? This is the Cartesian model about trusting your senses as your only source of information, right?

(02:08:27)
Rene Descartes took the … Well, I could do it for you. He took a glass of water and he put a stick in it, right? That was his example. I have a pencil, but it’ll do. Ooh, maybe it won’t stay in. I’ll have to hold it like this. I can’t tell if it’s enough in the water that you can see that it looks bent. Yes?

Audience (02:09:02):

Yes.

Eric Mead (02:09:02):

Is everyone familiar with this illusion?

Audience (02:09:04):

Yeah.

Eric Mead (02:09:05):

Right. I’m going to hold it like this. And so, if you trusted just your eyes, you would say, “Hey, that is not a straight stick, that is a bent stick.” And then some people would be, “Yes, but we are wise. We don’t have to just trust our eyes. We know about refraction and the fact that light travels slower through water than it does through air and it makes it look bent.” And that’s when I pull it out and show you this one actually is bent. So, your certainty is an illusion, especially when someone is intentionally trying to deceive you.

(02:09:42)
Everyone can be fooled. Certainty is an illusion. We have evolved to be pattern seeking brains, pattern seeking brains, so much so that we find patterns where they don’t exactly exist. I have a really interesting, I think, demonstration of this. An audio, it’s not an illusion. I’m going to ask the sound crew to play something in a second. I have a couple of audio clips. What you’re going to hear is a recording of a children’s toy and at first you’ll hear a static, like a little white noise background. And then you’ll hear a robotic voice say the word brainstorm, one word, two syllables. Brainstorm. Could you play that clip for us?

Speaker 4 (02:10:28):

Brainstorm.

Eric Mead (02:10:35):

And again. I don’t know if you can hear it, it’s a whispered robotic voice and it says, “Brainstorm.” Could you play that again for us?

Speaker 4 (02:10:40):

Brainstorm.

Eric Mead (02:10:46):

Can you hear it?

Audience (02:10:47):

Yes.

Eric Mead (02:10:49):

Okay. I had an audio engineer strip the voice out of that and replace it with the same voice, except instead of saying one word, two syllables, I had him replace a two words, three syllables. We’re going to hear the same background noise and the same voice except instead of brainstorm, you’re going to hear the voice say, “Green needle.” “Green needle.” Okay, would you play the second clip for us?

Speaker 4 (02:11:14):

Green Needle.

Eric Mead (02:11:20):

Green Needle. Green needle. One more time. Green needle. The second one.

Speaker 4 (02:11:28):

Green needle.

Eric Mead (02:11:31):

Can you hear it?

Audience (02:11:39):

Yes.

Eric Mead (02:11:39):

The great philosopher, Paul Simon, once said that, “A man hears what he wants to hear and disregards the rest.” That really is a voice saying something, but you’re not hearing two different clips there. I’m playing the same clip over and over, and what you expect to hear is what you hear. And I’m going to prove it, because I’ve shown people this and they go, “That’s not true. You’re lying to me.”

(02:12:06)
So, I’m going to have them play it like three more times. And you decide for yourself in real-time whether you want to hear the word brainstorm or whether you want to hear the words green needle. And that’s what you will hear. Would you just repeat maybe three times that recording for us? And decide.

Speaker 4 (02:12:22):

Green needle.

Eric Mead (02:12:28):

And now switch, if you’d like, or stay. And again.

Speaker 4 (02:12:32):

Brainstorm.

Eric Mead (02:12:37):

Right. It’s crazy. This is a mild demonstration of bias. And the biggest thing that fools us, that allows us to be susceptible to misinformation are bias, our own biases.

(02:12:58)
Study after study, if something confirms what you already believe, you’re more likely to believe it, accept it. If something challenges what you already believe, you’re more likely to discard it. I’m suggesting that we need to think about going forward, especially as the new misinformation starts tsunami-ing in on us, knowing that we can be fooled. Right? Knowing that your assumptions catch you every time, right? You’ve had so much experience with the world, that when I do this, you know where the ball is, right?

(02:13:33)
You assume that because you’ve watched people transfer something from one hand to the other a thousand times, you assume that that’s what’s happening, when it’s not what’s happening. This is why adults are so easy to fool, especially really smart scientists, the best. Really smart scientists, slightly inebriated, best audience ever, because they know they can’t be fooled. So, they make all the assumptions. The hardest audience ever? 10 year olds. Not enough experience with the world to make assumptions. They just go, “It’s in your other hand.” It’s great.

(02:14:10)
So, our assumptions, our biases, and our experience in the world and knowing that we can be fooled, that’s going to be our main protection. I want to make two final things. I want to end on a note of hope. Right? That’s the final … Truth, trust, hope. Two things. Number one, we’ve been here before, right? These new tools, the AI tools that are coming out, are hugely powerful and kind of frightening if you’re paying attention. But at the turn of the last century when photography became a thing, everyone thought it was the end of the world. Art was ruined. We could be lied to. And there was a fad religion at the time called spiritualism where they could talk to the dead and the darkroom experts started making pictures of ghosts. And people were like, “How are we going to deal with this?” Well, we learned to deal with that.

(02:15:02)
The same is true with Photoshop in the late ’90s and early 2000s, when that became an ubiquitous tool that anyone could use. We thought we can never have photographs in court anymore, because we can’t trust pictures. And we’ve learned to deal that. So, I believe that we will learn in the long term to deal with these new tools and to work our way and find our way into using them for our benefit rather than our ruin. That’s number one.

(02:15:30)
I think that in the short-term though, I’m a little worried about what’s going to happen with these new tools. And I think there will be a backlash. And for a while we’re going to be mistrustful of digital life in general. And I think that will force a retreat from digital life a little bit, and more to people talking to each other, face-to-face, person-to-person, heart-to-heart. And for me, that’s a very hopeful note to end on. Thank you very, very much.

Kelly Stoetzel (02:16:13):

Wow, thank you Eric. Whoa, I don’t know about y’all, but that green needle brainstorm thing like, I cannot get my head around that. And so we’ve just seen how crazy perception can be, and how easily we can be tricked into believing something that isn’t actually real. But when we get all the facts in a situation right, are we then good to go? Like when we remember things, how accurate is our recall, and how much trust can we put in that? Memory is the focus of our next speaker’s research, and she’s here to give us some insight. Let’s welcome Elizabeth Loftus.

Elizabeth Loftus (02:16:50):

Thanks, and it’s a pleasure, and an honor to be here to talk to you about some of the work that I’ve done in the area of memory, because I mean, we all know how important memory is. Without it, you wouldn’t know how to make the coffee in the morning, or find the car keys, or take the metro, or however you got here. Also, as scientists, we make a distinction between a couple of different kinds of memories that are stored in our memory banks. So, there are semantic memories, or you might call them fact memories like we are in Washington DC right now. That’s a fact in our memory. Or something like Paxlovid is a good treatment for COVID, or global warming is happening. But what I study are personal memories. So, things like I knew I was going to have to go backwards, things like we just saw some really cool magic trick, or maybe a month ago, or so I saw a crime, and I want to tell somebody about it, but memory doesn’t always work perfectly. And so I’m going to ask you, do you think I could make you remember?

(02:18:23)
Do you think I could make you remember that you were attacked by a vicious animal as a child, if it didn’t happen to you? Do you think I could make you remember that as a teenager you committed a crime, and it was serious enough that the police actually came to investigate? Do you think I could make you remember that a week ago you played a game, and you cheated in the game, and you took money out of the game bank when you weren’t entitled to that money? Do you think I could make you remember these things? Could I pour these ideas into your mind, and make you remember these things, personally, if they didn’t happen to you? I asked that question, and a lot of people say, “No way.” I mean no way I’d confess to a crime I didn’t do. No way. I’d think I was attacked by an animal if I wasn’t. But we’ll see how you feel in another 10 minutes, or so. Because I’ve been studying memory for more than 50 years now, and in the course of that career I’ve developed a couple of paradigms for examining human memory.

(02:19:40)
And one of those paradigms is called the misinformation paradigm. So, what happens in these scientific studies is people see some event, a crime, and accident. I’ve been particularly interested in legal events, and later on they get some post-event information, often misleading information about that event. And then we’ll test people, and ask them what they personally remember about their experience. And so we’ve shown lots, and lots of people simulated accidents. For example, in one of my older studies, we showed people an accident where a car goes through an intersection with a stop sign. Later on they’re going to get some post-event misinformation. Here’s the question that planted the misinformation. Did another car pass that red Dotson while it was at the intersection with the yield sign? I want you to appreciate how clever this question is, and I think our magician will appreciate it, too. You think this is about whether another car passed, and you’re thinking about that part of the question.

(02:20:55)
And while you are thinking about it, I slip in the information that it was a yield sign, it invades you like a Trojan horse, because you don’t even detect that it’s coming. And later on, lots, and lots of people will tell us they saw a yield sign at the intersection, not a stop sign. We’ve done these kinds of misinformation studies with people who experience naturally upsetting events, not these staged, or simulated events. We’ve planted misinformation in the minds, for example, of soldiers who are learning what it’s going to be like for them if, and when they are captured as prisoners of war. And these horrific experiences that are done for a good reason can be manipulated with post-event information. So, now I’ve treated you to a quick summary of about 50 years of work on something that we call the misinformation effect. There’s a kind of a cartoon drawing. You expose people to misinformation, you put them in a misled condition, it lowers their memory performance. And why is that important?

(02:22:14)
It’s important, because out there in the real world, misinformation is everywhere. We get it when we talk to other people. We get it when we’re interrogated by somebody who maybe has an agenda, and even inadvertently suggests things that aren’t true. We get it when we pick up newspapers, or online news, and we are exposed to some misleading information. Well, at some point during this process of studying misinformation, I came upon an even more extreme kind of memory problem, false memory problem. It turns out particularly in the nineties, people were going into psychotherapy with one kind of problem, maybe anxiety, maybe depression, and they were coming out of this psychotherapy with a different problem. They had a belief, and these memories of having been traumatized as children sometimes in satanic rituals where they were forced into all kinds of horrible activities, animal sacrificed, baby breeding, baby sacrificed the FBI investigated in many, many of these claims, and never found any kind of corroboration. So, I wanted to study the process by which people could develop these, what we now call rich false memories.

(02:23:45)
And this old paradigm that we had developed where we could turn a stop sign into a yield sign, I mean it just wasn’t going to cut it. I needed to develop a new procedure, something that’s now called the rich false memory procedure where there’s no event to begin with, but we are going to ply people with suggestions about the past, and we’ll see what they then remember about their childhood, or their more recent past. Our first study planted a false memory that when you were about five, or six years old, you were lost in a shopping mall in a particular place with a particular people there that you were frightened, and crying, ultimately rescued by an elderly person, and put back together with your family. After we published these findings, other scientists came forward, and we too, and planted false memories of things that would be more traumatic, or upsetting if they actually had happened like you nearly drowned, and had to be rescued by a lifeguard, or you were attacked by a vicious animal, or you committed a crime as a teenager, and it was serious enough that the police came to investigate. All of these things planted in the minds of otherwise healthy, ordinary adults. How often does this happen? A mega analysis that analyzed data from a collection of these studies. There were something like 423 subjects who, at this point had been subjected to these manipulations. And about 30% of the time people developed a false memory, and an additional 23% of the time they developed a false belief that this had happened to them, even though they didn’t have that sense of recollection. We’ve shown that these false memories have consequences for people. If I plant a false memory in somebody that they got sick eating a particular food as a child, they’re not so interested in eating that food. We did this with hard boiled eggs, we did this with pickles, we did this with strawberry ice cream, and we’ve even put foods in front of people, and if they develop a false belief for false memory, they don’t eat as much of these offending foods.

(02:26:11)
Kind of a nice dieting technique I think here. But so you may have lots of questions about all this like, well, is there any way to tell the difference between a true memory, and a false one? Maybe true memories, or more emotional than false ones? But we found, no, false memories can be felt with just as much emotion. How about the brain? If we could do some kind of neuro imaging, would the neural signals be different for a true memory, and a false memory? We explored this with functional magnetic resonance imaging, and the overwhelming finding is the similarity in the neural signals for true memories, and false memories. Do you need this kind of deception, this sort of Trojan horse deception? And the answer to that is no. We can plant these false memories in all kinds of ways without any deception. And our recent work on push polls I think illustrates this to some extent. These push polls maybe, well, first of all, you know what polling is because we get annoyed sometimes when the phone rings, and somebody wants to know how we’re going to vote. Well, a push poll is masquerading as a legitimate technique for gathering information, but really what the caller wants is to slip some information into your mind. An actual example, would it make any difference in terms of how you plan to vote if I told you that John McCain had fathered an illegitimate black child? A push poll that was actually done. Would it make any difference in terms of your willingness to vote for Obama if I told you he was really lenient on sex offenders? Well, we’ve now studied this push polling with my Irish collaborators. We gave people information about a politician, a female politician, background information, or education, or policies, and so on. And afterwards we’re going to use a push poll, and we’ll find out whether it affects their willingness to vote for her, and their memory about her. I mean, the push poll was very simple. If I told you that this politician had been accused of cheating on her income tax, or we might even make it a little more elaborate, and talk about the ways in which she was accused of cheating on her income tax.

(02:28:56)
And what we found is the push poll not only affected the way people said the likelihood they were would be to vote for this person, but they also started to remember that she had committed tax fraud. So, this mind technology really raises a whole bunch of ethical questions. When should we use this technology, if ever, and when, and how are we going to regulate it? So, when I look into the future, well the short term future, we are now able to doctor photographs, and so many of us can, that’s another nifty way to plant false memories, just expose people to doctored photographs. We’ve done this a number of times, and things are going to get even worse with deep fake technology that is going to get into the hands of so many more people, and take us way beyond that original example where you could make Barack Obama look like he was saying, and doing anything you wanted him to say, and do. But it was really the speech, and activities of an actor.

(02:30:16)
Think about with AI the amount of push polling, for example, that would be possible. So, I started here by asking you, could I make you remember? Could I pour these ideas into your mind? Could I make you remember that you were attacked by a vicious animal if you weren’t? That you committed a crime as a teenager, if you didn’t? That you cheated in a card game when you didn’t? I didn’t talk about that work. Some great work coming out of Britain. All of these things have been planted in the minds of otherwise healthy, happy adults. Yes. So, I’ve got one take home message. If I’ve learned anything from 50 years of working on memory, and memory distortion, it’s this, that just because somebody tells you something, and they say it with a lot of confidence, just because they give you a lot of detail about it, just because they cry when they tell you the story, it doesn’t mean that it really happened.

(02:31:24)
You need independent corroboration to know whether you’re dealing with an authentic memory, or one that’s a product of some other process. Not quite. Not quite. Thank you. I was going to end there, but I’ve got 19 seconds. So, I’m going to share my favorite quote from Dali who once said The difference between true memories, and false memories is like jewels. It’s the false ones that seem the most real, and the most brilliant. If I could meet Sal, which I can’t because he’s died, but I can’t yet anyhow, I would have to say, you weren’t quite right. It’s not that the false ones are more brilliant, and more real than the true ones, but they are equally real, and equally brilliant. Thank you for your attention. Thank you.

Kelly Stoetzel (02:32:38):

Elizabeth. So, I mean, we know that we have misinformation coming at us left, and right, and now we know we can’t even trust our memory. So, what makes us share this stuff? Why do we keep sharing misinformation, and spreading it? Well, Gizem Ceylan is her research is focused on understanding just that, and on exploring strategies to mitigate it. Let’s welcome Gizem Ceylan.

Gizem Ceylan (02:33:07):

Hi everyone. I want to take you to the early days of COVID-19 pandemic. So, in the early days, Peter Goodchild, an 84-year-old art gallery owner, shared an important post on Facebook. This post included detailed medical advice about COVID-19 pandemic, including some of these details which turned out to have no factual basis later on. However, at the time it sounded very truthy. It allegedly came from a friend whose uncle was a physician at a hospital in China. And in a matter of days, Peter post went viral. It was shared by hundreds of thousands of other people. But initially, Peter didn’t know whether this information was accurate. And naturally everybody was looking this type of information, and he shared with others. But what is more interesting about this story from my research perspective is that even when Peter learned that this information was completely fake, he didn’t change his sharing behavior, you would think that he would become a little more careful, right, to not to spread misinformation. During a deadly pandemic if you spread misinformation, it turns out that people could actually die.

(02:34:40)
But in a BBC interview, he explained that he has seen many fact checking messages later on saying that no, the post is not correct, or the people in the post were not real, or the events did not happen. All these warning signs did not stop Peter from spreading more information later on. So, in my research, this is a very interesting case to me. And in my research, I’m studying the effect of social media habits on people’s sharing of misinformation. My work actually takes the current wisdom beyond, and examines that actually maybe it’s not people just cognitively lazy, or biased, but there is something in their social media environment that make them more vulnerable to share misinformation. So, how do we build social media? In my work I’m studying the impact of social media habits, but one question that you might be wondering is that how do we build social media habits? So, how do we spread misinformation? So, you build information sharing habits on social media in the same way you build any other habit.

(02:35:58)
You do the behaviors that lead to the rewards repeating most of these behavior until they become automatic, and unconscious. But on social media, if you think about it, the rewards are social, right? You gain attention, you build connections through likes, re-shares, and new followers. So, when you’re new on a platform, you don’t know which posts are going to get you those rewards. But you learn through practice, you share some posts, and that gets you rewards, and you share other posts, you don’t get any rewards unfortunately. But over time you repeat the behaviors that lead to the rewards, and eventually your behavior becomes more automatic. And you basically, you respond automatically to the cues in your social media environment, and you don’t really need to think about what you are doing much of the time.

(02:36:56)
And you don’t have to evaluate whether the information is true, or false. And the more habitual you become on these platforms, actually, the less thinking you have to do. And over time you become like a pigeon pecking at a button with the hopes of getting food, but you don’t realize it. And to test the effect of social media habits, my cohorters, and I conducted a series of experiments with over 2000 social media users. And in these experiments we gave the social media users a set of true, and false headlines, and they looked like this. So, the look, and the feel of these headlines were very similar to what you would encounter on Facebook. And we asked them whether they would share this information online. We also measured how often they shared information on Facebook, particularly, which is a measure of their social media habits. And the results were fascinating when people had weaker habits, the ones you see on the left did not share much of the information as you would expect. I mean the bars are very low.

(02:38:07)
And what is more important is that they were also sensitive to the truthfulness of the information more so. So, they shared only half of the false headlines than through headlines. But people with stronger social media habits, the ones on the right that you see here, they did much of the sharing, right? I mean the bars are much higher than the other bars you see here. And what is more striking, and unfortunate for us is that they were completely insensitive to the truthfulness of the information. As you can see here, there was very little, and in statistical terms, no difference between the headlines, through headlines, and the false headlines they shared on these platforms. And there is one additional point, interesting point to the story is that when asked every one of us want to share accurate information, although we don’t do it all the time, as I have shown you earlier, right? And because we don’t realize that actually our behavior is not driven by our motivation on these platforms, they are driven by our habits that we build on these platforms.

(02:39:22)
So, these findings have very important implications for how we combat misinformation. If we assume that misinformation spreading behavior is driven by people’s cognitive capacities, or their biases, then we focus our attention on providing them with accurate information, or debunking the myths. But if we recognize that misinformation spreading behavior is driven by on social media platforms, and the reward structure on these platforms, then we can design different interventions. And my cohorters, and I decided to test what happens if we change the reward structure on these platforms. So, with that, we recruited again our participants, and we gave them one of the two different rewards in this case. And in the reward accuracy, reward condition, we rewarded people for sharing accurate information, and not sharing misinformation. And in the misinformation reward condition, we rewarded people for sharing misinformation, and not sharing accurate information. So, the rewards were actually working in opposite directions. And in the experimental conditions, as you would expect, people acted in line with the reward, whatever we rewarded them, they did it in the experimental conditions.

(02:40:55)
However, what we were interested in is what happens when we take away the rewards. That’s how you test habits in my field. So, you reward the desired behavior, and then you take away the rewards, and then see whether the behavior sticks. And in next we gave them a different set of headlines through, and false again, they decided whether they would like to share, or not. But this time there was no reward attached to it. And I was heartened, and surprised to see that our new reward structure you see on the right this time, lead people into forming new habits, and the habits that actually we would like to see. When we rewarded people for sharing accurate information, that’s what they did, as you can see on the right, right? I mean they shared more accurate information than misinformation, but when we rewarded people for sharing misinformation, as you can see on the left, unfortunately, they continue to do that even when we removed the rewards. So, this tells us that we have to reward the behavior that we want to see on these social platforms. So, in summary, my findings are suggesting that we can reduce misinformation sharing behavior, but we need a new reward structure that focuses people on accuracy rather than popularity of the information. So, if we can accomplish this by changing the reward structure on these platforms, but we will be less likely to accomplish this if we focus on people’s individual actions such as reminding people for fact checking, and providing information about the accuracy. And these are not going to solve our problems. And in summary, my findings suggest that if Peter’s initial post, if you remember I have shown you earlier, hadn’t received so much viral attention, probably Peter would have formed different sharing habits.

(02:43:04)
However, people like Peter, like you, like me, if we continue getting the social rewards, and recognition, even when we share misinformation, unfortunately we will continue to do so, and we can change the misinformation spreading behavior if we change the reward structure on these platforms. Let’s focus on the behavior change by focusing on the social platforms, and let’s save lives. Thanks so much.

Kelly Stoetzel (02:43:37):

Thank you so much Gizem. So, our next speaker is in a pretty major position to address the spread of misinformation, and disinformation, and hate speech around the globe. She leads the UN’s Department of Global Communications, and she’s currently working on a code of conduct for information integrity on digital platforms. Let’s welcome Melissa Fleming.

Melissa Fleming (02:44:16):

Thanks, Kelly. Well, I’ve been in this communications business for the United Nations for a long time. And I remember when social media burst onto the scene, we communicators were really excited because we were used to working only through journalists getting a little diluted what came up. But then we could communicate with people directly at scale. We could ask them even to join forces with us to improve the world. And our presence on social media soon grew exponentially. This is just our UN channels, but this the whole UN system. We have 60 million followers, and the platforms have arguably done much good. They have helped create grassroots movements that have created political change. They’ve connected the isolated, and reunited families displaced by war. But before long we saw a dark side. And I had myself a few personal aha moments. I was spokesperson for UNHCR, the UN Refugee Agency. And I remember in 2015 when a million refugees came to Europe, and we saw such an outpouring of welcome on our social media platforms, on our TVs.

(02:45:40)
I witnessed the best of humanity there, but soon bad actors started spreading ugly lies online to frighten the public. And the welcome soon soured. Policies became more hostile. Walls came up, and a year later I had my own wake up call. I was diagnosed with stage three breast cancer, and the course I was told to take chemotherapy, surgery, radiation was kind of scary. And I did what most of us do. You’ve heard of Dr. Google. I went to Facebook too, and I started looking for a community. One of the most prominent findings, or pages that popped up was a site called The Truth About Cancer. And the truth about cancer gave the following advice, don’t treat your cancer with chemotherapy there are natural remedies instead, if I had taken their advice, I would not be standing here with you today. And the reason I raised this is because that same group is still going strong on Facebook, and Twitter, and other platforms with the truth about vaccines. They’re trying to convince people to refuse lifesaving vaccines.

(02:47:07)
And during the COVID-19 pandemic, the Center for Countering Digital Hate found that the couple behind both of these pages were part of what they coined the disinformation dozen, just 12 accounts that were responsible for 65% of the disinformation that was spreading across the world on social media infecting the minds of millions. And why is this relevant to the UN? Well, when we finally got the vaccines to other parts of the world, in Africa in particular, which was very, had a public that were very vaccine positive, we were finding people refusing to take them. They had been so infected by these conspiracies that they weren’t only refusing the COVID-19 vaccines, but also childhood vaccines as well. We’re also seeing that in conflicts, social media is being weaponized to provoke the worst in human nature. A free tool used by genocidal governments to say it’s okay to murder, and to rape, and to drive out fellow citizens.

(02:48:28)
I met many victims, and I’m thinking of the Rohingya refugee population in Bangladesh, women whose babies were torn from their arms, and thrown into fires. UN colleagues doing a study later found that Facebook played a significant role in perpetrating that violence. And since then we’ve seen social media platforms deployed in many wars to dehumanize the enemy, but also to increase the density of the fog, and the fog around atrocities, and war crimes. And this is happening against a wider backdrop of online hate. We’re seeing huge spikes in anti-Semitism, in racism, and in other hate speech, normalizing this hate speech. And even the UN is under attack. False allegations spread online are targeting peacekeepers. This is from a recent survey which found that 75% of peacekeepers are saying their own safety, and security is being threatened by disinformation as if they didn’t have enough challenge to face. And it’s also hampering their own operating environment according to them. As we’ve heard by many other speakers, social media is also being harnessed to undermine, and distort, and abuse

Melissa Fleming (02:50:00):

… those who are promoting the science around climate change, the goal to silence the scientists and the activists, and secure a livable future for the planet. I’m seeing a number of news reports in the last couple of days and surveys citing climate scientists who are fleeing Twitter in droves. This is sad because we need those voices to promote the science and secure a livable future for our planet.

(02:50:34)
We’re also obviously very concerned that the new huge leaps in generative AI could take disinformation to new levels. While it’s good to hear the AI founders begging for regulation before it’s too late, we must continue to push all big tech to deliver on their early promises. But obviously experience has shown that we can’t leave it to the social media platforms or the other tech platforms alone, and the UN is working on multiple fronts to bring more balance into our information ecosystems by stepping up our online communications. And in many ways, we’re trying to make UN content cool.

(02:51:27)
But just to note, we do face lots of challenges. Certain social media policies hamper our reach. Meta, for example, has acknowledged that the UN falls under a category called Civic Down Ranking, which actually puts us at an algorithmic disadvantage at a starting point. And of course, firings at Twitter have left us with no one to call to flag content that is abusive or even incitement to violence. We’re now seeing that almost anyone can promote disinformation for the price of a blue tick.

(02:52:10)
So still, it’s not all bad. We have teamed up with the platforms to elevate reliable information around COVID and climate, to amplify trusted messengers, and we have quite an army of them out there who want to take UN content and promote it within their followings, and also educating users on how to slow the spread of disinformation. Our new slogan that we want to have everybody have in their ear when they’re online is, “Pause. Take care before you share.” But yet we do feel like we are in an information war and that we need to massively ramp up our response.

(02:52:56)
So we’re creating, at the UN, a central capacity to monitor and also have the ability to rapidly react when mis and disinformation and hate speech is threatening not just our people, our operations, but also the issues and the causes that we’re working on. But also, we are going to be gearing up our verified initiative around climate change and developing this UN Code of Conduct on information integrity on digital platforms, hoping to set global standards that we can all advocate around so that we can collectively work for a more humane internet.

(02:53:35)
So we obviously can’t do this alone, and it is so inspiring to have all of you here in the room who are working in so many different ways to create a more humane information ecosystem. I think that we vastly outnumber the haters all together. And if we do join forces, I think we can also, together, heal our troubled information ecosystem. Thank you.

Kelly Stoetzel (02:54:12):

Thank you, Melissa. Okay, so now, see the chairs, we are going to bring up two speakers to help us bring together some of the ideas and thinking that we’ve just heard about, and then they’re going to help us with tools that we can use going forward. So it’s going to work like this. They’re going to cover three topics for no more than six minutes each. And I’m up here just to keep us moving through each one on time, but otherwise get out of the way. So our two speakers.

(02:54:48)
Åsa Wikforss is a philosopher whose work focuses on the mechanisms that drive science denial, and she’s also a member of the Swedish Academy. And Kathleen Hall Jamieson is a communication professor at University of Pennsylvania, and she’s the 2020 recipient of the National Academy of Sciences’ Public Welfare Medal. And she thinks a lot about communicating facts in science and in politics. Let’s welcome Åsa and Kathleen.

Asa Wikforss (02:55:14):

Thank you.

Kelly Stoetzel (02:55:20):

And I’m going to need a clicker for the slides. Well, maybe we can just talk through them first. So can we bring up the first question slide while I’m waiting for that clicker? Okay. So this is going to be the first question. What are the one or two key insights that you would like to see more prominently featured in discussions of mis and disinformation? And let’s hear from you first, Åsa.

Asa Wikforss (02:55:46):

Thank you. Yeah. I am a philosopher and I think one thing that we need to remember is how much of our knowledge is not our own, as it were, because human knowledge has this very important social dimension. And one way to put that is to think about the important sources of knowledge. There are two very important sources for us. There’s direct experience. There is what we see and hear and feel. So I can hear the raindrops on the window and I can look out and see the rain and I know it’s raining. And that kind of knowledge from direct experience is, of course, incredibly important for how we navigate our daily lives and we have that in common with other animals.

(02:56:19)
But we also have another way of getting knowledge, and that’s through testimony from reliable sources. And if you think about it, most of what you know about science, even as a scientist, or about society, about history, geography, what have you, you don’t know through direct experience. You know it because you’ve heard it from a reliable source. And this is our great strength. Of course, it means we have knowledge accumulation, we have this division of cognitive labor, which means we can create this amazing mounds of knowledge and we can accumulate it across generations. But it is also a great vulnerability because it means that knowledge requires trust. I won’t get the knowledge from the reliable sources unless I trust them. And if I trust the wrong sources, I’ll end up in a bad place. And this is a particular issue right now because a lot of the disinformation is directed at undermining trust in the reliable sources and making us trust the unreliable ones. And we are very, very vulnerable to that.

Kelly Stoetzel (02:57:24):

Kathleen.

Kathleen Hall Jamieson (02:57:26):

I’d like to focus on three things a little more intensively than we have in the last couple of years. First, trying to minimize susceptibilities to misinformation. Secondly, trying to find ways in which protective knowledge can lay the grounds to minimize those susceptibilities, and also to arm us when we’re faced with misconceptions or misconstruels or misinformation, to have inside our menu some of those things that the fact-checkers in the scientific community will come back as explanations for why no, that isn’t actually the way it is. And then third, I’d like to increase the likelihood that every time we talk about science, particularly in a crisis, particularly when there’s a lot of new knowledge being generated, that we bookmark what we say by indicating clearly, “This is what we know now. We’re learning. Knowledge is provisional, knowledge is subject to updating.”

(02:58:14)
And as a result, when we pick up new knowledge, we will also do a second thing. We indicate that we now know this and this is how we know it. We didn’t know this back then when we said, for example, “That healthy people don’t need to necessarily walk around wearing masks,” but now we know there’s asymptomatic transmission. This is how we learned it. And as a result, you may think you’re healthy, you may look, your health look healthy, but you may actually be capable of transmission. That explanation bookmarks to say, “Science has learned,” and as a result minimizes susceptibility to the charge that, “Well, you just changed your mind because it was ideologically convenient,” or “You’re incompetent. You didn’t know then how do we know now?”

(02:58:52)
The notion that there is protective knowledge that we could put in place to minimize susceptibility assumes that there must be categories of knowledge that we offer when you’ve got an infectious disease, think COVID. And I’d like to argue, based on research we did with funding from the Robert Wood Johnson Foundation, that you could parse more than 4,000 misconstruels or distortions checked by one of eight fact-checkers into one of seven categories. I’ve got them up on the slide.

(02:59:18)
And what that tells you is in every one of those categories about infectious diseases, we have basic knowledge about simply infectious diseases, before we even know the specifics about COVID. Having those foundational pieces of knowledge in should build a repertoire on the part of the public to decrease susceptibility to misconstruel and to deceptions. This pattern also tells us across time that, of those 4,000 claims, a whole lot of them were about vaccination. And so it helps us say, “We need to build knowledge about how vaccination works, how we know that it’s safer than getting the disease,” and we need to offer information upfront, such as viruses mutate. So a vaccine that might work to do something at one time, well, might not work the same way a little bit later. That’s what I mean by foundational knowledge.

(03:00:06)
We also know from this analysis that most of those misconstruels about vaccination are about safety. They fall into those categories, but most about safety makes it really important that our foundational knowledge say, “This is how we know that the vaccine is safer than getting the disease.” “Safer than” is important because there are going to be some side effects and there’re going to be some that are unanticipated even with the best of clinical trials. We need the public to understand that just because a side effect appears that didn’t show up in a clinical trial, doesn’t mean that scientists got it wrong. It meant there was something else to learn.

Kelly Stoetzel (03:00:42):

Thank you. So we’ll head on to our next question. Are there specific ways to reduce public susceptibility to misinformation about science? What do you think?

Asa Wikforss (03:00:53):

Yeah, I mean, I think that since, indeed, knowledge has this social dimension and so much of the sort of scientific knowledge we have as individuals depends on trust, and we are often… Our understanding of specific scientific claims is often going to be quite shallow because you hear a scientist say this and you believe it and so on. And so therefore, it’s very hard to know how to respond when someone challenges that, when a science denialist comes and says, “Well, how do you really know these things?” So I think we know need to talk more about trust and why we should… Should we trust science? And this is partly a philosophical question that we need to think hard about, and some people have done that.

(03:01:36)
And I think one thing we need to bring up… When you ask that question, “Why should we trust science?” people tend to focus on, “Well, we trust scientists because they have this training and they’re kind of smart and all that.” And yeah, that’s true, generally speaking. But we, as individual scientists, we have our biases. We have confirmation bias. We really want our theories to be true. We are driven by values too. We have things that we value that influences how we act. So I think the thing to do is to lift your gaze a bit and look at the institutions of science which are set up to counteract the biases and the skewed reasoning that we’re all liable to fall prey to. And I think we need to talk much more about that. What is it about the scientific institutions that works?

(03:02:26)
So look at confirmation bias, for example. We know, someone was talking about it before, that we should all know that we have confirmation bias. Yeah, it’s good to know that. But unfortunately, knowing that it’s not going to help you one little bit. That we know from a lot of research. Having the knowledge that you have the bias is not going to help. What helps is if you are in a social setting where people question you. Because then you discover your confirmation bias. And of course science is set up that way, as an institution where we have this internal review and questioning all the time. We have peer review, we have this communal critical thinking, which are conferences where people ask you critical questions and challenge you.

(03:03:03)
And all these things, I think people don’t know that enough. I know because I’m out talking a lot and I tell people this and they’re like, “Oh, peer review, I’ve never heard of that.” We, as scientists, we think that’s an obvious one, but it’s not that obvious. So we need to talk more about that. And I think moreover, these kinds of characteristics, the internal mechanisms of review and critical thinking, they characterize all reliable knowledge institutions, including reliable media. They’re also set up in ways to weed out error and mistakes and bias. So I think also when it comes to trust in media, we need to talk more about how these institutions work and inform the public about that.

Kelly Stoetzel (03:03:41):

Thank you, Åsa.

Kathleen Hall Jamieson (03:03:42):

And I agree, by the way, with everything that Åsa said. And that falls into my category of things we want the public to understand about science. Because if the public understood that, it would be better able to understand, when we talk about science, what it is we’re saying, and why it is that that process of exchange among each other sometimes looks to the public as if we’re confused, when in fact that’s part of the way we’re testing our claims. That’s the culture of critique and self-correction that helps science function well. Now, one of my slides got up when Åsa was talking, so you didn’t get to see that she had a slide that reinforced all of that.

Kelly Stoetzel (03:04:18):

Can we fix that real quick, actually? So I think there should be another one too that… I don’t know, Åsa, if you want to say anything [inaudible 03:04:23].

Asa Wikforss (03:04:23):

Oh, it’s okay. Well, that’s my third one, so that’s-

Kelly Stoetzel (03:04:26):

I don’t know what’s happening.

Kathleen Hall Jamieson (03:04:28):

And sometimes science just gets things wrong. Sometimes the order of the slides is wrong.

Asa Wikforss (03:04:32):

Can’t trust them.

Kelly Stoetzel (03:04:32):

Exactly.

Kathleen Hall Jamieson (03:04:33):

And sometimes that falls into a category called innocent error. We don’t think that there’s a conspiracist behind stage who is trying to do us in because we’re incredibly powerful women and they’re just afraid of us.

Asa Wikforss (03:04:45):

They wouldn’t dare that.

(03:04:54)
No. But the point there that you’re stressing is what sometimes is formulated as the social dimension of human reason. Individually, we are not terribly good. We go wrong in all sorts of ways. Together we are amazing, and that’s the thing to remember about this.

Kathleen Hall Jamieson (03:05:09):

But sometimes the way we communicate gets in the way of people understanding what we know. And so my answer to the second question is, change the Vaccine Adverse Event Reporting System’s name. Change the name of VAERS. We are posting a white paper today on the summit site and on the Annenberg Public Policy Center site that provides the social scientific reasoning and the evidence behind our argument that we should change the name of the Vaccine Adverse Event Reporting System. Because the way in which those nouns aggregate implies causality when everything there is at best correlational. And as a result, seeds all sorts of deceptions, which we document in the white paper, and people are susceptible to them because they think what’s in that database is causally confirmed when it is not.

(03:05:56)
You see the top left of my slide, that shows the US public being asked whether those events are confirmed or not. Most are unsure. Those who believe that they are confirmed and those who are unsure are more likely to say that thousands of people died as a result of COVID vaccination in the United States. We saw, bottom of the left slide, a spike in various claims at the beginning when the vaccinations were first forecast during COVID. We saw, top right, top left, when we looked content analytically at those claims that most of those claims did not accurately describe the nature of VAERS data and most implied vaccines were unsafe.

(03:06:38)
We’re increasing the likelihood that people will be told by individuals that VAERS has confirmed those deaths when it hasn’t, increasing the likelihood. They accept that because the name invites it, and because we’re not careful about saying, “Raw and unconfirmed” every time we cite those data. It’s an important system. It’s monitoring for any possible effect. But the name and the fact that we don’t keep unconfirmed tied to it is increasing susceptibility to deceptions and those deceptions are consequential. If you believe that there are thousands of deaths caused by COVID vaccination, you have a direct predictive route to vaccination hesitancy in the presence of controls.

Asa Wikforss (03:07:18):

This also shows the importance of philosophy, right? Because…

Kathleen Hall Jamieson (03:07:20):

Yes.

Asa Wikforss (03:07:21):

… philosophers are very good at explaining the difference between correlation and causation. And I think we should talk more about that.

Kathleen Hall Jamieson (03:07:27):

And they have Latin for it. Post hoc, ergo propter hoc.

Asa Wikforss (03:07:29):

Yeah, there you go.

Kelly Stoetzel (03:07:30):

Impressive.

Kathleen Hall Jamieson (03:07:30):

Now we don’t have any idea what slides are coming up next.

Kelly Stoetzel (03:07:35):

Yeah. Should we do PowerPoint karaoke?

Asa Wikforss (03:07:39):

We have them in our heads, it’s okay.

Kelly Stoetzel (03:07:42):

Can we get that third question up? I think there’s… Yeah, this is the one. Okay. So are there actions that the attendees at the summit… Oh.

Kathleen Hall Jamieson (03:07:53):

That conspiracist wants to change the name of VAERS.

Kelly Stoetzel (03:07:55):

There we go. Just hold there for a second and we’ll figure it out. Are there actions that the attendees at the summit can take to bolster trust in science and the knowledge that it generates?

Asa Wikforss (03:08:09):

Forget about the slides. Look at us. Okay?

Kathleen Hall Jamieson (03:08:13):

You know, redundancy is correlated to retention, and that slide wants you to remember, “Change the name of VAERS.”

Asa Wikforss (03:08:21):

Well, so I think… And I’m going to be a little serious for a moment. I think we’re talking about the loss of trust in science, but we need to also say that there’s not a general loss of trust in science. There’s a politically polarized loss of trust in science and a politically polarized loss of trust in media. What is going on largely is a politicization, an intentional politicization, that’s really hard to say for a Swede, of science. Because science tells inconvenient truths for the preferred policies of some politicians. And we need to say this straight out.

(03:08:59)
Now, this is not the fault of the scientists, but it means the scientists who find themselves in the midst of having to communicate science when science is being intentionally politicized, and when there is all this disinformation used to spread conspiracy theories about scientists, we have to be super careful with how we communicate. And a really important thing to remember is the difference between the scientific facts, the scientific foundation, and policies. Policies always involve two things, the facts and the goals. Now in a democracy, the goals are determined by the elected representatives. That is by the voters, in theory. And science can tell us how we reach those goals and it’s the instruments for reaching these goals, but science can’t tell us what the goals should be.

(03:09:49)
We are not experts on the values. No more than anyone else who’s a citizen in a situation where you vote. So we need to be very clear on, “This is what the science says. If these are your goals, this is what you should do because this is what the science says. If you have some other goals, then there are other things you should do.” And I think there’s another reason to really keep this… Because what happens otherwise is that people who don’t like the policy, dislike the policy, they go after the science. That’s the wrong place to go. You should look at the goals then and discuss that. And that’s a really important thing. And I think that the other good advantage of doing it that way is that it also becomes very clear what the goals are. If the goal is to minimize the spread of communicable diseases, we can agree on that goal. It was like minimizing the number of people killed in traffic accidents. We can agree on that goal. Then we can go to science and see what science tells us.

Kathleen Hall Jamieson (03:10:46):

And I agree that, in this process, the way in which we communicate can increase the likelihood that our science becomes polarized and politicized. The ways in which we communicate can create vulnerabilities. And if you try to put up a slide, I think you’re actually going to get the right one. Let me just tell you this, imagine it in your head. It’s on a slide. 122 claims that came from published science were subject to polarized partisan distortion during COVID, as documented by one of the eight fact-checkers. There were probably many more, but as documented by one of the eight fact-checkers across more than two years that we coded content in that 4,000 units that were coded across time.

(03:11:29)
We could categorize those 122 into one of four categories. The categories are on the screen. Each of those categories, our journal editors and our scholars and our structures have a way to minimize the likelihood that the science could have been polarized and in some cases conspiracized. In some cases, the science was just flawed and it was being accurately quoted, but it was flawed science. Our protection is retraction. In some cases, the science was just not clearly communicated. It was reputable, peer-reviewed science, and the science was accurate, the communication was just muddied. And in ambiguities in the conclusions, conspiracists read additional meaning in, or partisans who wanted to polarize picked that up and used it as evidence, but they could actually cite real language in doing so, it was poorly written.

(03:12:17)
You also had instances in which the authorship was being misconstrued. There’s one study that was attributed to a Stanford author, but it wasn’t Stanford University. We can fix that by pushing our ORCID means of identification and by verifying our authorship’s identification all the way through the publication process. And then some things are just published in journals that aren’t very good, but the public doesn’t necessarily know that, nor necessarily do those people who are trying to interpret the science as best they can. Well, that’s what our badging system is for. Our badging system indicates what you can trust and not trust. When in doubt, trust the badges. Science has means to protect the integrity of its published knowledge. That 122 claims were distorted out there and got viral traction that we might have had some control over by doing a better job is on us.

Asa Wikforss (03:13:13):

Yeah, I mean, finding reliable sources is hard. It requires time, knowledge, effort, and we often are not able to do it, so we need all the help we can get. So I think badges and things like that is a very good idea.

Kathleen Hall Jamieson (03:13:25):

And we need the National Academy of Sciences and the Nobel Summit to help us do it. Thank you.

Asa Wikforss (03:13:29):

There we go.

Kelly Stoetzel (03:13:34):

Well, thank you so much Åsa and Kathleen. That was fantastic. Thank you. And it’s also great to see that truth and science prevailed over the slide conspiracy.

(03:13:48)
So now it’s time for a little break from this stage program. Online friends, I know that Sumi has some unique and interesting things planned for all of you, and we will see everyone back here at 2:00 o’clock Eastern. We’re going to say goodbye to our digital broadcast people. And friends in the room, we have lunch, you can get a tattoo-

Sumi Somaskanda (03:14:15):

All right. Thank you so much, Kelly, and welcome back into our digital studio. This is where we’re going to have the opportunity to reflect on what we saw and heard on stage through the course of the morning. So it’ll be a great opportunity to break down some of the thoughts that maybe you’ve been thinking about as well.

(03:14:30)
I’ll show you who you have here in the room with us. We have Niamh, Martin, and Rebecca, and I will properly introduce you in just a moment. But I do want to bring in some of the thoughts, comments, questions that have been shared in our livestream because we have a really active audience around the globe. And so I’m just going to look at this right here. So this is one of the thoughts that was shared in our livestream. “I think we believe disinformation because many of us are constantly looking for new information and we believe whatever we see. For example, when we see a new virus is discovered, we look for the information rather than looking deeply into it, whether it is good or bad information.” So that’s one thought. “If building trust is the goal, we cannot say what is right or wrong or we put up a wall.” That’s an interesting point as well.

(03:15:19)
And, “Science ‘experts'” in quotations, “divide and conquer approach and peer review are fundamentally flawed with information gaps, competition, and bias. Open innovation platforms are a way to progress.” So that’s something I think we can pick up on. And here, another point. “I view the internet as a partial remedy to the misinformation plague. The internet makes it difficult to hide the truth as it offers up a vast number of sources.” So also an interesting point made there, because often we hear that misinformation, disinformation, thrives on the internet.

(03:15:53)
So as I promised, I’m going to properly introduce you now. So here on the left is Rebecca MacKinnon, Vice President of Global Advocacy at the Wikimedia Foundation. In the middle is Martin Chalfie, the university professor in the Department of Biological Sciences at Columbia University and the 2008 Nobel Prize in Chemistry laureate. In 2008, correct? Yes. And of course, to our right here is Niamh Hanafin, Senior Advisor of Information Integrity at UNDP. So to all three of you, good afternoon. Maybe just off the bat, any of those statements that we got from our audience stand out to you?

Smriti Keshari (03:16:30):

Sure. I mean, I think this question of the information gap is critical and we see this playing out everywhere that we work as well. And I think there’s a nuance to that. I think that we don’t explore enough, and it’s not just the information gap, but it’s our ability to preempt. So are we able to actually get information preemptively on an issue so that we are competing with the disinformation as it starts to emerge? And we see this happening, unfortunately, in election context. We see it happening with the COVID vaccine. It’s a great example. We were monitoring disinformation around the vaccine some nine months before it actually came on the market. And that was not being combated until we had a vaccine ready for use, which, at that point, was already too late. So I think the gap is very, very critical in this and there is the ability to actually understand where the information is going, how these narratives are developing, and what we need to do in advance to be able to combat that.

Sumi Somaskanda (03:17:29):

Martin and Rebecca, your thoughts?

Martin Chalfie (03:17:30):

So I was struck one of the comments about, really, cooperation and interaction and how important it is to get multiple voices in a room talking about things. It was said, in one respect, about peer review, and it was, I think, somewhat negative and I think I disagree with that in the sense that peer review is not you have to do accepted knowledge. It really is that people that are most involved with the work can question it. But that doesn’t mean that others cannot question as well. I think a lot of it, I think overall from the session this morning, what I took away is we have to work on many, many different areas, many, many different directions. This is not a one approach is going to solve the problem of how we get accurate information to people, how people can get accurate information.

Sumi Somaskanda (03:18:26):

It has to be collaborative.

Rebecca MacKinnon (03:18:28):

Well, a couple of the things that really stuck out for me is that I worked for the Wikimedia Foundation, which supports the volunteer run and governed information platforms like Wikipedia and other projects that anyone can participate in and edit. And there were a number of comments about the importance of diversity of participation, open access, and open contribution. There was a discussion this morning, and also on the comments we just saw, about concerns about bias. One of the best ways to address bias is to make sure you have a diversity of people contributing to an information platform, while also having some process for examining what the sources are and debating them and coming to a consensus of what is verifiable and reliable.

(03:19:32)
But if you only have people from one type of background, you are likely to reinforce blind spots, bias, and what we heard from Katherine Hall Jamieson, confirmation bias. And so one of the things we’re doing at the Wikimedia Foundation, we don’t edit the content on Wikipedia, but we support activities to bring in more women, more people from marginalized communities who often don’t have access to the elite fora, in terms of providing perspectives on both people, events, and impact on communities of any number of issues, to make sure that the facts and perspectives are coming from people whose information and the truth they might be speaking might be quite threatening to powerful people, might be seeking to prevent them from sharing that information.

Sumi Somaskanda (03:20:40):

So diverse voices helps break down some of that bias. Niamh, from what you saw on the stage, how did that shape your thoughts on mis and disinformation? What did you take away from that?

Smriti Keshari (03:20:54):

I mean, I think my big picture takeaway is we haven’t even touched the technology part of this problem. We spent the morning talking about the human vulnerabilities and the way in which we operate as communities and societies that makes us susceptible to believing disinformation and being open to that. And I think, for me, it reinforces this notion that the technology, of course, is a huge amplifier, but it is a little bit like Marty said, this collaborative approach is really critical. Because you need to understand the science of people as well as the science of the technology to be able to address it.

(03:21:33)
But I think that the aspect that I really appreciated was Rachel Kuo’s initial speech where she really brought in this question of diversity, inequality, structural power imbalances that actually leave some communities even more marginalized and more vulnerable and more threatened by some of these information crises that we see emerging around different events, whether it’s natural disasters or conflict or whatever it happens to be. And so I think that part of this problem is poorly understood. I think it’s not fully recognized, and I think it’s really, really critical to help us to establish exactly how we don’t allow these technologies to further marginalize those who are already marginalized.

Sumi Somaskanda (03:22:21):

I know. There’ll definitely be a bigger focus on technology through the afternoon, so it’s something that we’re going to get to. Martin, what are examples of mis and disinformation that concern you the most, perhaps in your work or that you encounter on a daily basis?

Martin Chalfie (03:22:35):

So one of my other jobs is as Chair of the Committee on Human Rights of the National Academies. We’re concerned about social and natural scientists, engineers, and health professionals all over the world who have been subjected to human rights abuses, either as the basis of their work or because they have spoken out. And in those situations, the maligning of people, the accusing of people to discredit their science, is a major aspect of what is happening. So it’s not only, “Here is some inappropriate information,” this is directed towards discrediting these particular individuals that, in many cases, are saying things that the governments feel are threatening to them. And we saw that a lot during the pandemic, where doctors would report on what was happening and would be silenced, and in some cases imprisoned, because of this. So to me it’s that use, along with all the other uses that are very important, but that’s a very important use because this is

Speaker 5 (03:24:00):

… How people discredit, not the work, but the scientists, and therefore we can ignore the work.

Sumi Somaskanda (03:24:08):

So Rebecca, what are the best tools to combat that type of mis-and-disinformation?

Rebecca MacKinnon (03:24:13):

Right. Well, what Martin’s talking about makes it really important to distinguish between misinformation, which is inaccuracies that people are spreading but not necessarily intentionally, and disinformation, which is spreading false information with intent. The scientists and others who are attacked, are attacked by people who are seeking to spread disinformation, who are seeking to promote a different narrative that those scientists are contradicting. We see this at the Wikimedia Foundation and with Wikipedians as well. While Wikipedians are well- equipped to counter misinformation on the platforms themselves through their editing process, when there is a powerful government seeking to spread disinformation, or a corporation, or some political movement that is spreading disinformation and trying to discredit the public figures, scientists about whom we have Wikipedia pages, or people who are editing Wikipedia themselves, the disinformation campaigns are combined with threats with jailing, with physical violence, with smear campaigns, with what we call doxing, outing people online, exposing where they live, suggesting that mobs go after them.

(03:25:47)
So one of the things that we find that in order to achieve knowledge equity, in order to protect our community as an antidote to disinformation as well as misinformation, we have to put more and more resources and attention into the safety and security of everyone who contributes to the platform, and of our readers as well. Because in many cases, in some countries, accessing certain articles and being known to access certain articles frequently, can put a person at risk.

Sumi Somaskanda (03:26:23):

Very quick final question to you, Neve. How can different disciplines, journalists, academics, civil society, and scientists, work together to fight mis-and-disinformation?

Smriti Keshari (03:26:34):

Yeah, it’s really hard. We talk about multi-stakeholders and we want it to work. We know it’s incredibly challenging to get people who look at this from very different angles and speak very different languages to be able to work effectively. I think what we’ve seen at UNNDP working really well at the national level is when you have a really strong set of data and evidence that is coming into these forums and platforms that have got diverse groups and diverse representation in them, and then in there, determining your collective action together. So deciding with a certain piece of election disinformation in what way journalists will respond to this. How election organizers need to respond. How the security forces may need to react. How civil society can change their messaging around the electoral civic engagement work that they do.

(03:27:25)
But you’re working off the same information. You have an agreed set of facts that you determine and you classify and prioritize as being important, and you work on them together. I think that way you get this really cohesive, coherent response, which is something that we definitely want to support within UNDP.

Sumi Somaskanda (03:27:43):

And something all three of you definitely share. I’ve heard that message from all three of you. So thank you for the really interesting conversation, Neve, Martin, and Rebecca. We’ll definitely be following up on these thoughts through the course of the afternoon. Thank you.

Rebecca MacKinnon (03:27:55):

Thank you.

Speaker 5 (03:27:55):

Thank you.

Sumi Somaskanda (03:27:56):

Our next two guests took the stage just about 30 minutes ago. Take a look.

Elizabeth Loftus (03:28:00):

Out there in the real world, misinformation is everywhere. We get it when we talk to other people. We get it when we’re interrogated by somebody who maybe has an agenda, and even inadvertently suggests things that aren’t true. We get it when we pick up newspapers or online news and we are exposed to some misleading information.

Asa Wikforss (03:28:26):

And if you think about it, most of what you know about science, even as a scientist, or about society, about history, geography, what have you, you don’t know through direct experience. You know it because you’ve heard it from a reliable source. And this is our great strength. Of course, it means we have knowledge accumulation. We have this division of cognitive labor, which means we can create this amazing amount of knowledge and we can accumulate it across generations. But it is also a great vulnerability because it means that knowledge requires trust. I won’t get the knowledge from the reliable sources, unless I trust them. And if I trust the wrong sources, I’ll end up in a bad place.

Sumi Somaskanda (03:29:06):

All right, from the stage into our digital studio, we have both of them with us right now. Elizabeth Loftus, Distinguished Professor at the University of California, Irvine, and Asa Wikforss, Professor of Theoretical Philosophy at Stockholm University. I got to watch you both on stage, really interesting discussion. Now we get to dig a little bit deeper together here in our digital studio, a little bit behind the scenes. So Elizabeth, I’ll start with you. What are the sociological reasons that we have mis-and-disinformation? We’re using both terms because we do want to address both, but what are the reasons?

Elizabeth Loftus (03:29:40):

First of all, people get misinformation from a variety of sources. Somebody is describing an event that happened and they happen to make some honest mistakes. Somebody does an interrogation and they’ve got an agenda about what happened. They communicate their theory, their hypothesis, even inadvertently. Mostly, what I study is people who innocently are communicating information. That’d be more the misinformation example. Because with personal memory, sometimes people do it deliberately and they’ve got some motive for doing it. They want to win money or they want somebody punished, and so they’re going to basically lie.

Sumi Somaskanda (03:30:27):

What do you think of this, Asa? What do you see as the reasons that we have this phenomenon? And of course, it’s not a phenomenon because it’s nothing new, but that it has thrived in the current moment.

Asa Wikforss (03:30:37):

Well, it is the new information landscape. I mean, you can’t get around that fact. We went from a situation in the 20th century of a fairly limited number of fairly reliable sources, to a situation where researchers call it a high-choice information landscape, where the choice of sources is in principle, unlimited. The choice falls increasingly on us as individuals. It used to be that we got our newspaper and our TV channels, but now I can go out and find whatever I want, and things don’t go so well. First of all, we’re inclined to go to sources that confirm what we already believe and what feels good. But it’s also a very arduous task. Determining whether a source is reliable or not requires knowledge and time that we don’t often have. So it’s a very, very difficult situation to navigate for us as individuals, and huge possibilities for those who want to disinform and manipulate and spread propaganda, to do so.

Sumi Somaskanda (03:31:35):

It’s overwhelming, isn’t it, for an average user? I’m a journalist as a profession, and I find it difficult, of course at times to be able to distinguish between what we should be looking at, because what is fact and what is not?

Elizabeth Loftus (03:31:48):

Your question is making me think of something that happened to me. I’m very skeptical about information and so on, but not long ago I went to share something on Facebook, and up popped this message that said, “Do you realize this article you’re about to share is eight years old?” And I thought, “No, I don’t want to share something eight years old, I thought this was brand new.” And I thought why can’t more of these messages pop up to get us to stop and think? Because getting people to stop and think helps a little in helping them fend off bad information.

Sumi Somaskanda (03:32:25):

What do you think makes us vulnerable to misinformation?

Elizabeth Loftus (03:32:30):

Well, one of the things that she said so articulately, is that people seek out information that confirms what they already believe, their prior biases or wishes, and so on. So I know because of the work I do, it’s easier to plant false information when somebody wants to believe it.

Sumi Somaskanda (03:32:53):

Sure. Yeah, that’s a confirmation bias, isn’t it, Asa?

Asa Wikforss (03:32:56):

Yeah, that’s confirmation bias. There’s also motivated reasoning, which is the desire to believe certain things. You want to hold onto a belief for whatever reason. It’s become a valuable belief to you. It can be a belief about yourself or your children or about vaccines and the climate. And then if someone comes and challenges that belief, you feel challenged. You feel threatened, and then you respond by resisting the evidence. That’s the phenomenon of knowledge resistance that the research program I’m leading now, are studying. So that’s another thing that’s going on here. And here that interacts with the new information landscape and the unreliable sources in a very complicated way where indeed our desire to hold onto certain beliefs can be fulfilled if we just pick the right sources.

Sumi Somaskanda (03:33:42):

So how can we ourselves be better at recognizing, and even defending ourselves against mis-and-disinformation?

Asa Wikforss (03:33:52):

It’s not wrong to ask that question, but it’s wrong to stay with that question. I think there’s been a lot of focus on what we can do as individuals, but it ends up being a bit of victim blaming, I think, because again, we are so vulnerable to this, it is so hard to determine. There are certain rules of thumb that we should stick to. We know that we should slow down a bit, like you said, “Hey, wait a minute, do I really know what this is before I share it?” This sort of things we can do. But the idea that we can solve this by appealing to people’s critical thinking skills or improving their critical thinking skills, is naive.

(03:34:27)
Because for one reason, how good you are at critical thinking depends on what you already know, what you already believe. If you’ve been fed a lot of misinformation already, your critical thinking skills are not going to be able to detect what’s unreliable. So there’s this very complicated dependency on the background information when it comes to critical thinking. So it’s wrong to put the focus only on the individual, it’s not wrong to say something about that. We also need to look at the larger media landscape, and think about the responsibility of the big tech, the platforms, of course.

Sumi Somaskanda (03:34:58):

So you’re talking about types of regulation to make sure that-

Asa Wikforss (03:35:01):

Well, we may need some type of regulation. And when you say that, people scream, “Oh, censorship.” We don’t have to do that. We don’t have to forbid certain content, but we can demand of the platforms, for example, that these warning signs plop up more regularly. That there are these automatic fact-checking things. We can use AI, now. We know AI is going to be used for all sorts of horrible things, but it can also be used to signal when things are unreliable and so on. So use technology to help us as individuals not to get lost. And then also of course, change the algorithms and the reward systems, which doesn’t reward the spread of reliable information. We are rewarded when we stay on the platform, and we stay on the platform when things are sensational and emotionally charged and all that.

Sumi Somaskanda (03:35:46):

Right, the types of information. Before I ask another question, we have a really active and engaged audience online that’s following from around the world, and I want to bring in one of the comments that we noticed a little bit earlier, because it was just after, I believe you spoke on stage.

(03:36:00)
Let’s see if we can pull that up. This was something that one of our online viewers wrote. “Miss Beth’s closing remark makes a lot of sense. False events can be equally appealing as true ones. Emotions are subjective to justify the truth of something, we need an investigative approach to seeking truth. We cannot just rely on how we feel about things. Thank you for sharing this out of the box perspective.” What do you think about it?

Elizabeth Loftus (03:36:25):

Marry me, okay.

Sumi Somaskanda (03:36:29):

We’ll connect you two.

Elizabeth Loftus (03:36:29):

I like that comment.

Sumi Somaskanda (03:36:31):

Yeah, indeed.

Elizabeth Loftus (03:36:31):

I like that comment.

Sumi Somaskanda (03:36:32):

Speaks directly to what you were touching on on the stage there. So given what you both have just said about mis-and-disinformation, you gave us an example of sharing an article and getting that message popping up, saying this is eight years old. What other solutions or tools have you seen to combat mis-and-disinformation that you have found effective?

Elizabeth Loftus (03:36:52):

Well, one of the talks I heard today, which was fascinating, was a talk about trying to reward people for sharing accurate information or punish them for sharing inaccurate information. I thought that was fascinating. I’m curious what you would think of that because I think offering these kinds of rewards, is what they’re really doing is getting people to stop and think. So maybe we can find other ways also that might be more efficient or more practical of getting people to stop and think, or rewarding or punishing them, which I think might do something similar.

Asa Wikforss (03:37:29):

Yeah, no, definitely. There’s quite a bit of research showing that matters what kind of reward you get from it. And we also know that it matters if you tell people that in this case you are going to determine whether it’s true, and it really matters whether it’s true. If you really increase the importance of truth, if you make that more salient, people do better. So there are things like that. There’s framing, that we heard about before. We know that matters a lot, how things are presented, especially when it comes to science, denialism. How things are framed matters a lot to how people accept the science. We know that. But again, we need systemic changes as well, I think.

Sumi Somaskanda (03:38:11):

You’re saying it’s not just up to us as individual users or [inaudible 03:38:14].

Asa Wikforss (03:38:14):

No, it can’t be. It’s like climate change. It’s good to talk about each of us flying less and eating less meat, but we can’t put the blame only there. We have to look at the systems that are making this a very big problem, and it’s not going to lie on the individual level.

Sumi Somaskanda (03:38:28):

Very interesting point. So looking to the rest of the summit, this is three days, and of course, there’s the rest of the program on the main stage as well. I’ll start with you, Elizabeth, what are your hopes and expectations for this summit? What do you hope to see and hear, and perhaps learn?

Elizabeth Loftus (03:38:44):

One of the things I talked about was the two different kinds of memory of information and misinformation. Misinformation about facts about the world, like misinformation, about health issues or climate or whatever. And then this personal misinformation, scientific studies. I think these two literatures are in two different places. I and my colleagues are primarily publishing in memory journals. Other people may be publishing in different kinds of public policy journals, and both subgroups have been working on ways to minimize this problem or prevent… Well minimize this probably the most we can hope for, but I think they need to be talking to each other and learn what each other has learned.

Sumi Somaskanda (03:39:34):

Who might you be talking to then during this summit?

Elizabeth Loftus (03:39:36):

Well, I probably should talk a little bit more to the health misinformation people.

Sumi Somaskanda (03:39:42):

There you go. And Asa, what are you looking forward to seeing and perhaps hearing over the course of the summit?

Asa Wikforss (03:39:47):

No, I think the cross-disciplinary aspect of this is absolutely key. It’s a cross-disciplinary problem. We can’t do it just in psychology or just in political science. So I think that’s one of the beautiful things.

Elizabeth Loftus (03:39:57):

Philosophy.

Asa Wikforss (03:39:58):

Philosophy is absolutely part of this. We provide the big picture framework where you have to think about this, but I’m also very, very interested to see, we have this exercise in deliberative polling for the digital audiences, which is innovations in democracy that’s been used a lot the last few decades where people get to actually deliberate about complicated policy issues. They get expert input. They have moderators who help them stay objective and try to reason in nice ways and be nice to each other. It brings reason into democracy and into policymaking in a way that I think we need to strengthen. I’d be very curious to see, we’re doing this online now. Very curious to see. The topic that people get deliberate about is precisely what to do policy-wise, about the mis-and-disinformation problem. So I’m very curious to see the results of that.

Sumi Somaskanda (03:40:54):

And beyond the summit, is there something you’re looking forward to taking back into your work in addressing mis-and-disinformation?

Elizabeth Loftus (03:41:02):

Well, I’m anxious to see what AI is going to bring to explode this problem even more. And then we’re going to have to make a better shield to defend against that sword.

Sumi Somaskanda (03:41:19):

And you, Asa?

Asa Wikforss (03:41:21):

I think it’s a lot about making these connections across disciplines, but also across continents because people across the world are facing these issues now. I lead this research program where we have European researchers, but it’s very nice to meet your researchers from other parts of the world now, see what they’re doing, and really try to put together the very best knowledge that we have here so we can handle these challenges.

Sumi Somaskanda (03:41:44):

Well, this has been a great meeting of minds. Thank you both very much for the conversation. A lot for me to think over as well. Asa and Elizabeth, wonderful to have you, and looking forward to seeing you over the course of the summit. Thank you.

Asa Wikforss (03:41:54):

Thank you.

Elizabeth Loftus (03:41:54):

Thank you.

Sumi Somaskanda (03:41:56):

You’re watching the Nobel Prize Summit live from the heart of Washington DC. We are at the National Academy of Sciences. Now, we asked David MacMillan, who was awarded the Nobel Prize in chemistry in 2021, to share his view on misinformation. Here you go.

David Macmillan (03:42:12):

Hello, my name is David McMillan. In 2021, I won the Nobel Prize in chemistry. Why am I telling you that? Well, after you win the Nobel Prize, one of the things that happen is you’re invited to give lots of talks to the general public about who you are and what it is that you have accomplished. And during that, you get lots of questions along the way. And around this idea of misinformation and disinformation, a question that I receive all the time in these public talks is, “Do you believe that the vaccines are actually useful?” Or that vaccines are useful?

(03:42:48)
For someone like myself, this has been extremely dispiriting in different ways. Because as a scientist, I’ve grown up knowing the value of science. I’ve grown up knowing the value of medicine. You grew up knowing the value of vaccines. And science, if you think about it, is everywhere around us, every single thing around us is a direct result of science, of getting to the answers, of getting to knowledge.

(03:43:13)
Everything that we learn about or everything we do is based upon science in so many different ways. So when people ask me questions about vaccines, it’s really, really tough. The other part of that is that my wife, my students, the people who I grew up with as chemists, they all work in the healthcare industry. They all work for pharmaceutical companies, and they are literally spending all of their energy and all of their efforts to try and develop medicines for humankind.

(03:43:42)
I personally view this as one of the most noble jobs that anyone can have is to spend their life worrying about science as a means to deliver new medicines for humans. They take that so seriously and they care so passionately about it. To then be in a general public seminar and have someone ask me, “Do you even believe in vaccines?” So for my own reflections, I believe one of the most important things we can do as scientists is to really think about the ways in which we can communicate to everyone that the people who actually care about medicines, the people who care about vaccines, they’re not doing it for political reasons, they’re not doing it for financial reasons, they’re doing it because they care about everyone. Thank you.

Sumi Somaskanda (03:44:27):

Thanks to David MacMillan for sharing those thoughts. And as the guests here at the National Academy of Sciences are having lunch, we got to grab the host from the main stage, Kelly, who’s with us here in our digital studio. I got to visit you on stage, and now you get to visit us here in the digital studio. What’s happening right now at lunch? Is everyone just trying to break down everything they saw?

Kelly (03:44:47):

I think so. People are out there really meeting each other. There’s something really special about the people who are here, I feel like. It’s just they’re connecting through these ideas and through this thinking that they’re learning about. I did have a chance to hang out in there, and I met some people who have traveled from a lot of different places far away who were meeting up with other people and really talking about some of the content that they’d heard. It was really fun to see. It’s energetic out there.

Sumi Somaskanda (03:45:20):

Such an interesting mix of speakers on the stage as well. We heard really different perspectives from a magician to scientists to academics. I would love to get your perspective on what you heard and saw on stage. What really stuck out to you?

Kelly (03:45:34):

I mean, wow, it’s almost been a journey, I feel like. I love the way that it’s unfolded, talking about history, and now this current point in time. I think each one, I guess, has certain elements that resonate with me, but really, the ways in which our minds trick us and how much we believe that we’re sometimes right, from the magician and also from Elizabeth, learning how we take stuff in and it’s wrong, and then we remember things that we maybe saw in the correct way, and we remember those things wrong, it’s a lot to think about. We have a lot of responsibility as we think about the information that we share out in the world. I’m really thinking about that and how we go forward.

Sumi Somaskanda (03:46:34):

It’s a really interdisciplinary approach too, isn’t it?

Kelly (03:46:36):

Mm-hmm.

Sumi Somaskanda (03:46:37):

Do you get the sense that people from different disciplines are now over lunch, for example, meeting and exchanging ideas about what to do on this topic?

Kelly (03:46:45):

Yeah. I think too, another point that we’ve heard a lot about from several of the speakers, is how it’s really this that’s creating a bigger divide among us in the world. The way we more fervently believe one side or another. So I think it’s great to be able to recognize that, and the science behind some of what’s happening, and then get together with other people to talk it through.

Sumi Somaskanda (03:47:12):

What are you looking forward to seeing this afternoon?

Kelly (03:47:15):

Well, chapter two is a bit more of what’s happening currently. There’s some fun and interesting surprises and format changes and things like that that’ll sort of shake it up. What I’m especially looking forward to, of course, is the third chapter, which is about hope. We’ll talk about in that chapter, tools that we can use to move forward and build a future that is hopeful and full of truth and harmony and all of that, so of course, I’m looking forward to that part because it’s important to have hope and it’s important to work toward something good.

Sumi Somaskanda (03:47:59):

And to have the tools to be able to use.

Kelly (03:48:00):

To have the tools, exactly.

Sumi Somaskanda (03:48:01):

Do you think beyond the summit, this will change how you approach mis-and-disinformation when you come across it in your life?

Kelly (03:48:08):

I hope that it will. I feel so far I’ve learned even more about how deeply pervasive the problem is. We already knew that, but I’m learning more today about how it exists within me and my biology. So I plan to because it’s just about as important a thing to fix as I can think of.

Sumi Somaskanda (03:48:39):

Thank you, Kelly. We’re going to let you go get some lunch and coffee before things get started again on the main stage. But it was great to have you in our digital studio. We’ll be watching you back on stage. Thanks so much, Kelly.

Kelly (03:48:48):

Thank you, Sammy.

Sumi Somaskanda (03:48:49):

At our last Nobel Prize summit in 2021, we saw Laureates, academics, thought leaders, come together to talk about the climate crisis. One of the participants was Carl Folke, at the Stockholm Resilience Center. Take a look at this video.

Carl Folke (03:49:04):

I think that summit was extremely important in creating a consensus on the challenges for our species on Earth, into the future. And basically, because it was now recognized that it’s not just the climate as such as an isolated issue or environmental challenges that it’s all about. It’s really about our own future and our own civilizations, welfare, and progress, on Earth that we are talking about when we talk about climate change and environmental challenges or biodiversity loss.

Oen Gaffney (03:49:37):

And one of the big outcomes from the summit was a statement, and you led the development of that statement, and it was signed by over a hundred, I think, 126 Nobel laureates. This has been signed by more Nobel laureates than any other statement. What were the key points in that statement about the transformation that we needed to make, and why was that important?

Carl Folke (03:50:01):

I think one of the key points was the urgency issue. That is not something that can happen within several decades into the future, it’s now. We are in the Anthropocene change right now. We have left the stable Holocene era and moving into a new trajectory. It’s not just the game that is changing, it’s actually the whole playing field for our game that is changing on Earth. That means that it’s not just enough to talk about getting rid of some emissions, but it’s really about finding new strategies for our own future.

(03:50:38)
We discussed that in the context of climate change and biodiversity loss, was in the context of inequality, and also in the context of the enormous technological revolution we’re in right now. Both these problems, but especially the great opportunities that can he help us in redirecting how we do our business on Earth.

Oen Gaffney (03:50:57):

What’s your hope for this summit, Truth and Trust and Hope? What’s your hope that this summit can achieve?

Carl Folke (03:51:05):

I really hope that the summit can really establish the role of science as a very critical sense-making body for navigating the big changes that we are going to experience on earth right now. Because as I said before, we’re already out of the stable Holocene, into the Anthropocene. We need to take away actions that are meaningless or actions that are completely wrong, or even misinformation that try to stop the actions, and really have science as a backbone in this complex navigating process that we have into our own future.

Oen Gaffney (03:51:41):

Why do you think these Nobel summits are important?

Carl Folke (03:51:47):

I think they’re extremely important because they gather a lot of Nobel laureates and a lot of other decision-makers at the very high level around critical topics for human wellbeing and welfare for economic and social development. So I think they are extremely, extremely, timely, and extremely important for setting the stage for many of the challenges that confronts humanity.

Oen Gaffney (03:52:12):

Given everything you know, you are one of the leading experts in the world on the state of the planet, are you optimistic about our future?

Carl Folke (03:52:22):

If I would’ve started with these topics just a few years ago, I would’ve been really pessimistic, I think. But since I started already in the early eighties, I think we are in a mental cultural awareness revolution that is now starting to generate action, and it’s going really fast. That’s why I’m hopeful. You can easily be depressed and think that we are too many, we are doing too much, and we don’t understand that we’re living on the planet. But I see lots of good signs that we are changing in the right direction, and that makes me hopeful.

Sumi Somaskanda (03:52:54):

All right, a longer version of this conversation with Carl Folke will be playing on our digital program tomorrow at 1: 00 PM, so be sure to tune in to that. Now we’re going to shift gears a little bit. The Smithsonian Science Education Center is assembling a group of high school students to participate in a 12-week program to find sustainable solutions. It’s called Sustainable Communities, and the goal is to identify a real world problem and come up with a solution to it in the local community. So collect data and create a local solution, and it’s presented here as a video at the Nobel Prize Summit. So we’ll show you some of this video.

Speaker 6 (03:53:37):

Hi, my name is Shrudi Vasco Leputi, and I’m from Los Angeles, California.

Catalina (03:53:41):

Hello, my name is Catalina, and I am from Argentina.

Zoe Jung (03:53:44):

Hi, I’m Sarah, from Mexico.

Carrie Kang (03:53:47):

Hi, my name is Carrie Kang and I’m from Annandale, Virginia.

Alice (03:53:50):

Hi, my name is Alice, and I live in [Inaudible 03:53:53] in the center of France.

Zoe Jung (03:53:55):

Hello, my name is Zoe Jung, and I’m from Flower Mound, Texas.

Abigail (03:53:58):

I’m Abigail.

Olivia (03:53:59):

And I’m Olivia.

Abigail (03:54:00):

And we live in Arkansas. We’re at our composting area outside of our school. We realized there was a need for composting when we saw the amount of food waste accumulating at our school.

Catalina (03:54:16):

Hi everyone, I’m Layla Hunter, and my concrete action plan is about addressing the unequal distribution of shared spaces in Fairfax County.

Speaker 8 (03:54:25):

Instead of having the ability to cut down its carbon footprints, South Korea suffers from heavy traffic congestion, poor air quality, and excessive global carbon emissions.

Speaker 7 (03:54:37):

Our project hopes to achieve outcomes that promote sustainable choices about the production and use of electricity. We specifically wish to encourage action by fostering a sense of responsibility in the community.

Speaker 9 (03:54:51):

This stigma has its roots in the social history of rank, race, and occupation, residential segregation, and the historical usage of the bus system by non- car owners who needed to commute into the city center for work.

Speaker 10 (03:55:03):

The main issue that I noticed in my area is that the access to diverse public transportation is too low.

Speaker 11 (03:55:17):

I led a group of students in researching, funding, and carrying out a plan in which we partnered with the school’s horticultural department as well as local nurseries, to receive native South Floridian plants and plant them on our campus and sectioned-off areas. The project was a success and now is being taken up and continued by younger students to maintain this natural man-made balance.

Speaker 12 (03:55:37):

To reduce these emissions a few friends and I started what we like to call the carpool project.

Speaker 13 (03:55:43):

My plan is to spread these both on social media and hang them up in areas where recycling misinformation is prevalent.

Speaker 16 (03:55:48):

First, to prevent the use of chemical fertilizers, we need to search for alternatives, like composting.

Speaker 13 (03:56:04):

So what I asked myself was, what is a way I can reuse all of this paper? That is when I decided to make these biodegradable seedling pots made from the mixture of paper, flour, and water.

Catalina (03:56:36):

Climate change is a global issue that affects everyone around the world, and as the next generation, we will be the ones who are most affected by the climate crisis.

Speaker 14 (03:56:46):

You are important. It may seem as a citizen, you can’t do anything big to act against environmental issues. However, you can really change everything just by modifying some small habits in your everyday life. As mentioned before, it is like a snowball effect. If one acts, the other one will be more likely to act as well.

Speaker 15 (03:57:08):

I liked how my team and the participants in the program came from different communities and cultures all around the world, but everyone was focused on one goal, making a change no matter how small.

Speaker 17 (03:57:21):

With all this in mind, we hope that we can somehow make our communities a better place. Thank you.

Sumi Somaskanda (03:57:42):

Inspiring projects there. We have three students who worked on this program with us in the digital studio now, so I’m going to introduce them to you. So Brevin de Jesus is here. We also have with us Layla Hunter and Zoe Jung. Welcome, and congratulations on your project. So Brevin, I’ll start with you. Your project

Sumi Somaskanda (03:58:00):

Project was in Virginia Beach, Virginia. Tell us what it’s about.

Brevan (03:58:03):

So basically I was just trying to find out a bit more about our transportation in Virginia Beach as a whole. I was really looking for the problems with it, for the most part. Seeing what could be done better about it, and what we could fix overall about it.

Sumi Somaskanda (03:58:18):

What is the transportation in Virginia Beach like?

Brevan (03:58:22):

We have a local transportation called the Hampton Roads Transit, and it’s our bus system basically. And I wanted to see how we could fix it, make it better, make it more accessible to everybody in Virginia Beach and in the Hampton Roads area.

Sumi Somaskanda (03:58:35):

And why was that for you an important sustainable solution to work on?

Brevan (03:58:39):

Because in Virginia Beach, we’re a sprawling city. A sprawling suburban city. And it’s just kind of everywhere. You can’t really go and take a bus from one place to the other. You rely on your car for the most part in Virginia Beach.

Sumi Somaskanda (03:58:58):

So for people who don’t have cars, they can’t get around.

Brevan (03:59:01):

Yeah. Not that easily. Yeah.

Sumi Somaskanda (03:59:02):

Okay. And what did you find out while researching this?

Brevan (03:59:07):

Biggest problem that I found was that our website, the Hampton Roads Transit website didn’t have all our bus stops marked. So you could see where the routes were, but you really just couldn’t find out where said bus stops were on the routes. So I feel like this was a problem for people trying to find out where exactly they could get the bus from, and where they could just find the bus.

Sumi Somaskanda (03:59:32):

Okay. That is a really important problem. If you don’t know that, it’s hard to get the bus.

Brevan (03:59:35):

Yeah, exactly.

Sumi Somaskanda (03:59:36):

We’ll come back to some of the solutions. I want to come to Layla now. Layla, your project is in Fairfax, Virginia. What does it do?

Layla (03:59:43):

So my project is about shared spaces in Fairfax County, and about the unequal distribution of wealth that each part of the county gets. Because I’m from Oakton High School, which is one of the rich high schools. And we just got a renovation. But I noticed that when I went to advisory board meetings, I’d go to other schools and their high school looked like a bad middle school. It was just not updated and renovated. And I just wanted to learn about the distribution of the money from Fairfax County.

(04:00:22)
And then from then, I looked into the distribution of more shared spaces such as playgrounds. And I learned that a lot of the underdeveloped areas, or the areas that didn’t get that much money, had way less playgrounds, and some didn’t have any playgrounds. And whereas my community, we have a lot. And so that’s where my project came from.

Sumi Somaskanda (04:00:48):

Were you surprised by what you found?

Layla (04:00:50):

Yes. Because I didn’t think that Fairfax County was really like that, because we are one of the more rich counties, so I thought it’d be more equal. And I just found out that it wasn’t. Because the schools around me look similar to mine, but the schools, if you go 30 minutes out, because Fairfax County’s a large county, it just wasn’t the same as my school and I wanted to know why.

Sumi Somaskanda (04:01:19):

And shared spaces, you mentioned playgrounds, green spaces. Why are those important? Because sometimes we don’t even really notice those when we’re walking down the street.

Layla (04:01:27):

It’s important because that’s where I feel like I met a lot of people in just shared playgrounds and shared spaces in general. Like churches and just community centers, rec centers. There’s not as many rec centers if you go 30 minutes away from my house. But we have like I’ve Cub Run, there’s Oak Marr, there’s many. But farther out you go in Fairfax County, it doesn’t look like there are as many as where I come from.

Sumi Somaskanda (04:01:56):

Those are places for people to gather, you’re saying. Okay, that’s important. And Zoey, we’ll come back again to the solutions in a moment. Your plan is in Dallas, Texas. So welcome to Washington DC. Tell us about what your plan looked at.

Zoey (04:02:10):

Yeah. So I’m actually from a smaller suburb right outside of Dallas, Texas. And right next to my community there was another community that was built, and it was the first walkable community of our town. So it was really huge deal for all of us. However, it lacked one essential element, which was affordability. And as we see that a lot of times, suburban people, especially young people, we’re not involved and really able to get access to these topics surrounding sustainability, climate change. Because oftentimes we are kind of out of it, and a lot of times the stuff happens in urban communities. And so this was really interesting project for me to be able to explore how suburban communities still play into all these zoning laws, these zoning commissions, and everything really relating to that.

Sumi Somaskanda (04:03:11):

Why walkable communities? What’s important about being walkable?

Zoey (04:03:14):

So I believe that walkable communities are essential to really anywhere and to really any group of people. Because within mixed use development, I believe that, number one, it does create economic revenue. So it’s not just for the people, but it’s also just for everyone. And while we are able to facilitate walkable communities, we are able to allow access to everything that we really need in close to a 15 minute radius, which is super helpful if you don’t have a car. That’s a huge thing because I personally I don’t drive yet. And so oftentimes I don’t have rides, I don’t really have access to grocery stores. And also just in general, I feel like if you don’t have a car in a suburban town, you can’t get anywhere. And so walkability is super important in that sense where we can really just get what we need in a walkable radius without having to depend on transportation really, because that’s not an option here.

Sumi Somaskanda (04:04:20):

Yeah. All three of you are talking about accessibility really and barriers to that type of accessibility in your daily lives and the spaces around you. So let’s talk about some of the solutions. Brevan, what were you able to achieve, let’s say, and what are you hoping to see change?

Brevan (04:04:34):

What I’m really hoping to see change is I’m hoping for our websites to be updated so we show our routes for the most part. But I’m trying to see how we can make it more accessible, make our public transportation more accessible to everybody. Whether you come from a high income, low income. Just so I want to see people to be able to just take the bus, and maybe not have to rely on their own personal transportation like a car, or whatever they may drive. I want to see our local transportation, be more accessible in everybody.

Sumi Somaskanda (04:05:11):

How about you Layla?

Layla (04:05:12):

I would like to see more students show up to school board meetings, especially from different schools around our county because they are so different. And so they could have their voice heard by our super superintendent, Ms. Reid. Because I feel like that would foster a lot of changes in our community, and it would help them to see the changes that they want to be seen at their schools and in their shared spaces.

Sumi Somaskanda (04:05:40):

Show up and have that representation. Okay.

Layla (04:05:42):

Yes.

Sumi Somaskanda (04:05:42):

And Zoey.

Zoey (04:05:43):

I mean I really think that civic engagement amongst just young people really and also everyone else because this is an issue that affects primarily young people, but also everyone just due to many different factors. And so I believe that through calling your representative, emailing your representative, writing letters, and just making sure that your representative knows that this is an issue that you care about. Because I can’t walk into a leasing office and be like, “It’s too expensive.” There are better ways that we can enact mass legislation that will in return help so many different people, and not just our community, but other communities that face the same issue.

Sumi Somaskanda (04:06:29):

Okay. Well all three of you are certainly speaking up. Last question to all three of you. What have you learned from each other’s projects, and the other students that you worked with?

Brevan (04:06:37):

What I really learned is that transportation and all these other problems aren’t just a thing that we face in our own communities. These are things that can happen all across the world. I found out that I was talking with some of the people I worked with about how bad it was with having our bus stops not labeled or anything like that, and not having bus stops close enough to a lot of people. And then I was finding some my teammates in France, and some of them in Richmond, and they were talking about how there’s not a lot of bus stops in their area either. And I was especially surprised with Richmond as well because of how big of the area it is, how developed it is over there.

Sumi Somaskanda (04:07:24):

How about you Layla? What have you learned?

Layla (04:07:26):

I’ve definitely learned the need for more accessible and sustainable areas. Not my partners, but in another group I met with a girl, and she talked about how her community keeps changing because of climate change and the need for more accessible spaces. And how there’s a lake near her community that keeps getting clean, but then it keeps getting littered in, I guess, again. And so the need for more sustainable actions and more groups to help pick up the community.

Sumi Somaskanda (04:08:04):

And Zoey, what have you picked up from your fellow students?

Zoey (04:08:07):

Well, I’ve seen a lot about just how different factors within a community and within a shared space, how that all really points to how we need to change, and how we need to start changing and really putting our voices out there. Because we all have different issues, and they all pertain to sustainability and having a better, more well-rounded community in terms of providing more access to certain things or other things.

(04:08:39)
However, I think what I learned the most about everything was just how we all come from different communities, and we all bring different problems to the table, but we cannot solve them without initiative. And I learned that through all these people who were so ambitious about different issues that all had an umbrella over them. I thought it was really interesting to see how we can all come together from different parts of the world to make change and to bring change.

Sumi Somaskanda (04:09:09):

That’s a really good note to close our conversation on. So congratulations on your project Zoey, Layla, Brevan, great to have you with us. Thank you for sharing your projects with us and for the conversation. And you can see all the student videos in our program at 2:00 PM tomorrow afternoon.

(04:09:25)
Okay, we’re going to show you a clip now that’s just a little taste of the next conversation, a clip from earlier on stage with Gizem Ceylan. Take a look.

Gizem Ceylan (04:09:33):

But one question that you might be wondering is that how do we build social media habits? So how do we spread misinformation? So you build information sharing habits on social media in the same way you build any other habit. You do the behaviors that lead you to rewards, repeating most of these behavior until they become automatic and unconscious.

(04:09:58)
But on social media, if you think about it, the rewards are social, right? You gain attention, you build connections through likes, re-shares, and new followers. So when you’re new on a platform, you don’t know which posts are going to get you those rewards. But you learn through practice, you share some posts, and that gets you rewards. And you share other posts, you don’t get any rewards unfortunately. But over time you repeat the behaviors that led you the rewards. And eventually your behavior becomes more automatic, and basically you respond automatically to the cues in your social media environment, and you don’t really need to think about what you are doing much of the time.

Sumi Somaskanda (04:10:47):

That was a clip from on stage just a little bit earlier in the main program. And we’re going to talk now about how new technology plays a role in disinformation and what we can do about that. We have two guests with us here who I’m going to introduce in just a moment. Tristan and Gizem are with us. I’ll introduce you properly, but we do have some of the interaction that’s been taking place in our online stream that I want to bring in right now.

(04:11:11)
So something that one of our viewers online wrote is, “I am alarmed at the growth of information globally. We have enough problems to solve without disagreement over basic facts underlying those problems. How can we begin to move towards solutions?” That’s a good question. “Valuable information is buried in the literature. A way to evaluate this using AI would be extremely beneficial.” Okay. We got another one here. “To address mis/disinformation, we need to pay attention to issues such as the role that racism and inequity play in science.”

(04:11:42)
And let’s see if we have one more here from our viewers. “We should focus on helping people accept and embrace differences of opinion and perform their own reviews of data and research.” From another viewer. “Often scientists blame the public and the media for the spread of misinformation, but we accept that we’re not effectively communicating our science to the public.” An interesting point there. Last one here. “Addressing deficiencies in education, access to technology, and most importantly, promoting a critical thinking mindset would render dishonest sources useless.”

(04:12:15)
Important comments there from those who are watching us online around the globe. So I’ll introduce you two properly now. Tristan Harris, the co-founder and executive director of the Center for Humane Technology, and Gizem Ceylan, a behavioral scientist at the Yale University School of Management and the Yale Center for Customer Insights. First off the bat, Gizem, what did you think of some of those comments that we’ve been getting?

Gizem Ceylan (04:12:38):

I mean, what can I say? That’s so true. I mean misinformation has become a huge problem for everyone from the health perspective, from the political perspective. People have lost their lives over this. And the social media has been playing an important role. And this morning I also talked a little bit on that. So far when we look at the misinformation research, we were always putting the blame on the people, and there are some comments that I see that we should tell people what to share, or we should tell people what to believe in.

(04:13:12)
But in my research, I mean, yes, we can tell people everything, but we have seen that these interventions were not that effective so far, right? And what I’m studying is that what we can change in the platform design that is going to help us to reduce misinformation pandemic. Because I mean currently in the social media environment, people are getting social rewards for sharing very emotionally provoking information or information that’s not the most accurate thing, but maybe most emotionally charged, and it’s going to get the likes or the re-shares or the comments, the discussions from other people.

(04:13:52)
And if we can change this reward system, what if your friends are going to give you likes for sharing the most accurate information, then your behavior will change accordingly. Because you don’t want to get this time comment saying that, “Oh, that information is actually inaccurate. Why did you share that?” So I think our focus needs to shift from individual actions to the platform design, and how we can change all this reward structure that are unfortunately making us more vulnerable to these social rewards, and as a function, sharing more misinformation.

Sumi Somaskanda (04:14:27):

Tristan, that’s something you’re looking at as well, isn’t it? How the onus is also on the new technologies and on platforms themselves.

Vidar Helgesen (04:14:35):

Yeah. Our focus actually for the last 10 years has been on how do you change the platform design? Because it always comes back to ultimately how is the design of the technology changing the flows of human attention and information as it restructures that infrastructural pipeline of what we’re all immersed within. In terms of what can be done, I think one of the things that we really need is to train and reward, as Gizem was saying, instead of rewarding division entrepreneurs, currently the better you are at adding inflammation to a cultural fault line, we will pay you in more likes, followers, and reach. That is a bad reward structure.

(04:15:14)
I would actually say even if you got rid of all disinformation or misinformation, the subtle exaggeration and black and white framing of all information is a kind of a deeper subtler threat. Because what it means is that everybody still gets polarized. And if everybody’s polarized, we can’t actually coordinate and agree, especially in open societies and democracies, on what we want to do about any complex problem.

(04:15:35)
So how do you fix this? I would say one of the skills and one of the reward structures that we need is to reward those who are synthesis entrepreneurs rather than division entrepreneurs. What do I mean by that? What I mean is people who are good at steelmanning instead of strawmanning the other side’s point of view of saying here’s the least charitable interpretation of what that person was trying to say, we need people who are the best at steelmanning here’s what all these different perspectives are saying. And so that any listener who sees that information sees a reflection or an echo of their own views back mirrored back to them. And that creates a space in which people start to see more complexity, which is how we escape the polarization problem we’re having everywhere.

Sumi Somaskanda (04:16:19):

What would that look like practically? Give us an example in our current senses of the platforms that we use.

Vidar Helgesen (04:16:25):

Simple example would be anybody who is, I mean imagine a system where just like we have jury duty where you have to do jury duty every once in a while, there could be a little bit of rating duty. Where we have to look at here’s all the people that I’ve been engaging with, and we ask the question, which of these people appear to be doing more synthesis and steelmanning all these different perspectives. And whoever’s doing the best job of that, we subsidize almost like a quantitative easing in our financial system, we do kind of reputational easing of boosting those who are better at synthesis. I think there’s also ways of simply, for example, not having short tweets that are just very short. I mean Elon Musk expanding the length for posts. I think that’s actually a great move. I think for those who can say here’s all these five different perspectives, here’s what they’re trying to say. Here’s actually a place of agreement among all of them. When people can see that there’s a way to hold multiple views at the same time, we need that kind of information reflected back to us. And right now that kind of information will get drowned out compared to the outrage machine, which gets disproportionately boosted otherwise.

Sumi Somaskanda (04:17:30):

But Gizem, do we have the attention span for that, for what Tristan just described there? Are we open to the idea that there is nuance?

Gizem Ceylan (04:17:37):

Yes, I was going to say that. I think some people are really searching for the truth, or trying to get multiple perspectives. I mean for those people, I think that’s a brilliant solution. If I give you all sides, perspectives from all sides, maybe I’m going to kind of convince you not to be polarized, but to be more neutral, and process the information in a more unbiased manner.

(04:18:01)
But I think we still need to, like I think Tristan also said at the beginning, we need some changes in the platform design as well. And one of the things that is in line with what you’re saying I’m testing right now. I mean there is no results yet. But what happens if we put another button there? I mean there’s the like button, there’s the love button, there’s the re-share button. Another button on trusting some information. And this is actually going to compile and people can give a trust score to any content.

(04:18:35)
And this will help us actually streamlining a large spectrum of perspectives because I may be more conservative leaning, and I will still process this information. And I will say maybe distrust, but you may be from the other side, but overall, where are we? So on average. And we believe in wisdom of crowds, so that’s why overall where are we going to land? And maybe this is one way where we are going to bring a larger spectrum of perspectives.

(04:19:04)
And as a reader when I see this, oh, this got some distrust, but this got some trust, so I should be a little bit careful here. I mean this is going to be a warning sign to the reader saying that, oh, there is something else going on here. Maybe I should double check this information before I share it further, rather than believing it blind eyed.

Vidar Helgesen (04:19:24):

And I would just add to that if it’s just about trust, then people will often want to trust the things that they already trust. But if we actually deepen everyone’s definition of trust to say what we are asking specifically is who do you trust to reflect a more comprehensive view of the situation? Which is again finding that synthesis. Because then when we’re capturing that signal, we don’t narrowly just reinforce the sources people already trust, which are already the polarized ones. We ask how do we deepen what trust should be about, which is about a level of comprehensiveness to any issue. Every issue is much more complex than what we’re currently making it out to be. And so I think trust is actually such a key factor in how we actually reboot the information ecosystem.

Sumi Somaskanda (04:20:05):

Absolutely it is. It is a currency, if you will, trust, isn’t it? How do we deal with all of this when the very platforms, and you both talked about platform design, the platforms themselves are changing, the technology’s changing. And I’m speaking specifically now about artificial intelligence. What kind of challenge does that pose in trying to find these types of solutions and tools? And maybe you can start.

Vidar Helgesen (04:20:26):

Well, if we’re try to be optimistic for a moment, we all know that there’s a lot of dystopian views here, one of the things AI and large language models could do is say, hey, what are the five different memetic tribes around this topic that are most in conflict? And then actually what is the synthesis perspective that all five would agree with? And then generate a summary of that information.

(04:20:46)
Because actually I agree with something you were saying earlier. If you show people just more complexity by just saying here’s 10 different sources that disagree on an issue, that actually doesn’t work. Because people don’t have the attention spans for that, as you were pointing out. If you synthesize it and make it into a short, powerful synthesis of what I think I hear you saying, just like when you watch someone, I mean we’re building on each other’s conversation. I’d say I think what you’re saying is this, and I check is that true? And then I keep building on that.

(04:21:11)
What I’m doing in a motion like that is building trust. I’m showing I’m a good faith actor. I’m not trying to say, well the thing that Gizem said was this, and I’m not even going to ask her whether that was true or not. That’s that’s bad faith communication. We need systems that reward good faith communication, which means communication that increases faith in the communication process itself. And large language models could be used again to both synthesize that information, and search for those signals of who is communicating in good faith.

Sumi Somaskanda (04:21:36):

What’s your thought on this?

Gizem Ceylan (04:21:40):

I do agree with you on everything. But I think in the coming back to the social media environment and the limited attention span, we have to find a way. I think maybe we can use the new technology in order to do that, to synthesize this information, and make it scoring or make it something very tangible in the social media environment like likes or the trust buttons. Something that people can instantly use as a signal, okay, this is a safe content that I can share with others. Or this is something I really would like more people to see it because this is trustworthy, or this has accurate information in it. Or this is reflecting a large spectrum of the perspectives. But I think this is where all the entrepreneurs or the new startups should come in. How are we going to make this design, right? I mean we are both saying that, I think from the research perspective and from your analysis, we both see that there is a need for additional signal to people what they can trust and what they can share with others. But today, we don’t see that unfortunately.

Sumi Somaskanda (04:22:50):

Yeah, go ahead, Tristan.

Vidar Helgesen (04:22:53):

I was just going to add that I think, as much as there should be competition, markets don’t work well with network effect based platforms. And so I think a faster way there, since we all care about this, and our democracies are falling apart because of the problems we’re talking about right now, if there was some kind of law passed that made democracies the new fiduciary stakeholder. Meaning right now these companies are obligated to maximize shareholder value, and not care about the democracies or the mental health of kids that they impact.

(04:23:21)
Just like there’s a duty of care in British law, if these companies had a duty of care to democracy, which meant that they actually all had to rank for not engagement, but for coherence of democracies, which means these sort of synthesis incentives. And they were actively doing experiments in that. So what we would want as a world were Facebook, Twitter, and the existing platforms, because we need those platforms in the fewest number of days, weeks, and years to actually start working to the benefit of open societies. Because right now they work exactly to the opposite of open societies. Because maximizing what each person finds most outrageous just shreds shared reality into a million pieces. And it’s very hard, as we all know now, to reassemble shared reality once that’s happened. So we really do need not just startups that are working out on the side, it’s nice to do those experiments, what we need is the main platforms to actually be changed in their incentives.

Sumi Somaskanda (04:24:09):

And a multidisciplinary approach, of course, which is what this summit is trying to get at. So I wish we could build on that conversation for another two hours, but I have to let you guys go. Thank you very much, Gizem and Tristan.

Gizem Ceylan (04:24:18):

Thank you.

Sumi Somaskanda (04:24:19):

Wonderful to have a conversation with you here.

Vidar Helgesen (04:24:20):

Likewise.

Sumi Somaskanda (04:24:21):

Thank you.

Gizem Ceylan (04:24:21):

Thanks so much.

Sumi Somaskanda (04:24:22):

So even if new technology can be a threat, digital solutions can also be a beacon of hope, as we just heard from our conversation. And earlier this year, the Digital Public Goods Alliance and UNDP led a global campaign to discover open source solutions, and the response was overwhelming. Nine solutions were selected. Take a look.

Video (04:24:50):

False, manipulated, and hateful information is undermining truth and trust, but there is hope. Passionate technologists are fighting back. Here is one of their stories.

(04:25:04)
Hi, I’m Fernanda, director of Open Knowledge Brazil. With a fantastic team and a community of hundreds of volunteers, we created Querido Diario or Dear Diary. Brazil has 5,000 cities. And local policies impact the lives of 200 million people. Still most cities don’t have transparency portals. They’re like data deserts. But those cities do have information locked in the shape of old printed newspapers, just like two centuries ago. The official gazettes, or diaries in Portuguese. They’re usually in PDF with no structured data.

(04:25:43)
We unlock that massive text enabling friendly searches. Then we share this data on an API, so third parties can also develop applications. It’s already being used to monitor environmental and education policies. Now we cover cities where nearly one in four Brazilians live. And it’s growing. Querido Diario’s possible because it’s open. People help us build the code to scrape information. Universities also collaborate with us. Official information on public matters is crucial to fight mis and disinformation. It helps make sense amid the information deluge. Querido Diario enables a healthier ecosystem by making this valuable data available to everyone.

(04:26:31)
The Digital Public Goods Alliance and UNDP’s Oslo Governance Center and Chief Digital Office are highlighting open source solutions for a more trustworthy and informed future.

Sumi Somaskanda (04:26:49):

Marie Santini of net lab at the Federal University of Rio de Janeiro shares her insights on the solutions found in the project.

Marie Santini (04:27:00):

Some of the main problems of South is the low trust society has in institutions, in governments, in formal social relationships. And it’s because people don’t rely on strong ties because they don’t have trust in each other, and they are insecure about the information environment because of these weak institutions. And one of the reasons for that is because population don’t have access to quality data and quality information. What I mean about quality data is complete data, accessible and searchable. And it creates a situation of vulnerability for information pollution, manipulation, and disinformation that [inaudible 04:27:44] now our democracy.

(04:27:46)
So of course, information pollution, we know it has no borders and effects society worldwide. But it impacts can be even more devastating in Global South in countries with weak government systems, conflicts, and crisis situations. So that’s why this kind of initiative stimulating open source technologies and solutions is so important for Global South.

(04:28:21)
So data became a resource so essential as oil in our historical moment. And big companies normally use Global South for data extractive. At the same time, the situation of Global South is characterized by inequality, a state financially weak, and it’s essential for us democratizing data and information to better people lives and protect our democracies too. So Global South face unique challenges in combating information pollution, the lack of transparency we have, the weak governance systems. So free data, collaborative technologies and innovative open source solutions are essential and fundamental for our countries.

(04:29:20)
It’s very important to promote transparency, collaboration, and easy adoption and adaptation to different contexts. And for example, in Brazil, we are very big country with different situations in different states. So this program mounted by Digital Public Goods Alliance and United Nation Development Program, in a global call for open source solutions addressing information pollution, is very fundamental to identify initiatives that we already have in Global South that exists. And strengthened them into a global network of solutions, stimulating the initiatives that already exist, but it also inspire other programs like these for stimulate and finance researchers, policy makers, and civil society to create new and innovative solutions, inventions, and concepts to combat desert of data, the desert of information, and disinformation problem that we have now in Global South.

Sumi Somaskanda (04:30:29):

And check out our stream tomorrow at 1:35 PM for more from the Digital Public Good Alliance. Make sure to tune in then. We’re going to speak now about some global examples of disinformation, and the very real life impact that they can have. And we’ll take the examples of Brazil and India.

(04:30:45)
And I have two guests here with me in the digital studio who will help us look at those issues. On the left here, Flora Rebello Arduini is a business and international human rights law expert. And she focuses her work on the spread of disinformation on social media and the impacts on society at large. And Rana Ayyub is an Indian investigative journalist and global opinions writer at the Washington Post. And she’s worked as a reporter, editor, columnist with some of the leading publications in India and internationally. Welcome to both of you.

(04:31:16)
You’re going to be on stage in about 90 minutes, so I get the chance to talk to you first about your insights. How has disinformation affected you personally in the way that you work? And we can start with you, Rana.

Rana Ayyub (04:31:27):

Well, I mean, the way I work, especially journalists in India, like journalists in the world, are the new enemies of the state, right? I mean, India is right now in the 168th position in the World Press Freedom Index. And how does that kind of narrative fit? Because of disinformation about journalists that journalists like us are kind of bringing disrepute to India’s image internationally. In the absence of a mainstream media that tells the news, that should tell the news, and when your mainstream media is captured, a lot of your news, there have very few options left. A lot of Indians, for that matter, consume

Rana Ayyub (04:32:00):

… their news on WhatsApp forwards, on Twitter, on Instagram. And these are some of the biggest platforms that are disseminating fake news. Me personally, if you look me up on Google at this point of time, you’ll start with fake news about me and then real news over the last 15 years that I’ve covered about the allegations against me. So I think fake news in a way has been weaponized by the state to discredit those who write about the country, especially a democracy like mine.

Sumi Somaskanda (04:32:28):

How does that affect you personally when you see that? When you Google yourself perhaps, once in a while and see information is not true?

Rana Ayyub (04:32:34):

If I’m on my phone, if I’m on Chrome and by mistake it goes on Google, I can tell you it’s triggering, because I do not want to Google my own self and that wasn’t the case two, three years ago. I find doing a Google check on my own self triggering, or my family members, because the same is being done to my family. It’s just like you don’t want to go down that road.

Sumi Somaskanda (04:32:53):

How about for you, Flora?

Flora Rebello Arduini (04:32:55):

Yes. Looking at Brazil, if you look right now how this information has disrupted the trust, special institutions and media, this is a huge issue. Part of my work is doing research and investigations to try to map out network that spreads this information as campaigns, and unfortunately, we see elected officials spreading this information as part of their business as usual. Now we have a new government that is really trying to tackle this information as part of a more systemic way.

(04:33:25)
But personally for me, is the level of polarization that happened in Brazil since 2018, the first time that social media was weaponized to actually prior one candidate over the other, families stopped talking, friends stopped talking, and this is still being felt today. Unfortunately, there were a few people that I loved very much, but we just couldn’t connect anymore because of this information.

(04:33:49)
And very similarly to India, brazil is a country where 9 out of 10 people have WhatsApp. People consume their news on WhatsApp and Instagram and TikTok now, so it is a big issue that we have social media platforms that monetize on this kind of content as well. Because the more triggering, the more emotionally triggering, more likely you are to click, more likely you are to share, it’s just psychology. And so it’s actually very similar experiences to the Indian population as well.

Sumi Somaskanda (04:34:20):

How do you deal with that when mis or disinformation drives a wedge between you and your friends and your family?

Flora Rebello Arduini (04:34:27):

It’s hard. It’s not always easy to just keep calm and be serene and find the correct information and share. But what I try to do is find the fact check of that information, if it’s available, and then share with the people that I know are consuming that news. But it’s almost as if you are fighting against a whole tsunami of infodemics. That’s why journalism is so crucial in this fight, because we need accurate facts so we can then find those facts and then share around.

(04:34:56)
But we also need to change how our relationship work with the platforms, with the social media as well, so they don’t prime this kind of content. Not just this information, but also hate speech and other kinds of violent content, because they are more clickable. And being more clickable, you are more engaging with that platform, more data is collected and then it’s the business.

Sumi Somaskanda (04:35:18):

How about yourself, Renna? How do you deal with it?

Rana Ayyub (04:35:20):

There’s this screenshot that does the round every second day, which is basically a morphed thing that says, “I hate India and I hate Indians.” As many times as it is fact checked, some relative or the other, some friend or the other keeps dropping and saying, “Why are you so ungrateful to India?” But the thing is, there can be fact checking websites to fact check your work, but it’s a very niche audience that consumes fact-checked stories. At the end of the day, propaganda and fake news travels faster than fake news. The reason why a lot of people have this opinion about me in India, that I’m trying to disgrace India, is because of kind of fake news that is spread about me. My image was morphed in a porn video, circulated all over the country.

(04:35:59)
The number of times my image has been morphed by with Islamic leaders, jihadist leaders all over the world to call me a jihadist propagandist, the statements attributed to me that I support child rapist in the name of Islam. The number of times I have busted that bit that I’ve never said this, but it just is viral. And my brother’s editor, he’s a well known publisher, reached out to him, saying, “How could your sister say that?” And she said, “It’s fake news, can’t you see it?”

(04:36:24)
But the thing is, a lot of times when people forward it to you, saying, “How could you say that,” they don’t see the screenshot completely. That’s the kind of generation we are living in, instant gratification. We want to consume everything that comes on our WhatsApp and this is happening in democracies. World’s largest democracy, India, a country of 1.4 billion people with the kind of digital penetration that we see. So the more the digital penetration, the more the kind of fake news spread in India. Of course, it’s a huge, huge pandemic, so to say.

Sumi Somaskanda (04:36:57):

How do you take that, what you called the niche audience for fact checked information, how do you take that and expand it, make people more receptive to fact checked information?

Rana Ayyub (04:37:07):

The times that we are living in India, we have a couple of fact checking websites who themselves are being demonized now, saying they fact check only a specific agenda or an organization. But it is important now, I’m so glad we have publications like Alt News and Boom Live and other publications that are doing fact checking. I think it is important for us to amplify those stories, but there are only so many people who are doing that at the end of the day. Fact checking websites do not have the kind of audience we need. I mean, for instance, the Indian government is bringing about a new law that the Indian government will now decide what is fake news.

(04:37:42)
You literally become the arbiter of what is the real news and fake news when your members of parliament and your spokespersons are disseminating fake news. I mean, I’ll give you a small example. Because we are at the Nobel Nobel event, somebody from the Nobel Committee was in India, Mr. [inaudible 04:38:00] and he was in India. He never gave a statement that Mr. Moley is a front runner for the Nobel Peace Prize. Somebody attributed a quote to him, that quote was run by almost every news channel in India, it was tweeted by top leaders in India. By the time it was busted by the fake news platforms, it had become a reality for the rest of India that he’s the front runner for the Nobel Peace Prize. That’s how deep the problem is.

Sumi Somaskanda (04:38:25):

Yeah, just a few days ago here in the US, I’ll add that there was a fake photo of the Pentagon on fire that caused the stock market to crash 10 points, I believe. That’s something that we are seeing in various countries and democracies, as you said. Flora, you mentioned the Brazilian government. What initiatives are you working on with the Brazilian government to combat disinformation?

Flora Rebello Arduini (04:38:46):

That’s actually a great question. Brazil right now has a unique opportunity to pass legislations because we had the attacks in the Brazilia, which were pretty much the same what happened in the US in 6th, January, 2021 in Capitol Hill. It was almost copy and paste and happened in Brazil just this year. So we had that happening, there was a trigger. And the silver lining of that happening, which was horrific, having invasion of official institutions in the country. But that triggers something in the government that does combat this information and pass legislations that will regulate the industry, because it’s not just this information.

(04:39:23)
This information is one of the results of an entire system and entire business model that prevails over truth. Right now, there are two main bills being discussed in Congress in Brazil. There is one that is called Fake News Bill, which is a catchy name, but it’s in order to really manage to regulate the whole industry. The second one is the AI Act. Right now we are trying to work on both of them, but prioritizing the Fake News Bill that is supposedly to be voted in any time now, any moment.

(04:39:56)
And that bill specifically, it took a lot of inspiration from what happened in Europe, which is the DSA, the Digital Services Act, that is a milestone legislation that can and should be used as an inspiration for other countries. That’s what the Brazilian government is doing right now. And we really have to fight off the big tech lobby, that it’s not playing very nicely right now in the country in terms of using illegal tactics to stop and halt the pass of that legislation. So it’s very important that we keep pushing through as a civil society to make sure that it passes strongly.

Sumi Somaskanda (04:40:31):

We’re at the Nobel Prize Summit and have the opportunity over three days to talk about mis and disinformation. What are your hopes? And Renna, we’ll start with you, your hopes and expectations to get out of the summit that can maybe be carried over to the work that you do or the examples that you gave?

Rana Ayyub (04:40:46):

One is that the fact that we have got platforms to speak here, it’s a huge… The Nobel Peace Prize is prestigious. Everybody, whatever is said here will be taken very seriously. The fact that there is an entire summit focused on misinformation and disinformation, one, speaks volumes of the problem. This is a serious problem. And the fact that we are sitting here, trying to come to… I mean, I don’t think we can all in the next three days, come with a quick fix solution. We need to have dialogues and arguments and I think that’s a starting point with where we are. We are speaking to civil society people who are engaged in legal cases, is to understand what the problem is. You can only fix a problem when you understand what the problem is, right?

(04:41:31)
For the longest time, Facebook and Twitter and all these platforms that have become enablers of fake news and misinformation refuse to acknowledge that there is a problem. But now we have oversight boards that are saying, “Okay, now we have a problem, we need to fix it.” So we are at a stage where we are now acknowledging the problem. We are a long way to fixing it, but at this platform, I believe we can come to solutions, we can learn from each other and we can go back with a sense of optimism when there’s so much pessimism around it.

Sumi Somaskanda (04:42:02):

And Flora, your hopes, expectations?

Flora Rebello Arduini (04:42:04):

Yes, it’s the same. I mean, having this room and having this platform to speak about the problem is unprecedented. And it speaks to the necessity that we really need to step up as societies, as democracy, to make sure that we address this issue from an international angle as well. Putting together brilliant minds, people fighting on the front lines, journalists, civil society, academics. Everyone deserves to be put in the same room and work towards a solution. Unfortunately, there is no one single silver bullet, to put it this way, to find a solution for the problem. That’s why it’s so important that we see and navigate through different angles to make sure that we address it properly.

Sumi Somaskanda (04:42:44):

Well, really looking forward to seeing both of you interacting with the summit, taking part in the summit, also on stage of course, this afternoon. Flora and Renna, thank you very much for sharing your personal stories and also your solutions to this problem. Great to have you in conversation with each other, thank you.

Rana Ayyub (04:42:59):

Likewise, thank you so much, Sumi for having us. Thank you.

Flora Rebello Arduini (04:43:01):

Thank you, Sumi.

Sumi Somaskanda (04:43:01):

Okay, during the afternoon, you’re going to hear from plenty of more speakers on the main stage who will be bringing you thoughts and insights, all speaking in the Covley Auditorium, including Nobel Prize Laureate Martin Chalfie, Melissa Fleming, Under Secretary General at the United Nations, and many more. We will be back here in the studio with a complete review of the afternoon at 5:15 PM and we’re going to take a very short break, but we’ll be back in the Covley Auditorium live at 2:00 PM, so don’t go away.

Kelly Stoetzel (04:45:53):

Hey everybody, welcome back from lunch. Sorry about the cow bell, I heard that was a little loud out there, but thank you so much for rallying quickly around it. It was fun out there. I got to meet a lot of new people. Who here made a new friend? Oh, yay, look at that. Is anybody sitting by their new friend? Oh, that’s really, a lot of people raised their hands. I can see here, look, there’s a whole group of friendships right here.

Saul Perlmutter (04:46:21):

Birds aren’t real! Birds aren’t real!

Kelly Stoetzel (04:46:23):

Wait, wait, what’s happening here. Do y’all have a leader? Is there a leader? Do you have a leader?

Saul Perlmutter (04:46:36):

The birds aren’t real.

Kelly Stoetzel (04:46:37):

Wait, listen, can you come up here? Kim, can you come up here? I feel like, okay, can we get a handheld mic? I think in the interest of truth and trust, let’s hear these guys out. All right.

Saul Perlmutter (04:46:55):

Okay, thank you for letting us speak.

Kelly Stoetzel (04:46:59):

What’s this about?

Saul Perlmutter (04:47:01):

My name is Peter McIndoo. These are my fellow bird truthers, we’re part of a movement called Birds Aren’t Real. I figured there were a lot of high IQ people here, so it makes sense that you would know who we are.

Kelly Stoetzel (04:47:16):

What is it?

Saul Perlmutter (04:47:17):

We are a movement dedicated to wakening up the people, to let them know that every bird is a robot. The government, from 1959 through 2001, the media won’t tell you this, killed every bird in the sky using poisonous toxins dropped from airplanes. With each bird that fell, a drone rose. And now we live in a world where 12 billion birds are surveilling us every second. Is that okay with you?

Kelly Stoetzel (04:47:46):

Wait, wait a minute. Do you have proof of this?

Saul Perlmutter (04:47:49):

Oh, we have a lot of proof. Yeah. I mean, the proof is all around us, the proof is all around us. Say the proof is all around us.

Speaker 18 (04:47:55):

The proof is all around us.

Saul Perlmutter (04:47:58):

The proof is all around us. For instance, birds have batteries, drones have batteries. You ever wonder how birds charge their batteries? You ever wonder why birds sit on power lines? Open your eyes. I mean, it all starts to unravel when you start looking at it this way. For instance, have you ever seen a baby pigeon? You haven’t, have you? Yet, there are all these adult pigeons walking around. Where are all the babies? They come out of the factory as adults, folks, factory fresh. No organic growth, smoking gun.

Kelly Stoetzel (04:48:37):

Okay, how did you hear about this? How are you communicating it? Tell us about that.

Saul Perlmutter (04:48:43):

Oh, I have done a lot of independent research and have seen evidence that I’m sure you all would like to see. It’s on birdsarentreal.com, the only media site that I trust. I also populate all the info on it. And I’ve been spreading this information for the past few years. We hold up signs, we come, we crash media funded events like this, and we have different slogans to awaken the people, such as, if it flies, it spies. Or bird watching goes both ways.

Kelly Stoetzel (04:49:26):

Okay, so now we’ve met, bird truther through you. Let’s meet the real you.

Saul Perlmutter (04:49:31):

Okay.

Kelly Stoetzel (04:49:37):

Peter McIndoo.

Saul Perlmutter (04:49:41):

Okay, let’s start this talk over. Hi, I’m Peter. Can you say, hi, Peter?

Speaker 18 (04:49:45):

Hi, Peter.

Saul Perlmutter (04:49:47):

Hey guys. Let me explain to you what is going on right now. My name is Peter McIndoo and I’m the founder of a satirical conspiracy theory movement called Birds Aren’t Real. And for the past few years I’ve been traveling around in this van, deeply in character, going on the news, holding rallies with hundreds of people also in on the bit, and running social media accounts with millions of followers dedicated to spreading the truth about the important matters in this country. Like how when birds poop on your car, it’s a liquid tracking device. How do you think they’re tracking your vehicles? This project was really a performance art social experiment to see how the media, random people react to conspiracy theorists, both online, behind the scenes and in public. Let me tell you a little bit about my background. I grew up in Arkansas, in Little Rock.

(04:50:53)
I grew up homeschooled in a rural community about an hour outside of the city. Pretty much everyone that I knew believed in some form of conspiracy theory, whether it was that Obama was the anti-Christ, literally, or that there were microchips in the vaccines. This is normal stuff to believe in where I’m from. So when it came time to play the character of a conspiracy theorist going around trying to convince the media that birds aren’t real, that there’s been a real movement around, let’s believe this, I had a lot of inspiration. Over the past three years, I’ve really played the character that I grew up around, using the same logic, arguments and cadence, just with a different theory swapped in. Now, over this time, I learned a lot about how people react to conspiracy theorists. I spent a lot of time in this van actually, traveling around, meeting up with our supporters. We have a boots on the ground activism network called the Bird Brigade that does what I do. Some members were actually here with us. And when we’re out in public like this, people don’t even assume that it’s a joke. They see the van, they see me in character, and I think a testament to the times, they assume that it’s all for real. They’ll often approach me, complete strangers walking up to me saying, “You’re crazy. You’re the problem with this country.”

(04:52:27)
They’ll look at me right in the face, as close as I am to you right now and they’ll say, “You’re uneducated. You have mental problems, you are insane.” And I was so deep in the mind of this character, method acting from the people that I grew up around, taking real inspiration from them, that my emotions that I felt when they would respond to me this way weren’t what I thought they would be. I thought that I would laugh at this funny reaction to my social experiment, but instead, I felt the emotions of the character. I felt hurt, truly. I felt emboldened, I felt othered.

(04:53:12)
And in those moments, these strangers that were ridiculing me could not have been more ineffective at achieving what I would assume they truly want, less conspiracy theorists in the world. This social experiment, more so than just finding out how easy it is to trick the media, how simple it is to put ideas into society and have people believe them, taught me something more important, which was that inside the mind of a conspiracy theorist, it had less to do with belief and it had more to do with belonging. There are people behind these screens. It’s a lot of talk about misinformation and numbers. And while a lot of us cannot control swarms of misinformation online, the algorithms, we can’t control censorship, we’re dealing with a lot of new problems that no one’s ever dealt with before.

(04:54:05)
But what we can control is how we respond to these people and how we shift the conversation. I think we need to develop new solutions for new problems. I think that we need to start the conversation at a different place, I think we’ve been starting it at the wrong place. By starting the conversation at truth and not at what is fueling the need for that alternate truth, we’re forgetting that people are receiving real rewards from joining conspiracy theories. They’re not just crazy people joining this for no reason, they’re getting identity in return. They’re getting a sense of purpose, a sense of belonging, something everyone in this room is also looking for. If we can invite, rather than condemn with healthy boundaries, if we can show people that the other side welcomes them in, maybe then we’ll be closer to the shared reality that we all want. Thank you.

Kelly Stoetzel (04:55:05):

Oh, thank you, Peter. Okay, now it’s time for the dynamic dialogue section of programming. Here’s how this is going to go. We have three speakers who are going to join us on stage, and they’re going to come up one at a time and give very short talks. Each one of them will be on a subject that is relevant to misinformation and I’ll introduce them one by one once they get ready to speak. You’ll also notice that we’ve got four chairs up here. These are for our amazing provocateurs, and each one of them is remarkable and brilliant in their own right, and we are so lucky and grateful to have them here. They’re going to be up on stage during the next three talks, and then they’re going to ask the speakers at the end of their talks, a couple of questions. They’re going to kind of serve as a proxy for all the people, all of us here, to ask questions.

(04:56:07)
They’ll just get speakers to dig a little bit deeper on what they’ve just talked about. I guess, let’s first bring up the provocateurs. We have Leslie Brooks, a veterinarian and American Academy for the Advancement of Science and Technology Policy fellow. We have Roger Carruth, a communications professor at Howard University with a whole lot of degrees and awards. We have Dray McKesson, a leading voice as an advocate and activist on issues that impact people of color. And we have Sabrina McCormick, she’s a public health professor, a filmmaker and a storyteller dedicated to addressing the climate crisis through her work. Thank you, provocateurs.

(04:56:49)
Wait, and there’s another part to this. Since there’s so much going on in this segment, we have to be really vigilant about keeping it on time. I’m going to stay up here, kind of stand back here to keep the Q&A part moving along. But all of the speakers have also been given a really strict warning that if they go over their time, there will be consequences. So ladies and gentlemen, let me introduce you to consequences. This is Nancy. This is Nancy and she is the communications director for the Division of Earth and Life Sciences here at the NAS, Earth and Life Studies at the NAS. And she’s also going to be our speaker timer for this segment. So what I would like to do is actually, I’m going to talk a minute as if I’m a speaker and I’ve just hit my time and Nancy’s going to show us what that might look like if I just kept talking and I’m maybe talk louder now.

Nancy (04:57:50):

I hate to burst your balloon, but we’re taking your mic away.

Kelly Stoetzel (04:57:57):

Oh my gosh. Thank you so much, Nancy. That was absolutely beautiful. And for the sake of our speakers, I hope that it is the only time that we get to hear that this afternoon. Now it’s time for our first speaker. He is a biologist chair of the Human Rights Committee at the academy and the recipient of the 2008 Nobel Prize in chemistry for his introduction of green fluorescent protein as a biological marker. Let’s welcome Marty Chalfie.

Martin Chalfie (04:58:27):

Hello. We all know about the wonderful promise of science, but I want to talk a little bit at first about some of the downsides, some of the problems that scientists have faced, and many of these have been things that we’ve noticed from the Committee on Human Rights, which works on behalf of social and natural scientists, engineers and health professionals all over the world. And so we’ve already heard a lot of information about misinformation and about disinformation. I want to add a couple of other aspects to this. One of them is the flooding of information, what’s been sometimes called the infodemic, that if there’s so much noise out there, it’s hard to find out where the right information is. And so that flooding of information is an important aspect. The other unfortunate situation is that there are a lot of attacks, not on the science, but on the scientists and the health professionals.

(04:59:47)
And we’ve seen a lot of examples of blame and attacks where people have been, especially if governments feel that they don’t like what is being said, they will arrange for them to be dismissed from their job or harassed or threatened sometimes with death threats and imprisoned, and we’ve seen this really all over the world. We’ve seen a lot of this in terms of COVID-19, where governments, to many times cover up their ineptitude of being able to protect their citizens, would rather blame the health professionals and deny that where’s anything taking place. We’ve seen this, for example, in Nicaragua, where several scientists wrote a paper talking about the ecological and social impact of building a canal through the country, which was a top priority for the government. The government didn’t like this. They were dismissed from their jobs, they were harassed. They were also eventually, the Nicaraguan Academy of Science, its legal charter was taken away from it.

(05:01:05)
We’ve seen this in other respects, in Greece. In order for Greece to be able to get European Union funding, they had to change and have an independent statistical office to present the correct economic data. They did this and then turned around and blamed the head of that office, whose numbers were accepted. And he had several lawsuits against him and even more severe charges, some of which, had they not been thrown out in 2019, would’ve resulted in his actually having a life sentence. And we’ve seen this also in health professionals who treat protestors that have been injured, where we’ve seen police turn on the medics because they were helping people that were injured, which is actually their job.

(05:02:09)
And so, what are the ways that we can sort of help solve these things? What are the solutions? Well, clearly, education is an important thing. Having laws that protect people are going to be important. I think it’s also important for people to join together. One of the things that the Committee on Human Rights has recently done is bring together various health professional organizations to, as a united group, come and support health professionals, especially in places of conflict where they’ve been attacked. One slight fact that you may or may not know, there are four health facilities bombed or attacked every day in the Ukraine.

(05:03:06)
Obviously, bringing people together and helping this. But what else can we do to bring this together? There was a wonderful book by Joel Simon and Robert Mahoney, called Infodemic, talking about the consequences of what happened in terms of the COVID-19 pandemic. And one of the things they said is that one of the things that exacerbated it was the fact that local news organizations were not there. And there are several examples that I know of, of local news organizations at various times. For example, in Senegal, the health authorities worked with local community radio stations to get information about HIV and AIDS, and Senegal had the lowest amount of AIDS of any West African country.

(05:03:55)
There’s a group in New York, called Documented, that has done very similar things to bring information to the Spanish speaking community and now the Chinese speaking community, both underserved communities, and gain their trust so that they could in fact, get appropriate health information. I think what this says is that we have to go and partner with local groups to first gain their trust by listening and helping, and use that to aid ourself in the future. And my last point is there is a right to science in the declaration of human rights. We should be bringing science to everybody, thank you.

Kelly Stoetzel (05:04:43):

Thank you, Marty. And you stuck that landing.

Martin Chalfie (05:04:51):

Yeah, no song.

Kelly Stoetzel (05:04:57):

Provocateurs, all right, Leslie.

Leslie Brooks (05:04:59):

I’ll ask question. I’m curious, what are your thoughts to go deeper into what you talked about, when there are scientists and scientific communities with divergent views or scientific studies with different reported outcomes, and how do you think that might contribute to the issues that you were discussing and how do we go about that?

Martin Chalfie (05:05:25):

Part of what we do as scientists is to work through controversy, to openly discuss and evaluate and compare various ideas to other. We don’t try to get unanimity, we try to get the best way that we can evaluate a situation. So when there is a difference, Enrico Fermi has a wonderful quote that I’m going to mangle, but it’s basically,

Martin Chalfie (05:06:00):

If you do an experiment and it confirms your hypothesis, you’ve made a measurement, but if it doesn’t confirm your hypothesis, then you’ve made a discovery, and there’s a lot more discoveries out there to look at. So controversy is not the problem because we want to get at the underpinning of that, and that’s very important.

Kelly Stoetzel (05:06:27):

Thank you.

Speaker 19 (05:06:27):

I’m next?

Kelly Stoetzel (05:06:27):

Yeah, if you’d like to be.

Speaker 19 (05:06:33):

I agree with everything that you said. I’m not a scientist. I’m communications person and the idea of the demise of local media is very prevalent in this day and age, particularly with social media. And we in the media community, and the press in particular, are the fourth pillar of government, theoretically, and I wanted to see about that relationship because you have the executive, the judicial and the legislative, and then the press, the freedom of the press is a big part of that. And pointing out these examples, do you have a way in which the communications community could work better with the science community to articulate some of the issues that you’re talking about?

Martin Chalfie (05:07:12):

So I think this is very important, it was one of the lines that I didn’t actually get a chance to say in my talk, but I think what this also says, the experience with documenting the experience that I mentioned in Senegal, is that freedom of an independent press is absolutely essential to this, that we have to be able to have people that will investigate their local communities, to communicate with their local communities. It’s really a two-way street and we should be part of that communication. Absolutely agree.

Kelly Stoetzel (05:07:43):

Thank you. Yes, Sabrina.

Sabrina (05:07:46):

So, thank you. And I really agree with what you just said, that media and transparent media and the press, the freedom of the press, is critically important. And also, I think something that we’ve heard today, and we all experience, is the echo chambers that we are in watching our particular media channel and other people with very divergent perspectives watching, consuming something else. So I wonder what you think about entertainment as a way to transcend those echo chambers, potentially, even in creating accountability and transparency around some of these persecutions that you’ve seen and that, certainly, you advocate to avoid.

Martin Chalfie (05:08:25):

I think anything that makes information available to people. If it’s done in an entertaining way, if it’s done in a way that’s provocative, if it’s done in a way that makes people think, that helps. And you probably know that the national academies have a very strong tradition of working with media to get the science right, but also don’t ignore the science and use the science as a vehicle of talking about many different things and many different attitudes, so I think it’s very important.

Speaker 20 (05:09:04):

My question, and this is a push to understand, not to challenge, I’m trying to understand the relationship between the infodemic or the flooding of information that you started with and then the solutions that you offer at the end and trying to figure out, are you saying that you think a free press and a healthy scientific community addresses the flooding piece, or can you help us better understand what you think the response is to flooding?

Martin Chalfie (05:09:28):

That’s a wonderful question, and I don’t have answers to everything. Just like in my science, I don’t have answers to anything. And I agree with you, that’s one of the really important pieces. If there are bad actors that want to flood the area with things, it’s very hard to work with them. That’s why I think there has to be protection from people, that we have to be able… I think one of the examples that the people that documented did is they went into, first, the Spanish speaking communities in New York City and they first just listened. They learned about the backgrounds, where people came from, what were their interests, what were their concerns. Then, among other things, they found, at the beginning of the pandemic, that there were several opportunities for micro grants for essential workers, and they compiled the list so people could actually get at this material, and they presented that to the community so that that community would be able to avail itself of that.

(05:10:38)
Once they had done that and other things to gain the trust by their direct interaction and listening, what they found is that people from the community came to them and said, “We’ve heard this about COVID 19,” Or about the vaccine, “Can you find us the information that we really need about this?” So they built up trust first and then utilized it. But if somebody floods things, frankly, that’s a real problem, and I’m not sure I have an immediate answer to that.

Leslie Brooks (05:11:20):

Can I ask one more?

Kelly Stoetzel (05:11:21):

Yeah.

Leslie Brooks (05:11:22):

I just want you to talk more about listening because I think this and the fun segment before, it’s the first time we started talking about that and anything you have to say, I don’t know if you have any experience where you have listened to people that maybe disagreed or didn’t want to understand your science and then just by listening to them, it helped them come around.

Martin Chalfie (05:11:42):

So, sometimes people think about… This may be an indirect way of answering this, but sometimes when people think about science and doing that, is that it’s so competitive, you shouldn’t talk to people about it until you’re just about ready to publish or things like that. I have found that every single time I’ve opened my mouth and told people about something I’m excited about, in return, I get so many ideas that I never thought of and things that have helped my science simply by being able to talk about it. So it’s not that you shouldn’t keep some things secret, but the open communication really helps. It helps in a lab to sort of get rid of the hierarchy or a parent hierarchy and have everyone realize that they are actually all collaborators, and when that works, it’s absolutely wonderful. So yes, this has happened several times in the science, but I don’t think you want to hear about all the nitty gritty of the details right now.

Kelly Stoetzel (05:12:50):

Thank you so much, Marty.

Martin Chalfie (05:12:52):

Thank you all.

Kelly Stoetzel (05:12:55):

And thank you to our provocateurs. Y’all can just toss in the questions as we go to. I don’t need to call on you, you just got this conversation rolling. And we are going to bring out our next speaker who is an astrophysicist who cares deeply about science, education and civic engagement. Let’s welcome Anita Krishnamurthy.

Anita Krishnamurthy (05:13:23):

Well, hi everyone. It’s such a pleasure to be here. Gosh, that light’s bright. So to be honest, I had to think for a long time about what I, a research astronomer turned advocate for STEM education, and that’s science, technology, engineering, and mathematics for those who don’t speak that jargon, had to say about this weighty topic of truth, trust, and hope. And then it hit me. Education is an act of hope, right? We educate ourselves hoping that we’re gaining the skills and knowledge we need to earn a living. We educate ourselves hoping that it will help us understand and better engage with the world around us, and that it will help us apply that knowledge in practical ways, both in our own lives and to help the communities around us. And we have the same hope for the young people that we educate. We hope that we’re equipping them to move through life with confidence and dignity, and we have a belief that education will turn us into critical thinkers who can hold pluralistic views and we hope that it will inoculate us against misinformation and disinformation, and that’s a hope.

(05:14:34)
So how do we turn that hope into reality? How do we realize this potential of education? Because a speaker after speaker this morning has said, and as Marty just said, just amassing a body of knowledge is not sufficient. It’s crucial, but it’s insufficient. We have to learn how to apply that knowledge in different contexts, we have to see how it intersects with culture and society and values and understand that those might clash sometimes, and that we need to be prepared to continually unlearn and learn new things. And this ability to develop, sustain, and nurture curiosity throughout our lives so that we can respond rather than react to challenge when we face it, is a muscle that has to be developed through practice and it has to be strengthened through practice. So how can a learning ecosystem develop this? Because it has to start when we are young and be a lifelong effort.

(05:15:39)
So what is reasonable to expect out of our education system, whether that’s happening in school or in the afterschool setting where I work or other out of school time programs like in museums, in science centers? How can we all come together to develop these muscles and what would we call success? Because our metrics and performance indicators, so far at least, are kind of blunt kajals, right? Because they’re not very nuanced and for the most part, as the saying goes, we value what we can measure because we can’t always measure easily what we really want to value. So how do we change that paradigm and value education as something that supports all young people to thrive, not just to succeed academically and to join the workforce, but also be equipped to become informed citizens who are discerning and can engage in a participatory democracy and be advocates for themselves and be change makers when needed?

(05:16:39)
There are examples of amazing organizations that are doing this kind of work and sometimes entire countries. Finland, for example, is quite legendary for its education outcomes as well as its fight against fake news. I serve on the board of Nobel Prize Outreach that has a whole curriculum around critical thinking that we will be launching shortly, and there are many other programs that are doing this kind of work all over the world and in the United States. And it’s tempting to think that we just need to pick the one good program and scale it up because that’s the solution, but we all know, and we’ve heard this again and again, local context matters and local culture matters. So things really have to be adapted to a local community because what works in Finland is unlikely to work here, and what works in an urban environment is unlikely to work in a rural environment.

(05:17:28)
And as I said, I mainly work in the afterschool and out of school time spaces which are rooted in local communities with local educators and youth, and recognizing that the STEM conversation has become quite transactional even in this space. You know, you study STEM fields so that you can get a job, you offer STEM learning because you can then participate in a workforce. And don’t get me wrong, jobs are important, we all need to earn a living, but it’s much more than that. The STEM narrative has to include STEM knowledge and ways of thinking as a tool for public good and citizenship. So with that in mind, I’ve launched a new initiative called the Collective for Youth Empowerment in STEM and Society, or CS. It’s a project of the After School Alliance, where I work, and we’re trying to bring together people and organizations who are working at this intersection of STEM, civic engagement and teen leadership so that we can really give young people a voice and a seat at the table from the very beginning.

(05:18:32)
And by forming a community of these programs, we hope to better support and empower young people to develop and apply that STEM knowledge in local contexts and to enact systemic change. There’s some really amazing programs out there doing this work, and I’m really idealistic and believe that this expanded narrative will allow more young people to see themselves in it, allow more people to engage with science, and hopefully expand a sense of belonging for some of the minoritized groups who are often left out of science. So my hope is pinned on both the young people and the adults who are working with them to empower them, to support them, and most importantly, to act alongside them. And we hope that this approach will also enable everyone engaged in this effort to become more critical consumers of information because when something real is at stake that affects us and the communities we care about very deeply, we want to trust that we’re acting on unbiased information and we will want to check for that.

(05:19:37)
And this work, as I see it, is not optional, it’s really necessary and it fills me with hope that so many of us are trying to engage in this work, that includes me and my colleagues and all of you by participating in this conversation, and I hope you’ll join us in this work. So thank you.

Kelly Stoetzel (05:19:55):

Thank you, Anita. Provocateurs, I’ll let you guys take it away.

Sabrina (05:20:05):

I’ll go. So thank you so much and I’m very excited to hear about your new collective, your initiative. I’m especially excited to hear about it because I work in the climate space and obviously we all know about this incredible climate anxiety that youth have now and what we see in our research is that something that stems that is action.

Anita Krishnamurthy (05:20:26):

Absolutely.

Sabrina (05:20:26):

Actually engaging with political systems, social systems, friends, neighbors, whomever. So I would love to hear how you’re thinking about that, if there’s a way you see that your program will address that or even just move in that direction.

Anita Krishnamurthy (05:20:39):

Absolutely. So as I said, we’re not trying to launch new programs, we’re trying to support existing programs that are doing this kind of work so that we can expand and amplify the work that’s being done because there’s amazing work that’s happening in a lot of isolated pockets and there isn’t a community of practice that brings people together so that they can learn from each other, share their ideas and expertise and experiences so that we can build on it and sort of define what good looks like and what people need to be able to support each other. Because as I said, it’s not about the one perfect program that’s going to solve this problem for all of us, there’s lots of local communities acting in ways to sort of empower themselves and to tackle challenges in their own communities.

(05:21:26)
So it’s, how do we support them to do more of that, to expand that, to improve it if needed? How can they learn from others in that space? And climate, absolutely, is one of those. The other one is actually data. We see a lot of people working with GIS data sets and other data sets to map issues and challenges in their communities and how to support young people to then activate and advocate for solutions to those challenges.

Speaker 20 (05:21:56):

I’m curious. I used to teach sixth grade math and I was an afterschool provider as well. What are the components… Shout out to afterschool. What are the components of a good STEM program, you’d say? When you talk about, there are great programs across the country in isolated pockets, what is great to you?

Anita Krishnamurthy (05:22:14):

I think a lot of it is how it brings it alive for the young people in that community. Because the truth is that a lot of afterschool providers are not STEM experts, but they are excellent at working with young people. They are youth development experts. So I think some of it is, how do you use STEM as a tool for bringing about agency and confidence in young people? And that happens when you can really contextualize it to what’s happening in that community so that STEM is seen as relevant to their lives. And the learning happens through that process, and I think there’s very much an element of the afterschool educator or the other community based organizations, leaders, almost learning alongside the young people. And I often find that being unafraid is a key characteristic. To be able to say, “I don’t know, let’s figure it out,” Because that’s how science works.

(05:23:18)
We don’t know the answers all the time. We’re sort of trying to figure it out. So I think some of it is that, and the reality is there’s always a need for more resources, there’s a need for more partnerships. I think partnerships with other science rich organizations in the areas help. So there’s a lot of different components, and I’ll go back to saying there isn’t one single definition. It looks different in every community, but I think a commitment to offering that kind of programming and that sort of co-design, co-creation and co-learning are all hallmarks, I think, of that kind of work.

Leslie Brooks (05:23:58):

I really enjoyed your talk and your focus on context specific situations and applying knowledge in different contexts. So I’m curious, if you could speak to how one might also apply science in different situational contexts where the best scientific answer may not be able to be played out to get the best results because of culture nuances on the ground, a variety of different situations, politics. We’re told all the time, it’s not always about the science when decisions are having to be made. So I’m wondering if you could speak a little to that and how do we educate our up and coming STEM workforce so that we can have more impactful science in society without this political divisiveness, if you will?

Anita Krishnamurthy (05:24:48):

Yeah, I’m not sure that I can exactly answer the question that you’re asking. Let me try and you can bring me back on track if I go off in a different direction. So, I mean there are certain things… Like, scientific facts are scientific facts, but at the same time, they do intersect with culture and I think it’s this issue of, what is the end goal? What are we actually trying to communicate and where are we trying to get to? And I think it is that humility that we as scientists must have. And I think what Peter said earlier really resonated because I think what people are looking for is sort of validation and a sense of belonging, and I think those are very human needs and I think science is a very human endeavor.

(05:25:43)
So I think it’s, how do we actually bring that into play and into the equation so that there is some common ground that we can then start with so that the problem is phrased differently, framed differently, but we know where we want to go. It may not be the exact path we would’ve taken to get to that end point, and some of that just has to be a lot of almost radically human conversations with people in that community to understand their hopes and dreams and fears about what the information you might be trying to work on with them might be, but is there a different way to approach it? I’m not sure that answered your question, but that’s my answer.

Speaker 19 (05:26:29):

So I have more of a comment than question. I would say education and hope got me up here in this stage. So for the communities that you’re trying to impact, do you have a recommendation or suggestion for those who may have hope but may feel things are hopeless?

Anita Krishnamurthy (05:26:48):

Yeah, that’s a tough one.

Speaker 19 (05:26:50):

Right.

Anita Krishnamurthy (05:26:51):

And I think that some of that… I work in a policy group here in Washington, and we work with a lot of state level as well as city level policy makers, and I think a lot of this is continually looking to see how we can bring programs and resources into those communities, how we can bring mentors into those communities, how we can continue to show what is possible. And sometimes there are other needs that need to be addressed. It’s not just about science literacy, this poverty is a very real issue and there’s many other issues. So it’s, how do we work with the other actors who are trying to do their best to support that community, to fit in with the larger agenda, that recognize that the agenda just isn’t about science education, it’s about supporting the people in that community?

(05:27:57)
And I really mean this when I say it’s about supporting that community to thrive and the young people to thrive, and that goes beyond science. And science is a part of it, I think, in this modern day and age, I think scientific literacy should be included in the definition of literacy because it’s what helps us move through life, but again, it’s that conversation between what does the community really need and what are you taking to the table. Because if you’re not even, not only not in the same boat, but in different lakes, not going to go anywhere very far, right?

Kelly Stoetzel (05:28:35):

Anita, thank you so much.

Anita Krishnamurthy (05:28:37):

Thank you.

Kelly Stoetzel (05:28:44):

So our final dynamic dialogue speaker is Hahrie Han. She is a political scientist who studies civic and political engagement and collective action. She’s here to share recent insights on how miscommunication impacts health messaging. Let’s welcome Hahrie Han.

Rachel Kuo (05:29:12):

Hi. As a political scientist who studies democracy and social change, one of the things that I’ve been doing for the past few years is doing a lot of work with people who are fighting for racial justice within the context of evangelical megachurches in America. And a couple years ago, one of the pastors who I’d gotten to know through that work called me to ask me what I thought about vaccines. It was a couple years ago, it was in the height of the Coronavirus pandemic, and he had been hearing from some of his other pastor friends the vaccines weren’t safe and he wanted to know what I thought. Now, he and I are really different from each other. I’m not evangelical, I am a social scientist and I live on the east coast. He is deeply a person of faith, he believes that the Bible is a better guide for understanding our world than the scientific method, and he lives in what you might call America’s heartland.

(05:29:56)
But we had gotten to know each other through this work that we had been doing together, and he wanted to know what I thought. So I assured him that I was quite positive that the vaccines were safe and effective. You know, and he laughed and he said, “Well, you would know because you’re a doctor at Johns Hopkins.” And I had to remind him I’m not that kind of a doctor. Which, by the way, that joke would not go over very well in a lot of audiences, but I appreciate your laughter so thank you for that. And we talked a little bit more and he said, “Well, can you send me some information that I can share with some of my friends?” And I said, “Sure, I’ll do that.” But then once we got off the phone, I thought about, what should I send him? And I wasn’t sure.

(05:30:32)
It made me step back and think about, how is it that we know what we know? And Elijah Millgram is a philosopher who writes about the fact that one of the characteristics of modern life is hyper specialization. So we all go through the world and we actually don’t know certain things, but we trust other people who tell us those things. So I don’t actually understand the biology behind how mRNA vaccines work, but I trust the scientists who do. But we’re so specialized that in a lot of cases, if I read a paper about the vaccines, I wouldn’t even know how to differentiate what’s a legitimate expert from a not legitimate expert. And so what I do instead is, I look for markers of trust. I might turn to friends and colleagues at Johns Hopkins or maybe look and see if the author is a member of the National Academy to try to figure out, is this a trustworthy paper or not?

(05:31:19)
And that pattern of what I was doing, or what I do in my life, is the same as what a lot of people do, is that we find people who share our belief systems and then we ask them what they think. And that’s why there’s so much research that shows, when you’re talking to someone who believes this information, one of the worst things you can do is actually throw a lot of scientific information at them. It’s much more effective to find someone who they trust, who’s already part of their circle of trust, and then have them have a conversation. If I’m a hippie mom in Berkeley who doesn’t believe in vaccines, it’s much more effective to find and have another mom come and talk to me about why she vaccinates her children than it is to send her papers that prove that vaccines work. And so the challenge then, in our current moment, is to find those kind of trusted messengers. But that’s one of the hardest things to do right now. Why is it?

(05:32:08)
So the structure of our social spaces changed dramatically in recent decades. I think there are three changes that are relevant to identify here. First, a lot of us go through our lives right now having lots of social interactions, but building few social relationships. I might have an interaction with someone on Twitter, I might have an interaction with a barista at Starbucks, but we don’t have a relationship because what differentiates a relationship from an interaction is that both people have a shared expectation for a future. Second, those of us who have relationships, we’re much more likely to have transactional relationships than social relationships. The commodification of American life has meant that we have lots of transactional relationships in which both parties are trying to protect their self-interest, right?

(05:32:55)
I have a transactional relationship with my mechanic, we have an expectation for a shared future, but we’re both trying to protect our self-interest in that relationship. My relationship with my college roommate on the other hand is really different because what happens in that relationship is I give to it without knowing what I’m going to get in return. Those are the kind of relationships that engender trust. Third, the disintegration of a lot of our civic infrastructure and the balkanization of American life has meant that even when we have those social relationships, we’re much more likely to have them with people who are like us. And so what that means, if you put all those things together, is that most people have fewer relationships, among those people, they have even fewer social relationships and then among those people, you have fewer relationships with people who are different from you.

(05:33:41)
And so, if information is social and the way in which we find information about the complex modern world that we live in is through our social networks, then we’re much more likely to live in these echo chambers in which disinformation can bounce around very frequently. So that’s the bad news. So what’s the good news? So as I thought about how to respond to my friend, the pastor, I thought about, what are some of the lessons that I learned from the work that I’ve been doing with him over the past few years? He’s a pastor in the third largest megachurch in America. It’s a church that gets 35,000 people to show up every Sunday for services, they have 500,000 people who show up online, so they have a scale that’s enormous, but somehow, when you go to that church, it feels really intimate. Everyone in that community feels belonging. And why is that? Well, one of the mottos in the church is this idea that belonging comes before belief, right? Think about that for a minute, right? Belonging comes before belief. In most of the social spaces that we inhabit, belief comes first.

(05:34:44)
Imagine if someone came here and stood up and said, “I don’t believe in science.” They would probably get a few looks, a little bit of scans, people wouldn’t be sure how to react to them. But in my friend’s church, their attitude is, “We’re really clear about what we believe in. We believe in a Christian God but you don’t have to believe in any God. You don’t have to believe in our God. No matter what you believe, you’re still a part of our community.” And so what I’m finding is that people are looking for that kind of belonging all over the world, and the work that we all have to do is it to take those lessons from both this church and from the work on disinformation, which is to create those relationships of belonging with people with whom we don’t know and across all the different social spaces that we inhabit, our faith institutions, our kids’ schools, our neighborhoods.

(05:35:31)
We all interact with other kinds of people, but we can build those relationships with trust. And some people think that doing that work might take too long and a short term problem that we have. And I’ll just stop with the sort of saying that people say, which is that the best time to have planted a tree is 20 years ago, but the second best time to plant a tree is tomorrow. Thank you.

Kelly Stoetzel (05:35:52):

Thank you, Hahrie.

Speaker 19 (05:36:02):

Okay, well…

Sabrina (05:36:03):

We don’t have any questions. That was such a…

Kelly Stoetzel (05:36:05):

Done.

Speaker 19 (05:36:05):

No, I think that was great. It touches on so many points that are unique, particularly when you’re talking about church. I automatically thought about the presence and impact of the church in the black community. And also, when you rely on stakeholders and trusted sources in the community, you assume, when you receive some information from them, it’s already been vetted and you can just hit accept because it came from a pastor or someone that you recognize. How do you think we can kind of combat that notion of acceptance when we know that the information is not always correct, particularly for hidden communities or communities that are often marginalized?

Rachel Kuo (05:36:48):

Right. It’s a complicated question.

Speaker 19 (05:36:48):

Give me a small…

Rachel Kuo (05:36:55):

Small answer. Yeah. So here’s what I’ll say is that, I think it is absolutely true that faith institutions, a lot of the structures that structure our social lives, are really important in helping us understand what is true. And it’s also true that if we think about what the causes of disinformation are, one of the problems is that it’s profitable to spread disinformation. And so, people have to become effective consumers of that. I think part of it is education, as Anita was talking about just before, but part of it, also, is to think about the role of doubt and the construction of knowledge. And people often think about faith and the scientific method as being polar opposites of each other, but if you actually think about it, at the center of both faith and a scientific method is the idea of doubt. Like, I have hypotheses. I’m not sure what the answer is and so I ask questions without knowing what the outcome’s going to be, right?

(05:37:49)
People who turn to faith communities, I think, are not sure how to make sense of the world, but then just based on faith, are going to accept this kind of idea of a higher power. And so I think if we begin to think about where those commonalities are and then how we return those values to it, then you begin to create a different set of practices.

Speaker 19 (05:38:11):

Thank you.

Speaker 20 (05:38:11):

Is there anything that has surprised you in your research as you study the relationship between evangelical communities and trust or community building, especially in a moment where the national narrative is less people are going to church, this is the least spiritual sort of generation that there’s ever been, and you’re like, “Well, this one, they got it going on.”

Speaker 19 (05:38:31):

Yeah.

Speaker 20 (05:38:32):

Has anything surprised you in the research?

Rachel Kuo (05:38:33):

Yes, a million things. So this is great because I was worried that the woman was going to come out and sing and so I sped through a couple things. So…

Speaker 20 (05:38:44):

I got you. I’m here for you.

Rachel Kuo (05:38:45):

I didn’t want her to sing. It would’ve been very embarrassing to be the last person and the only person to have her sing. Okay. So it is true that the median church in America has fewer than a hundred people and it’s getting smaller, but megachurches, which are defined as a church that has 2000 or more people, on average, they’ve experienced 34% growth over the past decade. We’ve gotten to the point now where the largest 9% of churches in America contain 50% of the church going population.

Speaker 20 (05:39:12):

Shut up.

Sabrina (05:39:12):

Wow.

Rachel Kuo (05:39:12):

Yep. It’s heavily skewed. So that’s shocking in and of itself. And so, what does that say? Well, it tells me that it’s not that people are giving up on church necessarily, but people are looking for communities of belonging. People are looking, searching for something that they’re not sure where to find. And so, I’ll give you one more statistic on these, is that the average megachurch, 96% of its budget comes from individual donations. And that’s just a marker of saying how committed people are to contributing to these communities that speak to something that they want. It is also true, to your point though, that we also see a rise in what are called religious nuns, people who don’t identify as any religion. And so both of those trends are happening at the same time. And the question

Rachel Kuo (05:40:00):

Question is what does it mean to be a nun? Does it mean that I don’t want to identify with the label of evangelical, let’s say, because it’s become so politicized? Or does it mean that I actually don’t believe in God? And we don’t really know the answer to that yet.

Sabrina (05:40:14):

Thank you. I have a particularly not well-formed question, but it’s about this concept of belonging, which I think it all resonates with all of us so strongly. We all want to belong somewhere. But in the climate space, belonging also is a constraining kind of set of beliefs, values, norms that disallow certain populations from thinking that climate has anything to do with them. And so I guess I’m wondering, in your experience, do you see kind of conceptions of where I belong being malleable in any way? Or are there moments in life or particular characteristics allow people to shift their conception of where they belong or who they are, what their identity is in that belonging space?

Rachel Kuo (05:41:00):

Yeah, I mean, I will say, just speaking personally, when I first started doing this work with evangelical churches, I mean, as I said, I didn’t grow up in Protestant, I didn’t grow up in an evangelical community, and I sort of crept into these spaces kind of feeling like I would be a little bit shunned or I wasn’t sure how I’d feel. And it’s like this sense of radical hospitality surrounded me just like it surrounded everybody else.

(05:41:23)
And I don’t know if it shifted my identity, it certain didn’t, but I look forward to going to this church. I look forward to seeing my friends. I look forward to seeing people. And so I think that sometimes we underestimate the transformative power of the social relationships that we have. And part of it, I think, is because our social infrastructure is fraying. But I think part of the messages that I want to leave people with is this idea that when we think about combating disinformation, we have to address not only the messaging and the narratives and the ideas that are coming out, but also the social infrastructure that underlies how we hear and interact with the information that we receive.

Leslie Brooks (05:42:06):

So I was curious, how do you think, an actionable step, how can we be more belonging in our health messaging? I’m thinking of during COVID, the signs that were in people’s front yards that said, “Believe in science.” And I, in my mind, I’m thinking that’s not affecting anyone’s opinion of science except for reinforcing our own. In the church community, that access for the most part is free. But in the healthcare setting, there’s a huge dollar sign tied to it, whether you want to become a scientist or a physician and go to school and pay for that or go and pay for services. And so how do we make it more belonging? It’s just very inaccessible on many different levels. And how do we also make that messaging not so judgmental, if you will push people away?

Rachel Kuo (05:43:01):

Well, first I’ll just say I love the example of yard signs. I also do work with political campaigns and stuff like that, and there’s a lot of research that shows that yard signs aren’t actually that effective in getting someone to support a candidate or anything like that. But it’s like a joke among a lot of campaigners because campaigns can’t run a campaign without yard signs because people get so mad if you don’t have one to give them, right? So it’s one of these things that’s… it’s a very expressive act. People want to be able to say, “I’m someone who believes this.” It doesn’t convince anybody, but it makes me feel good.

(05:43:31)
And so I think that related to the question that you’re asking is that a lot of the… the stickiest problems that we have when it comes to disinformation are so intertwined with structures of inequity in our society, and there’s no yard sign that’s going to dismantle that structure. And so to me, that’s part of the reason why we have to think not only about the messaging, but also the social infrastructure. But in terms of something like health, how you open that space up and make it more equitable, I mean, I think that there’s tremendous work that’s going on through public health agencies through a lot of community-based organizations. I think sometimes we underestimate the institutions that we have that already gather people towards them that become opportunities to open access to governmental services, to other kinds of information, to different kinds of relationships, food banks, public health agencies, my kids’ pediatrician, all of these should be opportunities for connection and belonging that right now we’re not necessarily using

Kelly Stoetzel (05:44:35):

Hari, thank you so much. And y’all thank you to our provocateurs. Thank you Leslie and Roger and Dray and Sabrina. Thank you so much.

(05:44:51)
And thank you as well to our human timer for being on standby. Thank you, Nancy.

(05:44:58)
Okay, so while we kind of take a moment to clear the stage here and get ready for our next speaker, we thought we’d do something a little fun. So you know those audio tracks that you hear as speakers have been taking the stage, we call them audio stingers, they’re like little musical songs. So a bunch of those are AI and some are human. They were created by AI, so we thought we would take a minute to see if y’all can guess. So my pal Sean in the back is going to play, I think, three for us. Will you play three? He’s going to play three for us. And raise your hand if you think that it’s the human one. Okay?? So Sean, give us stinger number one, will ya? Any hands for human? I see a couple.

Rachel Kuo (05:45:52):

It’s going in.

Kelly Stoetzel (05:45:52):

Okay. All right. We got a few people that went in on that one. Thank you. Okay, stinger number two, please. Human, AI? All right. Kind of about… I think we had a few more maybe on this one than the first one. Okay, let’s go for stinger number three. Okay. I don’t think anyone thought that. Oh, wait, there’s a couple. There’s a couple. There’s a couple people. Okay.

(05:46:35)
You know what, actually, so we just did that. I’m going to call them out again, one, two, three, now that you’ve had the chance to hear all three kind of together, and you might have changed your mind, so let’s do it again, but you don’t have to play them again, Sean.

(05:46:46)
Okay. Stinger number one. Who thought that was human? Okay. Stinger number two. Stinger number three. Okay. I think Stinger number two overwhelmingly gets it. That is in fact the one that’s human and human creativity wins. Yay. That’s kind of relieving somehow. Thanks, y’all. Hope.

(05:47:15)
Okay. So when you think about the dangerous power that algorithms hold over our lives and the urgent need that we have for ethical systems change, Tristan Harris would be the first name that pops into many of our minds. He is on a mission to align technology with the best interests of humanity, and it is really hard to imagine the importance conversations that we’re having here today without his voice. Let’s welcome Tristan Harris.

Tristan (05:47:42):

It’s great to be here with all of you, and so great to see the broad range of topics that have been covered. Because it’s so much more than just misinformation. We often think about social media as kind of creating the climate change of culture. And what I hope to do is sort of broaden and maybe zoom out a bit on what are the collective effects of technology and humanity.

(05:48:13)
We often start our presentations with this quote by E. O. Wilson, the Harvard Sociobiologist, who said, “The real problem of humanity is that we have paleolithic emotions or brains, medieval institutions and God-like technology.” And it’s so clear, we repeat it in every presentation because I just feel like it so quickly summarizes the feeling that we have, that our brains are not matched with the way that our technology is influencing our minds, which is what you’ve been hearing about all day today.

(05:48:48)
And so oftentimes we talk about the alignment problem in AI. How do we align AI with humanity? Well, how do we align our brains with our institutions having a steering wheel over something that’s God-like, it’s moving a million times faster, especially as we have large language models, they say AIs that are going to be moving way faster. Who here is sort of feeling the explosive rate of AI moving in the world, right? That’s going to supercharge so many of the things you’ve been hearing about and what I’ll kind of talk about today.

(05:49:17)
Now, many of you here have probably seen The Social Dilemma, how people here have actually seen The Social Dilemma? Okay, good. Maybe a good third of you. The social dilemma was a Netflix documentary I was a part of that was actually seen by more than 125 million people in 190 countries and 30 languages, about the effects that social media had on humanity, it was rewiring the flows of attention and information in our society.

(05:49:41)
And what we actually talk about is that people might have missed that was actually first contact between humanity and AI. People say, how is AI going to affect humanity? What we actually already had an AI that rewired and was misaligned with humanity, because when you swipe your finger on TikTok or swipe your finger on Twitter or swipe your finger on YouTube, you activated a super computer pointed at your brain to calculate based on what 3 billion other human social primates have watched, seen or looked at. What is the perfect next thing to show you when you swipe? That was first contact.

(05:50:13)
How did it go? So in first contact with social media, I hope you can see these slides here, we have information overload, doom scrolling, loneliness crisis, the influencer culture, sexualization of young kids, which angle makes me look the best, polarization, bots and DeepFakes and basically the breakdown of our shared reality. Collectively, these effects are something like the climate change of culture. It’s so much better, if we just had good information in our information environment, we would still have doom scrolling. If you just had good information in our information environment, you’d still have a loneliness crisis because people would be by themselves on a screen scrolling by themselves, and that would affect the belonging dynamics that Hari just spoke to.

(05:50:54)
And so this is really important because, as we’re about to go into second contact with AI with large language models, we haven’t really fixed this first misalignment. We lost. How did we lose to this AI? How did we lose in this first contact? Well, what were the stories we were telling ourselves? It seems really aligned, right? Social media is going to connect people with their friends. We’re going to give people what they want. We’re going to show people the most personalized ads that are relevant to them, only the things that they would want to buy.

(05:51:23)
These stories that we were telling ourselves are true, but that somehow hid what was underneath all that, which as other speakers have talked about today, the race to maximize engagement. Because how much have you paid for your TikTok account in the last year? How much have you paid for your YouTube account in the last year? How much have you paid for your Facebook account in the last year? Zero. How are they worth a trillion dollars in market cap? People say it’s your data, but they’re actually selling also your attention. So how do I get your attention? We add slot machine mechanics, pull to refresh, how many likes did I get? Did I get more this time than last time? I just checked my email five seconds ago, but I’ll check it again. It’s not enough just to get your attention. If I’m a social media company, my goal is actually to get you addicted to getting attention from other people because a person who’s validation seeking and wants attention from other people is more profitable than someone who does not care about attention from other people. So how do we do that? We add beautification filters to it. And what we call the race to the bottom of the brainstem, if Snapchat adds beautification filters, Instagram will lose if they don’t also add beautification filters. TikTok was found to actually automatically add a beautification of between one and 5% without even asking people, because we are all users mirror and mirror on the wall, which makes me look best of all, and we use that app.

(05:52:45)
Now, these design decisions that, again, other speakers have been talking about all day, also led to the creation of this because a button that instantly retweets, re-shares this content quickly is good at creating attention addicts, right? Because now TikTok is literally competing with Instagram. If you post this video on Instagram and Instagram offers you a hundred views on average, but if you post it on TikTok, you get a thousand views on average, if you’re a teenager, where are you going to post your next video?

Audience (05:53:16):

TikTok.

Tristan (05:53:17):

TikTok, the one that gives you the more reach. So it’s a race to who can inflate your ego and give you the most instant sharing as fast as possible. And we’ve all seen the effects of that, fake news spread six times faster than true news. And as other people have seen, this is the study from More in Common. This has led to a massive over representation, a fun house mirror of the extreme voices to the moderate voices with this instant sharing. Because what’s the difference between a moderate voice on the internet versus an extreme voice? Do extreme voices post more? I’ll just say it, they post more often and moderate voices post infrequently. That’s the first layer of the double whammy.

(05:53:53)
The second is that when someone says something extreme, it goes more viral than when someone says something more moderate. So even though there’s a very small number of very extreme voices out there, social media takes that like 5% of the population and then just spreads it out and stretches it out over the whole canvas and movie screen of humanity. And you run society through that fun house mirror for about 10 years and you quickly end up in a very distorted world.

(05:54:18)
I don’t know if you can read this. This is a thing that says social media summarized elegantly in two tweets. The top tweet says, and I’ll just zoom in really quickly. “The much vaunted Pandora Papers revealed that the patriotic Zelensky was storing payments from his top funder, Israeli Igor Kolomoisky, in offshore accounts, and this person was also a funder of the neo-Nazi battalion. And notice that first tweet got 8,000 retweets. The top one. Now, underneath it says, “I was the editor and co reporter of that story. You can look it up. You’ve completely twisted it. There’s no link between this money and anything to do with it.” And that got 58 tweets.

(05:54:53)
We could all just go home because that kind of summarizes what the entire information environment looks like. And if that’s the asymmetry of power that we have granted to every actor in the information ecosystem, we have been living in this fun house mirror.

(05:55:10)
And so we got to a world that looks like this, the climate change of culture. And we have another talk out there that I highly recommend folks, I have a short amount of time today, but you should check out this talk we recently gave called The AI Dilemma that talks about this second contact, which unfortunately won’t have time to talk about today.

(05:55:27)
So what’s the solution to this problem? Well, is it content moderation? Is it fact checking? Is it information true versus false? Well, what about all the sort of salaciousness that has partial truths that are spun? What about if we vilify the CEOs? One of the things that I think we really need to get good at is recognizing that the complexity of our problems actually has exceeded the institutions. The complexity of our world is going up, right? Pandemics, what’s the right way to respond to pandemic? What’s going on with COVID. Nuclear escalation, how do we deal with energy crises, debt crises, debt to GDP ratio, misinformation, the complexity of all these issues also combinatorially, right?

(05:56:11)
And then simultaneously, our ability to respond to that complexity, both as our brains and our institutions. So our paleolithic brains and our institutions collectively is represents humanity’s ability to respond to the complexity of our problems. But what is runaway technology adding to this? Well, runaway technology, AI steepens the curve of the complexity because if synthetic biology was a threat before AI supercharges the threat of synthetic biology. If misinformation was a threat before then generative media supercharges the threat of misinformation because people can publish… just yesterday, two days ago, the Pentagon, there was a fake image of Pentagon being bombed. If I wanted to cause a bank run in the United States, if I’m Russia or China, I could easily just create photos of lines of people standing in front of Wells Fargo, Chase Bank, et cetera, I can devalue the US dollar like that. There’s no department of homeland security or patriot missile defense system that’s going to stop someone from doing something like that. And there’s a million more examples like that, by the way.

(05:57:13)
And so the reason I’m going here is that I think we are often thinking too narrowly about how to solve these problems. We’re thinking about content moderation and fact checking when really it’s about how do we have wisdom, how do we have the bottom line? Our brains and our institutions and our social trust and our social fabric actually match the speed and complexity of the evolutionary curve of technology and our issues.

(05:57:36)
Think of it this way, if your immune system has a slower evolutionary pace than a virus has a fast evolutionary pace, which is going to win, your immune system or the virus? The virus. Our immune system are our institutions, our governance, our regulation, and that’s why we have to upgrade those institutions. What this means is that the sense making and choice making of humanity in terms of both individually and in terms of our regulation, in terms of how our institutions act, has to match the complexity of the world. And I wanted to reframe what we’re here to do, which is to also say, in addition to dealing with the information ecosystem, we need an information ecosystem that is, I think, collapsing these lines together with the quality of our sense making and choice making matches, the complexity of choices that we face.

(05:58:19)
And one way maybe of saying that to go back to E. O. Wilson, is that we need to embrace our paleolithic brains, upgrade our medieval institutions and bind the race dynamics of the God-like technology. And just to give some subtle examples of this, what do we mean by embrace our paleolithic brains? You’ve been hearing from people today talking about confirmation bias, [inaudible 05:58:44] talk about the social rewards that we get, belonging, that is embracing what it means to be human.

(05:58:50)
Our name, the Center for Humane Technology, is named as such because my co-founder, Aza Raskin, his father was the inventor of the Macintosh project at Apple. He wrote a book called The Humane Interface. And the word humane means having an honest reflected view in the mirror of how does our brains really work. That’s how he came up with and was conceiving of the Macintosh, because the Macintosh, different than the personal computer with a blinking green cursor and a command line interface was a very non-ergonomic way to match our brains. But the Macintosh with the mouse and the menu bar and drag and drop and icons was a much more humane interface because it embraced how we really work on the inside.

(05:59:26)
But now what we need to do is apply those kinds of insights to… our brains have confirmation bias. We need to feel belonging rather than loneliness. So how would technology embrace these aspects of what it means to be human and design in a richer way. If we were to get rid of the engagement based business models that sell our advertising, when you open up Facebook, Facebook could be ranked instead of which content should I show you, it could be ranked in terms of what are things that I can do with different communities around me in my physical environment.

(05:59:55)
They could be super charging the reinstantiation and the reflowing of the social fabric. Instead of showcasing virtual Facebook groups, they could be showcasing actual physical communities that people could spend time with each other. Because as many people have found out, when you spend physical time with people face to face, it automatically has healing properties, right? There’s building trust, building connection.

(06:00:15)
Upgrading our medieval institutions, one of the limits of our institutions right now is they deal with acute harms, discreet harms. This product hurt this person. But it doesn’t deal with diffuse, chronic and long-term cumulative harms. Think climate change, think forever chemicals in our atmosphere, think climate change of culture, slow rolling, increases in mental health issues, addiction, loneliness, et cetera in society. We need institutions that deal with those long-term cumulative diffuse issues, rather than just the acute issues. Liability, for example, is something that we need for AI companies that’s coming down the pipe, but liability only deals with an acute issue like someone died while using a car or using a product.

(06:00:56)
And in terms of binding our God-like technology, we need to actually recognize when there’s a race, if I don’t do it, I lose to the guy that will. If I don’t use social media for my non-profit to try to boost up my influence, I’ll just lose to the other non-profits that boost up their influence with social media. So it becomes a race to create this virtual amount of influence on online.

(06:01:17)
If I don’t deploy open AI to as many people as possible and deploy AI into Snapchat, because now every kid on Snapchat has My AI bot at the top, a friend who will always answer your questions and will always be there when their other friends go to bed at 10:00 PM. Snapchat is going to win if they do that versus the other companies that don’t. So as we are deploying this God-like technology, we need to actually recognize, not be upset at bad guys, bad CEOs, but get upset at bad games, bad races. And that would be another form of upgrading our institutions.

(06:01:48)
And just to sort of close, a quote that we reference all the time is that “You cannot have the power of gods without the wisdom, love, and prudence of gods.” If you have more power than you have wisdom or responsibility, that by definition means you are going to be creating effects in society that you can’t see, because you have more power than you have awareness of what those effects are. Think of it as you know, people are now aware of things like a biosafety level four laboratory from the Wuhan Institute of Virology. If you have biosafety level four capabilities where you’re developing pathogens, you need biosafety level four safety practices. As we’re developing AI, we are now inventing biosafety level 10 level capabilities, but we haven’t even invented what biosafety level 10 practices would need to be.

(06:02:37)
And one of the hardest things I think humanity’s going to have to do in looking in the mirror is thinking about how do we bind and limit the power that we’re handing out to the appropriate level of wisdom and responsibility. The biggest mistake we made with social media is handing out God-like powers of instant re-sharing to everyone, which can be very helpful in certain cases, but it’s helpful specifically because it’s bound to wisdom when it’s going well. And when it’s not going well, it’s because that power is not bound to wisdom. So those are some thoughts I wanted to leave you with today. Thank you very much

Speaker 21 (06:03:07):

Hey there, John. It’s lovely to meet you. I just spent the morning grabbing brunch at The Flowering Tree Cafe in West Hollywood. It was absolutely amazing.

Kelly Stoetzel (06:04:21):

Hey y’all. Next up we have a group conversation around how we can think about striking a balance between free speech and a flourishing and healthy democracy and mitigating the danger caused by the spread of misleading information. Finn Myrstad, an important global voice on digital ethics, is going to join us as moderator. And then he’s going to bring out a group of amazing people and he’ll set up the conversation and introduce you to all of them. Let’s welcome Finn.

Finn (06:04:57):

All right, what a pleasure to be here with all of you, I’ve been here the whole day and I’m really amazed by all the fantastic perspectives that we’ve been listening to. And I can promise you that you’re going to hear some even more incredible stories right now. We have two, no, two plus two, four incredible speakers coming up now with perspectives from all over the world. And so I’d love you to give a warm welcome to these four incredible individuals who inspire you and maybe give you some new perspectives as well. Welcome to the stage. And I’ll introduce them as we go along with the opening questions. Rebecca, you are the vice president of the Wikimedia Foundation and you have a long and distinguished career working in this field. The foundation hosts Wikipedia, and you have a host of volunteers curating the Wikipedia sites, but also many other sites that are gathering knowledge freely globally. And today we’ve been discussing disinformation and misinformation. And for those watching digitally have already heard Rebecca define this term. But for also in the audience, I thought it could be maybe helpful if you could start with what is your definition of misinformation and disinformation and in this context.

Rebecca (06:06:23):

Sure. Well, thanks so much for having all of us on stage today. So misinformation, as some panelists have discussed earlier, is information that is false but is not spread intentionally to deceive. It might be your aunt’s sending around something they saw that excited them and doesn’t realize that it’s false. Disinformation is false information that is spread intentionally with a specific agenda. And what we find at the Wikimedia Foundation is that disinformation is often accompanied by threats against people who try to counter the disinformation, disinformation campaigns against the people trying to debunk the disinformation, et cetera. So it’s quite dangerous, literally and physically. `

Finn (06:07:19):

Yeah, I think we’re going to hear a bit more about that in the panel as well. I mean, people here are amazing individuals who’ve experienced firsthand the price that you can pay for telling the truth in a landscape filled with misinformation and disinformation. And early today, we heard a lot about the pandemic and the Wikimedia Foundation, but in particular, the volunteers who run Wikipedia were in the middle of this battle. Could you describe to us how that played out during the pandemic?

Rebecca (06:07:45):

Sure. Well, since the start of the pandemic volunteer editors of Wikipedia and other related projects have been at the front lines of countering misinformation and disinformation on the projects. And just because anybody can become an editor of Wikipedia, doesn’t mean there aren’t rules about what you can and cannot do. And the rules and the system for enforcing the rules is created by the volunteers themselves. And those rules relate to what is considered neutral, well sourced content. And the people developing those rules and enforcing the rules include experts, people with local knowledge, et cetera.

(06:08:28)
And so how that played out in the pandemic, take for example, one of our heroes, Dr. Netha Hussein, who’s a clinical neuroscientist from Kerala India, but living in Germany, she started something called Wiki Project Medicine, bringing together dozens and dozens of experts, scientists, doctors to focus on COVID-19 related information and making sure it was well sourced, monitoring those pages. And so now there are around 7,000 Wikipedia articles related to COVID-19 in over 230 languages, according to my colleagues.

(06:09:12)
And so again, there’s COVID-19 related factual information in some languages where it’s very hard to get any other information. And what’s more, there’s also a cohort of volunteers who’ve been documenting the disinformation and misinformation about COVID-19. And so you can go on to Wikipedia and check what the disinformation is. You can also click on the history tab and the talk tab to see all the edits that have made and all the debates about what is and isn’t factual on every page.

Finn (06:09:47):

Yeah. Oh, that’s really impressive. And I think in a landscape where information is getting more and more, how to say, spread, right? It’s really important to have some trusted sources of information that is also freely available, which is also becoming maybe a challenge.

(06:10:04)
So I’m going to get back to you a little bit on this issue, but I want to now move to Maia. You founded the Alliance for Europe and you’re working for more democratic Europe by fighting disinformation through community building, digital intelligence and campaigns. You’re now based in Warsaw, Poland, but you also do a lot of work on the border of Ukraine. And as a result, you are also witnessing the horrors and consequences of the invasion by Russia into Ukraine. And I think it could be interesting for the audience to hear how misinformation and disinformation is a part of that war and that conflict.

Maia (06:10:39):

Yeah. Thank you very much. It’s a pleasure to be in here because the topic of disinformation and disinformation in particular is really changing our behaviors. It’s changing the way how we think, but it’s also basically building our future. So it’s on us what choices you will make in order to get the information. But listening to all the previous panelists, especially the scientists, thinking what Russia is doing and other bad actors is basically putting the science at work.

(06:11:11)
And we are thinking of how to do that, but they are already doing it. The weaponizing of information and disinformation is huge. And we are seeing that in Poland and what Melissa Fleming has said a year ago, I’ve been going to Ukraine and I’ve been really very much touch hearted, like hearted to touch with all of my citizens from Poland as well, supporting the refugees from Ukraine. But what we’ve been seeing for the past year, it was a spread of disinformation and misinformation on trying to polarize our both societies. And it’s not only happening in Poland, it’s happening pretty much everywhere.

(06:11:52)
So Russia and also other bad actors that are using this information, they are working on our cultural biases, but very much working on our brain and the way how our brain is working. So we are in fear pretty much for the past few years since the pandemic, we first fear of for our lives because of COVID-19, and then in Europe we fear of the war. And when we fear, our brains are closing. We are having the fight, flight, or freeze response, and that’s when we are having the tunnel thinking. And if we are looking at the information space, if it’s floated by disinformation and disinformation, it’s very often working on fear. That’s where we are coming.

(06:12:37)
And what we are trying to do with Alliance for Europe is to tag the whole society approach. Exactly on one of the regions in POTA in Poland, we are working with partnership hub, working with 10 different organizations on the ground with the municipalities and doing a research to see those changes in the attitudes and thinking of how we can respond because positive communication and strategic communication is the way to move forward. In the social media, we have a war a bit, and whether we’re going to know how to communicate our values or we are just going to be looking at how disinformation is being spread. It’s a question that going to build our future and our common future.

Finn (06:13:22):

Yeah. Thank you very much. I think you touched on also a topic that’s been a theme as well today, in between, I mean the reward systems in social media. Tristan Harris was also discussing the effects of certain business models of big tech actors. And that sort of brings me to the next person on the panel, which is Flora, and she’s working for the nonprofit and you’re a champion of social justice issues and disinformation in particular. And you have witnessed really firsthand how systemic misinformation and disinformation can be a threat to democracy. I think this has been already quite clear today, but

Finn (06:14:00):

I’d love us for you to tell us briefly how this has played out in Brazil where there recently was an election, and considering we are in Washington DC there was also something that looked very much like a resurrection and invasion of your state capital.

Flora Rebello Arduini (06:14:12):

Yes. First of all, great to be here, thank you everyone for having me. Yes. So, in Brazil what we could see unfortunately was nothing new in terms of tactics. You have in one side bad actors including officials, elected officials, people trying reelection, people trying to get elected, and bad actors in general as Maya was saying boosting disinformation and hate speech content, including calls for violence, including calls for a military intervention in Brazil. So, you have that cohort of actors. Unfortunately, on the other hand you also have the big tech platforms monetizing on this kind of content. So, you had in one hand the platforms boosting and actually recommending this kind of content for users, and also allowing online ads with this kind of content. I’m not just saying this because I think, when we conduct researches and investigative research, which we conducted for quite a long period of time following the elections in Brazil, we could see the experience of the regular user.

(06:15:18)
You would go to a search bar and you look for electronic ballots, the recommendation firsthand on TikTok, Instagram, Facebook, and YouTube would be rigged. Instead of electronic ballots or safe or trustworthy, they would just recommend something that would lead you into a rabbit hole of disinformation. So, we had that situation happening. Authorities obviously tried to step up to the limit that they could within our laws, but that led ultimately to the attacks January 8th this year in Brazil, which was a copy and paste of what happened after the US 2020 elections in Capitol Hill. We had tens of thousands of people marching into Brasilia, which is Brazil’s capital, literally invading official buildings, trashing everything, because deep down they believed that the elections were rigged. And the reasons why they believed that was because for years they’ve been receiving this message that the elections were going to be rigged and et cetera. So, fortunately there is a silver lining, right? We now have a new government that is looking very much eager to regulate the platforms. And so we are now seeing a great commitment from the government and also civil society at the front lines, but claim to pass a legislation that it’s robust and will regulate the industry as a whole, including disinformation.

Finn (06:16:43):

Thank you. And I think it’s interesting, we’ll probably maybe get back to that as well. I mean, earlier today we were discussing how misinformation is a deep societal issue. It also connects to cultural perceptions, but also injustices over time. I’d love to hear afterwards maybe how social media in your view then refueling on those sort of division lines that maybe were already there. But I’m going to move to you Rana, you are now a world renowned journalist. When I was reading about Rana I just was amazed by the things that you’ve been doing. I mean, she’s been the longest undercover journalist in the world. Yeah. Working eight months undercover, with I think was it eight cameras on your body the full-time? Exposing systematic religious state fueled persecution. My summary, you can tell it better than me.

Rana Ayyub (06:17:42):

We have very little time for that though. Yeah.

Finn (06:17:46):

And you’re also yourself the subject of massive disinformation campaigns, attempts to silence you in court, death threats and much, much more. So, I want to ask you, this is twofold question and we’ll get back to a little bit how this impacted you as a person. But could you tell us a little bit, what is the state of play on misinformation and disinformation in the world’s largest democracy? India has 1.4 billion inhabitants, people. What is the media landscape like?

Rana Ayyub (06:18:10):

The media landscape does not exist in a country of 1.4 billion people. The mainstream media is literally captured by the state, we had an independent news channel which recently saw a hostile takeover. Before I begin, I need to acknowledge the fact that two years ago we had a Nobel Foundation, even the Nobel Prize, even the fact that the Nobel Committee gave the Nobel Peace Prize to two journalists two years ago, decades after decades was to acknowledge that journalists are the new enemies of the state. And how do they become enemies of the state? When you discredit them with disinformation and misinformation. I mean, when I Google myself the first thing that I see is disinformation. I see news and stories about me much later. I come from the world’s largest democracy and I take great pride in being an Indian, but not the government that is ruling it.

(06:18:57)
We’re talking about misinformation and disinformation, I’ll give you an example, the deputy leader of the Nobel Committee was in India two months ago, Mr. Asle Toje, and there was a rumor that he said, he said, which he didn’t, that Narendra Modi, the Indian Prime Minister was the front-runner for the Nobel Peace Prize. He never said that. He never said that. But that story was on the front, was the headline of every news channel in India. The spokesperson of the ruling party, everybody said that. By the time his statement came up somewhere later, “I did not say that,” nobody played that. By the time the news was busted, the fact that Narendra Modi is a front-runner for the Nobel Peace Prize became the actual reality. It doesn’t stop at that.

(06:19:40)
The Indian Prime Minister of course will be in the US on a state visit hosted by President Biden, we are also told he’ll be going to Camp David. He has been promoting and endorsing a film in India which is based on the premise of Islamophobia and fake news. And one of his ministers said that people who do not watch this film are terrorist and ISIS supporters. So, that’s the landscape we are talking about. In absence of an independent mainstream media some of us who are doing our journalism are made enemies of the state, hence we are facing what we are facing. Persecution, my bank accounts are frozen, money laundering, tax evasion, name the charge, it’s there. Sedition, criminal conspiracy, defamation. I’m just facing a charge for an article I wrote 15 years ago. God knows, I don’t even remember that article. But I am facing charge for it because apparently I have heard the sentiments of the majority community.

(06:20:31)
That is the new story, and how does it fuel? It fuels because we, how do you make me an enemy of the state? When you start fueling every story about me. For instance, there’s this screenshot about me that says I hate India and I hate Indians. God knows where it came from. But every second time I tweet something, that screenshot comes up and it’s circulated all over the country. There is a screenshot that says I support child rapists in the name of Islam. I have never said that, but that screenshot is there all over the internet. That’s how you discredit me, saying, oh, she is a Muslim, a marginalized, a critic of the government, and her only job is to defame India on an international level. That’s the role disinformation is playing in a country which we call the world’s largest democracy.

Finn (06:21:22):

We’ll get back a little bit to you, Rana, on how you deal with this. I think Maria Resa, who’s also in the audience, who’s going to have the closing remarks later, has been subject to many of the same tactics and can probably also attest to how this is a very systematic approach that is used to try to silence free journalism and free speech and also individual activists. But this section of the program is also about hope.

Rana Ayyub (06:21:50):

Where is it?

Finn (06:21:51):

So, we have this section now where we actually, we’ve had a pre-meeting, so we have agreed we’re going to try to end also on a hopeful note. So, I’m going to challenge the panel now to two questions. I’m going to try to divide up very quickly on what can we do at a systematic level? And then what can we do at the individual level? I think we’ll start with the systematic level. Rebecca, we’ll start with you and go this way.

Rebecca (06:22:14):

Sure. Well, it’s pretty clear that independent journalists open an independent science, independent research, open data repositories, libraries, Wikipedia are the world’s best antidote to mis and disinformation. That means you need to protect and support the people who are contributing, who are reporting facts, who are sharing facts, who are working to verify facts. Right now our laws and regulations are not prioritizing that. There’s all kinds of debates about how do we regulate internet platforms? How do we hold technology accountable? Don’t forget, Wikipedia is an internet platform. Different business model, but the point is that there’s so much discussion about how you punish the bad guys. Just because you see a cockroach in the kitchen you blow torch the whole kitchen? Right?

(06:23:17)
So, let’s make sure that when we’re addressing harms, we are not actually making vulnerable independent journalists, independent researchers and scientists who are under threat, who we heard about earlier before this panel, making it even harder for them to do their work without being sued, without being threatened, without being surveilled, without being exposed and doxxed. Make sure the law isn’t making it worse for them while you’re trying to go after genuine harms. That is the biggest problem we’re seeing today, and Wikipedia is facing this all the time where everybody wants to punish big tech with their laws, but they’re sort of catching Wikipedia up in the net. Many proposed laws will actually make it much harder for people to edit Wikipedia without going to jail, being surveilled or being sued. And so this is something that people need to keep in mind.

Finn (06:24:20):

Yeah. No, I think it’s such a complicated issue and there’s no quick fix here. I think that’s really clear from this summit today that there is really no quick fix. But there are some proposals on the table, and Maya, you’ve been working on several policy initiatives in the European Union. Could you tell us a little bit about, if you could say one or two things that you think would really help on this issue.

Maia (06:24:40):

Definitely. I believe that regulation helps, and it needs to be a smart regulation and ethical regulation. Because what we need, we really need more ethics and values in our internet. And being afraid of AI, it’s not something that’s going to bring us a future. It’s a good technology, internet is good, strategy communication is good. We just need to use it, but use it in a way that bad actors also won’t win. So, coming all together in a room is usually the best thing to do, and I’m very grateful that tomorrow that’s what we’re going to do in here. We’re going to meet as the group of experts and try to arrange the future. But I also want to say that not only regulations are good, what is important I would like to see the media literacy education, critical thinking education, and also one that is the most important for me, emotional intelligence education in every single school.

(06:25:34)
Because when we talk about disinformation and how it affects us, it’s as I said, usually something that we fear and each of us feel the fear. And when we fear that, when we feel the fear, it’s the moment to come back to yourself. And that’s something that we should be learning, humanity from the very single beginning, from being kid. Emotions are the normal things that is out there, so let’s also remember about education.

Finn (06:26:04):

Yeah. I think you learned today that this is a systemic issue, but also we could do work at individual level. Flora, you’ve been working both in Brazil and in the European Union on these issues, so what would be your kiosks to sort of regulate or improve the situation?

Flora Rebello Arduini (06:26:17):

I have three.

Finn (06:26:19):

Yeah. That’s fine.

Flora Rebello Arduini (06:26:20):

So, yes to regulation, time is over. Time is up for self-regulation of this industry. We don’t want to see more Capital Hills, we don’t want to see more Brasilias, we don’t want to see more Rohingya genocides being fueled by algorithms that prime disinformation and hate speech and violence. So, regulation, national level, international level. This is a global panel, this is a global forum, this is how we’re supposed to tackle this issue. So, regulation for sure, disinformation is a symptom, grave serious one, but we need to regulate the industry as a whole. European Union just passed a regulation that we can now look at and get inspired by, which is the Digital Services Act. The second branch of this, as Maya mentioned, is digital public policies for digital literacy. It’s vital. We need to empower people to navigate the new technologies. We need to empower people to understand how to distinguish as much as possible disinformation from the true facts.

(06:27:18)
And obviously the third pillar is independent press. We as a society need to figure it out how to solve this issue. It’s what’s happened to Rana, what’s happened to Resa. There are thousands around the road being killed for speaking their truth, and this is unacceptable. The best medicine and antidote for disinformation is accurate information. So, we really need to make sure the press is free, is independent, is autonomous, and we need to make sure that our societies make possible for journalism to flourish and not perish. And if I may add a fourth, platforms and industries, they don’t have to be on the cross arms around the corner or just a plotting aggressive lobby tactics to stop legislation. But they can for example, respect our laws would be a start, would be really cool. Second of all, treat citizens equally everywhere. We’re not seeing that happening, English-speaking countries, the global North is taking majority of the safety and risk budget of these platforms. And third is the same level that they invest on development and news business, why don’t they invest in ethics and security, policies and professionals as well? Now we are seeing companies firing entire departments of ethics. I’m done.

Finn (06:28:43):

As you realize, these people have so much knowledge and working on this full time. We could go on all day, but I’d love to give the last word to Rana, because the things you’re going through, the things you’re doing I think is an inspiration to all of us. And if your story can serve as an inspiration to all of us to do something after this summit, whether you’re here in this audience or you’re watching this. Rana, how have you managed to still work on this and still go back to India and report on the things you see?

Rana Ayyub (06:29:14):

I mean, the only way for us to kind of tide over this is to keep speaking up. The reason why they use disinformation and misinformation is to silence us. If I go silent, then they win. I do not have the luxury of being silent right now, nor do any of us who are fighting for independent journalism, civil liberties, human rights. At this point of time the world is witnessing a right wing churn everywhere. We have Modi, we just saw Bolsonaro. We have the prospect of Trump, we have other one. We have all these leaders globally, and the only antidote to right-wing politics all over the world as independent journalists is a platform, is ethical news. You said how do I go on? Honestly, in an ideal world I would just sit back and just retire. I don’t have that luxury because it’s been a long and relentless ride.

(06:30:02)
Just facing the kind of attack that I face from the government, I do more court cases than I do my reporting off late. It’s exhausting. It takes a toll on your mental health, you get anxiety attacks on flights. But that’s exactly what these dictators and demagogues want, they want to tire you down. How do they tire you down? By misinformation and fake news. The only antidote to everything that we are all witnessing here is to start consuming ethical news, is to start supporting independent journalists who are risking everything, who are risking their personal lives to tell you the stories that they’re telling, so please support them. This is not the time to be silent, because I believe when history is being written, history will not just record the tyrants, history will also record the silence of the well-meaning. We don’t want to go down that road. So, thank you so much.

Finn (06:30:50):

Give them all the big applause. Thank you.

Kelly Stoetzel (06:31:19):

Wow. Thank you so much to that amazing group. Thanks so much y’all. So, we will find our way through this challenging moment in time, and as we just heard, we need a plan and we need to take action. So, now we turn to the last chapter of today where we explore the many reasons to have hope. Indeed, trust abounds, and where there is truth and trust, there is hope. And where there is life, there is poetry. From Poets for Science, let’s welcome one half of Poets for Science, first, Jane Hirshfield, and she’ll be followed by David Hassler, the second half of Poets for Science. Jane.

Rebecca MacKinnon (06:32:13):

So, some of you might be wondering, what is poetry doing in a conversation in support of the factual? And that’s a fair question. But this is also of course a conversation about communication. It is about truth’s compass and truth’s persuasion. And for that, the microscope and the metaphor, imagination and observation are not separate. Science and art both began as all things human do, with pre-history. And the earliest myths were set into meter and rhyme to help them be memorable, before there was paper and writing to help with that. There were also attempts to find meaning and order in what William James described as the blooming and buzzing confusion of the pre-conscious world. Cave paintings recorded what mattered, mammoths, lions, pregnant horses, human hands. Material science begins with knowing which stone to nap, but also what can you do with ochre? And the imagination is the proto faculty of hypothesis making.

(06:33:27)
Now, 50,000 years or so later, things have speciated a bit like Darwin’s finches. We turn now to science for questions that have answers. We turn to art and to poetry for questions that have no answers, but still require response. Both offer what they always have done, a way to go on. Both are acts of discovery. Both are distillations, they take complex thoughts and put them into symbolic systems that are portable, repeatable, given from person-to-person, culture-to-culture, time-to-time. And both of them are things that we do ordinarily all the time because human beings are curious and because discovery is a joy, but they are also needed most direly in times of disaster and crisis. And last, both of them borrow from each other’s strengths. A poem won’t be worth keeping if whatever lightning is in it doesn’t strike the actual ground of our lives and the world.

(06:34:38)
A scientist is most excited by what they don’t yet know. So, Poets for Science began as a response to a steepening crisis of silence. We haven’t talked much about silence here today. We’ve talked about misinformation and disinformation, there’s also censorship. You might remember that on the fifth day of the last administration, January 24th, 2017, the White House took down from its website all information about climate change and ordered every scientist who worked for the federal government not to speak of their work in public without pre-approval. By the end of that day, I had written poem. Of course. I sent it to several research science friends, they sent it to other friends. And three months later I was on the mall here in Washington DC saying it allowed to 40 or 50,000 people at the March for Science, while my partners at the Wick Poetry Center were hosting a tent covered inside and out with human sized banners of poems that spoke to every different area of science and inviting people to write their own. People would come in, they would see this tent with Poets for Science written on it, do a double take, come over, start reading. Their faces would change, their breathing would change. And so I am going to read you the precipitating poem on the fifth day.

(06:36:20)
On the fifth day, the scientists who studied the rivers were forbidden to speak or to study the rivers. The scientists who studied the air were told not to speak of the air, and the ones who worked for the farmers were silenced, and the ones who worked for the bees. Someone from deep in the Badlands began posting facts. The facts were told not to speak and were taken away. The facts, surprised to be taken, were silent. Now it was only the rivers that spoke of the rivers, and only the wind that spoke of its bees, while the unpausing factual buds of the fruit trees continued to move toward their fruit. The silence spoke loudly of silence, and the rivers kept speaking of rivers, of boulders and air. Bound to gravity, airless and tongueless, the untested rivers kept speaking. Bus drivers, shelf stalkers, code writers, machinists, accountants, lab techs, cellists kept speaking. They spoke, the fifth day of silence.

(06:37:28)
Facts are foundational, reality is foundational, no animal ignoring the actual will long survive. But the presence of poetry at the March for Science was not about advocating a particular practical action, it was about advocating for a feeling, the sense of shared lives, shared faiths, shared existence. If you want to change a culture, emotion is foundational too. And people will only work to save what we love, what we feel part of the fabric of and kin with. And so Francis of Assisi wrote in his Canticle of Brothers Sun and Sister Moon, and Walt Whitman wrote in Leaves of Grass, “Every atom belonging to me as good belongs to you,” that none of us ends at our skin is a truth of both poetry and science. Before I give the stage to David Hassler and the academies Mirzayan and fellow scientist for a bounding demonstration of hope, I will finish with four quotes. First, John Keats. “Beauty is truth, truth beauty. That is all you know on earth and all you need to know.” Next, Rachel Carson. Yeah, Rachel Carson, who was surely the most effective science communicator of our era, accepting the National Book Award for the Sea Within Us. “If there is poetry in my book about the sea, it is not because I deliberately put it there, but because no one could write truthfully about the sea and leave out the poetry.”

(06:39:12)
The 19th century philosopher and polymath scientist, Herbert Spencer, “Those who have never entered upon scientific pursuits know not a tithe of the poetry by which they are surrounded.” And last, Michael Atiyah, the Lebanese British fields metal mathematician and theoretical physicist, “In the broad light of day mathematicians check their equations and their proofs, leaving no stone unturned in their search for rigor. But at night, under the full moon, they dream, they float among the stars and wonder at the miracle of the heavens. They are inspired. Without dreams there is no art, no mathematics, no life.” Thank you.

David Hassler (06:40:01):

Thank you, Jane. So, I had the privilege for this summit of leading three online poetry workshops with four of the Mirzayan fellows here at the National Academy of Science and the Nobel Laureate, Elizabeth Blackburn. I met them for the first time five weeks ago online. What Jane said that microscope and the metaphor are not separate, she didn’t tell you one of my favorite quotes of her, “The microscope and the metaphor are both instruments of discovery.” What is it that a poet, poetry for science can discover? I believe is what I would call the emotional truth of science, a way to give voice to the felt experience of what it is when we peer through the lens of our microscopes with awe-struck observation. And so in our sessions we read three of our poems for science that are in the gallery exhibit, Poets for Science. We charged the air to both guide our writing and to prompt and inspire us.

(06:41:15)
The poem we’re going to read you in a moment is inspired by Gary Snyder’s poem, For All, James’ own poem, Optimism, and Camille Dungy’s poem, Characteristics of Life. The writing that followed became our own way of integrating what we know in our heads, our knowledge of science, with the emotion we feel for science in our hearts. Our poems were inherently hopeful. For, like all poetry as Donald Hall says, “Poems are the unsayable said.” So, now I’d like to invite the Mirzayan Fellows to come out to perform, Covey Chintham, Mariela Garcia Arredondo, and Nafiza Andrabi. We’re going to share for you what we discovered through the instruments of metaphor and the imaginative language of poetry. A collective poem that we scripted, drawing from our own individual voices. Origin Story.

Speaker 23 (06:42:26):

Origin story.

Speaker 22 (06:42:28):

After Camille Dungy’s characteristics of life.

David Hassler (06:42:31):

Ask me if I speak for myself and I will say…

Speaker 23 (06:42:35):

What is the self but a tangle? A cobweb of stories generations deep stitched together by earth’s fibers. [foreign language 06:42:46].

Speaker 24 (06:42:47):

Nothing I speak is without the trace, channeled scab lands of every path carved before me a single molecular origin story.

Speaker 22 (06:42:57):

The tiniest wriggling organism invisible in its still dark pond water can tell us truths that inspire.

David Hassler (06:43:06):

Yet how can I speak for the molecules dancing in each living creature?

Speaker 24 (06:43:11):

Calling back.

Speaker 23 (06:43:12):

And fourth.

David Hassler (06:43:13):

I eavesdrop on their gossipy chatter, ask them to explain, but mostly they take the fifth.

Speaker 23 (06:43:22):

Ask me what I know of the morning banter…

Speaker 24 (06:43:25):

Of the blue jay.

David Hassler (06:43:26):

The scrub jay.

Speaker 22 (06:43:27):

The cooing of doves.

Speaker 24 (06:43:28):

And I will tell you I know only the meaning I make of their song.

Speaker 23 (06:43:33):

How can I speak for the soil? Welcoming all hugging seeds that spring forward and shoots.

Speaker 22 (06:43:44):

[foreign language 06:43:41] Bursting with black and rudy colors that envelope our skin and keep our bellies fed.

Speaker 23 (06:43:51):

More and more I have come to trust in the circularity of a tree, how it yearns itself out of the soil.

Speaker 22 (06:43:57):

From sprout…

Speaker 24 (06:43:58):

To seedling…

David Hassler (06:43:59):

To sapling…

Speaker 23 (06:44:00):

Then rots and returns to the earth.

Speaker 24 (06:44:03):

When I trust the coffee shop stranger to watch my backpack, I leap into the woven net of ties between us.

Speaker 22 (06:44:09):

We catch each other, holding doors open.

David Hassler (06:44:12):

Offer our hands in greeting.

Speaker 22 (06:44:14):

Each of us the same star dust rearranged.

David Hassler (06:44:18):

I speak to unself myself.

Speaker 23 (06:44:22):

To tell the truth of the origin story we share with our planet.

Speaker 24 (06:44:26):

A truth lost too often in rushing days and racing thoughts.

Speaker 22 (06:44:32):

A truth coursing through the ventricles in my heart. Whispers of…

Speaker 24 (06:44:37):

Ancestors.

David Hassler (06:44:38):

Trees.

Speaker 23 (06:44:38):

And oceans.

Speaker 24 (06:44:43):

[foreign language 06:44:42] I speak for the earth.

Speaker 23 (06:44:45):

And the earth speaks through me.

All four poets (06:44:48):

Our narratives cannot be untangled.

Kelly Stoetzel (06:45:08):

Wow. Yes to Poets for Science. As many of y’all know, this is the second Nobel Prize Summit that the National Academy of Sciences and the Nobel Foundation have produced together. In 2021 they jointly hosted Our Planet Our Future, and as part of their involvement in that last summit, our next speakers, Sheldon Himelfarb and Phil Howard called for an intergovernmental panel on the information environment. Let’s just say that they’ve been very busy since then. Let’s welcome Sheldon and Phil.

Sheldon Himelfarb (06:45:51):

Thanks everybody. Hello. What an amazing day, right? Before we say anything else, thank you, thank you, thank you to our summit organizers. So, as you’ve heard and as you can see on the screen, my name is Sheldon Himelfarb, and I’ve had the distinct privilege of spending the better part of the last three decades working with local change makers, local leaders, local peace builders in conflict zones around the world, from Bosnia to Burundi, and even in this country in cities like Baltimore. And what we’ve been doing together is developing technology tools, technology strategies, technology training so that communities could do more, better, faster, since that’s what tech does, it’s a force multiplier. They could do more, better, faster to tackle the drivers of conflict in their communities.

Phil Howard (06:46:52):

Hi, I’m Phil Howard. I work at the University of Oxford and our lab has worked for several years now to identify the complex information operations that really degrade public life. My career hit a new low a few years ago when I spent three months studying the most ridiculous campaign to blame COVID on a shipment of lobsters that had been sent from Maine to China, and since then I’ve been trying to recover. I think one of the ways we’ve recovered is by documenting very carefully, researching the long thread that connects the people who produce misinformation to the platforms that serve it up, to the politicians who benefit from the chaos.

Sheldon Himelfarb (06:47:39):

So, two really different career trajectories, Phil in academia, me working at the local level in conflict zones. But the common denominator between us has always been that we’ve worked to try to figure out how technology could be used to amplify social

Sheldon Himelfarb (06:48:00):

… good, but now we have come together for a different problem and one that’s slow and coming. (Singing). So pretty terrifying really, especially when you see it happening in all quarters of the world like that. So what we’ve managed to do is actually create a problem so far reaching that it is rapidly becoming an existential threat to the planet.

Phil Howard (06:49:32):

But this conversation needs to be very much about solutions. I think we’ve all heard today how misinformation, propaganda is targeting science itself, diminishing public trust in evidence. There are examples of how this misinformation diminishes our trust in institutions at sensitive moments in public life. Ultimately, information operations degrade our trust in each other and turn some of the world’s most complex humanitarian disasters into even more complex disasters.

Sheldon Himelfarb (06:50:03):

And of course, we’ve heard throughout the day about AI and how it’s about the turbocharge this problem by delivering millions of messages faster than any and smarter than any human being ever could deliver millions of messages, millions of moments designed to mislead. So it’s no wonder that our policy makers, our governments struggle to keep up. The technology just moves so fast, it moves at a thousand miles an hour. But there are people who can keep up, who do every day, who work on these problems every day.

Phil Howard (06:50:38):

People like our colleague Sebastian Valenzuela from the Catholic University of Chile, one of the world’s foremost public opinion researchers. He’s been able to demonstrate how social media is critical for young people developing their political identities, but can also have a role in fracturing those networks when misinformation, particularly about COVID health drops into the networks. Colleagues like Young Mie Kim, who’s established ways of studying how political identity formation happens when platforms rapidly AB test ads to customize content for their audience. Colleagues like Mona Elswah at the University of Sharjah who’s demonstrated that the vast majority of content that’s misinformation in languages other than English originates in a handful of state backed media outlets where there are political appointees making editorial decisions. Colleagues like Charlton Mcllwain whose book Black Software that demonstrates that this infrastructure is still actually very fundamental to the social movements we have now. Those working on civic activism, working for social justice, those working to improve our quality of life.

(06:51:46)
Colleagues like my own colleague, Patricia Kingori, a Kenyan researcher based at Oxford who’s been able to demonstrate that most people most of the time don’t have an appetite for anti-vax information. It’s when communities don’t have doctors, they don’t have a GP, they also don’t have a nurse they can check in with. That’s the pernicious effect that seems to give anti-vax messaging some traction. And colleagues like Princeton researcher Molly Crockett who studies the neuroscience of the culture wars, she’s able to demonstrate how we all experience moral outrage that we perceive in the social and social life as a personal identity threat. And she’s able to demonstrate that the social media platforms operationalize this, turn it into a mechanism for manipulation.

Sheldon Himelfarb (06:52:34):

So you see where Phil is taking us, there’s been a lot of talk today about the problems. Luckily this is the section on hope and he has been taking us through people where solutions are being surfaced every day at universities, in think tanks, and in corporate research. But we just have not thought about how to harness that collective power yet. And that’s what we’re doing today.

Phil Howard (06:53:06):

So it’s a pleasure to be able to introduce the International Panel on the Information Environment, this is an initiative that involves researchers from around the world, from the computer sciences, the natural sciences, the social sciences, engineering, and of course all driven by the great questions, the critical questions that come out of the humanities. This is a large complex problem that requires many domains of inquiry united to create carefully craft questions and purposefully land on answers.

Sheldon Himelfarb (06:53:37):

So a science advisor, a trusted science advisor to coordinate the research to surface the gaps in our knowledge, to help us prioritize the research questions and also most importantly, to develop using evidence-based techniques developed the best solutions to this problem so that we can ensure that the information environment serves humanity instead of destroying it. But it needs to do so at a speed that is commensurate with the urgency of the problem.

Phil Howard (06:54:09):

Fortunately, there are good examples from the last few years of how science can move quickly at scale to solve urgent problems. There’s the MMRA vaccine, which appeared in a very quick turnaround more quickly than any previous vaccine had been developed. The web telescope six months from being launched into space to fully operational through a collaborative team of scientists and engineers. And of course, much of the work that we need to do in this domain will be helped by large language models. So in some ways, AI is a new tool in our toolkit for addressing the problem that we’re all concerned about.

Sheldon Himelfarb (06:54:47):

So as science is moving faster than ever, so must the IPIE and we think it is, as you heard, it was two years ago at the 2021 Nobel Prize Summit when we first suggested the creation of such a group. Today, I’m so pleased to say that we have 200 research scientists in our ranks from 55 countries and six wonderful international foundations that are supporting our work.

Phil Howard (06:55:17):

But more important, we already have some outcomes. And tomorrow at 1 o’clock in our breakout session from 1:00 to 3:00, we’ll be talking through one of the analysis we’ve done from a meta-analysis, a systemic review of some four and a half thousand scholarly papers published over the last few years about solutions, potential solutions to the information environment problems we see now. And there are many different proposals, but there are two that surface as having pretty consistent positive effects. Flagging content before users go down the rabbit hole into misinformation and providing the corrections accurate information at the moment in a user encounters misinformation. These are the two most likely operational changes that we could make to social media to improve public understanding of key issues.

Sheldon Himelfarb (06:56:17):

So again, in the interest of demonstrating its value, the IPIE will also release its second study in about a month, which is a global survey of experts on these problems. So the point is the IPIE is moving swiftly and it is because it is a group effort. Research scientists from around the world are self-organizing in order to do frankly what our legislators, our regulators, and our corporations have really struggled if not failed to do. And the reason it works is because they can stand on the shoulders of other scientific organizations, giants in science like the IPCC, what a great model for the IPIE and what they’ve done with climate science or the CERN model or the Union of Concerned Scientists. I could name several extraordinary scientific organizations that have tackled the greatest challenges of our time.

Phil Howard (06:57:18):

I think for the last few years as researchers, us with our own teams have very much been working in our own backyards, studying incredibly complex problems at the far end of the universe with our own telescopes, our own infrastructure, we now need to move forward and start to act collectively to build that infrastructure that will let us see the complex phenomenon for what it is. This is something that’s going to be multidisciplinary, that’s going to require the inside of the humanities, and it’s something that we can do together.

Sheldon Himelfarb (06:57:50):

So if you are a researcher, a scientist, whether from academia or industry or otherwise, if you are a foundation, if you are a citizen who is eager for an information environment that really delivers truth, and trust, and hope, then please join us. Thank you.

Speaker 25 (06:58:27):

Information pollution is threatening democracy, peace, and human rights worldwide. It is undermining truth and trust the basic building blocks of well governed and peaceful societies. It is a global challenge that demands immediate action. And there is hope. Solutions do exist. Passionate technologists are fighting back in unique and creative ways. Believing that technology can be a powerful catalyst for positive change, the Digital Public Goods Alliance and the United Nations Development Program launched a global call to developers of open source solutions that strengthen information integrity. Nine solutions from around the world were selected. By promoting and supporting these technologies, we believe that we can help to pave the way towards a more trustworthy and informed digital future.

Kelly Stoetzel (06:59:31):

See, all that video was a setup for our next speaker who has had a long career at the White House as a national security expert focused on mitigating the dangers that the US faces from foreign malign influence, and disinformation, cybersecurity, and election security. And she’s here to tell us about a very hopeful project she’s working on. Let’s welcome Nicole Tisdale.

Nicole Tisdale (07:00:04):

Hey, y’all. So that was a pretty scary resume, but I promise this is not going to be a scary talk. So as stated, I’m Nicole Tisdale. I’m a longtime national security attorney and advisor. I’ve worked in the United States Congress and I’ve worked at the White House National Security Council. And when I’m not in top secret classified rooms without phones or the internet, I’m in my hometown of Nettleton, Mississippi that has less than 2000 people and no traffic lights, mostly because we don’t need them. But I’m also in barbecue joints in Memphis, Tennessee. I’m talking to students at San Jose State University, and I’m hearing a lot of disinformation about Congress, about the White House, about politics. And so when I’m not in these spaces in DC, I make it my mission to actually go out to these communities and talk to the people who are actually the targets of disinformation. And so we’re going to show a quick video to talk a little bit more about the project that I’ve been working on, and then I’ll give you all more information. I was supposed to show y’all a quick little video. It’s fine. We’ll get to the video. So what we’re going to talk about is the project that we launched in February of this year and we is the Digital Public Goods Alliance and the United Nations Development Program. We issued a call for open source software solutions to fight disinformation. And I know what you all are thinking. Why are you looking for digital solutions to a digital problem? And for us, it’s two easy reasons. The first is disinformation disproportionately impacts governance, human rights, and development in communities and countries with weaker democratic institutions. And so why focus on fighting this with open source tools? I could give you something long and complicated, but basically it’s open y’all

(07:02:24)
Open source solutions are transparent to the people who create them and the folks who use them. They’re accessible to many communities, no matter how much funding they have or how much expertise they have. They’re also able to replicate success from tools that have been created from them before and they actually empower the people who are going to be using them. And so we got together, we created our group. We looked for advisors, we wanted advisors from the private sector, from academia, from civil society, we wanted technologists. And yes y’all, we wanted policy makers. Once we got them together, we issued the call and we received over 99 solutions, which was much more than we could have hoped for. Of the 99 solutions, we picked nine. Some of them are actually here today, and I’ll say their names because I want you all to remember them.

(07:03:23)
Open Terms Archive, RegretsReporter, Deep-Fake Fingerprint, Faluda, Usha Hedi, Phoenix, Media Cloud, and I’m about to debut my Portuguese publicly for the first time, [foreign language 07:03:42] Dear Diary, [foreign language 07:03:43], it wasn’t that good, all right. So what do those solutions do? Between those nine solutions, we have a solution for each that is going to hold big tech accountable by providing transparency about algorithms online. We are going to build trust online through authentication and analysis. And we’re also going to lessen the role of disinformation in global conflicts, including natural disasters.

(07:04:12)
All of these tools provide transparency and access. And so that’s what we did. That’s why we did it. And that’s our winners. So as you’re sitting here today, what can you do to help? No matter what space that you’re in, we really hope that any tool that you’re creating, any problem that you are solving, you focus on inclusivity and collaboration. And for us, inclusivity means thinking about linguistic and language needs for the communities that you’re going to be working in, thinking about the social political constructs that the communities are dealing with, and then also the cultures of the communities that you’re building tools for. When we say collaboration, we really mean everybody, but we definitely mean start with local communities, next, civil society technologists and policy makers. Y’all, it’s going to always be policy makers.

(07:05:10)
And as we sit here today, just a quick homework assignment of something that you can do today, which is visit the Digital Public Goods Alliance website. There you’ll find our nine winners and you’ll see more information about their tools. But I don’t want you to just go to the website and look, that’s not helpful. You need to actually start to use these tools, fund these tools, and implement the solutions that they are creating. And that’s how we turn hope into action. Thank y’all.

Kelly Stoetzel (07:08:39):

That just kind of makes you smile, didn’t it? So in accordance with the terms of Alfred Nobel’s instructions, Nobel Prizes are awarded to those whose work has conferred the greatest benefit to humankind. So truth, trust, and hope. I could not think of a better name for this panel of three Nobel Laureates. Here to introduce our panelists and moderate the discussion we have Sudip Parikh, he’s spent his career at the intersections of science, policy, and business, and he heads up the American Association for the Advancement of Science. Sudip, come on up.

Sudip Parikh (07:09:23):

Wonderful. Thank you. Well, what an exciting day. I flew across the country to get here for this because I could not miss it. And one of the reasons why is because I wanted to impress my kids. I said, “I’m going to be on a stage with three Nobel Prize winners.” I thought that would impress them. And what they said was, “What are you doing up there?” And so what I’m doing up here is I’m going to ask the question that I hope that you want to ask. So that is what we’re going to do. So on the screen, we have a Nobel Prize winner in physics, Donna Strickland joining us. And then joining us also is Dr. Rich Roberts Nobel Prize winner in Medicine and Physiology. And then finally, another Nobel Prize winner in physics, Saul Perlmutter.

(07:10:22)
And so what we’re going to do, I love the title of this, it says truth, trust, and hope, but then there’s a subtitle for ours, which is the truth is out there and it is, it is out there. And one of the things that’s exciting about being with Nobel Prize winners is they actually discovered a little kernel of truth, a little kernel of truth that excited everyone and in some ways turned over something that was thought before. And so we’re going to go through a discussion about what we’ve seen and heard today and our reactions to it.

(07:10:52)
And so a little bit of it will be covering some ground that’s been covered, but from the perspective and the prisms that they bring to the table as Nobel Prize winners. And so we heard a bit about the historical context of the times in which we live. You heard about it from many of the speakers, including our magician and from the perspectives of Nobel Prize winners in astrophysics, optical physics, and molecular biology. Saul, why don’t you start by giving us your observations about the history and what that history has in terms of relevance for today?

Saul Perlmutter (07:11:22):

Well, I guess earlier today we heard a little bit about the moment where people start printing with vast amounts of information coming out and how we overcame that. But it was occurring to me that in some sense, science was dealing with this problem of how do we find the truth and how do we trust what we are discovering together way before? And really, I think one thing that’s important probably for a group like this is the idea that science itself offers us ways to try to build trust together. And it’s been a very much a collaborative, interactive activity for all vitz history. Newton wouldn’t be Newton if he didn’t have people to write to and to hear from. And in some sense, we are now able to use and we should be teaching each other how to use all those techniques to deal with the current crisis.

Sudip Parikh (07:12:15):

That’s great. Thank you. Donna, I was going to turn to you next. It’s wonderful. You’re actually on stage and it’s like you’re here. So Donna, how does today fit into the context of history?

Donna Strickland (07:12:29):

How does today fit into the context of history?

Sudip Parikh (07:12:31):

Yeah.

Donna Strickland (07:12:33):

Well, I think what we’re talking about is trust. And I think in the past we’ve had more trust in science than we do now. And as Saul was just talking about, scientists always have this peer review process that we go through and go through and go through when we try to write the proposal to get the money, to do the idea, to publishing the idea, and then every step in between. And we go to conferences and have conversations about what we do. And so I think this is a time where scientists are actually being asked to do more than just talk to each other. I think that’s probably one of the biggest differences between now and before. Scientists mostly spoke to each other and now we’re being asked to speak more broadly.

Sudip Parikh (07:13:18):

That’s great. Thank you. Rich, you have a joke that’s along these lines.

Dr. Rich Roberts (07:13:23):

I promised I wouldn’t tell it, but I would say laughter I think is really important. And I think the audiences really come apart if you like as soon as anybody says something funny. So I think the illusionist was really good, but just to get a little more serious for a moment, I think one of the things that I’ve gotten so far from the meeting is the importance of communication. I think not just communication among ourselves, but to the general public, but also teaching our kids from a fairly early age about science. Because one of the wonderful things about kids is they’re very open to science. They’re open to new ideas. They’re not coming in with a bias point of view and unless their parents gave it to them, but most of them, they really want to know about this.

(07:14:13)
And I think we could do a much better job of teaching kids at a younger age about this. This would be good. I also think one thing that came up recently because of COVID is that zooming, which we do more and more all the time, it’s not like person to person. You cannot beat in-person communication because Saul can be talking and I can interrupt him, but it’s considered very [inaudible 07:14:45] not to do that. If you’re on a Zoom call, you have to wait until you’re asked. But I like the person to person.

Sudip Parikh (07:14:52):

So I take you’re planning on interrupting Saul. All right.

Dr. Rich Roberts (07:14:55):

Possibly.

Saul Perlmutter (07:14:58):

But I mean to add on what you’re saying, I think this idea of education and what we teach the kids, often we think of science education as being teaching biology, physics, chemistry, but I think so much more should be taught about how science works. We have I think a real opportunity to be teaching a form of critical thinking, which is essentially what science consists of. And that would be powerful for everybody, whether or not they plan to become scientists, it provides them a route to grappling with this world that we live in.

Dr. Rich Roberts (07:15:28):

And I think most kids start off being scientists. They’re creative, they ask questions, and we tend to put them into schools that knock the science and the creativity and the curiosity out of them. And I think we need to flip that. We need to let them blossom, let them question the teacher. There’s nothing wrong with questioning the teacher.

Donna Strickland (07:15:53):

A lot, don’t just add one thing.

Dr. Rich Roberts (07:15:53):

And I speak from personal experience. I got in a lot of trouble when I was a kid.

Sudip Parikh (07:15:58):

Donna, please. Donna’s going to interrupt us.

Donna Strickland (07:16:00):

Yeah, I’m from Zoom, I’m cutting in.

Sudip Parikh (07:16:00):

Yeah, come in.

Donna Strickland (07:16:03):

I also think that the one thing that we lose in our education system is that we spend most of our time in science teaching them what sciences already known. And then when we become scientists, our job is to figure out what we don’t know. And so it would be nice if we introduced that concept. I think this goes back to what Rich is saying, they should learn how to ask the questions why, in a critical thinking way through the education system and understand that there’s far more we don’t know than what we do know.

Sudip Parikh (07:16:39):

I think it’s really exciting. That’s worth applause. I think it’s really exciting that each of you in winning the Nobel Prize, you’ve changed the way we think about something. And that was because there was something that we thought we knew or we had some ideas about. And it turned out it didn’t work that way. That’s the scientific process. We argue with each other, we have disagreements and then we test it.

Dr. Rich Roberts (07:17:02):

But we don’t shoot one another.

Sudip Parikh (07:17:04):

That’s good. And I’m glad for that. Would one of you want to speak to what that’s like?

Saul Perlmutter (07:17:12):

Well, I will say that I think it’s one of the real, I don’t think most people realize that in the general public that the thing that really pays off for a scientist is when they get to say, “Oh, I was wrong. The world is completely different than I thought it was.” And that’s where the excitement comes in. So that sense that, you often get asked by reporters, “So what did you set out to prove when you started working on this project?” And I keep thinking, well, if you set out to prove it, you probably weren’t really doing science. And the real fun of the game is to find out something that you didn’t expect.

Dr. Rich Roberts (07:17:48):

I think one of the important things is that very often you’re doing experiments and they don’t work and they fail. And you thought that you knew exactly how they were going to work, but they don’t. And you look and see why did it fail? You do a postmortem on your experiments, this is when you make the big discoveries. This was certainly how it was for me. Failure is a good thing. I think we don’t do our kids any favors when we tell, “Oh, you are a failure. You failed that exam.” Failure is great, failure is terrific.

Donna Strickland (07:18:22):

Well, let me add my example. Because so many people make such a big deal about, I won the Nobel Prize for my very first paper, but I also point out that I was in my fourth year of my PhD, I had had that many failures getting to finally a really good success. So it kind of averaged out. But I had so many failures before I ever got to my first success.

Saul Perlmutter (07:18:45):

I don’t know whether you guys have that experience, but my sense is that one of the things as a professor you are often trying to teach the upcoming generation is that they should not lose hope along the way. And that basically things are going to go wrong most of the time. And it’s only the very, very end that if you’re lucky, something will work out. And maybe in some sense, that’s really one of the lessons for all of us facing misinformation in the current whole story that we’re worrying about, this is going to be a long process. We’ll get things wrong and we’ll try lots of things, but that’s how it works. Eventually you get somewhere.

Sudip Parikh (07:19:22):

In telling these stories of failure or in attempts and trying, by telling those stories, do we build trust for the scientific community by telling the truth about the way that science works? The scientific process?

Saul Perlmutter (07:19:36):

Yeah. I would say yes.

Dr. Rich Roberts (07:19:39):

I think so. I totally agree. Truth is something that is really very valuable. And I think when people start lying about stuff, all that happens is they become politicians or they do something else, but I think there should be a law that says if politicians lie, they can be sued and sent to jail. I mean, these are the people who represent us.

Donna Strickland (07:20:06):

But also I think one of the first speakers today talked about how we can’t just say, science says, we have to explain. And so we have to become good communicators. But also we live in the world of the sound bite. And it’s very hard to explain science in a sound bite. And also going back to the idea of explaining the scientific process, this was one of the concerns I had through COVID is how many people got upset about the masking, no masking, what kind of masking. And they threw up their hands and sort of thought scientists were wrong.

(07:20:38)
And really, I think what was happening is that they were watching the science experiment in real time for the first time in their lives in that something was tried, it was tested, people figured out what was right and what was wrong and they changed it. And this is what goes on in science. And usually we have time to get to the final answer before we broadcast it. But because scientists were trying to save lives, they were broadcasting as we went. And I think there wouldn’t have been the frustration if we’d had the chance to explain. And also people already understood the scientific process and why failure is a big part of it, but we learn and move on.

Dr. Rich Roberts (07:21:15):

But you had the advantage that you lived in Canada where you had much better information coming your way.

Donna Strickland (07:21:29):

Or we listened better, one or the other, I don’t know.

Sudip Parikh (07:21:35):

So what does that say about the state of understanding of the scientific process in the United States and Canada and around the world that we went through this test and maybe we didn’t do so well?

Saul Perlmutter (07:21:47):

I think it certainly showed something that we have a lot to teach now that this is a real opportunity for us to get into the schools. In fact, there’s a presentation coming up on the last day of this summit

Saul Perlmutter (07:22:00):

About a approach to teaching critical thinking in the schools that Nobel is supporting. And I think this would be the real moment for us to start teaching how it is that you do all these different steps and different processes so that the students feel that they’re part of the activity and that they’re able to use it.

Dr. Rich Roberts (07:22:19):

It just makes the case for more education. I think kids, if they’re educated well from a fairly early age, and I would say 9, 10 is a good time to get started. If they learn what the scientific process is, they can understand what people are saying to them. Provided you don’t have different people coming. The surgeon general in Florida thinks that vaccines are bad. I mean, where on earth did he get his degree? I do not understand it. Maybe we should close that university that they took it.

Sudip Parikh (07:23:00):

Very subtly done, very subtly done. Donna, was there a difference, I mean, in Canda in terms of the reaction because of education or was it similar?

Donna Strickland (07:23:15):

I don’t know if it’s education. I think politics came into the US system far more than other places. And this was also disheartening to me as a scientist that somehow people put their medical health information with some kind of political blinders on. And so, maybe it’s education, it’s communication, it’s a lot of things all at play. But I don’t think we had it that polarized up here in Canada. And so that was a little bit easier to deal with, but it’s unfortunate when politics and science collide this way.

Sudip Parikh (07:23:53):

And it’s interesting because the fields in which you all worked and did your award- winning work were perhaps not as out in the open at that time. So if folks were spreading disinformation about supernovas when, Saul, you were doing your work, would it have been harder? I’m being a little bit facetious, but on the other hand, what I’m saying is, there’s an active combatant here.

Saul Perlmutter (07:24:19):

No, I always felt that one of the big advantages of working in cosmology is that there’s almost no political position about whether universities are slowing down, speeding up. So you’re able to actually talk to people and their immediate reaction is pleasure. They really actually enjoy joining in. That’s one of the other aspects that science helps with if you’re in one of these areas.

Sudip Parikh (07:24:39):

I thought everybody was against inflation. Come on, give me something here. That was a really good joke. Yeah.

Saul Perlmutter (07:24:47):

There was this period before the Big Bang that we pulled [inaudible 07:24:49] later.

Sudip Parikh (07:24:52):

If you’re explaining, you’re losing, right? So tell me about, Saul, you had one suggestion about things that we should be doing for the future. What should we be doing in this moment as a community? And we’ve heard several prescriptions about it today. And which of those, if they caught your eye or if other things that you’ve heard from your own experience should we be doing as a community to help bridge this gap of trust?

Saul Perlmutter (07:25:16):

Personally, I’m very interested in these issues of participatory democracy and what does it take to bring people into a conversation together so that the scientists are not in the position of being these sort of sages telling everybody else the answer. The scientists know something, and as Donna pointed out, they don’t always know something about the current states of the facts that we have to live with that we’re trying to work around. But the scientists are not the experts in the values and the choices that need to be made once you know the facts. And that’s a place where I think we do much better if we were part of a two-way conversation rather than talking to the public as scientists. So I’m very interested in these techniques, like we’re doing this experiment tomorrow here at the summit in deliberative democracy. Deliberative polling is the kind we’re doing tomorrow. And the idea being there, you can have a public that’s informed by discussion with the experts, but then they are the ones who deliberate and help come up with what does a representative population think about the problem.

Sudip Parikh (07:26:22):

Co-construction, I guess, [inaudible 07:26:23].

Saul Perlmutter (07:26:23):

And it seems like it’s really the place to build a sense of shared belonging and community that we also heard is so important.

Sudip Parikh (07:26:30):

Yeah. Rich, same question.

Dr. Rich Roberts (07:26:32):

Yeah. So I’ve been impressed by people who are good at communicating with the public, who really take the time to learn a language so that they can explain what they do in ways that the general public can understand. And there’s a couple of cities in Zurich, they do a very good job. They have one day every year where a whole bunch of scientists from the ETH go out and talk about their science. They sort of set up little stands on the streets and do it. In Exeter, in England, they do the same thing. The universities send out students, they send out professors, and talk about the work that they do, but in language that can be understood. And one thing we might think about doing when it comes to education is to make sure that science students are taught how to talk to the general public, how to use language that the general public can understand.

(07:27:28)
I have something that I call the grandmother test. And the grandmother test is a student is doing something in the lab. They go home and they explain to their grandmother exactly what they do to a point where she actually can understand it. And the test is can she then go and talk to all her friends and tell her how smart her granddaughter is?

Sudip Parikh (07:27:52):

That’s great. Donna, from what you’ve heard today or from your own experience, what should we be doing in the future?

Donna Strickland (07:27:59):

Well, here at the University of Waterloo, we are trying to start a trust network. And following on both what Rich and Saul say. We do think it’s got to be a two-way conversation. And I think this has already been said to date in the talks that scientists can’t just really go out and say, this is what science says. And also I think what we’re just bringing up is that there are different groups of people who all have different reasons for whatever their culture is or whatever we’re not trusting in some area of science. And so we have to start not only communicating better, but we really have to start listening to the communities to find out what it is that makes them not trust us. And then, we’re going to have to have our social scientists and psychology colleagues start figuring out the ways around this. And I hope to really do real science experiments. Try something doesn’t move the needle, does it not move the needle. And does it work for a number of groups of people or just the certain people.

Sudip Parikh (07:29:03):

That’s great. Thank you Donna. In the final couple of minutes that we have, the last part of this is hope. I want you to just in a couple of words, and we’ve got two minutes, a couple of words say, what is it that you’re hopeful about in the next few years, either in the scientific realm or in the relationship between science and society? What are you hopeful about? What’s got you excited?

Saul Perlmutter (07:29:27):

I think this meeting is a great opportunity for hope. I mean, it seems to me that once you’ve identified a problem, then you have a fighting chance to do something about it. And that I think that as experimentalists, we’ll figure it out. And I think we just need those multiple tries that Donna’s talking about until we start getting it.

Sudip Parikh (07:29:47):

Love it. Rich?

Dr. Rich Roberts (07:29:48):

I think it would be very good if we as scientists and the scientific societies could set up a factual database that people could trust. If you want to know whether something is true or false, you go to this gold standard database and know that what you read and what you find out about is true, it’s factual, and the evidence is there. Leads to the evidence, to the papers, and whatever you want. That I think could make a huge difference. And it’s something that the AI community could use productively much more so than the way they’re using the rubbish that goes into cheat GPT.

Sudip Parikh (07:30:26):

I thought we were going to get 20 minutes without saying AI. All right, Donna.

Dr. Rich Roberts (07:30:30):

Oh, sorry.

Donna Strickland (07:30:32):

I think what I’m hopeful for is the fact that right now this is a global problem, but I also think globally we’re looking for solutions. And so certainly, here at Waterloo with our network, others have reached out to us to say, how can we network together? And now, even though Rich doesn’t like Zoom, luckily because we have Zoom, we can have global meetings, and start talking together, and each doing our own types of experiments and finding out what works. Just like all other science, we can get it out there in the global sphere and maybe start changing the dial. That’s why I’m hopeful it’s going to come together.

Sudip Parikh (07:31:08):

Thank you, Donna. As you can see, one of the speakers before said that one of the solutions was wisdom. There’s a lot of wisdom on the right of me on this stage, so thank you. Thank all three of you for the panel. Thank you all.

Saul Perlmutter (07:31:47):

Thank you.

Speaker 26 (07:31:48):

The internet has brought humanity together. It’s unleashed our creativity, entertained us, educated us, and compelled us to take action for social justice. But our need for connection has been hijacked by big tech companies. They learn fast how to monetize our thirst for information. They do whatever it takes to keep us scrolling, clicking, consuming. The outcome, polarization, online hate, failing democracies. And social media and search engines that have broken our trust in the truth and in each other. But we have a plan.

Maria Ressa (07:32:37):

This is it. 10-Point.

Speaker 26 (07:32:38):

Through it, we can detox algorithms, rebuild independent journalism, and reclaim our right to hope, not hate. By ending big texts on checked power, we can make online experiences safe, unite communities, and transform the internet to serve people, not profit.

Speaker 27 (07:33:25):

Our next speaker, and actually the last one of our day together here is an absolutely fearless defender of freedom of expression, and 2021 Nobel Peace laureate. We all want to have hope, but we have to have a plan. Here’s our plan. Let’s welcome Maria Ressa.

Maria Ressa (07:33:43):

Hi. Please don’t make me cry before I have to speak for 20 minutes. Oh my God, thank you. Thank you so much. That was actually the very first thing I wanted to say. There are so many people in this audience who have helped us stay alive, stay out of jail. Thank you, thank you. The only weapon a journalist has to fight back is to shine the light. Without you, it is impossible to do it. And this is actually what I hope we’ll do in the next 20 minutes or so, is to show you the micro and the macro you heard from Nobel laureates, you heard from what an incredible day. We have the context of everything that is the problem. But I’m going to kind of help focus it a little bit more because the tech has gone exponential, exponential. And we’re still moving at glacial speed. So I think the first step is please use disinformation, not misinformation. Because misinformation is like a game of telephone, it gets distorted. People make mistakes. We make mistakes. Disinformation is when power and money uses the existing information ecosystem to insidiously manipulate the cellular level of our democracy, which is each of us. Disinformation that leads to information operations and information warfare. In the Philippines and in Southeast Asia, we used disinformation. I think this is my theory in America and in the West, part of the reason that you stay with misinformation is you have a very powerful tech lobby that uses that word to whitewash what exactly is happening. So Robert said earlier that three-second sound bite. Well, let me give you the history of the Philippines with this map.

(07:36:33)
This map here, our history, 200 years in a convent, 50 years in Hollywood, one sentence. Okay? So my ask for you in the next few minutes is really the same question we have had to confront. And I think the only way we will come out of this, because the window to act is closing. At the worst of times, it was very cathartic to write. When we went into lockdown, I didn’t realize how exhausted I was and I just kept writing. My editor cut half of what I wrote, 200 pages, he cut 200 pages. But since then, so it came out around, it came out the same time as ChatGPT actually, as the generative AI came out. But since then it’s been translated in about 20 languages. There’s the French, there’s Japanese, there’s Korean, there’s Mandarin coming in June, and Ukrainian. I just signed.

(07:37:39)
But I think it is this left brain, right brain that I tried to, how will we move you to act? How can I move journalists to act? It’s actually really simple. It’s not about me, it’s about you. It’s about your courage. It’s about your ability to look at the world, listen to all the problems and all the nuances of it, and then just say, this is what I stand for because silence is complicity. I’m going to remind you, in the Nobel lecture in 2021, I talked about how the war isn’t Russia invading Ukraine. Yes, there’s conventional war, but the war is in your pocket. It is a person-to-person defense of our values, of our democracy, and each of us has to win. And the second one is that how does it work? Madeleine Albright used to call it, you slice the salami. This is how fascism would work.

(07:38:42)
In my case, the two biggest stories that I’ve worked on were all how they tested tactics to attack America in my country. 9/11 attacks, when that happened, it was a memory for me. In 1995, the first pilot recruited by Al-Qaeda had been arrested in the Philippines, Abdul Hakim Murad. And there was an interrogation document in my closet that talked about a plot to hijack planes and crash them into buildings. He even named the buildings, he said the World Trade Center, Pentagon. Then he added a building that hadn’t been attacked, the Transamerica in San Francisco. So that was the first, what’s the second one? Cambridge Analytica. Americans had the most number of compromised accounts. What’s the country with the second most number of compromised accounts? The Philippines. So I use this phrase, death by a thousand cuts because this is how we lose our rights.

(07:39:47)
It’s like, you’re not going to stop because “Oh, well, we didn’t get access. So we’ll just stay quiet because maybe…” This is where the fear is used against us. “… maybe if we make too much noise, we won’t be able to come in again next time.” Democracy is lost by giving up. You have to hold the line of our rights. So that’s the part. And it’s ironic, asymmetrical warfare. You heard about this a little bit, but this is what Al-Qaeda used to attack America, that we will come after America with death by a thousand cuts. It’s ironic that the very principles to combat terrorism is now being used asymmetrical warfare against each of us. And then the last one, which I go from macro to micro, is us. We are standing on quicksand. Don’t worry, I’ll only be really bad for half of the time I talk to you.

(07:40:50)
And this is what I suggest. This is what we did at Rappler. Imagine the worst case possible, the worst case, whatever it is you are most afraid of, you hold it, you touch it, and you embrace it, you rub it of its sting. Because if you do that, then we can move forward. Nothing can stop you. Embrace your fear. Fears now with generative AI, man, it’s off the scale. All of the science fiction films that we’ve seen, it’s worrisome. All right, so what’s our 10-Point action plan? Tomorrow we’re going to have two hours to go over it with you. But I think here, three buckets. This is something that we pulled nearly everyone who spoke with you today had bits and pieces of it. But the reason we also focus on disinformation is because it is the business plan. It is money, it is power.

(07:41:49)
And this business plan is the first step we need to stop to reclaim our rights. Stop surveillance for profit. What that means is, and I’ll show you some of this. On this phone, if you’re on any of the social media platforms, every single thing that you post, every data point that’s collected about you is used to create a model of you. They’ll say that, the tech companies, but what they don’t tell you is that that model, replace that word model with clone, they clone us, and then they say, because they used AI to do it, machine learning that they own that clone. Did you give your permission? No. Stop surveillance for profit. That’s the first step. Because all, fear, anger, and hate, it follows. The second, stop coded bias. Coded bias, if you were marginalized in the real world, if you are a woman, LGBTQ+, if you are from the global south, if you are marginalized, you are further marginalized.

(07:42:57)
And what happens, Coded Bias was actually a film that was in Sundance the same time as A Thousand Cuts. And this was about a female, her name is Joyce. She was an MIT student. She was given an assignment, an AI assignment, except that she couldn’t do the assignment because the AI wouldn’t recognize her face because she is a woman and she’s Black. So what she did, she put a white face on, and then she did the assignment. Stop coded bias. That’s the second point. The third, oh is the part that has always been hard to explain. I think when news organizations had power, we never really explained the process. Journalism as an antidote to tyranny. Because only journalists are foolish enough to stand up to a dictator and say, “You are wrong.” I had a 26-year-old reporter stand up to President Duterte who was towering over her trying to bully her. And she kept asking in a very respectful way, this is what journalism does.

(07:44:11)
And if you look at the trends, not only has democracy declined globally, journalists have had to pay more and more. Jailed, attacked, killed every year for the last decade. So we can’t do this alone. Let me quickly go through this because I see… I will pass through. It’s all about data. And if I say this to our kids, they’ll say it’s really boring. But think data, not Excel sheet, but your clone. You in data. That is exactly what we’re talking about. And that huge shift is phenomenal. It changed the entire system, but we didn’t know it. The two buckets you heard a little bit earlier from, we’ve been talking about machine learning, the AI of machine learning. This is what builds our clone. What is that? All of our clones, if we were all on social media, is pulled together. That is the mother load database that’s used to micro-target. Micro-targeting is not advertising, the old advertising, but micro-targeting is taking your advertising dollars.

(07:45:20)
And that micro-targeting takes your weakest moment to a message. It’s almost like you went to a psychiatrist, and then you told your psychiatrist your deepest, darkest secret. And the psychiatrist went out and said, “Who wants Maria’s deepest, darkest secret for the highest bidder, to the highest bidder?” So that’s the first one. And you heard from Christan, we haven’t solved anything. No one has been made accountable, it still continues, and it is getting worse. The second bucket, generative AI is not just exponential, the first one, but exponential, exponential. You’re talking about, and we’ve played with this stuff on Rappler because if you don’t, again that arms race, if you don’t use it, are you going to be left behind?

(07:46:06)
But the question really is, where are the governments that have abdicated responsibility for protecting the public sphere? Why are the journalists, the politicians, it’s hard to govern in this day and age, right? The scientists, the researchers, why are we left alone? And I’ll show you some of that stuff. We’ll go back to generative AI. Let me quickly, I’ll skim you through this because this is the problem. In 2014, content and distribution was separated. News organizations lost our power and the tech companies, American tech companies, became the new gatekeepers. We’re frenemies. I really love the tech. I drank the Kool-Aid. That was how we set up Rappler. In 2012, we first set it up on a Facebook page. The second, and this is the original sin. If we’re talking values, the system that connects us, all of them spread lies faster than facts, six times faster.

(07:47:13)
And this is a 2018 MIT study. So if you’re a journalist, you’re hands are handcuffed. Lies laced with anger and hate spread. This is a distribution issue. In the Nobel lecture I talked about how it is fueling the worst of humanity. And then finally, what has this become with information warfare for your 2016 elections. You already have the data out, but nothing happened with the Mueller report. Look at all the footnotes in the Mueller report. I mean, it’s really interesting. This is now a behavior modification system. This is a warfare. So that’s why you have to win it. Sorry, last slide before I kind of take you through the solutions. We’ve talked about behavior modification and it goes two ways. It pounds you. You heard Rana talk about being pounded. And the end goal of all of that is to take your voice out. But the other part is astroturfing, it’s to make this side, like if you see part of the attacks against me or against Rana, make you disbelieve. This is why facts are debatable.

(07:48:22)
So the three sentences I’ve said over and over makes me feel like Cassandra and Sisyphus combined, without facts, you can’t have truth, without truth, you can’t have trust. Without these three, we have no shared reality, we have no democracy. So imagine if you’re here in the room with me and you know we’re in Washington, DC, but if I were to set information operations and pound this part of America with a million tweets that said they would all have blue verified checks with a million tweets that said this is happening in New Delhi, people outside would think it’s happening in New Delhi. Three layers that it’s coming. The personal, individual, sociological in groups because we act differently in groups. And then the one that we haven’t even begun to talk about, emergent human behavior. Each, we cannot know what we are going to become when we are being pumped full of toxic sludge, when we cannot believe in the goodness of human nature.

(07:49:34)
You have been very good to us. So I believe in the goodness, but it is so hard to do that in our information ecosystem. So what kind of, aside from the fact that it’s addictive, dopamine, our attention span has gone from 10 seconds to 3 seconds. All of this stuff, I worry about our kids, the surgeon general just announced that social media is harmful to our children. We kind of knew that a decade ago. You have to move faster. And last part, the last two minutes, this is the big picture stuff. What is happening all around the world? And I’m going to reference both the Freedom House report and V-Dem from Sweden. V-Dem pointed out that last year, 60% of the world was under authoritarian rule, I thought, “Well, that includes India and China. So that’s okay, 60%.” This year it’s 72% of the world.

(07:50:40)
You’re getting a tipping point where these trade kind of sanctions are not going to work because there will be more authoritarian countries and there will be democratic countries. I want to light a fire. Our window is closing. We with our information ecosystem are electing illiberal leaders, democratically. I know it in my country. And they are crunching. They are crippling the institutions from within. You now know how easily that can happen, but they’re not staying in their own country. They’re allying globally. Would Belarus be a democracy today if Russia hadn’t come in to help? So this is it. Our window is closing. I’m going to quickly go through this. I mean, look, I wanted to show you what we lived through. Rana and Patricia Campos Mello, who is also here from Brazil. This is the first network. What we did in Rappler to try to figure it out, go to the facts. What are the networks that are sharing the crap? I can say that in the National Academy of Sciences.

(07:51:49)
So this is the first generation, but this was the same network attacking journalists, opposition politicians, human rights activists, and it was so organized that they had messages out by demographic. I’ll show you some of the attacks that I got there. Mimetic warfare. So everything about me, it’s a good thing I’m not corrupt because they couldn’t find anything. But they did things like the way I looked, my sex, so gender, gender disinformation. I have eczema. Can you imagine eczema weaponized? I have really dry skin, but take a look, it’s not bad. I have really dry skin. But look what they did to all the photos. And then they made up a name for me. Don’t drink anything right now. They called me scrotum face, dehumanization. And this becomes dangerous because that is the next step to violence. Online violence is real world violence.

(07:52:52)
America looks at reasons why the shootings are happen. We are being pumped with toxic sludge. So it took like a month, my mom sent me this one. And for years, I didn’t want to tell you about it. I didn’t want to speak about it because it makes you ashamed. But I realized, if I didn’t tell you, you wouldn’t know. I mean, who wanted personalization? Who wanted our own separate realities? I thought in 2014 when I heard that, “Oh my god, that’s going to be problematic. What happens to the public sphere?” I’m going to say something that’s not quite politically correct. You know what they call us a place that has different people believing in different realities, it’s called an insane asylum.

(07:53:48)
We’re close. I mean, they still kept going, right? So it took a few months before they took that down. But then A Thousand Cuts, again, my skin is a little bit better than that. But it’s public. It astroturfs. VOVph has a Facebook page. And then they did this. Creative, right? But I’m not alone. And this is where the UNESCO and the Chilling interview, this is a 300-page book. Women journalists under attack, having to deal with things like this. And you can see the statistics, they are horrendous. 73% have experienced online abuse. 25% have gotten death threats. And of that, 20% have been attacked in their physical world, in the real world. Online violence is real world violence. This is the first big data case study done by the International Center for Journalists based in DC with UNESCO. And they took half a million attacks against me. I was getting 90 hate messages per hour. 60% was meant to tear down my credibility. 40% was meant to tear down my spirit. With your help, it didn’t work.

(07:55:21)
So let me fast forward and bring you back to DC and you heard about Brazil on January 8th. But this is the same thing. This is the anatomy of how you change reality with information operations. It takes a few years, but it’s been a few years, right? So if you take a look, this is violence on Capitol Hill, #Stopthesteal. It was seated, the meta narrative. This is worked by the Election Integrity Partnership. It was seated on RT a year earlier, August 20th. Mainstreamed Steve Bannon on YouTube August, 2020, so we’re getting closer to your election date. Then you had the

Maria Ressa (07:56:00):

Super Spreaders, QAnon, dropped it October 7th and then President Trump came top down. It’s the same thing that happened to us in the Philippines, bottom up exponential, lies, journalist equals criminal. A year later, President Duterte said the same thing about me and Rappler. I immediately tweeted, in his state of the nation address. Then a week later I got my first subpoena and they just kept coming. In 2018, the government tried to shut us down. In 2019, I had eight arrest warrants in about three months and then two more followed. Yeah. Like Rona, I wound up spending more time with lawyers than with journalists. This is how it happens. You heard from our Yale behavioral scientist talk about how you can take your memories and pull it out. We are a country where we overwhelmingly elected the only son and namesake of the man, our dictator in 1986, a kleptocrat who stole 10 billion US dollars in 1980, $86.

(07:57:11)
I didn’t mention the name Marcos, did I? President Marcos then and now. We now have his son and namesake as our president. He was elected with two ways. The first was because of information operations that began in 2014. The second is because like in other countries around the world, the world doesn’t change like this. His supporters were still there and dynastic families helped bring out the vote.

(07:57:39)
All right. Let me quickly go through the solutions. I promised solutions. How do you rebuild trust? In Rappler our elevator pitch was we build communities of action and the food we feed our communities is journalism. Our three pillars, I like the anvil drop. I use that all the time. Technology, journalism, and community. You are a powerful community. Let me take technology. In technology legislation, and I joke that the EU is winning the race of the turtles.

(07:58:19)
We can’t wait more years. Now the Digital Services Act, the Digital Markets Act it’s kicking in this year. Please, on May 31, there’s a deadline for who gets access to real time data. Please just go take a look and weigh in on this. The first is, I mean, in the long term it’s going to be education. We heard that from here. In the medium term, it will be legislation. That’s where it is, so we have to look at that. The second is I’ve given up on big tech, even though we still need big tech, but we’re building our own platform. It took us much longer, because we were under attack. In our first year and a half we spent a million dollars on legal fees. That is impossible for a little group like us. We’re only a hundred people strong. Lighthouse will come out by Q3 this year. This will give oomph to how do you build communities of action? How do you have safe spaces to actually speak to each other, to debate, to do the thinking, slow part of both governance and democracy.

(07:59:28)
The last one I want to tell you is interesting because this will be announced October 2, formally, but I’ve been okayed to tell you about it. The Institute of Global Politics is being launched this fall at SIPA, the School of International Public Affairs at Columbia University. It will be led by the Dean of SIPA, Keren Yarhi-Milo, and Hillary Clinton. The end goal is really, again, don’t go into the politics because that’s part of the cascading failure, because the original failure is lies spread faster, remember. When lies spread faster, we go into stranger things. We go into the upside down. We’re living in the upside down. In that, that’s where your politics, all our politics become a gladiator’s battle for the death. In my book, there’s one algorithm that did that. It’s recommendation of friends, of friends, for the growth of your social network. It’s chapter seven, it’s called How Friends of Friends Broke Democracy.

(08:00:36)
The Institute for Global Politics, our goal there is to bring engineers, together with lawyers, together with policy, together with scientists to try to, … I’m excited about IPIE. We need to pull our efforts together, but we need the short term. Let me do the short term. Journalism; we have no business model. It’s dead. Advertising is dead. Microtargeting for you in big companies, you are using microtargeting, because it’s better ROI. What does that mean for journalists? A year ago we began the International Fund for Public Interest Media. I decided to co-chair it along with Mark Thompson, who is the former president of the New York Times. He’s big and tall and a white male and I’m short, and little and a brown female. We raised $50 million in a year of new money.

(08:01:38)
That is for those journalists, especially in the global south, who are putting their finger to stop the dam from falling on them and still doing their jobs. There are journalists there, but our incentive structure for our information ecosystem rewards bad journalism. Think about that. The second is, I see CPJ, RSF, ICFJ, the global coalitions who have helped journalists come under, who are under attack the Hold The Line coalition. I know Courtney, you’re here. She was one of the founders to pull this together. We have to help journalists. Evan is in Russia. I mean, why do we have to sacrifice so much to try to give you the facts?

(08:02:26)
Finally, the last one. I told you, I wasn’t going to leave you depressed. This is something, Project Agos. The Philippines is the third most disaster pro nation starting in 2013, globally. Our very first effort in crowdsourcing was really climate change. We built a tech platform that we handed to the government and we did it with help. The government accepted it. We have an average of 20 typhoons every year. This is Project Argos, which was something that worked from 2012 all the way to 2016. What we did there is the tech platform was created for crowdsourcing, for everyone. Anyone who is seeing someone who needs help, they can use the hashtag and they’ll go through. Then the journalists, we did three phases; before, during, and after. How do you prepare for the typhoon? How do you respond while it’s happening and recover?

(08:03:24)
I’m going to show you something that actually happened in the Philippines. This is the power of the crowd if we’re not manipulated. This is Project Agos.

(08:03:37)
Typhoon Glenda, known internationally as Rammasun intensifies as it moves closer to the [inaudible 08:03:43] area. On this platform that we handed to the government, we do it with the government. You actually see the path of the typhoon. These lighter maps that we had, it took eight months to get our Philippine government to actually release it to the public. We did. Now, if you’re in a landslide area, you’re going to have to evacuate. These all came from citizens. These are all photos, videos that they uploaded onto this platform, even as we followed the path of a storm. It was pretty incredible what journalists can do with our communities. We did both face-to-face and remote, virtual. The end goal is to work with government, because you don’t have to always fight government, because government can’t do it alone, especially not in my country.

(08:04:37)
The third part is the response. That’s where it is. Then recovery. The recovery comes in and we all watched this. It included the Red Cross, which had 14,000 volunteers. The goal is to turn the thousands of deaths per typhoon to # zerocasualty.

(08:05:04)
The Duterte administration forced this minister who signed it with us to resign. Now under the Marcos administration, we are building this again. I have three cases left. I could still go to jail for a decade or so, but I hope. I hope. You have to hope, but you’ve got to have a plan. I’m going to end with this. This is, I think, something that we need in every democracy around the world with an election, because you cannot have integrity of elections if you don’t have integrity of facts.

(08:05:44)
This was something we did for our May, 2022 elections. We did it with the help of Google News Initiative and the data portal was pulled together by a San Francisco startup, [inaudible 08:05:56]. What we did is this whole of society approach because in the medium term, it’s legislation, in the short term, it’s just us. You have to take what is there with social media, the way it exists right now and make it work your way, but don’t lose your values. How do we do that? Well, this is what we did because we showed it. 16 news organizations working together for the very first time, because we tend to compete with each other. Those fact checks that we did, they’re really boring and they don’t spread. I bet you don’t read them. What we did was the mesh layer. Now, this is the influencer marketing campaign for facts. We had a total of almost 150 different groups. The mesh layer, their task every day was to take the identified fact checks and share them with their network, but add emotion. They couldn’t use anger. What we found from the data, and I will show you this, this includes the church, the Philippines id Asia’s largest Roman Catholic nation. It includes business. Those businesses finally kicked in. What we found was which emotion spreads as fast as anger? Inspiration. Inspiration. Think about that. Little tougher, but inspiration spreads as fast as anger.

(08:07:28)
The data pipeline that we collected here went all the way up to our academic partners. There were eight of them. Their goal every week is to tell us what meta-narrative is being seated, who is being attacked? Who is benefiting? They did the public before they did peer review. That helped. That helped. The accountability layer is the law. How do you have integrity of facts, if you don’t have integrity? Sorry. You cannot have integrity … One more time. You cannot have rule of law if you don’t have integrity of facts. Our lawyers finally kicked in. They were far more excited than we were, the journalists at the bottom of the pyramid that were exhausted. They filed more than 20 cases in three months to protect this pyramid.

(08:08:21)
I want to show you some of the data since, we’re at the National Academy of Sciences. Look at this stuff. I’m not on TikTok, but Rappler is. There was creative commons license and everyone took what the news groups did and did it their way. Influencers across the way. Please don’t call a journalist an influencer.

(08:08:43)
Then we did this. We found, from the data, that if we did a daily influencer marketing campaign for facts that we segregate, segregate is the wrong word, that we identified four clusters that spread the information. You can see here, we mobilized … What are boundary spanners. A boundary spanner would be Invil between the Philippines and Norway. I know Invil, Invil is Norwegian. We’re boundary spanners to our communities. We looked at boundary spanners and we crafted a data-backed influence strategy, because it isn’t about messaging, it’s about distribution. It’s about using the design of the platform today without losing your morals. This is what we did. We shifted them. Every single partner of ours, the nearly 150, we all grew. We got stronger. That stronger together actually did work.

(08:09:45)
I’m going to end it here. This, the blue, are the original partners. The orange are boundary spanners. The yellow is the ripple effect. You can see, we succeeded in taking over the center of the Facebook information ecosystem. While there’s still no law, don’t give up. We organize in a different way. Stop being a user, and become a citizen. Identify what civic engagement means in the age of exponential lies.

(08:10:20)
I’ll leave you with this, the 10 point action plan. We have to turn that constitutional level to tactics. Let me just say the one thing, 2024 is a tipping point to either fascism or democracy. Sorry. I don’t use that phrase lightly, that word lightly. It is because we will tip at that point. The three key elections we are looking at; Indonesia, the world’s largest Muslim population, India, you heard from Rana, the world’s largest democracy, and you here, the United States. Our tipping point. Please.

(08:11:04)
Thank you.

Marcia McNutt (08:11:50):

Wow.

Kelly Stoetzel (08:11:52):

What a day.

Marcia McNutt (08:11:52):

Today certainly exceeded my expectations. How about you?

Kelly Stoetzel (08:11:56):

Absolutely.

Vidar Helgesen (08:12:05):

This hall has been a perspective generator today, and an inspiration generator, as has the hallways. It’s been absolutely amazing. We’ve heard from science about the problem and the solutions. We’ve heard from activists that are suffering the problems and shaping the solutions. There is hope for truth and trust.

Marcia McNutt (08:12:25):

Yes. I just wanted to list a few of the things that I learned today. I learned that I can’t trust my senses, I can’t trust my memory. I can’t trust so-called news that comes over social media. But I also learned about so many positive things we can do. First of all, I learned about the wisdom of the crowd. The wisdom of the crowd to do the right thing, as long as they aren’t given perverse incentives. This is something we can really focus on. I learned about the power of education and how all of us can start with our own families, our own communities, and make sure that education is the cornerstone of every citizen. Of course, we heard from Maria just now about ways that she has beat the system at its own game. Amazing.

Vidar Helgesen (08:13:42):

It’s not over yet. Tomorrow there’s going to be more planning. Maria talked about hope with a plan. Tomorrow and the day after, there will be much planning. Tomorrow we’ll have physical invitation-only sessions, but it’ll all be streamed so the digital participation can be secured. We’ll also have a citizen deliberation activity, that we do in collaboration with Stanford. Registration for that is still open, so do register for quite an exciting exercise. Then there’s day three.

Marcia McNutt (08:14:16):

Then there’s day three where we have a number of incredible partners who are going to be helping us with dissemination and outreach of all the insights we’ve had from day one and day two. I hope many of you will also participate in that.

Vidar Helgesen (08:14:33):

What remains is to thank the speakers of today, the participants of today, our moderator, Kelly. Sumi, our digital presenter, who is still going to be with us for two more days and some more people to be thanked.

Marcia McNutt (08:14:56):

I want to give a special thanks to all of the production people behind the scenes who did a fabulous job. They transformed an early 20th century building into a studio that could actually deliver on the promise of today.

Kelly Stoetzel (08:15:22):

They did. They did. Thank you to all of you. As you leave here today, we would love for you all to be thinking about what you learned today and what you will now do. As you kind of head out into the big, wide world, one thing that we would love for you do right away is seize the day.

Sumi Somaskanda (08:15:47):

We’re back in the digital studio now. Thank you, everyone, and to Maria Ressa, as well. She’ll actually be here in the digital studio in about 15 minutes, 20 minutes time, so stay tuned for that. All of you have been so active on our livestream. You’ve shared your comments, your thoughts, your inspiration, your encouragement. Thank you for that. We couldn’t bring everything into our livestream program, but we have been collecting all of your thoughts. We’re going to make sure that they are included in the continuation of this project. Thank you, again, for staying engaged in everything that the summit has been about.

(08:16:20)
We’re going to have a quick recap of what we saw on stage today. Then we’ll get to our final discussions here in the digital studio. Take a look at our video.

Anita Krishnamurthy (08:16:27):

Just amassing a body of knowledge is not sufficient. It’s crucial, but it’s insufficient. We have to learn how to apply that knowledge in different contexts. We have to see how it intersects with culture and society and values and understand that those might clash sometimes, and that we need to be prepared to continually unlearn and learn new things.

Rachel Kuo (08:16:53):

All right. I don’t actually understand the biology behind how mRNA vaccines work, but I trust the scientists who do. We’re so specialized, that in a lot of cases, if I read a paper about the vaccines, I wouldn’t even know how to differentiate what’s a legitimate expert from a not legitimate expert. What I do instead is I look for markers of trust. I might turn to friends and colleagues at Johns Hopkins, or maybe look and see if the author is a member of the National Academy to try to figure out is this a trustworthy paper or not?

Tristan (08:17:24):

TikTok is literally competing with Instagram. If you post this video on Instagram and Instagram offers you a hundred views on average, but if you post it on TikTok, you get a thousand views on average. If you’re a teenager, where are you going to post your next video? TikTok, the one that gives you the more reach. It’s a race to who can inflate your ego and give you the most instant sharing as fast as possible. We’ve all seen the effects of that. Fake news spreads six times faster than true news.

Dr. Rich Roberts (08:17:51):

I think, kids, if they’re educated well from a fairly early age, and I would say nine, 10 is a good time to get started, if they learn what the scientific process is, they can understand what people are saying to them. Provided, you don’t have different people coming. I mean, the surgeon general in Florida thinks that vaccines are bad. I mean, where on earth did he get his degree? I do not understand it. Maybe we should close that university that educated him.

Tristan (08:18:27):

There is a right to science in the declaration of human rights. We should be bringing science to everybody. Thank you.

Sumi Somaskanda (08:18:39):

Those were just some of the insights that we saw on the stage this afternoon. We’ve been wrangling some of those speakers into the digital studio so that they can share more of their thoughts with us. Two of them are sitting with me right now. Sheldon Himelfarb, CEO of PeaceTech Lab, and Phil Howard, professor at Oxford University and the University of Washington. Welcome. I watched your segment on stage, which was really interesting. What I wanted to do is just dive a little bit deeper into what you talked about. Sheldon, I’ll start with you. You talked about linking up with other researchers, essentially, across the globe to tackle myths and disinformation, that that can be a really effective front against this problem. How would that play out? How would this panel, let’s call it, of scientists, researchers who are working on this issue actually tackle something like generative AI being used for misinformation in an election, or disinformation in an election?

Sheldon Himelfarb (08:19:36):

Well, it’s our, I’m going to toss that to Phil because he is our professor from Oxford who is working directly with these brilliant research scientists already. This is just to say that it is already a work in progress. We have 200 research scientists from around the world, some of whom are experts on generative AI. They are finding new answers to these really complex, challenging questions every single day. What our presentation was about was about the formation of a new organization, the International Panel For The Information [inaudible 08:20:15], in order to benefit from their collective wisdom. These things are happening. If anything, I saw that across the entire day today. Brilliant research happening all over the place. We need to pull it together. Phil is really in touch with all of these folks.

Sumi Somaskanda (08:20:33):

Yeah. Tell us about that, Phil.

Phil Howard (08:20:34):

Certainly, and this is a great question. It looks like there’s an emerging consensus that it’s going to take humans and machine learning systems to identify problematic content. You wouldn’t want to turn it all over to AI to sort out what’s high quality information and what’s low quality information. You wouldn’t want to just have cohorts of humans making those decisions. It looks like some combination. Keeping humans in the loop will help with public understanding of those kinds of issues.

Sumi Somaskanda (08:21:02):

These are examples, of course, the research you’re working with and researchers of science moving very quickly, right?

Phil Howard (08:21:08):

Right.

Sumi Somaskanda (08:21:09):

Can it move quickly enough to keep up with this technology that’s moving very, very quickly?

Phil Howard (08:21:13):

The technology moves quickly, but if you can get people to talk and study and work to deadlines to come up with reasonable answers in six, nine months, this is what the IPIE will be about, a focused question working to deadline. That is possible. We’ve seen several great examples of how science can move quickly today.

Sumi Somaskanda (08:21:33):

What standard do you think that can set?

Sheldon Himelfarb (08:21:35):

Well, I just think that we’ve seen the standard in the last couple of years with the mRNA vaccine, record-breaking creation of a vaccine and other space exploration, and the creation of generative AI, which you just asked about. We’re living in a new era of science. Scientists are leading the way for how you can reach actionable solutions in a timely fashion. We really need to start working more closely with those folks in order to deal with this particular crisis.

Sumi Somaskanda (08:22:19):

Sheldon, let’s close that loop. The scientists are coming up with these actionable solutions, that needs to be passed on to those who can take action, so that means policy makers, right?

Sheldon Himelfarb (08:22:30):

Indeed, it does. That, of course, is our Achilles heel. The legislators and the regulators have the speed at which they operate, but I think that we have an opportunity here. Look at the discussion around AI that’s happening right now. I’ve been working with technology for social good for a long, long time, but we’ve also seen the power of technology to create harm and hatred. It’s because we rarely have paused to think about the unintended consequences of new technologies. Well, the debate right now, the discussion right now, about AI is really different. We are really seeing a gigantic … Let’s take a break. Let’s think about this before it creates more harm than good. We have a moment. I think we’re trying to harness that moment with the IPIE.

Sumi Somaskanda (08:23:37):

It sounds like this is a moment that’s also undergirded this entire summit today. What do you take away from this summit? This first day of the summit. It’s three days, of course.

Phil Howard (08:23:46):

My takeaway from the first day is that there’s a lot of energy behind involving journalists, supporting journalists and the difficult work that they do. There’s a lot of energy and enthusiasm for getting scientists to collaborate. It’s probably the case that we need to do both things. Getting the evidence right, understanding the big picture problems, and getting our stories in to policy makers who might actually regulate, provide that policy oversight, build the institutions we want to have, that’s going to need to be a collaborative effort.

Sumi Somaskanda (08:24:18):

For you, Sheldon?

Sheldon Himelfarb (08:24:20):

Yeah. I think that my takeaway from today was so … I participated in this Nobel Summit in 2021, and so did Phil with me. We were one of just very few panels talking about this problem. To see an entire day of people all wrestling with both problems and solutions, I guess I’m gratified and I’m encouraged that we’re moving in the right direction. It’s felt like everybody’s problem and nobody’s problem, but now I think that’s changing.

Sumi Somaskanda (08:25:01):

We’ve also heard from so many people from different disciplines as well. Have you had the opportunity to connect with others here in the summit who are working in this field and exchange ideas?

Phil Howard (08:25:10):

Absolutely. It’s a constant journey to meet other scholars, other journalists working in interesting part of the world. A huge portion of the misinformation that’s out there is not in English. It’s in other languages, and being studied by researchers in other countries. Developing those international ties is going to be critical to any kind of movement.

Sheldon Himelfarb (08:25:32):

I would add to that a piece that we did put on the table today in our presentation is that among those 200 research scientists that are already in the IPIE, neuroscience, psychology, anthropology, communications, computational scientists, data scientists, that’s the challenge here. The information environment is multidisciplinary, but everybody recognizes that and seems to be wanting to collaborate.

Sumi Somaskanda (08:26:06):

Last quick question to you both. What is the one message perhaps from the stage today that you saw or took away that you’ll carry with you and shape your work with going forward?

Phil Howard (08:26:17):

For me, I think the Nobel is a unique platform. It is certainly one of the most trusted institutions, especially in research. The fact that the Nobel network is lending its weight to this social problem, committing itself to trying to move into this hope agenda, that’s what’s really struck me as important about today.

Sheldon Himelfarb (08:26:38):

Well, what struck me is that there’s a lot of brilliant people out there who Phil and I need to reach out to very quickly and get the benefit of their outstanding research into the IPIE, because that’s how we will all succeed when the whole is greater than the sum of its parts.

Sumi Somaskanda (08:26:57):

We certainly benefited from hearing about your work today. Thank you, Sheldon, thank you, Phil, for speaking with us here in the digital studio as well, sharing a bit more on your thoughts. We hope you enjoy the rest of the summit.

Phil Howard (08:27:09):

Thank you.

Sheldon Himelfarb (08:27:09):

Thank you.

Sumi Somaskanda (08:27:10):

Now let’s take a look at laureates providing an example of what misinformation has meant to them. We now have Andrea Ghez, the Nobel Prize in physics from 2020. Take a look at her video.

Andrea Ghez (08:27:25):

Let me just begin by saying, as a scientist, I think it’s incredibly important that scientists, in general, think very carefully about this topic of truth, trust, and misinformation. Being careful about our work and authentic to what we’re trying to understand is just so critical. The universe just doesn’t care about our local politics. If we really want to understand and make progress about our understanding of how the world works, how the universe works, it’s really important that we be truthful with ourselves and truthful with the public.

(08:28:02)
The way in my own field that I see this raise its head most clearly is with respect to funding. Funding is a really scarce resource. Therefore, scientists have to compete for this funding. I think this means it’s really important that we communicate well or in a very accessible way. There’s a really interesting boundary between making your science accessible, versus overselling, or overhyping, or being disingenuous with what is actually possible. I think that’s the most obvious place that this arises in my field of astronomy and astrophysics. I think scientists, in general, just have a really huge responsibility in this arena.

Sumi Somaskanda (08:28:56):

Thank you to Andrea Ghez for sharing that message with us. We can reflect a bit more on day one of the summit with the speakers who are with us now in our digital studio. I’ll introduce them to you. On the left is Maia Mazurkiewicz, an expert on countering disinformation and behavioral changes, a co-founder and head of SratCom of Alliance 4 Europe. In the middle here is Finn Lützow, director of Digital Policy at the Norwegian Consumer Council, and Melissa Fleming, under Secretary General for Global Communications at the United Nations.

(08:29:28)
Thank you for making time for this conversation. I think it’s important, somehow, to reflect and digest all of those wonderful speeches and presentations we saw on the stage. I’ll start with you, Melissa. What really stuck out to you from what you saw?

Melissa Fleming (08:29:42):

I mean, I think in general it is just really refreshing, having worked on this phenomenon for so long, seeing polluted information environments all over the world to see so many forces for good here, working on healthy information

Melissa Fleming (08:30:00):

… environments and pulled together because we’re aware of them when we find them online or meet individually here and there. But it really seems like a coalition is building and collaboration is building, and that if joined together, we certainly outnumber the haters and we certainly outnumber those who are trying to inject and monetize disinformation who want to poison our information ecosystems, which also halts human progress.

Sumi Somaskanda (08:30:41):

Finn, how about for you?

Finn (08:30:42):

Yeah. I mean, really, the breadth of interventions here really showed how science came together and explained it from the point of memory. And then we had how easy you can manipulate a memory, which I thought was fascinating. And then looking at how different cultural contexts could also influence how you see the world to how ingrained prejudices in society, whether it’s racism or other prejudices that built into our institutions or in basically cultural mindset to the role of big tech players in amplifying and how they can be used to basically reinforce the visions that are already in society. So I thought it was really refreshing to see the breadth of approaches to this topic that’s so important.

Sumi Somaskanda (08:31:29):

And for you, Maia?

Maia (08:31:31):

For me, it’s the thing that we all came together from very different backgrounds and different places, different continents even. But we are fighting and encountering the same challenge, and the solutions in the room are really coming together, making sense of bringing together truth and trust and hope. And that’s something that we really can take out there and work together from the different spaces in the issue to countering this information.

Sumi Somaskanda (08:32:00):

Certainly, the speakers brought a global perspective and we have had a global audience joining us all day online on our stream. And I wanted to bring in some of the comments because we have had so many comments, thoughts, questions that have been added in our stream in the chat there. One of them that has been added, I want to share with you, says, “Misinformation is an error; disinformation is a sin.” A very strong statement there. Another viewer said, “Transform theoretical knowledge into actions and behaviors. Less talk like experts and act more like persons.” That’s a good point.

(08:32:32)
“In education, we need to broadly rework academic curriculum to emphasize media literacy and critical thinking skills.” “Fund libraries. Stop cutting library budgets and recognize the importance, the role of public school and health libraries in providing access to information, teaching critical thinking and evidence-based practice.” “Controversy is a great opportunity to meet other people, a different one, to experience respect and admiration, and to learn to be more and better human being.” And one more from our stream, “Policy makers need to work with evidence and not push political agendas, which sometimes ignore the evidence.” So that was a broad spectrum of thoughts. Melissa, anything there that stood out to you, want to comment on?

Melissa Fleming (08:33:17):

Well, yeah, we just came out of listening to Maria Ressa who said, “Stop using the word misinformation.” So disinformation being a sin, I think we’d all agree with because disinformation is intentional and it’s being weaponized. And yes, maybe if we can focus on disinformation, we’ll go a long way to solving the problem. I completely agree with more media literacy. Obviously, the tech platforms have a huge responsibility and we have to continue to push for more transparency, accountability, the whole thing. But we’ve been pushing for years, and not much has changed. So there’s a lot that we are forced actually to all do ourselves. There’s individual, just knowledge that can shield against succumbing to all of this barrage of information warfare. And again, if we can also come together, as many of them are saying, and talk like people, because there were a lot of scientists here and there are also a lot of people who are now have different titles that no one has ever heard of, experts in this field of disinformation. But we also are tending to speak in a language that maybe other people don’t understand. So like scientists, like doctors, we mis and disinformation experts also need to speak in very plain language.

Sumi Somaskanda (08:34:46):

Maia, what’s your thought on that?

Maia (08:34:48):

I think that’s a very crucial point, and I agree with funding libraries and bringing more science into the world, but we also need to more listen to each other. We used to have conversations, we used to be fine with disagreeing with each other. I’m coming from Poland, a highly polarized country, and US is very much similar. Right? Sometimes it’s even hard for us to sit together in the dinner table coming from the different political views. And that’s something that is not right. We really need to come together and bring each other. It’s not only about books and scientists that is really hard to say, but also this communication needs to be simple. So we need to communicate in a way that people understands, and we shall be fine with that.

Sumi Somaskanda (08:35:34):

And Finn, what’s your thought on this?

Finn (08:35:35):

Well, there was one very interesting note today from researcher of Yale who basically showed the audience how the incentive structures of the social media platforms create these habits. Today you’re rewarded for sharing content that’s quite provocative or even fake, and how it’s really hard to shape individual’s habits because of these reward systems that they have online today. So I really think that that research showed us that, whilst I completely agree that we need to work on the individual level here, we also need to recognize the responsibility we need to place at a higher level, and also because the collective impact that this can have on society and big groups of people. It’s not something that can be solved at an individual level. So it’s really important to have all these levels and identify them to the individual levels and libraries and all these things, but then also really tell policy makers to get their act together and make meaningful forceful policies.

Sumi Somaskanda (08:36:38):

Very quick last question. I’ll just go down the panel here in 20 seconds or so, if you can. What is your hope or expectation to come out of this summit?

Melissa Fleming (08:36:49):

Adjoining of the best of minds and the best of forces, and taking it out of academia and think tanks, and transforming it into really real mobilization. I work for the United Nations, and I’m really hoping we’re working on a developing a UN Code of Conduct on information integrity on digital platforms. Obviously the UN can’t do it alone, and so I’m making lots of contacts here to pull together to bring this out for the world.

Finn (08:37:24):

Yeah, I hope people are inspired. I mean, I’m so inspired by the people I met in here today, that people see that they can do something, whether it’s in their community, whether it’s at their work, whether it’s where they study or if they have a position of power to actually try to do something because democracies are eroding and people’s trust in science is eroding. That’s been clear today. So hopefully, people will go home today and really feel like they can do something.

Maia (08:37:47):

I hope that not only think of doing but really doing, and that awareness that is crucial to start working right now. And I’m very happy, Melissa, that you said that. I very hope that we can all together, come together, and change a system because indeed, it’s one thing that we need to rise awareness to people in order for them to access the information otherwise. But also, we need to bring it on the systemic level, on the UN, with the governments, with the civil society, with the media, and with all the other actors that want to counter disinformation and want to build the future that is stressful.

Sumi Somaskanda (08:38:24):

A call to action, a call to work together. Thank you very, very much, Maia, Finn, and Melissa. Great to speak to you again, and enjoy the rest of the summit.

Melissa Fleming (08:38:32):

Thank you. Great to be with you.

Maia (08:38:34):

Thank you so much.

Sumi Somaskanda (08:38:34):

Thank you. And we’ll take a quick look now at the final speaker on stage, who we just saw and who will be right in the digital studio with us in a moment, Maria Ressa. Here’s a clip from on stage.

Maria Ressa (08:38:44):

Please use disinformation, not misinformation, because misinformation is like a game of telephone. It gets distorted. People make mistakes, we make mistakes. Disinformation is when power and money uses the existing information ecosystem to insidiously manipulate the cellular level of our democracy, which is each of us. When news organizations had power, we never really explained the process. Journalism as an antidote to tyranny, because only journalists are foolish enough to stand up to a dictator and say, “You’re wrong.” In the medium term, it’s legislation. In the short term, it’s just us. So you have to take what is there with social media the way it exists right now and make it work your way, but don’t lose your values.

Sumi Somaskanda (08:39:47):

That was a short while ago on stage, and Maria’s with us in our digital studio. Maria, thank you for coming in to talk a little bit more about what we heard on stage. It was fascinating. I want to ask you about one of the things that you said that I was listening to. You were talking about this being an important window that now is closing, a time to act. Why do you say that?

Maria Ressa (08:40:10):

Well, we’re seeing first the impact of it already, right? We’ve talked about the social harms forever, since… I mean, Russian disinformation on Crimea began in 2014, and we saw this. I tackled this in my book. It was fascinating to watch the information operations come bottom up. I got some of it in the Philippines. And then to have the same thing said by Russia’s foreign minister in Geneva a day later, this is May 14th. So we had those very same meta narratives that were seated was exactly what Russia used to invade Ukraine itself eight years later, in February last year. So that was the first generation AI. These information operations, information warfare have been making us weaker collectively.

(08:41:04)
And I’m going to quote a Russian, Yuri Andropov, who’s former KGB chairman, he went on to head Russia, where he said that, “Dezinformatsiya is like cocaine. You take it once or twice, you’re okay, but if you take it all the time, you’re a changed person.” We are changed people. It has been years. This is part of the reason we’re fragmented. We distrust everything. The end goal of disinformation is really not to make you believe something, it’s to make you distrust everything. So that’s our reality today.

(08:41:36)
Now you add generative AI. Now I can talk about it because I was talking at the time. But think about generative AI. What makes this interesting? GPT-2, it’s about how the machine thinks in word for word. That was the shift. But they have parameters that they use. So GPT-2 was 1.5 billion parameters per word. GPT-3 escalated to 175 billion parameters. That’s not even four times, that’s 1.5 to 175. GPT-4 which was just released a few months ago is one trillion. And now they’ve stopped talking about it. GPT-5 will be released this year before the end of the year. The people who release this stuff have warned that they don’t know what will happen. I forgot to say this, thanks for coming here. If they’ve weaponized our anger and our fear in this rudimentary AI, this new generative AI will weaponize intimacy and it will be far more disastrous. It’s exponential, exponential.

Sumi Somaskanda (08:42:47):

So how do we keep up?

Maria Ressa (08:42:49):

The first thing is, I think governments need to kick in. Democratic states need to kick in and stop the analysis paralysis and just say, “This is harmful. You are not going to release this.” The same way, look at the vaccinations for COVID-19. They didn’t release it right away. Even though people were dying, they tested it first. It was tested before it was released to the public. And there were still dangers. These companies are using us to make their AI better for free. And at the same time, they’re already saying that GPT-4 is coding 40% of the new code, that they could actually lose control of what this is. So if the first one, I talked in the Nobel lecture about an atom bomb, this is beyond, this is like we’re now in the dystopia. And this is what’s scary. Having said that, I’m a sci-fi fan. Sure, I like the fact that I can do lots of stuff with this, but I don’t like that we have essentially given the crowd access to the atom bomb and it is being crowdsourced how it’s developed.

Sumi Somaskanda (08:44:05):

That’s something that’s certainly being discussed now by regulators. But I want to ask you about something else you said on stage. You said, “Inspiration spreads as fast as anger.”

Maria Ressa (08:44:13):

Isn’t that cool?

Sumi Somaskanda (08:44:13):

What inspires you?

Maria Ressa (08:44:15):

Oh, my gosh. I mean, that audience today inspired me, right? You walk on that stage and you feel the energy. This is what you will never get, you got to come here. It’s like you feel the energy of the people in the room. It is what makes us human, and it’s slightly different every time. But what inspires me? I survived six years of attacks. I’m not in jail. Rappler is still up. In 2016 when we came under attack, I was told, “You’re being foolish.” I just didn’t have any other way to be because I’m a journalist and I couldn’t accept what I was being told was the new reality. So the phrase I always used is, “#HoldTheLine.” Hold the line. This is what the constitution says are our rights. And we did, and we’re still here. So what inspires me? Oh my gosh, it’s what keeps me working. What inspires you? We are so under attack. Why do you keep doing what you do?

Sumi Somaskanda (08:45:14):

To have conversations like this.

Maria Ressa (08:45:16):

Your TV.

Sumi Somaskanda (08:45:18):

Ready for TV. I should ask you, Maria, what are you taking away from the summit? It’s three days long, but particularly from this first day where we saw such a range of speeches and presentations on stage, what will you take with you to inform your work going forward?

Maria Ressa (08:45:32):

I loved hearing from new voices, starting with the first session with Rachel Kuo, who is really put it in context. Oftentimes when you’re only talking to the tech bros, they will forget that they’ve coded bias into this thing and that it is about power and it’s made our society far more afraid. Identity politics, why is that being used? Why is migration being used? Because it’s being used. Black… Sorry. No, I didn’t. Black Lives Matter during the 2016 elections was attacked both sides by the IRA, the Internet Research Agency, and the GRU. They attacked both sides. They weren’t trying to make you believe one side or another. They attacked both sides to inflame, right?

(08:46:26)
If a leader in the old days, if the task of the leader is to bridge and heal, it’s impossible to do that with our information ecosystem today. We journalists are very aware of how each word, or you have an editor that will go over this, and we’re legally liable for this. We do not want to incite violence. That isn’t what social media does. In fact, the data is there. It shows us that it is pumped full of violence, and it affects all of us.

Sumi Somaskanda (08:46:59):

Well, I think that’s a really good note to bring two final guests to join you because you have said that you were inspired, and I think two guests who were the hosts and organizers who joined us at the very beginning of the day can tell us what inspired them about the day. Marcia McNutt, the president of the National Academy of Sciences, and Vidar Helgesen, executive director of the Nobel Foundation. You have both been just listening to Maria, so I’ll ask you as well, what inspired you from today?

Vidar Helgesen (08:47:26):

Well, Maria, for one.

Maria Ressa (08:47:28):

Of course. Thank you so much for doing this.

Vidar Helgesen (08:47:31):

I think that the multitude of perspectives, I said at the closing that this whole has been a perspective generator today and an inspiration generator. We’ve heard from scientists that helped us understand the problem, but also helped us understand potential solutions. We’ve heard from activists and journalists and journalists activists about the problem and how they’re suffering the problem, but also how they are key shapers of solutions. And when the science-based evidence meets activist knowledge, I think a lot of stuff can happen. So I’m looking forward to the next couple of days when these communities will try to shape even more solutions.

Sumi Somaskanda (08:48:24):

How about for you, Marcia?

Marcia McNutt (08:48:26):

So I think what really stands out for me was the lack of finger pointing to any one group in particular, but rather the attempt to bring all groups in on the solution. We all have ways that we have failed. A good example is with my own community, the scientists, how we have projected this image of privilege and some unique insight into the world. And we have done a very poor job of saying to every person in the world, “You can be a scientist. You can do science. All you have to do is come up with a hypothesis, test it, and share what you’ve done for other people to critique.” Science is not some magical temple. It is a natural way that curiosity is encouraged in young people, and we just have to take that natural curiosity and funnel it in positive ways.

Sumi Somaskanda (08:49:38):

Maria, what do you think us journalists can learn from science?

Maria Ressa (08:49:43):

I mean, I think this is our process. We just do it faster. I mean, this is on deadline, right? But that’s the crazy thing. I think this also makes us able to shift quicker, and these are two incredible organizations and we need that support.

Vidar Helgesen (08:50:07):

I’m also left with one consideration or one thought because there was this talk today about belonging and how rabbit holes and conspiracy theory communities and fake news communities build on a sense of belonging. And the question is, can the truth-seeking community create that same sense of belonging? Or said that belonging comes before belief? And I think we need to use the same tactics in a way, and science definitely needs to get better at creating that sense of belonging.

Marcia McNutt (08:50:45):

Yeah, I would say that I have decided that as a scientist myself, I don’t trust any other scientist who hasn’t at least once changed their mind about something important because new evidence is coming in all the time. And if in light of new evidence you have never changed your position, you’re practicing religion, you’re not practicing science.

Sumi Somaskanda (08:51:16):

Well, I think that’s the final word of the day. We have had an excellent time here in the digital studio, particularly this conversation, wrapping it all together. So thank you very much, Vidar, Marcia, and Maria for speaking with us here in the digital studio. And I will thank all of our incredible viewers, our users on the livestream who’ve been with us all day long, sharing their comments and their thoughts, and inspiring us to continue this conversation as well. Thank you to the National Academy of Sciences and the Nobel Prize Summit, of course the Nobel Foundation for hosting this, and to all of the technical team who has put all of this together. So thank you for joining us. We hope that you continue to follow the summit on day two and day three. Have a good day.

Marcia McNutt (08:51:58):

Thank you, Sumi.

Related Post
Recent Posts