Oct 28, 2020
Tech CEOs Senate Testimony Transcript October 28
Facebook CEO Mark Zuckerberg, Twitter CEO Jack Dorsey, and Alphabet (Google) CEO Sundar Pichai testified before the Senate on October 28. The CEOs were questioned about content moderation policies. Read the full transcript of the hearing here.
Transcribe Your Own Content
Try Rev and save time transcribing, captioning, and subtitling.
Chairman Wicker: (00:00)
Internet freedom has been the hallmark of a thriving digital economy in the United States. This success has largely been attributed to a light touch regulatory framework, and to Section 230 of the Communications Decency Act, often referred to as the 26 words that created the internet. There’s little dispute that Section 230 played a critical role in the early development and growth of online platforms. Section 230 gave content providers protection from liability, to remove and moderate content that they or their users considered to be “obscene, lewd, lascivious, filthy, excessively violent, harassing or otherwise objectionable.” This liability shield has been pivotal in protecting online platforms from endless and potentially ruinous lawsuits. But it has also given these internet platforms the ability to control, stifle and even censor content in whatever manner meets their respective standards.
Chairman Wicker: (01:07)
The time has come for that free pass to end. After 24 years of Section 230 being the law of the land much has changed. The internet is no longer an emerging technology. The companies before us today are no longer scrappy startups, operating out of a garage or a dorm room. They are now among the world’s largest corporations, wielding immense power in our economy, culture and public discourse. Immense power. The applications they have created are connecting the world in unprecedented ways, far beyond what lawmakers could have imagined three decades ago. These companies are controlling the overwhelming flow of news and information that the public can share and access. One noteworthy example occurred just two weeks ago, after our subpoenas were unanimously approved. The New York Post, the country’s fourth largest newspaper ran a story revealing communications between Hunter Biden and a Ukrainian official.
Chairman Wicker: (02:15)
The report alleged that Hunter Biden facilitated a meeting with his father, Joe Biden, who was then vice president of the United States. Almost immediately, both Twitter and Facebook took steps to block or limit access to the story. Facebook, according to its policy communications manager began “reducing its distribution on the platform,” pending a third-party check, a third-party fact check. Twitter went beyond that, blocking all users, including the House Judiciary Committee from sharing the article on feeds and through direct messages. Twitter even locked the New York Post account entirely, claiming the story included hacked materials and was potentially harmful. It is worth noting that both Twitter and Facebook’s aversion to hacked materials has not always been so stringent. For example when the president’s tax returns were illegally leaked, neither company acted to restrict access to that information. Similarly, the now discredited Steele dossier was widely shared without fact checking or disclaimers. This apparent double standard would be appalling under normal circumstances, but the fact that selective censorship is occurring in the midst of the 2020 election cycle dramatically amplifies the power wielded by Facebook and Twitter. Google recently generated its own controversy when it was revealed that the company threatened to cut off several conservative websites, including The Federalist from their ad platform. Make no mistake for sites that rely heavily on advertising revenue for their bottom line, being blocked from Google services or demonetized can be a death sentence. According to Google the offense of these websites was posting user submitted comment sections that included objectionable content. But Google’s own platform, YouTube posts user submitted comment sections for every video uploaded.
Chairman Wicker: (04:44)
It seems that Google is far more zealous in policing conservative sites than its own YouTube platform for the same types of offensive and outrageous language. It is ironic that when the subject is net neutrality, technology companies including Facebook, Google, and Twitter have warned about the grave threat of blocking or throttling the flow of information on the internet. Meanwhile, these same companies are actively blocking and throttling the distribution of content on their own platforms and are using protections under Section 30 to do it. Is it any surprise that voices on the right are complaining about hypocrisy or even worse anti-democratic election interference? These recent incidents are only the latest in a long trail of censorship and suppression of conservative voices on the internet. Reasonables or observers are left to wonder whether big tech firms are obstructing the flow of information to benefit one political ideology or agenda.
Chairman Wicker: (05:55)
My concern is that these platforms have become powerful arbiters of what is true and what content users can access. The American public gets little insight into the decision-making process when content is moderated, and users have a little recourse when they are censored or restricted. I hope we can all agree that the issues the committee will discuss today are ripe for thorough examination and action. I have introduced legislation to clarify the intent of Section 230’s liability protections and increase the accountability of companies who engage in content moderation. The Online Freedom and Viewpoint Diversity Act would make important changes to right-size the liability shield, and make clear what type of content moderation is protected. This legislation would address the challenges we have discussed while still leaving fundamentals of Section 230 in place.
Chairman Wicker: (06:56)
Although some of my colleagues on the other side of the aisle have characterized this as a purely partisan exercise, there is strong bipartisan support for reviewing Section 230. In fact, both presidential candidates, Trump and Biden have proposed repealing Section 230 in its entirety, a position I have not yet embraced. I hope we can focus today’s discussion on the issues that affect all Americans, protecting a true diversity of viewpoints and free discourse is central to our way of life. I look forward to hearing from today’s witnesses about what they are doing to promote transparency, accountability and fairness in their content moderation processes. I thank each of them for cooperating with us in the scheduling of this testimony. I now turn to my friend and ranking member Senator Cantwell for her opening remarks, Senator Cantwell.
Maria Cantwell: (07:56)
… Beautiful state of Washington in my Senate office here in Washington, DC. It shows the various ecosystems of the state of Washington, which we very much appreciate. I bring that up because just recently the Seattle area was named the number one STEM economy in the United States, that is the most STEM workforce in the United States of America. These issues about how we harness the information age to work for us and not against us is something that we deal with every day of the week, and we want to have discussion and discourse. I believe that discussion and discourse today should be broader than just 230. There are issues of privacy that our committee has addressed and issues of how to make sure there is a free and competitive news market. I noticed today we’re not calling in the NAB or the publishers association asking them why they haven’t printed or reprinted information that you allude to in your testimony that you wish was more broadly distributed.
Maria Cantwell: (09:17)
To have the competition in the news market is to have a diversity of voices and diversity of opinion. In my report just recently released, we show that true competition really does help perfect information, both for our economy and for the health of our democracy. I do look forward to discussing these issues today. What I do not want today’s hearing to be is a chilling effect on the very important aspects of making sure that hate speech or misinformation related to health and public safety are allowed to remain on the internet. We all know what happened in 2016. We had reports from the FBI, our intelligence agencies and a bipartisan Senate committee that concluded in 2016 that Russian operatives did, masquerading as Americans use targeted advertisements, intentionally falsified news articles, self-generated content, social media platform tools to interact and attempt to deceive tens of millions of social media users in the United States.
Maria Cantwell: (10:26)
Director of National Intelligence then, Republican Senator former Senator Dan Coats said in July 2018, “The warning lights are blinking red, that the digital infrastructure that serves our country is literally under attack.” I take this issue very seriously and I’ve had for many years, that is making sure as special counsel Mueller indicated 12 Russian intelligence officers hacked the DNC and various information, detailing phishing attacks into our state election boards, online personas and stealing documents. When we had a subcommittee hearing and former Bush Homeland Security Director, Michael Chertoff testified, I asked him point blank. Because there were some of our colleagues who were saying, you know what, everybody does election interference. I asked him if election interference was something that we did or should be encouraging. He responded that he agreed, “Interfering with infrastructure or elections is completely off limits and unacceptable.”
Maria Cantwell: (11:37)
That is why I believe that we should be working aggressively internationally to sanction anybody that interferes in our elections. I hope today that we will get a report from the witnesses on exactly what they have been doing to clamp down on election interference. I hope that they will tell us what kind of hate speech and misinformation that they have taken off the books. It is no secret that there are various state actors who are doing all they can to take a whack at democracy. To try to say that our way of government, that our way of life, that our way of freedom of speech and information is somehow not as good as we have made it being the beacon of democracy around the globe. I am not going to let or tolerate people to continue to whack at our election process, our vote by mail system, or the ability of tech platforms, security companies, our law enforcement entities, and the collective community to speak against misinformation and hate speech.
Maria Cantwell: (12:44)
We have to show that the United States of America stands behind our principles, and that our principles do also transfer to the responsibility of communication online. As my colleagues will know, we’ve all been through this in the past. That is why you Mr. Chairman and I and Senators Rosen and Thune sponsored the HACKED Act, that is to help increase the security and cybersecurity of our nation, and create a workforce that can fight against that. That is why I joined with Van Hollen and Rubio on the DETER Act, especially in establishing sanctions against Russian election interference, and to continue to make sure that we build the infrastructure of tomorrow. I know that some people think that these issues are out of sight and out of mind, I guarantee you they’re not. There are actors who have been at this for a long time. They wanted to destabilize Eastern Europe, and we became the second act when they tried to destabilize our democracy here by sewing disinformation.
Maria Cantwell: (13:49)
I want to show them that we in the United States do have fair elections. We do have a fair process. We are going to be that beacon of democracy. I hope that as we talk about 230 today, and we hear from the witnesses on the progress that they have made in making sure that disinformation is not allowed online, that we will also consider ways to help build and strengthen that. That is to say, as some of those who are testifying today, what can we do on transparency, on reporting, on analysis? Yes I think you’re going to hear a lot about algorithms today, and the kind of oversight that we all want to make sure that we can continue to have a diversity of voices in the United States of America, both online and offline. I do want to say though, Mr. Chairman, I am concerned about the vertical nature of news and information. Today I expect to ask the witnesses about the fact that I believe they create a choke point for local news.
Maria Cantwell: (14:58)
The local news media have lost 70% of their revenue over the last decade, and we have lost thousands, thousands of journalistic jobs that are important. It was even amazing to me that the sequence of events yesterday had me being interviewed by someone at a newspaper who was funded by a joint group of the Knight Foundation and probably Facebook funds, to interview me about the fact that the news media and broadcast has fallen on such a decline because of loss of revenue as they’ve made the transition to the digital age. Somehow, we have to come together to show that the diversity of voices that local news represent need to be dealt with fairly when it comes to the advertising market. That too much control in the advertising market puts a foot on their ability to continue to move forward and grow in the digital age. Just as other forms of media have made the transition and yes still making the transition, we want to have a very healthy and dynamic news media across the United States of America.
Maria Cantwell: (16:09)
I plan to ask the witnesses today about that. I wish we had time to go into depth on privacy and privacy issues, but Mr. Chairman you know and so does Senator Thune and other colleagues of the committee on my side, how important it is that we protect American consumers on privacy issues. That we’re not done with this work, that there is much to do to bring consensus in the United States on this important issue. I hope that as we do have time or in the follow-up to these questions, that we can ask the witnesses about that today.
Maria Cantwell: (16:43)
But make no mistake gentlemen, thank you for joining us, but this is probably a one of many, many, many conversations that we will have about all of these issues. But again, let’s harness the information age as you are doing, but let’s also make sure that consumers are fairly treated, and that we are making it work for all of us, to guarantee our privacy, our diversity of the voices, and upholding our democratic principles and the fact that we, the United States of America stand for freedom of information and freedom of the press. Thank you.
Chairman Wicker: (17:19)
Thank you Senator Cantwell, and certainly you’re correct that this will not be the last hearing with regard to the subject matter. I also appreciate you mentioning your concerns which I share about local journalism. At this point, we are about to receive testimony from our witnesses. Before we begin that, let me remind members that today’s hearing will provide senators with seven minute rounds with a round of seven minute questioning rather than the usual five minutes that we have done in the past. At seven minutes the gavel will, let’s say at a few seconds after seven minutes the gavel will will go down. Even so, this hearing could last some three hours and 42 minutes at that rate.
Chairman Wicker: (18:17)
This will be an extensive and lengthy hearing. Members are advised that we will adhere closely to that seven minute limit, and also shortly before noon at the request of one of our witnesses, we will take a short 10 minute break. With that, we welcome our panel of witnesses, thank them for their testimony, and ask them to give their opening statements, summarizing them in some five minutes. The entire statement will be added at this point in the record, and we will begin with Mr. Jack Dorsey of Twitter. Sir, do you hear us? We have contact with you.
Jack Dorsey: (19:04)
Yes can you hear me?
Chairman Wicker: (19:06)
Yes. Thank you for being with us, and you are now recognized for five minutes sir.
Jack Dorsey: (19:11)
Okay. Well, thank you members of the Commerce Committee for the opportunity to speak with the American people about Twitter and Section 230. My remarks will be brief so we can get to questions. Section 230 is the most important law protecting internet speech. Removing Section 230 will remove speech from the internet. Section 230 gave internet services two important tools. The first provides immunity from liability for users content. The second provides Good Samaritan protections for content moderation and removal, even of constitutionally protected speech, as long as it’s done in good faith. That concept of good faith is what’s being challenged by many of you today. Some of you don’t trust we’re acting in good faith. That’s the problem I want to focus on solving. How do services like Twitter earn your trust? How do we ensure more choice in the market if we don’t?
Jack Dorsey: (20:08)
There are three solutions we’d like to propose to address the concerns raised, all focused on services that decide to moderate or remove content. It could be expansions to Section 230, new legislative frameworks, or commitment to industry-wide self-regulation best practices. The first is requiring a services moderation process to be published. How are cases reported and reviewed? How are decisions made? What tools are used to enforce? Publishing answers to questions like these will make our process more robust and accountable to the people we serve. The second is requiring a straightforward process to appeal decisions made by humans or by algorithms. This ensures people can let us know when we don’t get it right, so that we can fix any mistakes and make our processes better in the future.
Jack Dorsey: (21:02)
Finally, much of the content people see today is determined by algorithms, with very little visibility into how they choose what they show. We took a first step in making this more transparent by building a button to turn off our home timeline algorithms. It’s a good start, but we’re inspired by the market approach suggested by Dr. Stephen Wolfram before this committee in June 2019. Enabling people to choose algorithms created by third parties to rank and filter the content is an incredibly energizing idea that’s in reach. Requiring one, moderation process and practices to be published.
Jack Dorsey: (21:41)
Two, a straightforward process to appeal decisions, and three best efforts around algorithmic choice are suggestions to address the concerns we all have going forward. They’re all achievable in short order. It’s critical as we consider these solutions we optimize for new startups and independent developers. Doing so ensures a level playing field that increases the probability of competing ideas to help solve problems. We mustn’t entrench the largest companies any further. Thank you for the time, and I look forward to a productive discussion to dig into these and other ideas.
Chairman Wicker: (22:22)
Thank you very much Mr Dorsey. We now call on Mr Sundar Pichai. You are recognized for five minutes sir.
Sundar Pichai: (22:34)
Chairman Wicker, ranking member Cantwell, and distinguished members of the committee. Thank you for the opportunity to appear before you today. The internet has been a powerful force for good for the past three decades. Has radically improved access to information. Whether it’s connecting Americans to jobs, getting critical updates to people in times of crisis, or helping a parent find answers to questions like how can I get my baby to sleep through the night? At the same time, people everywhere can use their voices to share new perspectives, express themselves and reach broader audiences than ever before. Whether you’re a barber in Mississippi or a home renovator in Indiana, you can share a video and build a global fan base and a successful business right from your living room. In this way, the internet has been one of the world’s most important equalizers, information can be shared and knowledge can flow from anyone to anywhere.
Sundar Pichai: (23:36)
The same low barriers to entry also make it possible for bad actors to cause harm. As a company whose mission is to organize the world’s information and make it universally accessible and useful, Google is deeply conscious of both the opportunities and risks the internet creates. I’m proud that Google’s information services like search, Gmail, maps and photos provide thousands of dollars a year in value to the average American for free. We feel a deep responsibility to keep the people who use our products safe and secure, and have long invested in innovative tools to prevent abuse of our services. When it comes to privacy, we are committed to keeping your information safe, treating it responsibly and putting you in control. We continue to make privacy improvements like the changes I announced earlier this year to keep less data by default, and support the creation of comprehensive federal privacy laws.
Sundar Pichai: (24:38)
We are equally committed to protecting the quality and integrity of information on our platforms, and supporting our democracy in a nonpartisan way. As just one timely example, our information panels on Google and YouTube inform users about where to vote and how to register. We’ve also taken many steps to raise up high quality journalism. From sending 24 billion visits to news websites globally every month, to our recent one billion dollar investment in partnerships with news publishers. Since our founding we’ve been deeply committed to the freedom of expression. We also feel a responsibility to protect people who use our products from harmful content, and to be transparent about how we do that. That’s why we set and publicly disclose clear guidelines for our products and platforms, which we enforce impartially.
Sundar Pichai: (25:33)
We recognize that people come to our services with a broad spectrum of perspectives, and we are dedicated to building products that are helpful to users of all backgrounds and viewpoints. Let me be clear, we approach our work without political bias full stop. To do otherwise would be contrary to both our business interests and our mission, which compels us to make information accessible to every type of person, no matter where they live or what they believe. Of course, our ability to provide access to a wide range of information is only possible because of existing legal frameworks, like Section 230. The United States adopted Section 230 early in the internet’s history, and it has been foundational to US leadership in the tech sector. It protects the freedom to create and share content, while supporting the ability of platforms and services of all sizes to responsibly address harmful content.
Sundar Pichai: (26:33)
We appreciate that this committee has put great thought into how platforms should address content, and we look forward to having these conversations. As you think about how to shape policy in this important area, I would urge the committee to be very thoughtful about any changes to Section 230, and to be very aware of the consequences those changes might have on businesses and customers. At the end of the day, we all share the same goal, free access to information for everyone and responsible protections for people and their data. We support legal frameworks that achieve these goals. I look forward to engaging with you today about these important issues and answering your questions. Thank you.
Chairman Wicker: (27:16)
Thank you very much. Mr. Pichai. Members should be advised at this point that we are unable to make contact with Mr. Mark Zuckerberg. We’re told by Facebook staff that he is alone and attempting to connect with this hearing, and that they are requesting a five minute recess at this point to see if that connection can be made. I think this is a most interesting development, but we’re going to accommodate the request of the Facebook employees and see if within five minutes we can make contact and proceed. At this point I declare a five minute recess (silence). I call the hearing back into order, and we’re told that in less than five minutes we have success. Mr. Zuckerberg I’m told that we have both video and audio connection. Are you there sir?
Mark Zuckerberg: (30:38)
Yes I am. Can you hear me?
Chairman Wicker: (30:40)
I can hear you fine. You’re now recognized for five minutes to summarize your testimony. Welcome.
Mark Zuckerberg: (30:46)
All right. Thank you chairman. I was able to hear the other opening statements. I was just having a hard time connecting my myself. All right so-
Chairman Wicker: (30:55)
I know the feeling Mr. Zuckerberg.
Mark Zuckerberg: (30:58)
Chairman Wicker, ranking member Cantwell and members of the committee. Every day millions of Americans use the internet to share their experiences and discuss issues that matter to them. Setting the rules for online discourse is an important challenge for our society, and there are principles at stake that go beyond any one platform. How do we balance free expression and safety? How do we define what is dangerous? Who should decide? I don’t believe that private companies should be making so many decisions about these issues by themselves. In Facebook, we often have to balance competing equities. Sometimes the best approach from a safety or security perspective isn’t the best for privacy or free expression. We work with experts across society to strike the right balance. We don’t always get it right, but we try to be fair and consistent. The reality is that people have very different ideas and views about where the lines should be.
Mark Zuckerberg: (31:53)
Democrats often say that we don’t remove enough content, and Republicans often say we remove too much. I expect that we’ll hear some of those criticisms today. The fact that both sides criticize us doesn’t mean that we’re getting this right, but it does mean that there are real disagreements about where the limits of online speech should be. I think that’s understandable. People can reasonably disagree about where to draw the lines. That’s a hallmark of democratic societies, especially here in the US with our strong First Amendment tradition. But it strengthens my belief that when a private company is making these calls, we need a more accountable process that people feel is legitimate and that gives platforms certainty. At Facebook we publish our standards and issue quarterly reports on the content that we take down. We launched an independent oversight board that can overturn our decisions, and we’ve committed to an audit of our content reports.
Mark Zuckerberg: (32:52)
But I believe Congress has a role to play to in order to give people confidence that the process is carried out in a way that balances society’s deeply held values appropriately, and that’s why I’ve called for regulation. Right now the discussion is focused on Section 230. Some say that ending 230 would solve all of the internet’s problems. Others say it would end the internet as we know it. From our perspective, Section 230 does two basic things. First, it encourages free expression, which is fundamentally important. Without 230 platforms could potentially be held liable for everything that people say. They’d face much greater pressure to take down more content to avoid legal risk. Second, it allows platforms to moderate content.
Mark Zuckerberg: (33:40)
Without 230 platforms could face liability for basic moderation like removing harassment that impacts the safety of their communities. Now there’s a reason why America leads in technology. Section 230 helped create the internet as we know it. It has helped new ideas get built and our companies to spread American values around the world, and we should maintain this advantage. But the internet has also evolved. I think that Congress should update the law to make sure that it’s working as intended. One important place to start would be making content moderation systems more transparent. Another would be to separate good actors from bad actors, by making sure that companies can’t hide behind Section 230 to avoid responsibility for intentionally facilitating illegal activity on their platforms.
Mark Zuckerberg: (34:32)
We’re open to working with Congress on these ideas and more. I hope the changes that you make will ring true to the spirit and intent of 230. There are consequential choices to make here, and it’s important that we don’t prevent the next generation of ideas from being built. Now, although this hearing is about content policy, I also want to cover our election preparedness work. Voting ends in six days, we’re in the midst of a pandemic, and there are ongoing threats to the integrity of this election. Since 2016 Facebook has-
Mark Zuckerberg: (35:03)
Of this election. Since 2016, Facebook has made major investments to stop foreign interference. We’ve hired more than 35,000 people to work on safety and security. We’ve disrupted more than a hundred networks coming from Russia, Iran, and China, and more. They were misleading people about who they are and what they’re doing, including three just this week. This is an extraordinary election and we’ve updated our policies to reflect that. We’re showing people reliable information about voting and results, and we’ve strengthened our ads and misinformation policies. We’re also running the largest voting information campaign in U.S. history. We estimate that we’ve helped more than 4.4 million people register to vote, and a hundred thousand people volunteer to be poll workers. Candidates on both sides continue to use our platforms to reach voters and people are rightly focused on the role that technology companies play in our elections. I’m proud of the work that we’ve done to support our democracy. This is a difficult period but I believe that America will emerge stronger than ever and we’re focused on doing our part to help.
Chairman Wicker: (36:15)
Well, thank you. Thank you very much, Mr. Zuckerberg, and thanks to all of our witnesses. We will now. I think we’re supposed to set the clock to seven minutes, and I see five minutes up there, but somehow we’ll keep time, so there we are. Okay, well, thank you all.
Chairman Wicker: (36:35)
Let me start then with Mr. Dorsey. Mr. Dorsey, the committee has compiled dozens and dozens of examples of conservative content being censored and suppressed by your platforms over the last four years. I entered these examples into the record on October 1st when the committee voted unanimously to issue the subpoenas, and thank you all three again for working with us on the scheduling, alleviating the necessity for actually exercising the subpoenas.
Chairman Wicker: (37:19)
Mr. Dorsey, your platform allows foreign dictators to post propaganda, typically without restriction. Yet you routinely restrict the President of the United States. And here’s an example. In March, a spokesman for the Chinese Communist Party falsely accused the U.S. military of causing the coronavirus epidemic. He tweeted, “CDC was caught on the spot. When did patient zero begin in the U.S.? How many people are infected? What are the names of the hospitals? It might be the U.S. Army who brought the epidemic to Wuhan and on and on.” After this tweet was up for some two months, Twitter added a fact check label to this tweet, after being up for two months. However, when President Trump tweeted about how mail-in ballots are vulnerable to fraud, a statement that I subscribe to and agree with, and a statement that is in fact true, Twitter immediately imposed fact check label on that tweet.
Chairman Wicker: (38:35)
Mr. Dorsey, how does a claim by Chinese Communists that the U.S. military is to blame for COVID remain up for two months without a fact check, and the President’s tweet about security mail-in ballots get labeled instantly?
Jack Dorsey: (38:53)
Well, first and foremost, we, as you mentioned, we did label that tweet. As we think about enforcement, we consider severity of potential offline harm, and we act as quickly as we can. We have taken action against tweets from world leaders all around the world, including the President, and we did take action on that tweet because we saw it, we saw the confusion it might encourage, and we labeled it accordingly, and the goal with our labeling-
Chairman Wicker: (39:30)
You’re speaking of the President’s tweet.
Jack Dorsey: (39:32)
Chairman Wicker: (39:33)
Jack Dorsey: (39:33)
The goal of our labeling is to provide more context to connect the dots so that people can have more information so they can make decisions for themselves. We’ve created these policies recently. We are enforcing them. There are certainly things that we can do much faster, but generally we believe that the policy was enforced in a timely manner and in the right regard.
Chairman Wicker: (40:03)
And yet you seem to have no objection to a tweet by the Chinese Communist Party saying the U.S. Army brought the epidemic to Wuhan.
Jack Dorsey: (40:16)
Well, we did, and we labeled that tweet providing more information.
Chairman Wicker: (40:20)
It took you two months to do so, is that correct?
Jack Dorsey: (40:23)
I’m not sure of the exact timeframe, but we can get back to you on that.
Chairman Wicker: (40:26)
So you’re going to get back to us as to how a tweet from the Chinese Communist Party falsely accusing the U.S. military of causing the coronavirus epidemic was left up for two months with no comment from Twitter, while the President of the United States making a statement about being careful about ballot security with the mail was labeled immediately.
Chairman Wicker: (40:59)
I have a tweet here from Mr. Ajit Pai. Mr. Ajit Pai is the Chairman of the Federal Communications Commission. And he recounts some four tweets by the Iranian dictator, Ayatollah Ali Khomeini, which Twitter did not place a public label on. All four of them glorified violence. The first tweet says this, and I quote each time, “The Zionist regime is a deadly cancerous growth and a detriment to the region. It will undoubtedly be uprooted and destroyed.” That’s the first tweet. The second tweet, “The only remedy until the removal of the Zionist regime is firm armed resistance.” Again, left up without comment by Twitter. The third, “The struggle to free Palestine is Jihad in the way of God.” I quote that in part for the sake of time. And number four, “We will support and assist any nation or any group anywhere who opposes and fights the Zionist regime.”
Chairman Wicker: (42:24)
I would simply point out that these tweets are still up, Mr. Dorsey, and how is it that they are acceptable to be there? I’ll ask unanimous consent to enter this tweet from Ajit Pai in the record at this point. That’ll be done without objection. How, Mr. Dorsey, is that acceptable based on your policies at Twitter?
Jack Dorsey: (42:55)
We believe it’s important for everyone to hear from global leaders, and we have policies around world leaders. We want to make sure that we are respecting their right to speak and to publish what they need. But if there’s a violation of our terms of service, we want to label it and-
Chairman Wicker: (43:18)
They’re still up. Do they violate your terms of service, Mr. Dorsey?
Jack Dorsey: (43:21)
We did not find those to violate our terms of service because we considered them saber rattling, which is part of the speech of world leaders in concert with other countries. Speech against our own people or countries on citizens, we believe is different and can cause more immediate harm.
Chairman Wicker: (43:47)
Very telling information. Mr. Dorsey. Thank you very much. Senator Cantwell, you are recognized.
Speaker 1: (43:53)
I think I’m deferring to our colleague, Senator Peters, just because of timing and situation for him.
Chairman Wicker: (44:03)
All right. Senator Peters, are you there?
Senator Peters: (44:05)
I am here. I am here.
Chairman Wicker: (44:07)
Sir, you are recognized for seven minutes.
Senator Peters: (44:10)
Well, thank you, Mr. Chairman, and ranking member Cantwell, appreciate your deferral to me. I certainly appreciate that a consideration a great deal. I also want to thank each of our panelists here today for coming forward and being a witness, and appreciate all of you accommodating your schedule so that we could have this hearing. My first question is for Mr. Zuckerberg. And I want to start off by saying how much I appreciated our opportunity last night to speak at length on a number of issues. And as I told you last night, I appreciate the Facebook’s efforts to assist law enforcement to disrupt a plot to kidnap and to hold a sham trial and kill our Governor, Governor Whitmer.
Senator Peters: (44:56)
The individuals in that case apparently used Facebook for a broad recruiting effort, but then they actually planned the specifics of that operation off of your platform. My question is when users reach the level of radicalization that violates your community standards, you often will ban those groups and then drive them off to other platforms. Those platforms tend to have less transparency and oversight. But the issue that I’d like you to address is for those individuals that remain on your platform, they are often far down the path of radicalization, but they are definitely looking for an outlet. And I understand that Facebook has recently adopted a strategy to redirect users who are searching, for example, for election misinformation. But it doesn’t seem that policy applies to budding violent extremists.
Senator Peters: (45:52)
Mr. Zuckerberg, do you believe that your platform has a responsibility to off-ramp users who are on the path to radicalization by violent extremist groups?
Mark Zuckerberg: (46:04)
Senator, thanks for the question. I think this is very important, and my understanding is that we actually do a little of what you’re talking about here. If people are searching for, and I think for example, white supremacist organizations of which we ban those, we treat them as terrorist organizations, not only are we not going to show that content, but I think we try to, where we can, highlight information that would be helpful. And I think we try to work with experts on that. I can follow up and get you more information on the scope of those activities and when we invoke that. But I certainly agree with the spirit of the question that this is a good idea and something that we should continue pursuing and perhaps expand.
Senator Peters: (46:53)
Well, I appreciate those comments. I’m the ranking member on Senate Homeland Security Committee, and what we’re seeing is a rise of violent extremist groups, which is very troubling. And certainly we need to work very closely with you as to how do we disrupt this kind of radicalization, especially for folks that are using your platform. So I appreciate the opportunity to work further.
Senator Peters: (47:13)
And as we talked about last night, you asserted that Facebook is proactively working with law enforcement now to disrupt some of these real world violent attempts that stem from some of that activity that originated your platform. Could you tell me specifically how many threats that you have proactively referred to local or state law enforcement prior to being approached for a preservation request?
Mark Zuckerberg: (47:41)
Senator, I don’t know the number off the top of my head, so I can follow up with you on that. But it is increasingly common that our systems are able to detect when there’s potential issues. And over the last four years in particular, we’ve built closer partnerships with law enforcement and the intelligence community to be able to share those kind of signals. So we’re doing more of that, including in the case that you mentioned before, around the attempted kidnapping of Governor Whitmer. We identified that as a signal to the FBI. I think it was about six months ago when we started seeing some suspicious activity on our platform. And there’s certainly, that’s part of our routine and how we operate.
Senator Peters: (48:30)
Well, Mr. Zuckerberg, discovery tools and recommendation algorithms that your platforms use does serve up potentially extremist content based on the user profiles of folks. As we seek to understand why membership in these extremist groups is rising, I would hope that your companies are right now engaging in some forensic analysis of membership, once you take down an extremist group, to take a look at how that happened on your platform. It’s certainly going to better inform us as to how we can disrupt this type of recruitment into extremists groups.
Senator Peters: (49:05)
My question for you though, is that in 2016, you said, and this was apparently an internal Facebook internal document that was reported by the Wall Street Journal, that said that 64% of members of violent groups became members because of your platform’s recommendation. And I’ll quote from that report that was reported in the Wall Street Journal. It said, “Our recommendation systems grow the problem.” That’s clearly very concerning. And I know in response to that report in 2016, you had made changes to your policies. You made changes to some of the algorithms that existed at that time. My question is, have you seen a reduction in your platform’s facilitation of extremist group recruitment since those policies were changed?
Mark Zuckerberg: (49:52)
Senator, I’m not familiar with that specific study, but I agree with the concern, and making sure that our recommendation systems for what groups people are given the opportunity to join is certainly one important vector for addressing this issue. And we’ve taken a number of steps here including disqualifying groups from being included in our recommendation system at all if they routinely are being used to share misinformation, or if they have content violations, or a number of other criteria. So I’m quite focused on this. I agree with where you’re going with that question. I don’t have any data today on the real-world impact of that yet, but I think that addressing this upstream is very important.
Senator Peters: (50:49)
So I appreciate you agreeing with that and that we need more data. Is it that you don’t have the data just at the top of your head or that it doesn’t exist?
Mark Zuckerberg: (51:00)
Well Senator, certainly the former and potentially the latter as well. I think it probably takes some time after we make these changes to be able to measure the impact of it. And I’m not aware of what studies are going on into this. It seems like the type of thing that one would want, not just internal Facebook researchers to work on, both are potentially a collaboration with independent academics as well.
Senator Peters: (51:33)
Thank you, Mr. Zuckerberg, and thank you-
Mark Zuckerberg: (51:34)
Senator Peters: (51:35)
Senator Peters. Senator Gardner has also asked to go out of order, and Senator Thune has graciously deferred to him. So Senator Gardner, you are recognized for seven minutes, sir.
Senator Gardner: (51:53)
Well, thank you Mr. Chairman, and thank you Senator Thune for sharing your time or at least deferring your time to me, and thank you, Mr. Zuckerberg, Mr. Pichai, thank you very much. And Mr. Dorsey, thank you for being here.
Senator Gardner: (52:07)
Mr. Dorsey, I’m going to direct these first questions to you. Mr. Dorsey, do you believe that the Holocaust really happened? Yes or no.
Jack Dorsey: (52:15)
Senator Gardner: (52:16)
So you would agree that someone who says the Holocaust may not have happened is spreading misinformation? Yes or no.
Jack Dorsey: (52:24)
Senator Gardner: (52:25)
I appreciate your answers on this, but they surprise me and probably a lot of other Coloradans and Americans, after all Iran’s Ayatollah has done exactly this, questioning the Holocaust, and yet his tweets remain unflagged on Twitter’s platform. You and I agree that moderating your platform makes sense in certain respects. We don’t want the next terrorist finding inspiration on Twitter, or any certain, any platform for that matter. But you’ve also decided to moderate certain content from influential world leaders. And I’d like to understand your decisions to do so a little bit better. Can you name any other instance of Twitter hiding or deleting a tweet from heads of state?
Jack Dorsey: (53:10)
Not off the top of my head, but we have many examples across world leaders around the world.
Senator Gardner: (53:14)
Would you be willing to provide a list of those?
Jack Dorsey: (53:17)
Senator Gardner: (53:18)
I know we’ve established and we agreed content moderation can have certain upsides like combating terrorism, but Twitter has chosen to approach content moderation from the standpoint of combating misinformation as well. So it’s strange to me that you’ve flagged the tweets from the President, but haven’t hidden the Ayatollah’s tweets on Holocaust denial or calls to wipe Israel off the map. And that you can’t recall off the top of your head hidden or deleted tweets from other world leaders, I would appreciate that list. I think it’s important that we all hear that. So that brings my next question to the front. Does Twitter maintain a formal list of certain accounts that you actively monitor for misinformation?
Jack Dorsey: (54:00)
No, and we don’t have a policy against misinformation. We have a policy against misinformation in three categories, which are manipulated media, public health, specifically COVID, and civic integrity in election interference and voter suppression. That is all we have policy on for misleading information. We do not have policy or enforcement for any other types of misleading information that you’re mentioning.
Senator Gardner: (54:25)
So somebody denying the murder of millions of people or instigating violence against a country as a head of state is not categorically falling in any of those three misinformation or other categories Twitter has?
Jack Dorsey: (54:39)
Not misinformation, but we do have other policies around incitement to violence, which may, some of the tweets that you mentioned, or the examples that you’re mentioning, may fall a [inaudible 00:19:49], but for misleading information, we’re focused on those three categories only.
Senator Gardner: (54:55)
So somebody denies the Holocaust has happened is not misinformation?
Jack Dorsey: (55:00)
It’s misleading information, but we don’t have a policy against that type of misleading information. We have-
Senator Gardner: (55:05)
Millions of people died and that’s not a violation of Twitter. Again, I just don’t understand how you can label a President of the United States. Have you ever taken a tweet down from the Ayatollah?
Jack Dorsey: (55:20)
I believe we have, but we can get back to you on it. We’ve certainly labeled tweets and I believe we have taken one down as well.
Senator Gardner: (55:27)
You know, you said you do not have a list. Is that correct? You do not maintain a list.
Jack Dorsey: (55:32)
We don’t maintain a list of accounts we watch. We look for reports and issues brought to us, and then we weigh it against our policy and enforce if needed.
Senator Gardner: (55:43)
You look for reports from your employees or from the broader news-
Jack Dorsey: (55:46)
No, from the people using the service.
Senator Gardner: (55:50)
Right, and then they turned that over to your board of the review. Is that correct?
Jack Dorsey: (55:57)
So in some cases, algorithms take action. In other cases, teammates do, in some cases it’s a pairing of the two.
Senator Gardner: (56:05)
There are numerous examples of blue check marks, blue check marks that are spreading false information that aren’t flagged. So Twitter must have some kind of list of priority accounts that it maintains. You have the blue check mark list. How do you decide when to flag a tweet? You got into that a little bit. Is there a formal threshold of retweets or likes that must be met before a tweet is flagged?
Jack Dorsey: (56:24)
No. [inaudible 00:56:27]
Senator Gardner: (56:30)
With your answers on the Ayatollah and others, I just don’t understand how Twitter can claim to want a world of less hate and misinformation while you simultaneously let the kind of content that the Ayatollah has tweeted out flourish on the platform, including from other world leaders. It’s no wonder that Americans are concerned about politically motivated content moderation at Twitter given what we have just said. I don’t like the idea of a group of unelected elites in San Francisco or Silicon Valley deciding whether my speech is permissible on their platforms, but I like the even less the idea of unelected Washington, D.C. bureaucrats trying to enforce some kind of politically neutral content moderation.
Senator Gardner: (57:11)
So just as we have heard from other panelists, as we are going to hear throughout the day, we have to be very careful and not rush to legislate in ways that stifle speech. You can delete Facebook, turn off Twitter, or try to ditch Google that you cannot unsubscribe from government censors. Congress should be focused on encouraging speech, not restricting it. The Supreme Court has tried teaching us that lesson time and time again, and the Constitution demands that we remember it.
Senator Gardner: (57:38)
I’m running short on time, so I’m going to very quickly go through another question. One of the core ideas of Section 230’s liability protections is this. You shouldn’t be responsible for what someone else says on your platform. Conversely, you should be liable for what you say or do on your own platform. I think that’s pretty common sense. But courts have not always agreed with this approach, even Rep. Chris Cox opined in a recent Wall Street Journal op-ed that, “Section 230 has sometimes been interpreted by courts more broadly than I expected, for example, allowing some websites to escape liability for content they helped create.”
Senator Gardner: (58:12)
Mr. Zuckerberg, I have a simple question for you and each of the panelists today quickly. To be clear, I’m not talking about technical tools or operating the platform itself here. I’m purely talking about content. Do you agree that internet platforms should be held liable for the specific content that you yourself create on your own platforms? Yes or no.
Chairman Wicker: (58:32)
Mark Zuckerberg: (58:34)
Senator, I think that that is reasonable.
Senator Gardner: (58:37)
Yes or no, Mr. Dorsey. If Twitter creates specific content, should Twitter be liable for that content?
Jack Dorsey: (58:43)
It’s reasonable as well.
Senator Gardner: (58:44)
Mr. Pichai, same question to you. Yes or no, should Google be liable for the specific content that it creates?
Sundar Pichai: (58:51)
If we are acting as a publisher, I would say yes.
Senator Gardner: (58:56)
The specific content that you create on your own platform, yes.
Sundar Pichai: (58:59)
That seems reasonable.
Senator Gardner: (59:00)
Thank you. I think one of the other sides of liability questions in regard to the good faith removal provision in Section 230 that we’ll get into a little bit more in the private questions. I know I’m out of time. So Mr. Chairman, thank you for giving me this time. Senator Thune, thank you as well, and thanks to the witnesses
Chairman Wicker: (59:14)
Thank you, Senator Gardner. The ranking member has now deferred to Senator Klobuchar. So Senator, you are now recognized.
Senator Klobuchar: (59:27)
Thank you, Chairman. I want to note first that this hearing comes six days before election day and I believe we’re politicizing, and the Republican majority is politicizing, what should actually not be a partisan topic. And I do want to thank the witnesses here for appearing, but also for the work that they’re doing to try to encourage voting and to put out that correct information when the President and others are undermining vote by mail, something we’re doing in every state in the country right now.
Senator Klobuchar: (01:00:04)
Second point, Republicans failed to pass my bipartisan Honest Ads Act and the White House blatantly blocked the bipartisan election security bill that I had with Senator Lankford as well as several other Republicans. And it’s one of the reasons I think we need a new President.
Senator Klobuchar: (01:00:23)
Third, my Republican colleagues in the Senate, many of them I work with very well on this committee, but we have had four years to do something when it comes to anti-trust, privacy, local news, a subject that briefly came up and so many other things. So I’m going to use my time to focus on what I consider, in Justice Ginsburg’s words, to be a blueprint for the future. I’ll start with you, Mr. Zuckerberg. How many people log into Facebook every day?
Mark Zuckerberg: (01:00:55)
Senator, it is more than two billion.
Senator Klobuchar: (01:00:59)
Okay. And how much money have you made on political advertisements in the last two years?
Mark Zuckerberg: (01:01:06)
Senator, I do not know off the top of my head. It is a relatively small part of our revenue.
Senator Klobuchar: (01:01:11)
Okay. Small for you, but I think it’s 2.2 billion over 10,000 ads sold since May 2018. Those are your numbers and we can check them later. Do you require Facebook employees to review the content of each of the political ads that you sell in order to ensure that they comply with the law and your own internal rules?
Mark Zuckerberg: (01:01:34)
Senator, we require all political advertisers to be verified before they could run ads. And I believe we do review advertising as well.
Senator Klobuchar: (01:01:46)
But does a real person actually read the political ads that you [inaudible 00:26:50]? Yes or no.
Mark Zuckerberg: (01:01:53)
Senator. I imagine that a person does not look at every single ad. Our systems are a combination of artificial intelligence systems and people. We have 35,000 people who do content and security review for us, but given the massive amount of money-
Senator Klobuchar: (01:02:09)
I really just had a straightforward question because I don’t think they do. I think the algorithm’s hidden, because I think the ads instantly are placed. Is that correct?
Mark Zuckerberg: (01:02:20)
Senator, my understanding of the way the system works is we have computers and artificial intelligence scan everything. And if we think that there are potential violations, then either the AI system will act or it will flag it to the tens of thousands of people who do content review. But-
Senator Klobuchar: (01:02:40)
With all the money you have, you could have a real person review like a lot of the other traditional media organizations do. So another question. When John McCain and I, and Senator Warner, introduced the Honest Ads Act, we got pushback from your company, others, and you were initially against it. Then we discussed this at a hearing. You’re for it. I appreciate that. And have you spent any of the money? I know you spent the most money, Facebook spent the most money ever lobbying last year. Have you spent any of the money trying to change or block the bill?
Mark Zuckerberg: (01:03:12)
Senator, [crosstalk 01:03:13] no. In fact, I’ve endorsed it publicly and we’ve implemented it into our systems even though it hasn’t become law. I’m a big supporter [crosstalk 00:01:03:22].
Senator Klobuchar: (01:03:24)
Try to change it, no? Have you done anything to get it passed? Because we’re at a roadblock on it. And I do appreciate that you voluntarily implemented some of it, but have you voluntarily implemented the part of the Honest Ads Act where you fully disclosed which groups of people are being targeted by political ads?
Mark Zuckerberg: (01:03:45)
Senator, we have, I think, industry leading transparency around political ads, and part of that is showing which audiences, in broad terms, ended up seeing the ads. Of course, getting the right resolution on that is challenging without it becoming a privacy issue. But we’ve tried to do that and provide as much transparency as we can, and I think where we’re currently leading in that area. And to your question about how we’re-
Senator Klobuchar: (01:04:13)
I still have concerns, and I don’t mean to interrupt you, but I have such limited time. One of the things that I, and last thing I want to ask you about, is divisiveness on the platform. And I know there’s been a recent studies have shown that part of your algorithms had pushed people towards more polarized content left, right, whatever. In fact, one of your researchers warned senior executives that our algorithms exploit the human brains that traction to divisiveness. The way I look at it, more divisiveness, more time on the platform, more time on the platform, the company makes more money. Does that bother you what it’s done to our politics?
Mark Zuckerberg: (01:04:54)
Senator, I respectfully disagree with that characterization of how the systems work. We design our systems to show people the content that’s going to be the most meaningful to them, which is not trying to be as divisive as possible. Most of the content on the systems is not political. It’s things like making sure that you can see when your cousin had her baby or-
Senator Klobuchar: (01:05:19)
Okay. Okay. I’m going to move on to Google here and Mr. Pichai, but I’m telling you right now that that’s not what I’m talking about, the cousins and the babies here. I’m talking about conspiracy theories and all the things that I think the Senators on both sides of the aisle know what I’m talking about. And I think it’s been pervasive.
Senator Klobuchar: (01:05:41)
Google, Mr. Pichai, I have not really liked your response to the lawsuit and what’s been happening. I think we need a change in competition policy for this country. I hope I’ll be able to ask you more about it at the judiciary committee. And I think your response isn’t just defensive. It’s been defiant to the Justice Department in suits all over the world. You control almost 90% of all general search engine queries, 70% of the search advertising market. Don’t you see these practices as anti-competitive?
Sundar Pichai: (01:06:15)
Senator, we are a popular general purpose search engine. We do see robust competition, many categories of information and we invest significantly in R and D. We are innovating. We are lowering prices in all the markets we are operating, and happy to engage and discuss it further.
Senator Klobuchar: (01:06:36)
Well, one of your employees testified before the Antitrust sub-committee last month, and he suggested that Google wasn’t dominant in ad tech, that it was only one of many companies in a highly competitive ad tech landscape. Yet, Google has 90% of the publisher ad server market, a product of its double-click acquisition. Does the market sound highly competitive to you when you have 90% of it?
Chairman Wicker: (01:06:59)
Very brief answer.
Sundar Pichai: (01:07:01)
Many publishers can use simultaneously many tools. Amazon and Trade Desk alone have grown significantly in the last two years. You know, this is a market in which we share majority of our revenue. Our margins are low. We are happy to take feedback here. We are trying to support the publishing industry, but definitely open to feedback and happy to engage and discuss it further.
Chairman Wicker: (01:07:21)
Thank you. Thank you, Mr. Pichai.
Senator Klobuchar: (01:07:21)
Well, I think you’ve gotten feedback from the lawsuits, so I’m looking forward to our next sharing to discuss it more. Thank you.
Chairman Wicker: (01:07:29)
Thank you very much. Senator Thune, you are now recognized.
Senator Thune: (01:07:30)
Thank you, Mr. Chairman, and I appreciate you convening the hearing today, which is an important followup to the sub-committee hearing that we convened in July on Section 230. Many us here today and many of those we represent are deeply concerned about the possibility of political bias and discrimination by large internet, social media platforms. Others are concerned that even if your actions aren’t skewed, that they are hugely consequential for our public debate, yet you operate with limited accountability. Such distrust is intensified by the fact that the moderation practices used to suppress or amplify content remain largely a black box to the public.
Senator Thune: (01:08:09)
Moreover, the public explanations given by the platforms for taking down or suppressing content too often seem like excuses that have to be walked back after scrutiny. And due to exceptional secrecy with which platforms protect their algorithms and content moderation practices, it’s been impossible to prove one way or another whether political bias exists. So users are stuck with anecdotal information that frequently seems to confirm their worst fears. Which is why I’ve introduced two bipartisan bills, the Platform Accountability and Consumer Transparency, or the PACT Act, and the Filter Bubble Transparency Act, to give users, the regulators, and the general public meaningful insight into online content moderation decisions and how algorithms may be amplifying or suppressing information. And so I look forward to continuing that discussion today. My Democrat colleagues suggest that when we criticize the bias against conservatives that we’re somehow working the refs, but the analogy of working the refs assumes that it’s legitimate even to think of you as refs. It assumes that you three Silicon Valley CEOs get to decide what political speech gets amplified or suppressed, and it assumes that you’re the arbiters of truth, or at the very least the publishers making editorial decisions about speech. So yes or no, I would ask this of each of the three of you, are the Democrats correct that you all are the legitimate referees over our political speech. Mr. Zuckerberg, are you the ref?
Mark Zuckerberg: (01:09:44)
Senator, I certainly think not and I do not want us to have that role.
Senator Thune: (01:09:50)
Mr. Dorsey, are you the ref?
Jack Dorsey: (01:09:57)
Senator Thune: (01:09:58)
Mr. Pichai, are you the ref?
Sundar Pichai: (01:10:01)
Senator, I do think we make content moderation decision-
Sundar Pichai: (01:10:03)
Senator, I do think we make content moderation decisions, but we are transparent about it and we do it to protect users, but we really believe and support maximizing freedom of expression.
Senator Thune: (01:10:13)
I’ll take that as three nos, and I agree with that. You are not the referees of our political speech. That’s why all three of you have to be more transparent and fair with your content moderation policies in your content selection algorithms. Because at the moment, it is, as I said, largely a black box. There is real mistrust among the American people about whether you’re being fair or transparent. This extends their concerns about the kinds of amplification and suppression decisions your platforms may make on election day and during the post election period, if the results of the election are too close to call.
Senator Thune: (01:10:47)
So, I just want to underscore, again, for my Democratic friends who keep using this really bad referee analogy, Google, Facebook, and Twitter are not the referees over our democracy. Now, second question, the PACT Act, which I referenced earlier, includes provisions to give users due process and an explanation when content they post is removed. So, this is, again, a yes or no question. Do you agree that users should be entitled to due process and an explanation when content they post has been taken down? Mr. Zuckerberg?
Mark Zuckerberg: (01:11:27)
Senator, I think that that would be a good principle to have.
Senator Thune: (01:11:31)
Thank you. Mr. Dorsey?
Jack Dorsey: (01:11:32)
Absolutely. We believe in a fair and straightforward appeals process.
Senator Thune: (01:11:36)
Great. Mr. Pichai?
Sundar Pichai: (01:11:38)
Senator Thune: (01:11:41)
Great. Thank you. Mr. Zuckerberg, Mr. Dorsey, your platforms knowingly suppressed or limited the visibility of this New York Post article about the content on Hunter Biden’s abandoned laptop. Many in the country are justifiably concerned how often the suppression of major newspaper articles occurs online. I would say, Mr. Zuckerberg, would you commit to provide, for the record, a complete list of newspaper articles that Facebook suppressed or limited the distribution of over the past five years, along with an explanation of why each article was suppressed or the distribution was limited?
Mark Zuckerberg: (01:12:25)
Senator, I can certainly follow up with you and your team to discuss that. We have an independent fact checking program, as you’re saying. We try not to be arbiters of what is true ourselves, but we have partnered with fact-checkers around the world to help assess that, to prevent misinformation and viral hoaxes from becoming widely distributed on our platform. I believe that the information that they fact check and the content that they fact check is public. So, I think that there’s probably already a record of this that can be reviewed.
Senator Thune: (01:13:07)
Yeah. But if you could do that, as it applies to newspapers, that would be very helpful. Mr. Dorsey, would you commit to doing the same on behalf of Twitter?
Jack Dorsey: (01:13:17)
We would absolutely be open to it. We are suggesting going a step further, which is aligned with what you’re introducing in the PACT Act, which is much more transparency around our process, content moderation process, and also the results, the outcomes, and doing that on a regular basis. I do agree and think that builds more accountability, and ultimately that lends itself to more trust.
Senator Thune: (01:13:43)
Okay, great. Thank you. All right, very quickly, I don’t have a lot of time either, but I often hear from conservative and religious Americans who look at the public statements of your companies, the geographic concentration of your companies, and the political donations of your employees, which often are in the 80% to 90% to Democrat politicians. You can see why this lack of ideological diversity among the executives and employees of your company could be problematic and maybe contributing to some of the distrust among Conservatives and Republican users.
Senator Thune: (01:14:21)
So, I guess the question that I would ask is, and Mr. Zuckerberg, my understanding is that the person that’s in charge of election integrity and security at Facebook is a former Joe Biden staffer. Is there someone that’s closely associated with President Trump who’s in the same sort of election integrity role at Facebook? How do you all respond to that argument that there isn’t sufficient balance in terms of the political ideology or diversity in your companies? How do you deal with the lack of sort of trust that creates among Conservatives?
Chairman Wicker: (01:15:01)
Let’s see if we can have three brief answers there.
Mark Zuckerberg: (01:15:09)
Senator, I think having balances is valuable, and we try to do that. I’m not aware of the example that you say of someone in charge of this process who worked for Biden in the past. So, we can follow up on that if that’s all right.
Chairman Wicker: (01:15:25)
Yeah, follow up on the record for the rest of this answer, please, Mr. Zuckerberg. Thank you.
Mark Zuckerberg: (01:15:30)
Chairman Wicker: (01:15:32)
Jack Dorsey: (01:15:32)
Well, this is why I do believe that it’s important to have more transparency around our process and our practices, and it’s independent of the viewpoints that our employees told.
Chairman Wicker: (01:15:48)
Jack Dorsey: (01:15:50)
We need to make sure that we’re showing people that we have objective policies and enforcement.
Chairman Wicker: (01:15:54)
Sundar Pichai: (01:15:57)
In these teams, there are people who are Liberal, Republican, Libertarian, and so on. We are committed, we consult widely with important third party organizations across both sites when we develop our policies. As a CEO, I’m committed to running it without any political bias, but happy to engage more on answer.
Chairman Wicker: (01:16:17)
Thank you, gentlemen, and thank you, Senator Thune. The ranking member has now deferred to Senator Blumenthal. Sir, you are recognized.
Senator Richard Blumenthal: (01:16:27)
Thanks, Mr. Chairman, and thank you to the ranking member. I want to begin by associating myself with the very thoughtful comments made by the ranking member as to the need for broader consideration of issues of privacy and competition and local news. They are vitally important. Also, with the comments made by my colleagues, Senator Klobuchar, about the need for antitrust review. I assume we’ll be examining some of these topics in November, before the judiciary committee. I’ve been an advocate of reform of Section 230 for literally 15 years.
Senator Richard Blumenthal: (01:17:14)
When I was Attorney General at the State of Connecticut, I raised this issue of the absolute immunity that no longer seems appropriate. So, I really welcome the bipartisan consensus that we’re seeing now, that there needs to be a constructive review. But frankly, I am appalled that my Republican colleagues are holding this hearing literally days before an election, when they seem to want to bully and browbeat the platforms here to try to tilt them toward President Trump favor. The timing seems inexplicable except to game the ref in effect.
Senator Richard Blumenthal: (01:18:11)
I recognize the referee analogy is not completely exact, but that’s exactly what they’re trying to do, namely to bully and browbeat these platforms to favor President Trump’s tweets and posts. Frankly, President Trump has broken all the norms and he has put on your platforms potentially dangerous and lethal misinformation and disinformation. I’m going to hold up one of them. This one, as you can see, pertains to COVID. We have learned to live with it. He says, just like we are learning to live with COVID, talking about the flu, we have learned to live with it.
Senator Richard Blumenthal: (01:19:04)
In most populations, far less lethal. He has said that children, I would say, almost definitely, but almost immune from this disease. He has said about the election’s big problems and discrepancies with mail-in ballots all over the USA must have final total on November 3rd. Fortunately, the platforms are acting to label or take down these kinds of posts, but my Republican colleagues have been silent. They’ve lost their phones or their voices and the platforms in my view have-
Chairman Wicker: (01:19:57)
We just lost your voice there in mid sentence, Richard. Let’s suspend for just a minute till we get-
Senator Richard Blumenthal: (01:20:14)
I hope you can hear me now.
Chairman Wicker: (01:20:15)
There we are. Okay. We can hear you now, Senator Blumenthal. Just start back one sentence before. We had you until then.
Senator Richard Blumenthal: (01:20:27)
I just want to say about this misinformation from the President. There’s been deafening silence from my Republican colleagues. Now, we have a hearing that is in effect designed to intimidate or probably browbeat the platforms that have labeled this disinformation for exactly what it is. We’re on the verge of a massive onslaught on the integrity of our elections. President Trump has indicated that he will potentially interfere by posting information on election day, or the morning after. The Russians have begun already interfering in our elections.
Senator Richard Blumenthal: (01:21:15)
We’ve all received briefings that are literally chilling about what they are doing. The FBI and the CSIS has recently issued public alerts that, “foreign actors and cyber criminals likely to spread disinformation regarding 2020 results”. They are making 2016 look like child’s play in what they are doing. So, President Trump and the Republicans have a plan, which involves disinformation and misinformation. The Russians have a plan. I want to know whether you have a plan, Facebook, Twitter, Google, a plan, if the President uses your platforms to say on the day of the election that there is rigging or fraud without any basis in evidence.
Senator Richard Blumenthal: (01:22:27)
Or, attempts to say that the election is over and the voting, the counting of votes, must stop either on November 4th or some day subsequent. I would like, as to this question, about whether you have a plan, a yes or no.
Mark Zuckerberg: (01:22:55)
Senator, I could start. We do, we have policies related to all of the areas that you just mentioned. Candidates or campaigns trying to delegitimize methods of voting or the election, candidates trying to prematurely declare victory, and candidates trying to spread voter suppression material that is misleading about how, when, or where to vote. So, we’ve taken a number of steps on that front.
Senator Richard Blumenthal: (01:23:29)
Chairman Wicker: (01:23:29)
Perhaps we could take Mr. Pichai next and then Mr. Dorsey. Mr. Pichai?
Sundar Pichai: (01:23:35)
Senator, yes, we definitely are robustly, we have been planning for a while and we rely on raising up new sources through moments like that, as well as we have closely partnered with the Associated Press to make sure we can provide users the most accurate information possible.
Jack Dorsey: (01:23:55)
Yes, we also have a plan. So, our plan and our enforcement around these issues is pointing to more information and specifically State election officials. So, we want to give the people using the service as much information as possible.
Chairman Wicker: (01:24:11)
Thank you, Senator Blumenthal. Senator Cruz.
Senator Ted Cruz: (01:24:16)
Chairman, I want to thank you, Mr. Chairman, for holding this hearing. The three witnesses we have before this committee today collectively pose, I believe, the single greatest threat to free speech in America, and the greatest threat we have to free and fair elections. Yesterday, I spent a considerable amount of time speaking with both Mr. Zuckerberg and Mr. Pichai. I have concerns about the behavior of both of their companies. I would note that Facebook is at the minimum at least trying to make some efforts in the direction of defending free speech.
Senator Ted Cruz: (01:24:50)
I appreciate their doing so. Google, I agree with the concerns that Senator Klobuchar raised. I think Google has more power than any company on the face of the planet, and the antitrust concerns are real. The impact of Google is profound, and I expect we will have continued and ongoing discussions about Google’s abuse of that power and its willingness to manipulate search outcomes to influence and change election results. But today, I want to focus my questioning on Mr. Dorsey and on Twitter, because of the three players before us, I think Twitter’s conduct has by far been the most egregious. Mr. Dorsey, does Twitter have the ability to influence elections?
Jack Dorsey: (01:25:45)
Senator Ted Cruz: (01:25:48)
You don’t believe Twitter has any ability to influence elections?
Jack Dorsey: (01:25:52)
No, we are one part of spectrum of communication channels that people have.
Senator Ted Cruz: (01:25:57)
So, you are testified to this committee right now that Twitter, when it silences people, when it censors people, when it blocks political speech, that it has no impact on elections?
Jack Dorsey: (01:26:07)
People have choice of other communication channels with [crosstalk 01:26:11].
Senator Ted Cruz: (01:26:11)
Not if they don’t hear information. If you don’t think you have the power to influence elections, why do you block anything?
Jack Dorsey: (01:26:19)
Well, we have policies that are focused on making sure that more voices on the platform are possible. We see a lot of abuse and harassment, which ends up silencing people and something I believe from the platform.
Senator Ted Cruz: (01:26:30)
All right, Mr. Dorsey, I find your opening questions, your opening answers absurd on their face, but let’s talk about the last two weeks in particular. As you know, I have long been concerned about Twitter’s pattern of censoring and silencing individual Americans with whom Twitter disagrees. But two weeks ago, Twitter, and to a lesser extent, Facebook crossed a threshold that is fundamental in our country. Two weeks ago, Twitter made the unilateral decision to censor the New York Post, and a series of two blockbuster articles, both alleging evidence of corruption against Joe Biden.
Senator Ted Cruz: (01:27:08)
The first concerning Ukraine, the second concerning communist China. Twitter made the decision, number one, to prevent users, any user, from sharing those stories. Number two, you went even further and blocked the New York Post from sharing on Twitter its own reporting. Why did Twitter make the decision to censor the New York Post?
Jack Dorsey: (01:27:33)
We had a hack materials policy that we-
Senator Ted Cruz: (01:27:37)
When was that policy adopted?
Jack Dorsey: (01:27:39)
In 2018, I believe.
Senator Ted Cruz: (01:27:41)
In 2018. Go ahead. What was the policy?
Jack Dorsey: (01:27:44)
So, the policy is around limiting the spread of materials that are hacked, and we didn’t want Twitter to be a distributor for hack materials. We found that the New York Post, because it showed the direct materials, screenshots of the direct materials, and it was unclear how those were attained, that it fell under this policy.
Senator Ted Cruz: (01:28:10)
So, in your view, if it’s unclear, the source of a document, and in this instance, the New York Post documented what it said the source was, which it said it was a laptop owned by Hunter Biden, that had been turned into a repair store. So, they weren’t hiding what they claimed to be the source. Is it your position that Twitter, when you can’t tell the source, blocks press stories?
Jack Dorsey: (01:28:34)
No, not at all. Our team made a fast decision. The enforcement action, however, of blocking URLs, both in tweets and in DM, in direct messages, we believe was incorrect and we changed it within 24 hours.
Senator Ted Cruz: (01:28:50)
Today, right now, the New York Post is still blocked from tweeting two weeks later.
Jack Dorsey: (01:28:56)
Yes, they have to log into their account, which they can do at this minute, delete the original tweet, which fell under our original enforcement actions, and they can tweet the exact same material from the exact same article and it would go through.
Senator Ted Cruz: (01:29:09)
So, Mr. Dorsey, your ability is you have the power to force a media outlet, and let’s be clear, the New York Post isn’t just some random guy tweeting. The New York Post is the fourth highest circulation of any newspaper in America. The New York Post is over 200 years old. The New York Post was founded by Alexander Hamilton, and your position is that you can sit in Silicon Valley and demand of the media that you can tell them what stories they can publish, and you can tell the American people what reporting they can hear, is that right?
Jack Dorsey: (01:29:41)
No, every person, every account, every organization that signs up to Twitter agrees to a terms of service, a terms of service is [crosstalk 01:29:53]-
Senator Ted Cruz: (01:29:53)
So, media outlets must genuflect and obey your dictates if they wish to be able to communicate with readers, is that right?
Jack Dorsey: (01:30:01)
No, not at all. We recognized an error in this policy and specifically the enforcement.
Senator Ted Cruz: (01:30:06)
You’re still blocking their posts. You’re still blocking their posts. Right now, today, you’re blocking their posts.
Jack Dorsey: (01:30:12)
We’re not blocking the post. Anyone can tweet, [crosstalk 01:30:15].
Senator Ted Cruz: (01:30:14)
Can the New York Post post on their Twitter account?
Jack Dorsey: (01:30:19)
If they go into their account and delete the original [crosstalk 01:30:21]-
Senator Ted Cruz: (01:30:21)
No is your answer to that, unless they genuflect and agree with your dictates. So, let me ask you something. You claimed it was because of a hack materials policy. I find that facially highly dubious and clearly employed in a deeply partial way. Did Twitter block the distribution of the New York Times’ story a few weeks ago that purported to be based on copies of President Trump’s tax returns?
Jack Dorsey: (01:30:52)
We didn’t find that a violation of our terms of service and this policy in particular, because it is reporting about the material. It wasn’t distributing the material.
Senator Ted Cruz: (01:31:01)
Okay, well, that’s actually not true. They posted what they purported to be original source materials, and federal law, federal statute makes it a crime, a federal felony, to distribute someone’s tax returns against their knowledge. So, that material was based on something that was distributed in violation of federal law, and yet Twitter gleefully allowed people to circulate that. But when the article was critical of Joe Biden, Twitter engaged in rampant censorship and silencing.
Jack Dorsey: (01:31:33)
Again, we recognized the errors in our policy and we changed it within 24 hours. So, this is one of-
Senator Ted Cruz: (01:31:39)
But you’re still blocking the New York Post. You haven’t changed it.
Jack Dorsey: (01:31:42)
We have changed it. They can log into their account and delete the original tweet that was [crosstalk 01:31:47].
Senator Ted Cruz: (01:31:46)
You forced the political reporter to take down his post about the New York Post as well, is that correct?
Jack Dorsey: (01:31:52)
Within that 24 hour period? Yes, but as the policy has changed, anyone can tweet this article [crosstalk 01:31:58].
Senator Ted Cruz: (01:31:58)
So, [crosstalk 01:31:58], you could censor the New York Post, you can censor political, presumably you can censor the New York Times or any other media outlet. Mr. Dorsey, who the hell elected you and put you in charge of what the media are allowed to report and what the American people are allowed to hear, and why do you persist in behaving as a Democratic super PAC silencing views to the contrary of your political beliefs?
Chairman Wicker: (01:32:25)
Let’s give Mr. Dorsey a few second to answer that, and then we’ll have to conclude this segment.
Jack Dorsey: (01:32:34)
Well, we’re not doing that. This is why I opened this hearing with calls for more transparency. We realized we need to earn trust more. We realized that more accountability is needed to show our intentions and to show the outcomes.
Chairman Wicker: (01:32:49)
Thank you, Senator.
Jack Dorsey: (01:32:50)
So, I hear the concerns and acknowledge them, but we want to fix it with more transparency.
Chairman Wicker: (01:32:56)
Thank you, Senator Cruz. The ranking member has deferred now to Senator Schatz, who joins us remotely. Sir, you are recognized.
Senator Brian Schatz: (01:33:06)
Thank you, Mr. Chairman. Thank you, ranking member. This is an unusual hearing at an unusual time. I have never seen a hearing so close to an election on any topic, let alone on something that is so obviously a violation of our obligation under the law and the rules of the Senate to stay out of electioneering. We never do this, and there is a very good reason that we don’t call people before us to yell at them for not doing our bidding during an election. It is a misuse of taxpayer dollars. What’s happening here is a scar on this committee and the United States Senate.
Senator Brian Schatz: (01:33:48)
What we are seeing today is an attempt to bully the CEOs of private companies into carrying out a hit job on a presidential candidate, by making sure that they push out foreign and domestic misinformation meant to influence the election. To our witnesses today, you and other tech leaders need to stand up to this immoral behavior. The truth is, that because some of my colleagues accuse you, your companies, and your employees of being biased or Liberal, you have institutionally bent over backwards and overcompensated.
Senator Brian Schatz: (01:34:25)
You’ve hired Republican operatives, hosted private dinners with Republican leaders, and in contravention over your terms of service, given special dispensation to right wing voices and even throttled progressive journalism. Simply put, the Republicans have been successful in this play. So, during one of the most consequential elections in American history, my colleagues are trying to run this play again, and it is an embarrassment. I have plenty of questions for the witnesses on Section 230 on antitrust, on privacy, on antisemitism, on their relationship with journalism, but we have to call this hearing what it is, it’s a sham.
Senator Brian Schatz: (01:35:09)
So, for the first time in my eight years in the United States Senate, I’m not going to use my time to ask any questions because this is nonsense and it’s not going to work this time. This play my colleagues are running did not start today, and it’s not just happening here in the Senate. It is a coordinated effort by Republicans across the government. Last May, President Trump issued an executive order designed to narrow the protections of Section 230 to discourage platforms from engaging in content moderation on their own sites. After it was issued, President Trump started tweeting that Section 230 should be repealed as if he understands Section 230.
Senator Brian Schatz: (01:35:49)
In the last six months, President Trump has tweeted repeal Section 230 five times, in addition to other tweets in which he’s threatened the tech companies. A few weeks later, President Trump withdrew the nomination of FCC Commissioner Mike O’Reilly. Republican Commissioner O’Reilly questioned the FCC’s authority to regulate under Section 230 and the statute is not unclear on this. President Trump then nominated Nathan Simington, who was the drafter of NTIA’s petition to the FCC regarding Section 230, and Republican senators have enthusiastically participated.
Senator Brian Schatz: (01:36:29)
Since June of this year, six Republican only bills have been introduced all of which threatened platform’s ability to moderate content on their site. As the election draws closer, this Republican effort has become more and more aggressive. September 23rd, DOJ unveiled its own Section 230 draft legislation that would narrow the protections under the current law and discourage platforms from moderating content on their own site. September 14th and October 1st, respectively, Senators Hawley and Kennedy tried to pass their Republican only Section 230 bills live unanimous consent.
Senator Brian Schatz: (01:37:08)
Now, what that means is they went down to the floor and without a legislative hearing, without any input from Democrats at all, they tried to pass something so foundational to the internet unanimously without any discussion and any debate. On the same day as Senator Kennedy’s UC attempt, Senator Wicker forces the commerce committee without any discussion or negotiation beforehand to vote on subpoenaing the CEOs of Twitter, Facebook, and Google to testify. That’s why we’re here today. Two weeks later, on October 14th, Justice Clarence Thomas, on his own, issued a statement that appeared to support the narrowing of the court’s interpretation on Section 230.
Senator Brian Schatz: (01:37:51)
The very next day, the FCC Chairman Ajit Pai announced that the FCC would seek to clarify the meaning of Section 230. On that day, Senator Graham announced that the judiciary committee would vote to subpoena the tech companies over the content moderation. The context of this, in addition to everything, is that Senator Cruz is on Maria Bartiromo talking about a blockbuster story from the New York Post. Senator Hawley is on Fox and on the Senate floor and the commerce committee itself is tweeting out a campaign style video that sort of alarmingly says Hunter Biden’s emails, text censorship.
Senator Brian Schatz: (01:38:33)
On October 21st, Senator Hawley reattempted to pass his bill on Section 230 via UC, again, without going through any committee mark-up or vote. On Friday, Senator Graham announced that the CEO’s of Facebook and Twitter would testify before the Senate Judiciary Committee on November 17th. This is bullying and it is for electoral purposes. Do not let the United States Senate bully you into carrying the water for those who want to advance misinformation. Don’t let the specter of removing Section 230 protections or an amendment to antitrust law, or any other kinds of threats cause you to be a party to the subversion of our democracy.
Senator Brian Schatz: (01:39:20)
I will be glad to participate in good faith bipartisan hearings on these issues when the election is over, but this is not that. Thank you.
Chairman Wicker: (01:39:32)
Thank you, Senator Schatz. Next is Senator Fischer.
Senator Deb Fischer: (01:39:39)
Thank you, Mr. Chairman. Gentlemen, I’m not here to bully you today, and I am certainly not here to read any kind of political statement right before an election. To me, this hearing is not a sham. I am here to gain some clarity on the policies that you use. I am here to look at your proposals for more transparency, because your platforms have become an integral part of our democratic process for both candidates, but also more importantly for our citizens as well. Your platforms also have enormous power to manipulate user behavior and to direct content and to shape narratives.
Senator Deb Fischer: (01:40:34)
Mr. Dorsey, I heard your opening statement. I’ve read it. You also tweeted that the concept of good faith is what’s being challenged by many of you here today. Some of you don’t trust we’re acting in good faith. That’s the problem I want to focus on solving. Mr. Dorsey, why should we trust you with so much power? In other words, why shouldn’t we regulate you more?
Jack Dorsey: (01:41:05)
Well, the suggestions we’re making around more transparency is how we want to build that trust. We do agree that we should be publishing more of our practice of content moderation. We’ve made decisions to moderate content. We’ve made decisions to moderate content to make sure that we are enabling as many voices on our platform as possible. I acknowledge and completely agree with the concerns that it feels like a black box. Anything that we can do to bring transparency to it, including publishing our policies, our practices, answering very simple questions around how content is moderated, and then doing what we can around the growing trend of algorithms, moderating more of this content.
Jack Dorsey: (01:41:56)
As I said, this one is a tough one to actually bring transparency to. Explainability in AI is a field of research, but it is far out. I think a better opportunity is giving people more choice around the algorithms they use, including we would turn off the algorithms completely, which is what we’re attempting to do.
Senator Deb Fischer: (01:42:19)
Right, but you can understand the concerns that people have when they see that what many consider you’re making value judgments on what’s going to be on your platforms. You say users can report a content and then you take action, but certainly, you can understand that people are very concerned. They’re very worried about what they see as manipulation on your part, and to say you’re going to have more transparency and yeah, that’s, Sir, I would say with respect, I don’t think that’s enough just to say you’re going to have that transparency there and you’re not influencing people.
Senator Deb Fischer: (01:43:15)
Because as any time a free press is blocked on both sides, what we would view in the political world as both sides here, when views aren’t able to be expressed, that does have a huge amount of influence.
Jack Dorsey: (01:43:37)
I completely understand. I agree that it’s not enough. I don’t think transparency alone addresses these concerns. I think we have to continue to push for a more straightforward, and fast, and efficient appeals process. I do believe we need to look deeply at algorithms and how they’re used and how people have choice on how to use those algorithms or whether they use them.
Senator Deb Fischer: (01:43:59)
But ultimately, somebody makes a decision. Where does the buck stop with the algorithms? Where does the buck stop? Who’s going to make a value judgment? Because in my opinion, it is a value judgment.
Jack Dorsey: (01:44:12)
Well, ultimately, I’m accountable to all the decisions that the company makes, but we want to make sure that we’re providing clear frameworks that are objective and that can be tested, and that we have multiple checkpoints associated with them so that we can learn quickly if we’re doing something in there.
Senator Deb Fischer: (01:44:29)
When your company amplifies some content over others, is it fair for you to have legal protections for your actions?
Jack Dorsey: (01:44:42)
We believe so. Keep in mind, a lot of our algorithms recommending content is focused on saving people time. So, we’re ranking things that the algorithms believe people would find most relevant and most valuable in the time [crosstalk 01:44:57]-
Senator Deb Fischer: (01:44:57)
But it’s your value judgment on what those people would find most relevant?
Jack Dorsey: (01:45:02)
No, it’s not about your judgment, it’s based on-
Senator Deb Fischer: (01:45:03)
… find most relevant?
Jack Dorsey: (01:45:03)
No. It’s not about [inaudible 01:45:03]. It’s based on engagement metrics, it’s based on who you follow, it’s based on activity you take on on the network.
Senator Deb Fischer: (01:45:11)
Mr. Zuckerberg, with your ever-expanding content moderation policies, are you materially involved in that content?
Mark Zuckerberg: (01:45:23)
Senator, Yes. I spend a meaningful amount of time on making sure that we get our content policies and enforcement right.
Senator Deb Fischer: (01:45:31)
Okay, thank you. What, if any changes, do you think should be made to Section 230 to address the specific concerns regarding content moderation that you’ve heard so far this morning?
Mark Zuckerberg: (01:45:46)
Senator, I would outline a couple. First, I agree with Jack that increasing transparency into the content moderation process would be an important step for building trust and accountability. One thing that we already do at Facebook is, every quarter, we issue a transparency report where, for each of the 20 or so categories of harmful content that we are trying to address, so terrorism, child exploitation, incitement of violence, pornography, different types of content, we issue a report on how we’re doing, what the prevalence of that content is on our network, and what percent of it our systems are able to take down before someone even has to report it to us and what the precision is and basically how accurate our systems are at dealing with it.
Mark Zuckerberg: (01:46:41)
And getting to the point that everyone across the industry is reporting on a baseline like that, I think would be valuable for people to have these discussions not just about anecdotes of, “Okay, I saw a piece of content and I’m not necessarily sure I agree with how that was moderated,” it would allow their conversation to move to data. So that way, we can understand how these platforms are performing overall and hold them accountable.
Chairman Wicker: (01:47:09)
Senator Deb Fischer: (01:47:09)
That issue with your answer, I think, would be the time involved, that it wouldn’t be an immediate response to have that conversation, as you call it. I hope that all three of you gentlemen can answer that question in [inaudible 01:47:24] question. So my time is up. Thank you, Mr. Chairman.
Chairman Wicker: (01:47:27)
Thank you, Senator Fischer. I appreciate that. We are going to take, now, Senator Cantwell’s questioning, after which we are going to accommodate our witnesses with a five-minute recess. So Senator Cantwell, you are recognized.
Maria Cantwell: (01:47:44)
Thank you, Mr. Chairman. Can you hear me?
Chairman Wicker: (01:47:47)
Maria Cantwell: (01:47:48)
And can you see me this time?
Chairman Wicker: (01:47:50)
We can now see you, yes.
Maria Cantwell: (01:47:52)
Okay. Well, thank you, Mr. Chairman. And this is such an important hearing. I agree with many of the statements my colleagues have had that this hearing did need to take place at this moment, that the important discussion about how we keep a thriving internet economy and how we continue to make sure that hate speech and misinformation is taken down from the web is something that would probably better have been done in January than now, but here we are today and we’ve heard some astounding things that I definitely must refute.
Maria Cantwell: (01:48:24)
First of all, I’m not going to take lightly anybody who tries to undermine mail-in voting. Mail-in voting in the United States of America is safe. The State of Washington, the State of Oregon have been doing it for years. There is nothing wrong with our mail-in system. So I think that they’ll be secretaries of state, they’ll be our law enforcement agencies who’ve worked hard with state election officials and others who will be talking about how this process works and how we’re going to fight to protect it.
Maria Cantwell: (01:48:55)
I’m also going to not demean an organization just because they happen to be headquartered in the State of Washington or to have business there. That somebody claims that just because the geography of a company somehow makes it uber political for one side of the aisle or another, I seriously doubt. I know that because I see many of you coming to the State of Washington for Republican fundraisers with these officials. I know you know darn well that there are plenty of Republicans that work in high-tech firms.
Maria Cantwell: (01:49:25)
So the notion that somehow these people are crossing the aisle because of something and creating censorship, the notion that free speech is about the ability to say things and it doesn’t take … Well, maybe we need to have a history lesson from high school again, but yes, free speech means that people can make outrageous statements about their beliefs. So I think that the CEOs are telling us here what their process is for taking down healthcare information that’s, in fact, that’s not true, that is a threat to the public, and information that is a threat to our democracy. That is what they’re talking about.
Maria Cantwell: (01:50:05)
So I want to make it clear that this hearing could’ve happened at a later date and I don’t appreciate the misinformation that is coming across today that is trying to undermine our election process. It is safe. It is the backbone of what distinguishes America from other countries in the world. We do know how to have a safe and fair election. And one of the ways that we’re doing that is to have these individuals work with our law enforcement entities. My colleague, Gary Peters, made it very clear they successfully helped stop a threat on the Governor of Michigan. And why? Because they were working with them to make sure that information was passed on.
Maria Cantwell: (01:50:47)
So this is what we’re talking about. We’re talking about whether we’re going to be on the side of freedom and information and whether we’re going to put our shoulder to the wheel to continue to make sure that engine is there or whether we’re going to prematurely try to get rid of 230 and squash free speech. And so I want to make sure that we continue to move forward.
Maria Cantwell: (01:51:06)
So Mr. Zuckerberg, I’d like to turn to you because there was a time where there was great concern about what happened in Myanmar about the government using information against a Muslim minority. And you’ve took action and reformed the system. And just recently, in September, Facebook and Twitter announced they had suspended networks’ accounts liked to various organizations and for use of techniques, laundering Russian-backed websites’ accounts and derisive propaganda that we associated with state-run attempts to interfere in our elections. So could you please, Mr. Zuckerberg, talk about what you are doing to make sure state-run entities don’t interfere in US elections?
Mark Zuckerberg: (01:51:53)
Yes. Thank you, Senator. Since 2016, we’ve been building up some very sophisticated systems to make sure that we can stop foreign interference in elections, not just in the US, but all around the world. And a lot of this involves building up AI systems to identify when clusters of accounts aren’t behaving in the way that a normal person would. They’re behaving as fake accounts in some coordinated way. A lot of this is also about forming partnerships. The tech companies here today work more closely together to share signals about what’s happening on the different platforms to be able to combat these threats, as well as working more closely with law enforcement and intelligence communities around the world.
Mark Zuckerberg: (01:52:40)
And the net result of that is that, over the last few years, we’ve taken down more than 100 networks that were potentially attempting to spread misinformation or interfere. A lot of them were coming from Russia or Iran, a growing number from China, as well. And at this point, I’m proud that our company and as well as the others in the industry, I think, have built systems that are very effective at this. We can’t stop countries like Russia from trying to interfere in an election. Only the US Government can really push back with the appropriate leverage to do that, but we have built up systems to make sure that we can identify much faster when they’re attempting to do that. And I think that that should give the America people a good amount of confidence leading into this election.
Maria Cantwell: (01:53:38)
And is it true that those entities are trying to find domestic sources to help with that misinformation?
Mark Zuckerberg: (01:53:45)
Senator, yes. The tactics of these different governments are certainly evolving, including trying to find people outside of their country and, in some cases, we’re seeing domestic interference operations, as well. And the systems have had to evolve to be able to identify and take those down, as well. Of the hundred or so networks that I’ve just cited that we took down, about half were domestic operations, at this point. And that’s in various countries around the world, not primarily in the US, but this is a global phenomenon that we need to make sure that we continue pushing forward aggressively on.
Maria Cantwell: (01:54:26)
Thank you, thank you. Mr. Pichai, I’d like to turn to you for a second because I do want information from Facebook on this point, too, but I’d like to turn for you. There’s information now from media organizations that it may be as much as 30% to 50% of Google ad revenue that broadcasters and newsprint are losing, somewhere between 30% and 50% of their revenue that they could be getting to newspapers and broadcasting, losing it to the formats that Google has, as it relates to their platform and ad information. Can you confirm what information you have about this and do you think that Google is taking ad revenue from these news sources in an unfair way?
Sundar Pichai: (01:55:15)
Senator, it’s an important topic. It’s a complex topic. I do think journalists [inaudible 01:55:20] rightfully have called attention to it, particularly local journalism is very important. The internet has been a tremendously disrupting force and the pandemic has exacerbated it. Today, as Google, I would make the case that we believe in raising news across our products because we realize the importance of journalism. We send a lot of traffic to news publishers. All the ad technology questions I’m getting asked today, we invest in our technology, share the majority of revenue back to publishers. We are investing in subscription products. We have committed $2 billion in new licensing over the next three years to [inaudible 01:56:00] organizations. We have set up local emergency fund through COVID for local journalistic institutions.
Sundar Pichai: (01:56:08)
I gave plenty of examples, but the underlying forces which are impacting the industry, which is the internet, and whether it’s Google, if not Google, advertise instead of finding [crosstalk 01:56:20]-
Maria Cantwell: (01:56:21)
Yeah. I don’t have a clock on me, Senator. I don’t know how much time I have, Mr. Pichai.
Chairman Wicker: (01:56:24)
You’re a minute and a half over. So let’s see [crosstalk 01:56:27].
Maria Cantwell: (01:56:27)
Okay. Well, I’ll just leave it with this. Mr. Pichai, you hit on the key word, majority. I don’t think that you return the majority of the revenue to these broadcast entities. I do think it’s a problem. Yes, they’ve had to make it through the transformation, which is a rocky transformation, but the message from today’s hearing is the free press needs to live and be supported by all of us. And we look forward to discussing how we can make sure that they get fair return on their value. Thank you, Mr. Chairman.
Chairman Wicker: (01:56:55)
Thank you, Senator Cantwell. We will now take a five-minute recess and then we’ll begin. Most of our members have not yet had a chance to ask questions. The committee’s in recess for five minutes.
Chairman Wicker: (02:06:38)
Okay. This hearing will return to order and we understand that Senator Moran is next. So Sir, you are recognized.
Senator Moran: (02:06:50)
Chairman Wicker, thank you very much and thank you for you and Senator Cantwell hosting this hearing. Let me address, initially, the topic that seems to be primary today and then, if time, data privacy. Let me ask all three witnesses, how much money does your company spend annually on content moderation? How many people work, in general, in the area of content moderation, including by private contract? Let me just start with those two questions. Ultimately, I also want to ask you, how much money does your company spend in defending lawsuits stemming from user content on the platform?
Chairman Wicker: (02:07:33)
Okay. Mr. Zuckerberg, do you want to go first there?
Mark Zuckerberg: (02:07:38)
Senator, we have more than 35,000 people who work on content and safety review. And I believe our budget is multiple billions of dollars a year on this. I think upwards of three or maybe even more billion dollars a year, which is a greater amount that we’re spending on this than the whole revenue of our company was the year before we filed to go public in 2012.
Senator Moran: (02:08:13)
Chairman Wicker: (02:08:13)
Sundar Pichai: (02:08:18)
Senator, we use both a combination of human reviewers and AI moderation systems. We have well over 10,000 reviewers and we are investing there significantly. Not sure of the exact numbers, but I would say it’s north of over a billion dollars we spend on these things.
Senator Moran: (02:08:41)
Chairman Wicker: (02:08:42)
Jack Dorsey: (02:08:45)
I don’t have the specific numbers, but we want to maintain agility between the people that we have working on this and also just building better technology to automate. So our goal is flexibility here.
Senator Moran: (02:08:58)
Let me ask the question again about how much would you estimate that your company’s currently spending on defending lawsuits related to user content.
Chairman Wicker: (02:09:08)
In the same order. Okay?
Mark Zuckerberg: (02:09:12)
Senator, I don’t know the answer to that off the top of my head, but I can get back to you.
Senator Moran: (02:09:18)
Sundar Pichai: (02:09:21)
Senator, we do spend a lot on legal lawsuits, but not sure what of it applies content-related issues, but happy to follow.
Senator Moran: (02:09:30)
Jack Dorsey: (02:09:32)
I don’t have those numbers.
Senator Moran: (02:09:34)
Let me use your answers to highlight something that I want to be a topic of our conversation as we debate this legislation. Whatever the numbers are, you indicate that they are significant. It’s an enormous amount of money and an enormous amount of employee time, contract labor time in dealing with modification of content. These efforts are expensive. And I would highlight for my colleagues on the committee that they will not be any less expensive, perhaps less in scale, but not less in cost, for startups and small businesses. And as we develop our policies in regard to this topic, I want to make certain that entrepreneurship startup businesses and small business are considered in what it would cost in their efforts to meet the kind of standards to operate in a sphere.
Senator Moran: (02:10:35)
Let me quickly turn to federal privacy. I chair the Consumer Data Privacy Security Act. We’ve tried for months, Senator Blumenthal and I, to develop a bipartisan piece of legislation. We were close, but unsuccessful in doing so. Let me ask Mr. Zuckerberg. Facebook entered into a consent order with the FTC in July of 2012 for violations of the FTC act and later agreed to pay a $5-billion penalty along with a robust settlement order in 2018, following the Cambridge Analytica incident that violated the 2012 order. My legislation will provide the FTC with first-time civil penalty authority. Do you think this type of enforcement tool for the FTC would better deter unfair, deceptive practices than the current enforcement regime?
Mark Zuckerberg: (02:11:30)
Senator, I would need to understand it in a little bit more detail before weighing in on this, but I think that the settlement that we have with the FTC, we’re going to be setting up an industry-leading privacy program. We have, I think, more than a thousand engineers working on the privacy program now. I know we’re basically implementing a program which is sort of the equivalent of Sarbanes-Oxley’s financial regulation around internal auditing and controls around privacy and protecting people’s data, as well. So I think that that settlement will be quite effective in ensuring that people’s data and privacy are protected.
Senator Moran: (02:12:19)
Mr. Pichai, Google YouTube’s $170-million settlement with the FTC and the State of New York for alleged violations of COPPA involved persistent identifiers. How should federal legislation address persistent identifiers for consumers over the age of 13?
Sundar Pichai: (02:12:42)
Senator, today, we have done two things as a company. We have invested in one of a kind special product called YouTube Kids where content can be safe for kids. Obviously, on the YouTube main product, today, the way internet gets used, families do view content and part of our settlement was adapting so that they can accommodate for those use cases, as well. Privacy is one of the most important areas we invest in as a company. Have thousands of engineers working on it. We believe in giving users control, choice, and transparency. And anytime we associate data with users, we are transparent. They can go see what data is there. We give them delete controls. We give data portability options.
Sundar Pichai: (02:13:28)
And just last year, we announced an important change by which, for all new users, we delete the data automatically without them needing to do anything. And we encourage users to go through privacy checkup or a billion people have gone through their privacy checkups and it’s an area where we are investing significantly.
Senator Moran: (02:13:48)
Thank you. Chairman, I don’t see my time clock. Do I have time for one more?
Chairman Wicker: (02:13:51)
You really don’t. Your time has just expired, but thank you very much for-
Senator Moran: (02:13:56)
Mr. Chairman, thank you.
Chairman Wicker: (02:13:57)
Thank you so much. Senator Markey.
Senator Markey: (02:14:01)
Thank you, Mr. Chairman, very much. Today, Trump, his Republican allies in Congress and his propaganda parents on Fox News are pedaling a myth. And today, my Republican colleagues on the Senate Commerce Committee are simply doing the president’s bidding. Let’s be clear. Republicans can and should join us in addressing the real problems posed by big tech, but instead my Republican colleagues are determined to feed a false narrative about anti-conservative bias meant to intimidate big tech so it will stand idly by and allow interference in our election again.
Senator Markey: (02:14:49)
Here’s the truth. Violence and hate speech online are real problems. Anti [inaudible 02:14:56] bias [inaudible 02:14:58] a problem. Our foreign attempts to influence our election with disinformation are real problems. Anti-conservative bias is not a problem. The big tech business model, which puts profits ahead of people, is a real problem. Anti-conservative bias is not a problem. The issue is not that the companies before us today are taking too many posts down. The issue is that they’re leaving too many dangerous posts up. In fact, amplifying harmful content so that it spreads like wildfire and torches our democracy.
Senator Markey: (02:15:41)
Mr. Zuckerberg, when President Trump posted on Facebook that, “When the looting starts, the shooting starts,” you failed to take down that post. Within a day, the post had hundreds of thousands of shares and likes on Facebook. Since then, the President has gone on national television and told a hate group to, quote, “stand by.” And he has repeatedly refused to commit that he will accept the election results. Mr. Zuckerberg, can you commit that, if the president goes on Facebook and encourages violence after election results are announced, that you will make sure your company’s algorithms don’t spread that content and you will immediately remove those messages?
Mark Zuckerberg: (02:16:32)
Senator, yes. Incitement of violence is against our policy and there are not exceptions to that, including for politicians.
Senator Markey: (02:16:42)
There are exceptions, did you say?
Mark Zuckerberg: (02:16:44)
There are not exceptions.
Senator Markey: (02:16:46)
There are no exceptions, which is very important because, obviously, there could be messages that are sent that could throw our democracy into chaos. And a lot of it can be and will be created if social media sites do not police what the president says. Mr. Zuckerberg, if President Trump shares Russian or Iranian disinformation, lying about the outcome of the election, can you commit that you will make sure your algorithms do not amplify that content and that you will immediately take that content down?
Mark Zuckerberg: (02:17:28)
Senator, we have a policy in place that prevents any candidate or campaign from prematurely declaring victory or trying to de-legitimize the result of the election. And what we will do in that case is we will append some factual information to any post that is trying to do that. So if someone says that they won the election when the result isn’t in, for example, we will append a piece of information to that saying that official election results are not in yet. So that way, anyone who sees that post will see that context in line.
Mark Zuckerberg: (02:18:06)
And also, if one of the candidates tries to prematurely declare victory or cite an incorrect result, we have a precaution that we’ve built in to put at the top of the Facebook app, for everyone who signs in in the US, information about the accurate US election voting results. I think that this is a very important issue to make sure that people can get accurate information about the results of the election.
Senator Markey: (02:18:31)
It cannot be stated as being anything less than critically important. Democracy could be seriously challenged, beginning next Tuesday evening and for several days afterwards, maybe longer. And a lot of responsibility is going to be on the shoulders of Facebook and our other witnesses today. Mr. Zuckerberg, if President Trump uses his Facebook account to call on private citizens to patrol the polls on election day, which would constitute illegal voter intimidation and violation of the Voting Rights Act, will you commit that your algorithms will not spread that content and that you will immediately take that content down?
Mark Zuckerberg: (02:19:15)
Senator, my understanding is that content like what you’re saying would violate our voter suppression policies and would come down.
Senator Markey: (02:19:25)
Okay. Again, the stakes are going to be very high and we’re going to take that as a commitment that you will do that because, obviously, we would otherwise have a serious question mark placed over our elections. We know Facebook cares about one thing, keeping users glued to its platform. One of the ways you do that is with Facebook Groups. Mr. Zuckerberg, in 2017, you announced the goal of one billion users joining Facebook Groups. Unfortunately, these forum pages have become breeding grounds for hate, echo chambers …
Senator Markey: (02:20:03)
Foreign pages have become breeding grounds for hate, echo chambers of misinformation and venues for coordination of violence. Again, Facebook is not only failing to take these pages down, this actively spreading these pages and helping these groups’ recruitment efforts. Facebook’s own internal research found that 64% of all extremist group joins are due to Facebook’s recommendation tools. Mr. Zuckerberg will you commit to stopping all group recommendations on your platform until US election results are certified? Yes or no?
Mark Zuckerberg: (02:20:37)
Senator, we have taken the step of stopping recommendations and groups for all political content or social issue groups as a precaution for this. But just to clarify one thing, the vast, vast majority of groups and communities that people are a part of are not extremist organizations or even political. They’re interest-based and communities that I think are quite helpful and healthy for people to be a part of. I do think we need to make sure that our recommendation algorithm doesn’t encourage people to join extremist groups. That’s something that we have already taken a number of steps on, and I agree with you is very important that we continue to make progress on.
Senator Markey: (02:21:25)
Well, your algorithms are promoting online spaces that foster political violence. At the very least you should disable those algorithms that are recruiting users during this most sensitive period of our democracy.
Chairman Wicker: (02:21:37)
Senator Markey: (02:21:38)
Thank you, Mr. Chairman.
Chairman Wicker: (02:21:38)
Thank you, Senator Markey. Mr. Zuckerberg, let me just ask you this. In these scenarios that Senator Markey was posing, the action of Facebook would not be a function of algorithms in those cases, would it?
Mark Zuckerberg: (02:21:56)
Senator, I think that you’re right and that that’s a good clarification. A lot of this is more about enforcement of content policies. Some of the questions were about algorithms. I think group ranking is an algorithm, but broadly I think a lot of it is content enforcement.
Chairman Wicker: (02:22:14)
Thank you for clarifying that. Senator Blackburn, you are recognized.
Senator Marsha Blackburn: (02:22:21)
Thank you, Mr. Chairman. And I want to thank each of you for coming to us voluntarily. We appreciate that. There are undoubtedly benefits to using your platforms. As you have heard, everyone mentioned today, there are also some concerns, which you’re also hearing. Privacy, free speech, politics, religion. And I have chuckled as I’ve sat here listening to you all. That book Valley of the Gods. It reminds me that you all are in control of what people are going to hear, what they’re going to see. And therefore you have the ability to dictate what information is coming in to them. And I think it’s important to realize, you’re set up as an information source, not as a news media. And so therefore, censoring things that you all think unseemly may be something that is not unseemly to people in other parts of the country. But let me ask each of you very quickly. Do any of you have any content moderators who are conservatives? Mr. Dorsey first? Yes or no?
Jack Dorsey: (02:23:35)
But we don’t ask political ideology when we’re hiring.
Senator Marsha Blackburn: (02:23:41)
Okay. You don’t. Okay, Mr. Zuckerberg.
Mark Zuckerberg: (02:23:45)
Senator, we don’t ask for their ideology, but just statistically, there are 35,000 of them in cities and places all across the country and world. So I would imagine, yes.
Senator Marsha Blackburn: (02:23:55)
Sundar Pichai: (02:23:57)
The answer would be yes, because we hire them through the United States.
Senator Marsha Blackburn: (02:24:02)
Okay. All right. And looking at some of your censoring. Mr. Dorsey, you all have censored Joe Biden zero times. You have censored Donald Trump 65 times. So I want to go back to Senator Gardner’s questions. You claimed earlier that the Holocaust denial and threats of Jewish genocide by Iran’s terrorist Ayatollah don’t violate Twitter’s so-called rules and that it’s important for world leaders, like Iran’s terrorist leader to have a platform on Twitter. So let me ask you this. Who elected the Ayatollah?
Jack Dorsey: (02:24:47)
I don’t know.
Senator Marsha Blackburn: (02:24:49)
You don’t know. Okay. I think this is called a dictatorship. So are people in Iran allowed to use Twitter or does the country whose leader you claim deserves a platform, ban them from doing so?
Jack Dorsey: (02:25:07)
Ideally, we would love for the people of Iran to use Twitter.
Senator Marsha Blackburn: (02:25:10)
Well, Iran bans Twitter. And Mr. Zuckerberg, I know you are aware, they ban Facebook also. So, Mr. Dorsey is Donald Trump a world leader?
Jack Dorsey: (02:25:22)
Senator Marsha Blackburn: (02:25:23)
Okay. So it would be important for world leaders to have access to your platform, correct?
Jack Dorsey: (02:25:30)
Senator Marsha Blackburn: (02:25:32)
And so why did you deny that platform via censorship to the US president?
Jack Dorsey: (02:25:39)
We haven’t censored the US president.
Senator Marsha Blackburn: (02:25:41)
Oh yes you have. How many posts from Iran’s terrorist Ayatollah have you censored? How many posts from Vladimir Putin have you censored?
Jack Dorsey: (02:25:52)
We have labeled tweets of world leaders. We have a policy around not taking down the content, but simply adding more context around it.
Senator Marsha Blackburn: (02:26:00)
Okay. And the US president, you have censored 65 times. You testified that you’re worried about disinformation and election interference. That is something we all worry about. And of course, for about a hundred years, foreign sources have been trying to influence US policy and US elections. Now, they’re onto your platforms. They see this as a way to get access to the American people. So, given your refusal to sensor or ban foreign dictators while regularly censoring the president, aren’t you, at this very moment, personally responsible for flooding the nation with foreign disinformation?
Jack Dorsey: (02:26:48)
Just to be clear, we have not censored the president. We have not taken the tweets down that you’re referencing. They have more context and a label applied to them and we do the same for leaders around the world.
Senator Marsha Blackburn: (02:27:00)
Okay. Let me ask you this. Do you share any of your data mining? And this is to each of the three of you. Do you share any of your data mining with the Democrat National Committee?
Jack Dorsey: (02:27:15)
I’m not sure what you mean by the question, but we have a data platform that we have a number of customers. I’m not sure of the customer list.
Senator Marsha Blackburn: (02:27:26)
Okay. And you said you don’t keep lists. I made that note. I found that [crosstalk 02:27:32].
Jack Dorsey: (02:27:29)
Well, keep lists of accounts that we watch. We don’t keep a list of accounts that we watch.
Senator Marsha Blackburn: (02:27:36)
Yeah. Okay. All right. Okay. Mr. Pichai, is Blake Lemoine, one of your engineers, still working with you?
Sundar Pichai: (02:27:47)
Senator, I’m familiar with this name as a name. I’m not sure whether he’s currently an employee.
Senator Marsha Blackburn: (02:27:53)
Okay. Well, he has had very unkind things to say about me. And I was just wondering if you all had still kept him working there. Also, I want to mention with you, Mr. Pichai, the way you all have censored some things. Google searches for Joe Biden generated approximately 30,000 impressions for Breitbart links. This was on May one. And after May 5th, both the impressions and the clicks went to zero. I hope that what you all realize from this hearing is that there is a pattern. You may not believe it exists, but there is a pattern of subjective manipulation of the information that is available to people from your platforms.
Senator Marsha Blackburn: (02:28:50)
What has driven additional attention to this, is the fact that more of a family’s functional life is now being conducted online. Because of this, more people are realizing that you are picking winners and losers. Mr. Zuckerberg, years ago, you said Facebook functioned more like a government than a company. And you’re beginning to insert yourself into these issues of free speech. Mr. Zuckerberg, with my time that is left, let me ask you this. You mentioned early in your remarks that you saw some things as competing equities. Is the First Amendment a given right, or is that a competing equity?
Mark Zuckerberg: (02:29:55)
[inaudible 02:29:55] I believe strongly in free expression. Sorry if I was on mute there. But I do think that like all equities, it is balanced against other equities, like safety and privacy. And even the people who believe in the strongest possible interpretation of the First Amendment, still believe that there should be some limits on speech when it could cause imminent risk of physical harm. The famous example that’s used is that you can’t shout fire in a crowded theater. So I think that getting those equities and the balance right-
Senator Marsha Blackburn: (02:30:31)
Right. My challenge is [crosstalk 02:30:32].
Mark Zuckerberg: (02:30:32)
Is the challenge that we face.
Chairman Wicker: (02:30:33)
The time has expired. Perhaps we can follow up.
Senator Marsha Blackburn: (02:30:36)
Well, we believe in the First Amendment and we are going to. Yes, we will have questions to follow up. Thank you, Mr. Chairman. I can’t see the clock.
Chairman Wicker: (02:30:46)
Thank you. Senator Udall.
Senator Tom Udall: (02:30:49)
Mr. Chairman, thank you. And Senator Cantwell, really appreciate this hearing. I want to start by laying out three facts. The US intelligence community has found that the Russian government is intent on election interference in the United States. They did it in 2016. They’re doing it in 2020. The intelligence also says they want to help President Trump. They did so in 2016. The president doesn’t like this to be said, but it’s a fact. We also know that the Russian strategy, this time around, is going after Hunter Biden. So I recognize that the details of how to handle misinformation on the internet are tough. But I think that companies like Twitter and Facebook that took action to not be a part of a suspected Russian election interference operation, were doing the right thing. And let me be clear. No one believes these companies represent the law or represent the public.
Senator Tom Udall: (02:31:55)
When we say work the refs, the US government is the referee. The FCC, the Congress, the Presidency, and the Supreme Court are the referees. It’s very dangerous for President Trump, Justice Thomas and Republicans in Congress and at the FCC to threaten new federal laws in order to force social media companies to amplify false claims to conspiracy theories and disinformation campaigns. And my question to all three of you, do the Russian government and other foreign nations continue to attempt to use your company’s platforms to spread disinformation and influence the 2020 election?
Senator Tom Udall: (02:32:44)
Can you briefly describe what you are seeing? Please start Mr. Dorsey, and then Mr. Pichai. And Mr. Zuckerberg, you gave an answer partially on this. I’d like you to expand on that answer. Thank you.
Jack Dorsey: (02:33:00)
Yes. So we do continue to see interference. We recently disclosed actions we took on both Russia and actions originating out of Iran. We’ve made those disclosures public. We can share those with your team. But this remains, as you’ve heard from others on the panel and as Mark has detailed, one of our highest priorities and something we want to make sure that we are focused on eliminating as much platform manipulation as possible.
Sundar Pichai: (02:33:38)
Senator, we do continue to see coordinated influence operation at times. We’ve been very vigilant. We appreciate the cooperation we get from intelligence agencies. And as companies, we are sharing information. To give you an example, and we publish transparency reports, in June, we identified efforts, one from Iran group APD 35 targeting the Trump campaign. One from China, a group APD 31 targeting the Biden campaign. Most of these were phishing attempts but our spam filters were able to remove most of the emails out from reaching users. But we notified intelligence agencies. And that’s an example of the kind of activity we see. And I think it’s an area where you would need strong cooperation with government agencies moving forward.
Chairman Wicker: (02:34:32)
Mark Zuckerberg: (02:34:35)
Senator, like Jack and Sundar, we also see continued attempts by Russia and other countries, especially Iran and China to run these kind of information operations. We also see an increase in domestic operations around the world. Fortunately, we’ve been able to build partnerships across the industry, both with the companies here today and with law enforcement and the intelligence community, to be able to share signals, to identify these threats sooner. And along the lines of what you mentioned earlier, one of the threats that the FBI has alerted our companies and the public to, was the possibility of a hack and leak operation in the days or weeks leading up to this election.
Mark Zuckerberg: (02:35:27)
So you had both the public testimony from the FBI and in private meetings alerts that were given to at least our company, I assume the others as well, that suggested that we be on high alert and sensitivity that if a trove of documents appeared that we should view that with suspicion, that it might be part of a foreign manipulation attempt. So that’s what we’re seeing. And I’m happy to go into more detail as well, if that’s helpful.
Senator Tom Udall: (02:36:00)
Okay. Thank you very much. This one is a really simple question. I think a yes or no. Will you continue to push back against this kind of foreign interference, even if powerful Republicans threaten to take official action against your companies? Mr. Zuckerberg, why don’t we start with you and work the other way back?
Mark Zuckerberg: (02:36:20)
Senator, absolutely. This is incredibly important for our democracy and we’re committed to doing this work
Sundar Pichai: (02:36:29)
Senator, absolutely. Protecting our civic and democratic processes is fundamental to what we do. We will do everything we can.
Jack Dorsey: (02:36:38)
Yes. And we will continue to work and push back on any manipulation of the platform.
Senator Tom Udall: (02:36:45)
Thank you for those answers. Mr. Zuckerberg do Facebook and other social media networks have an obligation to prevent disinformation and malicious actors spreading conspiracy theories, dangerous health disinformation, and hate speech, even if preventing its spread means less traffic and potentially less advertising revenue for Facebook?
Mark Zuckerberg: (02:37:12)
Senator, in general, yes. I think that for foreign countries trying to interfere in democracy, I think that that is a relatively clear cut question where I would hope that no one disagrees that we don’t want foreign countries or governments trying to interfere in our elections, whether through disinformation or fake accounts or anything like that. Around health misinformation, we’re in the middle of a pandemic, it’s a health emergency. I certainly think that this is a high sensitivity time. So, we’re treating with extra sensitivity any misinformation that could lead to harm around COVID that would lead people to not get the right treatments or to not take the right security precautions. We do draw a distinction between harmful misinformation and information that’s just wrong. And we take a harder line and more enforcement against harmful misinformation.
Chairman Wicker: (02:38:09)
Senator Tom Udall: (02:38:15)
Thank you, Mr. Chairman.
Chairman Wicker: (02:38:16)
Thank you Senator Udall. Senator Capito.
Senator Shelley Moore Capito: (02:38:16)
Thank you, Mr. Chairman, and thank all of you for being with us today. I would say that any time that we can get the three of you in front of the American people, whether it’s several days before an election or several days after, is extremely useful and can be very productive. So I appreciate the three of you coming and the committee holding this hearing.
Senator Shelley Moore Capito: (02:38:37)
As we’ve heard, Americans turn every day to your platforms for a lot of different information. I would like to give a shout out to Mr. Zuckerberg because the last time he was in front of our committee, I had asked him to share the plenty of Facebook into rural America and help us with our fiber deployments into rural America. And when we see in this COVID environment, we see how important that is. And he followed through with that. I would like to thank him and his company for helping partner with us in West Virginia to get more people connected. And I think that is an essential. I would make a suggestion as well. Maybe when we get to the end, when we talk about fines, what I think we could do with these millions and billion dollar fines that some of your companies have been penalized on, we could make a great jump and get to that last household. But the topic today is on objectionable content and how you make those judgments. So quickly, each one of you, I know that in the Section 230, it says that the term is objectionable content or otherwise objectionable. Would you be in favor of redefining that more specifically. That’s awful broad. And that’s where I think some of these questions become very difficult to answer. So we’ll start with Mr. Dorsey on how do you define otherwise objectionable and objectionable and how can we improve that definition so that it’s easier to follow?
Jack Dorsey: (02:40:06)
Well, our interpretation of objectionable is anything that is limiting potentially the speech of others. All of our policies are focused on making sure that people feel safe to express themselves. And when we see abuse, harassment, misleading information, these are all threats against that. And it makes people want to leave the internet, makes people want to leave these conversations online. So that is what we’re trying to protect, is making sure that people feel safe enough and free enough to express themselves in whatever way they wish.
Senator Shelley Moore Capito: (02:40:42)
So this a follow-up to that. Much has been talked about the blocking of the New York Post. Do you have an instance, or for instance, of when you’ve actually blocked somebody that would be considered politically liberal on the other side, in the political realm and in this country? Do you have an example of that to offset where the New York post criticism has come from?
Jack Dorsey: (02:41:04)
Well, we don’t have an understanding of the ideology of any one particular account, and that is also not how our policies are written or enforcement taken. So I’m sure there are a number of examples. But that is not our focus. We’re looking purely at the violations of our policies, taking action against that.
Senator Shelley Moore Capito: (02:41:23)
Yeah. Mr. Zuckerberg, how would you define otherwise objectionable, objectionable? Not how would you define it, but how would you refine the definition of that to make it more objective than subjective?
Mark Zuckerberg: (02:41:39)
Senator, thank you. When I look at the written language in Section 230 and the content that we think shouldn’t be allowed on our services, some of the things that we bucket in otherwise objectionable content today include general bullying and harassment of people on the platform. So somewhat similar to what Jack was just talking about a minute ago. And I would worry that some of the proposals that suggest getting rid of the phrase, otherwise objectionable from Section 230 would limit our ability to remove bullying and harassing content from our platforms, which I think would make them worse places for people. So I think we need to be very careful in how we think through that.
Senator Shelley Moore Capito: (02:42:24)
Well, thank you. Mr. Pichai.
Sundar Pichai: (02:42:28)
Senator, maybe there I would add is that the content is so dynamic. YouTube gets 500 hours per minute of video uploaded. On an average of any day, search 15% of queries, we have never seen before. To give you an example, a few years ago, there was an issue around teenagers consuming Tide Pods. And it was a kind of issue which was causing real harm. When we run into those situations, we are able to act with certainty and protect our users. The Christchurch shooting, where there was a live shooter, live streaming horrific images, it was a learning moment for all our platforms. We were able to intervene, again with certainty. And so that’s what otherwise objectionable allows. And, I think that flexibility is what allows us to focus. We always state with clear policies, what we are doing, but I think it gives platforms of all sizes flexibilities to protect our users.
Senator Shelley Moore Capito: (02:43:28)
Thank you. I think, I’m hearing from all three of you, really, that the definition is fairly acceptable to you all. In my view, sometimes I think it can go too much to the eye of the beholder type… The beholder being either a you all, or your reviewers or your AI and then it gets into a region where maybe it becomes so very subjective.
Senator Shelley Moore Capito: (02:43:50)
I want to move to a different topic because in my personal conversations with at least two of you, you’ve expressed the need to have the 230 protections because of the protections that it gives to the small innovators. Well, you sit in front of us. And I think all of us are wondering, how many small innovators and what kind of market share could they possibly have when we see the dominance of the three of you? I understand you started as small innovators when you first started. I get that. How can a small innovator really break through? And what does 230 really have to do with the ability of a… I’m skeptical on the argument, quite frankly. So, whoever wants to answer that. Mr. Zuckerberg, do you want to start?
Mark Zuckerberg: (02:44:39)
Sure, Senator. I do think that if, when we were getting started with building Facebook, if we were subject to a larger number of content lawsuits, because 230 didn’t exist, that would have likely made it prohibitive for me as a college student in a dorm room to get started with this enterprise. And I think that it may make sense to modify 230 at this point, just to make sure that it’s still working as intended. But I think it’s extremely important that we make sure that for smaller companies that are getting started, the cost of having to comply with any regulation is either waived until a certain scale, or it is, at a minimum, taken into account as a serious factor to make sure that we’re not preventing the next set of ideas from getting built.
Chairman Wicker: (02:45:32)
Thank you. Thank you, Senator.
Senator Shelley Moore Capito: (02:45:33)
Thank you, Mr. Chairman.
Chairman Wicker: (02:45:34)
Thank you. Senator Baldwin.
Senator Tammy Baldwin: (02:45:34)
Thank you. I’d like to begin by making two points. I believe the Republicans have called this hearing in order to support a false narrative fabricated by the president to help his reelection prospects. And number two, I believe that the tech companies here today need to take more action, not less, to combat misinformation, including misinformation on the election, misinformation on the COVID-19 pandemic and misinformation and posts meant to incite violence. And that should include misinformation spread by President Trump on their platforms.
Senator Tammy Baldwin: (02:46:25)
So I want to start with asking the Committee Clerk to bring up my first slide. Mr. Dorsey, I appreciate the work that Twitter has done to flag or even take down false or misleading information about COVID-19, such as this October 11th tweet by the president claiming he has immunity from the virus after contracting it and recovering, contrary to what the medical community tells us.
Senator Tammy Baldwin: (02:46:59)
Just yesterday morning, the president tweeted this, that the media is incorrectly focused on the pandemic and that our nation is quote rounding the turn on COVID-19. In fact, according to Johns Hopkins University, in the past week, the seven day national average of new cases reached its highest level ever. And in my home state of Wisconsin, case counts continue to reach record levels. Yesterday, Wisconsin set a new record with 64 deaths and 5,462 new confirmed cases of COVID-19.
Senator Tammy Baldwin: (02:47:39)
That is not rounding the turn, but it’s also not a tweet that was flagged or taken down. Mr. Dorsey, given the volume of misleading posts about COVID-19 out there. Do you prioritize removal based on something like the reach or audience of a particular user of Twitter?
Jack Dorsey: (02:48:02)
I could be mistaken, but it looks like the tweet that you showed actually did have a label pointing to both of them, pointing to our COVID resource hub in our interface. So in regards to misleading information, we have policies against manipulated media in support of public health and COVID information and election interference and civic integrity. And we take action on it. In some cases it’s labeling, in some cases it’s removal.
Senator Tammy Baldwin: (02:48:40)
What additional steps are you planning to take to address dangerously misleading tweets like the president rounding the turn tweet?
Jack Dorsey: (02:48:52)
We want to make sure that we are giving people as much information as possible and that ultimately we’re connecting the dots when they see information like that, that they have an easy way to get an official resource or many more viewpoints on what they’re seeing. So we’ll continue to refine our policy. We’ll continue to refine our enforcement around misleading information, and we’re looking deeply at how we can evolve our product to do the same.
Senator Tammy Baldwin: (02:49:27)
Mr. Zuckerberg, I want to turn to you to talk about the ongoing issue of right-wing militias using Facebook as a platform to organize and promote violence. Could the Committee Clerk please bring up my second slide? On August 25th, a self-described militia group called Kenosha Guard created a Facebook event page entitled Armed Citizens to Protect our Lives and Property, encouraging armed individuals to go to Kenosha and quote defend the city during a period of civil unrest, following the police shooting of Jacob Blake. That evening, a 17- year-old from Illinois did just that and ended up killing two protesters and seriously injuring a third. Commenters in this group wrote that they wanted to kill looters and rioters and switch to real bullets and put a stop to these rioting impetuous children.
Senator Tammy Baldwin: (02:50:37)
While Facebook has already had a policy in place banning militia groups, this page remained in place. According to press reports, Facebook received more than 450 complaints about this page, but your content moderators did not remove it. Something you subsequently called an operational mistake. Recently, as you heard earlier in questions, the alleged plot to kidnap Michigan Governor Gretchen Whitmer and the potential for intimidation or even violence at voting locations, show that the proliferation of the threat of violence on Facebook remains a very real and urgent problem.
Senator Tammy Baldwin: (02:51:24)
Mr. Zuckerberg, in light of the operational mistake around Kenosha, what steps has Facebook taken to ensure that your platform is not being used to promote more of this type of violence?
Mark Zuckerberg: (02:51:38)
Thank you, Senator. This is a big area of concern for me personally, and for the company. We’ve strengthened our policies to prohibit any militarized social movement. So any kind of militia like this. We’ve also banned conspiracy networks. So QAnon being the largest example of that. That is completely prohibited on Facebook at this point, which in this period where I’m personally, I’m worried about the potential of increased civil unrest, making sure that those groups can’t organize on Facebook may cut off some legitimate uses. But I think that they will also preclude greater potential for organizing any harm. And by making the policy simpler, we will also make it so that there are fewer mistakes in content moderation. So I feel like we’re in a much stronger place on the policies on this at this point.
Chairman Wicker: (02:52:43)
Thank you, Senator Baldwin. Senator Lee.
Senator Mike Lee: (02:52:48)
Thank you very much, Mr. Chairman. I want to read a few quotes from each of you, each of our three witnesses and from your companies. And then I may ask for a response. So, Mr. Zuckerberg, this one is from you. You said, quote, “We’ve built Facebook to be a platform for all ideas. Our community’s success depends on everyone feeling comfortable, sharing what they want. It doesn’t make sense for our mission or for our business to suppress political content or prevent anyone from saying what matters most to them.” You said that I believe on May, 18, 2016.
Senator Mike Lee: (02:53:27)
Mr. Dorsey, on September 5th, 2018, you said, “Let me be clear about one important and foundational fact. Twitter does not use political ideology to make any decisions.” Mr. Pichai, on October 28th, 2020, you said, “Let me be clear. We approach our work without political bias, full stop.”
Senator Mike Lee: (02:53:52)
Now, these quotes make me think that there is a good case to be made that you’re engaging in unfair or deceptive trade practices in violation of federal law. I see these quotes where each of you tell consumers and the public about your business practices. But then you seem to do the opposite and take censorship related actions against the president, against members of his administration, against the New York Post, The Babylon Bee, The Federalist, pro-life groups. And there are countless other examples. In fact, I think the trend is clear that you almost always censor. And when I use the word Censor here, I mean block content, fact-check or label content, or demonetize websites of conservative, Republican, or pro-life individuals or groups or companies, contradicting your commercial promises. But I don’t see this suppression of high-profile liberal commentators. So for example, have you ever censored a Democratic senator?
Senator Mike Lee: (02:55:03)
So, for example, have you ever censored a democratic Senator? How about president Obama? How about a democratic presidential candidate? How about Planned Parenthood or NARAL or Emily’s List? Mr. Zuckerberg, Mr. Dorsey, and Mr. Pichai, can any of you, and let’s go in that order, Zuckerberg, Dorsey and then Pichai, can you name for me one high profile person or entity from a liberal ideology, who you have censored and what particular action you took?
Mark Zuckerberg: (02:55:38)
Senator, I can get you a list of some more of this, but there are certainly many examples that your democratic colleagues object to when fact checker might label something as false that they disagree with or they’re not able to-
Senator Mike Lee: (02:55:55)
Yeah, I get that. I get that. I just want to be clear. I’m just asking you if you can name for me one high profile liberal person or company who you have censored. I understand that you’re saying that there are complaints on both sides, but I just want one name of one person or one entity.
Mark Zuckerberg: (02:56:18)
Senator, I need to think about it and get you more of a list. But there are certainly many, many issues on both sides of the aisle, where people think we’re making content moderation decisions that they disagree with.
Senator Mike Lee: (02:56:31)
I got that. And I think everybody on this call could agree that they could identify at least five, maybe 10, maybe more, high profile, conservative examples. But what about you, Mr. Dorsey?
Jack Dorsey: (02:56:45)
We can give a more exhaustive list, but again, we don’t have an understanding of political [inaudible 02:56:52] of our accounts, but-
Senator Mike Lee: (02:56:54)
I’m not asking for an exhaustive list, I’m asking for a single example, one, just one individual, one entity, anyone.
Jack Dorsey: (02:57:00)
We’ve taken action on tweets from members of the house for election miss info.
Senator Mike Lee: (02:57:05)
Can you identify an example?
Jack Dorsey: (02:57:08)
Yes. Two Democratic Congresspeople-
Senator Mike Lee: (02:57:14)
What are their names?
Jack Dorsey: (02:57:15)
I’ll get those names to you.
Senator Mike Lee: (02:57:17)
Great. Great, Mr. Pichai, how about you?
Sundar Pichai: (02:57:21)
Senator, I’ll give specific examples, but let me step back. We don’t censor, we have moderation policies which we apply equally. To give you an example-
Senator Mike Lee: (02:57:30)
I get that. I use the word censor as a term of art there, and I define that term. Again, I’m not asking for a comprehensive list, I want a name, any name.
Sundar Pichai: (02:57:40)
We have turned down ads from Priorities USA, from vice-president Biden’s campaign. We have had compliance issues with World Socialist Review, which is a left-leaning publication. We can give you several examples, but for example, we have a graphic content policy. We don’t allow for ads that show graphic violent content in those ads. And we have taken down ads on both sides of the campaign and I gave you a couple of examples.
Senator Mike Lee: (02:58:07)
Okay. At least with respect to Mr. Zuckerberg and Mr. Dorsey and I would point out that with respect to Mr. Pichai, those are not nearly as high profile. I don’t know if I can identify anyone picked at random from the public, even picked at random from the public as far as members of the political active community in either political party who could identify those right off the top of the bat. There is a disparity between the censorship and again, I’m using that as a term of art, as I’ve defined it a moment ago, between the censorship of conservative and liberal points of view. And it’s an enormous disparity. Now you have the right, I want to be very clear about this, you have every single right to set your own terms of service and to interpret them and to make decisions about violations, but given the disparate impact of who gets censored on your platforms, it seems that you’re either one, not enforcing your terms of service equally, or alternatively two, that you’re writing your standards to target conservative viewpoints.
Senator Mike Lee: (02:59:15)
You certainly have the right to operate your own platform, but you also have to be transparent about your actions, at least in the sense that you can’t promise certain corporate behavior and then deceive customers through contradictory actions that just blatantly contradict what you’ve stated as your corporate business model or as your policy. So Mr. Zuckerberg and Mr. Dorsey, if Facebook is still a platform for all ideas, and if Twitter, “does not use political ideology to make decisions,” then do you state before this committee, for the record, that you always apply your terms of service equally to all of your users?
Mark Zuckerberg: (03:00:02)
Senator, our principle is to stand for free expression and to be a platform for all ideas. I certainly don’t think we have any intentional examples where we’re trying to enforce our policies in a way that is anything other than fair and consistent. But it’s also a big company, so I get that there are probably mistakes that are made from time to time. But our North Star, and what we intend to do, is to be a platform for all ideas and to give everyone a voice.
Senator Mike Lee: (03:00:35)
Okay. I appreciate that. I understand what you’re saying about intentional examples of a big company, but again, there is a disparate impact. There is a disparate impact that’s unmistakable, as evidenced by the fact that neither you nor Jack could identify a single example. Mr. Dorsey, how do you answer that question?
Chairman Wicker: (03:00:50)
Brief answer please, Mr. Dorsey.
Jack Dorsey: (03:00:53)
Yes. So we operate our enforcement in our policy, without an understanding of political ideology. Anytime we find examples of bias in how people operate our systems or our algorithms, we remove it. And as Mark mentioned, there are checkpoints in these companies and in these frameworks and we do need more transparency around them and how they work. And we do need a much more straightforward and quick and efficient appeals process, to give us a further checkpoint from the public.
Chairman Wicker: (03:01:28)
Thank you, Senator Lee. Senator Duckworth.
Senator Duckworth: (03:01:32)
Thank you, Mr. Chairman. I’ve devoted my life to public service, to upholding a sacred oath to support and defend the constitution of the United States against all enemies, foreign and domestic. And I have to be honest, it makes my blood boil and it also breaks my heart a little as I watched my Republican colleagues, just days before an election, sink down to the level of Donald Trump. By placing the selfish interest of Donald Trump ahead of the health of our democracy, Senate Republicans, whether they realize it or not, are weakening our national security and providing aid to our adversaries. As my late friend, Congressman Cummings often reminded us, we’re better than this. Look, our democracy is under attack right now. Every American, every member of Congress should be committed to defending the integrity of our elections from hostile foreign interference.
Senator Duckworth: (03:02:23)
Despite all the recent talk of great power competition, our adversaries know they still cannot defeat us on a conventional battlefield. Yet meanwhile, the members of the United States military and our dedicated civil servants, are working around the clock in the cyber domain to counter hostile actors, such as Iran, China, and Russia. And they do this while the Commander in Chief cowers in fear of Russia and stubbornly refuses to take any action to criticize or warn Russia against endangering our troops. I have confidence in the United States armed forces, intelligence community, and civil servants. Their effective performance explains why our foreign adversaries have sought alternative avenues to attacking our nation. Afraid to face us in conventional military or diplomatic ways, they look for unconventional means to weaken our democracy and they realize that social media could be the exhaust port of our democracy. Social media is so pervasive in the daily lives of Americans and traditional media outlets, that it can be weaponized to manipulate the public discourse and destabilize our institutions. After Russia was incredibly successful in disrupting our democracy four years ago, all of our adversaries learned a chilling lesson, social media companies can not be trusted to put patriotism above profit. Facebook and Twitter, utterly failed to hinder Russia’s sweeping and systemic interference in our 2016 election, which use the platforms to infiltrate our communities, spread disinformation and turn Americans against one another. Of course, the situation has grown far worse today as evidenced by today’s partisan sham hearing. While corporations may plead ignorance prior to the 2016 election, president Trump and his Republican enablers in the Senate have no such excuse. Senate Republicans cut a deal to become the party of Trump and now they find themselves playing a very dangerous game. By encouraging Russia’s illegal hacking, by serving as the spreaders and promoters of disinformation cooked up by foreign intelligence services and by falsely claiming censorship when responsible actors attempt to prevent hostile foreign adversaries from interfering in our elections. Senate Republicans insult the efforts of true patriots working to counter maligned interference and weaken our security.
Senator Duckworth: (03:04:49)
This committee is playing politics at a time when responsible public officials should be doing everything to preserve confidence in our system of elections and system of government. The reckless actions of Donald Trump and Senate Republicans do not let technology companies off the hook. None of the companies testifying before our committee today are helpless in the face of threats to our democracy, small D democracy. Federal law provides your respective companies with authority to counter foreign disinformation and counter-intelligence propaganda. And I want to be absolutely clear gentlemen, that I fully expect each of you to do so. Each of you will be attacked by the president, Senate Republicans and right wing media for countering hostile foreign interference in our election, but you have a duty to do the right thing. Because facts still exist. Facts still matter. Facts save lives. And there’s no both sides when one side has chosen to reject truth and embrace poisonous false information.
Senator Duckworth: (03:05:52)
So in closing, I would like each witness to provide a personal commitment that your respective companies will proactively counter domestic disinformation that spreads the dangerous lies, such as masks don’t work, while aggressively identifying and removing disinformation that is part of foreign adversaries efforts to interfere in our election or undermine our democracy. Do I have that commitment from each of you gentlemen?
Chairman Wicker: (03:06:17)
Okay. We’ll take Dorsey, Pichai and then Zuckerberg. Mr. Dorsey.
Jack Dorsey: (03:06:22)
We’ve made that commitment.
Chairman Wicker: (03:06:23)
Sundar Pichai: (03:06:26)
Senator, absolutely yes.
Chairman Wicker: (03:06:29)
And Mr. Zuckerberg.
Mark Zuckerberg: (03:06:31)
Yes, Senator, I agree with that.
Senator Duckworth: (03:06:35)
Thank you. Your industries success or failure in achieving this goal, will have far reaching life or death consequences for the American people and the future of our democracy. Thank you and I yield back, Mr. Chairman.
Chairman Wicker: (03:06:45)
The Senator yields back. Senator Johnson.
Senator Johnson: (03:06:51)
I’d like to start with a question for all three of the witnesses. You have public reports that you have different chat forums in companies, and also public reports where the few conservatives that might work for your company, has certainly been harassed on those types of forums. I don’t expect you to have taken poll of your employees, but I just want to get kind of a sense, because I think it’s pretty obvious. But would you say that the political ideology of the employees of your company is, let’s say 50/50, conservative versus liberal progressive, or do you think it’s closer to 90% liberal, 10% conservative? We’ll start with Mr. Dorsey.
Jack Dorsey: (03:07:35)
As you mentioned, I don’t know the makeup of our employees, because it’s not something we ask or focus on.
Senator Johnson: (03:07:41)
Just what do you think off top of your head, based on your chat rooms and kind of the people you talk to?
Jack Dorsey: (03:07:46)
Not something I look for or look-
Senator Johnson: (03:07:48)
Yeah, right. Okay. Mr. Pichai?
Sundar Pichai: (03:07:52)
Senator, we have over a hundred thousand employees. For the past two years, we have hired greater than 50% of our workforce outside California, it does tend to be proportionate to the areas where we are in. But we have a million message boards at Google, we have groups like Republican side, liberal side, conservative side and so on and we have definitely made an effort to make sure people of all viewpoints are welcome.
Senator Johnson: (03:08:18)
So again, you won’t … Mr. Zuckerberg, will you answer the question honestly? Is it 90% or 50/50, which is it closer to?
Mark Zuckerberg: (03:08:26)
Senator, I don’t know the exact number, but I would guess that our employee base skews left-leaning.
Senator Johnson: (03:08:34)
Thank you for that honesty. Mr. Dorsey, you started your opening comments that you think that people don’t trust you. I agree with that. We don’t trust you. You all say you’re fair and you’re consistent, you’re neutral, you’re unbiased. Mr. Dorsey, I think the most incredible answer I’ve seen so far in this hearing is when Senator Cruz asked, “Does Twitter have the ability to influence the elections?” Again, does Twitter have the ability to influence elections? You said no. Do you stick with that answer, that you don’t even believe … let’s face it, you all believe that Russia has the ability to influence the elections or interfere by using your social platforms. Mr. Dorsey, do you still deny that you don’t have the ability to influence and interfere in our elections?
Jack Dorsey: (03:09:22)
Yeah. I mean, my answer was around peoples choice around other communication channels.
Senator Johnson: (03:09:27)
No, the question was, does Twitter have the ability to influence the elections? You said, no. Do you still stand by that answer.
Jack Dorsey: (03:09:38)
Twitter as a company, no. No, we-
Senator Johnson: (03:09:41)
You don’t think you have the ability by moderation policies, by, as Senator Lee and I would call it, censoring you what you do here at post. You don’t think that censorship, that moderation policies, you don’t think that influences the elections by withholding what I believe is true information from American public. You don’t think that interferes in the elections?
Jack Dorsey: (03:10:02)
Not our current moderation policies. Our current moderation policies are to protect the conversation and the integrity of the conversation around the elections.
Senator Johnson: (03:10:09)
Okay. For both Mr. Zuckerberg and Dorsey who censored New York Post stories or throttled them back, did either one of you have any evidence that the New York Post story is part of Russian disinformation or that those emails aren’t authentic? Do any of you have any information whatsoever they’re not authentic or that they are Russian disinformation? Mr. Darcy.
Jack Dorsey: (03:10:35)
Senator Johnson: (03:10:38)
So why would you censor it? Why did you prevent that from being disseminated, on your platform that is supposed to be for the free expression of ideas and particularly true ideas?
Jack Dorsey: (03:10:48)
We believed it fell afoul of our hacking materials policy. We judged-
Senator Johnson: (03:10:53)
What evidence did you have that it was hacked? They weren’t hacked.
Jack Dorsey: (03:10:56)
We judged in the moment that it looked like it was hacked materials.
Senator Johnson: (03:11:00)
You were wrong.
Jack Dorsey: (03:11:02)
And we updated our policy and our enforcement within 24 hours.
Senator Johnson: (03:11:08)
Mr. Zuckerman, Zuckerberg.
Mark Zuckerberg: (03:11:12)
Senator, as I testified before, we relied heavily on the FBI’s intelligence and alert to us, both through their public testimony and a private briefing, the alerts they gave us.
Senator Johnson: (03:11:24)
Did the FBI contact you and say the New York Post story was false?
Mark Zuckerberg: (03:11:28)
Senator, not about that story specifically [crosstalk 03:11:32].
Senator Johnson: (03:11:33)
Why did you throttle it back?
Mark Zuckerberg: (03:11:35)
They alerted us to be on heightened alert around a risk of hack and leak operations, around a released trove of information. And Senator, to be clear on this, we didn’t censor the content, we flagged it for fact checkers to review and pending that review, we temporarily constrained its distribution to make sure that it didn’t spread wildly while it was being reviewed. But it’s not up to us either to determine whether it’s Russian interference, nor whether it’s true. We rely on the FBI intelligence and fact checkers to do that.
Senator Johnson: (03:12:12)
Mr. Dorsey, you talked about your policies toward misinformation and that you will block misinformation if it’s against civic integrity, election interference, or voter suppression. Let me give you a tweet that was put up on Twitter. It says, “Senator Ron Johnson is my neighbor and strangled our dog, Buttons, right in front of my four year old son and three year old daughter. The police refused to investigate. This is a complete lie, but important to retweet and note that there are more of my lies to come.” Now, we contacted Twitter and we asked them to take it down and here’s the response. “Thanks for reaching out. We escalated this to our support team for their review, and they have determined that this is not a violation of our policies.”
Senator Johnson: (03:13:03)
So Mr. Dorsey, how could a complete lie, it’s admitted it’s a lie, how does that not affect civic integrity? How could you view that as being election interference? Let’s face it, that could definitely impact my bill to get reelected. How could that not be a violation of voter suppression? Obviously, if people think I’m strangling my neighbor’s dog, they may not show up at the polls. That would be voter suppression. So why didn’t Twitter take that … By the way, that tweet was retweeted something like 17,000 times and viewed by over, and loved, commented, and appreciated, by over 50,000 people. How is that not voter suppression? How’s that now the election interference? How’s that not affect the civic integrity?
Jack Dorsey: (03:13:53)
We’ll have to look into our enforcement or not enforcement in this case, of the tweet and we can get back to you with more context.
Senator Johnson: (03:14:00)
So Mr. Zuckerberg, in that same June hearing … Real quick, Mr. Dorsey, you referred to that June hearing, Stephan [inaudible 03:14:09] had all kinds of good ideas. That’s 16 months ago. Why haven’t you implemented any of those transparency ideas that you thought were pretty good 16 months ago?
Jack Dorsey: (03:14:18)
Well, he was talking about algorithm choice and we have implemented one of them, which is we allow people to turn off the ranking over a timeline. The rest is work and it’s going to take some time.
Senator Johnson: (03:14:28)
I would get to it if I were you. Thank you, Mr. Chairman
Chairman Wicker: (03:14:31)
Senator Johnson, thank you. Let me just make sure I understood the answer Mr. Dorsey and Mr. Zuckerberg. Mr. Dorsey, did I understand you to say that you have no information indicating that the New York Post story about Hunter Biden has a Russian source? Did I understand correctly?
Jack Dorsey: (03:14:57)
Yes, not that I’m aware of.
Chairman Wicker: (03:14:58)
And is that also your answer, Mr. Zuckerberg, that you have no information at all to indicate that Russia was the source of this New York Post article?
Mark Zuckerberg: (03:15:09)
Senator, I would rely on the FBI to make that assessment.
Chairman Wicker: (03:15:13)
But you don’t have any such information do you?
Mark Zuckerberg: (03:15:16)
I do not, myself.
Chairman Wicker: (03:15:17)
Just trying to clarify the answer to Senator Johnson’s question. Thank you very much for indulging me there. Senator Tester, you are next, sir.
Senator Tester: (03:15:25)
I want to thank you, Mr. Chairman, and I want to thank Sundar and Jack and Mark for being in front of this committee. There is no doubt that there’s some major issues with Google and Facebook and Twitter, that Congress needs to address. Quite frankly, big tech is the unregulated wild west, that needs to be held accountable. And we do need to hear from all three of you about a range of critical issues that Americans deserve answers on data privacy, anti trust, the proliferation and misinformation on your platforms. In a moment, I’m going to ask all of you to commit to returning to this committee early next year, to have a hearing on these important issues.
Senator Tester: (03:16:05)
But the truth is my Republican colleagues arranged this hearing less than a week from election day for one specific reason, to make a last ditch case based on shoddy evidence, that these companies are censoring conservative voices. It is a stunt and it’s a cheap stunt at that. It is crystal clear that this hearing is designed to cast doubt on the fairness of the upcoming election and to work with the platforms to allow bad information to stay up as November 3rd approaches. It is also crystal clear that the directive to hold this political hearing, come straight from the White House. And it is a sad day when the United States Senate, an equal part of an independent branch of government, allows the Senates halls to be used for the presidents political stunts.
Senator Tester: (03:16:58)
There is a national election in six days, Mr. Chairman, you had nearly two years to hold this hearing and it’s happening six days before the election. The idea that we should have a sober hearing about putting the reins on big tech six days before the election, quite frankly doesn’t pass the smell test. Today, this hearing is about electoral politics. I know it, you know it, everybody in this room knows it, and I know the American people are smart enough to figure that out. I’m going to talk a little more about that in a second, but first I want to thank the panel once again for being here. I start by asking a question about making a more sincere effort to discuss the issues that surround big tech down the road. So the question for the panel, and this is a yes or no answer, will you commit to returning to testify again in the new Congress? Start with you, Jack.
Jack Dorsey: (03:17:57)
Yes, we’re always happy to, myself or one of our teammates is always happy to talk to people.
Senator Tester: (03:18:06)
Sundar Pichai: (03:18:08)
Senator, yes. We have engaged many times and we are happy to continue that engagement to Congress.
Senator Tester: (03:18:14)
How about you Mark?
Mark Zuckerberg: (03:18:17)
Senator, yes. I hope we can continue to have this conversation and hopefully not just with the CEO’s of the companies, but also with experts who work on these issues every day as part of their jobs.
Senator Tester: (03:18:30)
Absolutely. I think the more information, the better, but not based on politics, based on reality. And I want to thank you for that because we are in a very unreal time when it comes to politics. Quite frankly, we are in a time when fake news is real and real news is fake. And you guys try to shut down the fake news, whether it comes from Joe Biden’s mouth or whether it comes from Donald Trump’s mouth. And the fact is if Joe Biden said some of the crazy and offensive stuff that the president has said, he would get fact checked in the same way. Wouldn’t you agree? You can nod your head to that. Wouldn’t you agree, if Joe Biden said the same stuff that Trump said, that you would do the same sort of fact checking on him?
Chairman Wicker: (03:19:18)
Shall we take Mr. Dorsey, Mr. Pichai and Mr. Zuckerberg in that order?
Jack Dorsey: (03:19:23)
If we found violations of our policy, we would do the appropriate enforcement action.
Senator Tester: (03:19:31)
Chairman Wicker: (03:19:37)
Just go ahead then. Mr. Pichai.
Sundar Pichai: (03:19:41)
Senator, yeah, we would apply our policies without regard to who it is from and we apply it neutrally.
Senator Tester: (03:19:47)
Okay, thank you. Mark.
Mark Zuckerberg: (03:19:51)
Senator, I agree with what Jack and Sundar have said, we would also apply our policies to everyone. And in fact, when Joe Biden tweets or posts and cross posts to Facebook about the election, we put the same label, adding context about voting, on his post as we do for other candidates.
Senator Tester: (03:20:18)
Thank you for that. In 2016, Russia built a network of bots and fake accounts that they used to spread disinformation. This year, it seems they are seeding your networks with disinformation, relying on Americans, including some folks in Congress, to amplify and distribute it. What tools do you have to fight foreign disinformation on your platforms, when it spread by Americans? Jack.
Jack Dorsey: (03:20:44)
We’re looking at … our policies are against platform integration, period, no matter where it comes from. So whether that’s foreign or domestic, we see patterns of people or organizations that attempt to manipulate the platform and the conversation, artificially amplify information.
Senator Tester: (03:21:06)
Mark Zuckerberg: (03:21:09)
Senator, the efforts are a combination of AI systems that look for anomalous behavior by accounts or networks of accounts, a large human effort where we have 35,000 employees who work on security and content review, and partnerships that we’ve made with the other tech companies here, as well as law enforcement and intelligence community and election officials across the world, to make sure that we have all the appropriate input signals and can share signals on what we’re seeing with the other platforms as well.
Senator Tester: (03:21:39)
Sundar Pichai: (03:21:40)
Two things to add Senator, to give different examples. We partner with over 5,000 civic entities, campaign organizations at the federal and the state level, to protect their campaign digital assets with our advanced protection program and training. And I would echo there’s been an enormous increase in cooperation between the tech companies, as companies, we are sharing a lot of information and doing more together than ever before.
Senator Tester: (03:22:07)
Thank you. I just want to close with one thing. We’ve heard a lot information out here today, where when you hire somebody, you’re supposed to ask them their political affiliation. You’re supposed to ask them who they’ve donated to. There’s supposed to be a political litmus test. If you hire a Biden person, you’re supposed to hire a Trump person. Why not hire a Tester person, huh? Let’s talk about business. We want to regulate business and if that business is as run by a liberal, we’re going to regulate them different than if they’re run by a conservative outfit. That reminds me a lot of the Supreme Court, where you have two sets of rules, one for a Democratic president, one for Republican. This is baloney folks. Get off the political garbage and let’s have the Congress hearing do its job. Thank you very much.
Chairman Wicker: (03:22:51)
Thank you Senator Tester. Senator Scott.
Senator Scott: (03:22:56)
Thank you Chairman, for hosting this. I think first off, if you’ve been following all of this today, would you have clearly come to conclusion that Republicans believe that you censor and Democrats think it’s pretty good what you’re doing. We’re blessed to live in the United States, a democracy where we grant individual freedoms and liberties under the constitution. This isn’t the case around the world. We can look at what’s happening in communist China right now. General Secretary Xi has committed horrible human rights abuses against Chinese minority communities and censoring anyone that speaks out about their oppression. The Chinese Communist Party, surveils their citizens and use state run media to push their propaganda, control information their citizens consume and hide their human rights abuses. Twitter and Facebook are banned in communist China. So you can understand why it’s concerning we’re even discussing this issue, that big technology companies are interfering with free speech.
Senator Scott: (03:23:52)
The American people entrust your companies with their information. They believe that you will protect their information and allow them to use your platforms to express themselves freely. I don’t think any one person has signed up for any of your platforms and expects to be blocked or kicked off because of their political views. But it’s becoming obvious that your companies are unfairly targeting conservatives. That’s clearly the perception today. Facebook is actively targeting ads by conservative groups ahead of the elections. Either removing the ads completely or adding their own disclosure, if they claim that didn’t pass their fact check system. But their fact check is based on points from known liberal media groups, like [inaudible 00:29:36], which clearly is a liberal media group. Twitter censored Senator Mitch McConnell and put warnings on several of the president’s tweets. And until recently, they completely blocked the American people from sharing the New York Post’s story about Hunter Biden’s laptop and suspended the New York Post account.
Senator Scott: (03:24:55)
The New York Post is one of the most circulated publications in the United States, this isn’t some fringe media outlet filled with conspiracy theories. Yet you allow murderous dictators around the world to freely use your platform. Let me give you a few examples. On Twitter, Iran’s supreme leader, Ayatollah, tweeted calling for the elimination of the Zionist regime. He said on May 21, 2020, “The elimination of the Zionist regime does not mean the massacre of the Jewish people. The people of Palestine should hold a referendum. Any political system that they vote for should govern in all of Palestine. The only remedy, until the removal of the Zionist regime, is firm, armed resistance.” I’d like to know first why Twitter let that stay up, and why the Ayatollah has not been blocked. In May, 2019 Maduro, a murderous dictator tweeted a photo of him backed by his military for a march, after three people were killed and 130 injured during protests in his country. The tweet describes the march as a clear demonstration of the moral strength and integrity of our glorious armed forces, which is always prepared to defend peace and sovereignty.
Senator Scott: (03:26:06)
I would say this glorifies violence, which Twitter has flagged president Trump for, but Twitter let that stand. General Secretary Xi’s communist regime pushed by the fact it’s committed a genocide against the Uyghurs, forcing millions in internment camps because of their religion. On September one, a Chinese government account posted on Twitter, “Xinjiang camps, more fake news. What the Chinese government has done in Xinjiang has created the possibility for the locals to lead better lives. But truth that simply goes against the anti China narrative, will not report by some bias media. Clear lie. It has been widely reported that this claim by the Chinese government is false, but Twitter took no action.
Senator Scott: (03:26:51)
Your companies are inconsistently applying their own rules with an obvious bias. Your companies are censoring free speech. You target the president, the White House Press Secretary, Senator Mitch McConnell, the Susan B. Anthony List, a pro-life group, while giving dictators a free unfettered platform. It is our responsibility to hold your companies accountable and protect Americans ability to speak freely on their platforms, regardless of their political views or the information they choose to share. You can’t just pick and choose which viewpoints are allowed on your platform, expect to keep the immunity granted by section 2-30. So Mr. Dorsey, you allow dangerous dictators on your platform. Tell me why you flag conservatives in America, like president Trump, or [inaudible 03:27:39], for potential misinformation, while allowing dictators to spew their propaganda on your platform.
Jack Dorsey: (03:27:47)
We have taken actions around leaders around the world and certainly with some dictators as well. We look at the tweets, we review them and we figure out if they violate our policy or not.
Senator Scott: (03:28:02)
Mr. Dorsey, can you tell me one you did against Iran? the Ram the tile. Can you tell me about one you’ve ever done against Ayatollah or Maduro?
Jack Dorsey: (03:28:11)
I think we’ve done more than one actually, but we can send you that information on those actions. We want to make sure that we do have a global leader policy, that we believe it’s important. People can see what these leaders are saying and those tweets remain up, but they are labeled that they violated our terms of service, just to show the integrity of our policy and our enforcement.
Senator Scott: (03:28:39)
When the communist China, which we all know has put a million people for years in camps, you did nothing about the tweet, when they say that they were just helping them lead a better life. I mean, anybody that follows the news knows what’s happening to the Uyghurs. I mean, it’s genocide what they’re doing to the Uyghurs. I’ve never seen anything you’ve done on calling out a lie.
Jack Dorsey: (03:29:03)
We don’t have a general policy around misleading information and misinformation, we don’t. We rely upon people calling that speech out, calling those reports out and those ideas. And that’s part of the conversation, if there is something found to be in contest, then people reply to it. People retweet it and say, “This is wrong. This is obviously wrong.” You would be able to tweet that today and say, “This is absolutely wrong.’ And we benefit from one of those voices calling that up.
Senator Scott: (03:29:37)
But you blocked Mitch McConnell and Trump’s tweets and you just say … Right? I mean, here’s what I don’t get, is you guys have set up policies that you don’t enforce consistently and then what’s the recourse to a user? I talked to a lady this week that she just got her Facebook account just eliminated, there’s no recourse. There’s nothing she can do about it. So every one-
Senator Scott: (03:30:03)
There’s no recourse. There’s nothing she can do about it. So every one of you have these policies that you don’t enforce consistently. So what should be the recourse?
Jack Dorsey: (03:30:09)
We [inaudible 03:30:11] consistently. And as I said in my opening remarks, we believe it’s critical that we have more transparency around our process. We have clear and straightforward and efficient appeals. So the woman that you talked to could actually appeal the decision that we made, and that we focused on algorithms and figuring out how to give people more choice around it.
Chairman Wicker: (03:30:32)
Thank you, Senator Scott. Senator Rosen?
Senator Shelley Moore Capito: (03:30:37)
Thank you Mr. Chairman. I appreciate the witnesses for being here today and I want to focus a little bit, thank you, Mr. Dorsey, on algorithms because my colleagues and the majority [inaudible 03:30:51] that called this hearing in order to argue that you’re doing too much to stop the spread of disinformation, conspiracy theories and hate speech on your platforms. I’m here to tell you that you are not doing enough. Your platforms’ recommendation algorithms, well, they drive people who show an interest in conspiracy theories far deeper into hate, and only you have the ability to change this.
Senator Shelley Moore Capito: (03:31:15)
What I really want to say is that on these platforms and what I’d like to tell my colleagues, the important factor to realize is that people or users are the initiators of this content and the algorithms are the potentiators, if particularly, the recommendation algorithms, the potentiators of this content.
Senator Shelley Moore Capito: (03:31:36)
Now, I was doing a little cleaning in my garage like a lot of people during COVID. I’m a former computer programmer. I actually found my old hexadecimal calculator, and Radio Shack, my little owner’s manual here. So I know a little bit about the power of algorithms and what they can and can’t do having done that myself. And I know that you have the ability to remove bigoted, hateful and incendiary content that will lead and has led to violence.
Senator Shelley Moore Capito: (03:32:04)
So I want to be clear. It’s really not about what you can or cannot do, it’s really about what you will or will not do. So we have adversaries like Russia. They continue to amplify propaganda, everything from the election to coronavirus. We know what they’re doing, antisemitic, conspiracy theories. They do it on your platforms, weaponizing division and hate to destroy our democracy and our communities. The US intelligence community warned us earlier this year that Russia is now actively inciting white supremacist violence, which the FBI and the Department of Homeland Security say poses the most lethal threat to America.
Senator Shelley Moore Capito: (03:32:46)
In recent years, we’ve seen white supremacy and antisemitism on the rise, much of it spreading online. And what enables these bad actors to disseminate their hateful messaging to the American public are the algorithms on your platforms, effectively rewarding efforts by foreign powers to exploit divisions in our country.
Senator Shelley Moore Capito: (03:33:07)
To be sure, I want to acknowledge the work you’re already doing in this space. I’m relieved to see that Facebook has really taken that long overdue action in banning Holocaust denial content. But while you’ve made some policy changes, what we have seen time and time again is what starts online doesn’t end online. Hateful words morph into deadly actions, which are then amplified again and again. It’s a vicious cycle.
Senator Shelley Moore Capito: (03:33:34)
Just yesterday, we commemorated the two year anniversary of the Tree of Life shooting in Pittsburgh, the deadliest targeted attack in the Jewish community in American history. The shooter in this case had a long history of posting antisemitic content and social media sites. And what started online became very real for the families who will now never again see their loved ones.
Senator Shelley Moore Capito: (03:33:59)
So there has to be accountability when algorithms actively contribute to radicalization and hate. So Mr. Zuckerberg and Mr. Dorsey, when you implement a policy banning hate or disinformation content, how quickly can you adjust your algorithms to reduce this content, and perhaps what I want to ask even more importantly, to reduce or remove the recommendation algorithm of hate and disinformation, perhaps so it doesn’t continue to spread? We know those recommendation algorithms continue to drive someone more specifically and specifically and specifically. Great when you want to buy a new sweater, it’s going to be cold out here, it’s winter, not so great when you’re driving them towards hate. Can you talk to us about that please? Mr. Dorsey, you can go first please.
Jack Dorsey: (03:34:47)
As you know, algorithms, these algorithms, machine learning and deep learning are complex. They’re complicated and they require testing and training. So as we learn about their effectiveness, we can shift them and we can iterate them. But it does require experience and it does require a little bit of time. So the most important thing that we need to build into the organization is a fast learning mindset and that agility around updating these algorithms. So we do try to focus the urgency of our updates on any severity of harm, as you mentioned, specifically anything that leads to offline harm or dangerous speech that goes into offline hard.
Senator Shelley Moore Capito: (03:35:42)
And, Mr. Zuckerberg, I’ll ask you to answer that, then I have some more questions about I guess the nimbleness of your algorithms. Go ahead.
Mark Zuckerberg: (03:35:50)
Senator, I think you’re focused on exactly the right thing in terms of how many people see the harmful content. And as we talk about putting in place regulation or reforming Section 230 in terms of what we want to hold companies accountable for, I think that what we should be judging the companies on is how many people see harmful content before the companies act on it. And I think being able to act on it quickly and being able to act on content that is potentially going viral or going to be seen by more people before it does see a lot of people, that it gets seen by a lot of people is critical. This is what we report in our quarterly transparency reports, or what percent of the content that a person sees is harmful in any of the categories of harm that we track. And we try to hold ourselves accountable for basically driving the prevalence of that harmful content down. And I think good content regulation here would create a standard like that across the whole industry.
Senator Shelley Moore Capito: (03:36:56)
So I liked what you said, your recommendation algorithms need to learn to drive the prevalence of this harmful content down. So I have some other questions, I’m going to ask those, but I would like to see some of the information about how nimble you are on dropping down that prevalence when you do see it trending, when you do see an uptick, whether it’s by bots, by human beings, whatever that is. We need to drive that prevalence down.
Senator Shelley Moore Capito: (03:37:27)
And so, can you talk a little bit maybe more specifically then on things you might be doing for anti-Semitism? We know that that is white supremacy, the biggest domestic threat on the Homeland Security Committee. They have testified to this, the largest threat of course to our nation, and I want to be sure that this violence is not celebrated and amplified on your platforms.
Chairman Wicker: (03:37:49)
We’ll have to have a brief answer to that. Senator, to whom are you addressing the question?
Senator Shelley Moore Capito: (03:38:01)
Mr. Zuckerberg [inaudible 03:38:01]-
Chairman Wicker: (03:38:02)
To Mr. Zuckerberg?
Senator Shelley Moore Capito: (03:38:03)
I think I’ve only asked one, but we only have just a few seconds. We can ask that.
Mark Zuckerberg: (03:38:09)
Sure, Senator, thank you. I mean, there’s a lot of nuance here, but in general, for each category of harmful content, whether it’s terrorist propaganda or incitement of violence and hate speech, we have to build specific systems and specific AI systems. And one of the benefits of I think having transparency and transparency reports into how these companies are doing is that we have to report on a quarterly basis how effectively we’re doing at finding those types of contents so you can hold us accountable for how nimble we are.
Mark Zuckerberg: (03:38:45)
Hate speech is one of the hardest things to train an AI system to get good at identifying because it is linguistically nuanced. We operate in 150 languages around the world. But what our transparency reports show is that over the last few years, we’ve gone from proactively identifying and taking down about 20% of the hate speech on the service, to now, we are proactively identifying, I think it’s about 94% of the hate speech that we ended up taking down, and the vast majority of that before people even have to report it to us.
Mark Zuckerberg: (03:39:19)
But by having this kind of a transparency requirement, which is part of what I’m advocating for in the Section 230 reform, I think we’ll be able to have a broader sense across the industry of how all of the companies are improving in each of these areas.
Chairman Wicker: (03:39:33)
Thank you for that answer, Mr. Zuckerberg.
Senator Shelley Moore Capito: (03:39:33)
Well, thank you, and look forward to working with everyone on this. Thank you, Mr. Chairman.
Chairman Wicker: (03:39:41)
As do I, Senator Rosen. Thank you very much. When this hearing convened, I estimated that it would last three hours and 42 minutes. It’s now been three hours and 41 minutes. Four of our members have been unable to join us and that’s the only reason that my prediction was the least bit accurate.
Chairman Wicker: (03:40:11)
So thank you all. Thank you very much. And I thank our witnesses. During my first series, during my first question to the panelists, I referred to a document that I had entered into the record during our committee meeting I believe on October the 1st entitled Social Media Companies Censoring Prominent Conservative Voices. That that document has been updated, and without objection, it will be added to the record at this point.
Chairman Wicker: (03:40:52)
And I believe we are now at the point of closing the hearing. The hearing record will remain open for two weeks. During this time, senators are asked to submit any questions for the record. Upon receipt, the witnesses are requested to submit their written answers to the committee as soon as possible, but by no later than Wednesday, November 25th, 2020. I want to thank our witnesses for their cooperation and for bearing with us during a very lengthy hearing. And I want to thank each member of the committee for their cooperation in the conduct of this hearing. And with that, the hearing is concluded and the witnesses are thanked. This hearing is now adjourned.