Feb 1, 2024

Senate Hearing with CEOs of Meta, TikTok, X, Snap and Discord About Child Safety 1/31/24 Transcript

Senate Hearing with CEOs of Meta, TikTok, X, Snap and Discord About Child Safety 1/31/24 Transcript
RevBlogTranscriptsExploitationSenate Hearing with CEOs of Meta, TikTok, X, Snap and Discord About Child Safety 1/31/24 Transcript

Big Tech CEOs including Meta’s Mark Zuckerberg, X’s Linda Yaccarino and TikTok’s Shou Zi Chew testify on the impact of social media on children during a Senate Judiciary Committee hearing. Read the transcript here.

Transcribe Your Own Content

Try Rev and save time transcribing, captioning, and subtitling.

Dick Durbin (28:02):

I want to preface my remarks by saying that I’ve been in Congress for a few years. Senator Graham has as well. If you do not believe this is an idea whose time has come, take a look at the turnout here.

(28:17)
Today, the Senate Judiciary Committee will continue its work on an issue on the mind of most American families. How to keep our kids safe from sexual exploitation and harm in the internet age. Online child sexual exploitation includes the use of online platforms to target and groom children, and the production and endless distribution of child sexual abuse material, CSAM, which can haunt victims for their entire lives and in some cases, take their lives. Everyone here will agree this conduct is abhorrent. I’d like to turn to a brief video to hear directly from the victims, the survivors, about the impact these crimes have had on them.

Speaker 5 (29:06):

I was sexually exploited on Facebook.

Speaker 6 (29:08):

I was sexually exploited on Instagram.

Speaker 5 (29:11):

I was sexually exploited on X.

Speaker 7 (29:14):

This is my daughter, Olivia.

Speaker 1 (29:16):

This is our son, Matthew.

Speaker 8 (29:18):

Look at how beautiful Maryam is.

Speaker 9 (29:20):

My son Riley died from suicide after being sexually exploited on Facebook.

Speaker 10 (29:25):

The child that gets exploited is never the same ever again.

Speaker 5 (29:30):

I reported this issue numerous times and it took over a decade before anyone helped me.

Speaker 11 (29:36):

You might be able to tell that I am using a green screen. Why is that? In the internet world, my past abusers can contact me. Fans of my abuse material as a child can find me and contact me.

Speaker 6 (29:51):

As a 17-year-old child, I had to write a victim impact statement after being extorted for four consecutive years. While I was strong enough to resist sending him any more pictures, there were dozens more who were not.

Speaker 12 (30:02):

We got a phone call to find out that my son was in his room and was suicidal. He was only 13 years old at the time. Him and a friend had been exploited online and trafficked and my son reached out to Twitter. Twitter, or now X, whose response was, “Thank you for reaching out. We reviewed the content and we did not find a violation of our policies so no action will be taken at this time.”

Speaker 13 (30:27):

How many more kids like Matthew?

Speaker 7 (30:29):

Like Olivia?

Speaker 9 (30:30):

Like Riley?

Speaker 4 (30:32):

How many more kids will suffer and die because of social media?

Speaker 10 (30:36):

Big tech failed to protect my child from sexual exploitation.

Speaker 6 (30:42):

Big tech failed to protect me from online sexual exploitation.

Speaker 12 (30:45):

And we need Congress to do something for our children and protect them.

Speaker 11 (30:49):

It’s not too late. It’s not too late to do something about it.

Dick Durbin (30:54):

Online child sexual exploitation is a crisis in America. In 2013, the National Center for Missing and Exploited Children, known as NCMEC, received approximately 1,380 cyber tips per day. By 2023, just 10 years later, the number of cyber tips has risen to 100,000 reports a day. That’s a hundred thousand daily reports of child sexual abuse material, also known as CSAM.

(31:27)
In recent years, we’ve also seen an explosion in the so-called ‘financial sextorsion’ in which a predator uses a fake social media account to trick a minor into sending explicit photos or videos, then threatens to release them unless the victim sends money. In 2021, NCMEC received a total of 139 reports of sextortion, 2021. In 2023 through the end of October alone, this number skyrocketed to more than 22,000. More than a dozen children have died by suicide after becoming victims of this crime. This disturbing growth in child sexual exploitation is driven by one thing: changes in technology.

(32:14)
In 1996, the world’s bestselling cell phone was the Motorola Star Tech. While groundbreaking at the time, the clamshell-style cell phone wasn’t much different from a traditional phone. It allowed users to make and receive calls and even receive text messages, but that was about it. Fast-forward to today, smartphones are in the pockets of seemingly every man, woman, and teenager on the planet. Like the StarTech today, today’s smartphones allow users to make and receive calls and texts, but they can also take photos and videos, support live-streaming, and offer countless apps. With the touch of your finger, that smartphone, that can entertain and inform you, can become a back alley where the lives of

Dick Durbin (33:00):

Your children are damaged and destroyed. These apps have changed the ways we live, work, and play, but as investigations have detailed, social media and messaging apps have also given predators powerful new tools to sexually exploit children. Your carefully crafted algorithms can be a more powerful force on the lives of our children than even the most best-intention parent. Discord has been used to groom, abduct and abuse children. Meta’s Instagram helped connect and promote a network of pedophiles. Snapchat’s disappearing messages have been co-opted by criminals who financially extort young victims. TikTok has become a platform of choice for predators to access, engage, and groom children for abuse, and the prevalence of CSAM on X has grown as the company has gutted its trust in safety workforce.

(33:57)
Today, we’ll hear from the CEOs of those companies. They’re not only the tech companies that have contributed to this crisis, they’re responsible for many of the dangers our children face online. Their design choices, their failures to adequately invest in trust and safety, their constant pursuit of engagement and profit over basic safety have all put our kids and grandkids at risk. Coincidentally, several of these companies implemented common sense child safety improvements within the last week, days before their CEOs would have to justify their lack of action before this committee, but the tech industry alone is not to blame for the situation we’re in. Those of us in Congress need to look in the mirror. In 1996, the same year the Motorola StarTAC was flying off shelves and years before social media went mainstream, we passed Section 230 of the Communications Decency Act. This law immunized the then fledgling internet platforms from liability for user generated content.

(35:08)
Interesting, only one other industry in America has an immunity from civil liability. We’ll leave that for another day. For the past 30 years, Section 230 has remained largely unchanged, allowing big tech to grow into the most profitable industry in the history of capitalism without fear of liability for unsafe practices. That has to change. Over the past year, this committee has unanimously reported five bills that would finally hold tech companies accountable for child sexual exploitation on their platforms. Unanimous. Take a look at the opposition and membership of the Senate Judiciary Committee and imagine, if you will, there’s anything we could agree on unanimously. These five bills were of objective agreement. One of these bills is by Stop CSAM Act. Critically, it would let victims sue online providers that promote or aid and abet online child sexual exploitation or that host or store CSAM. This stand against online child sexual exploitation is bipartisan and absolutely necessary. Let this hearing be a call to action that we need to get kids online safety legislation to the President’s desk. I now turn to the ranking member, Senator Graham.

Senator Graham (36:27):

Thank you, Ms. Chairman. The Republicans will answer the call. All of us, every one of us is ready to work with you and our democratic colleagues on this committee to prove to the American people while Washington is certainly broken, there’s a ray of hope and it is here. It lies with your children. After years of working on this issue with you and others, I’ve come to conclude the following, social media companies as they’re currently designed and operate are dangerous products. They’re destroying lives, threatening democracy itself. These companies must be reigned in or the worst is yet to come.

(37:12)
Gavin Guffey is a representative, Republican representative from South Carolina in the Rock Hill area. To all the victims who came and showed us photos of your loved ones, don’t quit. It’s working. You’re making a difference. Through you, we’ll get to where we need to go so other people won’t have to show a photo of their family. The damage to your family has been done. Hopefully, we can take your pain and turn it into something positive so nobody else has to hold up a sign. Gavin’s son got online with the Instagram and was tricked by a group in Nigeria that put up a young lady posing to be his girlfriend, and as things go at that stage in life, he gave her some photos, compromising sexual photos, and it turned out that she was part of a extortion group in Nigeria. They threatened the young man that if you don’t give us money, we’re going to expose these photos. He gave them money, but it wasn’t enough. They kept threatening and he killed himself. They threatened Mr. Guffey and his son. These are bastards by any known definition.

(38:37)
Mr. Zuckerberg, you and the companies before us, I know you don’t mean it to be so, but you have blood on your hands. You have a product that’s killing people. When we had cigarettes killing people, we did something about it, maybe not enough. You’re going to talk about guns, we have the ATF. Nothing here, there’s not a thing anybody can do about it. You can’t be sued. Now, Senator Blumenthal and Blackburn, who’ve been like the dynamic duo here, have found emails from your company where they warned you about this stuff and you decided not to hire 45 people that could do a better job of policing this. So the bottom line is you can’t be sued. You should be, and these emails would be great for punitive damages, but the courtroom is closed to every American abused by all the companies in front of me. Of all the people in America we could give blanket liability protection to, this would be the last group I would pick.

(39:45)
It is now time to repeal Section 230. This committee is made up of the ideologically most different people you could find. We’ve come together through your leadership, Mr. Chairman, to pass five bills to deal with the problem of exploitation of children. I’ll talk about them in depth in a little bit. The bottom line is all these bills have met the same fate. They go nowhere. They leave the committee and they die. Now, there’s another approach. What do you do with dangerous products? You either allow lawsuits, you have statutory protections to protect consumers, or you have a commission of sorts to regulate the industry in question. To take your license away if you have a license, to fine you, none of that exists here. We live in an America in 2024 where there is no regulatory body dealing with the most profitable, biggest companies in the history of the world.

(40:43)
They can’t be sued, and there’s not one law on the book that’s meaningful protecting the American consumer. Other than that, we’re in a good spot. So here’s what I think is going to happen. I think after this hearing today, we’re going to put a lot of pressure on our colleagues, leadership of the Republican and Democratic Senate to let these bills get to the floor and vote, and I’m going to go down starting in a couple of weeks, make unanimous consent request to do CSAM, do the EARN IT Act, do your bill, do all of the bills, and you can be famous, come and object. I’m going to give you a chance to be famous.

(41:23)
Now, Elizabeth Warren and Lindsey Graham have almost nothing in common. I promised her I would say that publicly. The only thing worse than me doing a bill with Elizabeth Warren is her doing a bill with me. We have sort of parked that because Elizabeth and I see an abuse here that needs to be dealt with. Senator Durbin and I have different political philosophies, but I appreciate what you’ve done on this committee. You have been a great partner. To all of my Democratic colleagues, thank you very, very much. To my Republican colleagues, thank you all very, very much. Save the applause for when we get a result. This is all talk right now, but they’ll come a day, if we keep pressing, to get the right answer for the American people. What does that answer? Accountability.

(42:12)
Now these products have an upside. You’ve enriched our lives in many ways. Mr. Zuckerberg, you created a product I use. The idea, I think, when you first came up with this, be able to talk to your friends and your family and pass on your life, to be able to have a place where you could talk to your friends and family about good things going on in life, and I use it. We all use it. There’s an upside to everything here, but the dark side hasn’t been dealt with. It’s now time to deal with the dark side because people have taken your idea and they’ve turned it into a nightmare for the American people. They’ve turned it into a nightmare for the world at large. TikTok, we had a great discussion about how maybe Larry Ellison, through Oracle, can protect American data from Chinese communist influence, but TikTok, your representative in Israel quit the company because TikTok is being used in a way to basically destroy the Jewish state.

(43:16)
This is not just about individuals. I worry that in 2024, our democracy will be attacked again through these platforms by foreign actors. We’re exposed and AI is just starting. So to my colleagues, we’re here for a reason. This committee has a history of being tough but also doing things that need to be done. This committee has risen to the occasion. There’s more that we can do, but to the members of this committee, let’s insist that our colleagues rise to the occasion also. Let’s make sure that in the 118th Congress, we have votes that would fix this problem. All you can do is cast your vote at the end of the day, but you can urge the system to require others to cast their vote. Mr. Chairman, I will continue to work with you and everybody on this committee to have a day of reckoning on the floor of the United States Senate. Thank you.

Dick Durbin (44:15):

Thank you, Senator Graham. Today we welcome five witnesses whom I’ll introduce. Jason Citron, the CEO of Discord Incorporated, Mark Zuckerberg, the founder and CEO of Meta, Evan Spiegel, the co-founder and CEO of Snap Incorporated, Shou Chew, the CEO of TikTok, and Linda Yaccarino, the CEO of X Corporation, formerly known as Twitter. I will note for the record that Mr. Zuckerberg and Mr. Chew are appearing voluntarily. I’m disappointed that our other witnesses did not offer that same degree of cooperation. Mr. Citron, Mr. Spiegel and Ms. Yaccarino are here pursuant to subpoenas, and Mr. Citron only accepted services of his subpoena after US Marshals were sent to Discord’s headquarters at taxpayers expense. I hope this is not a sign of your commitment or lack of commitment to addressing the serious issue before us. After I swear in the witnesses, each witness will have five minutes to make an opening statement.

(45:20)
Then senators will ask questions in an opening round each of seven minutes. I expect to take a short break at some point during questioning to allow the witnesses to stretch their legs. If anyone is in need of a break at any point, please let my staff know. Before I turn to the witnesses, I’d also like you to take a moment to acknowledge that this hearing has gathered a lot of attention as we expected. We have a large audience, the largest I’ve seen in this room today. I want to make clear as with other judiciary committee hearings, we ask people to behave appropriately. I know there is high emotion in this room, for justifiable reasons, but I ask you to please follow the traditions of the committee. That means no standing, shouting, chanting, or applauding witnesses. Disruptions will not be tolerated. Anyone who does disrupt the hearing will be asked to leave. The witnesses are here today to address a serious topic. We want to hear what they have to say. I thank you for your cooperation.

(46:18)
Could all of the witnesses please stand to be sworn in? Do you affirm the testimony you’re about to give before the committee will be the truth, the whole truth, and nothing but the truth, so help you God? Let the record reflect that all the witnesses have answered in the affirmative. Mr. Citron, please proceed with your opening statement.

Jason Citron (46:43):

Good morning.

Dick Durbin (46:44):

Good morning.

Jason Citron (46:45):

My name is Jason Citron and I am the co-founder and CEO of Discord. We are an American company with about 800 employees living and working in 33 states. Today, Discord has grown to more than 150 million monthly active users. Discord is a communications platform where friends hang out and talk online about shared interests, from fantasy sports to writing music to video games. I’ve been playing video games since I was five years old and as a kid, it’s how I had fun and found friendship. Many of my fondest memories are of playing video games with friends. We built Discord so that anyone could build friendships playing video games, from Minecraft to Wordle and everything in between. Games have always brought us together and Discord makes that happen today. Discord is one of the many services that have revolutionized how we communicate with each other in the different moments of our lives, iMessage, Zoom, Gmail and on and on. They enrich our lives, create communities, accelerate commerce, healthcare, and education.

(48:04)
Just like with all technology and tools, there are people who exploit and abuse our platforms for immoral and illegal purposes. All of us here on the panel today and throughout the tech industry have a solemn and urgent responsibility to ensure that everyone who uses our platforms is protected from these criminals, both online and off. Discord has a special responsibility to do that because a lot of our users are young people. More than 60% of our active users are between the ages of 13 and 24. It’s why safety is built into everything we do. It’s essential to our mission and our business, and most of all, this is deeply personal. I’m a dad with two kids. I want Discord to be a product that they use and love, and I want them to be safe on Discord. I want them to be proud of me for helping to bring this product to the world. That’s why I’m pleased to be here today to discuss the important topic of the online safety of minors.

(49:13)
My written testimony provides a comprehensive overview of our safety programs. Here are a few examples of how we protect and empower young people. First, we’ve put our money into safety. The tech sector has a reputation of larger companies buying smaller ones to increase user numbers and boost financial results, but the largest acquisition we’ve ever made at Discord was a company called Sentropy. It didn’t help us expand our market share or improve our bottom line. In fact, because it uses AI to help us identify, ban, and report criminals and bad behavior, it has actually lowered our user count by getting rid of bad actors. Second, you’ve heard of end-to-end encryption that blocks anyone, including the platform itself, from seeing users’ communications. It’s a feature on dozens of platforms but not on Discord. That’s a choice we’ve made. We don’t believe we can fulfill our safety obligations if the text messages of teens are fully encrypted because encryption would block our ability to investigate a serious situation and when appropriate, report to law enforcement. Third, we have a zero tolerance policy on child sexual abuse material or CSAM. We scan images uploaded to Discord to detect and block the sharing of this abhorrent material. We’ve also built an innovative tool, Teen Safety Assist, that blocks explicit images and helps young people easily report unwelcome conversations. We’ve also developed a new semantic hashing technology for detecting novel forms of CSAM called Clip, and we’re sharing this technology with other platforms through the tech coalition. Finally, we recognize that improving online safety requires all of us to work together, so we partner with nonprofits, law enforcement, and our tech colleagues to stay ahead of the curve in protecting young people online. We want to be the platform that empowers our users to have better online experiences, to build true connections, genuine friendships, and to have fun. Senators, I sincerely hope today is the beginning of an ongoing dialogue that results in real improvements in online safety. I look forward to your questions and to helping the committee learn more about Discord.

Dick Durbin (51:43):

Thank you, Mr. Citron. Mr. Zuckerberg.

Mark Zuckerberg (51:49):

Chairman Durbin, Ranking Member Graham and members of the committee, every day, teens and young people do amazing things on our services. These are apps to create new things, express themselves, explore the world around them, and feel more connected to the people they care about. Overall, teens tell us that this is a positive part of their lives, but some face challenges online, so we work hard to provide parents and teen support and controls to reduce potential harms. Being a parent is one of the hardest jobs in the world. Technology gives us new ways to communicate with our kids and feel connected to their lives, but it can also make parenting more complicated, and it’s important to me that our services are positive for everyone who uses them. We are on the side of parents everywhere working hard to raise their kids. Over the last eight years, we’ve built more than 30 different tools, resources, and features that parents can set time limits for their teens using our apps, see who they’re following or if they report someone for bullying.

(52:46)
For teens, we’ve added nudges that remind them when they’ve been using Instagram for a while or if it’s getting late and they should go to sleep, as well as ways to hide words or people without those people finding out. We put special restrictions on teen accounts on Instagram. By default, accounts for under 16s are set to private, have the most restrictive content settings and can’t be messaged by adults that they don’t follow or people they aren’t connected to. With so much of our lives spent on mobile devices and social media, it’s important to look into the effects on teen mental health and wellbeing. I take this very seriously. Mental health is a complex issue and the existing body of scientific work has not shown a causal link between using social media and young people having worse mental health outcomes. A recent National Academies of Science report evaluated over 300 studies and found that research, quote, “Did not support the conclusion that social media causes changes in adolescent mental health at the population level,” end quote. It also suggested that social media can provide significant positive benefits when young people use it to express themselves, explore and connect with others. Still, we’re going to continue to monitor the research and use it to inform our roadmap. Keeping young people safe online has been a challenge since the internet began, and as criminals evolve their tactics, we have to evolve our defenses too. We work closely with law enforcement to find bad actors and help bring them to justice, but the difficult reality is that no matter how much we invest or how effective our tools are, there’s always more to learn and more improvements to make, but we remain ready to work with members of this committee, industry and parents to make the internet safer for everyone.

(54:28)
I’m proud of the work that our teams do to improve online child safety on our services and across the entire internet. We have around 40,000 people overall working on safety and security, and we’ve invested more than $20 billion in this since 2016, including around $5 billion in the last year alone. We have many teams dedicated to child safety and teen wellbeing, and we lead the industry in a lot of the areas that we’re discussing today. We build technology to tackle the worst online risks and share it to help our whole industry get better, like Project Lantern, which helps companies share data about people who break child safety rules and we’re founding members of Take It Down, a platform which helps young people prevent their nude images from being spread online. We also go beyond legal requirements and use sophisticated technology to proactively discover abusive material. And as a result, we find and report more inappropriate content than anyone else in the industry.

(55:22)
As the National Center for Missing and Exploited Children put it this week, Meta goes, quote, “Above and beyond to make sure that there are no portions of their network where this type of activity occurs,” end quote. I hope we can have a substantive discussion today that drives improvements across the industry, including legislation that delivers what parents say they want, a clear system for age verification, and control over what apps their kids are using. Three out of four parents want app store age verification, and four out of five want parental approval of whenever teens download apps. We support this. Parents should have the final say on what apps are appropriate for their children and shouldn’t have to upload their ID every time. That’s what app stores are for.

(56:08)
We also support setting industry standards on age-appropriate content and limiting signals for advertising to teens to age and location and not behavior. At the end of the day, we want everyone who uses our services to have safe and positive experiences. Before I wrap up, I want to recognize the families who are here today who have lost a loved one or lived through some terrible things that no family should have to endure. These issues are important for every parent and every platform. I’m committed to continuing to work in these areas and I hope we can make progress today.

Dick Durbin (56:46):

Thank you. Mr. Spiegel.

Evan Spiegel (56:58):

Chairman Durbin, Ranking Member Graham and members of the committee, thank you for convening this hearing and for moving forward important legislation to protect children online. I’m Evan Spiegel, the co-founder and CEO of Snap. We created Snapchat, an online service that is used by more than 800 million people worldwide to communicate with their friends and family. I know that many of you have been working to protect children online since before Snapchat was created, and we are grateful for your long-term dedication to this cause and your willingness to work together to help keep our community safe. I want to acknowledge the survivors of online harms and the families who are here today who have suffered the loss of a loved one. Words cannot begin to express the profound sorrow I feel that a service we designed to bring people happiness and joy has been abused to cause harm. I want to be clear that we understand our responsibility to keep our community safe.

(57:52)
I also want to recognize the many families who have worked to raise awareness on these issues, push for change and collaborate with lawmakers on important legislation like the Cooper Davis Act, which can help save lives. I started building Snapchat with my co-founder, Bobby Murphy, when I was 20 years old. We designed Snapchat to solve some of the problems that we experienced online when we were teenagers. We didn’t have an alternative to social media. That meant pictures shared online were permanent, public and subject to popularity metrics. It didn’t feel very good. We built Snapchat differently because we wanted a new way to communicate with our friends that was fast, fun and private. A picture is worth a thousand words, so people communicate with images and videos on Snapchat. We don’t have public likes or comments when you share your story with friends. Snapchat is private by default, meaning that people need to opt in to add friends and choose who can contact them.

(58:44)
When we built Snapchat, we chose to have the images and videos sent through our service delete by default. Like prior generations who’ve enjoyed the privacy afforded by phone calls which aren’t recorded, our generation has benefited from the ability to share moments through Snapchat that may not be picture perfect, but instead, convey emotion without permanence. Even though Snapchat messages are deleted by default, we let everyone know that images and videos can be saved by the recipient. When we take action on illegal or potentially harmful content, we also retain the evidence for an extended period, which allows us to support law enforcement and hold criminals accountable.

(59:21)
To help prevent the spread of harmful content on Snapchat, we approve the content that is recommended on our service using a combination of automated processes and human review. We apply our content rules consistently and fairly across all accounts. We run samples of our enforcement actions through quality assurance to verify that we’re getting it right. We also proactively scan for known child sexual abuse material, drug-related content, and other types of harmful content, remove that content, deactivate and device block offending accounts, preserve the evidence for law enforcement and report certain content to the relevant authorities for further action. Last year, we made 690,000 reports to the National Center for Missing and Exploited Children, leading to more than 1000 arrests. We also removed 2.2 million pieces of drug-related content and blocked 705,000 associated accounts. Even with our strict privacy settings, content moderation efforts, proactive detection and law enforcement collaboration, bad things can still happen when people use online services. That’s why we believe that people under the age of 13 are not ready to communicate on Snapchat.

(01:00:25)
We strongly encourage parents to use the device-level parental controls on iPhone and Android. We use them in our own household and my wife approves every app that our 13-year-old downloads. For parents who want more visibility and control, we built Family Center in Snapchat, where you can view who your teen is talking to, review privacy settings and set content limits. We have worked for years with members of the committee on legislation like the Kids Online Safety Act and the Cooper Davis Act, which we are proud to support. I want to encourage broader industry support for legislation protecting children online. No legislation is perfect, but some rules of the road are better than none.

(01:01:03)
Much of the work that we do to protect people that use our service would not be possible without the support of our partners across the industry, government, nonprofit organizations, NGOs, and in particular, law enforcement and the first responders who have committed their lives to helping keep people safe. I’m profoundly grateful for the extraordinary efforts across our country and around the world to prevent criminals from using online services to perpetrate their crimes. I feel an overwhelming sense of gratitude for the opportunities that this country has afforded me and my family. I feel a deep obligation to give back and to make a positive difference, and I’m grateful to be here today as part of this vitally important democratic process. Members of the committee, I give you my commitment that we’ll be part of the solution for online safety. We’ll be honest about our shortcomings and we’ll work continuously to improve. Thank you and I look forward to answering your questions.

Dick Durbin (01:01:52):

Thank you, Mr. Spiegel. Mr. Chew.

Shou Chew (01:01:56):

Chair Durbin ranking Member Graham and members of the committee, I appreciate the opportunity to appear before you today. My name is Shou Chew and I’m the CEO of TikTok, an online community of more than 1 billion people worldwide, including well over 170 million Americans who use our app every month to create, to share, and to discover. Now, although the average age on TikTok in the US is over 30, we recognize that special safeguards are required to protect minors and especially when it comes to combating all forms of CSAM. As a father of three young children myself, I know that the issues that we’re discussing today are horrific and the nightmare of every parent. I am proud of our efforts to address the threats to young people online from a commitment to protecting them to our industry leading policies, use of innovative technology and significant ongoing investments in trust and safety to achieve this goal.

(01:02:59)
TikTok is vigilant about enforcing its 13 and up age policy and offers an experience for teens that is much more restrictive than you and I would have as adults. We make careful product design choices to help make our app inhospitable to those seeking to harm teens. Let me give you a few examples of longstanding policies that are unique to TikTok. We didn’t do them last week. First, direct messaging is not available to any users under the age of 16. Second, accounts for people under 16 are automatically set to private, along with their content. Furthermore, the content cannot be downloaded and will not be recommended to people they do not know. Third, every teen under 18 has a screen time limit automatically set to 60 minutes. And fourth, only people 18 and above are allowed to use our livestream feature.

(01:04:02)
I’m proud to say that TikTok was among the first to empower parents to supervise their teens on our app with our family pairing tools. This includes setting screen time limits, filtering out content from the teen’s feeds, amongst others. We made these choices after consulting with doctors and safety experts who understand the unique stages of teenage development to ensure that we have the appropriate safeguards to prevent harm and minimize risk. Now, safety is one of the core priorities that defines TikTok under my leadership. We currently have more than 40,000 trust and safety professionals working to protect our community globally, and we expect to invest more than $2 billion in trust and safety efforts this year alone, with a significant part of that in our US operations. Our robust community guidelines strictly prohibit content or behavior that puts teenagers at risk of exploitation or other harm, and we vigorously enforce them.

(01:05:11)
Our technology moderates all content uploaded to our app to help quickly identify potential CSAM and other material that breaks our rules. It automatically removes the content or elevates it to our safety professionals for further review. We also moderate direct messages for CSAM and related material, and use third party tools like Photo DNA and take it down to combat CSAM to prevent content from being uploaded to our platform. We continually meet with parents, teachers, and teens. In fact, I sat down with a group just a few days ago. We use their insight to strengthen the protections on our platform, and we also work with leading groups like the Technology Coalition. The steps that we’re taking to protect teams

Shou Chew (01:06:00):

… are a critical part of our larger trust and safety work as we continue our voluntary and unprecedented efforts to build a safe and secure data environment for us users, ensuring that our platform remains free from outside manipulation and implementing safeguards on our content recommendation and moderation tools.

(01:06:22)
Keeping teens safe online requires a collaborative effort as well as collective action. We share the community’s concern and commitment to protect young people online and we welcome the opportunity to work with you on legislation to achieve this goal. Our commitment is ongoing and unwavering because there is no finish line when it comes to protecting teens.

(01:06:43)
Thank you for your time and consideration today. I’m happy to answer your questions.

Dick Durbin (01:06:48):

Thanks, Mr. Chew. Ms. Yaccarino.

Linda Yaccarino (01:06:53):

Chairman Durbin, Ranking Member Graham, and esteemed members of the committee, thank you for the opportunity to discuss X’s work in protecting-

Dick Durbin (01:07:05):

Ms. Yaccarino. Could you check if your microphone is on?

Linda Yaccarino (01:07:08):

My talk button is on.

Dick Durbin (01:07:10):

And you might-

Linda Yaccarino (01:07:10):

How is that?

Dick Durbin (01:07:10):

Better. Thank you very much.

Linda Yaccarino (01:07:12):

Maybe I adjust my chair. Apologies. Start over.

(01:07:18)
Chairman Durbin, Ranking Member Graham, and esteemed members of the committee, thank you for the opportunity to discuss X’s work to protect the safety of minors online.

(01:07:34)
Today’s hearing is titled A Crisis which Calls for Immediate Action. As a mother, this is personal and I share the sense of urgency. X is an entirely new company, an indispensable platform for the world and for democracy. You have my personal commitment that X will be active and a part of this solution.

(01:08:07)
While I joined X only in June of 2023, I bring a history of working together with governments, advocates, and NGOs to harness the power of media to protect people. Before I joined, I was struck by the leadership steps this new company was taking to protect children. X is not the platform of choice for children and teens. We do not have a line of business dedicated to children. Children under the age of 13 are not allowed to open an account. Less than 1% of the US users on X are between the ages of 13 and 17, and those users are automatically set to a private default setting and cannot accept a message from anyone they do not approve.

(01:09:12)
In the last 14 months, X has made material changes to protect minors. Our policy is clear: X has zero tolerance towards any material that features or promotes child sexual exploitation. My written testimony details X’s extensive policies on content or actions that are prohibited and include grooming, blackmail, and identifying alleged victims of CSE.

(01:09:49)
We’ve also strengthened our enforcement with more tools and technology to prevent those bad actors from distributing, searching for and engaging with CSE content. If CSE content is posted on X, we remove it. And now we also remove any account that engages with CSE content, whether it is real or computer-generated.

(01:10:19)
Last year, X suspended 12.4 million accounts for violating our CSE policies. This is up from 2.3 million accounts that were removed by Twitter in 2022. In 2023, 850,000 reports were sent to NCMEC, including our first ever auto-generated report. This is eight times more than was reported by Twitter in 2022. We’ve changed our priorities. We’ve restructured our trust and safety teams to remain strong and agile. We are building a trust and safety center of excellence in Austin, Texas to bring more agents in-house to accelerate our impact. We’re applying to the Technology Coalition’s Project Lantern to make further industry-wide progress and impact. We’ve also opened up our algorithms for increased transparency.

(01:11:36)
We want America to lead in this solution. X commends the Senate for passing the REPORT Act and we support the SHIELD Act. It is time for a federal standard to criminalize the sharing of non-consensual intimate material. We need to raise the standards across the entire internet ecosystem, especially for those tech companies that are not here today and not stepping up. X supports the STOP-CSAM Act. The Kids Online Safety Act should continue to progress, and we will support the continuation to engage with it and ensure the protections of the freedom of speech.

(01:12:33)
There are two additional areas that require everyone’s attention. First, as the daughter of a police officer, law enforcement must have the critical resources to bring these bad offenders to justice. Second, with artificial intelligence offenders tactics will continue to sophisticated and evolve. Industry collaboration is imperative here. X believes that the freedom of speech and platform safety can and must coexist. We agree that now is the time to act with urgency.

(01:13:21)
Thank you. I look forward to answering your questions.

Dick Durbin (01:13:24):

Thank you very much, Ms. Yaccarino. Now we’ll go into rounds of questions, seven minutes each for the members as well. I would like to make note of your testimony, Ms. Yaccarino. I believe you are the first social media company to publicly endorse the CSAM Act.

Linda Yaccarino (01:13:40):

It is our honor, Chairman.

Dick Durbin (01:13:42):

That is progress, my friends. Thank you for doing that. I’m still going to be asking some probing questions, but let me get down to the bottom line here. I’m going to focus on my legislation on CSAM.

(01:13:56)
What it says is civil liability if you intentionally or knowingly host or store child sexual abuse materials or make child sex abuse materials available. Secondly, intentionally or knowingly promote or aid and abet a violation of child sexual exploitation laws.

(01:14:21)
Is there anyone here who believes you should not be held civilly liable for that type of conduct? Mr. Citron.

Jason Citron (01:14:32):

Good morning, Chair. We very much believe that this content is disgusting and that there are many things about the STOP CSAM bill that I think are very encouraging and we very much support adding more resources for the cyber tip line and modernizing that along with giving more resources to NCMEC. And I’d be very open to having conversations with you and your team to talk through the details of the bill some more.

Dick Durbin (01:15:04):

I sure would like to do that, because if you intentionally or knowingly host or store CSAM, I think you ought to at least be civilly liable. I can’t imagine anyone who would disagree with that.

Jason Citron (01:15:15):

Yeah, it’s disgusting content.

Dick Durbin (01:15:17):

It certainly is. That’s why we need you supporting this legislation.

(01:15:21)
Mr. Spiegel, I want to tell you, I listened closely to your testimony here and it’s never been a secret that Snapchat is used to send sexually explicit images. In 2013, early in your company’s history, you admitted this in an interview. Do you remember that interview?

Evan Spiegel (01:15:45):

Senator, I don’t recall the specific interview.

Dick Durbin (01:15:50):

You said that when you were first trying to get people on the app, you would, quote, “Go up to the people and be like, ‘Hey, you should try this application. You can send disappearing photos.’ And they would say, ‘Oh, for sexting?'” Do you remember that interview?

Evan Spiegel (01:16:04):

Senator, when we first created the application, it was actually called Picaboo, and the idea was around disappearing images. The feedback we received from people using the app is that they were actually using it to communicate, so we changed the name of the application to Snapchat and we found that people were using it to talk visually.

Dick Durbin (01:16:20):

As early as 2017, law enforcement identified Snapchat as the pedophiles go-to sexual exploitation tool. The case of a twelve-year-old girl identified in court only as LW shows the danger. Over two and a half years, a predator sexually groomed her, sending her sexually explicit images and videos over Snapchat. The man admitted that he only used Snapchat with LW and not any other platforms because he, quote, “knew the chats would go away.”

(01:16:50)
Did you and everyone else at Snap really fail to see that the platform was the perfect tool for sexual predators?

Evan Spiegel (01:16:59):

Senator, that behavior is disgusting and reprehensible. We provide in-app reporting tools so that people who are being harassed or who have been shared inappropriate sexual content can report it. In the case of harassment or sexual content, we typically respond to those reports within 15 minutes so that we can provide help.

Dick Durbin (01:17:14):

When LW, the victim, sued Snapchat, her case was dismissed under Section 230 of the Communications Decency Act. Do you have any doubt that had Snap faced the prospect of civil liability for facilitating sexual exploitation, the company would’ve implemented even better safeguards?

Evan Spiegel (01:17:34):

Senator, we already work extensively to proactively detect this type of behavior. We make it very difficult for predators to find teens on Snapchat. There are no public friends lists, no public profile photos. When we recommend friends for teens, we make sure that they have several mutual friends in common before making that recommendation. We believe those safeguards are important to preventing predators from misusing our platform.

Dick Durbin (01:17:59):

Mr. Citron, according to Discord’s website, it takes a, quote, “proactive and automated approach to safety only on servers with more than 200 members. Smaller servers rely on server owners and community moderators to define and enforce behavior.”

(01:18:16)
So how do you defend an approach to safety that relies on groups of fewer than 200 sexual predators to report themselves for things like grooming, trading a CSAM or sextortion?

Jason Citron (01:18:28):

Chair, our goal is to get all of that content off of our platform and ideally prevent it from showing up in the first place or from people engaging in these kind of horrific activities. We deploy a wide array of techniques that work across every surface on Discord. I mentioned we recently launched something called Teen Safety Assist, which works everywhere and it’s on by default for teen users. That kind of acts like a buddy that lets them know if they’re in a situation or talking with someone that may be inappropriate so they can report that to us and block that user.

Dick Durbin (01:19:05):

Mr. Citron, if that were working, we wouldn’t be here today.

Jason Citron (01:19:09):

Chair, this is an ongoing challenge for all of us. That is why we’re here today. But we do have 15% of our company is focused on trust and safety, of which this is one of our top issues. That’s more people than we have working on marketing and promoting the company. So we take these issues very seriously, but we know it’s an ongoing challenge and I look forward to working with you and collaborating with our tech peers and the non-profits to improve our approach.

Dick Durbin (01:19:35):

I certainly hope so.

(01:19:37)
Mr. Chew, your organization business is one of the more popular ones among children. Can you explain to us what you are doing particularly and whether you’ve seen any evidence of CSAM in your business?

Shou Chew (01:19:54):

Yes, Senator. We have a strong commitment to invest in trust and safety, and as I said in my opening statement, I intend to invest more than $2 billion in trust and safety this year alone. We have 40,000 safety professionals working on this topic. We have built a specialized child safety team to help us identify specialized issues, horrific issues like material like the ones you have mentioned. If we identify any on our platform and we proactively do do detection, we will remove it and we will report them to NCMEC and other authorities.

Dick Durbin (01:20:30):

Why is it TikTok allowing children to be exploited into performing commercialized sex acts?

Shou Chew (01:20:37):

Senator, I respectfully disagree with that characterization. Our live-streaming product is not for anyone below the age of 18. We have taken action to identify anyone who violates that and we remove them from using that service.

Dick Durbin (01:20:54):

At this point, I’m going to turn to my ranking member, Senator Graham.

Senator Graham (01:20:59):

Thank you. Mr. Citron, you said we need to start a discussion. Be honest with you, we’ve been having this discussion for a very long time. We need to get a result, not a discussion. Do you agree with that?

Jason Citron (01:21:16):

Ranking Member, I agree this is an issue that we’ve also been very focused on since we started our company 2015, but this is the first time we’ve-

Senator Graham (01:21:23):

Are you familiar with the EARN IT Act by myself and Senator Blumenthal?

Jason Citron (01:21:28):

A little bit, yes.

Senator Graham (01:21:29):

Okay. Do you support that?

Jason Citron (01:21:33):

We-

Senator Graham (01:21:35):

Yes or no?

Jason Citron (01:21:36):

We are not prepared to support it today, but we believe that section-

Senator Graham (01:21:38):

Okay. Do you support the CSAM Act?

Jason Citron (01:21:41):

The STOP CSAM Act, we are not prepared to support today, but we-

Senator Graham (01:21:47):

Okay. Do you support the SHIELD Act?

Jason Citron (01:21:48):

We believe that the cyber tip line-

Senator Graham (01:21:50):

Do you support it, yes or no?

Jason Citron (01:21:53):

We believe that the cyber tip line and NCMEC-

Senator Graham (01:21:54):

I’ll take that to be no. The Project Safe Childhood Act, do you support it?

Jason Citron (01:22:00):

We believe that-

Senator Graham (01:22:02):

I’ll take that to be no. The REPORT Act, do you support it?

Jason Citron (01:22:06):

Ranking Member Graham, we very much look forward to having conversations with you and your team.

Senator Graham (01:22:09):

Thank you.

Jason Citron (01:22:10):

We want to be part of the solution-

Senator Graham (01:22:11):

I look forward to passing a bill that will solve the problem. Do you support removing Section 230 liability protections for social media companies?

Jason Citron (01:22:18):

I believe that Section 230 needs to be updated. It’s a very old law.

Senator Graham (01:22:23):

Do you support repealing it so people can sue if they believe they’re harmed?

Jason Citron (01:22:28):

I think that Section 230, as written, while it has many downsides, has enabled innovation on the internet, which I think has largely been-

Senator Graham (01:22:35):

Thank you very much. So here you are. If you’re waiting on these guys to solve the problem, we’re going to die waiting.

(01:22:43)
Mr. Zuckerberg, Mr… Trying to be respectful here. The representative from South Carolina, Mr. Guffey’s son got caught up in a sex extortion ring in Nigeria using Instagram. He was shaken down, paid money. That wasn’t enough and he killed himself using Instagram. What would you like to say to him?

Mark Zuckerberg (01:23:19):

That’s terrible. No one should have to go through something like that.

Senator Graham (01:23:24):

You think he should be allowed to sue you?

Mark Zuckerberg (01:23:31):

I think that they can sue us.

Senator Graham (01:23:33):

Well, I think he should and he can’t. So the bottom line here, folks, is that this committee is done with talking. We passed five bills unanimously in their different ways and look at who did this, Graham-Blumenthal, Durbin-Hawley, Klobuchar-Cornyn, Cornyn-Klobuchar, Blackburn and Ossoff. We’ve found common ground here that just is astonishing, and we’ve had hearing after hearing, Mr. Chairman, and the bottom line is I’ve come to conclude, gentlemen, that you’re not going to support any of this.

(01:24:14)
Linda, how do you say your last name?

Linda Yaccarino (01:24:19):

Yaccarino.

Senator Graham (01:24:21):

Do you support the EARN IT Act?

Linda Yaccarino (01:24:26):

We strongly support the collaboration to raise industry practices to-

Senator Graham (01:24:33):

No, no, no, no. Do you support the EARN IT Act?

Linda Yaccarino (01:24:34):

… prevent CSAM.

Senator Graham (01:24:36):

Do you support the… In English, do you support the EARN IT Act, yes or no? We don’t need double-speak here.

Linda Yaccarino (01:24:40):

We look forward to supporting and continue our conversations. As you can see-

Senator Graham (01:24:43):

Okay, so I’ll take that as no. The reason the EARN IT Act is important, you can actually lose your liability protections when children are exploited and you didn’t use best business practices. See, the EARN IT Act means you have to earn liability production. You’re given it no matter what you do. So to the members of this committee, it is now time to make sure that the people who holding up the signs can sue on behalf of their loved ones. Nothing will change until the courtroom door is open to victims of social media. $2 billion, Mr. Chew, how much… What percentage is that of what you made last year?

Shou Chew (01:25:26):

Senator, it’s a significant and increasing investment. As a private company, we’re not sharing our financials.

Senator Graham (01:25:31):

You pay taxes. 2% is what percent of your revenue?

Shou Chew (01:25:36):

Senator, we’re not ready to share our financials in public.

Senator Graham (01:25:38):

Well, I just think $2 billion sounds a lot unless you make a hundred billion. So the point is, when you tell us you’re going to spend $2 billion, great, but how much do you make? It’s all about eyeballs. Well, our goal is to get eyeballs on you, and it’s just not about children. The damage being done, do you realize, Mr. Chew, that your TikTok representative in Israel resigned yesterday?

Shou Chew (01:26:06):

Yes, I’m aware.

Senator Graham (01:26:07):

Okay, and he said, “I resigned from TikTok. We’re living in a time in which our existence as Jews in Israel and Israel is under attack and in danger.” Multiple screenshots taken from TikTok’s internal employee chat platform, known as Lark, show how TikTok’s trust and safety officers celebrate the barbaric acts of Hamas and other Iranian-backed terror groups including the Houthis in Yemen.

Shou Chew (01:26:36):

Senator, need to make it very clear that pro-Hamas content and hate speech is not allowed on our platform or within our company.

Senator Graham (01:26:42):

Why did he resign? Why did he resign? Why did he quit?

Shou Chew (01:26:45):

Senator, we also do not allow any hateful behavior at work-

Senator Graham (01:26:48):

Do you know why he quit? Do you know why he quit?

Shou Chew (01:26:49):

We do not allow this. We will investigate such crimes.

Senator Graham (01:26:51):

My question is he quit. I’m sure he had a good job. He gave up a good job because he thinks your platform is being used to help people who want to destroy the Jewish state. And I’m not saying you want that. Mr. Zuckerberg, I’m not saying you want, as an individual, any of the harms. I am saying that the products you have created, with all the upside, have a dark side.

(01:27:14)
Mr. Citron, I am tired of talking. I’m tired of having discussions. We all know the answer here and here’s the ultimate answer: Stand behind your product. Go to the American courtroom and defend your practices. Open up the courthouse door. Until you do that, nothing will change. Until these people can be sued for the damage they’re doing, it is all talk. I’m a Republican who believes in free enterprise, but I also believe that every American who’s been wronged has to have somebody to go to complain.

(01:27:46)
There’s no commission to go to that can punish you. There’s not one law in the book because you oppose everything we do, and you can’t be sued. That has to stop, folks. How do you expect the people in the audience to believe that we’re going to help their families if we don’t have some system or a combination of systems to hold these people accountable? Because for all the upside, the dark side is too great to live with. We do not need to live this way as Americans.

Dick Durbin (01:28:20):

Thank you, Senator Graham. Senator Klobuchar is next. She’s been quite a leader on the subject for quite a long time on the SHIELD Act and with Senator Cornyn on the revenge porn legislation. Senator Klobuchar.

Amy Klobuchar (01:28:34):

Thank you very much, Chairman Durbin, and thank you Ranking Member Graham for those words. I couldn’t agree more. For too long we have been seeing the social media companies turn a blind eye when kids have joined these platforms in record numbers. They have used algorithms that push harmful content because that content got popular. They provided a venue, maybe not knowingly at first, but for dealers to sell deadly drugs like Fentanyl. Our own head of our Drug Enforcement Administration has said they basically been captured by the cartels in Mexico and in China.

(01:29:24)
So I strongly support first of all, the STOP CSAM bill. I agree with Senator Graham that nothing is going to change unless we open up the courtroom doors. I think the time for all of this immunity is done because I think money talks even stronger than we talk up here.

(01:29:43)
Two of the five bills as noted are my bills with Senator Cornyn. One has actually passed through the Senate but is waiting action in the House. But the other one is the SHIELD Act, and I do appreciate those supportive acts of that bill. This is about revenge porn. The FBI director testified before this committee, there’s been over 20 suicides of kids attributed to online revenge porn in just the last year.

(01:30:14)
But for those parents out there and those families, this is for them about their own child, but it’s also about making sure this doesn’t happen to other children. I know because I’ve talked to these parents, parents like Bridgette Norring from Hastings, Minnesota who is out there today. Bridgette lost her teenage son after he took a fentanyl-laced pill that he purchased on the internet. Amy Neville is also here. Platform, got the pill. Amy Neville is also here. Her son Alexander was only 14 when he died after taking a pill he didn’t know was actually fentanyl.

(01:30:58)
We’re starting a law enforcement campaign, One Pill Kills in Minnesota, going to the schools with the sheriff’s and law enforcement. But the way to stop it is yes at the border and at the points of entry, but we know that 30%, some of the people that are getting the fentanyl are getting it off the platforms.

(01:31:17)
Meanwhile, social media platforms generated 11 billion in revenue in 2022 from advertising directed at children and teenagers, including nearly two billion in ad profits derived from users age 12 and under. When a Boeing plane lost a door in mid-flight several weeks ago, nobody questioned the decision to ground a fleet of over 700 planes. So why aren’t we taking the same type of decisive action on the danger of these platforms when we know these kids are dying?

(01:31:59)
We have bills that have passed through this incredibly diverse committee when it comes to our political views, that have passed through this committee and they should go to the floor. We should do something finally about liability, and then we should turn to some of the other issues that a number of us have worked on when it comes to the charges for app stores and when it comes to some of the monopoly behavior and the self-preferencing, but I’m going to stick with this today.

(01:32:32)
Facts: One-third of fentanyl cases investigated over five months had direct ties to social media. That’s from the DEA. Facts: Between 2012 and 2022, cyber tip line reports of online child sexual exploitation increased from 415,000 to more than 32 million. And as I noted, at least 20 victims committed suicide in sextortation cases.

(01:33:01)
So I’m going to start with that, with you, Mr. Citron. My bill with Senator Cornyn, the SHIELD Act, includes a threat provision that would help protection and accountability for those that are threatened by these predators. Young kids get a picture, send it in, think they got a new girlfriend or a new boyfriend, ruins their life or they think it’s going to be ruined, and they kill themselves. So could you tell me why you’re not supporting the SHIELD Act?

Jason Citron (01:33:31):

Senator, we think it’s very important that teens have a safe experience on our platforms. I think that the portion to strengthen law enforcement’s ability to investigate crimes against children and hold bad actors accountable is incredible.

Amy Klobuchar (01:33:46):

Are you holding open that you may support it?

Jason Citron (01:33:48):

We very much would like to have conversations with you. We’re open to discussing further. And we do welcome legislation regulation. This is a very important issue for our country, and we’ve been prioritizing safety for teens-

Amy Klobuchar (01:34:01):

Thank you. I’m much more interested in if you support it because there’s been so much talk at these hearings and popcorn throwing and the like, and I just want to get this stuff done. I’m so tired of this. It’s been 28 years, what, since the internet, we haven’t passed any of these bills because everyone’s double-talk, double-talk. It’s time to actually pass them. And the reason they haven’t passed is because of the power of your company. So let’s be really, really clear about that. So what you say matters. Your words matter.

(01:34:29)
Mr. Chew, I’m a co-sponsor of Chair Durbin’s STOP CSAM Act of 2023 along with Senator Hawley, who’s the lead Republican, I believe, which, among other things, empowers victims by making it easier for them to ask tech companies to remove the material and related imagery from their platforms. Why would you not support this bill?

Shou Chew (01:34:52):

Senator, we largely support it. I think the spirit of it is very aligned with what we want to do. There are questions about implementation that I think companies like us and some other groups have, and we look forward to asking those. And of course, if this legislation is law, we will comply.

Amy Klobuchar (01:35:08):

Mr. Spiegel, I know we talked ahead of time. I do appreciate your company’s support for the Cooper Davis Act, which will finally… It’s a bill with Senator Shaheen and Marshall, which will allow law enforcement to do more when it comes to fentanyl. I think you know what a problem this is. Devin Norring, teenagers from Hastings, I mentioned his mom here, suffered dental pain and migraines, so he bought what he thought was a Percocet over Snap, but instead he bought a counterfeit drug laced with a lethal dose of fentanyl. As his mom, who’s here with us today said, “All of the hopes and dreams we as parents had for Devin were erased in the blink of an eye and no mom should have to bury their kid.” Talk about why you support the Cooper Davis Act.

Evan Spiegel (01:35:58):

Senator, thank you. We strongly support the Cooper Davis Act, and we believe it will help DEA go after the cartels and get more dealers off the streets to save more lives.

Amy Klobuchar (01:36:06):

Okay. Are there others that support that bill? No? Okay.

(01:36:12)
Last, Mr. Zuckerberg, in 2021, The Wall Street Journal reported on internal Meta research documents asking: Why do we care about tweens? These were internal documents; I’m quoting the documents. And answering its own question by citing Meta internal emails. They are a valuable but untapped audience.

(01:36:35)
At a commerce hearing, I’m also on that committee, I asked Meta’s Head of Global Safety, why children age 10 to 12 are so valuable to Meta? She responded, “We do not knowingly attempt to recruit people who aren’t old enough to use our apps.” Well, when the 42 state attorneys general, Democrat and Republican, brought their case, they said this statement was inaccurate.

(01:36:59)
Few examples. In 2021, she received an email, Ms. Davis, from Instagram’s research director saying that Instagram is investing and experiencing targeting young age, roughly 10 to 12. In a February, 2021 instant message, one of your employees wrote that Meta is working to recruit Gen Alpha before they reach teenage years. A 2018 email that circulated inside Meta says that you were briefed that children under 13 will be critical for increasing the rate of acquisition when users turn 13.

(01:37:39)
Explain that with what I heard at that testimony at the commerce hearing that they weren’t being targeted. And I just ask again, as the other witnesses were asked, why your company does not support the STOP CSAM Act or the SHIELD Act?

Mark Zuckerberg (01:37:55):

Sure, Senator. I’m happy to talk to both of those. We had discussions internally about whether we should build a kids’ version of Instagram, like the kids versions of YouTube and other services.

Amy Klobuchar (01:38:09):

I remember that.

Mark Zuckerberg (01:38:09):

We haven’t actually moved forward with that and we currently have no plans to do so. So I can’t speak directly to the exact emails that you cited, but it sounds to me like there were deliberations around a project that people internally thought was important and we didn’t end up moving forward with.

Amy Klobuchar (01:38:27):

Okay. And the bills, what are you going to say about the two bills?

Mark Zuckerberg (01:38:30):

Sure. Overall, my position on the bills is I agree with the goal of all of them. There are most things that I agree with within them. There are specific things that I would probably do differently. We also have our own legislative proposal for what we think would be most effective in terms of helping the internet in the various companies give parents control over the experience. So I’m happy to go into the detail on any one of them, but ultimately I think that this is-

Amy Klobuchar (01:38:59):

Again, I think these parents

Ms. Klobuchar (01:39:00):

Parents will tell you that this stuff hasn’t worked to just give parents control. They don’t know what to do. It’s very, very hard and that’s why we are coming up with other solutions that we think are much more helpful to law enforcement, but also this idea of finally getting something going on liability, because I just believe with all the resources you have that you actually would be able to do more than you’re doing or these parents wouldn’t be sitting behind you right now in this Senate hearing room.

Mr. Durbin (01:39:28):

Thank you, Senator Klobuchar-

Mark Zuckerberg (01:39:29):

Senator, can I speak to that or do you want me to come back later?

Ms. Klobuchar (01:39:32):

Yeah, yeah.

Mr. Durbin (01:39:32):

Please, go ahead.

Mark Zuckerberg (01:39:35):

I don’t think that parents should have to upload an ID or prove that they’re the parent of a child in every single app that their children use. I think the right place to do this and a place where it would be actually very easy for it to work is within the app stores themselves. Where my understanding is Apple and Google already, or at least Apple, already requires parental consent when a child does a payment with an app, so it should be pretty trivial to pass a law that requires them to make it so that parents have control anytime a child downloads an app and offers consent of that. And the research that we’ve done shows that the vast majority of parents want that, and I think that that’s the type of legislation in addition to some of the other ideas that you all have that would make this a lot easier for parents.

Ms. Klobuchar (01:40:22):

Just to be clear, I remember one mom telling me with all these things she could maybe do that she can’t figure out, it’s like a faucet overflowing in a sink and she’s out there with a mop while her kids are getting addicted to more and more different apps and being exposed to material. We’ve got to make this simpler for parents so they can protect their kids and I just don’t think this is going to be the way to do it. I think the answer is what Senator Graham has been talking about, which is opening up the halls of the courtroom, so that puts it on you guys to protect these parents and protect these kids and then also to pass some of these laws, it makes it easier for law enforcement.

Mr. Durbin (01:40:58):

Thank you, Senator Klobuchar. We’re going to try to stick to the seven-minute rule. Didn’t work very well, but I’ll try to give additional time on the other side as well. Senator Cornyn.

Mr. Cornyn (01:41:11):

There’s no question that your platforms are very popular, but we know that while here in the United States we have an open society and free exchange of information that there are authoritarian governments, there are criminals who will use your platforms for the sale of drugs, for sex, for extortion and the like. And Mr. Chew, I think your company is unique among the ones represented here today, because of its ownership by ByteDance, a Chinese company. And I know there have been some steps that you’ve taken to wall off the data collected here in the United States. But the fact of the matter is that under Chinese law and Chinese national intelligence laws, all information accumulated by companies in the People’s Republic of China are required to be shared with the Chinese intelligence services. ByteDance, the initial release of TikTok I understand was 2016. These efforts that you made with Oracle under the so-called Project Texas to wall off the US data was in 2021 and apparently allegedly fully walled off in March of ’23. What happened to all of the data that TikTok collected before that?

Shou Chew (01:42:47):

Senator, thank you.

Mr. Cornyn (01:42:49):

From American users.

Shou Chew (01:42:50):

Understand. TikTok is owned by ByteDance, which is majority owned by global investors and we have three Americans on the board out of five. You are right in pointing out that over the last three years we have spent billions of dollars building out Project Texas, which is a plan that is unprecedented in our industry. The wall off, firewall off protected US data from the rest of our staff. We also-

Mr. Cornyn (01:43:14):

I’m asking about all of the data that you collected prior to that event.

Shou Chew (01:43:18):

Yes, Senator. We have started a data deletion plan. I talked about this a year ago. We have finished the first phase of data deletion through our data centers outside of the Oracle cloud infrastructure. We’re beginning phase two where we will not only delete from the data centers, we will hire a third party to verify that work and then we will go into, for example, employees working laptops to delete that as well.

Mr. Cornyn (01:43:41):

Was all of the data collected by TikTok prior to Project Texas shared with the Chinese government pursuant to the national intelligence laws of that country?

Shou Chew (01:43:52):

Senator, we have not been asked for any data by the Chinese government and we have never provided it.

Mr. Cornyn (01:44:03):

Your company is unique again among the ones represented here today, because you’re currently undergoing review by the Committee on Foreign Investment in the United States. Is that correct?

Shou Chew (01:44:14):

Senator, yes. There are ongoing discussions and a lot of our Project Texas work is informed by the discussions with many agencies under the CFIUS umbrella.

Mr. Cornyn (01:44:25):

Well, CFIUS is designed specifically to review foreign investments in the United States for national security risks, correct?

Shou Chew (01:44:33):

Yes, I believe so.

Mr. Cornyn (01:44:34):

And your company is currently being reviewed by this interagency committee at the treasury department for potential national security risks.

Shou Chew (01:44:46):

Senator, this review is on a acquisition of Musical.ly, which is an acquisition that was done many years ago.

Mr. Cornyn (01:44:54):

I mean, is this a casual conversation or are you actually providing information to the Treasury Department about how your platform operates for evaluating a potential national security risk?

Shou Chew (01:45:09):

Senator, it’s been many years across two administrations and a lot of discussions around how our plans are, how our systems work. We have a lot of robust discussions about a lot of detail.

Mr. Cornyn (01:45:24):

63% of teens, I understand, use TikTok. Does that sound about right?

Shou Chew (01:45:31):

Senator, I cannot verify that. We know we are popular amongst many age groups. The average age in the US today for our user base is over 30, but we are aware we are popular.

Mr. Cornyn (01:45:42):

And you reside in Singapore with your family, correct?

Shou Chew (01:45:46):

Yes, I reside in Singapore and I work here in the United States as well.

Mr. Cornyn (01:45:50):

Do your children have access to TikTok in Singapore?

Shou Chew (01:45:54):

Senator, if they lived in the United States, I will give them access to our under 13 experience. My children are below the age of 13.

Mr. Cornyn (01:46:01):

My question is in Singapore, do they have access to TikTok or is that restricted by domestic law?

Shou Chew (01:46:09):

We do not have an under 13 experience in Singapore. We have that in the United States, because we were deemed a mixed audience app and we created under 13 experience in response to that.

Mr. Cornyn (01:46:21):

A Wall Street Journal article published yesterday directly contradicts what your company has stated publicly. According to the journal employees under the Project Texas say that US user data, including user emails, birthdate, IP addresses, continue to be shared with ByteDance staff, again owned by a Chinese company. Do you dispute that?

Shou Chew (01:46:48):

Yes, Senator. There are many things about that article they’re inaccurate. Where it gets right is that this is a voluntary project that we built. We spend billions of dollars. There are thousands of employees involved and it’s very difficult, because it’s unprecedented.

Mr. Cornyn (01:47:06):

Why is it important that the data collected from US users be stored in the United States?

Shou Chew (01:47:15):

Senator, this was a project we built in response to some of the concerns that were raised by members of this committee and others.

Mr. Cornyn (01:47:21):

And that was because of concerns that the data that was stored in China could be accessed by the Chinese Communist Party according to the national intelligence laws, correct?

Shou Chew (01:47:34):

Senator, we are not the only company that does business that has Chinese employees, for example. We’re not even the only company in this room that hires Chinese nationals. But in order to address some of these concerns, we have moved the data into the Oracle cloud infrastructure. We built a 2000 person team to oversee the management of that data based here. We walled it off from the rest of the organization and then we open it up to third parties like Oracle and we will onboard others to give them third party validation. This is unprecedented access. I think we are unique in taking even more steps to protect user data in the United States.

Mr. Cornyn (01:48:09):

Well, you’ve disputed the Wall Street Journal story published yesterday. Are you going to conduct any sort of investigation to see whether there’s any truth to the allegations made in the article or are you just going to dismiss them outright?

Shou Chew (01:48:24):

We’re not going to dismiss them. So we have ongoing security inspections not only by our own personnel, but also by third parties to ensure that the system is rigorous and robust. No system that any one of us can build is perfect, but what we need to do is to make sure that we are always improving it and testing it against bad people who may try to bypass it. And if anyone breaks our policies within our organization, we will take disciplinary action against them.

Mr. Durbin (01:48:52):

Thanks, Senator Cornyn. Senator Coons.

Mr. Coons (01:48:55):

Thank you, Chairman Durbin. First I’d like to start by thanking all the families that are here today. All the parents who are here because of a child they have lost. All the families that are here, because you want us to see you and to know your concern. You have contacted each of us in our offices expressing your grief, your loss, your passion and your concern. And the audience that is watching can’t see this. They can see you, the witnesses from the companies, but this room is packed as far as the eye can see. And when this hearing began, many of you picked up and held pictures of your beloved and lost children. I benefit from and participate in social media as do many members of the committee and our nation and our world. There are now a majority of people on earth participating in and in many ways benefiting from one of the platforms you have launched or you lead or you represent.

(01:49:54)
And we have to recognize there are some real positives to social media. It has transformed modern life, but it has also had huge impacts on families, on children, on nations. And there’s a whole series of bills championed by members of this committee that tries to deal with the trafficking in illicit drugs, the trafficking in illicit child sexual material, the things that are facilitated on your platforms that may lead to self-harm or suicide. So we’ve heard from several of the leaders on this committee, the chair and ranking and very talented and experienced senators, the frame that we are looking at this is consumer protection. When there is some new technology we put in place regulations to make sure that it is not overly harmful. As my friend Senator Klobuchar pointed out, one door flew off of one plane, no one was hurt, and yet the entire Boeing fleet of that type of plane was grounded and a federal fit for purpose agency did an immediate safety review. I’m going to point not to the other pieces of legislation that I think are urgent that we take up and pass, but to the core question of transparency. If you are a company manufacturing a product that is allegedly addictive and harmful, one of the first things we look to is safety information. We try to give our constituents, our consumers warnings, labels that help them understand what are the consequences of this product and how to use it safely or not. As you’ve heard pointedly from some of my colleagues, if you sell an addictive, defective, harmful product in this country in violation of regulations and warnings, you get sued. And what is distinct about platforms as an industry is most of the families who are here are here because there were not sufficient warnings and they cannot effectively sue you. So let me dig in for a moment if I can, because each of your companies voluntarily discloses information about the content and the safety investments you make and the actions you take.

(01:52:09)
There was a question pressed, I think it was by Senator Graham earlier about TikTok, I believe Mr. Chew, you said invest $2 billion in safety. My background memo said your global revenue is $85 billion. Mr. Zuckerberg, my background memo says, you’re investing $5 billion in safety and Meta and your annual revenue is on the order of $116 billion. You can hear some expressions from the parents in the audience. What matters is the relative numbers and the absolute numbers. Your data folks, if there’s anybody in this world who understand data, it’s you guys. So I want to walk through whether or not these voluntary measures of disclosure, of content and harm, are sufficient, because I would argue we’re here because they’re not. Without better information, how can policymakers know whether the protections you’ve testified about, the new initiatives, the starting programs, the monitoring and the take-downs are actually working? How can we understand meaningfully how big these problems are without measuring and reporting data?

(01:53:17)
Mr. Zuckerberg, your testimony referenced to National Academy of Sciences study that said, “At the population level there is no proof about harm for mental health.” Well, it may not be at the population level, but I’m looking at a room full of hundreds of parents who have lost children and our challenge is to take the data and to make good decisions about protecting families and children from harm. So let me ask about what your companies do or don’t report, and I’m going to particularly focus on your content policies around self-harm and suicide. And I’m just going to ask a series of yes or no questions and what I’m getting at is, do you disclose enough? Mr. Zuckerberg, from your policies prohibiting content about suicide or self-harm, do you report an estimate of the total amount of content, not a percentage of the overall, not a prevalence number, but the total amount of content on your platform that violates this policy and do you report the total number of views that self-harm or suicide promoting content that violates this policy gets on your platform?

Mark Zuckerberg (01:54:30):

Yes, Senator, we pioneered a quarterly reporting on our community standards enforcement across all these different categories of harmful content. We focus on prevalence, which you mentioned, because what we’re focused on is what percent of the content that we take down or our systems proactively identifying.

Mr. Coons (01:54:48):

Mr. Zuckerberg, I’m going to interrupt you and you’re very talented. I have very little time left. I’m trying to get an answer to a question, not as a percentage of the total, because remember it’s a huge number, so the percentage is small. But do you report the actual amount of content and the amount of views self-harm content received?

Mark Zuckerberg (01:55:09):

No. I believe we focus on prevalence.

Mr. Coons (01:55:10):

Correct, you don’t. Ms. Yaccarino, yes or no, you report it or you don’t?

Ms. Linda Yaccarino (01:55:16):

Senator, as a reminder, we have less than 1% of our users that are between the ages of 13 and 17 and-

Mr. Coons (01:55:24):

Do you report the absolute number of how many images and how often-

Ms. Linda Yaccarino (01:55:27):

We report of posts and accounts that we’ve taken down in 2023. We’ve taken over almost a million posts down that in regards to mental health and self harm.

Mr. Coons (01:55:38):

Mr. Chew, do you disclose the number of appearances of these types of content and how many are viewed before they’re taken down?

Shou Chew (01:55:46):

Senator, we disclosed the number we take down based on each category of violation and how many of that were taken down proactively before it was reported.

Mr. Coons (01:55:55):

Mr. Spiegel?

Ms. Evan Spiegel (01:55:57):

Yes, Senator, we do disclose.

Mr. Coons (01:55:59):

Mr. Citron?

Mr. Jason Citron (01:56:00):

Yes, we do.

Mr. Coons (01:56:01):

So I’ve got three more questions I’d love to walk through if I had unlimited time. I will submit them for the record. The larger point is that platforms need to hand over more content about how the algorithms work, what the content does and what the consequences are. Not at the aggregate, not at the population level, but the actual numbers of cases so we can understand the content.

(01:56:27)
In closing, Mr. Chairman, I have a bipartisan bill, the Platform Accountability and Transparency Act, co-sponsored by Senators Cornyn, Klobuchar, Blumenthal on this committee and Senator Cassidy and others. It’s in front of the Commerce Committee, not this committee, but it would set reasonable standards for disclosure and transparency to make sure that we’re doing our jobs based on data. Yes, there’s a lot of emotion in this field, understandably, but if we’re going to legislate responsibly about the management of the content on your platforms, we need to have better data. Is there any one of you willing to say now that you support this bill? Mr. Chairman, let the record reflect a yawning silence from the leaders of the social media platforms. Thank you.

Mr. Durbin (01:57:17):

Thanks, Senator Coons. We’re on the first of two roll calls and so please understand if some of the members leave and come back. It’s no disrespect. They’re doing their job. Senator Lee?

Mr. Lee (01:57:29):

Thank you, Mr. Chairman. Tragically survivors of sexual abuse are often repeatedly victimized and revictimized over and over and over again by having non-consensual images of themselves on social media platforms. There’s a NCMEC study that pointed out there was one instance of CSAM that reappeared more than 490,000 times after it had been reported. After it had been reported. So we need tools in order to deal with this. We need, frankly, laws in order to mandate standards so that this doesn’t happen so that we have a systematic way of getting rid of this stuff, because there is literally no plausible justification, no way of defending this.

(01:58:28)
One tool, one that I think would be particularly effective, is a bill that I’ll be introducing later today and I invite all my committee members to join me. It’s called the Protect Act. The Protect Act would in pertinent part, require websites to verify age and verify that they’ve received consent of any and all individuals appearing on their site in pornographic images. And it also require platforms to have meaningful processes for an individual seeking to have images of him or herself removed in a timely manner. Ms. Yaccarino, based on your understanding of existing law, what might it take for a person to have those images removed, say from X?

Ms. Linda Yaccarino (01:59:15):

Senator Lee, thank you. It sounds like what you’re going to introduce into law in terms of ecosystem wide and user consent sounds exactly like part of the philosophy of why we’re supporting the SHIELD Act and no one should have to endure non-consensual images being shared online.

Mr. Lee (01:59:38):

And without that, without laws in place and it is fantastic anytime a company, as you’ve described with yours, wants to take those steps, it’s very helpful. It can take a lot longer than it should and sometimes it does to the point where somebody had images shared 490,000 times after it was reported to the authorities and that’s deeply concerning. But yes, the Protect Act would work in tandem with, it’s a good compliment to the SHIELD Act.

(02:00:15)
Mr. Zuckerberg, let’s turn to you next. As you know, I feel strongly about privacy and believe that one of the best protections for an individual’s privacy online involves end-to-end encryption. We also know that a great deal of grooming and sharing of CSAM happens to occur on end-to-end encrypted systems. Does Meta allow juvenile accounts on its platforms to use encrypted messaging services within those apps?

Mark Zuckerberg (02:00:48):

Sorry, Senator. What do you mean juvenile?

Mr. Lee (02:00:50):

Underage, people under 18.

Mark Zuckerberg (02:00:52):

Under 18. We allow people under the age of 18 to use WhatsApp and we do allow that to be encrypted, yes.

Mr. Lee (02:00:59):

Do you have a bottom level age at which they’re not allowed to use it? Child of any age?

Mark Zuckerberg (02:01:04):

Yeah, I don’t think we allow people under the age of 13.

Mr. Lee (02:01:09):

What about you, Mr. Citron. On Discord, do you allow kids to have accounts to access encrypted messaging?

Mr. Jason Citron (02:01:18):

Discord is not allowed to be used by children under the age of 13, and we do not use end-to-end encryption for text messages. We believe that it’s very important to be able to respond to well-formed law enforcement requests, and we’re also working on proactively building technology. We’re working with a nonprofit called Thorn to build a grooming classifier so that our teen safety assist feature can actually identify these conversations if they might be happening so we can intervene and give those teens tools to get out of that situation or potentially even report those conversations and those people to law enforcement.

Mr. Lee (02:01:51):

And then encryption, as much as it can prove useful elsewhere, it can be harmful, especially if you are on a site where you know children are being groomed and exploited. If you allow children onto an end-to-end encryption enabled app, that can prove problematic.

(02:02:09)
Now, let’s go back to you for a moment, Mr. Zuckerberg. Instagram recently announced that it’s going to restrict all teenagers from access to eating disorder material, suicidal ideation themed material, self-harm content, and that’s fantastic. That’s great. What’s odd, what I’m trying to understand is why it is that Instagram is only restricting, it’s restricting access to sexually explicit content, but only for teens ages 13 to 15. Why not restrict it for 16 and 17 year olds as well?

Mark Zuckerberg (02:02:56):

Senator, my understanding is that we don’t allow sexually explicit content on the service for people of any age.

Mr. Lee (02:03:04):

How is that going?

Mark Zuckerberg (02:03:09):

Our prevalence metrics suggest that I think it’s 99% or so of the content that we remove, we’re able to identify automatically using AI system. So I think that our efforts in this, while they’re not perfect, I think are industry leading. The other thing that you asked about was self-harm content, which is what we recently restricted. And we made that shift, I think the state of the science is shifting a bit. Previously we believed that when people were thinking about self-harm, it was important for them to be able to express that and get support, and now more of the thinking in the field is that it’s just better to not show that content at all, which is why we recently moved to restrict that from showing up for those teens at all.

Mr. Lee (02:03:57):

Is there a way for parents to make a request on what their kid can see or not see on your sites?

Mark Zuckerberg (02:04:07):

There are a lot of parental controls. I don’t think that we currently have a control around topics, but we do allow parents to control the time that the children are on the site and also a lot of it is based on monitoring and understanding what the teen’s experience is, what they’re interacting with et cetera.

Mr. Lee (02:04:28):

Mr. Citron, Discord allows pornography on its site. Now reportedly 17% of minors who use Discord have had online sexual interactions on your platform, 17%. And 10% have those interactions with someone that the minor believed to be an adult. Do you restrict minors from accessing Discord servers that host pornographic material on them?

Mr. Jason Citron (02:04:57):

Senator, yes, we do restrict minors from accessing content that is marked for adults. Discord also does not recommend content to people. Discord is a chat app. We do not have a feed or an algorithm that boosts content. So we allow adults to share content with other adults in adult labeled spaces and we do not allow teens to access that content.

Mr. Lee (02:05:16):

Okay, I see my time’s expired. Thank you.

Speaker 15 (02:05:22):

Welcome everyone. We are here in this hearing, because as a collective, your platforms really suck at policing themselves. We hear about it here in Congress with fentanyl and other drug dealing facilitated across platforms. We see it and hear about it here in Congress with harassment and bullying that takes place across your platforms. We see it and hear about it here in Congress with respect to child pornography, sex exploitation, and blackmail, and we are sick of it.

(02:06:13)
It seems to me that there is a problem with accountability, because these conditions continue to persist. In my view, Section 230, which provides immunity from lawsuit is a very significant part of that problem. If you look at where bullies have been brought to heal recently, whether it’s dominion, finally getting justice against Fox News after a long campaign to try to discredit the election equipment manufacturer. Or whether it’s the moms and dads of the Sandy Hook victims finally getting justice against Infowars in its campaign of trying to get people to believe that the massacre of their children was a fake put on by them. Or even now more recently with a writer getting a very significant judgment against Donald Trump after years of bullying and defamation.

(02:07:38)
An honest courtroom has proven to be the place where these things get sorted out. And I’ll just describe one case, if I may. It’s called Doe v. Twitter. The plaintiff in that case was blackmailed in 2017 for sexually explicit photos and videos of himself then aged 13 to 14. A compilation video of multiple CSAM videos surfaced on Twitter in 2019. A concerned citizen reported that video on December 25th, 2019, Christmas Day. Twitter took no action. The plaintiff, then a minor in high school in 2019, became aware of this video from his classmates in January of 2020. You’re a high school kid and suddenly there’s that. That’s a day that’s hard to recover from. Ultimately, he became suicidal. He and his parents contacted law enforcement and Twitter to have these videos removed on January 21st and again on January 22nd, 2020. And Twitter ultimately took down the video on January 30th, 2020 once federal law enforcement got involved. That’s a pretty foul set of facts.

(02:09:44)
When the family sued Twitter for all those months of refusing to take down the explicit video of this child, Twitter invoked Section 230 and the district court ruled that the claim was barred. There is nothing about that set of facts that tells me that Section 230 performed any public service in that regard. I would like to see very substantial adjustments to Section 230 so that the honest courtroom, which brought relief and justice to E. Jean Carroll after months of defamation, which brought silence, peace, and justice to the parents of the Sandy Hook children after months of defamation and bullying by Infowars and Alex Jones. And which brought significant justice and an end to the campaign of defamation by Fox News to a little company that was busy just making election machines.

(02:11:18)
My time is running out. I’ll turn to, I guess, Senator Cruz is next. But I would like to have each of your companies put in writing what exemptions from the protection of Section 230 you would be willing to accept, bearing in mind the fact situation in Doe v. Twitter. Bearing in mind the enormous harm that was done to that young person and that family by the non-responsiveness of this enormous

Speaker 15 (02:12:00):

Platform over months, and months, and months, and months. Again, think of what it’s like to be a high school kid and have that stuff up in the public domain, and have the company that is holding it out there in the public domain react so disinterestedly. Okay, will you put that down in writing for me? 1, 2, 3, 4, 5 yeses, done. Senator Cruz?

Mr. Cruz (02:12:36):

Thank you Mr. Chairman. Social media is a very powerful tool, but we’re here because every parent I know and I think every parent in America is terrified about the garbage that is directed at our kids. I have two teenagers at home and the phones they have are portals to predators, to viciousness, to bullying, to self- harm, and each of your companies could do a lot more to prevent it. Mr. Zuckerberg, in June of 2023, the Wall Street Journal reported that Instagram’s recommendation systems were actively connecting pedophiles to accounts that were advertising the sale of child sexual abuse material. In many cases, those accounts appear to be run by underage children themselves, often using code words and emojis to advertise illicit material. In other cases, the accounts included indicia that the victim was being sex trafficked.

(02:13:42)
Now, I know that Instagram has a team that works to prevent the abuse and exploitation of children online, but what was particularly concerning about the Wall Street Journal expose was the degree to which Instagram’s own algorithm was promoting the discoverability of victims for pedophiles seeking child abuse material. In other words, this material wasn’t just living on the dark corners of Instagram, Instagram was helping pedophiles find it by promoting graphic hashtags, including #pedwhore, and #preteensex to potential buyers. Instagram also displayed the following warning screen to individuals who were searching for child abuse material. “These results may contain images of child sexual abuse.” And then you gave users two choices, get resources or see results anyway. Mr. Zuckerberg, what the hell were you thinking?

Mark Zuckerberg (02:15:03):

All right, Senator, the basic science behind that is that when people are searching for something that is problematic, it’s often helpful to, rather than just blocking it, to help direct them towards something that could be helpful for getting them to get help. We also-

Mr. Cruz (02:15:22):

I understand, get resources. In what sane universe is there a link for see results anyway?

Mark Zuckerberg (02:15:28):

Well, because we might be wrong. We try to trigger this warning or we try to, when we think that there’s any chance that the results might be harmful-

Mr. Cruz (02:15:38):

Okay, you might be wrong. Let me ask you, how many times was this warning screen displayed?

Mark Zuckerberg (02:15:43):

I don’t know, but if there’s-

Mr. Cruz (02:15:43):

You don’t know, why don’t you know?

Mark Zuckerberg (02:15:47):

I don’t know the answer to that off the top of my head, but-

Mr. Cruz (02:15:49):

You know what, Mr. Zuckerberg? It’s interesting you say you don’t know it off the top of your head, because I asked it in June of 2023 in an oversight letter and your company refused to answer. Will you commit right now to within five days answering this question for this committee?

Mark Zuckerberg (02:16:06):

We’ll follow up on that.

Mr. Cruz (02:16:06):

Is that a yes? Not a we’ll follow up, I know how lawyers write statements saying we’re not going to answer. Will you tell us how many times this warning screen was displayed, yes or no?

Mark Zuckerberg (02:16:17):

Senator, I’ll personally look into it. I’m not sure if we have-

Mr. Cruz (02:16:19):

Okay, so you’re refusing to answer that. Let me ask you this. How many times did an Instagram user who got this warning that you’re seeing images of child sexual abuse, how many times did that user click on see results anyway, I want to see that?

Mark Zuckerberg (02:16:34):

Senator, I’m not sure if we stored that, but I’ll personally look into this and we’ll follow up after.

Mr. Cruz (02:16:38):

And what follow up did Instagram do when you have a potential pedophile clicking on, I’d like to see child porn. What did you do next when that happened?

Mark Zuckerberg (02:16:52):

Senator, I think that an important piece of context here is that any context that we think is child sexual abuse-

Mr. Cruz (02:16:57):

Mr. Zuckerberg, that’s called a question. What did you do next when someone clicked, “You may be getting child sexual abuse images.” And they clicked see results anyway, what was your next step? You said you might be wrong. Did anyone examine, was it in fact child sexual abuse material? Did anyone report that user? Did anyone go and try to protect that child? What did you do next?

Mark Zuckerberg (02:17:25):

Senator, we take down anything that we think is sexual abuse material on the service and we do report to-

Mr. Cruz (02:17:30):

Did anyone verify whether it was in fact child sexual abuse material?

Mark Zuckerberg (02:17:35):

Senator, I don’t know if every single search result we’re following up on, but-

Mr. Cruz (02:17:39):

Did you report the people who wanted it?

Mark Zuckerberg (02:17:40):

Senator, do you want me to answer your questions?

Mr. Cruz (02:17:42):

Yeah, I want you to answer the question I’m asking. Did you report?

Mark Zuckerberg (02:17:45):

Give me some time to speak then.

Mr. Cruz (02:17:45):

The people who clicked see results anyway?

Mark Zuckerberg (02:17:48):

That’s probably one of the factors that we use in reporting, and in general, we’ve reported more people and done more reports like this to NCMEC, the National Center of Missing Exploited Children, than any other company in the industry. We proactively go out of our way across our services to do this and have made, I think it’s more than twenty-six million reports, which is more than the whole rest of the industry combined. So I think the illustration that we take this seriously.

Mr. Cruz (02:18:10):

Mr. Zuckerberg, Mr. Zuckerberg, your company and every social media company needs to do much more to protect children. All right, Mr. Chew in the next couple of minutes I have, I want to turn to you. Are you’re familiar with China’s 2017 National Intelligence Law which states, “All organizations and citizens shall support, assist and cooperate with national intelligence efforts in accordance with the law and shall protect national intelligence work secrets they are aware of”?

Mr. Shou Chew (02:18:40):

Yes, I’m familiar with this.

Mr. Cruz (02:18:42):

TikTok is owned by ByteDance, is ByteDance subject to the law?

Mr. Shou Chew (02:18:47):

For the Chinese businesses that ByteDance owns, yes, it will be subject to this, but TikTok is not available in mainland China. And Senator, as we talked about in your office, we built Project Texas to put this out of reach.

Mr. Cruz (02:18:58):

So ByteDance is subject to the law. Now under this law which says, “Shall protect national intelligence work secrets they’re aware of.” It compels people subject to the law to lie to protect those secrets, is that correct?

Mr. Shou Chew (02:19:17):

I cannot comment on that. What I said again is that we have moved it out of reach-

Mr. Cruz (02:19:21):

Because you have to protect those secrets?

Mr. Shou Chew (02:19:23):

No, Senator, TikTok is not available in mainland China. We have moved the data into an American cloud infrastructure

Mr. Cruz (02:19:28):

TikTok is controlled by ByteDance, which is subject to this law. Now you said earlier, you said, and I wrote this down, “We have not been asked for any data by the Chinese government and we have never provided it.” I’m going to tell you and I told this when you and I met last week in my office, I do not believe you, and I’ll tell you, the American people don’t either. If you look at what is on TikTok in China, you are promoting to kids science and math videos, educational videos, and you limit the amount of time kids can be on TikTok. In the United States, you are promoting to kids self-harm videos and anti-Israel propaganda. Why is there such a dramatic difference?

Mr. Shou Chew (02:20:18):

Senator, that is just not accurate. There is a lot-

Mr. Cruz (02:20:20):

There’s not a difference between what kids see in China and what kids see here?

Mr. Shou Chew (02:20:24):

Senator, TikTok is not available in China, it’s a separate experience there. What I’m saying is-

Mr. Cruz (02:20:28):

But you have a company that is essentially the same except it promotes beneficial materials instead of harmful materials?

Mr. Shou Chew (02:20:35):

That is not true. We have a lot of science of math content here on TikTok. There’s so much of it, we created a stem feed for 100 billion views.

Mr. Cruz (02:20:42):

Let let me point to this Mr. Chew. There was a report recently that compared hashtags on Instagram to hashtags on TikTok and what trended, and the differences were striking. So for something like #TaylorSwift or # Trump, researchers found roughly two Instagram posts for everyone on TikTok, that’s not a dramatic difference. That difference jumps to eight to one for the hashtag, Uyghur, and it jumps to 30 to one for the hashtag, Tibet, and it jumps to 57 to one to the hashtag, TiananmenSquare, and it jumps to 174 to one for the hashtag, HongKongprotest. Why is it that on Instagram people can put up a hashtag, HongKongprotest 174 times compared to TikTok? What censorship is TikTok doing at the request of the Chinese government?

Mr. Shou Chew (02:21:46):

None. Senator, that analysis is flawed.

Mr. Cruz (02:21:49):

Can you explain that differential?

Mr. Shou Chew (02:21:49):

The analysis is flawed, it’s been debunked by other external sources like the Cato Institute. Fundamentally, a few things happen here. Not all videos carry hashtags, that’s the first thing. The second thing is that you cannot selectively choose a few words within a certain time period-

Mr. Cruz (02:22:03):

Why the difference between TaylorSwift and Tiananmen Square? What happened in Tiananmen Square?

Mr. Shou Chew (02:22:07):

Senator, there was a massive protest during that time, but what I’m trying to say is our users can freely come and post this content.

Mr. Cruz (02:22:15):

Why would there be no difference on Taylor Swift or a minimal difference and a massive difference on Tiananmen Square or Hong Kong?

Mr. Durbin (02:22:20):

Senator, could you wrap up please?

Mr. Shou Chew (02:22:22):

Senator, our algorithm does not suppress any content simply based on this, it doesn’t.

Mr. Cruz (02:22:27):

So answer that question, why is there a difference?

Mr. Shou Chew (02:22:28):

Like I said, I think this analysis is flawed. You’re selectively choosing some words over some periods, we haven’t been around as long as other apps.

Mr. Cruz (02:22:34):

There is an obvious difference. 174 to one for Hong Kong compared to Taylor Swift is dramatic.

Mr. Durbin (02:22:42):

Senator Blumenauer.

Mr. Blumenthal (02:22:45):

Thanks Mr. Chairman. Mr. Zuckerberg

Mr. Durbin (02:22:47):

Blumenthal, I’m sorry.

Mr. Blumenthal (02:22:50):

Thank you.

Mr. Durbin (02:22:50):

I know both of them.

Mr. Blumenthal (02:22:52):

That was good enough. Mr. Zuckerberg, you know who Antigone Davis is, correct?

Mark Zuckerberg (02:22:59):

Yes.

Mr. Blumenthal (02:22:59):

She’s one of your top leaders. In September of 2021 she was Global Head of Safety, correct?

Mark Zuckerberg (02:23:09):

Yes.

Mr. Blumenthal (02:23:09):

And you know that she came before a subcommittee of the Commerce committee that I chaired at the time, Subcommittee on Consumer Protection, correct?

Mark Zuckerberg (02:23:17):

Yes.

Mr. Blumenthal (02:23:18):

And she was testifying on behalf of Facebook, right?

Mr. Shou Chew (02:23:21):

Meta but yes?

Mr. Blumenthal (02:23:23):

It was then Facebook, but Meta now and she told us, and I’m quoting, “Facebook is committed to building better products for young people and to doing everything we can to protect their privacy, safety, and well-being on our platforms.” And she also said Kids safety is an area where, “We are investing heavily.” We now know that statement was untrue. We know it from an internal email that we have received. It’s an email written by Nick Clegg, you know who he is, correct?

Mark Zuckerberg (02:24:02):

Yes.

Mr. Blumenthal (02:24:03):

He was Meta’s president of Global Affairs and he wrote a memo to you which you received, correct? It was written to you.

Mark Zuckerberg (02:24:18):

I can’t see the email but sure. I’ll assume that you got that correct.

Mr. Blumenthal (02:24:25):

And he summarized Facebook’s problems. He said, “We are not on track to succeed for our core well-being topics. Problematic use, bullying and harassment, connections, and SSI, meaning suicidal self-injury.” He said also in another memo, “We need to do more and we are being held back by a lack of investment.” This memo has the date of August 28th, just weeks before that testimony from Antigone Davis, correct?

Mark Zuckerberg (02:25:12):

Sorry, Senator, I’m not sure what the date of the testimony was.

Mr. Blumenthal (02:25:15):

Well, those are the dates on the emails. Nick Clegg was asking you, pleading with you for resources to back up the narrative, to fulfill the commitments. In effect, Antigone Davis was making promises that Nick Clegg was trying to fulfill and you rejected that request for 45 to 84 engineers to do well-being or safety. We know that you rejected it from another memo, Nick Clegg’s assistant, Tim Colburn, who said, “Nick did email Mark,” referring to that earlier email, “to emphasize his support for the package but it lost out to the various other pressures and priorities.” We’ve done a calculation that those potentially 84 engineers would’ve cost Meta about $50 million in a quarter when it earned $9.2 billion and yet it failed to make that commitment in real terms and you rejected that request because of other pressures and priorities.

(02:26:55)
That is an example from your own internal document of failing to act and it is the reason why we can no longer trust Meta, and frankly any of the other social media, to in effect grade their own homework. The public, and particularly the parents in this room, know that we can no longer rely on social media to provide the kind of safeguards that children and parents deserve, and that is the reason why passing the Kids Online Safety Act is so critically important. Mr. Zuckerberg, do you believe that you have a constitutional right to lie to Congress?

Mark Zuckerberg (02:27:46):

Senator? No, but I mean shou shared a bunch of things, and I’d like the opportunity to respond to you.

Mr. Blumenthal (02:27:47):

Well, let me clarify for you. In a lawsuit brought by hundreds of parents, some in this very room, alleging that you made false and misleading statements concerning the safety of your platform for children, you argued in not just one pleading but twice, in December and then in January, that you have a constitutional right to lie to Congress. Do you disavow that filing in court?

Mark Zuckerberg (02:28:23):

Senator, I don’t know what filing you’re talking about, but I testified honestly and truthfully, and I would like the opportunity to respond to the previous things that you showed as well.

Mr. Blumenthal (02:28:34):

Well, I have a few more questions and let me ask others who are here because I think it’s important to put you on record. Who will support the Kids Online Safety Act? Yes or no, Mr. Citron.

Mr. Jason Citron (02:28:50):

There are parts of the act that we think are great.

Mr. Blumenthal (02:28:53):

No, it’s a yes or no question. I’m going to be running out of time, so I’m assuming the answer is no if you can’t answer yes.

Mr. Jason Citron (02:29:00):

We very much think that the National Privacy Standard would be great.

Mr. Blumenthal (02:29:03):

That’s a no. Mr. Spiegel?

Mr. Evan Spiegel (02:29:06):

Senator, we strongly Support the Kids Online Safety Act and we’ve already implemented many of its core provisions.

Mr. Blumenthal (02:29:11):

Thank you. I welcome that support, along with Microsoft’s support. Mr. Chew?

Mr. Shou Chew (02:29:16):

Senator, with some changes, we can support it.

Mr. Blumenthal (02:29:19):

In its present form, do you support it, yes or no?

Mr. Shou Chew (02:29:22):

We are aware that some groups have raised some concerns. It’s important to understand how-

Mr. Blumenthal (02:29:26):

I’ll take that as a no. Ms. Yaccarino?

Ms. Linda Yaccarino (02:29:30):

Senator, we support KOSA, and we’ll continue to make sure that it accelerates and make sure to continues to offer community for teens that are seeking that voice.

Mr. Blumenthal (02:29:40):

Mr. Zuckerberg?

Mark Zuckerberg (02:29:42):

Senator. We support the age-appropriate content standards but would have some suggestions on how to implement it.

Mr. Blumenthal (02:29:48):

Yes or no Mr. Zuckerberg. Do you support the Kids Online Safety Act? It’s a measure that is public and I’m just asking whether you’ll support it or not.

Mark Zuckerberg (02:29:58):

These are nuanced things. I think that the basic spirit is right, I think the basic ideas in it are right, and there are some ideas that I would debate how to best implement them.

Mr. Blumenthal (02:30:05):

Unfortunately, I don’t think we can count on social media as a group or big tech to support this measure, and in the past we know it’s been opposed by armies of lawyers and lobbyists. We’re prepared for this fight, but I am very, very glad that we have parents here because tomorrow we’re going to have an advocacy day, and the folks who really count, the people in this room who support this measure are going to be going to their representatives and their senators, and their voices and faces are going to make a difference. Senator Schumer has committed that he will work with me to bring this bill to a vote and then we will have real protection for children and parents online. Thank you Mr. Chairman.

Mr. Durbin (02:31:05):

Thank you. Senator Blumenthal. We have a vote on, has Senator Cotton have you voted? And Senator Hawley, you haven’t voted yet, you’re next and I don’t know how long the vote will be open, but turn over to you.

Mr. Cotton (02:31:21):

Thank you, Mr. Chairman. Mr. Zuckerberg, let me start with you. Did I hear you say in your opening statement that there’s no link between mental health and social media use?

Mark Zuckerberg (02:31:32):

Senator, what I said is I think it’s important to look at the science. I know people widely talk about this as if that is something that’s already been proven, and I think that the bulk of the scientific evidence does not support that.

Mr. Cotton (02:31:43):

Well, really, let me just remind you of some of the science from your own company. Instagram studied the effect of your platform on teenagers. Let me just read you some quotes from the Wall Street’s Journal’s report on this. “Company researchers found that Instagram is harmful for a sizable percentage of teenagers, most notably teenage girls.” Here’s a quote from your own study, “We make body image issues worse for one in three teen girls.” Here’s another quote, “Teens blame Instagram,” this is your study, “for increases in the rate of anxiety and depression. This reaction was unprompted and consistent across all groups.” That’s your study.

Mark Zuckerberg (02:32:22):

Senator. We try to understand the feedback and how people feel about the services. We can improve those-

Mr. Cotton (02:32:29):

Wait a minute, your own study says that you make life worse for one in three teenage girls, you increase anxiety and depression, that’s what it says, and you’re here testifying to us in public that there’s no link. You’ve been doing this for years. For years you’ve been coming in public and testifying under oath that there’s absolutely no link, your product is wonderful, the science is nascent, full speed ahead, while internally full well, your product is a disaster for teenagers. And yet you keep right on doing what you’re doing.

Mark Zuckerberg (02:33:00):

That’s not true, that’s not true.

Mr. Cotton (02:33:02):

Let me show you some other facts. [inaudible 02:33:03]. Wait a minute, wait a minute, that’s not a question, that’s not a question. Those are facts, Mr. Zuckerberg, that’s not a question.

Mark Zuckerberg (02:33:10):

Those aren’t facts.

Mr. Cotton (02:33:11):

Let me show you some more facts. Here’s some information from a whistleblower who came before the Senate testified under oath in public, he worked for you, a senior executive. Here’s what he showed he found when he studied your products. So for example, this is girls between the ages of 13 and 15 years old. 37% of them reported that they had been exposed to nudity on the platform unwanted in the last seven days. 24% said that they had experienced unwanted sexual advances, they’d been propositioned in the last seven days. 17% said they had encountered self-harm content pushed at them in the last seven days. Now, I know you’re familiar with these stats because he sent you an email where he lined it all out, we’ve got a copy of it right here. My question is, who did you fire for this? Who got fired because of that?

Mark Zuckerberg (02:34:07):

Senator, we study all of this because it’s important and we want to improve our services.

Mr. Cotton (02:34:10):

Well, you just told me a second ago you studied it that there was no linkage. Who did you fire?

Mark Zuckerberg (02:34:15):

I said, you mischaracterized.

Mr. Cotton (02:34:16):

37% of teenage girls between 13 and 15 were exposed to unwanted nudity in a week on Instagram. You knew about it, who did you fire?

Mark Zuckerberg (02:34:26):

Senator, this is why we’re building all these tools.

Mr. Cotton (02:34:28):

Who did you fire?

Mark Zuckerberg (02:34:30):

Senator, I don’t think that that’s…

Mr. Cotton (02:34:31):

Who did you fire?

Mark Zuckerberg (02:34:33):

I’m not going to answer that.

Mr. Cotton (02:34:35):

Because you didn’t fire anybody, right? You didn’t take any significant action?

Mark Zuckerberg (02:34:39):

I don’t think it’s appropriate to talk about individual or HR Decisions.

Mr. Cotton (02:34:43):

It’s not appropriate? Do you know who’s sitting behind you? You’ve got families from across the nation whose children are either severely harmed or gone and you don’t think it’s appropriate to talk about steps that you took, the fact that you didn’t fire a single person. Let ask you this, let me ask you this, have you compensated any of the victims?

Mark Zuckerberg (02:35:01):

Sorry?

Mr. Cotton (02:35:02):

Have you compensated any of the victims? These girls, have you compensated them?

Mark Zuckerberg (02:35:06):

I don’t believe so.

Mr. Cotton (02:35:09):

Why not? Don’t you think they deserve some compensation for what your platform was done? Help with counseling services, help with dealing with the issues that your services caused?

Mark Zuckerberg (02:35:21):

Our job is to make sure that we build tools to help keep people safe across our platforms.

Mr. Cotton (02:35:25):

Are you going to compensate them?

Mark Zuckerberg (02:35:27):

Senator, our job and what we take seriously is making sure that we build industry-leading tools to find harmful content, take it off the services.

Mr. Cotton (02:35:35):

To make money, to make money.

Mark Zuckerberg (02:35:35):

And to build tools that empower parents.

Mr. Cotton (02:35:37):

So you didn’t take any action, you didn’t take any action, you didn’t fire anybody, you haven’t compensated a single victim. Let me ask you this, Let me ask you this. There’s families of victims here today, have you apologized to the victims? Would you like to do so now? They’re here, you’re on national television, would you like now to apologize to the victims who have been harmed by your product? Show them the pictures. Would you like to apologize for what you’ve done to these good people?

Mark Zuckerberg (02:36:06):

I’m sorry for everything that you have all gone through, it’s terrible. No one should have to go through the things that your families have suffered, and this is why we invest so much and are going to continue doing industry-leading efforts to make sure that no one has to go through the types of things that your families have had to suffer.

Mr. Cotton (02:36:30):

Mr. Zuckerberg, why should your company not be sued for this? Why is it that you hide behind a liability shield, you can’t be held accountable? Shouldn’t you be held accountable personally? Will you take personal responsibility?

Mark Zuckerberg (02:36:45):

Senator, I think I’ve already answered this. These issues-

Mr. Cotton (02:36:48):

We’ll try this again, will you take personal responsibility?

Mark Zuckerberg (02:36:51):

Senator, I view my job and the job of our company as building the best tools that we can to keep our community safe.

Mr. Cotton (02:36:58):

Well, you’re failing at that.

Mark Zuckerberg (02:37:00):

Well, Senator, we’re doing an industry-leading effort, we build AI tools-

Mr. Cotton (02:37:03):

That nonsense, your product is killing people. Will you personally commit to compensating the victims? You’re a billionaire, will you commit to compensating the victims? Will you set up a compensation fund with your money?

Mark Zuckerberg (02:37:12):

Senator, I think these are-

Mr. Cotton (02:37:12):

With your money.

Mark Zuckerberg (02:37:14):

Senator, these are complicated issues.

Mr. Cotton (02:37:15):

No, that’s not a complicated question though. Yes or no, will you set up a victim’s compensation fund with your money, the money you made on these families sitting behind you, yes or no?

Mark Zuckerberg (02:37:25):

Senator, my job is to makes sure we build good tools.

Mr. Cotton (02:37:27):

Sounds like a no. Sounds like a no. Your job is to be responsible for what your company has done. You’ve made billions of dollars on the people sitting behind you here, you’ve done nothing to help them, you’ve done nothing to compensate them, you’ve done nothing to put it right. You could do so here today and you should, you should Mr. Zuckerberg. Before my time expires. Mr. Chew, let me just ask you. Your platform, why should your platform not be banned in the United States of America? You are owned by a Chinese communist company or a company based in China. The editor-in-chief of your parent company is a communist party secretary. Your company has been surveilling Americans for years. According to leaked audio for more than 80 internal TikTok meetings, China-based employees of your company have repeatedly accessed non-public data of United States citizens. Your company has tracked journalists, improperly gaining access to their IP addresses user data in an attempt to identify whether they’re writing negative stories about you. Your platform is basically an espionage arm for the Chinese Communist Party, why should you not be banned in the United States of America?

Mr. Shou Chew (02:38:45):

Senator, I disagree with your characterization. Many of what you have said, we have explained in a lot of detail. TikTok is used by 170 million Americans.

Mr. Cotton (02:38:53):

I know, and every single one of those Americans are in danger from the fact that you track their keystrokes, you track their app usage, you track their location data, and we know that all of that information can be accessed by Chinese employees who are subject to the dictates of the Chinese Communist Party. Why should you not be banned in this country?

Mr. Shou Chew (02:39:15):

Senator, that is not accurate. A lot of what you described we collect, we don’t.

Mr. Cotton (02:39:18):

It is 100% accurate. Do you deny that repeatedly American’s data has been accessed by ByteDance employees in China?

Mr. Shou Chew (02:39:28):

We built a project that cost us billions of dollars to stop that, and we have made a lot of progress.

Mr. Cotton (02:39:33):

And it hasn’t been stopped. According to the Wall Street Journal report from just yesterday, “Even now, ByteDance workers, without going through official channels, have access to the private information of American citizens.” I’m quoting from the article, “Private information of American citizens, including their birthdate, their IP address, and more.” That’s now.

Mr. Shou Chew (02:39:52):

Senator, as we know, the media doesn’t always get it right.

Mr. Cotton (02:39:56):

But the Chinese Communist Party does?

Mr. Shou Chew (02:39:58):

I’m not saying that. What I’m saying is that we have spent billions of dollars to build this project. It’s rigorous, it’s robust, it’s unprecedented, and I’m proud of the work that the 2000 employees are doing to protect the data of American citizens.

Mr. Cotton (02:40:09):

It’s not protected, that’s the problem. Mr. Chew, it’s not protected at all. It’s subject to Communist Chinese Party inspection and review. Your app, unlike anybody else sitting here, and heaven knows I’ve got problems with everybody here, but your app, unlike any of those, is subject to the control and inspection of a foreign hostile government that is actively trying to track the information of whereabouts of every American that they get their hands on. Your app ought to be banned in the United States of America for the security of this country. Thank you, Mr. Chairman,

Mr. Durbin (02:40:40):

Senator Hirono.

Ms. Hirono (02:40:42):

Thank you Mr. Chairman. As we’ve heard children face all sorts of dangers when they use social media, from mental health harms, to sexual exploitation, even trafficking. Sex trafficking is a serious problem in my home state of Hawaii, especially for native Hawaiian victims. That social media platforms are being used to facilitate this trafficking as well as the creation and distribution of CSAM is deeply concerning, but it’s happening. For example, several years ago a military police officer stationed in Hawaii was sentenced to 15 years in prison for producing CSAM as part of his online exploitation of a minor female. He began communicating with this twelve-year-old girl through Instagram, he then used Snapchat to send her sexually explicit photos and to solicit such photos from her. He later used these photos to blackmail her.

(02:41:44)
And just last month, the FBI arrested a neo-Nazi cult leader in Hawaii who lured victims to his Discord server. He used that server to share images of extremely disturbing child sexual abuse material interspersed with neo-Nazi imagery. Members of his child exploitation and hate group are also present on Instagram, Snapchat, X, and TikTok, all of which they use to recruit potential members and victims. In many cases, including the ones I just mentioned, your companies played a role in helping law enforcement investigate these offenders, but by the time of the investigation, so much damage had already been done. This hearing is about how to keep children safe online, and we’ve listened to all of your testimony to seemingly impressive safeguards for young users. You try to limit the time that they spend, you require parental consent, you have all of these tools, yet trafficking and exploitation of minors online and on your platforms continues to be rampant.

(02:42:55)
Nearly all of your companies make your money through advertising, specifically by selling the attention of your users. Your product is your users. As a make-up product designer wrote in an email, “Young ones are the best ones. You want to bring people to your service young and early.” In other words, hook them early. Research published last month by Harvard’s School of Public Health estimates that Snap makes an astounding 41% of its revenues by addressing to users under 18. With TikTok, it’s 35%. Seven of the 10 largest Discord servers attracting many paying users are for games used primarily by teens, by children. All this is to say that social media companies, yours and others, make money by attracting kids to your platforms, but ensuring safety doesn’t make money, it costs money.

(02:44:04)
If you are going to continue to attract kids to your platforms, you have an obligation to ensure they’re safe on their platforms because the current situation is untenable, that is why we’re having this hearing. But to ensure safety for our children, that costs money. Your companies cannot continue to profit off young users only to look the other way when those users, our children, are harmed online. We’ve had a lot of comments about Section two 30 protections, and I think we are definitely heading in that direction, and some of the five bills that we have already passed out of this committee talks about limiting the liability protections for you. Last November, this is for Mr. Zuckerberg, last November, the privacy technology

Ms. Hirono (02:45:00):

…. technology subcommittee, her testimony from Arturo Béjar. In response to one of my questions about how to ensure that social media companies focus more on child safety, he said, and I am paraphrasing a little bit, Mr. Béjar said, “What will change their behavior is at the moment that Mark Zuckerberg declares earnings, and these earnings have to be declared to the SEC,” so he has to say, “Last quarter, we made $ 34 billion.”

(02:45:31)
And the next thing he has to say is, “How many teens experienced unwanted sexual advances on his platform?” Mr. Zuckerberg, will you commit to reporting measurable child safety data on your quarterly earnings reports and calls?

Mark Zuckerberg (02:45:53):

Senator, it’s a good question. We actually already have a quarterly report that we issue and do a call to answer questions for how we’re enforcing our community standards. That includes not just the child safety issues and metrics.

Ms. Hirono (02:46:06):

So is that a yes?

Mark Zuckerberg (02:46:08):

We have a separate call that we do this on, but we’ve led the industry-

Ms. Hirono (02:46:12):

You have to report your earnings to the SEC. Will you report to them this kind of data? And by numbers, by the way, because Senator [inaudible 02:46:22] and others have said percentages don’t really tell the full story.

(02:46:25)
Will you report to the SEC the number of teens? And sometimes, you don’t even know whether they’re teens or not, because they just claim to be adults. Will you report the number of underage children on your platforms who experience unwanted CSAM and other kinds of messaging that harm them? Will you commit to citing those numbers to the SEC when you make your quarterly report?

Mark Zuckerberg (02:46:58):

Well, Senator, I’m not sure it would make as much sense to include it in the SEC filing, but we file it publicly so that way everyone can see this, and I’d be happy to follow up and talk about what specific metrics.

(02:47:08)
I think the specific things that some of the ones that you just mentioned around underage people on our services, we don’t allow people under the age of 13 on our service. So if we find anyone who’s under the age of 13, we remove them from our service. Now, I’m not saying that people don’t lie and that there aren’t-

Ms. Hirono (02:47:21):

Yes, apparently there are-

Mark Zuckerberg (02:47:22):

… anyone who’s under the age of 13 who’s using it, but we’re not going to be able to count how many people there are, because fundamentally, if we identify that someone is underage, we remove them from the service.

Ms. Hirono (02:47:31):

I think that’s really important that we get actual numbers, because these are real human beings. That’s why all these parents and others are here, because each time that a young person is exposed to this kind of unwanted material and they get hooked, it is a danger to that individual. So I’m hoping that you are saying that you do report this kind of information to, if not the SEC, that it is made public. I think I’m hearing that, yes, you do, so-

Mark Zuckerberg (02:47:58):

Yeah. Senator, I think we report more publicly on our enforcement than any other company in the industry, and we’re very supportive of transparency measures, and I think-

Ms. Hirono (02:48:09):

I’m running out of time, Mr. Zuckerberg, but so I will follow up with what exactly it is that you do report. Again, for you, when Meta automatically places young people’s accounts, and you testified to this, on the most restrictive privacy and content sensitivity sessions, and yet teens are able to opt out of these safeguards, isn’t that right?

Speaker 16 (02:48:29):

Yeah.

Ms. Hirono (02:48:30):

It’s not mandatory that they remain on these settings. They can opt out?

Mark Zuckerberg (02:48:37):

Senator, yes. We default teens into a private account, so that they have a private and restricted experience. But some teens want to be creators and want to have content that they share more broadly, and I don’t think that that’s something that should just blanketly be banned.

Ms. Hirono (02:48:53):

Why not? I think it should be mandatory that they remain on the more restrictive settings.

Mark Zuckerberg (02:49:01):

Senator, I think that there’s-

Ms. Hirono (02:49:02):

We have to start somewhere.

Mark Zuckerberg (02:49:03):

I mean, a lot of teens create amazing things, and I think with the right supervision and parenting and controls, I don’t think that that’s the type of thing that you want to just not allow anyone to be able to do. I think you want to make it so that-

Ms. Hirono (02:49:18):

My time is up, but I have to say that there is an argument that you all make for every single thing that we are proposing. And I share the concern that I have about the blanket limitation on liabilities that we provide all of you, and I think that that has to change. And that is on us, on Congress, to make that change. Thank you, Mr. Chairman.

Senator Dick Durbin (02:49:38):

Thank you, Senator Hirono. Senator Cotton.

Senator Tom Cotton (02:49:42):

Mr. Chew, let’s cut straight to the chase. Is TikTok under the influence of the Chinese Communist Party?

Mr. Shou Chew (02:49:50):

No, Senator. We are a private business.

Senator Tom Cotton (02:49:53):

Okay. So you can see that your parent, ByteDance, is subject to the 2017 national security law, which requires Chinese companies to turn over information to the Chinese government and conceal it from the rest of the world. You can see that, correct?

Mr. Shou Chew (02:50:06):

Senator, the Chinese business-

Senator Tom Cotton (02:50:08):

There’s no question. You conceded it early.

Mr. Shou Chew (02:50:11):

Any global business that does business in China has to follow their local laws.

Senator Tom Cotton (02:50:13):

Okay. Isn’t it the case that ByteDance also has an internal Chinese Communist Party committee?

Mr. Shou Chew (02:50:19):

Like I said, all businesses that operate in China have to follow the local law.

Senator Tom Cotton (02:50:22):

So your parent company is subject to the national security law that requires it to answer to the party. It has its own internal Chinese Communist Party committee. You answer to that parent company, but you expect us to believe that you’re not under the influence of the Chinese Communist Party?

Mr. Shou Chew (02:50:37):

I understand this concern, Senator, which is why we built Project Texas.

Senator Tom Cotton (02:50:41):

Okay. It was a yes or no. Okay. But you used to work for ByteDance, didn’t you? You were the CFO for ByteDance?

Mr. Shou Chew (02:50:45):

That is correct, Senator.

Senator Tom Cotton (02:50:46):

In April 2021, while you were the CFO, the Chinese Communist Party’s China and Internet Investment Fund purchased a 1% stake in ByteDance’s main Chinese subsidiary, the ByteDance Technology Company. In return for that so-called 1% golden share, the party took one of three board seats at that subsidiary company. That’s correct, isn’t it?

Mr. Shou Chew (02:51:07):

It’s for the Chinese business.

Senator Tom Cotton (02:51:09):

Is that correct?

Mr. Shou Chew (02:51:10):

It’s for the Chinese business, yes.

Senator Tom Cotton (02:51:11):

That deal was finalized on April 30th, 2021. Isn’t it true that you were appointed the CEO of TikTok the very next day, on May 1, 2021?

Mr. Shou Chew (02:51:21):

Well, it’s a coincidence.

Senator Tom Cotton (02:51:22):

It’s a coincidence that you-

Mr. Shou Chew (02:51:24):

It is.

Senator Tom Cotton (02:51:24):

… were the CFO-

Mr. Shou Chew (02:51:24):

Senator, it is.

Senator Tom Cotton (02:51:25):

… that the Chinese Communist Party took its golden share and its board seat, and the very next day, you were appointed the CEO of TikTok? That’s a hell of a coincidence.

Mr. Shou Chew (02:51:33):

It really is, Senator.

Senator Tom Cotton (02:51:34):

Yeah, it is. Okay. And before ByteDance, you were at a Chinese company called Xiaomi. Is that correct?

Mr. Shou Chew (02:51:44):

Yes, I used to work around the world.

Senator Tom Cotton (02:51:46):

Where did you live when you worked at Xiaomi?

Mr. Shou Chew (02:51:48):

I lived in China, like many expats.

Senator Tom Cotton (02:51:50):

Where exactly?

Mr. Shou Chew (02:51:51):

In Beijing, in China.

Senator Tom Cotton (02:51:51):

How many years did you live in Beijing?

Mr. Shou Chew (02:51:54):

Senator, I worked there for about five years.

Senator Tom Cotton (02:51:56):

So you lived there for five years?

Mr. Shou Chew (02:51:57):

Yes.

Senator Tom Cotton (02:51:57):

Isn’t it the case that Xiaomi was sanctioned by the United States government in 2021 for being a communist Chinese military company?

Mr. Shou Chew (02:52:05):

I’m here to talk about TikTok. I think they then had a lawsuit and it was overturned. I can’t remember the details.

Senator Tom Cotton (02:52:10):

No, no. It was-

Mr. Shou Chew (02:52:10):

It’s another company.

Senator Tom Cotton (02:52:11):

… the Biden administration that reversed those sanctions, just like, by the way, they reversed the terrorist designation on the Houthis in Yemen. How’s that working out for them? But it was sanctioned as a Chinese communist military company.

(02:52:23)
So you said today, as you often say, that you live in Singapore. Of what nation are you a citizen?

Mr. Shou Chew (02:52:30):

Singapore, sir. Senator.

Senator Tom Cotton (02:52:31):

Are you a citizen of any other nation?

Mr. Shou Chew (02:52:33):

No, Senator.

Senator Tom Cotton (02:52:34):

Have you ever applied for Chinese citizenship?

Mr. Shou Chew (02:52:36):

Senator, I served my nation in Singapore. No, I did not.

Senator Tom Cotton (02:52:40):

Do you have a Singaporean passport?

Mr. Shou Chew (02:52:42):

Yes, and I served my military for two and a half years in Singapore.

Senator Tom Cotton (02:52:45):

Do you have any other passports from any other nations?

Mr. Shou Chew (02:52:47):

No, Senator.

Senator Tom Cotton (02:52:48):

Your wife is an American citizen, your children are American citizens.

Mr. Shou Chew (02:52:51):

That’s correct.

Senator Tom Cotton (02:52:52):

Have you ever applied for American citizenship?

Mr. Shou Chew (02:52:54):

No, not yet.

Senator Tom Cotton (02:52:56):

Okay. Have you ever been a member of the Chinese Communist Party?

Mr. Shou Chew (02:53:01):

Senator, I’m Singaporean. No.

Senator Tom Cotton (02:53:03):

Have you ever been associated or affiliated with the Chinese Communist Party?

Mr. Shou Chew (02:53:07):

No, Senator. Again, I’m Singaporean.

Senator Tom Cotton (02:53:08):

Let me ask you some hopefully simple questions. You said earlier in response to a question that what happened at Tiananmen Square in June of 1989 was a massive protest. Did anything else happen in Tiananmen Square?

Mr. Shou Chew (02:53:20):

Yes, I think it’s well-documented. There was a massacre.

Senator Tom Cotton (02:53:22):

There was an indiscriminate slaughter of hundreds or thousands of Chinese citizens. Do you agree with the Trump administration and the Biden administration that the Chinese government is committing genocide against the Uyghur people?

Mr. Shou Chew (02:53:34):

Senator, I’ve said this before. I think it’s really important that anyone who cares about this topic or any topic can freely express themselves on TikTok.

Senator Tom Cotton (02:53:41):

Very simple. It’s a very simple question that unites both parties in our country and governments around the world. Is the Chinese government committing genocide against the Uyghur people?

Mr. Shou Chew (02:53:49):

Senator, anyone, including you can come on the TikTok-

Senator Tom Cotton (02:53:52):

Yes, sir.

Mr. Shou Chew (02:53:52):

… and talk-

Senator Tom Cotton (02:53:52):

Yes, sir. Yes, sir.

Mr. Shou Chew (02:53:53):

… about this topic-

Senator Tom Cotton (02:53:53):

And I’m asking you yes or no.

Mr. Shou Chew (02:53:54):

… or any topic that matters to you.

Senator Tom Cotton (02:53:55):

You are a worldly, cosmopolitan, well-educated man who’s expressed many opinions on many topics. Is the Chinese government committing genocide against the Uyghur people?

Mr. Shou Chew (02:54:03):

Actually, Senator, I talk mainly about my company, and I’m here to talk-

Senator Tom Cotton (02:54:07):

Yes or-

Mr. Shou Chew (02:54:07):

… about what TikTok does.

Senator Tom Cotton (02:54:08):

Yes or no? You’re here to give-

Mr. Shou Chew (02:54:09):

We allow-

Senator Tom Cotton (02:54:09):

You’re here to give testimony that’s truthful and honest and complete. Let me ask you this. Joe Biden last year said that Xi Jinping was a dictator. Do you agree with Joe Biden? Is Xi Jinping a dictator?

Mr. Shou Chew (02:54:19):

Senator, I’m not going to comment on any world leaders.

Senator Tom Cotton (02:54:22):

Why won’t you answer these very simple questions?

Mr. Shou Chew (02:54:24):

Senator, it’s not appropriate for me-

Senator Tom Cotton (02:54:25):

Are you scared?

Mr. Shou Chew (02:54:25):

… as a businessman to comment on world leaders.

Senator Tom Cotton (02:54:27):

Are you scared that you’ll lose your job if you say anything negative about the Chinese Communist Party?

Mr. Shou Chew (02:54:31):

I disagree with that. You will find-

Senator Tom Cotton (02:54:32):

Are you scared that-

Mr. Shou Chew (02:54:32):

… content that is critical-

Senator Tom Cotton (02:54:33):

… you will be arrested and-

Mr. Shou Chew (02:54:33):

… of China on our platform.

Senator Tom Cotton (02:54:34):

… disappear the next time you go? Are you scared that you’ll be arrested and disappear the next time you go to mainland China?

Mr. Shou Chew (02:54:39):

Senator, you will find content that’s critical of China and any other country freely on TikTok.

Senator Tom Cotton (02:54:44):

Okay, okay. Let’s turn to what TikTok, a tool of the Chinese Communist Party, is doing to America’s youth. Does the name Mason Edens ring a bell?

Mr. Shou Chew (02:54:55):

Senator, you may have to give me more specifics if you don’t mind.

Senator Tom Cotton (02:54:57):

Yeah. He was a 16-year-old Arkansan. After a breakup in 2022, he went on your platform and searched for things like inspirational quotes and positive affirmations. Instead, he was served up numerous videos glamorizing suicide until he killed himself by gun. What about the name Chase Nasca? Does that ring a bell?

Mr. Shou Chew (02:55:19):

Would you mind giving me more details, please?

Senator Tom Cotton (02:55:21):

He was a 16-year-old who saw more than 1000 videos on your platform about violence and suicide until he took his own life by stepping in front of a plane, or train. Are you aware that his parents, Dean and Michelle, are suing TikTok and ByteDance for pushing their son to take his own life?

Mr. Shou Chew (02:55:37):

Yes, I’m aware of that.

Senator Tom Cotton (02:55:39):

Okay. Finally, Mr. Chew, has the Federal Trade Commission sued TikTok during the Biden administration?

Mr. Shou Chew (02:55:48):

Senator, I cannot talk about whether there’s any ongoing-

Senator Tom Cotton (02:55:51):

Are you currently being sued by the Federal Trade Commission?

Mr. Shou Chew (02:55:54):

Senator, I cannot talk about any potential lawsuits, whether they-

Senator Tom Cotton (02:55:57):

I didn’t say potential. Actual.

Mr. Shou Chew (02:55:57):

… happen or not.

Senator Tom Cotton (02:55:59):

Are you being sued by the Federal Trade Commission?

Mr. Shou Chew (02:56:00):

Senator, I think I’ve given you my answer. I cannot talk about-

Senator Tom Cotton (02:56:02):

The answer is no. Ms. Yaccarino’s Company is being sued, I believe, Mr. Zuckerberg’s company is being sued, I believe. Yet TikTok, the agent of the Chinese Communist Party, is not being sued by the Biden administration. Are you familiar with the name Cristina Caffarra?

Mr. Shou Chew (02:56:22):

You may have to give me more details.

Senator Tom Cotton (02:56:23):

Cristina Caffarra was a paid advisor to ByteDance, your communist-influenced parent company. She was then hired by the Biden FTC to advise on how to sue Mr. Zuckerberg’s company.

Mr. Shou Chew (02:56:37):

Senator, ByteDance is a global company and not a Chinese company.

Senator Tom Cotton (02:56:41):

Public reports indicate-

Mr. Shou Chew (02:56:42):

It’s owned by global investors.

Senator Tom Cotton (02:56:42):

Public reports indicate that your lobbyist visited the White House more than 40 times in 2022. How many times did your company’s lobbyist visit the White House last year?

Mr. Shou Chew (02:56:52):

I don’t know that, Senator.

Senator Tom Cotton (02:56:54):

Are you aware that the Biden campaign and the Democratic National Committee is on your platform, and they have TikTok accounts?

Mr. Shou Chew (02:57:01):

Senator, we encourage people to come on to create-

Senator Tom Cotton (02:57:03):

Which by the way-

Mr. Shou Chew (02:57:03):

… [inaudible 02:57:04] content.

Senator Tom Cotton (02:57:04):

… they won’t let their staffers use their personal phone. [inaudible 02:57:07] give them separate phones that they only use TikTok on.

Mr. Shou Chew (02:57:09):

We encourage everyone to join, including yourself, Senator.

Senator Tom Cotton (02:57:11):

So all these companies are being sued by the FTC. You’re not. The FTC has a former paid advisor of your parent talking about how they can sue Mr. Zuckerberg’s company. Joe Biden’s reelection campaign and the Democratic National Committee is on your platform.

(02:57:25)
Let me ask you, have you or anyone else at TikTok communicated with or coordinated with the Biden administration, the Biden campaign, or the Democratic National Committee to influence the flow of information on your platform?

Mr. Shou Chew (02:57:39):

We work with anyone, any creators who want to use our campaign. It’s all the same process that we have.

Senator Tom Cotton (02:57:45):

Okay. So what we have here, we have a company that’s a tool of the Chinese Communist Party that is poisoning the minds of America’s children, in some cases, driving them to suicide, and that at best, the Biden administration is taking a pass on, at worse, maybe in collaboration with. Thank you, Mr. Chew.

Senator Dick Durbin (02:58:02):

Thank you, Senator Cotton. So we’re going to take a break now. We’re on the second roll call. Members can take advantage as they wish. The break will last about 10 minutes. Please do your best to return.

Senator Alex Padilla (03:17:29):

Thank you, Mr. Chair. Colleagues, as we reconvene, I’m proud once again to share that I am one of the few senators with younger children.

(03:17:44)
And I lead with that because as we are having this conversation today, it’s not lost on me that between my children, who are all now in the teen and preteen category, and their friends, I see this issue very up

Mr. Padilla (03:18:00):

… up close and personal. And in that spirit, I want to take a second to just acknowledge and thank all the parents who are in the audience today, many of whom have shared their stories with our offices, and I credit them for finding strength through their suffering, through their struggle, and channeling that into the advocacy that is making a difference. I thank all of you. Now, I appreciate, again, personally the challenges that parents and caretakers, school personnel, and others face in helping our young people navigate this world of social media and technology in general.

(03:18:47)
Now, the services our children are growing up with provide them unrivaled access to information. This is beyond what previous generations have experienced, and that includes learning opportunities, socialization, and much, much more. But we also clearly have a lot of work to do to better protect our children from the predators and predatory behavior that these technologies have enabled.

(03:19:17)
And yes, Mr. Zuckerberg, that includes exacerbating the mental health crisis in America. Nearly all teens we know have access to smartphones and the internet and use the internet daily.

(03:19:35)
And while guardians do have primary responsibility for caring for our children, the old adage says it takes a village. And so society as a whole, including leaders in the tech industry, must prioritize the health and safety of our children.

(03:19:56)
Now, dive into my questions now and be specific, platform by platform, witness by witness on the topic of some of the parental tools you have each made reference to.

(03:20:06)
Mr. Citron, how many minors are on Discord and how many of them have caretakers that have adopted your Family Center tool? And if you don’t have the numbers, just say that quickly and provide that to our office.

Mr. Jason Citron (03:20:18):

We can follow up with you on that.

Mr. Padilla (03:20:22):

How have you ensured that young people and their guardians are aware of the tools that you offer?

Mr. Jason Citron (03:20:27):

We make it very clear to teens on our platform what tools are available and our Teen Safety Assist is enabled by default.

Mr. Padilla (03:20:36):

What specifically do you do? What may be clear to you is not clear to the general public, so what do you do in your opinion to make it very clear?

Mr. Jason Citron (03:20:41):

So our Teen Safety Assist, which is a feature that helps teens keep themselves safe in addition to blocking and blurring images that may be sent to them, that is on by default for teen accounts and it cannot be turned off.

(03:20:53)
We market and to our teen users directly in our platform, we launched our Family Center, we created a promotional video and we put it directly on our product. So when every teen opened the app, in fact, every user opened the app, they got an alert like, “Hey, Discord has this and they want you to use it.”

Mr. Padilla (03:21:10):

Thank you. Look forward to the data that we’re requesting-

Mr. Jason Citron (03:21:12):

[Inaudible 03:21:13]

Mr. Padilla (03:21:13):

Mr. Zuckerberg, across all of Meta services from Instagram, Facebook, Messenger, and Horizon, how many minors use your applications, and of those minors, how many have a caretaker that has adopted the parental supervision tools that you offer?

Mark Zuckerberg (03:21:29):

Sorry, I can follow up with the specific stats on that.

Mr. Padilla (03:21:31):

Okay. It would be very helpful not just for us to know, but for you to know as a leader of your company. And same question, how are you ensuring that young people and their guardians are aware of the tools that you offer?

Mark Zuckerberg (03:21:44):

We run pretty extensive ad campaigns both on our platforms and outside. We work with creators and organizations like Girl Scouts to make sure that there’s broad awareness of the tools.

Mr. Padilla (03:21:57):

Okay. Mr. Spiegel, how many minors use Snapchat and of those minors, how many have caretakers that are registered with your Family Center?

Mr. Spiegel (03:22:04):

Senator, I believe in the United States, there are approximately 20 million teenage users of Snapchat. I believe approximately 200,000 parents use Family Center and about 400,000 teens have linked their account to their parents using Family Center.

Mr. Padilla (03:22:18):

So 200,000, 400,000 sounds like a big number, but small in percentage of the minors using Snapchat. What are you doing to ensure that young people and their guardians are aware of the tools you offer?

Mr. Spiegel (03:22:29):

Senator, we create a banner for Family Center on the user’s profiles, so that accounts, we believe maybe of the age that they could be parents can see the entry point into Family Center easily.

Mr. Padilla (03:22:40):

Okay. Mr. Shou, how many minors are on TikTok and how many of them have a caregiver that uses your family tools?

Mr. Shou Chew (03:22:46):

Senator, I’d need to get back to you on the specific numbers, but we were one of the first platforms to give what we call Family Pairing to parents. You go to Settings, you turn on the QR code, your teenager’s QR code and yours. You scan it, and what it allows you to do is you can set screen time limits, you can filter out some keywords, you can turn on a more restricted mode, and we are always talking to parents. I met a group of parents and teenagers and high school teachers last week to talk about what more we can provide in the Family Pairing mode.

Mr. Padilla (03:23:15):

Ms. Yaccarino, how many minors use X and are you planning to implement safety measures or guidance for caretakers like your peer companies have?

Ms. Linda Yaccarino (03:23:24):

Thank you, Senator. Less than 1% of all US users are between the ages of 13 and 17.

Mr. Padilla (03:23:32):

Less than 1% of how many?

Ms. Linda Yaccarino (03:23:34):

Of 90 million US users.

Mr. Padilla (03:23:37):

Okay, so still hundreds of thousands? Continue.

Ms. Linda Yaccarino (03:23:38):

Yes. Yes, and every single one is very important. Being a 14-month-old company, we have reprioritized child protection and safety measures, and we have just begun to talk about and discuss how we can enhance those with parental controls.

Mr. Padilla (03:23:56):

Let me continue with a follow-up question for Mr. Citron. In addition, keeping parents informed about the nature of various internet services, there’s a lot more we obviously need to do for today’s purposes.

(03:24:09)
While many companies offer a broad range of quote, unquote “user empowerment” tools, it’s helpful to understand whether young people even find these tools helpful.

(03:24:18)
So appreciate you sharing your Teen Safety Assist and the tools and how you’re advertising it, but have you conducted any assessment of how these features are impacting minor’s use of your platform?

Mr. Jason Citron (03:24:32):

Our intention is to give teens tools, capabilities that they can use to keep themselves safe and also so our teams can help keep teens safe. We recently launched Team Safety Assist last year, and I do not have a study off the top of my head, but we’d be happy to follow up with you on that.

Mr. Padilla (03:24:47):

Okay. My time is up. I’ll have follow up questions for each of you either in the second round or through statements for the record on a similar assessment of the tools that you’ve proposed. Thank you, Mr. Chairman.

Mr. Chairman (03:24:58):

Thank you, Senator Padilla. Senator Kennedy.

Mr. Kennedy (03:25:03):

Thank you all for being here. Mr. Spiegel, I see you hiding down there. What does yada, yada, yada mean?

Mr. Spiegel (03:25:21):

I’m not familiar with the term Senator.

Mr. Kennedy (03:25:24):

Very uncool. Can we agree that what you do not what you say, what you do is what you believe and everything else is just cottage cheese?

Mr. Spiegel (03:25:40):

Yes, Senator.

Mr. Kennedy (03:25:42):

You agree with that? Speak up. Don’t be shy. I’ve listened to you today. I’ve heard a lot of yada, yada, yading, and I’ve heard you talk about the reforms you’ve made and I appreciate them. And I’ve heard you talk about the reforms you’re going to make, but I don’t think you’re going to solve the problem. I think Congress is going to have to help you.

(03:26:15)
I think the reforms you’re talking about to some extent are going to be like putting paint on rotten wood, and I’m not sure you’re going to support this legislation. I’m not.

(03:26:29)
The fact is that you and some of your internet colleagues who are not here are no longer… You’re not companies, you’re countries. You’re very, very powerful and you and some of your colleagues who are not here have blocked everything we have tried to do in terms of reasonable regulation, everything from privacy to child exploitation.

(03:27:03)
And in fact, we have a new definition of recession. A recession is when… We know we’re in a recession when Google has to lay off 25 members of Congress. That’s what we’re down to. We’re also down to this fact that your platforms are hurting children.

(03:27:27)
I’m not saying they’re not doing some good things, but they’re hurting children. And I know how to count votes, and if this bill comes to the floor of the United States Senate, it will pass.

(03:27:43)
What we’re going to have to do, and I say this with all the respect that I can muster, is convince my good friend, Senator Schumer to go to Amazon, buy spine online and bring this bill to the Senate floor and the House will then pass it. Now, that’s one person’s opinion. I may be wrong, but I doubt it.

(03:28:09)
Mr. Zuckerberg, let me ask you a couple of questions. Might wax a little philosophical here. I have to hand it to you. You have convinced over 2 billion people to give up all of their personal information, every bit of it, in exchange for getting to see what their high school friends had for dinner Saturday night. That’s pretty much your business model, isn’t it?

Mark Zuckerberg (03:28:48):

It’s not how I would characterize it, and we give people the ability to connect with the people they care about and to engage with the topics that they care about.

Mr. Kennedy (03:28:58):

And you take this information, this abundance of personal information, and then you develop algorithms to punch people’s hot buttons and steer to them information that punches their hot buttons again and again and again to keep them coming back and to keep them staying longer. And as a result, your users see only one side of an issue, and so to some extent, your platform has become a killing field for the truth, hasn’t it?

Mark Zuckerberg (03:29:40):

I mean, Senator, I disagree with that, that characterization. We build ranking and recommendations because people have a lot of friends and a lot of interests and they want to make sure that they see the content that’s relevant to them.

(03:29:53)
We’re trying to make a product that’s useful to people and make our services as helpful as possible for people to connect with the people they care about and the interest they care about. That’s what we-

Mr. Kennedy (03:30:02):

But you don’t show them both sides. You don’t give them balanced information. You just keep punching their hot buttons, punching their hot buttons. You don’t show them balanced information so people can discern the truth for themselves and you rev them up so much that so often your platform and others becomes just cesspools of snark where nobody learns anything, don’t they?

Mark Zuckerberg (03:30:29):

Well, Senator, I disagree with that. I think people can engage in the things that they’re interested in and learn quite a bit about those. We have done a handful of different experiments and things in the past around news and trying to show content on diverse set of perspectives. I think that there’s more that needs to be explored there, but I don’t think that we can solve that by ourselves. One of the things that I saw-

Mr. Kennedy (03:30:54):

Do you think… I’m sorry to cut you off, Mr. President, but I’m going to run out time. Do you think your users really understand what they’re giving to you, all their personal information and how you process it and how you monetize it? Do you think people really understand?

Mark Zuckerberg (03:31:14):

Senator, I think people understand the basic terms. I mean, I think that there’s-

Mr. Kennedy (03:31:20):

Let me put it-

Mark Zuckerberg (03:31:20):

I actually think that a lot of people overestimate how much information [inaudible 03:31:24]-

Mr. Kennedy (03:31:23):

… put it another way. It’s been a couple of years since we talked about this. Does your user agreement still suck?

Mark Zuckerberg (03:31:30):

I’m not sure how to answer that, Senator. I think basic-

Mr. Kennedy (03:31:33):

Can you still have a dead body and all that legalese where nobody can find it?

Mark Zuckerberg (03:31:40):

Senator, I’m not quite sure what you’re referring to, but I think people get the basic deal of using these services. It’s a free service. You’re using it to connect with the people you care about.

(03:31:49)
If you share something with people, other people will be able to see your information. It’s inherently… and If you’re putting something out there to be shared publicly or with a private set of people, you’re inherently putting it out there, so I think people get that basic part of how the service works.

Mr. Kennedy (03:32:04):

But Mr. Zuckerberg, you’re in the foothills of creepy. You track people who aren’t even Facebook users. You track your own people, your own users who are your product even when they’re not on Facebook.

(03:32:24)
I’m going to land this plane pretty quickly, Mr. Chairman. I mean, it’s creepy. And I understand you make a lot of money doing it, but I just wonder if our technology is greater than our humanity. I mean, let me ask you this final question. Instagram is harmful to young people, isn’t it?

Mark Zuckerberg (03:32:52):

Senator, I disagree with that. That’s not what the research shows on balance. That doesn’t mean that individual people don’t have issues and that there aren’t things that we need to do to help provide the right tools for people, but across all of the research that we’ve done internally, I mean the survey that the Senator previously cited, there are 12 or 15 different categories of harm that we asked teens if they felt that Instagram made it worse or better. And across all of them, except for the one that Senator Hawley cited, more people said that using Instagram [inaudible 03:33:29]-

Mr. Kennedy (03:33:29):

I’ve got a land the plane.

Mark Zuckerberg (03:33:30):

… issues they face.

Mr. Kennedy (03:33:30):

Mr. Zuckerberg-

Mark Zuckerberg (03:33:31):

… either positive or-

Mr. Kennedy (03:33:33):

… we just have to agree to disagree. If you believe that Instagram… I’m not saying it’s intentional, but if you agree that Instagram… if you think that Instagram is not hurting millions of our young people, particularly young teens, particularly young women, you shouldn’t be driving. It is. Thanks.

Mr. Chairman (03:33:56):

Senator Butler.

Ms. Butler (03:33:59):

Thank you, Mr. Chair, and thank you to our panelists who’ve come to have an important conversation with us. Most importantly, I want to appreciate the families who have shown up to continue to be remarkable champions of your children and your loved ones for being here, in particular to California families that I was able to just talk to on the break, the families of Sammy Chapman from Los Angeles and Daniel Puerta from Santa Clarita.

(03:34:32)
They are here today and are doing some incredible work to not just protect the memory and legacy of their boys, but the work that they’re doing is going to protect my 9-year-old, and that is indeed why we are here.

(03:34:49)
There are a couple of questions that I want to ask some individuals. Let me start with a question for each of you. Mr. Citron, have you ever sat with a family and talked about their experience and what they need from your product? Yes or no?

Mr. Jason Citron (03:35:06):

Yes. I have spoken with parents about how we can build tools to help them.

Ms. Butler (03:35:10):

Mr. Spiegel, have you sat with families and young people to talk about your products and what they need from your product?

Mr. Spiegel (03:35:16):

Yes, Senator.

Ms. Butler (03:35:18):

Mr. Shou?

Mr. Shou Chew (03:35:19):

Yes. I just did it two weeks ago, for example, I did-

Ms. Butler (03:35:22):

I don’t want to know what you did for the hearing prep, Mr. Chew. I just wanted to know if-

Mr. Shou Chew (03:35:27):

No, it’s an example.

Ms. Butler (03:35:27):

… anything-

Mr. Shou Chew (03:35:28):

Senator, it’s an example.

Ms. Butler (03:35:29):

… in terms of designing the product that you are creating. Mr. Zuckerberg, have you sat with parents and young people to talk about how you design product for your consumers?

Mark Zuckerberg (03:35:47):

Yes. Over the years, I’ve had a lot of conversations with parents.

Ms. Butler (03:35:50):

You know that’s interesting, Mr. Zuckerberg, because we talked about this last night and you gave me a very different answer. I asked you this very question.

Mark Zuckerberg (03:36:01):

Well, I told you that I didn’t know what specific processes our company had for answer-

Ms. Butler (03:36:08):

No, Mr. Zuckerberg, you said to me that you had not.

Mark Zuckerberg (03:36:13):

I must have misspoke.

Ms. Butler (03:36:14):

I want to give you the room to misspeak Mr. Zuckerberg, but I asked you this very question. I asked all of you this question and you told me a very different answer when we spoke, but I won’t belabor it. A number of you have talked about the… I’m sorry, X, Ms. Yaccarino, have you talked to parents directly, young people about designing your product?

Ms. Linda Yaccarino (03:36:40):

As a new leader of X, the answer is yes. I’ve spoken to them about the behavioral patterns because less than 1% of our users are in that age group, but yes, I have spoken to them.

Ms. Butler (03:36:54):

Thank you, ma’am. Mr. Spiegel, there are a number of parents who’ve children have been able to access illegal drugs on your platform. What do you say to those parents?

Mr. Spiegel (03:37:10):

Well, Senator, we are devastated that we cannot-

Ms. Butler (03:37:13):

To the parents. What do you say to those parents, Mr. Spiegel?

Mr. Spiegel (03:37:17):

I’m so sorry that we have not been able to prevent these tragedies. We work very hard to block all search terms related to drugs from our platform. We proactively look for and detect drug related content. We remove it from our platform, preserve it as evidence, and then we refer it to law enforcement for action.

(03:37:35)
We’ve worked together with nonprofits and with families on education campaigns because the scale of the fentanyl epidemic is extraordinary. Over 100,000 people lost their lives last year, and we believe people need to know that one pill can kill. That campaign was viewed more than 260 million times on Snapchat. We also-

Ms. Butler (03:37:52):

Mr. Spiegel, there are two fathers in this room who lost their sons. They’re 16 years old. Their children were able to get those pills from Snapchat.

(03:38:07)
I know that there are statistics and I know that there are good efforts. None of those efforts are keeping our kids from getting access to those drugs on your platform. As California company, all of you, I’ve talked with you about what it means to be a good neighbor and what California families and American families should be expecting from you. You owe them more than just a set of statistics, and I look forward to you showing up on all pieces of these legislation, all of you showing up on all pieces of legislation to keep our children safe.

(03:38:40)
Mr. Zuckerberg, I want to come back to you. I talked with you about being a parent to a young child who doesn’t have a phone, is not on social media at all, and one of the things that I am deeply concerned with as a parent to a young black girl is the utilization of filters on your platform that would suggest to young girls utilizing your platform, the evidence that they are not good enough as they are.

(03:39:24)
I want to ask more specifically and refer to some unredacted court documents that revealed that your own researchers concluded that these face filters that mimic plastic surgery negatively impact youth mental health indeed and wellbeing.

(03:39:47)
Why should we believe, why should we believe that because that you are going to do more to protect young women and young girls when it is that you give them the tools to affirm the self-hate that is spewed across your platforms? Why is it that we should believe that you are committed to doing anything more to keep our children safe?

Mark Zuckerberg (03:40:12):

Sorry, there’s a lot to unpack there.

Ms. Butler (03:40:14):

There is a lot.

Mark Zuckerberg (03:40:15):

We give people tools to express themselves in different ways and people use face filters and different tools to make media and photos and videos that are fun or interesting across a lot of the different products that are [inaudible 03:40:30]-

Ms. Butler (03:40:30):

Plastic surgery pins are good tools to express creativity?

Mark Zuckerberg (03:40:36):

Senator, I’m not speaking to that specific-

Ms. Butler (03:40:38):

Skin lightning tools are tools to express creativity?

Mark Zuckerberg (03:40:41):

I’m not defending-

Ms. Butler (03:40:43):

This is the direct thing that I’m asking about.

Mark Zuckerberg (03:40:44):

I’m not defending any specific one of those. I think that the ability to filter and an edit images is generally a useful tool for expression. For that specifically, I’m not familiar with the study that you’re referring to, but we did make it so that we’re not recommending this type of content to teens [inaudible 03:41:06]-

Ms. Butler (03:41:06):

I may know no reference to a study to court documents that revealed your knowledge of the impact of these types of filters on young people, generally young girls in part particular.

Mark Zuckerberg (03:41:20):

Senator, I disagree with that characterization. I think that there’s… There have been hypothesis-

Ms. Butler (03:41:22):

With court documents?

Mark Zuckerberg (03:41:25):

I haven’t seen any document that says… but-

Ms. Butler (03:41:26):

Okay, Mr. Zuckerberg, my time is up. I hope that you hear what is being offered to you and are prepared to step up and do better. I know this Senate committee is going to do our work to hold you to greater account. Thank you, Mr. Chair.

Mr. Chairman (03:41:44):

Senator Tillis.

Mr. Tillis (03:41:47):

Thank you Mr. Chair. Thank you all for being here. I don’t feel like I’m going to have an opportunity to ask a lot of questions, so I’m going to reserve the right to submit some for the record, but I have heard… We’ve had hearings like this before.

(03:42:03)
I’ve been in the Senate for nine years. I’ve heard hearings like this before. I’ve heard horrible stories about people who have died, committed suicide. I’ve been embarrassed. Every year, we have an annual flogging. Every year. And what materially has occurred over the last nine years.

(03:42:29)
Just yes or no question, do any of y’all participate in an industry consortium trying to make this fundamentally safe across platforms? Yes or no? Mr. Zuckerberg.

Mark Zuckerberg (03:42:38):

Yes.

Mr. Tillis (03:42:38):

[inaudible 03:42:39]

Ms. Linda Yaccarino (03:42:40):

There’s a variety of organizations that we work-

Mr. Tillis (03:42:41):

Do you participate?

Ms. Linda Yaccarino (03:42:43):

… which organizations-

Mr. Tillis (03:42:44):

I should say, does anyone here not participate in an industry? I actually think it would be immoral for you all to consider it a strategic advantage to keep say, or to keep private something that would secure all these platforms to avoid this sort of problem. Do you all agree with that?

(03:43:02)
That anybody that would be saying, “You want ours because ours is the safest and these haven’t figured out the secret sauce,” that you as an industry realize this is an existential threat to you all if we don’t get it right. Right?

(03:43:12)
I mean, you’ve got to secure your platforms. You got to deal with this. Do you not have an inherent mandate to do this? Because it would seem to me if you don’t, you’re going to cease to exist.

(03:43:23)
I mean, we could regulate you out of business if we wanted to, and the reason I’m saying… It may sound like a criticism, it’s not a criticism. I think we have to understand that there should be an inherent motivation for you to get this right.

(03:43:37)
Our Congress will make a decision that could potentially put you out of business. Here’s the reason I have a concern with that though. I just went on the internet while I was listening intently to all the other members speaking, and I found a dozen different platforms outside the United States, 10 of which are in China, two of which are in Russia.

(03:43:59)
Their daily average active membership numbers in the billions. Well, people say you can’t get on China’s version of TikTok. It took me one quick search on my favorite search engine to find out exactly how I could get an account on this platform today, and so the other thing that we have to keep in mind…

(03:44:23)
I come from technology. Ladies and gentlemen, I could figure out how to influence your kid without them ever being on a social media platform. I can randomly send texts and get a bite and then find out an email address and get compromising information.

(03:44:41)
It is horrible to hear some of these stories and I’ve had these stories occur in my hometown down in North Carolina, but if we only come here and make a point today and don’t start focusing on making a difference, which requires people to stop shouting and start listening and start passing language here, the bad actors are just going to be off our shores.

(03:45:06)
I have another question for you all. How many people, roughly… If you don’t know the exact number it’s okay, roughly, how many people do you have looking 24 hours a day at these horrible images? And just go real quick with an answer down the line and filtering it out.

Mark Zuckerberg (03:45:20):

It’s most of the 40,000 about people who work on safety and-

Mr. Tillis (03:45:24):

And again?

Ms. Linda Yaccarino (03:45:25):

We have 2300 people all over the world.

Mr. Tillis (03:45:27):

Okay.

Mr. Shou Chew (03:45:27):

We have 40,000 trust and safety professionals around the world.

Mr. Jason Citron (03:45:33):

We have approximately 2000 people dedicated to trust and safety and content moderation.

Speaker X (03:45:39):

Our platform is much smaller than these folks. We have hundreds of people and it’s looking at the content and 15% of our workforce focused on it.

Mr. Tillis (03:45:45):

I’ve already, these people have a horrible job. Many of them experience… They have to get counseling for all the things they see. We have evil people out there and we’re not going to fix this by shouting past or talking past each other.

(03:46:00)
We’re going to fix this by every one of y’all being at the table and hopefully coming closer to what I heard one person say supporting a lot of the good bills, like one that I hope Senator Blackburn mentions when she gets a chance to talk.

(03:46:12)
But guys, if you’re not at the table and securing these platforms, you’re going to be on it. And the reason why I’m not okay with that is that if we ultimately destroy your ability to create value and drive you out of business, the evil people will find another way to get to these children. And I do have to admit, I don’t think my mom’s watching this one, but there is good.

(03:46:36)
We can’t look past good that is occurring. My mom who lives in Nashville, Tennessee, and I talked yesterday and we talked about a Facebook post that she made a couple of days ago. We don’t let her talk to anybody else. That connects my 92-year-old mother with her grandchildren and great-grandchildren. That lets a kid who may feel awkward in school to get into a group of people and relate to people. Let’s not throw out the good because we have it all together focused on rooting out the bad.

(03:47:09)
Now, I guarantee you, I could go through some of your governance documents and find a reason to flog every single one of you because you didn’t place the emphasis on it that I think you should. But at the end of the day, I find it hard to believe that any of you people started this business, some of you in your college dorm rooms for the purposes of creating the evil that is being perpetrated on your platforms, but I hope that every single waking hour, you’re doing everything you can to reduce it.

(03:47:38)
You’re not going to be able to eliminate it, and I hope that there are some enterprising young tech people out there today that are going to go to parents and say, “Ladies and gentlemen, your children have a deadly weapon. They have a potentially deadly weapon, whether it’s a phone or a tablet. You have to secure it. You can’t assume that they’re going to be honest and say that they’re 16 when they’re 12.”

(03:48:05)
We all have to recognize that we have a responsibility to play and you guys are at the tip of the spear, so I hope that we can get to a point to where we are moving these bills.

(03:48:17)
If you got a problem with them, state your problem. Let’s fix it. No is not an answer, and know that I want the United States to be the beacon for innovation, to be the beacon for safety, and to prevent people from using other options that have existed since the internet has existed to exploit people. And count me in as somebody that will try and help out. Thank you, Mr. Chair.

Mr. Chairman (03:48:43):

Thank you. Senator Tillis. Next is Senator Ossoff.

Mr. Ossoff (03:48:46):

Thank you, Mr. Chairman, and thank you to our witnesses today. Mr. Zuckerberg, I want to begin by just asking a simple question, which is, do you want kids to use your platform more or less?

Mark Zuckerberg (03:49:01):

Well, we don’t want people under the age of 13 using-

Mr. Ossoff (03:49:03):

Do you want teenagers 13 and up to use your platform more or less?

Mark Zuckerberg (03:49:09):

Well, we would like to build a product that is useful and that people want to use more.

Mr. Ossoff (03:49:13):

My time is going to be limited, so it’s just… Do you want them to use it more or less? Teenagers, 13 to 17 years old, do you want them using Meta products more or less?

Mark Zuckerberg (03:49:23):

I’d like them to be useful enough that they want to use them more.

Mr. Ossoff (03:49:26):

You want them to use it more. I think herein we have one of the fundamental challenges. In fact, you have a fiduciary obligation, do you not to try to get kids to use your platform more?

Mark Zuckerberg (03:49:43):

It depends on how you define that. We obviously are a business, but-

Mr. Ossoff (03:49:49):

I’m… Mr. Zuckerberg, our time, it’s not… It’s self-evident that you have a fiduciary obligation to get your users, including users under 18, to use and engage with your platform more rather than less. Correct?

Mark Zuckerberg (03:50:04):

Over the long term, but in the near term, we often take a lot of steps, including we made a change to show less videos on the platform that reduced amount of time by more than 50 million hours.

Mr. Ossoff (03:50:16):

Okay, but if your shareholders ask you, “Mark…” I wouldn’t, Mr. Zuckerberg here, but your shareholders might be on a first name basis with you, “Mark, are you trying to get kids to use Meta products more or less?” You’d say more, right?

Mark Zuckerberg (03:50:29):

Well, I would say that over the long term, we’re trying to create the most value-

Mr. Ossoff (03:50:33):

I mean, let’s look… So the 10K you file with the SEC. A few things I want to note. Here are some quotes, and this is a filing that you signed, correct?

Mark Zuckerberg (03:50:40):

Yes.

Mr. Ossoff (03:50:40):

Yeah. “Our financial performance has been and will continue to be significantly determined by our success in adding, retaining, and engaging active users.”

(03:50:49)
Here’s another quote: “If our users decrease their level of engagement with our products, our revenue, financial results, and business may be significantly harmed.”

(03:50:57)
Here’s another quote: “We believe that some users, particularly younger

Speaker 17 (03:51:00):

Younger users are aware of and actively engaging with other products and services similar to as a substitute for ours, continues in the event that users increasingly engage with other products and services, we may experience a decline in use and engagement in key demographics or more broadly, in which case our business would likely be harmed.

(03:51:16)
You have an obligation as the chief executive to encourage your team to get kids to use your platform more.

Zuckerberg (03:51:29):

Senator, I think this-

Speaker 17 (03:51:31):

Fundamental, is that not self-evident? You have a fiduciary-

Zuckerberg (03:51:33):

Senator, I think it’s not.

Speaker 17 (03:51:34):

… obligation to your shareholders to get kids to use your platform more.

Zuckerberg (03:51:36):

I think that the thing that’s not intuitive is the direction is to make the products more useful so that way people want to use them more. We don’t give the teams running the Instagram feed or the Facebook feed a goal to increase the amount of time that people spend.

Speaker 17 (03:51:52):

Yeah, but you don’t dispute and your 10k makes clear you want your users engaging more and using more the platform. And I think this gets to the root of the challenge because it’s the overwhelming view of the public certainly in my home state of Georgia, and we’ve had some discussions about the underlying science that this platform is harmful for children.

(03:52:16)
I mean you are familiar with… And not just your platform by the way, social media in general 2023 report from the Surgeon General about the impact of social media on kids’ mental health, which cited evidence that kids who spend more than three hours a day on social media have double the risk of poor mental health outcomes, including depression and anxiety. You’re familiar with that Surgeon General report in the underlying study?

Zuckerberg (03:52:37):

I read the report, yes.

Speaker 17 (03:52:39):

Do you dispute it?

Zuckerberg (03:52:40):

No, but I think it’s important to characterize it correctly. I think what he was flagging in the report is that there seems to be a correlation and obviously the mental health issue is very important, so it’s something that needs to be studied further.

Speaker 17 (03:52:52):

The thing is everyone knows there’s a correlation. Everyone knows that kids who spend a lot of time, too much time on your platforms are at risk and it’s not just the mental health issues. I mean, let me ask you another question. Is your platform safe for kids?

Zuckerberg (03:53:08):

I believe it is, but there’s important-

Speaker 17 (03:53:09):

Hold on a second.

Zuckerberg (03:53:10):

… difference between correlation and causation.

Speaker 17 (03:53:12):

Because we’re not going to be able to get anywhere. We want to work in a productive, open, honest, and collaborative way with the private sector to pass legislation that will protect Americans, that will protect American children above all, and that will allow businesses to thrive in this country. If we don’t start with an open, honest, candid, realistic assessment of the issues, we can’t do that.

(03:53:35)
The first point is you want kids to use the platform more. In fact, you have an obligation to, but if you’re not willing to acknowledge that it’s a dangerous place for children. The internet is a dangerous place for children. Not just your platform, isn’t it? Isn’t the internet a dangerous place for children?

Zuckerberg (03:53:50):

I think it can be. Yeah. There’s both great things that people can do and there are harms that we need to work today.

Speaker 17 (03:53:54):

Yeah, it’s a dangerous place for children. There are families here who have lost their children. There are families across the country whose children have engaged in self-harm, who have experienced low self-esteem, who have been sold deadly pills on the internet. The internet’s a dangerous place for children, and your platforms are dangerous places for children. Do you agree?

Zuckerberg (03:54:13):

I think that there are harms that we need to work to mitigate. I mean, I’m not going to… I think overall the-

Speaker 17 (03:54:17):

Why not? Why not? Why not just acknowledge it? Why do we have to do the very careful code?

Zuckerberg (03:54:23):

I disagree with the characterization that you have.

Speaker 17 (03:54:25):

Which character that the Internet’s a dangerous place for children?

Zuckerberg (03:54:28):

I think you’re trying to characterize our products as inherently dangerous, and I think that-

Speaker 17 (03:54:32):

Inherently or not, your products are places where children can experience harm. They can experience harm to their mental health, they can be sold drugs, they can be preyed upon by predators. They’re dangerous places, and yet you have an obligation to promote the use of these platforms by children. All I’m trying to suggest to you, Mr. Zuckerberg, and my time is running short, is that in order for you to succeed, you and your colleagues here, we have to acknowledge these basic truths.

(03:55:07)
We have to be able to come before the American people, the American public, the people in my state of Georgia and acknowledge the internet is dangerous, including your platforms. There are predators lurking. There are drugs being sold. There are harms to mental health that are taking a huge toll on kids’ quality of life.

(03:55:26)
And yet you have this incentive, not just you, Mr. Zuckerberg, all of you have an incentive to boost, maximize use, utilization and engagement, and that is where public policy has to step in to make sure that these platforms are safe for kids so kids are not dying, so kids are not overdosing. So kids are not cutting themselves or killing themselves because they’re spending all day scrolling instead of playing outside. And I appreciate all of you for your testimony. We will continue to engage as we develop this legislation. Thank you.

Dick Durbin (03:56:00):

Senator from Tennessee.

Senator Blackburn (03:56:02):

Thank you Mr. Chairman. Thank you to each of you for coming and I know some of you had to be subpoenaed to get here, but we do appreciate that you all are here. Mr. Chew, I want to come to you first. We’ve heard that you’re looking at putting a headquarters in Nashville and likewise in Silicon Valley and Seattle, and what you’re going to find probably is that the welcome mat is not going to be rolled out for you in Nashville like it would be in California.

(03:56:33)
There are a lot of people in Tennessee that are very concerned about the way TikTok is basically building dossiers on our kids the way they are building those on their Virtual You and also that that information is held in China, in Beijing as you responded to Senator Blumenthal and I last year in reference to that question. And we also know that a major music label yesterday said they were pulling all of their content off your site because of your issues on payment, on artificial intelligence and because of the negative impact on our kids’ mental health. So we will see how that progresses.

(03:57:23)
Mr. Zuckerberg, I want to come to you. We have just had Senator Blumenthal and I of course have had some internal documents in emails that have come our way. One of the things that really concerned me is that you referred to your young users in terms of their lifetime value of it being roughly $270 per teenager.

(03:57:50)
And each of you should be looking at these kids, their T-shirts they’re wearing today say, “I’m worth more than $270.” We’ve got some standing up in those T-shirts. And some of the children from our state, some of the children, the parents that we have worked with just to think whether it is Becca Schmidt, David Molik, Sarah Flat, Anna Lee Short, would you say that life is only worth $270? What could possibly lead you?

(03:58:42)
I mean, I listened to that. I know you’re a dad. I’m mom, I’m a grandmom, and how could you possibly even have that thought? It is astounding to me, and I think this is one of the reasons that 42 states are now suing you because of features that they consider to be addictive that you are pushing forward.

(03:59:12)
And in the emails that we’ve got from 2021 that go from August to November, there is the staff plan that is being discussed in Antigan Davis, Nick Clegg, Cheryl Sandberg, Chris Cox, Alex Schultz, Adam Messeri, are all on this chain of emails on the wellbeing plan. A.

(03:59:32)
Nd then we get to one, Nick did email Mark for emphasis to emphasize his support for the package, but it sounds like it lost out to various other pressures and priorities. See, this is what bothers us. Children are not your priority. Children are your product. Children you see as a way to make money and protecting children in this virtual space.

(04:00:07)
You made a conscious decision even though Nick Clegg and others were going through the process of saying, this is what we do. These documents are really illuminating. And it just shows me that growing this business, expanding your revenue, what you were going to put on those quarterly filings, that was the priority. The children were not. It’s very clear.

(04:00:46)
I want to talk with you about the pedophile ring because that came up earlier and the Wall Street Journal reported on that. And one of the things that we found out was after that became evident, then you didn’t take that content down and it was content that showed that teens were for sell and were offering themselves to older men. And you didn’t take it down because it didn’t violate your community standards.

(04:01:15)
Do you know how often a child is bought or sold for sex in this country? Every two minutes. Every two minutes a child is bought or sold for sex. That’s not my stat. That is a TBI stat. Now finally, this content was taken down after a congressional staffer went to Meta’s Global Head of Safety. So would you please explain to me and to all these parents why explicit predatory content does not violate your platform’s terms of service or your community standards?

Zuckerberg (04:02:00):

Sure, Senator. Let me try to address all of the things that you just said. It does violate our standards. We work very hard to take it down.

Senator Blackburn (04:02:06):

Didn’t take it down.

Zuckerberg (04:02:08):

Well, we’ve reported, I think it’s more than 26 million examples of this kind of content.

Senator Blackburn (04:02:13):

Didn’t take it down until a congressional staffer brought it up.

Zuckerberg (04:02:16):

It may be that in this case we made a mistake and missed something.

Senator Blackburn (04:02:19):

I think you make a lot of mistake so let’s move on.

Zuckerberg (04:02:21):

But have leading teams that identify more than-

Senator Blackburn (04:02:23):

I want to talk with you about your Instagram Creators Program and about the push. We found out through these documents that you actually are pushing forward because you want to bring kids in early. You see these younger teen ages as valuable, but an untapped audience quoting from the emails and suggesting teens are actually household influencers to bring their younger siblings into your platform, into Instagram. Now, how can you ensure that Instagram Creators your product, your program does not facilitate illegal activities when you fail to remove content pertaining to the sale of miners? And it is happening once every two minutes in this country.

Zuckerberg (04:03:22):

Senator, our tools for identifying that kind of content are industry leading. That doesn’t mean we’re perfect. There are definitely issues that we have, but we continue to invest a tone in it.

Senator Blackburn (04:03:33):

Mr. Zuckerberg, yes, there are a lot that is slipping through. It appears that you’re trying to be the premier sex trafficking site-

Zuckerberg (04:03:39):

Of course not, Senator.

Senator Blackburn (04:03:39):

… in this country.

Zuckerberg (04:03:39):

Senator, that’s ridiculous. Senator.

Senator Blackburn (04:03:41):

No, it is not ridiculous. You want to turn around and tell these people that-

Zuckerberg (04:03:44):

Of course we don’t want this content on our platforms and we-

Senator Blackburn (04:03:46):

Why don’t you take it down?

Zuckerberg (04:03:47):

We do take it down.

Senator Blackburn (04:03:48):

We are here discussing. We need you all to work with us.

Zuckerberg (04:03:51):

We do more work to take it down than-

Senator Blackburn (04:03:53):

No, you are not. You are not. And the problem is we’ve been working on this… Senator Welch’s over there. We’ve been working on this stuff for a decade. You have an army of lawyers and lobbyists that have fought us on this every step of the way. You work with NetChoice, the Cato Institute, Taxpayers Protection Alliance and Chamber of Progress to actually fight our bipartisan legislation to keep kids safe online. So are you going to stop funding these groups? Are you going to stop lobbying against this and come to the table and work with us? Yes or no?

Zuckerberg (04:04:34):

Senator, we have a-

Senator Blackburn (04:04:35):

Yes or no?

Zuckerberg (04:04:37):

Of course we’ll work with you on the legislation.

Senator Blackburn (04:04:39):

Okay, the door is open. We’ve got all these bills. You need to come to the table. Each and every one of you need to come to the table and you need to work with us. Kids are dying.

Dick Durbin (04:04:57):

Senator Welch.

Senator Welch (04:05:00):

I’m going to thank my colleague, Senator Blackburn for her decade of work on this. I actually have some optimism. There is a consensus today that didn’t exist, say 10 years ago, that there is a profound threat to children, to mental health, to safety. There’s not a dispute that was in debate before. That’s a starting point. Secondly, we’re identifying concrete things that can be done in four different areas. One is industry standards, two is legislation, three are the courts. And then four is a proposal that Senator Bennett, Senator Graham, myself and Senator Warren have to establish an agency, a governmental agency whose responsibility would be to engage in this on a systematic, regular basis with proper resources.

(04:06:07)
And I just want to go through those. I appreciate the industry standard decisions and steps that you’ve taken in your companies, but it’s not enough. And that’s what I think you’re hearing from my colleagues. For instance, where there are layoffs in it is in the trust and verify programs. That’s alarming because it looks like there is a reduction in emphasis on protecting things like you just added Ms. Yaccarino 100 employees in Texas in this category and how many did you have before?

Yaccarino (04:06:48):

The company is just coming through a significant restructuring. So we’ve increased the number of trust and safety employees and agents all over the world by at least 10% so far in the last 14 months. And we will continue to do so specifically in Austin, Texas.

Senator Welch (04:07:05):

All right, Mr. Zuckerberg, my understanding is there have been layoffs in that area as well. There’s added jobs there at Twitter, but at Meta, have there been reductions in that?

Zuckerberg (04:07:16):

There have been across the board, not really focused on that area. I think our investment is relatively consistent over the last couple of years. We invested almost $5 billion in this work last year and I think this year we’ll be on the same order of magnitude.

Senator Welch (04:07:30):

All right, another question that’s come up is when to the horror of a user of any of your platforms, somebody has an image on there that’s very compromising often of a sexual nature. Is there any reason in the world why a person who wants to take that down can’t have a very simple same day response to have it taken down? I’ll start with Twitter on that.

Yaccarino (04:07:59):

I’m sorry, Senator. I was taking notes. Could you repeat the question?

Senator Welch (04:08:03):

Well, there’s a lot of examples of a young person finding out about an image that is of them and really compromises them and actually can create suicidal thoughts and they want to call up or they want to send an email and say, take it down. I mean, why is it not possible for that to be responded to immediately?

Yaccarino (04:08:26):

Well, we all strive to take down any type of violative content or disturbing content immediately. At X, we have increased our capabilities with a two-step reporting process.

Senator Welch (04:08:39):

Shouldn’t that just be standard? If I’m a parent or I’m a kid and I want this down, shouldn’t there be methods in place where it comes down? You can see what the image is.

Zuckerberg (04:08:49):

Yes.

Senator Welch (04:08:50):

Mr. Zuckerberg.

Yaccarino (04:08:52):

Ecosystem wide standard would improve and actually enhance the experience for users at all our platforms.

Senator Welch (04:08:58):

All right.

Zuckerberg (04:08:59):

There actually is an organization that I think a number of the companies up here are a part of called Take It Down. It’s some technology that we and a few others build, but basically-

Senator Welch (04:09:08):

All right, you all are in favor of that because-

Zuckerberg (04:09:09):

Yeah, this should exist.

Senator Welch (04:09:10):

… that is going to give some peace of mind to people. All right? It really, really matters. I don’t have that much time. So we’ve talked about the legislation and Senator Whitehouse had asked you to get back with your position on section 230, which I’ll go to in a minute, but I would welcome each of you responding as to your company’s position on the bills that are under consideration in this hearing. All right? I’m just asking you to do that.

(04:09:41)
Third, the court, this big question of section 230, and today I’m pretty inspired by the presence of the parents who have turned their extraordinary grief into action and hope that other parents may not have to suffer what for them is a devastating, for everyone, a devastating loss.

(04:10:03)
Senator Whitehouse asked you all to get back very concretely about Section 230 and your position on that, but it’s an astonishing benefit that your industry has that no other industry has. They just don’t have to worry about being held accountable in court if they’re negligent. So you’ve got some explaining to do and I’m just reinforcing Senator Whitehouse’s request that you get back specifically about that.

(04:10:36)
And then finally, I want to ask about this notion. It’s this idea of a federal agency who’s resourced and whose job is to be dealing with public interest matters that are really affected by big tech. It’s extraordinary what has happened in our economy with technology and your companies represent innovation and success.

(04:11:02)
But just as when the railroads were ascendant and were in charge and ripping off farmers because of practices they were able to get away with just as when Wall Street was flying high, but there was no one regulating blue sky laws. We now have a whole new world in the economy. And Mr. Zuckerberg, I remember you testifying in the Energy and Commerce Committee and I asked you your position on the concept of a federal regulatory agency. My recollection is that you were positive about that. Is that still the case?

Zuckerberg (04:11:36):

I think it could be a reasonable solution. There are obviously pros and cons to doing that versus the normal, the current structure of having different regulatory agencies focused on specific issues, but because a lot of the things trade off against each other. Like one of the topics that we talked about today is encryption and that’s obviously really important for privacy and security, but-

Senator Welch (04:11:56):

Right. Can we just go down the line? I’m at the end, but thank you. Yaccarino.

Yaccarino (04:12:00):

Senator, I think the industry initiative to keep those conversations going would be something X would be very, very proactive about. If you think about our support of the Report Act, the SHIELD Act, the Stop CSAM Act, our support of the Project Safe Childhood Act, I think our intentions are clear to participate in SHIELD here.

Senator Welch (04:12:19):

Mr. Chew.

Chew (04:12:20):

Senator, we support national privacy legislation for example. So that sounds like a good idea. We just need to understand what it means.

Senator Welch (04:12:27):

All right, Mr. Spiegel.

Spiegel (04:12:30):

Senator, we’ll continue to work with your team and we’d certainly be open to exploring the right regulatory body for big technology.

Senator Welch (04:12:35):

But the idea of a regulatory body is something that you can see has merit.

Spiegel (04:12:41):

Yes, Senator.

Senator Welch (04:12:42):

And Mr. Stefan.

Stefan (04:12:44):

Yeah, we’re very open to working with you and our peers and anybody on helping make the internet a safer place. I think you mentioned this is not a one platform problem. So we do look to collaborate with other companies and with nonprofits in the government to make it safe for everybody.

Senator Welch (04:12:59):

Okay. I thank you all. Mr. Chairman, I yield back.

Dick Durbin (04:13:01):

Thank you, Senator Welch. Well, we’re going to conclude this hearing and thank you all for coming today. You probably have your scorecard out there. You’ve met at least 20 members of this committee and have your own impressions of their questioning approach and the like. But the one thing I want to make clear as chairman of this committee for the last three years is this was an extraordinary vote on an extraordinary issue.

(04:13:24)
A year ago, we passed five bills unanimously in this committee. You heard all the senators, every spot on the political spectrum was covered. Every single senator voted unanimously in favor of the five pieces of legislation we’ve discussed today. It ought to tell everyone who follows Capitol Hill and Washington a pretty stark message. We get it and we live it. As parents and grandparents we know what our daughters and sons and others are going through.

(04:13:56)
They cannot cope. They cannot handle this issue on their own. They’re counting on us as much as they’re counting on the industry to do the responsible thing. And some will leave with impressions of our witnesses and the companies they represent that you’re right as an American citizen, but you ought to also leave with the determination to keep the spotlight on us to do something. Not just to hold a hearing, bring out a good strong crowd of supporters for change, but to get something done. No excuses, no excuses. We’ve got to bring this to a vote.

(04:14:32)
What I found in my time in the House, in the Senate is that’s the moment of reckoning. Speech is not withstanding press releases and the like. The moment of reckoning is when we call a vote on these majors, it’s time to do that. I don’t believe there’s ever been a moment in America’s wonderful history when a business or industry has stepped up and said, regulate us. Put some legal limits on us. Businesses exist by and large to be profitable, and I think that we got to get behind that and say profitability at what cost.

(04:15:02)
Senator Kennedy, Republican colleague said, “Is our technology greater than our humanity?” I think that is a fundamental question that he asked. What I would add to it are our politics greater than technology? We’re going to find out. I want to thank a few people before we close up here. I’ve got several staffers who’ve worked so hard on this. Alexandra Gelber. Thank you very much, Alexandra. Jeff Hanson. Scott Jordan Hanson. Last point I’ll make Mr. Zuckerberg is just a little advice to you. I think your opening statement on mental health needs to be explained because I don’t think it makes any sense. There is an parent in this room who’s had a child that’s gone through an emotional experience like this that wouldn’t tell you and me, they changed right in front of my eyes. They changed. They hold themselves up in their room. They no longer reached out to their friends. They lost all interest in school. These are mental health consequences that I think come with the abuse of this, right? To have access to this technology. So I will jump, let’s see my colleague, do you want to say a word?

Speaker 18 (04:16:14):

I think it was a good hearing. I hope something positive comes from it. Thank you all for coming.

Dick Durbin (04:16:17):

The hearing record’s going to remain open for a week for statements and questions may be submitted by Senators by 5:00 PM on Wednesday. Once again, thanks to the witnesses for coming. The hearing stands adjourned.

Transcribe Your Own Content

Try Rev and save time transcribing, captioning, and subtitling.