Sep 30, 2021
Facebook Head of Safety Testimony on Mental Health Effects: Full Senate Hearing Transcript
Antigone Davis, Facebook’s Head of Safety, testified on social media’s effects on children and teens before the Senate on September 30, 2021. Read the full transcript of the hearing here.
Transcribe Your Own Content
Try Rev and save time transcribing, captioning, and subtitling.
Mr. Blumenthal: (00:00)
This effort and to the Ranking Member, Senator Wicker, who is also with us and who has helped to lead this effort has been very, very bi-partisan. And I think the ongoing series of hearings that we will have, similarly will be bipartisan in its objective and its conduct. I want to welcome our witness, Ms. Davis, who is appearing on behalf of Facebook, thank you for being with us. This hearing is the third in a series intended to help us draft legislation, but not just educate, not just legislate, but also to prompt action by Facebook itself. And that action has to address the harms that children and teens face on social media. I want to make clear that our interests are not limited to Facebook and Instagram. Our subcommittee has secure commitments from several social media companies to appear in the coming weeks. We will hold them to those promises.
Mr. Blumenthal: (01:10)
We’re here today because Facebook has shown us once again, that it is incapable of holding itself accountable. This month, a whistleblower approached my office to provide information about Facebook and Instagram. Thanks to documents provided by that whistleblower as well as extensive public reporting by The Wall Street Journal and others, we now have deep insight into Facebook’s relentless campaign to recruit and exploit young users. We now know while Facebook publicly denies that Instagram is deeply harmful for teens, privately Facebook researchers and experts have been ringing the alarm for years. We now know that Facebook routinely puts profits ahead. This [inaudible 00:02:10] reports that Facebook has not disclosed why. That will be a question that I think will resonate throughout this hearing, because the fact of the matter is Facebook has concealed research, studies, experts that show the harm that has been caused to children on its site, how it knew about that harm and how it can seal it continually.
Mr. Blumenthal: (02:39)
In August ahead of this hearing Senator Blackburn and I wrote to Mark Zuckerberg, and we asked, as you can see from this poster board, “Has Facebook research ever found that it’s platforms and products and have a negative effect on children’s and teens’ mental health or wellbeing such as increased suicidal thoughts, heightened anxiety, unhealthy usage patterns, negative self-image or other indications of lower well-being?” Facebook’s response was, “We are not aware. We are not aware of a consensus among studies or experts about how much screen time is too much.”
Mr. Blumenthal: (03:38)
That response was simply untrue. Facebook knows. It knows the evidence of harm to teens is substantial and specific to Instagram. In new previously undisclosed documents provided by the whistleblower, making them available now through these quotes, “We know that its own comprehensive internal review indicated that Facebook employees found and, I quote substantial evidence suggests that experiences on Instagram and Facebook make body dissatisfaction worse, particularly viewing attractive images of others, viewing filtered images, posting selfies, and viewing content with certain hashtag.” I’m going to repeat that quote. “Substantial evidence suggests that experiences on Instagram and Facebook make body dissatisfaction worse, particularly viewing attractive images of others, viewing filtered images, posting selfies, and viewing content with certain hashtag.”
Mr. Blumenthal: (05:05)
That finding was not some disgruntled Facebook employing making a complaint. It was Facebook’s own employees making a formal finding based on their research. And it was available at the highest levels of Facebook’s management. In our August letter, we also asked, “Has Facebook ever found that child or teenage users engage in usage patterns that would indicate addictive or unhealthy usage of its platforms or products?” Facebook didn’t even bother to respond directly and pointed us to a previous evasion.
Mr. Blumenthal: (05:55)
And there was a reason it responded in that way because Facebook knows, they know that children struggle with addiction on Instagram and they didn’t want to admit it. Facebook researchers have concluded that teens, “Have an addicts narrative about their use.” Another survey also not disclosed publicly found that, “Over one third of teens felt a have only a little control or no control at all over how Instagram makes them feel.” Again, this conclusion is not solely one report, one Facebook employees perspective, it is a pattern of findings repeated across sophisticated and extensive studies that Facebook itself conducted over the past four years, not displaced or disgruntled employees, Facebook formal findings and conclusion. Facebook knows the destructive consequences that Instagram’s design and algorithms are having on our young people in our society, but it has routinely prioritized its own rapid growth over basic safety for our children.
Mr. Blumenthal: (07:42)
There is a teenage mental health crisis in America. After years of decline starting in 2007, the suicide rate for young people has begun to skyrocket. The suicide rate for 10 to 14 year old has doubled, for young girls, it has quadrupled. Instagram didn’t create this crisis, but from the documents provided by the whistleblower clearly Facebook’s own researchers describe Instagram itself as a ‘perfect storm’ that and I quote again, ‘exacerbates downward spirals.” Facebook knew it was a perfect storm through Instagram that exacerbates downward spiral. My office did its own research. We created an Instagram account identified as a 13 year old girl and followed a few easily findable accounts associated with extreme dieting and eating disorders.
Mr. Blumenthal: (08:52)
Within a day it’s recommendations were exclusively filled with accounts that promotes self-injury and eating disorders. That is the perfect storm that Instagram has fostered and created. So Facebook has asked us to trust it, but after these evasions and these revelation, why should we? It’s clear that Facebook has done nothing to earn that trust. Not from us, not from parents, not from the public. In truth, Facebook has taken big tobacco’s playbook, it has hidden its own research on addiction and the toxic effects of its products. It has attempted to deceive the public and us in Congress about what it knows, and it has weaponized childhood vulnerabilities against children themselves. It’s chosen growth over children’s mental health and wellbeing, greed over preventing the suffering of children. These internal Facebook studies are filled with recommendations, recommendations from Facebook’s own employees. And yet there is no evidence, none that Facebook has done anything other than a few small, minor, marginal changes. We all know that Facebook treated protecting kids with disregard. If it had protected kids like it did drive up revenue or growth, it would have done a whole lot more. Instead, Facebook has evaded misled and deceived. I hope that this hearing provides real transparency and marks the start of a change from Facebook. Parents deserve the truth. Thank you to everyone for being here this morning, I’ll turn to the Ranking Member. And then if the chair woman or the Ranking Member have remarks, I’d be happy to call on them.
Mrs. Blackburn: (11:06)
Thank you, Mr. Chairman. And I want to say thank you to you and your staff for working in partnership with us on this hearing. And I wish that Senator Markey was still here. He and I have been on this issue since we were each in the House and working on privacy, Big Tech accountability. So this is the type hearing that has been a long time coming. And this is truly an important conversation for us to be having to continue and to be bringing our findings forward so that the public is aware. There are a lot of moms, security moms I call them that are very concerned about what they see happening in virtual space. 2019 CDC released some data and adding to what you were talking about, I think this is important. In 2019, the CDC data showed that 20%, 20% of our American high school students seriously considered attempting suicide.
Mrs. Blackburn: (12:20)
40% reported experiencing sadness, hopelessness. Now our children who have lived through COVID, school closings, and more upheaval in their lives than ever before, deserve better than this yet are the findings about the social interaction and relationship that they so desperately need, where they are finding this is on social media, on sites like Instagram, TikTok, Snapchat. And now we know that at least one of these sites, Facebook knows that its services are actively harming young children. They know this, how did they know this? Because they did their own research as Chairman Blumenthal just said.
Mrs. Blackburn: (13:15)
In 2019 and 2020 Facebook’s in-house analysts performed a series of deep dives into teen use of Instagram and it revealed… And I’m quoting from the report, ” Aspects of Instagram exacerbate each other to create a perfect storm and that perfect storm manifest itself in the minds of teenagers in the form of intense social pressure, addiction, body image issues, eating disorders, anxiety, depression, and suicidal thoughts.” But it gets even worse than this because Facebook, despite touting their compliance with CAPA was scheming to bring even younger users into their field.
Mrs. Blackburn: (14:10)
Instagram announced this week that it is temporarily shelving their plans for Instagram Kids. But until this week they were moving forward with this trying to bring younger children onto their platforms. Yet at the same time, that we’re learning this, The Wall Street Journal reported how Facebook tried to use play dates… that is right, play dates to attract more children to its Messenger Kids service. In fact, Facebook is fully aware that underage children are using their platforms. Not only that, but they encourage older teen siblings to recruit their younger siblings and are actually devising marketing plans to help kids and teens… Get this, create secondary or anonymous accounts that they can hide from their parents. And they perform market research on kids as young as eight years old, so they can learn how to recruit them to their sites. Facebook is also aware of other types of harmful content on their site.
Mrs. Blackburn: (15:28)
In fact, a report shows how Facebook knew about content devoted to coercing women into domestic servitude, yet they chose to do nothing to stop it until Apple threatened to pull Facebook from the app store. That’s correct. It took Apple standing up to get them to stop this. In fact, this seems to be a recurring theme with this company, “Do everything and anything to mold the world into your own image for your own profit, without any regard for any harm that is going to be done because your focus is own your pocket book.” Adam Mosseri, CEO of Instagram continues to double down on youth marketing. He said on the Today Show earlier this week when asked about Instagram Kids and I’m quoting him, “I firmly believe it’s a good idea. As a father, the most important thing to me is the safety of my children.”
Mrs. Blackburn: (16:39)
Well, Mr. Mosseri, I am a mother and I’m a grandmother and I really beg to differ with you. In fact, I would imagine that most of the chief momma’s in charge at their own households would disagree with you. I think they would vehemently disagree with you. They don’t want their kids going on platforms like Instagram. Even if you assure us that it will be safe for tweens. As the Chairman said, “You’ve lost the trust.” And we do not trust you with influencing our children, with reading in their minds. They also don’t want Facebook collecting data on their children because… Call them whatever you want, tweens, teens, young adults, the bottom line is these are children. They’re children and you and Zuckerberg both of you being parents should understand that Facebook has both a legal and a moral obligation to forego collecting and using children’s data.
Mrs. Blackburn: (18:04)
So Mr. Chairman, I’m grateful for the opportunity that we have this hearing today to continue to investigate, continue to expose what is happening in the virtual space. And I am certain that we will be holding Facebook to account as other tech platforms will be held to account. Ms. Davis, I do thank you for appearing before us today, and I hope that we can have a very frank and candid conversation. Thank you, Mr. Chairman.
Mr. Blumenthal: (18:36)
Thanks Senator Blackburn. I call on Senator Cantwell.
Ms. Cantwell: (18:40)
Thank you Mr. Chairman, and thank you Ranking Member Blackburn for this hearing today and for your long standing work on this very important public policy area. I think it’s very important to understand that our committee would like to move forward on stronger privacy legislation. And yesterday’s hearing clearly crystallized that we need to update the Children’s Online Privacy Protection Act. And this hearing, I’m sure will put even more focus to the fact that we need to do that. I want to thank Senator Markey for his questioning yesterday. this month, The Wall Street Journal published a series of articles about Facebook and Instagram showing the management knew a great detail about the impacts of these products, the harm to children, the harm to teenagers, and in spite continued to bury that knowledge.
Ms. Cantwell: (19:37)
So as our colleges said, data collection of children is something that should have more aggressive attention. They should not have the products and services track and follow these young children and updating CAPA will be essential. As we said yesterday, the committee talked about also, first time privacy and data security violations. There was unanimous support for that. So it’s very important that we continue to take steps on this issue. I agree that the safeguards in place are not enough and we need to do more. So I look forward to hearing from the witness today.
Mr. Blumenthal: (20:20)
Thank you, Senator Cantwell. Senator Wicker do you have any [crosstalk 00:20:24]?
Ms. Cantwell: (20:23)
Yeah, thank you Senator Blumenthal. And I’ll be brief because we need to get to our witness. Facebook is one of a handful of Big Tech companies wielding immense power over our internet experiences. Using its market dominance, Facebook maintains unprecedented control over the vast flow of news, information and speech on the internet. To maintain a free, open, safe, and secure internet, many of us on this committee have long called for more transparency and accountability from Facebook and other social media platforms. Today, the content moderation and data collection practices of Big Tech remain largely hidden to consumers. Too often, Americans are left wondering why their own line posts have been deleted, demoted, demonetized or outright censored without a full explanation. Users also remain in the dark about what data is being collected about them, how it’s being used and to whom it’s being sold and for what purpose.
Ms. Cantwell: (21:31)
Recent reports from The Wall Street Journal may have shed new light on why Facebook’s platform management practices have been kept from public view. This month, the journal revealed that Facebook’s so-called crosscheck program reportedly exempts certain public figures from its terms of service and community standards. The journal also disclosed Facebook’s own internal research documenting the harmful mental effects of the platform and its photo-sharing side effects on children and teens. Both of these reports are deeply troubling and only amplify concerns about Facebook’s inconsistent enforcement of its content moderation policies and its disregard and wellbeing for children and teens. This morning, I hope Facebook will be forthcoming about its platform management practices and take this opportunity to address The Wall Street Journal’s reports. I also hope Facebook will outline what it is doing to increase transparency and begin protecting users of all ages on it’s platforms.
Ms. Cantwell: (22:40)
Following yesterday’s data privacy hearing, what remains clear to me is that Congress must act to address Big Tech’s continued reign to censor content, suppress certain viewpoints, prioritize favored political speech, stockpile consumer data, and act in other unfair and anti competitive ways. The time to act is now, and I’m the fourth member of this committee this morning to say that. Addressing these issues as essential to preserving a free and open internet and a thriving digital economy for generations to come. We are serious about taking action. Thank you, Mr. Chairman.
Mr. Blumenthal: (23:36)
Thanks Senator Wicker. We will now turn to our witness Ms. Antigone Davis, who is the Global Head of Safety at Facebook. She spearheads Facebook’s safety advisory board efforts. And she earned her JD from the University of Chicago Law School and her BA from Columbia University, Ms. Davis, the floor is yours.
Ms. Antigone Davis: (24:03)
Thank you, Chairman Blumenthal, Ranking Member Blackburn, and distinguished members of the subcommittee. Thank you for the opportunity to appear before you today. My name is Antigone Davis. I’m a parent, a former teacher and the Global Head of Safety at Facebook. Like you, I care deeply about the safety and wellbeing of young people online. And I have dedicated the better part of my adult life to these issues. In my current role, I work with internal teams and external stakeholders to ensure that Facebook remains a leader in online safety, including issues of bullying and combating child exploitation. This is some of the most important work that I’ve done in my career. And I’m proud of the work that my team does every day. At Facebook, we take the privacy, safety and wellbeing of all those who use our platform very seriously, especially the youngest people on our services.
Ms. Antigone Davis: (24:58)
We work tirelessly to put in place the right policies, products, and precautions so they have a safe and positive experience. We have dedicated teams focused on youth safety, and we invest significant resources in protecting teens online. We also know that we can’t do this work alone. We work closely with experts and parents to inform the features we develop. We require everyone to be at least 13 years of age on Facebook and Instagram. When we learn that an underage user has created an account, we remove them from our platform. When it comes to those between 13 and 17, we consult with experts to ensure that our policies properly account for their presence. For example, by age gating content, we were constantly to improve safety and privacy protections for young people. For example, earlier this year, we announced that all users under 16 in the US will now be defaulted into a private account when they join Instagram.
Ms. Antigone Davis: (25:55)
We also think it is critical to give parents and guardians the information, resources, and tools they need to set parameters for their children and help them develop healthy and safe online habits. That’s why we publish a variety of guides and portals intended to foster important conversations around online safety. And we’re fortunate to do all this work with the help of industry experts, including our youth advisors, a group of experts in privacy, youth development, psychology, parenting, and youth media. We understand that recent reporting has raised a lot of questions about our internal research, including research we do to better understand young people’s experiences on Instagram.
Ms. Antigone Davis: (26:35)
We strongly disagree with how this reporting characterize our work. So we want to be clear about what the research shows and what it does not show. The research shows that many teens, that Instagram is helping them with hard issues that are so common to being a teen. In fact, one of the main slides referenced in the article includes a survey of 12 difficult and serious issues like loneliness, anxiety, sadness, and eating disorders. We asked teens who told us that they were struggling with these issues, whether Instagram was making things better, worse, or having no effect. On 11 of the 12 issues, teen girls who said they struggled with those issues were more likely to say that Instagram was affirmatively helping them not making it worse. That was true for teen boys on 12 of 12 issues. I want to be clear, I’m not diminishing the importance of these issues or suggesting that we will ever be satisfied anyone is struggling on our apps.
Ms. Antigone Davis: (27:31)
That’s why we conduct this research to make our platform better, to minimize the bad and maximize the good and to proactively identify where we can improve. And the most important thing about our research is what we’ve done with it. We’ve built AI to identify suicide content on our platform and rapidly respond with resources. We’ve launched tools to help control time spent on our apps. We’ve built a dedicated reporting flow for eating disorder related content and we offer resources when people try to search for it. We have a long track record of using our internal research, external research and close collaboration with experts to improve our apps and provide resources for people who use them.
Ms. Antigone Davis: (28:14)
And our work to respond to this research is ongoing. One idea we think has promise, is finding opportunities to jump in if we see people dwelling on certain types of content and point them to content that inspires and uplifts them. Finally, I want to speak to our work on Instagram experience for those under 13. As every parent knows when it comes to kids and tweens, they’re already online. We believe that is better for parents to have the option to give tweens access to a version of Instagram that’s designed for them where parents can supervise and manage their experience than to have them lie about their age, to access a platform that wasn’t built for them, or rely on an app’s ability to verify the age of kids who are too young to have an ID. That’s why we’ve been working on delivering age appropriate, parent supervised experiences.
Ms. Antigone Davis: (29:02)
… working on delivering age appropriate parent supervised experiences, something YouTube and TikTok already do, but we recognize how important it is to get this right. We have heard your concerns, which is why we announced that we are pausing the project to take more time. We’ll keep listening to parents, keep talking with policy makers and regulators like yourselves, keep taking guidance from experts, and we’ll revisit this project at a later date.
Ms. Antigone Davis: (29:27)
There’s an important part of what we’ve been developing for Instagram kids that we won’t be pausing. Supervisory tools for parents will continue our work to allow parents to oversee their children’s accounts by offering these tools to teen accounts on Instagram. These new features, which parents and teens can opt into will give parents tools to meaningfully shape their teen’s experience. As a parent, this development means a lot to me. I know I would’ve truly appreciated more insight and tools to help me support my daughter and manage her online experience as she learned how to navigate social media.
Ms. Antigone Davis: (30:03)
I want to thank you for the opportunity to discuss these important issues with you today and answer your questions. Your safety and wellbeing are areas where we are investing heavily, and we welcome productive collaboration with lawmakers and elected officials. Thank you, and I look forward to our discussion.
Mr. Blumenthal: (30:19)
Thank you, Miss Davis. We both know, all of us know as parents, how vulnerable teens are at this age. How they can succumb to eating disorders, even the suicidal tendencies, and how susceptible they are. So, the effects known to Facebook of its site in condoning and even encouraging those tendencies is so deeply repugnant. Facebook knows from its own report disclosed, undisclosed previously, that it found in December 2020 a survey of over 50,000 Facebook users that, “Teens, women of all ages, and people in Western countries experience higher levels of both body image concerns and problems with appearance comparison on Instagram.”
Mr. Blumenthal: (31:24)
In an April 2021 report, which also has not been disclosed, it found a quarter of teen girls felt discouraged about their own life and worse about themselves, often or very often after using Instagram. Another undisclosed report, March 2020 found, “Social comparison is worse on Instagram.” In part because its recommendations “enable never ending rabbit holes” and because it “perceived as real life”. I don’t understand, Miss Davis, how you can deny that Instagram isn’t exploiting young users for its own profits.
Ms. Antigone Davis: (32:31)
Thank you, Senator, for your question. I’d like to speak specifically to this as an experienced mom of a teenage daughter, as someone who was a teenage girl herself, and as someone who’s taught middle school and teenage girls. I’ve seen firsthand the troubling intersection between the pressure to be perfect between body image and finding your identity at that age. I think what’s been lost in this report is that in fact with this research we found that more teen girls actually find Instagram helpful.
Ms. Antigone Davis: (33:11)
Teen girls who are suffering from these issues find Instagram helpful than not. Now, that doesn’t mean that the ones that aren’t aren’t important to us. In fact, that’s why we do this research. It’s leading to [crosstalk 00:33:24].
Mr. Blumenthal: (33:23)
Well, if I may interrupt you, Miss Davis, these are your own reports. These findings are from your own studies and your own experts. You can speak from your own experience, but will you disclose all of the reports, all of the findings, will you commit to full disclosure?
Ms. Antigone Davis: (33:46)
Senator, thank you. I know that we have released a number of the reports and we are looking to find ways to release more of this research. I want to be clear that this research is not a bombshell. It’s not causal research. It’s in fact just directional research that we use for product-
Mr. Blumenthal: (34:04)
Well, I beg to differ with you, Miss Davis. This research is a bombshell. It is powerful, gripping, riveting evidence that Facebook knows of the harmful effects of its site on children and that it has concealed those facts and findings. So, I ask you to commit that you will make full disclosure all of the thousands of pages of documents that the whistleblower has and more that can be made available. I want to switch to a separate topic because I think that you’ve indicated that you’re not willing at this point to make a commitment that you’ll fully disclose everything. Unless I’m mistaken, I’ll give you a chance to respond.
Ms. Antigone Davis: (35:04)
Thank you, Senator. We are looking for ways to release more research. There are privacy considerations that we need to take into place, but I think more importantly, we’re actually also looking for ways to give external researchers access to data so that they can do independent research as well.
Mr. Blumenthal: (35:21)
Well, I think that’s a very important point. You haven’t provided that access to researchers. You’ve refused to make it available to independent experts and researchers. I will ask you as well for a commitment to do so. I recognize you’re not going to answer this question here, but let me ask you separately. In your remarks you say, “We think it is important to help provide parents and guardians the information, resources, and tools they need.”
Mr. Blumenthal: (35:57)
I want to talk about one major source of concern for parents, they are Finstas. Finstas are fake Instagram accounts. Finstas are kid’s secret, second accounts. Finstas often are intended to avoid parent’s oversight. Facebook depends on teens for growth. Facebook knows that teens often are the most tech savvy in the household, that they need or they would like to have critical ways. Facebook would like to have critical ways to acquire new older users. But Facebook also knows that nearly every teen in the United States has an Instagram account.
Mr. Blumenthal: (36:46)
It can only add more users as fast as there are new 13-year old. So, what Facebook has done is Finstas. In multiple documents, Facebook describes these secret accounts as “a unique value proposition”. It’s a growth strategy, a way to boost its monthly active user metric. That active user metric is of great interest to your investors, to the markets, and it looks to me like it’s another case of prioritizing growth over children’s safety. So, Facebook claims it’s giving tools to parents to help their kids navigate social media and stay safe online.
Mr. Blumenthal: (37:43)
But behind the scenes, your marketers see teens with multiple accounts as unique value opportunity propositions. Let me quote, “unique value proposition”. We all know that means Finstas. You’re monetizing kids deceiving their parents. You make money when kids deceive their parents. You make money from these secret accounts. You make money from heightening the metrics that have impressed the markets, your investors, raise the stock price. How can parents trust you?
Ms. Antigone Davis: (38:34)
Respectfully, Senator, that’s not how I would actually characterize the way we build our products. We build our products to provide the best experience for young people. Interestingly, when you mentioned Finstas, in my engagement with teens, Finstas are not something that we actually we built, they built, they did it to actually provide themselves with a more private experience, which is one of the things that led us to think about offering them privacy, more privacy. So, that’s actually where Finstas come from, teens, not us.
Ms. Antigone Davis: (39:07)
But I think more importantly, our announcement that we are going to be providing supervisory tools that give parents actual insight into what their children are doing on Instagram is exactly contrary to what you’re suggesting.
Mr. Blumenthal: (39:22)
Well, with all due respect, these are private. Yes, they are secret. They are secreted from parents so that whatever the tools you may have, parents can’t apply them. They are part of the metrics. They’re measured, so that you can show growth. I’ll turn to the ranking member.
Mrs. Blackburn: (39:43)
Thank you, Mr. Chairman and Miss Davis, thank you for your testimony and for being with us today. We appreciate that, and I congratulate you on a perfectly curated background. It looks beautiful coming across the screen. I wish the message that you were giving us were equally as attractive. Let me go to Instagram CEO Adam Mosseri recently saying, and I used his interview in my opening statement, that 13 year olds are not allowed on Instagram. Is that true? Yes or no.
Ms. Antigone Davis: (40:22)
13 year olds and above are allowed on Instagram, under 13 year olds are not.
Mrs. Blackburn: (40:27)
But we know that you are doing research on children as young as eight and are marketing to eight to 12-year olds. Correct?
Ms. Antigone Davis: (40:41)
We do not market to eight to 12 year olds because they’re not on Instagram, 13-year olds and above. If we find an account of someone who’s under 13, we remove them. In fact, in the last three months we removed 600,000 accounts of under 13-year olds.
Mrs. Blackburn: (40:56)
Okay. So, talk to me about how you enforce the policy that a 13-year old, under 13 cannot be on Instagram.
Ms. Antigone Davis: (41:07)
Yeah, actually, I appreciate that question. So, there are a number of different things that we do. We have an age screen when someone tries to join Instagram. If we see someone trying to repeatedly change the date to get past that, we actually will restrict their ability to access the app. We also allow people to report underage accounts, even if you’re not on Facebook and we will remove them. We’re investing and using AI and other signals to remove underage accounts.
Mrs. Blackburn: (41:40)
Not to interrupt, but I’ve got five minutes. Then, talk to me about what the MAP is, because I know your research, your research, shows that you’ve looked into using the MAP for kids under 13. So, why don’t you explain that to us?
Ms. Antigone Davis: (42:00)
Map is just a measure of how many people are using the site in a month. It’s monthly active people.
Mrs. Blackburn: (42:08)
Okay. But you were going to apply that to children under 13. So, therefore, you were trying to quantify the number of children that were under 13 years of age that we’re using your site, correct?
Ms. Antigone Davis: (42:22)
Respectfully, that doesn’t sound accurate to me. In fact, what we are trying to do is to remove under 13 from our site.
Mrs. Blackburn: (42:26)
Okay. Well, then let’s have you clarify that for the record, because your research shows that you were using the MAP on children under 13. I want to move on and talk to you about the information I’ve seen about the presence of content on Facebook and Instagram that is used to recruit women into domestic servitude. This is a kind of trafficking where people are forced to work against their will for little or no pay. Their passports are often taken away from them. They can be auctioned online and abused.
Mrs. Blackburn: (43:08)
I have a poster that is behind me, I hope that you can see this. I’ve seen information suggesting that Facebook knew this content was on its website but did nothing to delete it until Apple threatened to drop Facebook from the Apple app store. To quote from a Facebook internal report, and I quote, “Was this issue known to Facebook before the BBC inquiry and Apple escalation? Yes.” But quoting again, “Due to the underreporting of this information and absence of proactive detection, domestic servitude content remained on the platform.
Mrs. Blackburn: (43:56)
Removing our applications from Apple platforms would have had potentially severe consequences to the business.” Miss Davis, did Facebook know about content on its platform used to recruit women into forced slavery, and why did you not remove it until Apple threatened to drop Facebook from the app store?
Ms. Antigone Davis: (44:27)
Respectfully, Senator, I don’t agree with that characterization of what occurred. In fact, we have policies against sex trafficking on our platform.
Mrs. Blackburn: (44:36)
This is your reporting. Miss Davis, this is your company’s reporting. You knew this was there. You knew it was there, but you didn’t do anything about it. Is it still there? Are you still allowing sex trafficking on Facebook? Is this something that girls, as young as eight, who are on your site are exposed to? Let’s get a little bit more definition around us. One more question for you. One of the Wall Street Journal articles came out Monday shared Facebook research about the product segments it would like to target in the future.
Mrs. Blackburn: (45:27)
It shows younger and younger kids. This is your poster. This is your graphic. I put it on a poster, where we’ve been and where we’re going. In fact, documents I saw showed Facebook doing market research on eight-year olds. I’m quoting from you all now, “Twins and younger teens are very similar in digital behaviors. Even kids as young as eight are interested in similar digital experiences.” The document show survey results into the digital interest of eight to 10-year olds. So, with this categorization in mind, does Facebook conduct market research on twins? Yes or no.
Ms. Antigone Davis: (46:23)
Thank you, Senator. I’d first like to actually clarify that that document that you have up behind you, that document is actually from an age appropriate design code. Something that Senator Markey and others have actually given to tech companies as a way for us to think about how we design for different ages. It is actually a direction on policy not a direction on our product development.
Mrs. Blackburn: (46:46)
So, you are admitting to me that you’re designing for eight to 12-year olds? I think that that is something that is very interesting because you know that’s a violation of the Children’s Online Privacy Act. I guess what you’re telling us then is that you also are doing market research on children and that you are continuing to collect data on children. As you try to figure out what type digital experience children, children, ages eight to 12 are interested in having. I’m over time, Mr. Chairman. I will yield back.
Mr. Blumenthal: (47:33)
Thanks, Senator Blackburn. Senator Klobuchar.
Senator Amy Klobuchar: (47:36)
Thank you very much. Miss Davis, we now know that Facebook’s own research found that Instagram worsens body image issues for one in three teenage girls. Were you aware of those internal findings before the Wall Street Journal articles came out?
Ms. Antigone Davis: (47:54)
Senator Klobuchar, I’d just like to correctly characterize those findings. What those findings are are actually of teen girls who already expressed having that issue. Mind you, one and one is too many.
Senator Amy Klobuchar: (48:08)
Okay. I have five minutes and I appreciate that. We’ll put that on the record, but were you aware of the internal findings before the Wall Street Journal articles came out?
Ms. Antigone Davis: (48:21)
Thank you, Senator Klobuchar. I and my team work on a weekly, maybe even times a [crosstalk 00:48:25]-
Senator Amy Klobuchar: (48:25)
Could you just answer it? I’m sorry. I was actually asking a polite question. Were you aware, could you answer yes or no?
Ms. Antigone Davis: (48:33)
Yes, I was.
Senator Amy Klobuchar: (48:35)
What specific steps did you then take in response to your own research and when?
Ms. Antigone Davis: (48:43)
Senator Klobuchar, I don’t know that I’ll be able to give you exact dates, but what I can tell you is that this research has fueled numerous product changes. So, for example, in the context of eating disorders, we now have a dedicated reporting flow for eating disorder content. We also pop up resources for individuals if they try to search for this content. That’s just two of the numerous changes.
Senator Amy Klobuchar: (49:05)
Okay. What I’ll do is in writing ask the questions so we can find out the dates from when the research came out and what you did. You were creating, Facebook was creating a version of Instagram that targeted kids under 13. You announced this week that you are pausing that program. What specific criteria will you use to determine whether to unpause the plan and who will make that decision?
Ms. Antigone Davis: (49:32)
Thank you, Senator Klobuchar. I think what we intend to do at this point in time is to step back to talk with more parents, to engage with more policymakers like yourself, to engage with more experts. What I do know is that parents are, eight out 10 parents, in fact, for kids under the age of 13, are allowing their children onto sites, between the ages of eight and 12. What we really want to do is ensure that they have-
Senator Amy Klobuchar: (50:00)
Okay. But I asked who’s going to make the decision. I so appreciate, if you were answering the question I would let you go ahead, but I was asking who will make the decision about whether to unpause the work on developing that program?
Ms. Antigone Davis: (50:15)
Well, certainly, it would be a collaborative team within the company, but it will be done with the guidance and expertise of our youth advisors, hearing from parents, hearing from policymakers like yourselves and [crosstalk 00:50:28].
Senator Amy Klobuchar: (50:28)
Okay. All right. I know that’s guidance, but I was asking the identity of the person who will make the decision. That’s all. I’ll do that in writing [crosstalk 00:50:35].
Ms. Antigone Davis: (50:35)
I don’t have a single person. I’m sorry, Senator Klobuchar.
Senator Amy Klobuchar: (50:37)
Okay. Last quarter, Facebook publicly reported that its advertising revenue per user in the US and Canada, this is for a quarter, was $51 per quarter. Didn’t even compare with any other industrialized nation or any other country. They’re making so much money off of American users. I asked your colleague Steve Satterfield about that last week in a hearing, in my judiciary antitrust subcommittee, the hearing we had on big data. In his response he said he wasn’t entirely sure whether the data included Instagram revenue. Does it include Instagram revenue, and specifically, does it include revenue from kids under 18?
Ms. Antigone Davis: (51:22)
That is not something I work on, but it’s not how we build products, particularly in relation to young people. We actually have always limited ads for young people. Much more recently we’ve reduced it so that based on actually guidance from experts that we don’t target young people other than on issues [crosstalk 00:51:44].
Senator Amy Klobuchar: (51:43)
Okay. Again, I appreciate that. We’re good at filibustering in the Senate too, but I am really concerned about the answer, because I think it’s specific. Again, I will do this in writing. I will publish the answers, but I’m just asking a fact. You guys published these quarterly revenues. We have them on different countries, right? How much money you make. We got that information. So, I’m trying to figure out if it includes Instagram. I’m trying to figure out if it includes kids, which I assume it does, and I will keep pursuing it another way.
Senator Amy Klobuchar: (52:19)
When you estimate the lifetime value of a user, you must do that because I know your profit model and how it works now after years of taking on this monopoly dominant platform issue. What do you estimate the lifetime value of a user is for kids who start using Facebook products before age 13?
Ms. Antigone Davis: (52:41)
Respectfully, Senator, that’s not how we think about building products for young people. We actually are quite focused on ensuring that parents have the kinds of supervisory tools that they need. It’s just not the way we think about it. It’s certainly not the way my team think about it.
Senator Amy Klobuchar: (52:59)
Miss Davis, that may be true about your team, but are you saying that Facebook, in developing products, has never considered, and you are under oath, has never considered the profit value of developing products when they make their decisions of how those products look?
Ms. Antigone Davis: (53:17)
Respectfully, Senator, we are a business. I’m fully aware of that. But what we are thinking about is how do we provide the best experience. If we have a very shortsighted version without focusing on providing a better experience for people or a good experience, that’s just a terrible business model.
Senator Amy Klobuchar: (53:37)
Well, we’ll follow up in writing. I am out of time. I’ll try to come back if there’s a second round. Thank you very much.
Mr. Blumenthal: (53:44)
Thanks, Senator Klobuchar. I’m hopeful we will have a second round. I don’t know whether Senator Thune is available. If not, Senator Moran or Senator Lee. I will turn to Senator Markey in the absence of a Republican Senator wishing to ask questions. I am going to vote. So, you’re in charge, Senator Markey.
Senator Ed Markey: (54:17)
Okay. Thank you, Mr. Chairman, very much. We will recognize Republican members as they arrive. In April, Senator Blumenthal and I wrote to CEO Mark Zuckerberg ringing the alarm about Facebook’s plan to launch a version of Instagram for kids 12 and under. I’m pleased that Facebook responded to our concerns and is backing down, at least temporarily, from its plans. But a pause is insufficient. Let’s be clear, the problem isn’t that Instagram hasn’t developed a safe product for kids. The problem is Instagram itself.
Senator Ed Markey: (55:05)
According to Facebook’s own research, teen users consistently blame Instagram for increases in their anxiety and depression. In fact, 32% of teen girls said when they felt bad about their bodies, Instagram made them feel worse, and 6% of American teen users trace their desire to kill themselves to Instagram. For teens, Instagram is worse than a popularity contest in a high school cafeteria, because everyone can immediately see who’s the most popular or who’s the least popular. Instagram is that first childhood cigarette meant to get teens hooked early.
Senator Ed Markey: (55:57)
Exploiting the peer pressure of popularity and ultimately endangering their health. Facebook is just like big tobacco, pushing a product that they know is harmful to the health of young people, pushing it to them early, all so Facebook can make money. IG stands for Instagram, but it also stands for Instagreed. The last thing we should allow Facebook to do is push young kids to use Instagram. Miss Davis, will you commit that Facebook will not launch any platforms targeting kids 12 and under that includes features such as like buttons and follower counts that allow children to quantify popularity? Yes or no?
Ms. Antigone Davis: (56:54)
Senator Markey, I’d like to actually take a second to disagree with your comparison. Our products actually add value and offer … enrich teen’s lives. They enable them to connect with their friends, with their family, and actually during COVID, during the pandemic-
Senator Ed Markey: (57:10)
I appreciate that. I appreciate that. Senators just have limited time in the question and answer period. I have a question to you. Will you stop launching? Will you promise not to launch a site that includes features such as like buttons and follower accounts that allow children to quantify popularity? That’s a yes or a no.
Ms. Antigone Davis: (57:34)
Senator Markey, those are the kinds of features that we will be talking about with our experts trying to understand, in fact, what is most age appropriate and what isn’t age appropriate. We will discuss those features with them, of course.
Senator Ed Markey: (57:47)
Well, let me just say this, we’re talking about 12-year olds. We’re talking about nine-year olds. If you need to do more research on this, you should fire all the people who you’re paid to do your research up until now, because this is pretty obvious and it’s pretty obvious to every mother and father in our country.
Senator Ed Markey: (58:02)
Pretty obvious to every mother and father in our country, because all recent scientific studies by child development experts found that not getting enough likes on social media significantly reduces adolescents feelings of self-worth. Here’s another threat to young people on Instagram. The app is full of images and videos of popular influencers who peddle products while they flaunt their lavish lifestyles to users. Mr. Davis, will you commit that Facebook will not launch any platforms targeting kids that host influencer marketing, commercial content that children may be incapable of identifying as advertisements, yes or no?
Ms. Antigone Davis: (58:48)
Senator, that’s actually one of the questions that we will be working with with our experts as well. I do think it’s important to point out that our messaging app for young kids under 13 doesn’t show ads at all, and that was based on the feedback that we got from parents and from our experts.
Senator Ed Markey: (59:08)
I will just say, it’s not acceptable that you don’t have answers for these questions right now. These are the obvious problems that exist. In television, we don’t allow the host of a program to hawk a product to a child. It’s illegal. And I’m the author of those laws, so I know it’s illegal. And the same thing is true here. Why Facebook just can’t say flat out, no, we won’t allow influencers to be trying to push a child towards buying something because that child has now seen a video, is just, again, completely and totally unacceptable, because we know that children lack the cognitive ability to decipher whether something is an advertisement.
Senator Ed Markey: (59:56)
And influencer marketing is inherently manipulative to kids. The same thing was true on television. It’s true over here. We have to move the same values from television over to the internet, or else the same exploitative policies will be adopted by marketers. Research also finds that your algorithms send teen users into a spiral of harmful content, including misinformation about COVID and ads for diet pills and appetite suppressants. Ms. Davis, will you commit that Facebook will not launch any platforms targeting children and employ algorithms promoting this dangerous content?
Ms. Antigone Davis: (01:00:44)
Thank you, Senator Markey. We actually don’t allow weight loss ads to be shown to people under the age of 18 already.
Senator Ed Markey: (01:00:52)
Okay. Well then, that’s reassuring, because that content shouldn’t exist anywhere on your platform. Your platforms, however, from my perspective are actively promoting these materials, and we can’t let that happen to kids. So you seem to disagree with whether or not you are doing that, but my research says that you are. So that’s also something that I think we should just codify. If Facebook has taught us anything, it’s that self-regulation is not an option. We need rules, rules that federally mandated that have to be adhered to by companies. And that’s why today I am re-introducing the Kids Internet Design and Safety Act, the Kids Act, partnering with Senator Blumenthal, who I thank for working with me on this bill. Our legislation bans damaging website design features like follower counts, auto play and push alerts that are harmful to kids, limits advertising and commercial content like product placement and influencer marketing to kids, and prohibits amplification of harmful and violent content to kids. Ms. Davis, do you agree that Congress needs to pass this legislation and enact these critical safeguards for children online, yes or no?
Ms. Antigone Davis: (01:02:28)
Senator Markey, I think our company has made its position really well-known that we believe it’s time for the update of internet regulations. And we’d be happy to talk to you and work with you on that.
Senator Ed Markey: (01:02:40)
Okay. Well, do you support this legislation?
Ms. Antigone Davis: (01:02:46)
We’ll be happy to follow up most certainly.
Senator Ed Markey: (01:02:48)
Well, your company has had this legislation in your possession for months, and you’re testifying here today before the committee that would have to pass this legislation. And again, I just feel that delay and obfuscation is the legislative strategy of Facebook, especially since Facebook has spent millions of dollars on a marketing campaign calling on Congress to pass internet regulations. And Facebook purports to be committed to children’s wellbeing, so it’s simply wrong that you will not support this legislation to enact protections for kids online. That’s the only conclusion I can reach since you’ve had it in preparation for this hearing for a long period of time.
Senator Ed Markey: (01:03:32)
So we know that Facebook’s top priority is its bottom line. Congress has to step in. We have an obligation to enact a bold agenda for young people online, and that means passing the Kids Act to take on big tech’s damaging and coercive tactics to hook kids. Two, updating the Child Online Privacy Protection Act to finally give young people up to the age of 16 a privacy bill of rights for the 21st Century. And passing my Camera Act to launch a major research project at the National Institutes of Health on the effects of tech on children. It’s time for us to do this. We cannot wait. This is a crisis and we must act. Let me now turn and recognize Senator Thune.
Senator Thune: (01:04:20)
Thank you, Mr. Chair. I, along with many of my colleagues are deeply concerned about the lack of consumer transparency and limited accountability of big tech companies. Consumers have become increasingly troubled about the way that their information is used by social media platforms and how these sites decide what news and information we see. Because of the secrecy with which platforms protect our algorithms and content moderation practices, which largely has been and continues to be a black box, consumers have little or no idea how the information they see has been shaped by the sites that their visiting. I’ve [inaudible 01:04:56] two to bipartisan bills to address these issues in platform accountability and consumer transparency with a PACT Act and the Filter Bubble Transparency Act. The PACT Act would increase transparency around the content moderation process and provide consumers more due process when a platform like Facebook removes a post. And the Filter Bubble Transparency Act would give consumers the option to engage with internet platforms without being manipulated by opaque algorithms.
Senator Thune: (01:05:24)
And I’d like to, Ms. Davis, just very briefly discuss those with you today. The Wall Street Journal recently revealed that Facebook overhaul its algorithm in 2018 in an effort to boost quote, meaningful social interactions, or MSI meant to strengthen bonds between friends and family. Instead, the overhauled algorithm rewarded outrage and sensationalism making Facebook’s platform an angrier place. Mr. Zuckerberg reportedly resisted proposed fixes because he was worried it would hurt Facebook’s objective to make users engage more with Facebook. Ms. Davis, should consumers be able to use Facebook and Instagram without being manipulated by algorithms designed to keep them engaged on the platform?
Ms. Antigone Davis: (01:06:11)
Respectfully, Senator, that’s not how we think about our newsfeed. In fact, our newsfeed is designed to connect people to people that they have a meaningful connection to. So friends, family, things that they’re interested in. That particular change actually reduced the amount of time that was spent on our platform by about 50 million hours a day. The goal there was really to promote that more meaningful connection between friends and family. That said, I do think we have instituted additional controls for people’s news feeds so people can actually have a newsfeed that’s based on chronological order as opposed to a ranking. And we have made numerous investments in transparency broadly. We have an annual transparency report. We actually submit to human rights imPACT assessments. We have an oversight board all because we too, like you, believe more transparency is important.
Senator Thune: (01:07:15)
The PACT Act, which I referenced earlier, is a section 230 legislation I introduced with Senator Schatz, would among other things require that large online platforms remove court determined illegal content and activity within four days, or lose their section 230 liability protection. Do you believe that Facebook and other large internet platforms should remove content that has been found by a court to be illegal?
Ms. Antigone Davis: (01:07:40)
Certainly, Senator. We have policies against illegal activity on our platform and illegal content.
Senator Thune: (01:07:47)
There was a study published in the proceedings of the National Academy of Sciences way back in 2014, that revealed that Facebook had conducted a massive experiment of 700,000 users on its platform that found, and I quote, emotional states can be transferred to others via emotional contagion, leading people to experience the same emotions without their awareness, end quote. Today, seven years later, we’re learning through media leaks that Facebook’s internal studies continue to show the emotional contagion it’s services can produce among its users, most recently with teen users on Instagram. What do you think should be done to make users in the public more aware of the emotional contagion that occurs on Facebook and Instagram? And what can be done to counter program against emotional contagion on Facebook and Instagram?
Ms. Antigone Davis: (01:08:38)
Thank you, Senator. I really appreciate that very thoughtful question. In fact, the research that we did wasn’t exactly about emotional contagion. Nonetheless, the recent research really identified areas where we could actually improve our products. So for example, we saw that young people indicated that when they saw uplifting content or inspiring content, that could move them away from some other issues that they were struggling with. And so one of the things that we’re looking at is something called nudges, where we would actually nudge somebody who we saw maybe potentially rat a holing down content toward more uplifting or inspiring content to break that, what you’re referring to as sort of contagion.
Senator Thune: (01:09:26)
Chairman. My time’s expired, but let me just close by saying that I think it’s time for us to look at some of these reforms. I’ve got a couple of bills, as I mentioned. And I just think that users need to know all, they need more transparency. These algorithms are opaque. And I think in many cases, at least users ought to have an option to be able to see content that hasn’t been moderated by the platform. So I hope that we can make some headway on that and I hope we do it soon. Thank you.
Mr. Blumenthal: (01:09:58)
Thanks, Senator Thune. I call on Senator Lujan.
Senator Lujan: (01:10:03)
Thank you much so much, Mr. Chairman. Ms. Davis, we’ve heard from you today and from others that Facebook contests The Wall Street Journal’s reporting on internal research rather than argue details. I have a simple question. Yes or no, does Facebook have internal research indicating that Instagram harms teens, particularly harming perceptions of body image, which disproportionately affects girls?
Ms. Antigone Davis: (01:10:36)
Senator, we have released the two studies in relation to this. What our research showed was that for people who are struggling with these issues, that actually more of them found their engagement on Instagram helpful than harmful. And in fact, of the 12 issues that we looked at, 11 of those were the case for young girls, and 12 of 12 for teen boys.
Senator Lujan: (01:11:03)
So one of the challenges that I’m facing here, Ms. Davis, is that there’s two sides to this story. The problem is Facebook is telling both sides. You’re saying your own internal research is misleading and then taken out of context. So please help us get to the bottom of this. Yes or no, will Facebook release the basis of the research, the data set minus any personally identifiable information to allow for independent analysis?
Ms. Antigone Davis: (01:11:36)
Senator, we’ve already released two of the primary pieces of research. We are actually looking to release additional research and to create greater transparency. We’re also quite invested in giving external researchers an opportunity to access data in a way that’s privacy protective. In addition, we fund and research external independent research through grants.
Senator Lujan: (01:12:02)
Ms. Davis, I apologize. I don’t have a lot of time. If you could please give me a yes or no to this question, and it’s either yes or it’s no. Will Facebook released the basis of the research, the data set minus any personally identifiable information to allow for independent analysis?
Ms. Antigone Davis: (01:12:21)
We’re looking to release more of that research.
Senator Lujan: (01:12:24)
That sounds like a yes. Am I incorrect?
Ms. Antigone Davis: (01:12:29)
[inaudible 01:12:29] … we have privacy obligations, but we’re looking to provide greater transparency. Yes, sir.
Senator Lujan: (01:12:34)
That’s why I’m trying to be clear here, Ms. Davis. I’m asking for you to release the data minus any personally identifiable information. And if I’m incorrect with your answer being interpreted as yes, please correct me.
Ms. Antigone Davis: (01:12:48)
Respectfully, Senator, I want to be really clear-
Senator Lujan: (01:12:51)
So that sounds like a no. [crosstalk 01:12:54] … I’m just, yes or no. And if the answer is not yes, then it’s a no. On April 11th, 2018, I asked Mr. Zuckerberg if Facebook creates shadow profiles for non-users that utilize the site without logging on or officially creating an account. Despite ongoing and public reporting on this issue, in response, Mr. Zuckerberg claimed that he had never heard the term quote shadow profile. Ms. Davis, now in the context of today’s discussion, I’ll ask a slightly different question. Yes or no, does Facebook or Instagram collect personally identifiable information specific to individual children under the age of 13 without the consent of those children’s parents or guardians?
Ms. Antigone Davis: (01:13:40)
Senator, children under the age of 13 are not allowed on Instagram or Facebook.
Senator Lujan: (01:13:47)
Does Facebook or Instagram collect personally identifiable information specific to individual children under the age of 13? Is your answer no?
Ms. Antigone Davis: (01:14:01)
Respectfully, Senator, we do not allow children under the age of 13 onto our app-
Senator Lujan: (01:14:04)
That’s not the question that I’m asking. The question I’m asking in the same way that I asked Mr. Zuckerberg on April 11th about the collection of this information, does Facebook or Instagram collect personally identifiable information specific to individual children under the age of 13 without the consent of those parents or guardians? If the answer is no, that’s sufficient.
Ms. Antigone Davis: (01:14:28)
Senator, it would be my understanding that we don’t, since we don’t allow them on our apps.
Senator Lujan: (01:14:34)
I appreciate that. I understand that the algorithms underpinning content moderation and recommendations on Facebook and Instagram change on a regular basis. Ms. Davis, yes or no, in preparation of changes to existing algorithms, has Facebook ever first tested potential impacts of those changes before they are rolled out broadly?
Ms. Antigone Davis: (01:14:57)
Senator, this is not my area of expertise. I know that we do do testing to understand the impact of changes, but I can’t speak specifically to this one.
Senator Lujan: (01:15:08)
Publicly Facebook has said they do do that. Yes or no, has Facebook ever tested whether a change in its platform will later increase growth in users or growth in revenue?
Ms. Antigone Davis: (01:15:21)
Sorry, Senator. Would you mind to repeat the question? Your voice sped up weirdly.
Senator Lujan: (01:15:24)
Has Facebook ever tested whether a change to its platform would later increase growth in users or growth in revenue?
Ms. Antigone Davis: (01:15:37)
Respectfully, Senator, this is not my particular area of expertise. I can certainly take the question back to the team, but I’m sure that we think about business issues of those kind.
Senator Lujan: (01:15:48)
Facebook has said publicly they do. Yes or no, has Facebook ever tested whether a change to its platform increases an individual or a group of users’ propensity to post a violent or hateful language?
Ms. Antigone Davis: (01:16:05)
Again, Senator, this is not my area of expertise, but I’d be happy to take your questions back to the right team and get you answers.
Senator Lujan: (01:16:12)
And Mr. Chairman, I think with that last question, we might get more responses to that one next week. Yes or no, has Facebook ever tested whether a change to its platform makes an individual or a group of users more likely to consider self-harm?
Ms. Antigone Davis: (01:16:29)
Actually, the research that has been released and has been reported on looks at whether a young person thinks that their first thoughts of suicide occurred on our platform. And while the numbers there show about 0.5%, about a half a percent do, that’s one too many. As someone who had a brother who died by suicide, as well as a very close college friend, if there’s one person on our platform who attributes their suicidal ideation to our platform, that’s one too many and we care deeply about it. And we’ve built product changes to address that. So we have a suicide prevention reporting flow, where you can actually connect with a crisis counselor right from that reporting flow. Family members who report something can actually connect with the person immediately, because our experts have told us that when they connect with that person, that’s one of the best ways to prevent suicide. We take this issue very seriously, and we’re the industry leader when it comes to addressing suicide content on our platform.
Senator Lujan: (01:17:40)
And my final question, Mr. Chairman, yes or no, has Facebook ever found a change to its platform would potentially inflict harm on users, but move forward because the change would also grow users or increase revenue?
Ms. Antigone Davis: (01:17:56)
It’s not been my experience at all at Facebook. We care deeply about the safety and security of the people on our platform. We’ve invested $12 billion in it. We have thousands and thousands of people working on this issue. That’s just not how we would approach it.
Senator Lujan: (01:18:11)
Now, Mr. Chairman, I hope that the answer to the very first question that I asked will be a profound yes. The one area that Facebook can make structural changes here is by simply making research public by default, allow real, independent oversight. And I look forward to that information being given to the committee. If not, I look forward to requesting it formerly. Thank you, Mr. Chairman.
Mr. Blumenthal: (01:18:38)
Thanks, Senator Lujan. And I think you’re absolutely right in that regard. Let me just ask. Ms. Davis, you’ve refused to commit that these research and findings will be made public. Who will make that decision at Facebook?
Ms. Antigone Davis: (01:18:53)
I don’t know that there’s any, Senator, there’s any one person who will make that decision. I do know that there are many people looking at how we can provide greater transparency.
Mr. Blumenthal: (01:19:05)
Well, let me just ask you. Isn’t it a fact that Mark Zuckerberg is the one who will make that decision?
Ms. Antigone Davis: (01:19:11)
Respectfully, Senator, this is a kind of decision that would involve many people in the company. We need to look at our privacy obligations. And we are looking to provide more transparency.
Mr. Blumenthal: (01:19:23)
Well, with all due respect to you, the word transparency is easy to use, it’s hard to do. And so far there is nothing that you have said to indicate that disclosure of these findings, conclusions, recommendations, facts known to Facebook about the harmful effects of its products will be made available. And in fact, that a decision will be made by any specific time or by any particular individual. Can you tell us more?
Ms. Antigone Davis: (01:20:02)
Respectfully, Senator, I think that our commitment to transparency in the last few years should be a very good indication of our commitment. We’ve launched a transparency report regularly. We have set up an oversight board. We have human rights impact assessments. We’re doing a tremendous amount to ensure transparency around our platform. And we’re looking for ways to give independent researchers access to data so that they can do independent studies as well.
Mr. Blumenthal: (01:20:35)
That is perhaps one of the most discouraging parts of your testimony, that you’re relying on your past record of transparency for what you will do in the future. The fact of the matter is there are thousands of documents that we have only because a whistleblower has come forward. Documents that show your own findings. That is directly the opposite of transparency, Ms. Davis. I realize that you are testifying here about the efforts of Facebook to counter those documents, but the only way to counter facts is with real transparency. Let me ask you while we’re waiting for other senators to arrive. I know that some are on their way. For years, Instagram did nothing about eating disorders. It began to take some small steps, only when a 14 year old girl, her name is Molly Russell, took her own life. She was getting trapped in that perfect storm that Facebook researchers described. Your own researchers called it a perfect storm.
Mr. Blumenthal: (01:21:59)
Our research has shown that right now in real time, Instagram’s recommendations will still latch on to a person’s insecurities, a young woman’s vulnerabilities about their bodies, and drag them into dark places that glorify eating disorders and self-harm. That’s what Instagram does. In fact, according to documents provided to me as recently as April of 2021, that’s this year, a Facebook engineer raised concerns that quote, no one has decided to dial into eating disorders, end quote. They documented the problems we have verified. So you knew. You knew. How long should it take to fix these problems? What are you going to do to address what we have found just within the past week or so?
Ms. Antigone Davis: (01:23:13)
Senator, we’ve been working with suicide prevention experts since 2006. We also work with eating disorder experts. We don’t allow the promotion of either kinds of content on our platform. We do allow individuals to talk about their journeys to recovery, because our experts have told us that that’s really important and helpful to them. We have a dedicated reporting flow when it comes to eating disorder content, and we actually offer resources of support. That’s all work that’s been generated out of both this research and working with our experts. And that will be on going [crosstalk 01:23:52]-
Mr. Blumenthal: (01:23:55)
Ms. Davis, because our time is limited. In your answer in response to my question, what are you going to do to fix the problem? You’re essentially saying there’s no problem. Is that right?
Ms. Antigone Davis: (01:24:09)
Respectfully, Senator, no. In fact, that’s not what I’m saying. As long as there’s one person dealing with the issue on our platform, we consider it a problem. And actually, there are additional products changes that we’re looking at. So for example, I think I mentioned earlier that we’re looking at nudges towards uplifting content. One of the things that teens themselves has identified as helpful to them when they’re dealing with certain issues that they’re struggling with, like eating disorders. We’re also looking at something called take a break, where we would encourage somebody to take a break when we think they may be rabbit holing down certain kind of content or on the app too long.
Mr. Blumenthal: (01:24:44)
So you’re not committing to any specific steps by any specific time, but you do acknowledge there is a problem with eating disorders, with suicidal tendencies that may be fostered or promoted?
Ms. Antigone Davis: (01:25:04)
Sure. Certainly Senator, I think we actually have issues in relation to teens and suicide and eating disorders within our society. And to the extent that those things play out on our platform, we take them extraordinarily seriously. And while you mentioned a time commit, I can’t give you a time commit. But I can tell you that we’re working on it, and I can tell you that in addition to all the things that we already do, we’d be happy to follow up with you and share with you our progress in that direction. We take the issues very seriously.
Mr. Blumenthal: (01:25:35)
I know you take it seriously, at least that’s what you’re telling us. But all you’re doing is looking at these possible steps. And with all due respect, these steps are baby steps, not even baby steps, in the direction of trying to improve Instagram and meet the very serious problems that have been disclosed. Let me come right to the point. Instagram for kids has been paused. How long will it be paused?
Ms. Antigone Davis: (01:26:17)
I don’t have a specific date, but I do have a commitment from all of us at Facebook that we will be speaking to parents, we’ll be talking to policymakers like yourselves, we’ll be talking to experts. We want to get this right. We also know that young people are online under the age of 12 on apps that aren’t designed for them, that we want to get their parents the supervisory tools and insights that they need so that they can manage the amount of time that their child is spending, so they can determine what their child should be seeing or should not be seeing. Actually, fundamentally to allow them to parent their children.
Mr. Blumenthal: (01:26:54)
Who will make the decision about how long Instagram for kids is paused? Mark Zuckerberg, right?
Ms. Antigone Davis: (01:27:03)
There’s no one-
Mr. Chairman: (01:27:03)
… Mark Zuckerberg, right?
Ms. Antigone Davis: (01:27:03)
There’s no one person who makes a decision like that. We think about that collaboratively. But quite honestly, we’ll be working with experts to understand and get to feel that people are in a comfortable place before doing so.
Mr. Chairman: (01:27:17)
Mrs. Blackburn: (01:27:18)
Thank you, Mr. Chairman. Ms. Davis, let’s go back to this issue, all of the data that you all are collecting on kids through your program where you’re tracking them, you’re doing the digital experience surveys. What do you do with that data? And how long do you keep it? And do you have the parents’ permission to do that research?
Ms. Antigone Davis: (01:27:50)
Whenever we do research, we use the most stringent privacy protections. And whenever we do research with minors, we certainly get parental consent.
Mrs. Blackburn: (01:28:00)
You get parental consent. Why don’t you submit to us for the record a screenshot of what you use as a parental consent form? Will you do that?
Ms. Antigone Davis: (01:28:15)
Senator, I’d be happy to take your request back to the teams that do the research.
Mrs. Blackburn: (01:28:20)
No, we want a copy of the form. If you get parental consent, there has to be some kind of form that’s signed, even if it’s a digital signature. So why don’t you submit that to the record? Will you submit the form for the record?
Ms. Antigone Davis: (01:28:39)
Senator, I will go back to the teams and bring that request to them.
Mrs. Blackburn: (01:28:43)
Okay. The Wall Street Journal articles have had a big impact and have helped to bring some sunlight to your practices. And I’m sure that Mr. Zuckerberg was not pleased with this. And in some of the documents we’ve seen, that there’s a real lack of governance. It’s his way or the highway at Facebook. So how long have you worked at Facebook?
Ms. Antigone Davis: (01:29:15)
I’ve worked there for seven years.
Mrs. Blackburn: (01:29:17)
Seven years. Okay. And have you all deleted any documents since you learned about whistleblower in The Wall Street Journal reporting?
Ms. Antigone Davis: (01:29:30)
Senator, we would not do anything in violation of any law. There are 60,000 employees. I would never suggest that I know what emails one of our 60,000 employees have deleted.
Mrs. Blackburn: (01:29:48)
Okay. Well, how are you restricting access to data internally? Have your policies changed since The Wall Street Journal articles?
Ms. Antigone Davis: (01:30:03)
Senator, not that I’m aware of, certainly.
Mrs. Blackburn: (01:30:08)
Okay. So you don’t know if there’s a parental consent form, even though you say you have people sign one if you’re going to do research on their children. I would be interested to see if it’s similar to a medical release form that parents have to sign. And you don’t know if you’ve changed any practices about data handled internally or if you’ve eliminated data. Okay. Let me ask you this. Will you commit that Facebook will not take revenge, retribution, or retaliation against the whistle blower?
Ms. Antigone Davis: (01:30:49)
Senator, we would never retaliate against someone for speaking to Congress. That’s just not who we are.
Mrs. Blackburn: (01:31:00)
Okay. But you’re not going to say about the actions. I wasn’t asking about speaking in Congress. I was asking about the actions, but we’ll leave that where it is. Are you aware of Facebook enabling tracking on the [inaudible 01:31:19] Muslims in Xinjiang Province in China when they would download Messenger? Are you aware that they’ve put tracking and spyware in Messenger in China?
Ms. Antigone Davis: (01:31:38)
Senator, in fact, we did not put that tracking spyware. We found that tracking spyware. We removed it. We’ve briefed the Senate on it.
Mrs. Blackburn: (01:31:48)
Okay. Why did Senator Blumenthal’s office so easily access Instagram and set up this account for a 13 year old, and then immediately, they began to receive information about eating disorders and self harm content? What kind of artificial intelligence are you using that would direct them that way?
Ms. Antigone Davis: (01:32:22)
Senator, we do not direct people towards content that promotes eating disorders. That actually violates our policies and we remove that content when we become aware of it. We actually use AI to find content like that and to remove it.
Mrs. Blackburn: (01:32:34)
Okay. So what you’re saying is the experience that Senator Blumenthal’s office had is an outlier or an anomaly. Is that correct?
Ms. Antigone Davis: (01:32:46)
Senator Blackburn, I haven’t seen the particular things. I would take a look, but I can tell you that it’s against our policies.
Mrs. Blackburn: (01:32:53)
I’m certain he can send you the digital copy of the poster that he had here. So thank you. And Mr. Chairman, I yield my time.
Mr. Chairman: (01:33:03)
Thank you, Senator Blackburn. And we can make available to you, Ms. Davis, all of the information about how easily and readily we put this profile of a 13 year old young woman on and the reactions on eating disorders. I’m sure you already have the findings and evidence that supports our conclusions. Senator Lummis is on remotely.
Senator Lummis: (01:33:33)
Thank you, Mr. Chairman. Like many members of this committee, I’m alarmed by the revelations of The Wall Street Journal article, demonstrating the disturbing conclusions of Facebook’s own research, conclusions which merit repeating. Just under one and three teen girls reported the Instagram app made them feel worse about their body image. Another significant portion of users also reported increased feelings of anxiety, suicidal thoughts, depression, and eating disorders as a result of the app’s use. Unfortunately, this research did nothing more than confirm any of our earlier intuitions and suspicions. Social media can be dangerous to your mental health. I look forward to more studies on the impact of social media on mental health. However, I’m concerned by the consistent lack of transparency from Facebook. The fact is that this committee would not be here without the brave whistleblower who stepped forward to shed light on this issue, an issue that many of us had previously sought answers to before and that we now seek answers to today.
Senator Lummis: (01:34:46)
We must remember that despite apps that purport to be free for us to use, there’s a very real cost, one that often comes with the price of our youth’s mental health. I was fortunate to grow up without the pressures of social media, but for the first time, today’s generation of children struggle with how to grow up managing a virtual version of themselves, all while the billion dollar industries compete for their time, information, money, and attention. So I firmly believe that more must be done. I recently signed on to a bill that would update the children’s online privacy protection act by placing strict restrictions on behavioral advertising directed at children. So my question for you, Ms. Davis, has Facebook conducted research into how children are more easily manipulated by highly personalized advertising?
Ms. Antigone Davis: (01:35:43)
Senator, I would not be familiar with that research. What I can tell you is that we have very limited advertising to young people. You can only actually now target a young person based on their gender, age, or location. For Messenger, kids are at or under 13, we actually don’t allow ads at all. And I think we would want to know how we can safely provide an experience for young people on our apps in relation to advertising, and that’s why we have our rules in place.
Senator Lummis: (01:36:18)
Has Facebook withheld any other relevant information relating to its services impact on mental health? Here’s why I ask. When asked during a congressional hearing in March of earlier this year about the impact of social media on children’s mental health, Mr. Zuckerberg responded, and I quote, “The research that we’ve seen is that using social apps to connect with other people can have positive mental health benefits.” That’s only one side of the coin. This answer clearly only told part of the story. These documents reveal Facebook knew that. How can Congress or Facebook users have confidence in the credibility and safety of Facebook’s moving forward? And is Facebook withholding information about studies they’ve done on negative mental health consequences?
Ms. Antigone Davis: (01:37:22)
Thank you for your question, Senator. Actually, I would say that the one-sided and misleading reports actually were in The Wall Street Journal, which didn’t provide the full context. In fact, the research showed that many more people, actually more teens, found the Instagram use helpful when they were struggling with these particular issues. Our research is not bombshell research. It’s research that is [inaudible 01:37:50]. It’s a similar research out of Harvard, out of Pew, out of Berkeley. That doesn’t mean we don’t take it seriously. We do this research to improve our product, to make our products better for young people, to provide them with a positive experience. Right now, young people tell us, eight out of 10, tell us that they have a neutral or positive experience on our app. We want that to be 10 out of 10. If there’s someone struggling on our platforms, we want to build product changes to improve their experience and help support them.
Senator Lummis: (01:38:18)
So do you have information that from the two out of 10 who have not had neutral or positive experiences, so you know how to adapt the presentation of your product to consider the fact that some children seem harmed or negatively impacted by what they’re seeing?
Ms. Antigone Davis: (01:38:43)
Thank you, Senator. I really, really appreciate that thoughtful question. Actually, some of the research that we did was to actually find out from those teens what they thought could be particularly helpful to them. And one of the things that they identified is inspiring content or content that talks about people overcoming these particular issues, uplifting content. And so we are actually looking at some product changes to find ways to nudge that content to individuals who are struggling.
Senator Lummis: (01:39:09)
Thank you, Ms. Davis. Mr. Chairman, I yield back.
Mr. Chairman: (01:39:14)
Thank you, Senator. I’m going to do the second vote and yield to Senator Cruz for his questions, and I will be back shortly.
Senator Cruz: (01:39:27)
Thank you, Mr. Chairman. Ms. Davis, where are you right now?
Ms. Antigone Davis: (01:39:35)
Washington, DC, in a conference room.
Senator Cruz: (01:39:39)
So you’re in Washington, DC. Why aren’t you in this hearing room right now?
Ms. Antigone Davis: (01:39:47)
This is where I was told to come, there’s COVID protocols, for the safety and security of my family.
Senator Cruz: (01:39:55)
Facebook is in the process of hiding. Facebook is in the process of trying to avoid accountability. You’re not physically here even though you are blocks away from us. So you’re sitting in a conference room, but you don’t want to actually face senators and answer questions. Last week, a colleague of yours, I guess, didn’t have the instincts of hiding that you did. Mr. Satterfield actually came physically and was here for a hearing. And by the way, we have hearings every week, even with COVID. So it is witnesses that want to hide and avoid us that are not physically here and choose to do it over video as well. But your colleague, Mr. Satterfield, played the Sergeant Schultz game. You remember the old TV show, Hogan’s Heroes. His testimony was essentially, “I hear nothing. I see nothing.” And so when it came to Facebook’s research concerning the incredible harm that Instagram is inflicting on young girls, your colleague, Mr. Satterfield, said he didn’t know anything about it, that he didn’t cover those issues, he didn’t know anything about it. So I would assume, Ms. Davis, as global head of safety, you are familiar with these issues?
Ms. Antigone Davis: (01:41:09)
Senator Cruz: (01:41:10)
So you’re not going to plead ignorance as Mr. Satterfield did. Is that right?
Ms. Antigone Davis: (01:41:14)
To answer questions in my area of expertise, of course.
Senator Cruz: (01:41:21)
Okay. One of the things The Wall Street Journal reported was that Mark Zuckerberg was personally and directly aware of that research. Is that correct?
Ms. Antigone Davis: (01:41:33)
Senator, Mark pays attention to a lot of the impact research that we do, and I don’t know whether he was aware of the specific piece of research, but I know that he’s looking at the research, as we all are. I work with the research teams on a weekly basis, a daily basis, actually, in relation to the safety and security of the people on our platform.
Senator Cruz: (01:41:55)
All right. So you said you weren’t going to plead ignorance. Your very next question is, I don’t know. It was reported Mark Zuckerberg was personally aware. Have you ever discussed this research with Mark Zuckerberg? Yes or no.
Ms. Antigone Davis: (01:42:07)
This particular research, I don’t remember discussing that with him. No.
Senator Cruz: (01:42:12)
Okay. A minute ago you, said that this research was, and I wrote it down because the phrase really jumped out at me, you said this is not bombshell research. I found that a pretty remarkable statement. The Wall Street Journal reported that your Facebook research concluded that 13% of British users and 6% of American users trace their desire to kill themselves to Instagram. Is that right? Is that a conclusion of your research?
Ms. Antigone Davis: (01:42:47)
Respectfully, Senator, actually what the research shows, if you look at it more carefully, is that about 0.5% of teens indicate a connection of suicidal ideation to their Instagram use, and these are just teens who have that ideation. That’s 0.5% too many, and we’ve invested incredibly heavily in suicide prevention on our platform. For example, we have reporting flows specifically dedicated just to suicide crisis response.
Senator Cruz: (01:43:16)
You just suggested a moment ago that I look at the research more carefully. How would you propose I do that? Have you released the research?
Ms. Antigone Davis: (01:43:23)
We’ve released two of the primary pieces of research that are part of that story, and we’re looking to release additional research.
Senator Cruz: (01:43:32)
So was The Wall Street Journal not telling the truth when it said, “13% of British users and 6% of American users trace their desire to kill themselves to Instagram.” That is from The Wall Street Journal. Was that true or false?
Ms. Antigone Davis: (01:43:48)
It’s a misunderstanding of the research, but I’d point you to the blog post that our vice president of research wrote that goes through the [inaudible 01:43:56], it explains the research
Senator Cruz: (01:43:58)
And has the full research been released or not?
Ms. Antigone Davis: (01:44:03)
Actually, Senator, we’ve released two of the specific studies and we’re looking to release more research-
Senator Cruz: (01:44:09)
So what is the research you haven’t released? What are you keeping secret? Because you’re telling us, if only you knew the full research, and then at the same time, you’re not releasing the research. So which is it?
Ms. Antigone Davis: (01:44:23)
I understand your question.
Senator Cruz: (01:44:25)
Do you want us to examine the full research or not?
Ms. Antigone Davis: (01:44:30)
We’ve released two of the primary sources, we’re trying to release others. I believe you have-
Senator Cruz: (01:44:34)
So you’ve cherry picked the ones you want us to see. Have you released the research? I haven’t seen this research. So if you have released it, I will happily take a look at it. Have you released the research that The Wall Street Journal said concluded, and this is your own researcher concluding, 13% of British users and 6% of American users trace their desire to kill themselves to Instagram? Have you released the underlying and the entire underlying research behind that?
Ms. Antigone Davis: (01:45:00)
Respectfully, Senator, again, I disagree with the characterization of the research in The Wall Street Journal.
Senator Cruz: (01:45:08)
Have you released the research behind it, the entire research?
Ms. Antigone Davis: (01:45:13)
We’ve released the two studies, as I just said-
Senator Cruz: (01:45:15)
So you’ve cherry pick part of the research that you think helps your spin right now. So let me ask you, if 6% of American users trace their desire to kill themselves to Instagram, you just said that’s not bombshell research. Tell me, what would be bombshell research. If 6% is not, what would be?
Ms. Antigone Davis: (01:45:36)
Respectfully, Senator, that is again mis-characterization of the research. Maybe more importantly, what the research showed is that in those small instances, in that 0.5%, there’s actually an opportunity for us to help, that teens do find that we can be helpful in these instances-
Senator Cruz: (01:45:54)
So has Facebook changed your policies after you had a report that said teenagers using your product were significantly more likely to kill themselves? Did you change your policies in any regard to prevent that?
Ms. Antigone Davis: (01:46:09)
Respectfully, Senator, we have a set of suicide prevention experts that we work with on a regular basis, and we are constantly updating our policies in regard to-
Senator Cruz: (01:46:19)
Did you change your policies as a result of this research informing you that your products were making teenage girls significantly more likely to kill themselves?
Ms. Antigone Davis: (01:46:31)
We update our policies on an ongoing basis-
Senator Cruz: (01:46:33)
You’re not answering my question. Did you change your policies in response to this research? That’s a yes or no.
Ms. Antigone Davis: (01:46:40)
We change our policies based on expert guidance, not based on-
Senator Cruz: (01:46:44)
So you’re not going to answer that question. I’m just going to ask one final question, which is, your company conducted, paid for research that informed you that your products were making teenage girls more likely to kill themselves. I have a two-part question. Number one, have you quantified how many children have taken their own lives because of your products? And number two, as the global head of safety for Facebook, what would you say to a mother, what would you say to a father who lost a child because of Facebook’s products?
Ms. Antigone Davis: (01:47:29)
First of all, Senator Cruz, the research that you are referring to is, in fact, not causal research. So that’s important to understand. Second of all, as someone who’s had a brother who died by suicide as well as a close college friend who’s died by suicide, I would offer any family who’s ever lost a child, regardless of whether it has to do with Facebook or not, in relation to suicide, the utmost of empathy.
Senator Cruz: (01:47:53)
So you didn’t answer the question if you’ve done any effort to quantify how many children have taken their own lives because of Facebook’s products. Has the company done any effort to quantify, to put a number to it?
Ms. Antigone Davis: (01:48:07)
Causal research of that kind, Senator, requires an extraordinary long period of time. In fact, we have made significant investments to understand-
Senator Cruz: (01:48:15)
Is that a no? Is that a no?
Ms. Antigone Davis: (01:48:20)
This is not causal research, Senator.
Senator Cruz: (01:48:22)
So it’s a no. You’ve done no research to determine how many children have taken their own lives because of Facebook’s products.
Ms. Antigone Davis: (01:48:29)
That’s not research that we could do easily. That’s a long-term set of research. It’s not, but-
Senator Cruz: (01:48:37)
I’m sorry it’s not easy. Let me suggest that when you have children taking their own lives, it’s worse doing. Your characterization that this is not bombshell research is inaccurate. And for the parents who are losing their children, it is a bombshell in their lives. And I understand Facebook needs to make a buck. And so if the research isn’t easy, apparently you guys aren’t doing it. But there is a reason people across the country are horrified of this behavior.
Mrs. Blackburn: (01:49:05)
Thank you, Mr. Cruz. Senator Lee, you’re recognized.
Senator Lee: (01:49:10)
Thank you. Madam Chair. Ms. Davis, I’ve long been concerned about the targeting of adult themed ads to minors, because adult content or sexually suggestive content has unique psychological effects on minors. I think it should be addressed when we’re talking about teen mental health. My first question to you is, does Facebook, and by Facebook, here, I mean Facebook and Instagram, allow these businesses to target their advertisements to children using your platform, children who are between the ages of 13 and 17? I just need a yes or no answer on that. Do you allow businesses to target those kinds of advertisements to kids between 13 and 17?
Ms. Antigone Davis: (01:50:10)
Thank you, Senator. If you’ll allow me to explain how we, how we do advertisements for our app, I think that would be helpful to answer your question.
Senator Lee: (01:50:19)
I’d like a yes or a no. If you need a sentence to add to that, that’s fine. But I would like a yes or no, and you can take a sentence or so to do it. But I’ve got a lot of content I want to cover.
Ms. Antigone Davis: (01:50:30)
So when we do ads to young people, there are only three things that an advertiser can target, around age, gender, and location. We also prohibit certain ads to young people, including weight loss ads. And one of the reasons that we’re so invested in looking at things like Instagram Youth is to try to create more age appropriate experiences for teen [inaudible 01:50:59] control.
Senator Lee: (01:50:59)
Okay. You do allow some businesses to target their advertisements to young children. I get that. I’d like a yes or no on this one also. I really just need you to work with me on this because we need to get through material. I’m not trying to play gotcha. I just need you to give me a yes or no. If you need a sentence to explain, that’s fine. Does Facebook collect data? Do either Facebook or Instagram collect data or assign interest of adult related material to the profiles of children using your platform? By adult related material, I mean things not limited to sexually suggestive content, but also things like, I don’t know, cigarettes, alcohol, or other things that would be considered more appropriately themed to adults.
Ms. Antigone Davis: (01:51:57)
Senator, thank you for your question. Let me answer certain parts of it as directly as I can here. We don’t allow tobacco ads at all. We don’t allow them to children either. We don’t allow alcohol ads to minors. We also have policies against some of the content, the kind of content that you’re referring to. So for example, we don’t allow adult [inaudible 01:52:22]. We don’t allow the sexualization of minors on our platform.
Senator Lee: (01:52:26)
You’re answering a different question than the one I asked, but we’ve got to move on because I’ve got limited time. Does Facebook and does Instagram allow businesses to target children on your platforms with advertisements that are sexually suggestive, sexually explicit, or that contain other adult themes or products?
Ms. Antigone Davis: (01:52:55)
Senator, I’d have to understand more what you mean, but we don’t allow young people to see certain types of content, and we have age gating around certain types of content. I’d have to see specifically what you’re talking about, and I’d be happy to follow up with you for sure.
Senator Lee: (01:53:13)
What’s the process then for determining what advertisements are age appropriate and permitted by Facebook to be targeted at children?
Ms. Antigone Davis: (01:53:22)
There categories of ads that we don’t allow for young people. So I mentioned a few of them, tobacco, alcohol, weight loss products. I’d be happy to get you the full list.
Senator Lee: (01:53:36)
I would very much like to see that. I think that would be important to happen. And I also hope that in our follow-up, you can also let us know what data you are collecting about the interests that your users have in those age groups. Now, I hear countless stories about how platforms, including Instagram, but by no means limited to Instagram, can facilitate child exploitation, as well as easy access to pornography. Each of these platforms have an app that’s available through the Apple App Store and the Google Play app store, which have an age rating guide to guide consumers to what’s considered age appropriate content. For example, for Apple, on Apple’s App Store, Instagram and Facebook are rated for children 12 and up. And on Google Play, Instagram and Facebook are both rated T for Teen. Is that 12 plus rating from Apple and the Teen rating from Google, is the recommendation that Facebook made to Apple and Google for suggesting the app rating? In other words, why did those age ratings come from Facebook? I can’t hear you. I think you’re on mute. I need you to unmute.
Ms. Antigone Davis: (01:55:24)
Sorry about that, Senator. This is not necessarily my area of expertise, but I’ll answer it to the best to my understanding. We don’t submit the age and say, “This is the age of our app.” We actually submit a form that we fill out and then an app rating is assigned to our app.
Senator Lee: (01:55:44)
Assigned by whom? By the app stores?
Ms. Antigone Davis: (01:55:46)
Again, this is not exactly my area of expertise, so I’m probably not the best person to answer you, but I am happy to get more information. I think it’s an interesting line of questioning and I don’t mean to not be able to answer for you.
Senator Lee: (01:56:00)
Okay. A lot of these-
Senator Lee: (01:56:03)
Okay. Well a lot of these questions that I’m asking you relate to the fact that, due to the allegations that we’re hearing about today, about problematic content, including content that is sexually explicit or suggestive, or in some cases adult themed, if not sexually explicit or suggestive. And in light of the fact that that content does exist, why is there such a disparity between the apps rating on the one hand and the content that’s available on the platform on the other? And what are you doing to promote appropriate age ratings and transparency about the content that’s on your platforms? Taking into account that you’ve got a whole lot of teenage and child users and that not all of that content is appropriate for them.
Ms. Antigone Davis: (01:56:58)
Senator I’m glad you’re asking this question. Because as a parent, this is one of the things that I thought about quite a bit in relation to raising my teenage daughter, in terms of access to particularly, to sexually suggestive content. As well as content that I thought could across media and social media broadly impact her sense of her own body image and wellbeing. And one of the things that we committed to when we paused Instagram Youth, was actually giving parents supervisory tools in relation to their teen that’s on Instagram. In part for exactly what you’re talking about, which is to give them the ability to better manage their child’s experience, to have visibility into it, to actually potentially control portions and meaningfully control their child’s experience. And certainly to give them the visibility to set boundaries.
Senator Lee: (01:57:56)
Right. On the existing apps, the existing apps have an age rating. And so I really would like to know whether you recommended those age ratings and regardless of whether you recommended them, whether you think those are appropriate given the availability of content that’s not suitable for children?
Ms. Antigone Davis: (01:58:17)
We did not recommend those ratings. And we are very focused on building age appropriate experiences. It’s why we’re investing in a AI and it’s why we’re looking at Instagram Youth. And it’s actually why we’ve put these supervisory tools, are going to be launching these supervisory tools on Instagram. Because like you, we care very much that parents can help determine what their child should see and not see.
Senator Lee: (01:58:42)
Okay, I’m out of time, but I want to just leave you with a parting question. Are those ratings appropriate? Say Apple’s rating, 12 and up. Are they appropriate for Facebook or Instagram or any other platform like them? Not really another platform like them. But if there were a platform that like Facebook and Instagram, does allow for minors to access, in some cases be targeted using material that’s not appropriate for children. So are those age ratings appropriate?
Ms. Antigone Davis: (01:59:25)
Senator, I would really love for you to invite Apple to answer those questions. And I’d love to hear from Apple on their child safety-
Senator Lee: (01:59:35)
Oh believe me, I’ve asked Apple this question many times, I’ve had many conversations with Apple and I will continue to have conversations with Apple. But I’m asking your opinion as a Facebook executive.
Ms. Antigone Davis: (01:59:47)
I don’t have visibility into the decisions that they make. But what we do have control over is building age appropriate experiences. And that’s what we’re trying to do. So we’re trying to actually develop experiences where parents have supervisory control. Where under the age of 13, we can actually ensure age appropriate content. We’re putting in policies in place on our apps whom are 13 and over to ensure that kids don’t have access to inappropriate content. This is all part of our ongoing work and our commitment to families.
Senator Lee: (02:00:21)
The term attractive nuisance, a term used in the common law keeps coming to mind. I wish we could talk more about attractive nuisances, but my time has expired. Thank you Mr Chairman.
Mr. Chairman: (02:00:31)
Thanks Senator Lee. And we will be inviting other tech companies to testify. And I hope that they will respond to the kinds of questions that you’ve raised here Senator Lee. Just a few final questions, you were asked about possible retaliatory action. And you said I think, that it’s not who we are, there would be no retaliation against the whistleblower. Will you commit there’ll be no legal action based on the disclosure of the whistle blower’s documents?
Ms. Antigone Davis: (02:01:12)
I’m aware that there are rules in terms of the Senate and we will comply with those rules.
Mr. Chairman: (02:01:21)
I’m asking you whether there will be any legal action based on the disclosure of the documents? Either from the whistleblower or anyone else publicly.
Ms. Antigone Davis: (02:01:32)
We’ve committed to not retaliating for them coming to the Senate.
Mr. Chairman: (02:01:37)
So that’s a yes, there will be no legal action based on the disclosure of documents, Facebook documents, that is a yes, correct?
Ms. Antigone Davis: (02:01:50)
Senator we’ve committed to not retaliating for this individual speaking to the Senate.
Mr. Chairman: (02:02:04)
Can you tell me Ms. Davis, following up on Senator Blackburn’s question. Regarding these documents that have been disclosed publicly, all thousands of them, not just the two that Facebook disclosed last night. Have you locked down these documents shutting out other Facebook employees?
Ms. Antigone Davis: (02:02:31)
Senator, it’s not my understanding that we’ve done that. No, it’s not my understanding that we’ve done that.
Mr. Chairman: (02:02:39)
You have not, that’s your testimony?
Ms. Antigone Davis: (02:02:42)
I just am not the right person to ask and I can certainly follow up and get an answer for you. But it’s not my understanding that we may have done anything like that.
Mr. Chairman: (02:02:49)
I’d like you to confirm if you would, that those documents, the research, the findings and recommendations are available to others at Facebook. And I’m just going to ask you finally, you’ve declined to commit that any more of those documents will be made available. Who in the company will get back to us in response to that question?
Ms. Antigone Davis: (02:03:17)
We will be sure to follow up with your office. I’ll take it back to the team that work with your office to come back to you.
Mr. Chairman: (02:03:26)
Will you commit to ending Finsta?
Ms. Antigone Davis: (02:03:33)
Senator, again let me explain, we don’t actually do Finsta. What Finsta refers to is young people setting up accounts where they may want to have more privacy. You refer to it as privacy from their parents. In my interaction with teens, what I’ve found is that they sometimes like to have an account where they can interact just with a smaller group of friends-
Mr. Chairman: (02:03:59)
Well Finsta is one of your-
Ms. Antigone Davis: (02:04:00)
That said we’ve actually-
Mr. Chairman: (02:04:02)
Finsta is one of your products or services. We’re not talking about Google or Apple, it’s Facebook, correct?
Ms. Antigone Davis: (02:04:11)
Finsta is slang for a type of account, it’s not a product.
Mr. Chairman: (02:04:17)
Okay. Will you end that type of account?
Ms. Antigone Davis: (02:04:20)
I’m not sure I understand exactly what you’re asking. What I can say is that based on what we’ve seen in terms of teens using those kinds of accounts, we’ve actually given them additional privacy options, to address those kinds of issues, where they want more privacy so that they can have more privacy.
Mr. Chairman: (02:04:41)
Well, I don’t think that’s an answer to my question. I think we have reached the end of our hearing. We have another vote. I don’t think any of my colleagues have any other questions. So sorry, Senator Sullivan?
Senator Sullivan: (02:05:06)
Thank you Mr. Chairman.
Mr. Chairman: (02:05:08)
I’m so glad you’re here. I’m glad I ran into you on the floor.
Senator Sullivan: (02:05:11)
Yes. Thank you for holding this hearing. And I think it’s a really important hearing and I know you care a lot about it, I care a lot about it. So Ms. Davis, it’s probably… Well I won’t ask you, but I have three daughters. And when I read the Wall Street Journal story, I was shocked. But in some ways not surprised because I think we’ve seen a lot of this. So when you are looking at your applications, your services, do you balance the mental health needs of Americans versus the addictive nature of the products that you sell?
Ms. Antigone Davis: (02:06:04)
Thank you, Senator. I think you were going to ask me if I have children. I do, I have a 23 year old daughter. First of all, I don’t agree with the characterization of our product. But in fact we do think quite seriously about the safety and-
Senator Sullivan: (02:06:21)
Sorry, I’m going to interrupt here. What don’t you agree with? I just said addictive nature.
Ms. Antigone Davis: (02:06:28)
Senator Sullivan: (02:06:28)
I said addictive nature versus mental health. What two phrases did you not agree with?
Ms. Antigone Davis: (02:06:35)
So I disagree with calling our product addictive. I also think that’s not how we build products. But to your question about-
Senator Sullivan: (02:06:45)
Sorry. I’m going to drill down on that. You don’t think your products are addictive in terms of teenagers constantly wanting to be engaged in social media?
Ms. Antigone Davis: (02:07:01)
Senator, as a parent and as someone who talks to parents quite a bit, certainly parents, all parents, I haven’t met a parent who doesn’t think about the time that their child spends on their phone. And one of the things actually that we’ve done to actually to try to address that, is to make people aware of how much time they’re spending. There’s a dashboard where they can see it. They can actually set a reminder to let them know that they’ve been on so that they’ll get off. In addition, we’re looking at something called take a break, which would prompt somebody when they’ve been on to take a break. So that I think gets at your question. But I wanted to go back to your mental health-
Senator Sullivan: (02:07:44)
Well, I’ll let you get to mental health, but I want to drill down on this addictive element. But isn’t part of your business model to have more eyeballs for a longer amount of time engaged using your services?
Ms. Antigone Davis: (02:07:59)
Respectfully, Senator that’s not actually how we build our products. In fact, we made changes to our newsfeed to allow for more meaningful interactions, knowing that that would impact the time spent. In fact, it did impact the time spent by about 50 million hours per day. But we did it anyway because we were trying to build a positive, more positive experience.
Senator Sullivan: (02:08:21)
So can you address the issue of mental health? Were you aware of these mental health challenges for teenage girls? I’m sure you’ve seen the statistics more broadly about suicides for teenage American females. What are you doing to address that? And were you aware of these challenges, according to the Wall Street Journal that was in that study?
Ms. Antigone Davis: (02:08:51)
Certainly Senator. I am aware of the issues that teens face. I used to be a middle school and high school teacher and had a teenage daughter and was a teen myself. And being a teen comes with challenges. And that is reflected sometimes in our platform. And what we have done and why we did this research was to identify where those challenges may be on our platform and how we could potentially change our product to help. What we saw with that research was that in, out of 12 issues, really challenging issues, issues like anxiety, depression, loneliness, sadness, that out of 11 out of 12 teens, more teens said that they thought that their experience on the platform was helpful than harmful. Now the teens where they found it harmful-
Senator Sullivan: (02:09:46)
But do you believe that? I don’t-
Ms. Antigone Davis: (02:09:47)
We actually want to make those changes. We want to make changes to actually provide them with a better experience. And so I was talking about-
Senator Sullivan: (02:09:53)
Sorry, I’m going to interrupt, my time’s getting shorter.
Ms. Antigone Davis: (02:09:56)
That’s all right.
Senator Sullivan: (02:09:56)
Do you have evidence that those issues isolation, mental health, do you have evidence that those challenges and mental health challenges are actually helped by using, for example Instagram, more or less? Are you telling me that the use of your products actually limits those challenges? I think it’s almost obvious that they increase those challenges. So what’s your testimony today? I thought you said that, actually reduced that, is that what you just said? Because I find that quite remarkable.
Ms. Antigone Davis: (02:10:47)
Thank you Senator. Actually there’s a blog post that our Vice President of research has released on this. I want to be really careful that this research is not causal research. It’s what teens said about their experiences on our platform. And the numbers that you’re talking to speak specifically to teens who identified as suffering from these particular issues.
Ms. Antigone Davis: (02:11:14)
I think what’s really important here though, is that this research actually is being used to make product changes, to identify places where we can be more supportive of teens. So for example, take a break is something I mentioned earlier in some of the questioning. This is something that we would surface to a teen who may be online for a long period of time and give them an opportunity to take a break, so they don’t rabbit hole down a direction that may be not positive. We’re also looking at something called nudging where we would nudge them towards uplifting or inspiring content because they told us that that content can be helpful. Our goal here is to really… Right now, the research shows that eight out of 10 teens say that they have a positive to neutral experience. Our goal is for that to be 10 out of 10 and for it to be positive. We want to provide a better experience for teens.
Senator Sullivan: (02:12:11)
Okay let me end here. I’m going over my time but I don’t see any other senators waiting for questions. And I know the chairman is going to come back to wrap this up. But look, I think the issues of mental health, of depression, of isolation… I think the social media engagement, particularly for teenagers, enhances these challenges. And I think we’re going to see this more and more studies. And you mentioned take a break, I’m not a big fan of the Chinese communist party. Matter of fact, most things they do I instinctively disagree with. But you may have seen recently that they have a way they do things. It wasn’t a law, I guess it was an edict from on high from the party and Xia Jin Ping. But they have told Chinese teenagers to take a real break and to limit the amount of time that a teenager in China can spend on social media or gaming or things like this.
Senator Sullivan: (02:13:32)
Do you think the US government needs to look at doing something like that in edict? If you guys won’t. I personally believe that we’re going to look back like 20 years from now and see the massive social, mental health challenges that were created by this era when teenagers had phones in their faces. Starting in seventh and eighth grade and continue to have them and we’re going to look back, and we’re going to go, what in the hell were we thinking? Maybe it might be the one time where we say, why didn’t we like the Chinese communist party, say take a break.
Senator Sullivan: (02:14:12)
What do you think about the Chinese new edict on taking a break for over a billion people? And should the United States government think about doing something like that? Mandate, not relying on you guys. Because I do think your business model in part is eyeballs and time spent online with your services. I think that’s pretty obvious. If you have less viewers and less time, you’re going to get less revenue. So can you really, on your own, help people take a break? Or do we, the US government have to help people take a break like the Chinese are doing right now?
Ms. Antigone Davis: (02:15:04)
Respectfully Senator, I think that there’s some complexity here. So for example, during Covid young people used apps like ours to actually stay connected. It was a lifeline for them. They couldn’t go to school, they couldn’t go to their colleges, they couldn’t do their graduations. Social apps actually provided them with a way to stay connected to their friends and their family. So I think it’s a bit more com complex in that. That said, I think I would certainly like for apps like ours to build experiences where parents can actually have some control over the time that their children are spending. Similar to what we did in Messenger Kids. I think parents would far more welcome the ability to set time controls than to have an edict on high tell them how to parent their children.
Senator Sullivan: (02:15:53)
So what do you think of the Chinese edict? I know you guys aren’t allowed in China, but what do you think of it?
Ms. Antigone Davis: (02:16:03)
As a parent, I’d much prefer to be able to determine my child’s time online than to have China tell me how how to raise my child.
Senator Sullivan: (02:16:15)
Okay, fair enough. I’m going to move to recess this hearing for a few minutes until the chairman comes back. So Ms. Davis, if you can just hold on for a few more minutes, the chairman will be back in I think about just a couple minutes. So for now, this hearing stands in recess until the arrival back of the chairman.
Speaker 1: (02:18:42)
We are now back
Mr. Chairman: (02:18:49)
We’re back from a brief recess. I’m hopeful that our witness is still online and with us. I was going to offer her the opportunity if she has anything to add in conclusion.
Mr. Chairman: (02:19:21)
Give her a couple-
Ms. Antigone Davis: (02:19:21)
Sorry Senator was that directed to me?
Mr. Chairman: (02:19:24)
Ms. Antigone Davis: (02:19:27)
Thank you. Really the only thing that I would add is that I look forward to the hearings where TikTok and others will come. I think it’s important for us to hear from companies that have already started providing these types of apps to young people under the age of 13. TikTok, I think does, YouTube, Google does. It would be good to understand what they’re trying to solve for. As an industry, we face a real issue, and we’re trying to figure out a way how to best serve young people, in a way that actually needs the needs of their parents and families.
Mr. Chairman: (02:20:03)
I take your point Ms. Davis and TikTok, along with others will be invited, have been invited and others will be here. At this point we’re not specifying who. But I would emphasize that each company bears its own responsibility. The race to the bottom has to stop. Facebook in effect has led it. And if Facebook can’t hold itself accountable, Congress must act. The record so far is, Facebook can’t be trusted to hold itself accountable. Nothing personal to you.
Mr. Chairman: (02:20:42)
And you have in fact said that some amorphous team will make these decisions about disclosure, about Instagram for kids on pause. About potential legal action, these kinds of decisions ultimately I’m assuming will be made by Mark Zuckerberg. But the point is right now, Facebook has failed to hold itself accountable and Congress and the public must do so. So we are concluding this hearing and the record will be kept open for a week in case any of my colleagues have written questions. Thank you very much Ms. Davis, we really appreciate your participating and we look forward to your responses to the questions that you said that you would respond to. Thank you very much. This hearing is adjured.
Ms. Antigone Davis: (02:21:38)
Thank you for the opportunity.
Mr. Chairman: (02:21:38)