Skip to main content
Committee HearingAssembly

Assembly Privacy And Consumer Protection Committee

March 17, 2026 · Privacy And Consumer Protection · 36,233 words · 21 speakers · 1000 segments

Assemblymember Wicksassemblymember

My mic's on. Good afternoon. We're going to call this hearing of the Privacy Consumer Protection Committee into order. This is an informational hearing that we are hosting today on online safety controls. And I want to start by thanking all of our panelists for attending and participating in our hearing today. And of course I want to thank the privacy staff, which is, as I

Lashawn Francisother

believe, one of the best in the

Assemblymember Wicksassemblymember

building, the rules committee sergeant's office and other support staff for helping to organize this hearing. This hearing was really born out of my experience serving on this committee for now seven years.

Emily Cashman Kirsteinother

We hear and we'll begin with the

Assemblymember Wicksassemblymember

lived experience of parents navigating social media with their children who say that the systems are broken for their kids and they aren't working. And then we kept hearing from industry. The answer is parental controls. And so I decided that we needed to put these two perspectives together and have a real moment to talk about the parental controls, how they work, what they are, what they're doing. Do people know they exist? Are people using them? How and why? And can they fail children? And how do we navigate a path forward online that is safer for California's children? I want to really acknowledge all of the speakers who are here today to have that conversation with us. As I said, we're going to start with the Hinxes who will really share what it means to be a parents navigating this world. We're going to hear from folks representing children.

Speaker Dother

It's a voice that often isn't loud

Assemblymember Wicksassemblymember

enough in this building. But we have two incredible organizations, Common Sense Media and Children now, who will be here to speak on behalf of California's children. I also want to give a huge debt of gratitude to the four companies, CA companies, I will say Google, Meta, OpenAI and Roblox who all agreed to be here. Often we have to go a second round to get people to participate, but every single one of the companies that we invited in the first round agreed to come and have a conversation with us about parental control. I really want to express my gratitude.

Speaker Dother

I think this is an important conversation. Thank you for being here to have it with us.

Assemblymember Wicksassemblymember

Then lastly, we'll have a panel to discuss potential solutions with some experts that research this space every day. And I think this conversation will hopefully help us navigate a path forward in the social media space that is informed by what is happening online, the realities of these products and the experts research. Because I know that all of us

Speaker Dother

hopefully are committed to a safer online future for California's children.

Assemblymember Wicksassemblymember

I will say that I'm Entering this conversation personally with the fundamental question in my mind that if we know that these, in some cases, these online spaces are designed to be addictive and to keep our children engaged, can any amount

Speaker Dother

of time be safe for them?

Assemblymember Wicksassemblymember

And so that's where I come at this. But I also think that it is the reality that our kids are growing up with. And so we need to figure out, you know, what is the way for California to create the safest spaces for our children. So with that, I want to turn it over to my colleagues.

Speaker Dother

If they have any opening remarks. Senator Lowenthal or Wicks, I'll be very brief.

Senator Lowenthalsenator

First of all, I just want to thank our chair. This committee has led with moral clarity in a way nowhere else in the United States has, actually, including our federal government. And I am grateful as a father. Thank you, Madam Chair, for today and every day that we do this work. And I also want to thank everybody that has come here. I believe we are all a community together. All of us ultimately want the same

Lashawn Francisother

things for

Senator Lowenthalsenator

a healthy consumer, a robust business, a future that we all know this generation is surpassing the generation before it and so forth. And so I look forward to having individual relationships with each and every one of you. And I know that everybody on this committee feels this way. And it's just a joy that you, that you showed up today. So thank you so much.

Assemblymember Buffy Wicksassemblymember

Thank you, Madam Chair, for pulling together this hearing and for your leadership in this space. And I also want to thank my colleague here from Long beach who's been a tremendous leader as well. I've been working in this space now since day one when I got to the legislature. And, oh, my, has technology changed in those almost eight years now? And, you know, I've done a number of bills, many of which have resulted in being challenged in the courts and continue every single year to figure out how we keep our children safe. You know, one thing I'm inspired by, honestly, is the fact that you have lawmakers who are first and foremost parents before they're Democrats or Republicans. And we have a bipartisan group of. Of parent lawmakers who are just trying to figure out how to keep our kids safe. That is the goal, and we welcome industry in that conversation and being a part of the solution to that problem. We love our tech companies. They're a big part of our economic engine in California, and they need to make sure that our children are safe. And I think we can have all of those things and obviously appreciate the expertise and the diverse points of view of the advocates, the children's advocates who are part of this conversation as well. I also know that often what we do in California leads not only the nation, but the globe. And we have regular conversations with our counterparts in the European Union and in the UK and other places as well. We're looking at what other countries are doing and modeling work from them and learning from some of their lessons. But I think we all stand here committed to making sure our number one job. I've always said this is the most important thing we need to do, is keep our community safe and from my perspective, most specifically our children. And that is my goal, my mission in an incredibly complex, technologically evolving, complicated space that is evolving all the time. So we also want to create legislation that can be implemented, is implementable, is doable. And so that's where I always welcome conversation from opposition. I genuinely actually love conversation with opposition because you learn more about what you're trying to do in that context. But it's also, I think you get better policy when you are really in the weeds trying to figure out again how to adhere to these, these guardrails, but in a way that can be implemented. So with that, excited to be here

Speaker Gother

and thanks for your leadership.

Assemblymember Wicksassemblymember

Thank you Assemblymember. With that, we will start our first panel, as I mentioned, will be or

Speaker Dother

opening remarks will be from Victoria and

Assemblymember Wicksassemblymember

Paul Hinx, who are advocates for social media safety. So if you guys want to come up. And as you get comfortable, I just want to express our gratitude for you being here. I think you provide a really critical, humanizing voice to any conversation around social media.

Speaker Dother

So thank you for being here. Thank you.

Victoria Hinksother

Thank you so much for having us. Good afternoon. My name is Victoria Hinks and I'm a survivor parent who lost our daughter Alexandra Hinks. Everyone knew her as Owl Forever 16. We lost her to suicide 587 days ago and she was a beautiful girl on the inside and out. She was kind. She was cross country runner.

Monica Buffonother

She wanted to be a preschool teacher

Victoria Hinksother

one day and have a family. This is a loss that has so profoundly changed our family's life. And I it's left me living with severe ptsd. And since her death, I've dedicated my life to speaking out about the ways that social media can impact vulnerable young people and families. And I share this story so that other families will not have to endure the horrible tragedy that happened to our family. So hopefully it can help bring awareness and accountability and stronger protections for others so no other family has to go

Speaker Dother

through what we went through.

Victoria Hinksother

We didn't solve Car deaths with parental controls. We fixed the product itself by implementing mandatory seatbelt laws. So Owl would be graduating from high school, from Redwood High School in Marin county in June. And while her friends are all eagerly awaiting their college acceptance letters, we've been eagerly awaiting her headstone finally being put up. And I brought pictures of that for you all today. So these so called parental controls never worked. She found a way around and we never really stood a chance. And this is why the work that you all are doing is so important to us. Because this could be anyone's child, doesn't discriminate. Republican, Democrat. It's, you know, she was a, she

Emily Cashman Kirsteinother

had a bright future ahead of her.

Victoria Hinksother

So the grace that we live with is the most painful thing ever. And it could be anyone's child. We thought this is something that could never happen to us. So that's, thank you so much for having us here.

Speaker Dother

Thank you. Victoria.

Paul Hinksother

Afternoon. My name is Paul Hincks. I'm Victoria's husband, Alexandra's father. I have been a software engineer in Silicon Valley and San Francisco for over 30 years. We as a family considered ourselves to be tech savvy. And our children grew up in a house full of gadgets, video games, smart TVs, speakers. Our house is pretty much controlled by an app. We never let our children have a TV in their rooms. We always discouraged prolonged tech use. And we didn't allow devices at the dinner table or out in public. And we held off getting them phones until much later than their peers. Her older sister had already been through this successfully. We weren't starting from scratch. When we finally gave in to the inevitable and bought 13 year old Alexandra an iPhone. We thought we did everything right. We researched the dangers. We made her sign a contract acknowledging that the phone was our property and that we controlled it and that we could take it away from her at any time. No phones at night. She happily agreed to show us what was happening on it and to keep track of her own usage. We set up screen time limits, age appropriate content restrictions, and a firm 9pm curfew, after which the phone could only be used to play music or call her family. We felt prepared. This was an Apple device. They make great devices that just work. So what could go wrong? We weren't naive parents stumbling into this blindly. We thought we genuinely understood the dangers our daughter was being exposed to. We'd attended meetings at school where online bullying was discussed and the contagion of self harm and eating disorders among teenagers. We took it seriously, but the threat felt manageable. Local Even her school friends talking among themselves. A kind of thing that could be sorted out with a phone call to another parent. After all, we had all the parental controls on. We had devices on our network that were supposed to filter out dangerous websites. No random stranger from across the world would affect our child. What we didn't realize was that the dangers were coming from inside. Some of the apps that Apple told us were trusted. Initially, we did not allow social media at all. Slowly we added apps as our daughter grew older and wanted them to keep in touch with friends. Each app had its own parental controls and we set them up to keep her as safe as possible. But again, surely the app manufacturers had their customers best interests at heart. Surely they would not allow dangerous content to reach the screen of a teenager. What had worked at 13 did not work at 15. Our daughter began obsessing over her phone. She seemed very fragile and upset all the time. We didn't know the cause. There were probably many. She was transitioning from middle school to a high school that none of her friends were attending. Her older sister, who she was very close to, had left for college. She was desperate to make friends and some of the people she chose were not great people. She felt isolated and turned more and more to social media for companionship. We were aware of this, but it wasn't a major concern at the time. Surely social media major benefits was to keep her in touch with friends from her old school. And we had all the controls and limitations turned on. Surely nothing bad could be going on. She was happy to show us the apps when we asked, but she had ways of hiding things she did not want us to see. When we finally accepted that something was seriously wrong, we had lots of fights. We began to suspect that the phone was causing her problems. We restricted her use of social media to one hour a day, not realizing that these restrictions were broken. She could simply tap to ask for more time and stay on the phone as long as she wanted. We took the phone away from her for days, weeks at a time. That helped. She would apologise and ask for the phone back so she could keep in touch with friends. Her therapist told us that taking the phone away was actually harmful and isolating and letting her use it, even to listen to music, would help. So we agreed to this. Who wants to totally isolate their teenager from their friends? These devices can be made safe. Consider what happens when a company issues a device to an employee. There is an IT department, there are policies, there are people whose job it is to ensure that that device is configured correctly that dangerous content cannot reach it and that someone is accountable if it does. The company has legal obligations. The device manufacturer has contractual obligations. The chain of responsibility is clear. But when a parent buys a device for their child, there is no IT department. There are settings buried in menus that most people cannot find and that are ambiguous as to their effect, connected to restrictions that can be bypassed with a tap. There are app manufacturers shielded from liability by law and a platform company that takes no responsibility for what is displayed on its screens. The chain of responsibility leads nowhere. Nobody is accountable. The companies don't care. They will happily feed a 15 year old girl content about self harm if that will keep her engaged and scrolling for longer. And the people paying the price are children. Our daughter was presented with content that painted suicide as a rational and reasonable way to deal with her problems. Eventually she was able to use social media to find the best way to kill herself. Thank you.

Speaker Dother

Thank you both so much for being here. I know this cannot be easy, but your advocacy absolutely makes us better.

Eliza Jacobsother

Thank you so much for having us. Thank you. Elsa.

Assemblymember Wicksassemblymember

Well now move to the first panel which is an overview of the types of parental control challenges and reasons for failure.

Speaker Dother

We have Sunny Liu, Director of the

Assemblymember Wicksassemblymember

Stanford Social Media Lab, the Shawn Francis, policy analyst and advocate for children. Now

Speaker Dother

Anik what Anika how do you

Sunny Liuother

pronounce her last name?

Speaker Dother

Buffon.

Assemblymember Wicksassemblymember

Did I get that right? Anika yeah. Who's the PhD and founder and CEO of Clara Clear AI Risk Assurance and

Speaker Dother

they will be opening our first panel and then we'll take questions after they finish.

Sunny Liuother

Victoria thanks so much for sharing your stories and have courage to be here.

Speaker Mother

As a mom myself, I start to

Sunny Liuother

research about online harms because tragedy like

Speaker Mother

this, like so many parents, we simply

Sunny Liuother

just want to protect our children. Madam Chair and community members, today I will share about our research at Stanford Social Media Lab on the challenges parents facing digital parenting.

Speaker Mother

The view present here on my own

Sunny Liuother

and should not be interpreted to represent the views of the universities. So I'll start my presentation.

Speaker Mother

So we asked about 500 parents and

Sunny Liuother

kids across the United States. I know the next panelist will talk about children's perspectives so I want to briefly highlight the key findings here.

Speaker Mother

So we asked kids age from 10 to 18 what do they wish their

Sunny Liuother

parents to know about their social media use and online world.

Speaker Mother

The answer is mostly are trust them more more like give them more clear

Sunny Liuother

guidelines and also have clear expectations when

Speaker Mother

we ask parent so when we ask parents what they are most concerned about

Sunny Liuother

their children and experiences the most Concerned

Speaker Mother

are excessive use, harms, risks, privacy and

Sunny Liuother

also impact on mental and social well being.

Speaker Mother

If we look at really all those different perspectives, we can see both alignments and disalignments both children and parents. They're aligned on the goals they want

Sunny Liuother

a safe and healthy online world.

Speaker Mother

The misalignment centered on the approach. So how to settle boundaries and what

Sunny Liuother

is the way to control that?

Speaker Mother

So currently what our parents are doing now, so what are the ways they

Sunny Liuother

really prevent harms and protect their children online? So what they do is by parental control.

Speaker Mother

So what are parental controls? Parental controls are those tools and features to parents use to manage their kids digital access from screen time limits, content filters, app limits, for example Apple's family sharing, Android's family links and third party

Sunny Liuother

links like Barks Custodial net nannies.

Speaker Mother

Those tools definitely exist. But we're here still today talking about

Sunny Liuother

how to protect our children and reduce harms. Clearly those tools are not sufficient to prevent harms as we want today.

Speaker Mother

I want to share about the research

Sunny Liuother

at our lab Core challenges that parents facing to really protect their children online. The first one is digital parenting is challenging. The second is tech is complicated. Third, there are constraints on parental controls and last, accessibility and equity gap.

Speaker Mother

Digital parenting is challenging.

Sunny Liuother

A Pew report suggests that 2/3 of parents today think that parenting is harder today than 20 years ago because of technologies like social media and smartphone. Digital parenting is just one part of parenting. Parenting is challenging. Here's why mom shares with us.

Speaker Mother

There is a pressure to be everything

Sunny Liuother

everywhere all at once to your children.

Speaker Mother

The sense of constantly needing to do more, to be around more and be

Sunny Liuother

more of this and be more of that within the environment that doesn't really support

Speaker Mother

online make it even more challenging. Parents have to constantly understand and really

Sunny Liuother

navigate those complicated online safety worlds. So here's one parent share with us.

Speaker Mother

It's a struggle to make sure my child doesn't see inappropriate content, image or pornography, knows who he interacts with and

Sunny Liuother

that he not bullied.

Speaker Mother

So in our research at the lab we identified 22 type of harms young

Sunny Liuother

people can encounter online.

Speaker Mother

From cyberbullying to sextortion to harmful content. Online hate algorithm risks. Parents have to constantly navigate those involving

Sunny Liuother

technologies and involving harms.

Speaker Mother

And third, there is a knowledge gap. So there is a knowledge gap. Kids know those technologies better than their parents. Parents always feel that they are one

Sunny Liuother

step or even 10 steps behind of their kids. So what's happening online?

Speaker Mother

So those 43 points really makes digital parenting really challenging.

Sunny Liuother

And tech is complicated.

Speaker Mother

There's so many different platforms, features, interfaces and products.

Sunny Liuother

Parents has to constantly navigate all those different settings.

Speaker Mother

As soon as they figure this out

Sunny Liuother

their updates they have to relearn everything again.

Speaker Mother

The settings at device level usually don't work at apps level and apps level

Sunny Liuother

settings will not work at device level.

Speaker Mother

I have a 16 year old daughter loves to use instagram and I delete her on her phone and then she

Sunny Liuother

now use on her laptop which might

Speaker Mother

be even more risky or more interfere

Sunny Liuother

with her study more or her life more.

Speaker Mother

The third point is there are constraints on parental controls. Kids circumvent so they find all the

Sunny Liuother

different ways to bypass old parental controls.

Speaker Mother

Some parents call wi fi at midnight, then kids go to a neighbor's house for connections and protections and controls can backfire as well. Overly controlled or two restrictions can sometimes enroll parent and kids cohesion enrolled trust

Sunny Liuother

and increase Conflict in families Screen time is the number one conflict in family. Now lastly I want to highlight accessibility and equity gaps.

Speaker Mother

Not all the parents have the time

Sunny Liuother

and energy to constantly moderate.

Speaker Mother

So we have simple family, single parents

Sunny Liuother

families and families that have parents have

Speaker Mother

multiple jobs or caregivers like their grandparents and older siblings. They don't have the time and energy

Sunny Liuother

to really constant monitor tools do exist,

Speaker Mother

but not every family can afford those tools. Third party apps from custodial to bark costs from $10 per month to $40 per month. Our research show that the currently parent

Sunny Liuother

controls don't work for four main reasons.

Speaker Mother

Digital parenting is challenging, tech is complicated

Sunny Liuother

constraints or parent controls, accessibility and equity gap.

Speaker Mother

So I'm so glad as we can see it's really a complicated issue and the stake is so high for this reason.

Sunny Liuother

I'm so glad that the committee take this seriously and bring such a wide range of stakeholders here.

Speaker Mother

So I hope that my research here

Sunny Liuother

will help to framing the discussions.

Speaker Mother

I look forward to hear from the

Sunny Liuother

future for the other panelists, witnesses and I look forward for questions in the discussion.

Speaker Dother

Thank you so much Ms. Francis.

Lashawn Francisother

Thank you so much Madam Chair Members, My name is Lashawn Francis and I'm with Children Now. We are a statewide research, policy and advocacy org focused on the whole child. Our organization also leads the Children's Movement, a California network of more than 6,000 direct service, parent, youth, civil rights, faith based and community groups dedicated to improving children's well being. Our goal overall is to sound the alarm about how kids are doing in our state in regards to mental health, addiction and online spaces not well and the data makes it clear that digital spaces are both a reflection and a driver of that crisis. I know that today the header of this hearing is social media, but I'm going to talk broadly about digital spaces. I grew up in a time of AOL online chat rooms, and obviously that has changed. And so I'm going to say digital spaces more broadly because the iteration of things is constantly changing. That's just the nature of tech. In 2021, Children now wrote a letter to the governor asking him to declare a state of emergency for California's youth due to the mental health crisis. That declaration was never made, and the urgency around the mental health crisis for kids actually remains today. The connection between mental health, addiction and social in digital spaces has never been clearer. According to our 2025 youth poll, about 94% of young people in California report experiencing regular mental health challenges, with one third describing their mental health as fair or poor. Nearly all of those reporting poor mental health, 98% were youth of color. More than 1 in 3 LGBTQ youth in California seriously considered suicide in the last year. For transgender and non binary youth, that number climbs to nearly four in 10. Indigenous youth in California bear the highest rate of suicide deaths among any youth group by a wide margin on overdoses. Fentanyl has transformed the crisis entirely. Adolescent drug fatalities remain more than twice pre pandemic levels, 708 deaths nationally in 2023 compared to 282 in 2019. The National Crime Prevention Council estimates that 8 in 10 fentanyl overdose deaths are connected to social media contact, with dealers actively using these platforms to reach young people. Psychiatrists warn that generative AI affirms, enables, and fails to challenge delusional beliefs. The digital crisis connection to these mental health and addiction outcomes are no longer speculative. According to our youth poll, nearly a third of California young people say social media has been harmful to their mental health. About one in three report being cyberbullied, and roughly seven in 10 say social media contributed to a negative body image. So what is the industry offered as a solution? Parental controls and one of the things that I really do want to flag in this talk when we talk about parental controls and one of the reasons why I spent the majority of my introduction on the state of the mental health and addiction of young people is to realize that we're not actually answering a tech problem, we're answering a child safety problem. And once we understand that, I think the solutions will be clear. We need to be a little clear eyed about what parental controls actually are and where they come from. The design and definition of parental controls have so far been dictated by tech companies themselves. That means the industry has control the narrative around what safety looks like, and too often it looks good on paper while doing very little in practice. When companies use parental control features as a public relations shield, it allows them to sidestep the deeper systemic problems, harmful design, exploitive engagement algorithms, and inadequate privacy protections. A 2025 report titled Teen Accounts Broken Promises tested 47 of Instagram's teen safety and parental control features and found only eight worked as intended. Most were ineffective, unavailable, or easy to bypass. Fairplay found that parental controls do not accurately reflect what a teen is actually experiencing online. Parents are not notified by default when their child reports a poster account, and children can easily open a FINSTA account with no indication appearing in parental supervision tools. In 2025, pediatric experts warned that YouTube Kids still allows low quality and borderline harmful content to slip through even when parental controls are enabled. Because creators can self label videos as for kids and game the system with friendly thumbnails and keywords, these aren't isolated glitches. They reveal a pattern parental controls designed to look like protection without actually having to provide it. Young people see through that when we talk to youth about technology and online safety, parental controls are rarely what they bring up. In fact, when I bring it up, they actually chuckle. And it's not because they don't care about safety. It's because they know these tools don't work. Many of their parents aren't fully equipped to manage or understand how these systems work. Setting them up requires technological skill, time, and patience that parents simply don't have. And even when parents do engage with these tools and young people say the controls are set in such a way that they can easily navigate around them. So when I ask what will be effective because I know they care about their safety online, they say instead of focusing on parental controls, they want online literacy, digital responsibility, and corporate accountability. They understand that the online environment they inhabit is not shaped by personal choices alone. It's engineered by the design decisions tech companies make about platforms, algorithms, and engagement tools. In their view, teaching young people to critically evaluate content and understand data practices is more empowering than any parental dashboard. Young parents also want their parents to be educated not just on how to use parental controls, but on how to have open, informed conversations about tech, they want collaboration, not surveillance. When parents understand digital culture, social media norms, gaming communities, content creation spaces, they connect with their kids on a human level rather than a policing one. Importantly, the approach to digital safety needs to evolve as children grow. One of the things we see very often is that we Write legislation where the tech rules apply to a three year old in the same way that it would apply to a 17 year old.

Speaker Dother

That is not sufficient.

Lashawn Francisother

Perhaps what's most telling is this. When I spend time with youth advocates and ask why they keep using platforms they clearly dislike, their answers reveal just how much the stakes have changed. They tell me they feel compelled to participate not for entertainment, but because school announcements live on social media, political activism happens on social media, and job opportunities are shared online. For today's young people, these platforms are not a fun pastime like my AOL chat rooms. They are infrastructure. Opting out isn't really a choice. That is precisely why the burden of safety cannot rest on families alone. The real question before us is not how to build better parental controls. It's how to shift the conversation entirely away from tech companies defining what digital safety means and towards families and young peoples, young people, and policymakers outlining what is expected from corporations that provide products to our kids. This should be no different than the safety protocols for vehicles, car seats, toys, cribs, and the like. We need policymakers to come together with urgency to examine which rules and regulations need to change, address the structural crises in our digital spaces, and put meaningful guardrails on corporations. Because our children do not feel that they have the ability to leave these digital spaces that are offering them different ways to engage in life. The resources and reforms we pursue must reflect the full scope of this both offline and on. Thank you for your time.

Speaker Dother

Thank you for that insight. And now we will turn to Dr. Meyerberg, the phone.

Assemblymember Wicksassemblymember

Okay.

Monica Buffonother

Okay, you can hear me.

Speaker Nother

All right.

Monica Buffonother

I'm Monica Buffon. I am a PhD trained social psychologist and positive psychologist. And I've done most of my research

Speaker Nother

on well being and empathy until I

Monica Buffonother

transitioned actually to tech itself where I spent seven years. And unlike a lot of other researchers,

Speaker Nother

actually I was on growth and safety teams.

Monica Buffonother

So I understand the full stack pretty well. And then the last two years I

Speaker Nother

worked on age assurance, which is particularly on the Youth well being team. In this particular space

Monica Buffonother

I'm here, I founded a company that is a nonprofit that is very, very new. But the goal is to do research

Speaker Nother

based advocacy and to build the right products in this space and to have the right conversations and suggest the technical solutions that can actually work.

Monica Buffonother

And so I want to start maybe sort of break code a little bit here. You know, last night here was the Nine Inch Nails concert. I don't know if anyone here was there, but. So when my parents were parenting me, their biggest worry was that I might like Marilyn Manson and that I was going to go to the Love Parade. And so they said no to both. And today, parenting is so much more difficult because we don't actually know what the kids are seeing on these different apps because a lot of it is hidden to review. And so I get a lot of times the question, like, is it the algorithmic change that we need? Do we need to raise minimum age? Do we need to change minimum controls? And I think my answer is that this isn't the right question because we

Speaker Nother

need a lot of different layers. And we need a lot of layers

Monica Buffonother

because every family is different. And so we're not going to convince every family to be as strict as possible. We're not going to convince any family to be as loose as possible. And the rules that we make have to work for the conservative Christian parent that wants to shield their children from certain ideologies and from the parent with the LGBTQ teenager that wants to protect their child from hate speech and so

Speaker Nother

on and so forth.

Monica Buffonother

And so we need the whole stack to address the problem. If raising hands were appropriate, I would ask which of us in the room have changed their child's age on the device up because things otherwise weren't working

Speaker Nother

and things were broken.

Monica Buffonother

Like me, a lot of us have. A lot of us have noticed that when we put on parental controls, things we want to try to be using don't work anymore, be it they can't listen to an audible book for bedtime, be it that they can't get sent Apple Cash so that we can have them try independence and go to the store with their friends after school. It really needs to be a redesign of the whole system. So I think, again, we need everything. Age assurance is the number one barrier, right? Because if we have kids on with false ages because things otherwise break, then we can't protect the child in the end of that because the child will be assumed by the app to be an adult. So privacy assuring age assurance is really, really important. I get the question, can't we just verify everyone with an id? My personal opinion is that that may not be the right approach because a

Speaker Nother

lot of adults don't want to do it.

Monica Buffonother

And so it just leads to circumvention of adults.

Speaker Nother

But it can also lead to circumvention

Monica Buffonother

of parents because, like 70, 80% of parents say that they're really concerned about the privacy of their children, about data breaches, about, you know, so now what

Speaker Nother

is the good news is we have

Monica Buffonother

a lot of technology now that can go beyond IDs and can go beyond these approaches and have unobtrusive ways to get to age assurance. And California passed this amazing device level age assurance law, which is really great because it opens up a lot of

Speaker Nother

privacy preserving methods for children.

Monica Buffonother

Then there's device level controls. Those are really great, but they're very high levels. You can set things like screen time at the very high level, but you can't actually touch what happens in the app. So it's kind of like Vegas. What happens in the app stays in the app. You can't really see it, you can't really influence it. Or global Google regulation usually can't touch individual countries legislation. So you gotta understand how that works. And then on the app level, that's where you can set different controls. But as Sunny was saying, if then every single app has their own interface and different symbols and some things that exist somewhere but not on the other app, then it just gets very thin on what you can actually control. And then there's the third party tools, which are great gap fillers, but they have the same problem. They can't really see very deeply inside of the different apps and they cost money, which is an equitable issue which both of my previous speakers have spoken about. Again, I think privacy preserving age assurance is really, really important. And having multiple different ways parents often help kids. The younger the child, the more likely it is when they're on something the

Speaker Nother

parent has helped them.

Monica Buffonother

Kids of course, can also get around. And then what we see now is kids moving to less safe apps. So there's all these apps coming out, like some of which you probably haven't heard of before. I haven't really either before I started doing the research. So there is Yubo and Lemonade and Locket Widget and Coverstar. And some of them have atrocious things happening on them. A lot of them are actually trying to do the right thing. They're trying to be safe for children. But if you're starting out, you're not going to have a huge safety team. Right.

Speaker Nother

So you're limited.

Monica Buffonother

So when kids leave TikTok and Instagram, they might go to these other places. And so we just have to make sure that we make them safe everywhere. Right now it's about 120 hours of setup required of a parent. Between initial setup and setting up every single app and doing all the monitoring, these controls are often hard to find. Multiple clicks, like often if you.

Assemblymember Wicksassemblymember

Did you say 120 hours?

Monica Buffonother

100, 220 hours a year? Yeah. This is research Based. So this is based on expert opinion of people that have tried. It just takes a long time. You have to find all the different settings, you have to set it up. It's just a lot of work. Then at the end you have a lot of awesome dashboards that have a lot of data and no information again and the settings break. A lot of parents give up. Parents tried linking. A lot of times it's just a link that gets sent to an email or some QR code being scanned. So that's very easy to get around Sometimes as adult verification, we have hardly any cases of actual parent verification or guardian verification. A lot of times parents need their own account to supervise for the app, which is also I think unacceptable. There is silent graduation a lot of times. There was a famous example where kids got an email, hey, you can soon unlink yourself from supervision. So obviously we don't want to do that. Kids usually can remove supervision unilaterally if the parent is lucky. That app has decided that parents should get a notification also not always the case. And then there's false positives. So I get, every few days I get a your daughter got a nude picture alert. And it's never a nude picture, it's just 12 year olds taking really bad pictures and sending pictures of their warts and whatever, things like that. So again, parents can only control and see the tip of the iceberg. It's like things like screen controls and you know, some content, like sense of content can be blocked. But what does the algorithm optimize for? What kind of profiling is there? What kind of advertising, what you know, what is what autocompletes for search like posting privacy. One of my daughter's apps, when you do a challenge, a dance challenge, suddenly it becomes public. So my engineering husband had to flag that and so she no longer has that app. But it's sort of, you find out over time.

Speaker Oother

Which app was that?

Monica Buffonother

That was Coverstar. So AI chatbots are the next unregulated frontier. Kids use these apps but no one is really empowered to watch. Really. OpenAI is the only app that has had any meaningful in my opinion controls here and H checks and really I think especially for AI, it really is very disappointing to me personally as someone that has worked in tech because we have seen social media and so the fact that a lot of age vacation is just a checkbox and that there isn't parental controls is a really big concern, especially with how powerful these apps really are. And parents really are in the dark. When you look at research, parents don't know how these apps really work, like what to worry about, how to keep their kids safe. Teachers say the same things. And of course we've already seen some

Speaker Nother

pretty bad harms happen to children.

Monica Buffonother

The data and tools, and this is sort of one of my last points here, absolutely exists within the companies. Companies have the data, companies have the capabilities. And now with large language models, it really is in reach. It used to be harder admittedly to classify content and to provide some of these controls, but it is absolutely possible and it is being used in other ways. So companies, of course safety investments can create a lot of competitive disadvantage like age verifying everyone. It loses a lot of adults and youth are important to the business. And this isn't some earth shattering fact, right? Like I think every company, be it like Nike or social media, they want the next generation to be customers too. And so I actually do see myself

Speaker Nother

also, also as an advocate for the safety researchers that are working in companies today.

Monica Buffonother

Because I have been there and a lot of us want these, you know, these things to, you know, to happen. We have a lot of, you know, there's a lot of values, like this is my personal opinion obviously, but like a lot of the values in like wanting to do the right thing but

Speaker Nother

aren't always empowered to do that work.

Monica Buffonother

And especially now a lot of safety researchers have been laid off.

Speaker Nother

And so I think there's even less of feeling protected to speak out and to really advocate for the change internally, which is one of the reasons I'm doing this work outside right now.

Monica Buffonother

The chair Rebecca Vauer started with saying, is it safe at all? And I think there is sort of this idea of Pleasure island at Pinocchio where the kids go and they get handed cigarettes and whatever. I do think if you have apps that are optimized for, for engagement, optimize for content that is meant for adults and then you tack on safety, in the end, it may not be good enough. One thing that will need to happen is really thinking through how should these apps work? And also is there a responsibility to make the safe version of the app just as fun and entertaining as the adult version?

Speaker Nother

Because otherwise that also will drive circumvention.

Monica Buffonother

Self regulation is not working. This is my last point. We need independent standards. We really need to know what are the base rates of kids with false ages on the app? Are the control features and the age regulation features, the companies that they are putting in, are they actually reducing that rate? What is the harm base rate? What are the interventions? Is it going down? Every intervention and safety feature that isn't

Speaker Nother

meeting that bar really isn't good enough. And we need the standards for that.

Monica Buffonother

My four takeaways, kids and youth. Safety needs a lot of different layers. We need multiple approaches. We need minimal standards. AI chatbots really need more regulation than there is right now. And we need independent standards so that we have really a baseline for cause

Speaker Nother

and effect and we can make sure

Monica Buffonother

that we can really assure that kids

Speaker Nother

no longer get harmed. So, yeah, that is my pitch.

Speaker Dother

Thank you. Turning to questions, facilitator Lowenthal.

Senator Lowenthalsenator

Okay, before I say anything to the panel here, I just want to acknowledge the Hincks family. It is so important to have your voice in this conversation. And I know how. Well, I don't know, I don't know how challenging it is for you to come up and relive this all the time, but I can tell you that your presence here is meaningful to all of us and helpful for this conversation because we are able to make it real. So thank you for being here. I'm struggling and let me tell you why I'm struggling because I don't understand what safe is all about. Is safe meaning that we are, we're, we're stopping harm, crisis type harm from taking place, interactions that can be deadly, suicidal ideation, you know, things that are absolutely catastrophic harm. What about intellectual harm, academic harm? The empirical data that we're hearing right now, which is about our kids no longer surpassing this generation, I alluded to that earlier, which is a grave concern, I think, to all of us. I want to ask an open question about that and I'd like to hear how you answer that. That'd be great. And I also want to ask you about China and your feelings about what's happening in China. China, to me, is the only country that I know of that was ahead of this from a regulatory standpoint. I don't think of them as a beacon of civil and human rights whatsoever. And clearly they don't have a constitution with the Bill of Rights that we have here in the United States. And yet I wonder and do we have any information, empirical information about mental health disorders with youth happening in China right now as a result of those things. And I know that their efforts have been quite draconian. But to me, when it comes back to this issue of harm, they're very focused on STEM and steam. They're very focused on making sure kids raise the bar on their goals and their dreams and their hopes. They're focused on teaching kids healthy lifestyles and healthy choices and so forth. To me that's very attractive. I just wanted to ask for your comments and thoughts on these things.

Speaker Mother

Yeah, thanks so much for those two questions.

Sunny Liuother

I think those are the same question. I think fundamentally how we can support children to have a healthy development we don't know. I think that our research relates to technology is limited.

Speaker Mother

But we know a lot about what

Sunny Liuother

will make kids thriving. So there are fundamental needs both I think physical, psychological, mental, emotional. Those parts we know those psychological teachers have that.

Speaker Mother

So I will answer those questions in three ways.

Sunny Liuother

First is I think that if you

Speaker Mother

think about how to think about harms,

Sunny Liuother

I think that reducing harms is one part of how to supporting kids development. Kids cannot thrive when they are bullied. When they see all those online hate when they have those shoulders and harmful content and incontinent risk harms. So that's one aspect.

Speaker Mother

But without harm does not equal to benefit. So we not only we don't want harms but we also want kids to

Sunny Liuother

develop healthy their identities, who they are fulfill their potentials. Maybe it's intellectual, maybe it's social, maybe it's emotional.

Speaker Mother

So for those parts I think that it's really so one thing about without

Sunny Liuother

harm that's one aspect. But how to make our environment both online and offline support kids development their fundamental needs. I think that's all parts of picture. And then I think that as about China. China for a little bit background. I think a few years ago China had this regulation for video game specific. So kids can only play video games at. I don't remember details but maybe half hour on Fridays, one hour on Sundays.

Speaker Mother

It's less than two hours per week

Sunny Liuother

for all the kids. And they actually they implemented that really all the platforms they have to cut all those kids and the families have to take responsibility. You cannot have those kids to play video games.

Speaker Mother

There was one piece of high evidence

Sunny Liuother

research coming out I think a few years ago. I'm happy to share that article.

Speaker Mother

It shows that actually kids time to

Sunny Liuother

play video games did not decrease.

Speaker Mother

So the policies in the implemented did

Sunny Liuother

not decrease kids time online.

Speaker Mother

But I think we do need more

Sunny Liuother

research to understand do those kids under those regulation develop better, have more time to play with friends to intellectual development. Happy to do more research and then figure that out.

Speaker Dother

Thank you. And I assume Ms. Francis was really nice. Yeah.

Lashawn Francisother

So I'll say first I don't know much about China, so I can't answer that question. What I will say about how I know we've operated in the US unfortunately is that corporations and businesses seem to believe they have more rights than individual children and families and they will sue to prove it. And that tells me everything I need to know about how we are engaging here with corporations and who is really trying to set the bar and the parameters for safety. What I'll also say in the mental health context, in terms of what is even healthy for and safe and thriving is we know healthy face to face interactions are the best. That is the gold standard, not online interactions. So the gold standard is face to face in person interactions. The ability to read micro expressions, the ability to hug someone, the ability to, you know, put your hands on over someone's and show comfort. That's the gold standard. We have began talking as if the gold standard are online spaces with interaction. When it comes to mental health, whether it's through how we provide therapy or how we find community, it's not the gold standard. It's what we've done because we have a workforce shortage, but it's not the gold standard. So I just, I just want to say that, that it's actually something that we should be thinking about as secondary, not as primary.

Monica Buffonother

So I think my answer is that

Speaker Nother

I think there absolutely are good things that kids can do online.

Monica Buffonother

And so one of my favorite examples actually was Meta's portal, which like, I don't know anyone besides me remembers, but it had a story time feature and kids were able to talk to their grandparents. Their grandparents would turn into like the big bad wolf and like, and kids were. And it would, it actually did something remarkable which was it let kids talk to their grandparents and actually want to

Speaker Nother

keep talking to them.

Monica Buffonother

And the grandparents thought it was weird

Assemblymember Wicksassemblymember

at first because you don't look super

Monica Buffonother

attractive as a big bad wolf as grandma, but it sort of worked. And there is lots of equivalence of that also in online spaces. So I think that it absolutely can have benefits for kids to connect to different interests that they may not have

Speaker Nother

a community for themselves.

Monica Buffonother

Like to like minded kids that, you know, maybe have special needs in the same way that they do. But I think that these benefits can't really be reached in a safe way unless we have the right safety minimal

Speaker Nother

standards and the right controls.

Monica Buffonother

And so I think it absolutely is possible. But when I sort of review what's

Speaker Nother

out there instead of what the safety protections are, it's just not where we need it to be.

Monica Buffonother

And so I think

Speaker Nother

there really is

Monica Buffonother

a lot of research that's needed to see how can we make sure that we create these right spaces for kids

Speaker Nother

so that they can claim the benefits

Monica Buffonother

Obviously, I do agree that in person,

Speaker Nother

experiences are the best. Then the second piece, I think is

Monica Buffonother

that in terms of the oversight model,

Speaker Nother

I think there's also something really broken about how oversight was ever created because.

Monica Buffonother

Because it actually is a process where

Speaker Nother

the tech company ends up winning because

Monica Buffonother

it puts up kids against their parents.

Speaker Nother

Right.

Monica Buffonother

The parents are the police officers in a system that's not even working with

Speaker Nother

false alerts and all these different things.

Monica Buffonother

You accuse your kid of something they didn't actually do and there isn't any

Speaker Nother

education, like what you were saying, Francis.

Monica Buffonother

And I think really there needs to be some accountability, I think of companies

Speaker Nother

as well, to be part of educating kids to what the dangers are.

Monica Buffonother

You know, like, how can you tell that, like, something is upsetting you? What are the controls you can use?

Speaker Nother

How can you report?

Monica Buffonother

And I think there needs to be

Speaker Nother

accountability on what happens to all these reports. Like, you know, how many reports that

Monica Buffonother

kids sent in about, like, this is

Speaker Nother

eating disorders, this is this. How many get actually action on? We have no idea. Like, I've never seen that data of what the percentage is, of what reports are just getting dismissed. And so, you know, like, it's fine.

Monica Buffonother

So I think there's just a lot

Speaker Nother

of accountability that we can ask for. And I think then we can make real progress on answering that question.

Speaker Dother

Thank you.

Assemblymember Wicksassemblymember

Yeah. Just one thing that I think Ms. Francis said that's really interesting is the

Speaker Dother

First Amendment law that is coming out

Assemblymember Wicksassemblymember

on this because the First Amendment is

Speaker Dother

not allow for all speech with no exceptions.

Speaker Nother

Right.

Assemblymember Wicksassemblymember

When there is a public health risk

Speaker Dother

or another risk to a First Amendment. You can't hate speech, for example. It's not protection under the First Amendment.

Assemblymember Wicksassemblymember

And yet the case law on the social media companies appears to be protecting

Speaker Dother

everything they do under the First Amendment with no exception.

Assemblymember Wicksassemblymember

And I find it fascinating because that

Speaker Dother

is not, as I understood it as a young law student and practitioner, the

Assemblymember Wicksassemblymember

way the First Amendment works. And hopefully we will get to a place where we are weighing both sides

Speaker Dother

of that debate evenly in courts.

Assemblymember Wicksassemblymember

So I just thought that was a really interesting point.

Speaker Dother

With that, Ms. McKinnon, thank you guys

Assemblymember McKinnonassemblymember

so much for coming and testifying today.

Lashawn Francisother

I have one question.

Assemblymember McKinnonassemblymember

Is online harm today more a technology problem, a business model problem, or a regulatory gap?

Paul Hinksother

Good question.

Lashawn Francisother

All of the above.

Speaker Mother

Yeah.

Assemblymember McKinnonassemblymember

Wow.

Assemblymember Wicksassemblymember

That was a short answer.

Speaker Dother

Oh, I'm sorry.

Assemblymember McKinnonassemblymember

One last thing. What does success look like? How should we measure whether platforms are

Lashawn Francisother

actually safer

Speaker Nother

reducing harm?

Monica Buffonother

So I think, I mean, this is

Speaker Nother

my big point, right.

Monica Buffonother

The only way we can is if

Speaker Nother

we know what the base rates are.

Monica Buffonother

So, like, for example, what is the estimated percentage of children under the age of 13 on these different apps? And if we change our age predictions and we improve it, does that rate go down? Like, if we see certain harms that are emerging, if we have better safety

Speaker Nother

systems and standards controls, do those harms go down?

Monica Buffonother

I think without research to really see the data on cause and effect, it will be really difficult.

Speaker Nother

And as much as we can, experimental

Monica Buffonother

data, but also seeing if the interventions

Speaker Nother

are actually working, because I think that's one of the big problems.

Monica Buffonother

If the mandate is do this thing and then the implementation of that doesn't

Speaker Nother

actually fix the problem, then really we're just. It's just lip service. Right.

Monica Buffonother

And so it is accountability that I think benefits everyone.

Speaker Nother

It benefits the tech companies, it benefits

Monica Buffonother

business, it benefits regulators, and it benefits

Speaker Nother

the families that we're trying to serve as well.

Assemblymember McKinnonassemblymember

And are we. One more thing. I'm so sorry.

Assemblymember Wicksassemblymember

Well, I was just going to say no. An interesting point on that is that

Speaker Dother

somebody, remember Gabriel, had a bill a

Assemblymember Wicksassemblymember

long time ago while I was here

Speaker Dother

on disclosures on hate crimes on social media platforms. It was challenged by industry and struck

Assemblymember Wicksassemblymember

down by the courts under the First

Speaker Dother

Amendment, saying that they did not have

Assemblymember Wicksassemblymember

to disclose as materials, which makes it harder to track all of what she's saying.

Speaker Dother

So I just thought I would point that out.

Assemblymember McKinnonassemblymember

Thank you.

Victoria Hinksother

That answers.

Lashawn Francisother

And do you mind if I add really quickly one of the things. Every young person I've talked to has used one of the reporting features to report content or something happening online. Every single one of them said they've never heard back. It just kind of goes off into the ether.

Speaker Dother

Yes. Yes.

Assemblymember Buffy Wicksassemblymember

Sonny, remember Wicks, thank you for the testimony. I don't know a single parent that feels great about their tech situation with their children. They're like, this is awesome. Like, it's not a. You know, every time it's like a war, it's a fight. It's like the parents trying to navigate a very complicated. I mean, I can't even navigate my own phone. You know, I can't keep up with all of it myself. And then to manage your children's as well. So parents are just at their wit's

Victoria Hinksother

end

Assemblymember Buffy Wicksassemblymember

in the most generous terms and the most horrifying terms. We hear testimony from Victoria and Paul, and I also want to recognize their testimony. And this is obviously the worst case

Nicoleother

scenario as a parent.

Assemblymember Buffy Wicksassemblymember

So thank you for testifying. And that is like every parent I've ever talked to about this. And when I do pick up and drop off, this is what parents are talking about. When taking your kids to birthday parties, going to soccer practice, it's all consuming, and everyone's looking for a solution and they need help. And, you know, they're eager for government to take action because it feels like if it's a parent against a tech company, it's just an unfair fight, especially when the kids are often aligned with the tech company, you know, because they want the product more and more and more. And so that's why a holistic approach, I think, is critical. Ms. Francis, I'd love to ask you a question. Is there any benefit to social media access for kids? And if so, at what age does that benefit outweigh the risk? And the answer might be no, but I'm just kind of curious because I don't know the answer to that question. I'd love to know your thoughts.

Lashawn Francisother

So is this a personal question or am I. You know, there's how I feel and there is what young people tell me. So I want to be clear about that. What young people say is that they see a benefit. That they see benefits because it's how they are engaging politically, it is how they're finding jobs, it's how they're interacting with their school, and there's a social benefit. I remember a time before social media, so I'm not as convinced that we need it. Right. So my personal feeling is it's probably not that great of a product. We probably shouldn't expose it to children of any age. I know there's been a lot of conversation around 16. I think 16 is an arbitrary number. The science and data tells us that your brain doesn't really fully develop until 25. So I'm not really even thrilled about that. I know we would never get something through that. Banned social media for 25 and under. So I get that desire. But to me, that train has left the station, unfortunately. And one of the things that I am concerned about is creating an environment where young people feel like they have to sneak and use social media. And that's what I'm also trying to avoid when I talk about this, is I don't want to create this environment where they're hiding social media now because that's even more harmful and more problematic. So, no, I don't love it. I'm barely on social media these days. You know, social media didn't come out until I was already an adult, so the impact was completely different. But they. They want to be engaged with the world Differently and I think we should make sure that it's safe for them to do so.

Speaker Dother

Right.

Assemblymember Buffy Wicksassemblymember

On that note, are there and I'm happy to entertain your response, but others as well on this, are all social media platforms created equally? Like, are you seeing any of the companies actually put forth meaningful guardrails? And again, the answer could be no, I don't know. But I'd be curious your thoughts.

Speaker Nother

I think my website on this just went live today,

Monica Buffonother

so I will share that with you. But I think my answer is that

Speaker Nother

a lot of them have things where they're better than others, but I don't

Monica Buffonother

think there is one that is better

Speaker Nother

than all the others.

Monica Buffonother

So I think that for example, Instagram

Speaker Nother

teen accounts I think was an important step forward.

Monica Buffonother

TikTok has certain like minimal safety standards that are quite good. So it really depends on sort of the area. But I think it's like the problem

Speaker Nother

is that not one of the plan firms right now is all across the board doing the right things across all the different controls. And I think that's where the legislation is needed. That's where like mandated standards and like minimum standards and also parental controls are going to come in as really, really important.

Monica Buffonother

And to your last question very quickly, I think one problem that I see is that right, like today kids, they go from activity to activity, they're so busy all day, all afternoon, they have no free minute. And so I think technology ends up becoming the solution to that. You can only talk to your friends

Speaker Nother

for five minutes between soccer practice and tutoring.

Monica Buffonother

And I think that's just like this is a societal, you know, in some

Speaker Nother

ways problem where kids are expected to be in all these different activities and have no unstructured time to play and to be, just to be free and to be with each other.

Monica Buffonother

And so in some ways like these

Speaker Nother

technology companies have picked up on sort of a need for kids to want to socialize as teens and to be independent. And I think we have to understand that ecosystem that they're operating in.

Monica Buffonother

This doesn't mean like I think tech

Speaker Nother

is good or social media is good

Monica Buffonother

or bad, but it does mean that like you know, the, we have like,

Speaker Nother

if we take certain things away and

Monica Buffonother

there isn't space for that to be

Speaker Nother

filled with real life interactions, that that's a problem too. So like this is not really answering

Monica Buffonother

the question of should it be. But I just like, as a social psychologist it's important for me to point out like sort of why, why are

Speaker Nother

we having the system? And you know, and maybe why is it even that kids want to be in these apps as opposed to being in person. So I thought that was important to mention.

Assemblymember Buffy Wicksassemblymember

Thank you.

Speaker Dother

Thank you, assembly member Pellerin.

Speaker Oother

Yeah, this is hard stuff. It's taken me a while to really digest everything and I want to thank Victoria and Paul Hinks for being here and your story is so powerful and I know how hard it is to tell it. And thank you for being here and sharing Alexandra with us and thank you all for your testimony. You've given us lots to think about. I mean it seems like shouldn't we be designing safety systems from the very start and not it seems like we're putting a lot of it on the parents to control and is that happening at any level of speed and urgency?

Lashawn Francisother

Yes, we should be. No, we are not.

Speaker Oother

Okay.

Assemblymember Wicksassemblymember

I love how succinct Ms. Francis is. Okay.

Speaker Oother

I feel like I want to scream. Mental health is something that's very concerning to me and the connections that we're seeing between social media use and youth anxiety, depression, self harm. Are there platform features that are most harmful to a healthy kid or I mean have we identified. Yes, there's.

Speaker Mother

Yeah, I think that usually those harms

Sunny Liuother

are not equally distributed. So they are mostly target those extremely vulnerable populations. And kids, when I talk about vulnerabilities, mostly they have those at risk factors in their daily lives. They don't have those supporting systems in their offline world and online setting. Also don't have those guardrails for those as well. Well kids have those for example eating disorder and the algorithms driving them more toward those kind of content. So those algorithms and enhanced all those their offline vulnerabilities and make them even more vulnerable. So I think that's. Yeah. Not sure as a question I think

Monica Buffonother

very concretely like I think end to

Speaker Nother

end encryption is a big problem. I think that's extremely unsafe.

Monica Buffonother

Like children in these chats, like there's

Speaker Nother

just no way if a predator talks to them. Like it's even hard for law enforcement to track those conversations. So there are certainly features I think private versus public posting, like whether kids, you know, expose themselves publicly. I think there's group chats.

Monica Buffonother

So for bullying and harassment I'm quite concerned about them. Even in imessage like you don't even

Speaker Nother

have to go to social media and tech. Like there's school wide text message threads going on like in my own kids school. So yeah there's definitely some features that are particularly concerning. I agree also with the way that the algorithm is designed and on your

Monica Buffonother

question on Apps, actually, there was famously

Speaker Nother

the example of Instagram kids that got shut down.

Monica Buffonother

And I think that tech companies probably

Speaker Nother

do need more guidance on when such apps are designed.

Monica Buffonother

How should they be designed so that the attempts to do so really can actually be successful. And of course, my personal opinion is

Speaker Nother

that those apps should not be optimized for engagement, because I don't think that

Monica Buffonother

is ultimately safe because it will easily

Speaker Nother

lead to these rabbit holes for unsafe trajectories. But I do agree with you, and I think that is where regulatory support can really come in on what that should look like.

Speaker Oother

I'm grateful my kids are 28 and soon to be 31. I can't imagine raising young children in this environment right now. And quite frankly, I feel like we should just ban, you know, smartphones for kids age 16 and under. And I know you raised a good point, and that was good, because I need to hear that, because that's just how I feel. I feel like this is an evil device for them and this is hurting them and it's causing kids to take their own lives, and I can't stand by and watch that. I just want to take it all away.

Lashawn Francisother

Oh, I feel the same way.

Speaker Oother

Okay.

Assemblymember Wicksassemblymember

Yeah.

Lashawn Francisother

It's just not realistic, but I feel the same way.

Speaker Oother

I know. So I guess I'm just struggling. I mean, is there, I mean, other countries are doing, I think, bolder, more aggressive actions, and are those successful and should we be thinking about those for here? And I'm trying to. I know we're all trying to navigate this to the end path where everyone's happy and thriving and no one's having mental health crises.

Speaker Nother

But I think it depends on. I mean, the thing is, the honest answer is we don't know yet because we would need a lot more data. And these interventions are all so new, and I think there is a good

Monica Buffonother

chance that it will reduce the number of kids on these apps.

Speaker Nother

But there's also kids that are moving

Monica Buffonother

to these newer, less safe apps that

Speaker Nother

aren't as well regulated, that also aren't part like they're not affected by the regulation.

Monica Buffonother

And so my biggest concern is I think that trying these things, and maybe that is, I can't really weigh in

Speaker Nother

on is it right or is it not right?

Monica Buffonother

But there's definitely concerns on is it

Speaker Nother

keeping the child with less supervision safe as well. Right.

Monica Buffonother

Because if, you know, if there is parents that are willing to help the kids circumvent and give the kid a phone and telling the phone the kid is 18 at that moment, that child is less safe. And I think it's very hard to

Speaker Nother

say this is the right versus the wrong way.

Monica Buffonother

I think it should have a lot

Speaker Nother

of data behind it when we make those kind of decisions.

Monica Buffonother

But those are the trade offs that I think about is sort of like

Speaker Nother

where are kids wandering to which kids

Monica Buffonother

are least protected and so how do we keep them safe in the end?

Speaker Nother

And I think those are all very tough questions that need a lot of data support that we just don't have yet. I think it'll be monitoring those countries and sort of what happens there.

Speaker Oother

Okay, so since I can't ban social media and phones from 16 year olds but what does the research tell us about which safety tools actually work and which ones are largely ineffective?

Speaker Mother

Yeah, so I would share a little

Sunny Liuother

bit more about the solutions in the third panel but I think that the safety.

Speaker Mother

Yeah briefly answer.

Sunny Liuother

I think that safety tools works when that first they had a report from bottom but the report has to really connect kids understand that there's action taken. So kids have to feel that they're empowered and have efficacy and efficiency to use those tools and those tools actually work. So I think that's first part is the tools actually work. The second is education part.

Speaker Mother

Kids know that they have to know that sometimes they don't want to report

Sunny Liuother

because they don't want to get their friends get into trouble or they have this one so they never even reporting offline.

Speaker Mother

So I think there's.

Sunny Liuother

We do need educate our kids and family know that here are those functions and how that work setting education and make those tools actually work. I think those are two things really important to keep those safe online as well.

Speaker Mother

What's currently missing I think that on

Sunny Liuother

the offline world we all figured out we have schools, we have people, we have communities, we have coach.

Speaker Mother

We built those circle of care and

Sunny Liuother

circle support and circle of safety for the offline world. But online world we don't have that yet. So that's why those kids very likely when we cannot even protect them and hold them up because we don't have those circle of protection and I think have those safety fears.

Speaker Nother

My answer is screen time works. Taking the phone away works too.

Monica Buffonother

So with my own kids that's what I do. They have screen time controls and the rule is they charge the phone in

Speaker Nother

my room at night and I constantly

Monica Buffonother

just take it and put it in my pocket. And I think that is about the

Speaker Nother

extent of the parental controls on the device. I think the only one that I really trust actually is the screen time one and all the other ones I think have holes right now.

Speaker Oother

I've actually implemented that for myself. So thank you.

Monica Buffonother

Me too.

Assemblymember Wicksassemblymember

But then you can just press that button that says ignore. Yes, I know.

Speaker Rother

Nine out of ten times.

Emily Cashman Kirsteinother

Thank you all.

Assemblymember Wicksassemblymember

And I think that some important points were made. I will say that my own children,

Speaker Dother

my mother in law was a second grade teacher her entire career and they

Assemblymember Wicksassemblymember

would when they were little FaceTime with

Speaker Dother

her for hours and she had puppets

Assemblymember Wicksassemblymember

and she lives far away and read to them. And it was the most, it was amazing. It was honestly really positive connecting time that was happening through a device. And so I am a huge believer

Speaker Dother

that actually there is a way to do it that was real connection with a real person.

Assemblymember Wicksassemblymember

And I will say that I love what you said about circles of trust

Speaker Dother

online because one of our sort of,

Assemblymember Wicksassemblymember

I come from a very large southern family. One of our safety mechanisms is that

Speaker Dother

aunties get to follow their nieces and nephews online.

Assemblymember Wicksassemblymember

And so I watch all my nieces and nephews Instagrams and they know I'm watching but it's different than their parents. And so I do think it is

Speaker Dother

about building circles of love even in these spaces of people who care about you and you trust. And so I think again, these are not technological solutions, but they're important things to think about as we navigate the

Assemblymember Wicksassemblymember

future in a way that really centers public health. I think is we keep trying to

Speaker Dother

talk about this in a way that is a public health centered way because it's so important that we remember that at the bottom of this problem, it's not technology, it's the health and safety and well being of California's kids.

Assemblymember Wicksassemblymember

And I really appreciate all of you being here. Another thing that I wanted to point out was something you said, Ms. Francis, about kids reporting content not getting a response. One of the things we are working on this year that I hope will be successful as a bipartisan coalition is

Speaker Dother

a consumer facing regulatory regime that will allow customers to come to California and say this isn't working for me, I need your help regulator because we do

Assemblymember Wicksassemblymember

this for so many other industries and

Speaker Dother

yet we have not done this for technology. And I think that that would be game changing. And so I hope that in the

Assemblymember Wicksassemblymember

future when kids do face that they

Speaker Dother

have the state to turn to. And I really appreciate us continuing to have this conversation in a way that

Assemblymember Wicksassemblymember

helps our kids because their lives matter. And so I want to close lastly

Speaker Dother

by just reiterating my immense gratitude for

Assemblymember Wicksassemblymember

both Victoria and Paul, who

Speaker Dother

I know

Assemblymember Wicksassemblymember

I've had many conversations. And you say that coming here and telling your story is part of what empowers you every day.

Speaker Dother

But I am just so grateful for

Assemblymember Wicksassemblymember

it because as a mother, as an

Speaker Dother

auntie, as someone who cares deeply about

Assemblymember Wicksassemblymember

California's children, at the end of the

Speaker Dother

day, we want to make sure that no parent experiences what you experienced. And it will take hard work to do that. And you just remind us that that work is worth doing and showing up for every day. So thank you.

Assemblymember Wicksassemblymember

And with that, I will turn to the next panel. Thank you, guys. So the next panel is our industry panel. We're going to hear about these online tools, some of which I know from my own experience, have been updated, so we may get some updates on what is new and exciting online. First, we have Nicole Lopez, who's the director of God Global Litigation Strategy at Meta. You guys can sit wherever you'd like. We have Emily Cashman Kirstein, and I apologize if I'm butchering Kirsten. Thank you. Child safety manager at Google, Lauren Haber, Jonas, head of Youth well being and families at OpenAI, and Eliza Jacobs, senior director of product policy at Roblox.

Monica Buffonother

And I will say that I didn't

Assemblymember Wicksassemblymember

plan to have all female panels.

Emily Cashman Kirsteinother

We didn't choose who's here, but I'm

Speaker Dother

not, not mad at it.

Assemblymember Wicksassemblymember

So with that, we will turn it

Speaker Dother

over to who is supposed to be first.

Emily Cashman Kirsteinother

Nicole Lopez from Meta.

Nicoleother

First, do I need to press anything here or can you hear me? Okay. All right.

Assemblymember Wicksassemblymember

Okay.

Nicoleother

So I also want to thank Victoria and Paul. I appreciated and it meant a lot that you shared your story today. And I'm chairman as well as assembly members. I'm Nicole. I'm here testifying on behalf of Meta. But first and foremost, I like you, I'm a parent. I have two tweens who are online quite a bit. Screen time is the battle that we fight often in our household. I'm also here as a California resident, born and raised in Oakland, where I live five minutes from my parents. Today I joined Meta roughly three and a half years ago, where I've continued to work both in the policy as well as legal side of the house on what I care deeply about, which is the safety and well being of young people. I have done this for the bulk of my career, both in the private and public sectors, including eight and a half, almost nine years as a prosecutor in California, where I did two stints in the domestic violence unit. I worked on child endangerment, child abuse, child exploitation cases, and then I worked in the community violence reduction unit where I focused on violence impacting teens and their families. I care deeply about protecting young people online as well as supporting their parents, which we've touched on today. Parents are supporting their teens navigate these online spaces. I want to talk first about Meta's approach to teen safety. I think it's really important as a backdrop for how we build these features and experiences for teens. At Meta, our teams work together to build safe, positive and age appropriate experiences for teens and their families. But in order, and we've been talking about this today, to design products with the right mitigations to support users who are actually using them. It's critical, it's complicated. That has come up today as well and it's really critical to bring the right voices into the room. And there are a lot of voices that matter. Teens. You have regulators, policymakers like yourselves, internal experts at Meta as well as experts externally who are going to have different focus areas and who are going to come with a very blank slate because they're not actually working at Meta. They have their own experiences to bring to bear. But importantly and relevant to the question that you posed at the beginning, we need to listen to parents. No kid is the same, no teen is the same. I say this from personal experience, having two very different boys who are 10 and 12 and parents know their teens best. In terms of the approach that we take to building, it is not a one and done static experience. As technology changes and assemblymember Wicks talked about this, it's evolving really quickly. It is complex. We have moved into a different era than not just the AOL chats, but even four years ago it's constantly shifting and so we need to continue to listen, to build and to improve. It's not static. And I'll zed discuss. We have to get parents feedback and it's not just about parent controls. I want to make sure this is not a dichotomy. It's parent controls are important, very important. But so are the baseline experiences that need to be protective of all teens who are using the apps. And in terms of how we get parents feedback, we do it in a number of ways. One way that I've been deeply involved in includes listening to parents live in person. Meta's hosted Screen Smart events in California, I hosted one in San Francisco. We've had them in la, we've had them in San Diego where we provide hands on workshops for parents so that they actually understand how the tools and experiences work. We want parents to feel confident about raising their teens in an increasingly digital age. And we also want to make sure that they have boundaries and protections that are going to work for each family. Because again, it's not just that every teen is different, every family is different in what they want. So I want to take a step back and share some of the work that we've done to address parents concerns, some of which actually predates my joining meta. Before I joined, we started building out a number of parent supervision tools. And I'm not going to spend a lot of time on every tool that we've built because there are a lot. I just want to highlight some that I think give you an understanding of how things have shifted over time. We've given parents the ability to view how much time their teens spend on Instagram, set time limits, get notified when a teen reports an account or content, view what accounts their teens follow and the accounts that are following their teens. See who their teen has been speaking to in the last seven days. Again, hoping that parents feel empowered to have conversations with their teens. These conversations, as I said, are ongoing and they're continuing to shape and improve how we design experiences for teens. And so more recently in the last two years, again, this is a trajectory that continues to develop. Parents said they wanted to feel more confident around their teen social media use without having to worry about the top three concerns that again, things shift over time. It's what content their teen is seeing, who their teen is talking to, and how their teen is spending their time. And that's why we launched teen Accounts, which was talked about earlier today in September 2024 for Instagram, Facebook Messenger. And I think this is really important. All teens are defaulted into protective settings that address those three concerns. Who talks to their teens? We limit messaging, we limit the content that teens see and we make sure that time is well spent by putting teens into sleep mode at night. And again, any teen under 16 cannot wiggle out of these defaults, these strict settings, without a parent allowing them to do so. We also heard from parents more recently that they have different views on what's appropriate for their teens. Think about this as a parent. You know, parents look at content on Instagram and they looked at millions of pieces of content and there were thousands of parents who looked at it and they all had different views on what the feedback was and what was age appropriate. We took that feedback and we distilled it into how we draw lines across content that teens can see and that expanded again iterating improving the teen accounts experience. We revamped our content policies inspired by 13 movie criteria and more specifically, parent feedback. That means now that teens under 18 are automatically placed into these 13 plus experiences and they'll see content similar to what they'd see in an age appropriate movie. They also can't see 18 content anywhere, whether it's recommended, posted by a friend if they're searching for it. We also listened to parents and they told us they may not want their teens to see 13/plus experience content because again, not every teen is the same. A 13 year old may not be as mature as another 13 year old. So we created even more restrictive setting that parents can put their teens in. So again, every family's different. We took in that feedback, we implemented that feedback. We've also taken a similar approach to providing age appropriate interactions for teens who use our AI. Teens can access information and educational opportunities through Meta's AI assistant, again with default age appropriate protections in place. And we're continuing our work to give parents insights into those conversations. We're again using content guidelines that are inspired by movie ratings for 13 +, meaning that AI should not give responses that would feel out of place in an age appropriate movie. The other recent announcement that we made that has been highlighted today earlier is that Instagram will start notifying parents in supervision if their teen repeatedly tries to search for terms related to suicide or self harm within a short period of time. The vast majority of teens are not looking for this content, but when they do, we already have a policy in place to block those searches and to direct them to resources should that happen. These new alerts though, are designed to make sure that parents are aware if their teen is repeatedly trying to search for this content and to give them the resources they need to support their teen. And again, we worked with experts on this, but we heard directly from parents that they wanted to know and we incorporated that feedback. I think what's been raised today, and I really want to revisit this because it's been said so many times in a different, you know, variety of conversations, is that parents and myself included are feeling overwhelmed. Teens, and I'm sure Australia will come up at some point during the conversation, are fleeing to apps that we've never heard of. Teens are on average of, according to a University of Michigan study, 40 apps per week and parents have no idea what they're doing. And again, you know, we supported assembly member Wix bill to require operating system providers and app stores to implement an age assurance signal. And that's important because in order to get teens into age appropriate experiences, you absolutely need to know how old they are. And everybody here at the table will tell you it is complicated and it is hard to know how old somebody is. And so we applaud that bill for passing. We supported it. But I think what we're getting at today here is that parents want visibility into what their teens are doing online. They want to be able to decide whether their teen is ready for an app or not. And that's why we've supported OS App Store legislation that requires app stores to get a parent's approval before their teen downloads an app. And under this approach, if a teen attempts to download an app, the parent would get a notification on their phone and it's a one stop shop. They approve it or they don't. And again, it addresses parents concerns that they don't know what's going on and it puts them in the seat. It still requires all of the apps to do the work to create age appropriate experiences. That work is not done. It's work that we're still going to be doing. I want to close that. I actually know the people at this table, I think industry wide. At Meta, we all care. We're all parents. We care about creating safe experiences. We want to make sure that teens who. We've been told by an expert today that teens want to be online. My experience of AOL chat room I did get on when I was 16 is not the experience of my kids today. It is here to stay. We need to support them and we need to do so in a way where we're part of the solution and we're empowering apps to continue doing the work that they're doing, but also making sure that parents are in the loop and that parents have visibility and can support their teens while continuing to require that we develop protective experiences for teens as a baseline. Thank you.

Speaker Dother

Thank you.

Assemblymember Wicksassemblymember

And I'll say as a kid, that

Speaker Dother

was in those AOL chat rooms, there

Assemblymember Wicksassemblymember

was filth in there too. So I'm Safe

Nicoleother

experience, which I shared with another person here. Not safe.

Monica Buffonother

Yes.

Assemblymember Wicksassemblymember

No, I would agree with that. Through lived experience.

Speaker Dother

Okay.

Assemblymember Wicksassemblymember

And now we will turn to Emily Cashman.

Emily Cashman Kirsteinother

Kristin. Kirstine. Kirstine.

Assemblymember Wicksassemblymember

But I got it.

Emily Cashman Kirsteinother

Nope, not at all.

Speaker Oother

Yes.

Emily Cashman Kirsteinother

So I'm Emily Cashman Kirstein. I lead child safety public policy at Google. And I'd also like to thank Mr. And Mrs. Thanks for being here, for sharing your story and for your advocacy. I come to this job from industry today, but I've also worked on the NGO side. I led public policy.

Speaker Nother

Oh, sorry.

Emily Cashman Kirsteinother

I led public policy work at Thorne, the nonprofit to combat child sexual abuse material online and on the government side, working in the US Senate. I appreciate the opportunity to be with you all today to talk through parental tools, but also how Google frames it in the larger context, how we're thinking about building for kids and families overall. I think you all have slides and we have them up here. As Assemblymember Wick said, there have been a lot of updates and wanted to put those in front of you all today. So our overarching mission at Google is to organize the world's information, make it universally accessible and useful. And when it comes to youth, we want to be doing that in a way that offers them the benefits and the utility of the online world with the appropriate safeguards in place. And that last part, bolded, underlined, underscored all of that. And meaning, of course, we want to protect kids in not from the digital world. And how we're doing that is based on these three pillars here. The first is protect. This refers to everything from baseline protections for for all users, including our industry leading efforts to combat child sexual abuse, material and exploitation online. Two default settings that we have for under 18 users that are backed by age assurance. Respect is the core of what we're talking about today, which is parental tools and knowing that each family has a different relationship with technology, how do we respect that? And third, the empower pillar is how we're building those enriching, not just okay activities, how are we building enriching educational experiences for youth online and building the digital skills of the future, learning to use the latest technologies again in that safeguarded environment. And so starting with protect, you know, we have default settings for under 18 users even before we get to parental tools. And I'm going to go through a little bit of these here. So on search, for example, we have safe search on by default that helps filter explicit content. Location sharing is off by default, 18/apps are blocked on play. We'll get into YouTube and Gemini in a bit more depth, but I do want to emphasize here that Google does not serve personalized ads to minors. And on YouTube, regardless of parental tools, again, for all under 18 users, we've built in protections in our personalized recommendation systems to ensure that teens aren't overly exposed to specific kinds of content. That while they're not violating our policy guidelines, they may be innocuous in a single view, but if they're put in repetition, if they're recommended repeatedly, potentially they could become problematic. And we worked with independent experts, YouTube's youth and family Advisory Council, to develop these Content categories and we continue updating them. We also have take a break and bedtime reminders on by default. The take a break reminder is a full screen takeover that is on the default setting is for an hour, but parents can also adjust that as needed. And to properly ensure that those under

Nicoleother

18

Emily Cashman Kirsteinother

default settings are getting to the right users, we have rolled out age assurance on our own first party platforms and we're also working toward compliance, of course, with AB 1043, the approach to responsibly share signals across the broader app ecosystem. And excuse me, so how do we do that? So first, of course, we're starting with declared age, starting from somewhere. Then we run an inference model. So without taking more information from the user, we're looking at things like, has this account been around for 20 years? Probably not a minor. If they're searching for mortgage rates and tax assistance, again, probably not a minor. That goes into how that inference model works. If the model is unsure that this is an adult and that user tries to access a music video on YouTube that has explicit lyrics, something like that, that would otherwise be age gated, they will be prompted to confirm their age. Whether that's through an id. We know not everyone wants to offer an id. We offer selfie, email lookup, credit card verification, things like that. So getting into the parental tools themselves, right, all the protections I was speaking about before are default under 18, before we even get to parental tools. And the premise being that no one family and no one child is the same. Of course we've talked about this, we've heard about it, and we have to build with that reality. So one of the things, you know, we've had family link since 2017. That's our flagship parental tool for Google. But we have of course heard, as we've heard today, parents are overwhelmed. They want quick and easy setup, they want options that fit their families best. So in addition to Family Link, this past year we announced parental device controls right on the device. So at the point of a parent having the device, they can set up things like screen time, like web filters, approving and blocking apps that exists now. And that's all backed by a pin that a parent knows right there on the phone. If the parent would like a more robust experience with parental tools, that's where Family Link comes in, the ones before, just on the device. This is an app that the parents have, they can have on their phone that is a, you know, a more robust experience remotely, right? So they can, as it is, as it stands now, they can block apps, approve apps, Right through Family Link, they can block or, you know, approve websites, screen time settings, school time. This will make the phone not work during the day at school. All of those exist right now through Family Link.

Assemblymember Wicksassemblymember

Is that all free?

Speaker Dother

I heard a question.

Speaker Gother

Yeah.

Emily Cashman Kirsteinother

And child accounts, I should say, remain in a supervised state after they turn 13 unless the parent approves removing that supervision. This helps make sure that those decisions are made as a family. And again, in talking through all of the ways that parents were incorporating parents feedback, we heard that it took too long to set up YouTube accounts and things like that in the YouTube app. So we rolled out within the past couple months an easier way for parents to set up YouTube accounts. And how to more importantly, just as important to toggle back and forth between a parent's account and a kid's account. It's incredibly important as we know, for minors to be on their own account to be able to take advantage of the default settings we talked about of the parental tools that their parent has set up. And you know, another piece to this is we just rolled out a YouTube shorts timer. So this is allowing the parent to decide how much time may be appropriate for their child to see YouTube shorts. And an important piece to note is that timer can go down to zero and parents can decide if they don't want shorts on at all for their child. And last pillar is in power here.

Assemblymember Wicksassemblymember

I'll wrap up.

Emily Cashman Kirsteinother

This is about using technology to help young people learn and create and explore. One of the most important things talking about that is top of everyone's mind of course is related to generative AI. We want youth to have access to the benefits and the opportunities that come with it. But again, as we said before, with those appropriate safeguards in place, a bit on these safeguards themselves. Before rolling out youth experience on Gemini in 2023, we worked with our in house team of researchers, cognitive psychologists, child development experts, in addition to an independent youth advisory council that we have at Google to develop policies and protections for youth. And recognizing that youth could be more vulnerable to developing an emotional connection with AI, we built Persona protections for youth into Gemiini from day one, since 2023. So for younger users, Gemiini is designed not to say I love you, not to say I need you, or any explicit, you know, any explicit claims of humanness or that it feels emotions. We've built protections and additionally against sexually explicit content, dangerous activities, age, restricted substances, violence and gore, medical advice, unhealthy behaviors. Again, those are all baseline protections in Gemiini. Many of these for all users, but especially for under 18. And our suicide self harm protocols refer users to crisis service providers and encourage them to seek real world support and help from a trusted from someone they trust and. Oh, excuse me. So we're committed again to empowering both parents and youth to explore gemiini responsibly. We've heard a lot about parents wanting more resources and I should say for gemiini, parents are in control to decide if it's right for their child or not. But if they would like to get more information, we offer AI literacy guides. Some have been designed specifically for teens for their developmental state and family conversation guides to have this conversation as a family about how to use AI. Both of these help reinforce the importance of knowing the limitations of AI, how to think critically about responses and to double check answers as needed. And we also offer things like podcasts for parents and a video series on how to use AI with your children. We're always looking for new ways to make gemiini usable and useful for youth. As an example, we recently announced a partnership with the Princeton Review to make free on demand SAT prep available within gemiini. So I know I've gone on a little bit.

Assemblymember Wicksassemblymember

You have a lot of products.

Emily Cashman Kirsteinother

Yes, there's a lot to go through and this is really complex. But I hope that we're able to show the many different layers we're thinking about this of which parental tools is just one layer and all under the umbrella of the premise of wanting youth to have the benefits of this technology with those appropriate safeguards in place. Thanks.

Speaker Dother

Thank you. And I will say the only one of these products that my kid has is YouTube and I didn't know about

Assemblymember Wicksassemblymember

a lot of this, so I learned myself.

Speaker Dother

So I think the education piece is really important. I know I could turn Shorts off. He will not be happy when I

Assemblymember Wicksassemblymember

get home and that's the next thing I do. And then is Family Link available even if you're on an Apple device or

Lashawn Francisother

do you have to be?

Assemblymember Wicksassemblymember

Okay, Yep, I was curious about that. Okay.

Speaker Dother

We will obviously have more questions, but I just baseline.

Assemblymember Wicksassemblymember

Now we will turn it over to Lauren Haver Jones, head of Youth well

Speaker Dother

Being and families at OpenAI.

Lauren Haber Jonasother

Thank you so much. First, like my colleagues, I want to thank Victoria and Paul for the time

Speaker Gother

and for the testimony here today.

Lauren Haber Jonasother

As a, as a parent, I cannot

Speaker Gother

imagine the experience that you've had.

Lauren Haber Jonasother

Good afternoon. Chair Bauer Kayan and members of the committee, thank you for the opportunity to

Speaker Gother

be here and to testify and for

Lauren Haber Jonasother

your leadership on youth safety. My name is Lauren Haber Jonas I lead youth wellbeing and families at OpenAI. In particular, I come at this as a builder, so I lead product and engineering. I do not lead only policy for OpenAI. My teams are the one building these things.

Speaker Gother

We build parental controls, we build age

Lauren Haber Jonasother

assurance technologies, we build age verification. So we understand deeply the technical requirements

Speaker Gother

and how difficult it is to do this well and what the opportunity is

Lauren Haber Jonasother

and any limitations might be. I have been doing this for 10 years, so this is very much my life's work. I have been building both on the product and the engineering side in youth safety at large companies, at small companies, at my own companies as an entrepreneur for 10 years. So our goal here when I got to OpenAI two years ago, from nearly the moment that ChatGPT launched, was to

Speaker Gother

build this with youth safety at the

Lauren Haber Jonasother

start, from the moment that this was

Speaker Gother

in the hands of teens again, this

Lauren Haber Jonasother

is my life's work and core to

Speaker Gother

the mission of the company.

Lauren Haber Jonasother

I'm also the mother of three young children. I have three, seven and under. I don't sleep a lot if you see the bags under the eyes, as many have said. So I think about this both professionally and personally. We appreciate the committee's focus on parental controls as AI becomes more integrated into how young people learn, create and explore explore information. The companies that are developing these technologies have a responsibility to build the protections in from the start and also give families meaningful tools. At the same time. It's important to recognize that generative AI systems like ChatGPT operate differently than social media platforms. ChatGPT does not have feeds. We do not have engagement algorithms or public posting. We have only been available since November of 2022. But precisely because this technology is new and so powerful, we have focused on building these strong protections and learning from

Speaker Gother

the lessons of platforms that have come before us.

Lauren Haber Jonasother

I'll talk a little bit today about the approach we're taking, the partnerships that guide our work, the multi layered approach. Again, not just relying on parents as

Speaker Gother

some of my peers have stated, parents

Lauren Haber Jonasother

and parental controls to guide families on how best to make sure that their

Speaker Gother

teens are using these tools responsibly.

Lauren Haber Jonasother

So fundamentally at OpenAI our belief is that young people should be able to

Speaker Gother

benefit from these tools, whether that means

Lauren Haber Jonasother

learning, exploring ideas or developing new skills. Learning is one of the most common

Speaker Gother

use cases on ChatGPT today.

Lauren Haber Jonasother

1 in 3 US students use it to study. Many use it as a learning support tool. Create practice quizzes, study plans, review drafts of assignments. It is a tool that helps them test their knowledge, clarify difficult concepts. And for many students, this kind of personalized support was previously only available through one on one tutoring. These benefits are immense, but they must

Speaker Gother

be paired with intentional safeguards and responsible design. As we've said.

Lauren Haber Jonasother

One of the things that we said publicly from the start is that our

Speaker Gother

approach to this is a priority of

Lauren Haber Jonasother

safety ahead of privacy and freedom for teens, full stop. This is a new technology, it is a powerful technology and we believe minors need significant protection. We have said this, our CEO has said this a number of times before. This is a very serious responsibility that we take both to our teen users and to their parents to have a

Speaker Gother

layered set of protections.

Lauren Haber Jonasother

I want to talk a little bit about how we partner with experts. One of the things that we have learned from companies that have come before us is that we cannot solve youth safety challenges on our own. We have built two organizations, external third party organizations that we partner with, the first being an expert Council on well Being and AI. These are the folks on that council. These are researchers that study youth development, mental health and the effects of technology. They come from Boston, Children's Hospital, Georgia Tech, Northwestern University of Oxford. We have also built a global physicians network. So this is a network of 250

Speaker Gother

clinicians and physicians over 60 countries.

Lauren Haber Jonasother

So the goal here is a global lens, not purely a domestic lens that guide and help evaluate how our systems respond and help guide our policies and our principles and the content restrictions we have in place. Beyond that, we work closely with organizations

Speaker Gother

that have long been leaders in the space. We work with Common Sense Media, the

Lauren Haber Jonasother

American Psychological association aft connect safely. Today, in fact, I'm here and not there, but today, in fact, we're hosting a convening of a cross sector group of leaders, CEOs of the nation's leading mental health firms. So the American Psychological association and others to help guide our work in mental health for youth and for adults are today in our San Francisco headquarters in

Speaker Gother

this particularly unique convening. There we go.

Lauren Haber Jonasother

Building on this input that we get from third parties, we introduced what we call our Teen Safety Blueprint. The blueprint is meant to serve as both an internal framework for every team building within OpenAI and as a starting point for broader policy conversations about responsible

Speaker Gother

AI and young people.

Lauren Haber Jonasother

And it has a number of pillars. The first is, as we've talked about, is identifying users under the age of 18 and that is age estimation as

Speaker Gother

the approach, the initial approach we've taken.

Lauren Haber Jonasother

The second is a default safety layer

Speaker Gother

of protections once those teens are identified.

Lauren Haber Jonasother

The third is a layer on top of that, that offers parents the ability

Speaker Gother

to have control, as we've talked about in her quite a bit today.

Lauren Haber Jonasother

The fourth is designing systems that support

Speaker Gother

that are not just a safety floor, but support well being. What does that mean? How do we support the well being of teens, not just the baseline safety for teens.

Lauren Haber Jonasother

And then the last is transparency.

Speaker Gother

The goal here is to be as

Lauren Haber Jonasother

transparent as possible about our approach. The moral of the story here is that no single safeguard is sufficient on its own. We have taken a multi layered approach here, all working together. Product design, behavioral policies, parental tools, consultation with experts, and most importantly, we work in the open.

Speaker Gother

So we have published what we call

Lauren Haber Jonasother

our Model Spec, which are principles that guide how our AI systems behave. So this guides how the model is built and how the model should be

Speaker Gother

steered when interacting with teens.

Lauren Haber Jonasother

There is a specific section of the

Speaker Gother

Model Spec that is dedicated to teens and to teen safety which has been published and we're happy to share with the committee.

Nicoleother

Ooh, backwards one.

Lauren Haber Jonasother

I want to talk a little bit about the content restrictions that we have

Speaker Gother

in place for teens.

Lauren Haber Jonasother

And again, these are default on for teens when a teen is identified. Our system should not romanticize self harm or suicide. We should not engage in immersive role play with minors. They should avoid reinforcing harmful body ideals. They should encourage young people to seek

Speaker Gother

support from trusted adults outside of the

Lauren Haber Jonasother

technology when facing difficult situations. Again, these are behavioral guardrails. These are content guardrails that are a foundation. They are not the only mitigation, but they are the foundation on which everything is built. Now I want to turn a little bit to parental controls. We introduced a set of parental controls in the fall. And our overarching goal as a product

Speaker Gother

and an engineering team was not to just build a new settings page, but

Lauren Haber Jonasother

it was to lead the industry and to pull the industry with us. And we'll talk a little bit about how we did that and how we

Speaker Gother

feel we've done that.

Assemblymember Wicksassemblymember

I want to talk a little bit

Lauren Haber Jonasother

about our parental controls and how we feel that this is empowering families and educators. The protections reduce exposure to the types of content described that research shows may

Speaker Gother

be harmful for adolescents.

Lauren Haber Jonasother

So this is based on teen developmental psychology. Parents link their account to their teens account manage settings from a single dashboard. It allows parents to tailor the experience. But in particular, the setup process is very straightforward and happens in numerous directions. A teen can invite their parents to parental controls. Parent can invite their teen to parental controls.

Speaker Gother

It goes both ways.

Lauren Haber Jonasother

If a teen later unlinks their account, a Parent is notified.

Nicoleother

If a teen asks to change a

Lauren Haber Jonasother

setting, a parent is notified. They cannot do that on their own.

Speaker Gother

This is only available for parents. So the goal here was to design

Lauren Haber Jonasother

a system that encourages communication between parents and teens and is transparent on both sides. A teen can't do anything in terms of editing these controls their parents don't

Speaker Gother

know about and vice versa.

Lauren Haber Jonasother

Once accounts are linked, there are a number of different controls that a parent has. So the goal here is to get as granular as possible. A parent should be able to turn on and off image generation, on and off voice mode, on and off.

Speaker Gother

The sensitive content restrictions they have.

Lauren Haber Jonasother

Maybe for their family, they're comfortable with more adult content, their child seeing more adult content to receive alerts if the system detects sort of possible signs of suicide and distress. And we'll talk about that in a little more detail. To opt out of model training. The goal is to give parents as granular and flexible options as possible in as simple of a way as possible. All of these parental controls are default on. A parent does not have to opt in. And I want to talk a little bit about safety notifications and how we built this. This launched last fall. We were the first in the industry to build this and we're heartened to see some of our peers follow us in that regard. What this is is the following. It is a safety notification system, it's industry first and it doesn't require an opt in. So if you are in parental controls, you do not have to raise your hand as a parent and say I want to receive safety notifications. It is on by default. And we will notify you in three

Speaker Gother

ways in ChatGPT, via text and via email.

Lauren Haber Jonasother

We could do it via carrier pigeon

Monica Buffonother

if we, if we had any ability to, we would.

Lauren Haber Jonasother

But the goal is to get to a parent and to share that a teen is prompting for distressing content. The content that a teen is prompting for is never shared with the parent. So we understand and value the privacy of teens. We are not sharing the specific prompt

Speaker Gother

and generation text that a teen is prompting. But the goal is to encourage a

Lauren Haber Jonasother

parent to take action for them to

Speaker Gother

have enough information for a parent to take action.

Lauren Haber Jonasother

One thing that is important to note is that when a teen is prompting for distressing content before a parent notification is triggered, that content goes to human being full time employees for review, trained full time employees inside OpenAI for review

Speaker Gother

to make sure that that content is

Lauren Haber Jonasother

and we haven't had a false positive, we haven't done this in an incorrect way.

Speaker Gother

Before we send a notification to parents,

Lauren Haber Jonasother

we love that this has become industry norm and this is one of the

Speaker Gother

ways that we hoped to sort of

Lauren Haber Jonasother

pull the industry along in the parental controls space. As some of my colleagues have noted, our work is not done here. We are continuing to learn. We are continuing to improve. We partner with some of our friends over at Common Sense Media. We believe these are first steps. This is not the end. Additionally, because we know that parents need additional support and guidance and teen need support and guidance on how to use our tools, we have family guides on how to use AI responsibly, a set of conversation starters for parents. These resources were developed with input from safety experts and organizations like Connect to

Speaker Gother

Connect Safely in Common Sense Media.

Lauren Haber Jonasother

I want to end with recognizing that protecting young people online is an ongoing responsibility. No single company, no product, feature or law will solve these challenges on its own. We believe that progress comes from thoughtful guardrails, transparency, collaboration with experts, and empowering families. In fact, today we joined a group of kids safety advocates, community groups and other organizations as part of the parents and Kids Safe AI Coalition to pass what we hope will be the nation's strongest child safety AI law. We appreciate the committee's work in this area. We look forward to continuing to partner

Speaker Gother

with you and thank you for the opportunity to testify.

Speaker Dother

Thank you.

Assemblymember Wicksassemblymember

I just want to clarify on question, you said that when parents get that notification, it doesn't say what the prompt

Speaker Dother

was, it just gave them a category.

Speaker Gother

Yes.

Monica Buffonother

It says.

Assemblymember Wicksassemblymember

Yeah, so it would say suicidality, for example.

Speaker Gother

It'll say your teen is prompting for suicidal.

Assemblymember McKinnonassemblymember

Okay.

Eliza Jacobsother

Yes.

Assemblymember Wicksassemblymember

Turn my mic off. Now we're going to turn to Eliza Jacobs, who is not sitting here, but

Speaker Dother

her assistant and very talented government relations colleague is. So Eliza should be online.

Speaker Nother

Eliza, do we have you?

Eliza Jacobsother

Hi, everyone.

Emily Cashman Kirsteinother

Perfect.

Speaker Rother

Can you hear me?

Emily Cashman Kirsteinother

Great. Yep.

Sunny Liuother

Hi.

Eliza Jacobsother

Thank you so much for having us today.

Speaker Rother

And thank you to all the previous speakers. I think it's just a testament to how much this needs to be a group effort for all these different components to come together and talk about this important issue.

Eliza Jacobsother

And also, thank you so much for letting me testify remotely.

Speaker Rother

It lets me be home with my

Eliza Jacobsother

kiddo for dinner tonight.

Speaker Rother

So I really, really appreciate it.

Eliza Jacobsother

As Chair Barracan said, my name is Eliza Jacobs and I lead policy at roblox. First of all, I don't know how many people know what Roblox is, but Roblox is an immersive gaming platform.

Speaker Rother

People can connect with their friends and family and play and explore. Molly, you can go to the Next slide.

Eliza Jacobsother

We have over 150 million daily active users all across the world. About 66% of them are over 13. But that means there's a significant portion of our users that are under 13.

Speaker Rother

And we have always been in all ages platform which has really informed our approach to safety for a 20 year history. Next slide.

Eliza Jacobsother

Do we miss, do we miss the slide there? No.

Paul Hinksother

Okay.

Eliza Jacobsother

Yeah. So Roblox has been around for a while. We've always been an all ages platform and as a result we've always built with safety at our core. We have a multi tier, multi level approach to safety.

Speaker Rother

As many people have noted today, there

Eliza Jacobsother

is no one tool that is the

Speaker Rother

silver bullet for safety. You have to have many layers and many tools to keep your community safe. And that's what we do at Roblox.

Eliza Jacobsother

So we start with robust policies. Can we go back, Molly? Yeah, we start with robust policies. Our policies are purposefully more restrictive than most of the Internet.

Speaker Rother

Again because we're an all ages platform.

Eliza Jacobsother

We don't allow profanity, for example on the platform.

Speaker Rother

We don't allow any references to drugs or alcohol on the platform. We are optimizing for the safety of our youngest users and our policies. We also have robust automated moderation systems. At our scale, you need to have AI working in partnership with humans to moderate the content on the platform. We then have teams of human experts doing human moderation for more complex cases. We have a team of deep subject matter experts on all manner of child safety issues, grooming, suicide and self harm, terrorist content, all of that. We have a team of internal investigators that work on those more complex issues.

Eliza Jacobsother

And we also have a wide variety

Speaker Rother

of safety partnerships with NGOs, with common sense Media. You know, we work with all the people that organizations, people spoken about earlier today.

Eliza Jacobsother

And I also want to highlight that we have a teen Council and a global parent council and those are groups

Speaker Rother

of users and parents that engage with the platform where we're constantly talking to them about what they want to see, what would be helpful for them.

Eliza Jacobsother

We think it's really important to value

Speaker Rother

the teen voice and value the parent voice in all of these conversations.

Eliza Jacobsother

So there's as I said, many layers of safety on the platform. And to start with, on communication safety, we do not encrypt any of our communication. So all of our communication can be monitored.

Speaker Rother

We have AI models running in the background constantly to monitor for grooming and other critical harms behavior.

Eliza Jacobsother

We have internal experts that are looking

Speaker Rother

at that communication and reaching out to law enforcement where necessary. We think it's really important when we're talking about kids that we're not encrypting communication.

Eliza Jacobsother

We also have a text filter that operates on communication on the platform.

Speaker Rother

So we're filtering inappropriate communication before it

Eliza Jacobsother

can be sent to other users. And specifically, it's designed to block the

Speaker Rother

sharing of personal identifying information.

Eliza Jacobsother

So kids be sharing phone numbers, addresses. Instagram handles anything that would make it

Speaker Rother

easier for people to meet up with them offline or online, but on another platform.

Eliza Jacobsother

Platform. Next slide. And we know that it's important to design again with kids and teens in mind and to have additional protections for our younger users. There are real challenges here.

Speaker Rother

As many people have noted, as kids

Eliza Jacobsother

grow up, become teenagers, they have growing independence.

Speaker Rother

They often have their own devices.

Eliza Jacobsother

Maybe they're alone in their bedrooms on those devices. They're moving between apps.

Speaker Rother

You know, everyone that has spoken today, our users are on their platforms as well. And we can only control what they do on our platform once they leave the platform. We just don't have visibility into that. So a few things we've built into the product as safety features. First of all, there's no image or video sharing in chat, so you cannot share a photo from your camera roll in chat. On Roblox, you can't forward a video. As I said, we don't encrypt communication. So we're constantly monitoring all communication between users for potential harms.

Eliza Jacobsother

We also, and I'll talk about this

Speaker Rother

a little bit more later, we require age checks to access any communication features on the platform. That is a facial age estimation process

Eliza Jacobsother

that we rolled out starting in the

Speaker Rother

fall and is globally required as of January.

Eliza Jacobsother

And we've open sourced many of our safety models.

Speaker Rother

You know, the companies that are testifying today are some of the bigger players,

Eliza Jacobsother

but there are lots of apps that

Speaker Rother

just don't have the resources to build the kinds of systems that we're talking about today. And so we think it's really important to share this technology in an open source way with the whole industry to keep everybody safe. We want kids to be safe, not just on Roblox, but everywhere.

Eliza Jacobsother

And we're constantly engaging with policymakers like

Speaker Rother

yourself and child safety experts, child development experts, to understand what is necessary, what we need to build in the next generation.

Eliza Jacobsother

Next slide. So specifically talking about parental controls, and just to reiterate all of those things that I just talked about, those, those come as a factory setting out of the box.

Speaker Rother

You don't need to engage with parental controls to have any of that be

Eliza Jacobsother

true on the platform.

Speaker Rother

And we think it's really important that you're starting from a place of default

Eliza Jacobsother

safety and that parental controls are just another layer in the arsenal, another tool

Speaker Rother

so that parents and families can personalize their Roblox experience.

Eliza Jacobsother

But by all means, we don't think

Speaker Rother

that they're the end all, be all, and we don't think that they should be necessary for kids to be safe on our platform.

Eliza Jacobsother

But that being said, our parental controls were the result of extensive partnership and consultation with experts. We work with a variety of rating boards. So in the gaming space, similar to

Speaker Rother

movies, there are lots of different international

Eliza Jacobsother

ratings boards that rate content.

Speaker Rother

Some of them are here.

Eliza Jacobsother

We are working on integrating with iarc,

Speaker Rother

which is the International Age Rating Coalition,

Eliza Jacobsother

so that sometime in the next year our users will get localized ratings.

Speaker Rother

Right now, and I'll talk about this a little later, you get our standard Roblox platform ratings. But in the future, kids in the US will get ESRB ratings.

Eliza Jacobsother

For those who have gamers in your

Speaker Rother

life, you'll recognize those as things like

Eliza Jacobsother

E for everyone and T for teen.

Speaker Rother

But in, for example, Germany, there are UK ratings. In the UK there are PEGI ratings. So those will be familiar to parents and will be displayed for their kids when they're accessing Roblox games.

Eliza Jacobsother

Next slide. So how do our parental controls work? Similar to what other people have spoken about, we have a sort of parent link approach where parents create their own Roblox account. They link their account to their child's account, and then their phone becomes sort

Speaker Rother

of a remote control for their kids Roblox experience.

Eliza Jacobsother

As a parent myself, I know that

Speaker Rother

often you're making these choices like late at night when you finally sit down after doing the dishes. And so it's really important, we think that you have the ability to have asynchronous control over these things.

Eliza Jacobsother

You will also get notifications if your

Speaker Rother

kids request a settings change.

Eliza Jacobsother

So if they are at a friend's

Speaker Rother

house and they want to play a game that you haven't allowed them in their settings to do, they will send a request and you will get that request on your phone and be able to approve or deny it from your phone. But you don't have to be on their device to make that choice.

Eliza Jacobsother

In order to link your account as a parent, you have to verify your age.

Speaker Rother

You can do that either with a credit card or with an id. And once you've done that, you'll have access to the full suite of parental controls.

Eliza Jacobsother

One thing I want to note here Is that another advantage of the parent link approach is that it encourages parents

Speaker Rother

to get in the game themselves. We really believe that the more you're opening a dialogue with your kids and talking to them about their Roblox experience or any online experience, the easier it will be to hear from them their honest experience. And if they believe that you care

Eliza Jacobsother

about what they're doing on Roblox and

Speaker Rother

that your instinct isn't just to ban

Eliza Jacobsother

it, the more likely they are to

Speaker Rother

be open with you about what's happening.

Eliza Jacobsother

So, you know, create a fun avatar,

Speaker Rother

play a game with your kids. We think that's a big component of parental controls and parental involvement.

Eliza Jacobsother

Next slide. So as I said, parent accounts must be age verified with government issued ID or credit card. We only use this information to verify

Speaker Rother

your age and so it's not an identity marker in an ongoing way. Next slide.

Eliza Jacobsother

And then once you do that, you'll have access to this user friendly dashboard with the controls. Heard from teens and parents that parents most want. Something that we've heard a lot today is that parents are overwhelmed.

Speaker Rother

And I can totally understand that as a parent myself, I think what's most important is that we're giving parents the tools that they most want and not a million controls and a million radio buttons that are overwhelming. And that sort of become like an eye chart for parents to have to review. So we really focus on the things that parents have told us they want the most.

Eliza Jacobsother

And in general, those fall into a couple of categories. Content restrictions, what your kid can play, communication, who your kid can talk to, spending what they can purchase, and screen

Speaker Rother

time, as well as a few other key controls.

Emily Cashman Kirsteinother

Next slide.

Eliza Jacobsother

So parents can see who their kids friends are and they can set daily screen time limits. They can block individual connections, which means that your kids won't be able to talk to those users. And once that connection is blocked, kids

Speaker Rother

can't go in and change that setting.

Eliza Jacobsother

Parents can also set daily screen time limit

Speaker Rother

within the app.

Eliza Jacobsother

I think what's just one thing to note, like this might change. You know, like my daughter was home

Speaker Rother

on Friday sick and she got a

Eliza Jacobsother

lot more screen time that day than she would normally get.

Speaker Rother

And so again, we want this to be really easy for parents to do from their phones to be able to like quickly make adjustments if it's, you know, a sick day or a snow day. We've had many of here this year and they, they want to let their kids have a little more screen time that day. Next slide.

Eliza Jacobsother

Parents can also set spending restrictions.

Speaker Rother

It should be noted that parents are setting spending restrictions by sort of loading

Eliza Jacobsother

the money into their account in the first place.

Speaker Rother

But they can also set additional restrictions and also notifications.

Eliza Jacobsother

So if you want to get a

Speaker Rother

notification every time your kid buys something on Roblox, you can do that. You can also get a notification just when the spend hits a certain limit and set an overall limit as well.

Eliza Jacobsother

Next slide. Content maturity limit. So this is where the ratings come in. We currently maintain a sort of universal Roblox standard of content maturity limits. Think of this like movie ratings. By default, users under nine only have access to minimal or mild content.

Speaker Rother

Users over the age of nine will

Eliza Jacobsother

have access to moderate content.

Speaker Rother

Restricted content requires that users be 18 plus.

Eliza Jacobsother

But again, just to reiterate, our content policies are just much more restrictive than the rest of the Internet.

Speaker Rother

So again, no profanity, no drugs and

Eliza Jacobsother

alcohol, no sexual content.

Speaker Rother

All of those things are just flat

Eliza Jacobsother

out prohibited on the platform.

Speaker Rother

And so these buckets are actually much more restrictive than traditional GPG PG 13 kinds of ratings. Next slide.

Eliza Jacobsother

In terms of content restrictions, parents can block individual experiences that they don't want

Speaker Rother

their kids to play.

Eliza Jacobsother

And something that we took directly from the research was first we show parents what their kids 20 most played experiences are so that they know where their kids are actually spending time.

Speaker Rother

And then they can choose, go into those, explore them, decide whether those are appropriate on top of the ratings level restrictions.

Eliza Jacobsother

And this really, we think surfaces the

Speaker Rother

information parents need to know, make choices about what they want their kids to

Eliza Jacobsother

be able to play. Next slide.

Speaker Rother

Next slide.

Eliza Jacobsother

So we, as I said in November started rolling out and in January globally required that all users who wish to

Speaker Rother

access communication features on the platform are required to complete a facial age estimation process.

Eliza Jacobsother

Once they do so, they will be able to access communication features.

Speaker Rother

They'll only be able to chat with other kids in their peer group.

Eliza Jacobsother

We're very optimistic that this step, though

Speaker Rother

not required by anybody, will become a gold standard for age verification on the

Eliza Jacobsother

Internet and for child safety. For a long time, knowing how old

Speaker Rother

kids was was just incredibly difficult, right? For adults we have IDs, but for kids it was very difficult to know. So we're very excited to launch this globally. And we also have continual age estimation running in the background. I think Google talked about this as well. But if we have any reason to believe that the age that you estimated on your account is not the age of the person using that account. So for example, by the nature of the games you are playing or the types of folks that you're friends with on the platform. If there seems to be a mismatch, we will introduce additional friction and ask you to verify again if we believe that the age is not accurate on your account.

Speaker Gother

Next slide.

Eliza Jacobsother

That's it from us. But I look forward to hearing your questions.

Speaker Rother

We're very passionate about safety at Roblox and appreciate California's leadership on this issue.

Speaker Dother

Thank you so much. We are going to open questions with Assemblymember McKinnon. Thank you guys so much.

Assemblymember McKinnonassemblymember

And I'm rushing and I'm sorry that I have one and have another meeting, but this has been such an important topic today and I thank you Chairwoman for bringing this forward. I want to start with the settings. Is there any way we can make these settings to protect ourselves and kids better and more user friendly? I just started like trying to protect myself from being, you know, allowing people to know my location and you know, just privacy things on my own iPhone and it's been taking hours to go through there and try and figure out what to turn off, what to turn on, what to keep on and because I'm nervous about being followed and stuff myself for privacy. And so is there any way that the setting you guys can make these settings more user friendly?

Emily Cashman Kirsteinother

Well, I think from our perspective we're always, from the Google perspective, always looking to improve. This is an ongoing process for parents in particular. We have a variety of resources, whether that's, you know, families. Google is where parents can go to get instructions and more information on the different settings in addition to, you know, the setup and family link. But I think, you know, what we believe is this is a process that is going to evolve, right? As different technological tools evolve, so will protections, so will the settings. And it's also something that why we take such, why we prioritize working with, with our independent advisory groups we have both on the Google and the YouTube side and also with civil society NGOs with government having that back and forth. And this will be an ongoing discussion.

Speaker Gother

I think the single biggest thing that

Lauren Haber Jonasother

we're doing at OpenAI and I think

Speaker Gother

the easiest thing to do would be to have all the the default settings on so you don't have to figure out which ones are right. But the baseline safe private experience is on. And that's the approach that we've taken for our parental controls at the very least for parents and teens. We know that parents don't often know what they are. As my Google colleague has mentioned, we

Lauren Haber Jonasother

have literacy resources and in person consultation and we can get better at education

Speaker Gother

and I think we should, as we've noted. But by default the controls should be on and a parent shouldn't have to turn them on and figure out what they are.

Assemblymember McKinnonassemblymember

So when we purchase the phone and first get it, it should just be on the default already. And then you go from there.

Speaker Gother

Speaking to the OpenAI, you know ChatGPT experience in particular.

Speaker Dother

That's the approach we've taken and remind me because now I've probably conflated all of these different safety programs.

Assemblymember Wicksassemblymember

I apologize. So OpenAI, it's on by default for under 18 and then is that self attestation?

Speaker Dother

Is that how are you determining a ChatGPT user's age?

Lauren Haber Jonasother

Similar to our Google colleagues. 3 Ways Self Declaration of Age First Age Estimation that runs in the background

Speaker Gother

that will determine whether a user is over or under the age of 18

Lauren Haber Jonasother

and then if we are not certain

Speaker Gother

of a user's age over or under

Lauren Haber Jonasother

the age of 18 using age estimation

Speaker Gother

we define default that user down to the under 18 experience. If we get it wrong and we have defaulted you down to the under 18 experience, you can use age verification either via selfie or government issued ID to rectify.

Nicoleother

Got it.

Assemblymember Wicksassemblymember

And then you will I assume I'll

Speaker Dother

be complying with selling member Wix's bill when the time comes, which I know is not yet.

Assemblymember Wicksassemblymember

Although I will say before I turn it back over to Ascella Mercanter, my device manufacturer has now turned on age

Speaker Dother

signaling by their own choice.

Assemblymember Wicksassemblymember

This is not legally required yet and

Speaker Dother

I downloaded an app that was choosing to limit it to 18 plus.

Assemblymember Wicksassemblymember

My device then warned me I was downloading an app that was 18 plus, asked me if I wanted to change

Speaker Dother

my age prior to sending the age signal to get the app.

Assemblymember Wicksassemblymember

So even the device manufacturers that are

Speaker Dother

and it's not technically against the law,

Assemblymember Wicksassemblymember

we didn't think of that. We didn't think the device manufacturers would be inviting people to change their age. So we'll be cleaning that up. But I just feel like every time we try to do these things there's somewhere that there's an end run around. But we're going to keep keep fighting the fight and closing the loopholes.

Speaker Dother

Keep pushing.

Assemblymember McKinnonassemblymember

Given the subject matter of this hearing, I would like the panelists to comment on how we protect vulnerable youth who may not have active caregivers but rather may be neglected or experienced trauma at home. Given that research shows that children who have experienced abuse or maltreatment are at

Speaker Dother

heightened risk for suicidal indention, I can start.

Nicoleother

We've been talking about defaults And a lot of the conversations I've had with policymakers, this has come up. And again, not every parent is going to be involved. A lot of parents can't be involved. They're working multiple jobs. I used to do domestic violence cases. There may be home situations where teens don't want their parents involved. But again, as we said earlier today, it's an outlet for teens to connect, to get educated, to find their passions, to communicate with their friends. And that's why, again, we rolled out, we were the first to roll out the teen defaults with teen accounts. We understood how important it was that even if a parent can't get involved, we need to have the strictest settings in place. And again, we default all teens under 18 into them. I will say separately, we do, we did work, we have four or three or four different expert advisory councils we work with and they drew a differentiating line between under 16 year olds and over 16 year olds. And so if you're under 16, so between 13 and 16, you cannot get out of those protective defaults without a parent relaxing them. Older teens can drive, they are maybe studying, have jobs, executive functioning wise. Not again, every team is different, but there is a line between them. But we still default everybody into it. I think what's really important, it's not just the default experience itself, it's substantively, what are we protecting against. And we want to make sure that, again, the content teens are seeing is age appropriate. That goes to your question about, you know, sometimes vulnerable teens are looking for content that maybe they shouldn't see. So it's really important not only to have policies on it, but to enforce on it and to make sure that we are keeping that content away from vulnerable teens, especially if their parents aren't involved and cannot have conversations with them. I think who you talk to is really important. You want to make sure that teens are not getting randomly messaged by people and that they are in a protected experience when it comes to messaging restrictions. So we default them into that. And so I think the point here is everything should be automatic without the teen even having to hit anything and

Assemblymember Wicksassemblymember

try to get out of it.

Nicoleother

And if they do want to get out of it, that's when they go to a parent or guardian.

Speaker Dother

Thank you.

Assemblymember McKinnonassemblymember

One last question, please. Did you want to.

Emily Cashman Kirsteinother

Well, I think, you know, just when we're talking through how complex it can be and building for every type of child, every type of family, their unique experiences is why this can be hard and why we want to have the ability to have different settings and from the YouTube perspective in particular, we're talking about access to a video library and the way that that can help teens or users in vulnerable situations. Finding authoritative content, finding content that is putting them and validating some of what they may be feeling in a certain family situation or what have you. I think this is also about not cutting off access for some of those teens who may need that information. From a YouTube perspective this is. Teens are using this to listen to music while they're doing homework. This is the largest for younger users, the largest video library of Sesame street for example. So this is a video sharing platform and on top of that there's digital well being pieces built in for if someone is searching for a suicide self harm disordered eating, there's going to be protections defaulted in place that are having the screen takeovers, encouraging them to seek authoritative content and to take a beat and you know, elevate content about self compassion, about you know, grounding exercises, things like that. So there's a variety of different ways that they could be in supportive as

Speaker Dother

well and are yours on by default the which I'm sorry, so these teen protections you're mentioning.

Speaker Nother

Yes, yes.

Speaker Dother

And then how do you do age?

Emily Cashman Kirsteinother

So we do, we have age assurance. We rolled that out on our first party platforms and it goes through, we have that inference model that will say whether or not this user we think is above or below the age of 18. Taking into account things like again, how long the account has been in place, are they looking for different kinds of content?

Speaker Dother

Okay, that's fascinating.

Assemblymember Wicksassemblymember

I just will say again, I don't want to put you in the hot seat because my kids are on YouTube and part of the reason they're on YouTube is because my son has learned to play chess on on YouTube. He became a magician on YouTube. I actually think YouTube has really good content that my kids have grown from. And at the same time I will

Speaker Dother

say my son does have his computer in the kitchen. So I see what he's seeing.

Assemblymember Wicksassemblymember

He's also getting fed incredibly disturbing content every single day. And so I just, I'm like, I'm surprised by some of these answers. Cause it's all great but it's not playing out in my household.

Assemblymember McKinnonassemblymember

So very last question. And it is good to see you guys coming up with great ideas.

Monica Buffonother

So that's good to see because this

Assemblymember McKinnonassemblymember

is my second year in privacy. With no visible representation of people of color among your leadership here today, why should black communities trust that your platforms are safe for their youth? For our youth? What measurable actions have you taken to eliminate systemic racism in your systems? And how are you being held accountable for those outcomes?

Nicoleother

I know Mike dropped there.

Monica Buffonother

I mean, I can, I can address.

Nicoleother

I saw you doing that.

Speaker Tother

I can address it and say, I

Nicoleother

think we need to do better. I mean, the fact that you pointed out that, you know, there aren't enough black leaders at companies across the board, not just our companies here today, I think it's something that we need to all work on. It's important. I will say, I can only speak to my experience. I will say that when I used to lead youth safety policy, which I did for two and a half years, we brought a lot of different perspectives into the people who were advising on how we built the products. And it was across race, it was across gender, it was across socioeconomic status, it was across lots of different countries and also different kinds of families and different types of teens and parenting. And I do think, and I believe wholeheartedly that the way that you best design these experiences is making sure that you're getting all sorts of viewpoints in the room and that you're accounting for them and that if you don't feel like you have enough diversity in the room, you have to try better. So that I can speak to in terms of how our team worked, both with experts, parents, policymakers, it was a very, very diverse group of voices.

Speaker Gother

One of the things that I'll a couple things to note here. I agree. I think we can all do better here.

Lauren Haber Jonasother

I am Latin, I'm of Mexican descent, and I don't think that there is

Speaker Gother

enough representation just writ large in the technological industry. And so I'm in full support of that more broadly as we work with

Lauren Haber Jonasother

sort of third parties in the mental health space.

Speaker Gother

In particular, one of the things that that the CEOs of the major mental

Lauren Haber Jonasother

health organizations are people of color.

Speaker Gother

We ensure that on our wellbeing Advisory council there are people of color, that

Lauren Haber Jonasother

the Global Physicians Network is representative and

Speaker Gother

globally representative so that we are not

Lauren Haber Jonasother

taking a very particular approach in the

Speaker Gother

decisions that we're making.

Lauren Haber Jonasother

I also want to address sort of the prior question.

Speaker Gother

It sort of dovetails together, which is to say as we are building some of these systems, for example, this parental notification piece that we've talked about,

Assemblymember Wicksassemblymember

we

Speaker Gother

understand that even when a parent is involved, that that parent might not always have the best intentions.

Lauren Haber Jonasother

And this is something that has come through in some of the third party organizations that we've worked with on mental

Speaker Gother

health, which is to say, before we send a notification, we are Assessing for risk at home, which is to say, what else is that teen prompting for? To be sure that they're not prompting for suicidal content because they're is risk at home. Right.

Lauren Haber Jonasother

So I think a lot of this kind of dovetails together and representative viewpoints from our, you know, our Wellbeing Council

Speaker Gother

on AI or Global Physicians Network, and this sort of broader, you know, representative data set has really guided our approach here.

Assemblymember McKinnonassemblymember

And thank you for that and to the companies that you guys work for in leadership and decision making. We need to see a more diverse group of people so that they could give their input because this is affecting all of our children. And it's great to see women sitting here. That is very good to see women sitting here. But we do need a more diverse perspective.

Assemblymember Wicksassemblymember

So in the next coming years, that's

Assemblymember McKinnonassemblymember

what I'll be looking at. Like, where are you guys with AI, with online tools? How are you guys making sure that all kids are going to be safe and that this affects our children?

Emily Cashman Kirsteinother

Thank you.

Speaker Dother

Thank you. Assemblymember, assembly member Ward, thank you for the presentations.

Speaker Uother

Obviously key interest of the committee for some of the work that's coming forward before us and certainly discussion out in the community, parents, schools and anybody that cares about our kids, myself included, my 11 year old and 7 year old. And I sympathize as well. The 7 year old, you know, is loving YouTube, but maybe a little too much. And it raises a question because, you know, still educating myself on how to set things up well and maybe we don't have enough education. Right. When you're creating a new account. And I kind of would want to ask two things for any companies that, you know, sort of are creating accounts or pointing you in that direction of where you're trying to be able to make sure the good controls are in place. You know, how do I even know to access these controls or what options are available to me as we're learning things here just today that we never even knew. And you know, if you are creating accounts, there's evidence, you said you're sort of screening the, you know, the technology is sort of screening that, you know, this viewer, like might be a youth, might be a teen. Is there proactive, like ways to be able to prompt the teen or any other viewers there, hopefully a parent is in the room to know about the options that are there that they can start to avail themselves of parental controls or other systems?

Speaker Gother

Yeah, I'll speak to OpenAI in particular,

Lauren Haber Jonasother

at every possible point we are attempting

Speaker Gother

to surface the concept of parental controls.

Lauren Haber Jonasother

It's available in our settings pages. It's available.

Speaker Gother

We point users constantly to our help center, to our notification systems. The goal is to drive as many parents to this as possible. I think industry wide we can do better at education as we've said here today, but the goal is in the product to surface as many sort of notification moments both to parents and to teens as possible, to our literacy resources, to our help centers, to the settings page to engage in these parental.

Speaker Uother

Yeah, I think, I think that could be certainly a takeaway that we need to, you know, you know, more immediately kind of work on in this moment is that we want to make sure that there is a lot more opportunity for all of the software and product that you're, you're developing to be able to help be a part of the solution here that that information is getting out there so it can be availed of and maybe related to this is, you know, a youth and say in my case, you know, a seven year old, you know, we sort of get him on there and he wants to watch a little bit and I literally am typing in the search chat, you know, educational videos for 7 year olds and there's a lot of great options out there, right? And so he starts going on those and he's kind of clicking around and I'm out of the room, room for 10 minutes and next thing I come back in there and he's like, you know, hyper graphic, like you know, like war scenes and gunplay and it's like how did I get from here to here? Right? And if I was typing in educational videos for seven year olds. Well, one, hopefully you're realizing that a seven year old is watching TV and so it sort of would have self corrected but it wasn't happening in this case. And, and two, why would algorithms even sort of like, you know, link these two? So kind of an open question there. And I really raised this because we're having that challenge right now. Next thing I know, I literally this week got a call from the principal about gunplay at school. Gunplay at school, you know, and it's like, okay, well yeah, I guess he can't watch YouTube and I don't want that prohibition on it because I recognize the positive benefit of it. But something is just not actively working in practice right now or there's not enough check in there that unfortunately there wasn't a real problem. Right. Like it didn't really like have a serious action but left unchecked I could see more and more real problems sort of surfacing.

Emily Cashman Kirsteinother

Well, I'M happy to take that. I think to kind of fuse the two questions, if I'm understanding them correctly, is when a user, for example, starts a Google account, if they're telling us they're under 13, they're automatically going, and it's saying you need a parent and getting that parent involved. And they can't access anything until they connect with the parent. And so they would go through that Family Link flow which would have all of those settings that we had there. But if a user says they're above 18, but then that's when our age assurance things come in and it isn't sure, then before they try to access any age restricted material, they'd have to confirm the age. But in that default setting, if we're seeing that it is indeed a seven year old in that setting, we would send them to Family Link, but say it's a teen. We're putting those default settings in place that would. For the parts of YouTube, I think one of the things that's really important and certainly can't speak to any specific incident, but I think what we're trying to do is elevate high quality content and limit low quality content. And so from the teen experience, we have principles that we've worked through with third party experts for both kids under 13 and teens to figure out what does high quality look like, what does low quality look like and how do we adjust those personalization recommendations accordingly. And I think the other thing, and not to say that this is the case, but I think this is one of the things we hear a lot, is the importance of children being on their accounts. And we made it actually much more easy for parents to toggle between their account. If a parent's on their account, they're not going to necessarily have those default settings that would have those personalization, those high quality principles, more in the feed than they would for a child and making sure that they can take advantage to Only the under 18 default settings that we talked about. But also whatever parental tools are in place. And so we're making it easier for parents to go back and forth and you know, wanting to show the importance of kids being on their own account.

Speaker Uother

Thank you for that. I wanted to switch because I'm overdue for a 4 o' clock meeting, Madam Chair. But I did want to make sure that we at least kind of like, you know, we're able to work on another sort of community issue. And I'm the chair of our lgbt and that comes up often as we're thinking about, you know, how to manage social media, whether any kind of constraints that we're putting on there. We do have concerns sometimes because we recognize both positive and benefit and negative challenges right around social media use. You can imagine a number of scenarios where a youth might be identifying or questioning themselves, but they might not be in a supportive environment, or they really just want to go to more kind of constructive, proactive things. Think Driver project. Think your local teen LGBT center, a support group, you know, just sort of positive information and, and, you know, with. With parental controls, with the ability to sort of manage all that, you get. Things get a little dicey, right, because, you know, they're watching what they are accessing, and then that might be, you know, kind of getting into their space of privacy a little bit too much when they're not ready to come out, or they're, you know, may not be coming out in a very, you know, supportive environment or worse. Right. Like, you know, a very, very hostile environment. And so that. That's something that comes up in this committee conversation as well as we're thinking about this, this, these regulations. And I guess what, what do you see? I know that this has been studied. The Surgeon General is looking at, like, you know, both, like, you know, studies positive and negative benefits. What do you. What do you see as sort of like, you know, the kind of lens that you're thinking through when it comes to LGBTQ youth to make sure that they're protected overall, but that privacy considerations are embedded as well, too, and positive benefits are directed.

Emily Cashman Kirsteinother

I think that's incredibly important. And when we do talk about the. How parental tools should, you know, different levels of it, that's why, you know, this is a difficult conversation. We need to be balancing the fact that teens do have an increased developmental capacity for autonomy, wanting to make sure they have, of course, all of those default settings, but they're all really good reasons why they should be having a more autonomous experience. And it's really important to think through those exact kind of examples as we're thinking through what public policy looks like and why it's important not to completely cut off access, but to allow access within a safeguarded environment.

Lauren Haber Jonasother

On the ChatGPT side, and again, we're not social media, so it's a little

Speaker Gother

bit of a different game here.

Lauren Haber Jonasother

But on the parental control side, one of the.

Speaker Gother

The core tenets and principles in the way that we built this is a parent will never have access and will never see the exact prompt and generation text that a teen is putting into ChatGPT. And it's why as we built parental notifications and all of this concept, the general topic of the distressing content being

Lauren Haber Jonasother

suicide specific is shared.

Speaker Gother

The exact prompt and generation text does not. Because why that teen is suicidal and everything that surrounds that is their privacy. But we want to give parents enough to have the ability to take an action. So for example, the goal is to preserve the privacy of the teen and allow a parent to have enough information to do something about it. But we recognize and have thought extensively and worked with our third party experts and counsels in the APA and what have you on this exact question. And so I appreciate it.

Assemblymember Wicksassemblymember

And is there a difference in. And I appreciate, I think it was

Speaker Dother

Instagram that you mentioned that there's a difference for your programs between under 16 and 16 to 18. For example, do you see any distinction between any age groups or Is everything under 18 privacy protected like a 7 year old?

Nicoleother

Can I actually correct that?

Assemblymember Wicksassemblymember

Sorry if I spoke. You're right.

Nicoleother

In everything that you said. I think it depends on what the experience is. So just to elaborate, when we rolled out, we launched the new expanded version of teen accounts, we took a different approach when it came to content. And we identified that teens should not see. Whether you're 13, 14, 16, 17, you shouldn't see content that's 18 plus. So depending on the type of experience, we actually sometimes delineate at under 16 versus over 16. And then there are other experiences that squarely fit in. This is an experience that, that teens should have and should not be accessing adult inappropriate content. So I just wanted to.

Assemblymember Wicksassemblymember

Okay, no, I appreciate that clarification.

Speaker Dother

Didn't want to misspeak for you. So do you have any distinction between under 18 or is all under 18 as privacy protected?

Speaker Gother

As you just mentioned, Today everything under 18 is privacy protected.

Speaker Dother

Interesting.

Eliza Jacobsother

We at Roblox, just to jump in there.

Speaker Dother

Go ahead.

Eliza Jacobsother

We have sort of, you know, kids

Speaker Rother

grow up in a variety of ages and stages, right? So as you age on the platform, you have access to a sort of expanded set of products, features, content and all of that. And we think of that as sort

Eliza Jacobsother

of a training wheels approach.

Speaker Rother

We want to teach kids good digital habits.

Eliza Jacobsother

And we know that at Roblox, for

Speaker Rother

many kids, we're the first account they

Eliza Jacobsother

ever have on the Internet.

Speaker Rother

And we take that really seriously. So we make distinctions. So under nine, for example, has no access to direct messaging on the platform. And as you age up, you have more and expanded access to communication features. After that age check to different kinds of content. And then at 18 plus you have access to restricted content on the platform. But I do think one thing that

Eliza Jacobsother

would be incredibly helpful, and this is from a couple questions ago, but we're all talking about sort of safe by

Speaker Rother

default and then layering on parental controls. But none of us use the same necessarily the identical language or the identical terms for settings and buttons and tools.

Eliza Jacobsother

And that makes it really hard for

Speaker Rother

parents to be able to navigate across. You know, I think the stat is most kids are on upwards of 40 different apps. And so to the extent that regulation,

Eliza Jacobsother

that legislation can standardize some of that

Speaker Rother

language to make the cognitive load easier on parents, I think we would welcome that as an industry. To say, like, this is what this word means. Everyone use this word when you're talking about this control. That would be incredibly helpful.

Eliza Jacobsother

We're all engaging with experts and teens

Speaker Rother

and parents and NGOs and, you know,

Eliza Jacobsother

pediatricians and all of that.

Speaker Rother

But we're all landing in slightly different places. Even though we're all trying to get to this outcome. The more we could standardize that language, I think the better and safer everyone is.

Speaker Mother

Yeah.

Assemblymember Wicksassemblymember

And I think that leads me to my next question, which is, you know,

Speaker Dother

somebody, remember Wicks, who had to leave passed the age appropriate design code, which was really intended to get at how

Assemblymember Wicksassemblymember

do we design these to be safe for children. And some of what I'm hearing today,

Speaker Dother

I think, is unclear to me.

Assemblymember Wicksassemblymember

Are you changing the algorithms or the recommendation engines?

Speaker Dother

Are you just shielding content?

Assemblymember Wicksassemblymember

I don't know.

Speaker Dother

That's a little unclear.

Assemblymember Wicksassemblymember

If you want to answer that. Age appropriate design was then sued on,

Speaker Dother

is very minimally now lawful, but mostly not lawful according to the 9th Circuit. So I guess I'm a little bit lost in.

Assemblymember Wicksassemblymember

Okay, great, we're here. You're talking about all these things.

Speaker Dother

We had an assembly member who's led in the space for a long time.

Assemblymember Wicksassemblymember

We tried to put that forward. It was then sued by industry.

Speaker Dother

So is that the gold standard?

Speaker Tother

Like, is this the gold standard?

Assemblymember Wicksassemblymember

Should we be saying what's safe for kids online?

Speaker Dother

Is that something the industry will ever allow to happen? I guess is the question. I don't know if I said that

Eliza Jacobsother

well, but I would say Roblox supported the California age appropriate design code for

Speaker Rother

precisely the reason that I just discussed.

Eliza Jacobsother

And, you know, I can't speak to the legality necessarily and what those arguments were, but I do think industry standards that people can align on would be incredibly valuable.

Speaker Dother

Anyone else want to weigh in on age appropriate design in concept?

Emily Cashman Kirsteinother

Well, I think speaking to the purpose of age appropriate design, we are in favor and have been. We had a legislative framework to protect children and teens. I think we released it back in 2023 with things like requiring companies to take the best interest of the child into account, to require companies to have offerings related to prioritizing mental health and well being, things like that. And with regard to age appropriate design, I think there's a lot of things that, you know, where we've, you know, age assurance was part of that. Privacy by design is part of that. We, you know, have those in place. And I think just more broadly, you know, this is, as we've talked about and you know, maybe unsatisfying in some ways. But I think this is, I think my colleague from Meta said this isn't static. This is, you know, an ongoing conversation, an ongoing way that we want to be meeting the moment for both parents and for minors.

Sunny Liuother

And I was, I mean, I was

Nicoleother

just gonna jump in that I agree with Eliza. I think standards are good. And I think you've heard we all have different versions of default settings, different versions of parent controls, different versions of content ratings, I guess if that's what you're going to call it. So we're all solving for the same root issues and we're trying to put in mitigations and we're all working with experts and parents. I mean, we're all facing the same things. I think, though, what we heard earlier today, and I know you're going to have, I think another panel on this too is, you know, as a parent and the fact that Eliza cited the same University of Michigan Common Sense Media research study that showed that teens are on an average of 40 apps per week, it's a lot for parents to be jumping through those hurdles. And frankly, you know, I think parents have said time and time again and teens have said that the digital world is not going away. And there's a lot of good across everything that everybody has said today, it's not going away. But parents should be able to support their teens when they're online. And if a parent doesn't want their teen on 40 apps per week, they should be able to pick the apps and approve if it's too. You like YouTube. If you want your teenager or kid to be on YouTube, that's your choice. It doesn't remove the obligations from all of us, all of our companies to build those age appropriate experiences. They still have to happen. But I think we need to make it easier on parents because every person here has described a different version of what hoopp's parents are Jumping through whether it's streamlined or not to support their teens. And I think think we've pushed for federal legislation and state legislation to get parental consent at the OS App Store. And I think if you can make it easy on parents and the apps continue to build these safeguards as technology changes, you're supporting not only teens but also their parents. So I think it's everything that we've been discussing and then some now.

Assemblymember Wicksassemblymember

And you said it's funny you say, I like YouTube, I actually have a

Speaker Dother

love hate relationship with it.

Assemblymember Wicksassemblymember

I think there are fair enough as we do with most tech companies, technology frankly, so not to pick on YouTube again. But you know, I think it's so complicated and look, this is my life's

Speaker Dother

work and I didn't know about the

Assemblymember Wicksassemblymember

parental controls on YouTube so if I don't know about it, then that really says something.

Lauren Haber Jonasother

But that's the point.

Nicoleother

If it can be easier for everybody at the OS App Store level where it's like the same thing, the same standards, and then parents can decide, I'm okay with this app. Maybe my 12 year old, 13 year old is fine with YouTube, but maybe I have a kid with ADHD who's not okay with it. You as a parent should be able to decide and if you change your mind, you change your mind. But that's a parent's decision.

Assemblymember Wicksassemblymember

I also think, look, I'm a big,

Speaker Dother

I love the training wheel analogy because I actually truly believe, and this is

Assemblymember Wicksassemblymember

why the computer's in the kitchen, is that in my family, my kid will

Speaker Dother

leave home and he will have these devices and he will have access to these things.

Assemblymember Wicksassemblymember

And it's my job while he is in my home and living under my

Speaker Dother

roof to help him learn to navigate these spaces.

Assemblymember Wicksassemblymember

And so to not, you know, we

Speaker Dother

all, again, I went to college long

Assemblymember Wicksassemblymember

before these spaces existed. But we knew the kids who were sheltered a little bit too much and got to college and with other things

Speaker Dother

went a little bit, you know, wild because they hadn't been taught how to manage things that are exciting.

Assemblymember Wicksassemblymember

And so I struggle because I think kids should be in these spaces with their parents learning how to navigate them. How do we think critically about content on YouTube when you're being fed something that is maybe toxic or problematic or not factually based, how do you ask questions and look up sources? And that is something people have to learn. But at the same time, when I sit and watch

Speaker Dother

my daughter be fed

Assemblymember Wicksassemblymember

content, frankly that is different than my son.

Speaker Dother

That is incredibly disturbing from a body image perspective. I'M like, should I be allowing this at all?

Assemblymember Wicksassemblymember

And so I think that if we can create spaces where they can learn

Speaker Dother

and grow and start to get these

Assemblymember Wicksassemblymember

critical thinking skills, we are better for it.

Speaker Dother

And the problem is I think we're

Assemblymember Wicksassemblymember

not there right now.

Speaker Dother

So Assemblymember Wicks wanted me to ask you questions.

Assemblymember Wicksassemblymember

I think we've answered the first one. She said, for under 18, are the default settings the strictest? I think the answer was yes for everybody.

Speaker Dother

Correct me if I'm wrong.

Emily Cashman Kirsteinother

Yes.

Assemblymember Wicksassemblymember

Okay. And then who can change them?

Speaker Dother

Can kids override them?

Assemblymember Wicksassemblymember

I think I heard you say only at 16 to 18.

Speaker Dother

Kids can override them them in some

Assemblymember Wicksassemblymember

contexts unless they're in parent supervision.

Nicoleother

So some 16, 17 year olds may want to be in parent supervision. If they're not in parent supervision, they can undo some of the settings. Not all.

Assemblymember Wicksassemblymember

Okay.

Speaker Dother

And then for Google, so we have.

Emily Cashman Kirsteinother

So for Google, a teen, a supervised user would remain on supervision after the age of 13 with, with YouTube, there are, you know, there's a voluntary teen experience that I'm happy to kind of get more information for you on. But I think I also want to go back to the point that was made earlier and just clarify that parents right now, both on Android and through Family Link, have the ability right now to approve or block apps. And I just want to make sure

Speaker Nother

that that's very clear.

Speaker Dother

And that's true on Apple too, I think. Yeah, but I have Apple devices in our house.

Emily Cashman Kirsteinother

So

Speaker Gother

yeah, I think we answered this.

Assemblymember Wicksassemblymember

You said nobody can override it.

Lauren Haber Jonasother

Right.

Speaker Gother

Okay.

Assemblymember Wicksassemblymember

Yeah. And then I think Roblox, I heard you say it depends on the age,

Speaker Dother

but can depends on the age.

Speaker Rother

It's sort of, as I said, training wheels approach. We have parental visibility through. I think it's 18, but it might be 16. I will double check.

Eliza Jacobsother

And then we also have youth mental

Speaker Rother

health tools, again, to the sort of digital literacy point that you were making.

Eliza Jacobsother

We worked with our teen council to

Speaker Rother

ask them what would be most valuable to them. And so at 13, they have a series of youth mental health tools that are available to them in their own dashboard to make choices for themselves.

Speaker Dother

Got it.

Monica Buffonother

Okay.

Assemblymember Wicksassemblymember

And then her next question was, would our prior panel.

Speaker Dother

And this is kind of a tough one. I'm giving you her question.

Assemblymember Wicksassemblymember

She asked tough questions. Would our prior panel believe that the strictest settings, so presuming they keep it

Speaker Dother

on the strictest settings and don't make different choices as they can in some of these programs, are they good enough?

Assemblymember Wicksassemblymember

And she gave an example.

Speaker Dother

I'm Going to read her example of what she meant.

Assemblymember Wicksassemblymember

She said, for example, Google said you

Speaker Dother

have a bedtime reminder.

Assemblymember Wicksassemblymember

Can the kids just close that window and keep scrolling?

Emily Cashman Kirsteinother

Well, I think there's bedtime reminders and

Assemblymember Wicksassemblymember

I want to make sure I get it right.

Emily Cashman Kirsteinother

So let me make sure to follow up after. But I think, I think there's bedtime reminders and then there's downtime. And I think those are all available through family Link for parents to completely shut down the phone, whether it's, you know, the reminder itself, but also, you know, having the phone itself be off.

Speaker Dother

Okay, so would, I mean, I think

Assemblymember Wicksassemblymember

it's a challenging question. Would they think that these are sufficient?

Speaker Dother

The answer I heard them say themselves was no. So I don't know if you want

Assemblymember Wicksassemblymember

to speak for them. That feels. But I guess the last question would, which is her question, but actually I

Speaker Dother

share with her, is you're all sitting

Assemblymember Wicksassemblymember

here saying you're trying,

Speaker Dother

yet kids are dying. Right? I mean, kids are being harmed.

Assemblymember Wicksassemblymember

Kids are having eating disorder behavior because

Speaker Dother

they're being fed too much content of that nature.

Assemblymember Wicksassemblymember

I think that the lived experience of me and my peers and it sounds

Speaker Dother

like every single one of us is moms, I think. Right.

Nicoleother

We're all the same vintage.

Assemblymember Wicksassemblymember

All the same vintage mom. So you're probably getting the same questions we and comments at the soccer games.

Speaker Dother

I am. It's not working, right.

Assemblymember Wicksassemblymember

We see our teens and our younger children, I mean, addicted, wanting that device so badly, not wanting to go out and play because the iPad is sitting there even if it's turned off. And so I guess the question is like, why, if you're doing all of these things and you think they're best

Speaker Dother

in class, are we continuing to see harm arms?

Speaker Gother

I think that the I'll sort of answer from an OpenAI ChatGPT perspective, and I think we've all said this, this is a marathon, not a sprint. The way that teens engage with ChatGPT in particular changes over time as they grow, as they age. It's a learning source. It's a teach me quizzes source.

Senator Lowenthalsenator

It's a.

Speaker Gother

It evolves.

Lauren Haber Jonasother

The product is so new and so

Speaker Gother

early, at least for us. It's been around since November 2022 that the mitigations and the controls and the content restrictions are constantly changing. And we're evolving them because of the way that teens are using the tool. In the ChatGPT case in particular, it is so new. It is such new technology. The technology changes over time. So for us, the approach of sort of iterative deployment is how we think about this, which is to say we restrict and then learn and evaluate.

Lauren Haber Jonasother

To one of our prior panelists, we

Speaker Gother

learn, we look at metrics, we have dashboards, we at the individual user level and at the aggregate level to understand how our mitigations are working. And so I think, at least for ChatGPT, this is such a new technology that this will be a process. It's a marathon, not a sprint.

Speaker Dother

And have you pulled back models because they were harmful?

Speaker Gother

Yeah, so 4.0 in particular was deprecated

Assemblymember Wicksassemblymember

and that was, as I understood it,

Speaker Dother

mostly a sycophancy problem. Is that or am I missing?

Speaker Gother

There were a number of reasons that model was deprecated, but it's no longer in production, available to users.

Speaker Dother

Okay, I think.

Speaker Nother

Oh, sorry, go ahead.

Eliza Jacobsother

Sorry, jump in on that.

Speaker Tother

No, you're good.

Eliza Jacobsother

I totally agree. Look, I think it is a marathon, not a sprint.

Speaker Rother

And the technology is constantly changing. We only launched facial age estimation a couple of months ago when we felt the models were accurate enough to give us accurate age signal. We did not have that tool before. As the technology improves and becomes available, we will use it. And as our platforms grow and change, we will need to add more tools on top of them.

Eliza Jacobsother

I think the other thing about this

Speaker Rother

is that all of these platforms are a little bit different. They have a little bit of a

Eliza Jacobsother

different offering and all of our kids

Speaker Rother

are a little bit different. And what they need is a little bit different.

Eliza Jacobsother

It's not one size fits all at the platform level or at the user level. We're not talking about like car safety. Right.

Speaker Rother

Seat belt protects all of us.

Victoria Hinksother

Same

Eliza Jacobsother

airbag protects all of us. But when you're talking about different populations,

Speaker Rother

as was talked about earlier, for some kids, parental controls are incredibly important.

Eliza Jacobsother

And for some, that same parental control

Speaker Rother

might actually expose them to harm because

Eliza Jacobsother

their parent now knows something about their

Speaker Rother

private internal life that might cause them to harm them. So it's just so complex and so multilayered that there isn't one solution. Because every kid is different and every platform is different. And that's why it's a never ending problem to solve.

Speaker Dother

No, I appreciate that.

Assemblymember Wicksassemblymember

And I think what I struggle with

Speaker Dother

and I tell it.

Assemblymember Wicksassemblymember

I've told this story before. When my kids were born, I had a vibrating chair. It was the only place my babies would sleep.

Speaker Dother

It was my favorite thing in the world because it got me a nap and a shower.

Assemblymember Wicksassemblymember

Most days I believe it was five

Speaker Dother

babies flipped over and suffocated in the chair. The chair was recalled because the United

Speaker Rother

States of America wouldn't accept five babies. Was it the rock and play

Assemblymember Wicksassemblymember

in children? And so I get that this is hard, but we have accepted far too

Speaker Dother

many deaths of children through online harms.

Assemblymember Wicksassemblymember

And so I just. I hear you. I think it's hard. I actually,

Speaker Dother

I understand that, but I

Assemblymember Wicksassemblymember

just get to a point where I

Source: Assembly Privacy And Consumer Protection Committee · March 17, 2026 · Gavelin.ai