© 2024 Texas Public Radio
Real. Reliable. Texas Public Radio.
Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations

Snapchat's new parental controls try to mimic real-life parenting, minus the hovering

Snapchat is rolling out new parental controls that allow parents to see their teenager's contacts and confidentially report to the social media company any accounts that concern them. A child lies in bed illuminated by the glow of a cell phone.
Elva Etienne
/
Getty Images
Snapchat is rolling out new parental controls that allow parents to see their teenager's contacts and confidentially report to the social media company any accounts that concern them. A child lies in bed illuminated by the glow of a cell phone.

Snapchat is rolling out parental controls that allow parents to see their teenager's contacts and report to the social media company — without their child's knowledge — any accounts that may worry them.

The goal, executives say, is to enable parents to monitor their child's connections without compromising teens' autonomy. Named Family Center, the new suite of tools released Tuesday requires both caregiver and teen to opt in.

"It allows parents to see who's in their teen's universe," said Nona Farahnik, director of platform policy for Snap, the company that makes Snapchat. "It offers parents the ability to ask who someone might be, how they might know a contact, which prompts those kinds of real-time conversations about who teens are talking to."

Farahnik says Family Center is modeled on real-life parenting.

"If your teen is headed to the mall, you might ask who they're going with. 'How do you know them? Are you guys on a sports team together? Do you go to school together?'" said Farahnik. "But you won't be sitting there at the mall with them listening to their conversations."

Similarly, parents cannot see the content that their teen is sending or receiving on Snapchat. They can only view whom their child has communicated with in the past seven days. Snapchat is popular with young people, partially because messages on the platform disappear within 24 hours.

The company says it consulted with safety experts and academics and conducted focus groups with parents to develop Family Center and it plans to roll out more features in the coming months. The tool is only for parents of kids under 18.

With Family Center, Snap follows other social media platforms, including Instagram, that recently have boosted parental controls. By at least one survey, Snapchat is the second-most popular social network among teens. The first, TikTok, offers "Family Sharing," which gives parents a few ways to limit the videos shown to their kids.

A promotional screengrab of Snapchat's new Family Center, shared by the company ahead of the rollout.
/ Snapchat
/
Snapchat
A promotional screengrab of Snapchat's new Family Center, shared by the company ahead of the rollout.

"I think these platforms want to show that they can take steps to protect kids, that they can self-regulate and that they're capable of doing it themselves without getting the government involved," said Irene Ly, policy counsel for Common Sense Media, which reviews apps, games and media for families.

Bipartisan legislation in Congress would require more sweeping changes targeted at protecting kids on social media, however lawmakers have yet to vote on the measures.

Advocate: Social media networks should be 'safer by design' for kids

Parental controls can be helpful to some families, says Josh Golin, executive director of Fairplay, an advocacy group focused on improving online safety for children. But they require parents to have the time, energy and commitment to figure out social media tools and use them regularly.

"Are you going to spend 20 minutes a day figuring out what's going on in Snap and another 20 minutes on TikTok and another 20 on Instagram?" he said. "I don't think that parents particularly want to be spending their time this way. What they would prefer to see is that these platforms take real steps to be safer by design."

For example, Golin says, it should be easier for kids to put down their phones and take a break from social media.

"As a 12-year-old, you might feel like, 'Oh my God, my life is going to be over if I don't communicate with my friend today on Snapchat,'" Golin said. "I don't think that we should be giving kids rewards and badges and things for using online platforms more. That's not fostering intentional, thoughtful use. I think that's fostering compulsion and only benefits the company."

Snap's terms of use require a child to state that they're 13 or older before signing up for the service. Snap says it screens for underage users in compliance with the Children's Online Privacy Protection Act.

"We have millions of young people already on Snap, including millions who are under 13 and shouldn't even be there in the first place," said Golin.

He says the companies could do a better job of verifying their users' ages, rather than taking users at their word.

Ly, of Common Sense, says companies also could look at how their algorithms amplify content that might be harmful for children.

For example, Ly said, a kid might interact with a post that encourages healthy eating for a fitness routine. But algorithms created to show users more of what they like could quickly lead that child down a rabbit hole of misinformation about disordered eating or other harmful eating habits.

Copyright 2024 NPR

Raquel Maria Dillon
Raquel Maria Dillon has worked on both sides of the country, on both sides of the mic, at Member stations and now as an editor with Morning Edition. She specializes in documenting wildfires and other national disasters, translating the intricacies of policy into plain English and explaining the implications of climate change.