Skip to Content

Colorado leads lawsuit against social media giant ‘Meta’

KRDO

COLORADO SPRINGS, Colo. (KRDO) -The Colorado attorney general is leading a joint federal lawsuit along with dozens of other states.

The lawsuit accuses social media giant Meta, the parent company of Facebook and Instagram, of causing physical and mental harm to young children and teens. The lawsuit claims that Meta knew its addictive features harmed young people’s physical and mental health.

Executive Director of the non-profit National Alliance on Mental Illness (NAMI) in Colorado Springs, Ray Merenstein said he's eager to see how this lawsuit shakes out.

"That's why we really want to look at this AG [attorney general] suit and say, what can the companies do to protect the youth," said Ray Merenstein the Executive Director of the non-profit National Alliance on Mental Illness in Colorado Springs.

Not only is Merenstein an advocate for mental health but he's also a father of three. He says navigating the use of social media for his kids can be difficult as a parent.

"What scares me is, so much is going on in this world that the youth have to deal with that. If we don't give them the resources to deal with it, they're going to get more and more anxious and more and more overwhelmed and potentially more and more depressed," said Merenstein.

The federal complaint, joined by 33 states and filed in the U.S. District Court for the Northern District of California, alleges that Meta knew of the harmful impact of its platforms, including Facebook and Instagram, on young people. Instead of taking steps to mitigate these harms, the lawsuit claims it misled the public about the harms associated with the use of its platform, concealing the extent of the psychological and health harms suffered by young users addicted to the use of its platforms. 

The complaint further alleges that Meta knew that younger users, including those under 13, were active on its platforms, and knowingly collected data from those users without parental consent.

"That becomes the issue is if we're not looking over their shoulder, we're not regulating it," said Merenstein.

The suit also accuses Meta of pushing users into descending “rabbit holes” in an effort to maximize engagement, pointing to features like infinite scroll, and near-constant alerts.

"The companies are trying to get them to look more and more at the content. They're going down that rabbit hole," said Merenstein.

Many researchers agree, including the U.S. Surgeon General, but Meta denies the claims.

“We share the attorneys general’s commitment to providing teens with safe, positive experiences online, and have already introduced over 30 tools to support teens and their families. We’re disappointed that instead of working productively with companies across the industry to create clear, age-appropriate standards for the many apps teens use, the attorneys general have chosen this path.”  

-a Meta spokesperson

A Meta spokesperson also provided KRDO13 with some other background information in regards to the investigation that may be useful for parents, which is in their own words:

  • Since this investigation has begun, we have engaged in a meaningful dialogue with the attorneys general regarding the ways Meta already works to support young people on its platforms, and how Meta is continuously working to improve young peoples’ experiences. 
  • The issues identified by the attorneys general lend themselves to cross-industry standards for young people and the need to work with companies across the industry in addressing these topics. Instagram is one of the top social media platforms young people use today, but other platforms are just as popular — and in some cases more popular — for teens. See research from Gallup last week pointing to TikTok, YouTube, and Snap as examples of the other most popular platforms.  That makes it particularly disappointing that the attorneys general have chosen to single out one company, instead of taking the opportunity to work productively across the industry, especially if their motivations are, as they say, to address the impacts of social media at large.
  • While we share the attorneys general’s concern around teen mental health trends in the US, it's also important to recognize the complexity of mental health and the many issues teens struggle with in their daily lives, such as growing academic pressure, substance use, rising income inequality and limited access to public mental healthcare. The cause of these mental health trends is still being investigated, but some, including the American Psychological Association, have acknowledged that social media can play a positive role in young people’s psychological development. To support teens, we need to look at the bigger picture and examine the many issues they struggle with in their daily lives, and we need to recognize that social media is a place where many teens come to to find support and community when they're struggling.
  • Contrary to the allegations made, we’ve developed over 30 tools to support teens on our apps, and to make it simple for parents to shape their family’s online experiences, and we see that they work. Examples include:

o    Setting teens' accounts (under 16 in the US) to private when they join; we also limit the amount of potentially sensitive content they can see in Explore, Search, and Reels.

o    Age verification technology to help ensure teens are in the right experiences for their age.

o    Parental supervision tools that let parents and guardians set time limits on their teen's app use, and see information like who their teen reports or blocks.

o    Showing teens reminders to take regular breaks from Instagram and turn on tools like Quiet Mode and Take A Break for teens, designed to help them manage the time they spend and the notifications they receive.

o    Sharing expert resources when someone searches for, or posts, content related to suicide, self-injury, eating disorders, or body image issues. 

On U13 data allegations specifically:

  • Instagram’s Terms of Use prohibit users under the age of 13 (or higher in certain countries), and we do not knowingly collect information from anyone under 13. If we suspect that someone is under the age of 13, we deactivate their account and delete their data if they cannot demonstrate they meet our minimum age requirements.
  • People under 13 years old are not allowed on Instagram and Facebook. When we learn someone potentially under 13 has created an account, we remove them if they can’t demonstrate they meet our minimum age requirement. 
  • When people open our apps to sign up for an account, we require them to share their date of birth. As part of the sign-up process we will restrict users who repeatedly try to enter different birthdays, and those who are underage are not allowed to sign up.
  • If someone enters an age under 13, they are unable to create an account. 

On how we restrict ads to teens: 

  • Last year, we announced several updates to how advertisers can reach teens on our platforms. The changes we announced build on those updates so that we can continue providing teens with a more age-appropriate ad experience. We recognize that teens aren’t necessarily as equipped as adults to make decisions about how their online data is used for advertising, particularly when it comes to showing them products available to purchase. For that reason, we’ve further restricted the options advertisers have to reach teens, as well as the information we use to show ads to teens.  These changes reflect research, direct feedback from parents and child developmental experts, UN children’s rights principles, and global regulation.
  • This means that age and location will be the only information about a teen that we’ll use to show them ads.

On allegations of “social media addiction”:

  • The body of research on social media and negative mental health is not conclusive, nor does it support the suggestion that social media use causes teen mental health issues.
  • We also know that how people use social media matters and can influence their overall experience, which is why we have so many features, like Quiet Mode and Take a Break, that encourage teens to take regular breaks from social media and let parents set scheduled times they can use social media. These are intended to help teens be intentional and set thoughtful boundaries about their time online. We also see that these features work: Over the course of a given week, 44% of teens who see a 'Take a Break' reminder on Instagram will respect it at least once by leaving Instagram for 10 minutes or more.
  • We need to look at the bigger picture and examine the many issues teens struggle with in their daily lives, and we need to recognize that social media is a place many teens come to to find support and community when they're struggling.


Today, we have more than 30 features designed to support teens and their families. We’ll continue working with parents, experts, and many others who are invested in this important issue to develop more features like these:

Changes we’ve made so teens can express themselves in a safe environment: 

  • We use age verification technologyto help teens have experiences that are appropriate for their age, including limiting the types of content they see and who can see and interact with them. 
  • We automatically set teens’ accounts (U16) to private when they join Instagram. We also don’t allow people who teens don’t follow to tag or mention them, or to include their content in Reels Remixes or Guides.These are some of the best ways to help keep young people from hearing from adults they don’t know, or that they don’t want to hear from. 
  • We’ve developed technology to help prevent suspicious adults from engaging with teens. We work to avoid showing young people’s accounts in Explore, Reels or Accounts Suggested For You to these adults. If they find young people’s accounts by searching for their usernames, they won’t see an option to follow them. They also won’t be able to see comments from young people on other people’s posts, nor will they be able to leave comments on young people’s posts.
  • We limit the types of content teens can see in Explore, Search and Reels with our Sensitive Content Control. The control has only two options for teens: Standard and Less. New teens on Instagram who are under 16 years old are automatically placed into the Less state. For teens who are already on Instagram, we send  prompts encouraging them to select the Less experience. 
  • We don’t allow content that promotes suicide, self-harm or eating disorders. Of that content we take action on, we identify over 99% before it is reported to us.
  • We show expert-backed, in-app resources when someone searches for, or posts, content related to suicide, self-harm, eating disorders or body image issues. They see a pop-up with tips and an easy way to connect to organizations like NEDA in the US. We also have a dedicated reporting option for eating disorder content.

Tools we’ve built to foster a supportive and positive experience for teens:

  • We show teens notifications to take regular breaks from Instagram. Take a Break shows full-screen reminders to leave the app. 
  • We prompt teens to turn on Quiet Mode if they’re on the app for a specific amount of time at night. Quiet Mode gives teens more ways to focus and set boundaries with friends and family. Once on, we won’t show any notifications, activity status will change to In quiet mode and an automatic reply is sent when people receive DMs.
  • We notify teens that it might be time to look at something different if they’ve been scrolling on the same topic for a while.
  • We give people the option to hide like counts, so they don’t have to show others like counts on their own posts or see likes on other peoples’ posts. 
  • We show people a warning if they try to post an offensive comment. The warning reminds them of our Community Guidelines and that we may remove or hide their comment if they proceed to post.
  • We give teens the option to turn on Hidden Words for comments and DMs. Once on, comments and DMs containing emojis, words or phrases selected by the user will be hidden. 
  • We make it easy for teens to use Restrict to help prevent bullying. We developed Restrict specifically for teens, because they told us they wanted a more subtle way to block bullies without them knowing they’d been blocked. 
  • We let people easily manage what recommended content they see in Reels, Search and Explore. You can select Not Interested on posts in Explore, and we’ll aim to show you less of this kind of content anywhere you might see recommendations. 
  • We give people ways to avoid seeing specific topics in recommended content. If you say you don’t want to see things like “recipes” or “fitness,” we’ll work to no longer show you content with those words in the caption or hashtag.

Features we’ve developed to help parents navigate our apps with their teens:

  • We’ve developed supervision toolsthat help parents see how much time their teen is spending on Instagram and to set time limits. We also allow ​​parents to see who their teen follows, who follows their teen and to be notified when their teen reports or blocks someone. 
  • We’ve built aFamily Center to help teens and families build healthy online habits. It includes expert-backed guidance to help teens have positive online experiences using our technologies. 
  • We offertools and resources to support parents, guardians and teens on Instagram and in VR and to help teens spend more meaningful time online.
  • We have Instagram and VR Parents Guides, developed in partnership with experts like The Child Mind Institute and ConnectSafely, that give parents easy ways to talk to their teens about safety and well-being.

Partnerships we’ve built with some of the world's leading experts and safety organizations:

  • We consult regularly with our Youth Advisors, a group of external experts across online safety, privacy, media literacy, wellness, social and emotional health. Our Youth Advisors advise on app development, new policies and more ways to support teens.
  • We look to our Safety Advisors, comprised of independent safety organizations around the world, to inform our overall approach to safety. 
  • We’ve formed and meet regularly with our Body Image Expert Circle, a group designed specifically to help create new tools and policies that improve teens’ feelings about body image and eating disorders.
  • We work in collaboration with the National Center for Missing and Exploited Children (NCMEC) to address content that could exploit children. Specifically, we’ve funded NCMEC’s hash-sharing platform that helps thwart the proliferation of young peoples’ intimate images. 
  • We partnered with The Jed Foundation to develop our Pressure to be Perfect programwhich supports teens and parents with resources on how to manage pressures teens may feel to look or act a certain way online.
  • We launched a digital leadership program in partnership with Girl Scouts USAThe program helps prepare girls with the skills they need to navigate online spaces and includes activities, conversation guides and education on topics like managing social comparison and balancing time online.
  • We created a Well-being Creator Collective, in partnership with experts, to fund and educate creators on how to develop content that inspires teens and supports their well-being. Our Expert Steering Committee includes Dr. Linda Charmaraman, Director of Youth, Media and Well-being Research Lab at Wellesley Centers for Women, Dr. Robin Stevens, Director of the Health Equity and Media Lab at USC and Dr. Earl Turner, Founder of Therapy for Black Kids.
  • We joined Google, Microsoft and 15 other tech companies to form Project Protect, a plan to combat online child sexual abuse as part of the broader Technology Coalition.

The federal complaint seeks injunctive and monetary relief to rectify the alleged harms caused by Meta's platforms.

KRDO 13 reached out directly to Colorado's Attorney General, Phil Weiser, but he was unavailable for an interview by our deadline.

Article Topic Follows: News
Colorado
local news

Jump to comments ↓

Author Profile Photo

Barbara Fox

Barbara is a reporter based out of Pueblo for KRDO NewsChannel 13. Learn more about her here.

BE PART OF THE CONVERSATION

KRDO NewsChannel 13 is committed to providing a forum for civil and constructive conversation.

Please keep your comments respectful and relevant. You can review our Community Guidelines by clicking here

If you would like to share a story idea, please submit it here.

Skip to content