Instagram head Adam Mosseri will on Wednesday urge the creation of an industry body to determine best practices to help keep young people safe online, in his first appearance before Congress.
Mosseri, in written testimony before a Senate panel, said the industry body should address “how to verify age, how to design age-appropriate experiences, and how to build parental controls.”
READ ALSO: Who can beat Facebook and its metaverse?
Photo-sharing app Instagram and its parent company Meta Platforms Inc, formerly Facebook, have come under intense scrutiny over the potential impact of their services on the mental health and online safety of young users.
Mosseri said companies like Instagram “should have to adhere to these standards to earn some of our Section 230 protections,” referring to a key U.S. internet law which offers tech platforms protections from liability over content posted by users.
READ ALSO: Celebrity Instagram content linked to negative feelings, Facebook researchers say
In a slew of announcements on young users’ safety on Tuesday, Instagram said it would be stricter about the types of content it recommends to teens, and would switch off the ability for people to tag or mention teens who do not follow them on the app. It will also introduce parental controls next year.
The app, since September, has suspended plans for a version of Instagram for kids, amid growing opposition to the project.
READ ALSO: Gaming is the next big thing for Netflix: platform hires a former Facebook exec as gaming VP
That pause followed a Wall Street Journal report that said internal documents, leaked by former Facebook employee Frances Haugen, showed the company knew Instagram could have harmful mental health effects on teens.
In his written testimony on Wednesday, Mosseri echoed the company’s previous statements that public reporting mischaracterized the internal research.