Instagram has made a significant organizational shift by cutting 60 positions of technical program managers, effectively removing a layer of management within the company, as reported by The Information. The impacted employees have been given a two-month window to explore other job opportunities within the organization. After this transitional period, if they are unable to secure a different role at Instagram, their employment will be terminated.

A former Instagram employee highlighted in a LinkedIn post that some technical program managers may be required to “re-interview for PM roles” or project manager positions. Meta, the parent company of Instagram, declined to comment on the job cuts but referred to Meta CEO Mark Zuckerberg’s March 2023 blog post outlining the company’s “Year of Efficiency” initiative. In that statement, Zuckerberg emphasized Meta’s commitment to enhancing its financial performance and reducing its overall headcount.

In addition to the job cuts, Instagram has also communicated to its employees about a restructuring of product teams. The company is now creating three distinct focus areas for the team responsible for facilitating content creation and sharing on the social platform. These new focus areas include Creation, Creators, and Friend Sharing.

Read Also: Google Slashes Hundreds of Jobs in Voice Assistance and Hardware Teams as Fitbit Founders Depart

This strategic move underscores Instagram’s commitment to providing robust support to creators who are instrumental in driving teen engagement on the platform. While Instagram’s focus on teen engagement is not surprising, it coincides with legal challenges, with more than 40 U.S. states suing Meta. These legal actions allege that Meta’s services are contributing to mental health issues among young users.

The lawsuit contends that over the past decade, Meta has profoundly influenced the psychological and social experiences of a generation of young Americans, employing powerful technologies to captivate and engage youth and teenagers.

As Meta faces mounting regulatory scrutiny, it is continuing to prioritize teen engagement and retention. Nonetheless, recent changes demonstrate Meta’s efforts to address lawmakers’ concerns regarding teen safety. The company recently announced its intention to automatically restrict the type of content that teen Instagram and Facebook accounts can access. Under these changes, young users will be shielded from harmful content, including posts related to self-harm, graphic violence, and eating disorders.

In a challenging landscape of regulatory pressure and increased scrutiny, Meta appears committed to evolving its platforms to foster a safer and more responsible online environment, particularly for its younger users. These developments are expected to play a significant role as Meta executives prepare to testify before the Senate on child safety alongside other major social media platforms on January 31. Committee members are poised to inquire about the platforms’ measures to protect children online.