Facebook parent company Meta has boosted its parental controls, bringing greater safety to its virtual reality offerings as well as social media.
In the UK, this means parents will be able to set daily time limits of between 15 minutes and 2 hours for apps like Instagram, which also falls within Zuckerberg’s empire.
There will also be a new feature on Instagram called “nudge”, which will recommend other topics to users who are repeatedly searching for certain topics.
For its virtual headset, Quest, parents will also have a dashboard, as well as purchase approval options and app blocking.
This new virtual feature perhaps comes after the tech giant was swamped with accusations earlier this year that suggested that the metaverse was rife with abuse.
The investigation by Channel 4’s Dispatches back in April found that the metaverse raises security risks for young people.
Working undercover, journalist Yinka Bokinni posed as a 22-year-old woman and a 13-year-old girl on the Meta Oculus Quest Store apps, VRChat and Rec Room.
Upon arrival in the online world, she was met with a flood of abuse; this included the discussion of sex acts with minors, as well as the use of racial slurs.
The Times also carried out a similar investigation in January, with a reporter deeming the virtual world as a “home to sex predators”.
The announcement also comes as the UK pushes forward with the Online Safety Bill, which aims to create a safer online community for Brits by clamping down on Big Tech.