WhatsApp announced new parent-managed accounts, which allow parents and guardians to set up the messaging app for pre-teens with new controls to limit their experience to messages and calls.
These accounts, which will be rolled out globally in the coming months, will include "new default stricter settings", parental controls and options for parents to guide their pre-teens (13 years old in the US) through their first messaging experiences, WhatsApp reports.
New controls to limit your experience
"With input from families and experts, we are implementing the new parent or guardian-managed accounts that allow them to set up WhatsApp for tweens, with new controls to limit their experience on WhatsApp to messages and calls"
They must be accounts created and actively managed by parents or guardians, and must remain linked to their own WhatsApp account, the Meta-owned messaging app.
Parents will need the phone they bought for their family member and their own device side by side to link their accounts.
Decide who can communicate with the account
After setting up the account, the parent or guardian will be able to control it and decide who can communicate with the account and which groups it can join. In addition, these adults can review message requests from unknown contacts and manage the account's privacy settings.
The new parental controls and settings are protected by a parent PIN on the managed device. Only parents can access and modify the privacy settings, allowing them to tailor their family's experience, WhatsApp details.
"All personal chats remain private and are protected by end-to-end encryption, meaning that no one, not even WhatsApp, can see or hear them," the messaging network said.
Available "later this year"
According to WhatsApp, as these types of accounts are gradually rolled out over the coming months, it hopes to receive feedback from users to further develop the app and "provide the safest and most private way for families to communicate."
This WhatsApp communication comes almost two weeks after it was reported that Instagram will start notifying parents if their child "repeatedly searches for terms related to suicide or self-harm" on its social network, as announced by its parent company, Meta.
This latest measure will be rolled out to parents in the US, UK, Australia and Canada who use the social network's parental supervision tools, and will be available in other regions "later this year"
