Just over a year ago, social media firms received a warning about their treatment, or lack thereof, of their youngest users. Read below about parental controls on social media.
During a series of congressional hearings, executives from Facebook (FB), TikTok, Snapchat, and Instagram were grilled by lawmakers regarding how their platforms could lead young users to harmful content, harm their mental health and body image (especially teenage girls), and lacked adequate parental controls and safeguards to protect teens.
These hearings, which followed disclosures in the “Facebook Papers” by whistleblower Frances Haugen about Instagram’s impact on teens, prompted the companies to promise change. The four social networks have since introduced more tools and parental control options to better protect younger users. Some have also adjusted their algorithms, such as automatically showing teens less sensitive content and increasing moderation efforts. However, some lawmakers, social media experts, and psychologists argue that these new solutions are still inadequate, and more needs to be done.
Instagram
In the wake of the leaked documents, Instagram, which is owned by Meta, halted its heavily criticized plan to create an Instagram version for children under 13 years old and turned its attention to improving the safety of its primary service for young users.
The company has since launched an educational hub for parents that includes resources, tips, and articles from experts on user safety. It has also introduced a tool that allows guardians to monitor their children’s Instagram activity by seeing how much time they spend on the app and setting time limits. Parents can receive updates on the accounts their teens follow and the accounts that follow them. They can also view and receive notifications if their child makes any changes to their privacy and account settings. The company has also created video tutorials on how to use the new monitoring tools.
In addition, Instagram now offers a feature that encourages users to take a break from the app. After a set period of time, the app suggests that users take a deep breath, write something down, check a to-do list, or listen to a song. The company has also stated that it is taking a more rigorous approach to the content it recommends to teens. If a user has been focusing on any type of content for too long, such as harmful or negative content, Instagram will guide them towards different topics, such as architecture or travel destinations.
Facebook
Facebook’s Safety Center offers various monitoring tools and resources, including articles and advice from top experts. “Our long-term goal for the Family Center is to enable parents and guardians to help their teens manage experiences across Meta technologies, all from a single location,” stated Liza Crenshaw, a spokesperson for Meta, in an interview with CNN Business.
The hub also includes a guide to Meta’s VR parental supervision tools from ConnectSafely, a non-profit organization dedicated to helping children stay safe online. This guide assists parents in discussing virtual reality with their teenagers. Guardians have access to supervision tools and can see which accounts their teens have blocked. Additionally, they can approve their teen’s download or purchase of a blocked app based on its rating, or they can block specific apps that may not be suitable for their teenager.
Snapchat
Snapchat launched a parent guide and hub in August, which aims to provide parents with more insight into their teen’s use of the app. This includes information on who they have been communicating with within the past week (without revealing the content of those conversations). To utilize this feature, parents must create their own Snapchat account, and teens must opt-in and provide consent.
Snapchat already had some safety measures in place for young users, such as requiring mutual friends before they can communicate and prohibiting public profiles. Teen users have their Snap Map location-sharing tool turned off by default, but they can use it to share their real-time location with a friend or family member while their app is closed as a safety measure. Additionally, the Friend Check Up tool encourages users to review their friend lists and ensure they want to remain in touch with specific people.
Snapchat provides parents with the ability to view their teen’s new friends and report concerning accounts that may be interacting with their child in a confidential manner. It is also working on a tool to give younger users the option to notify their parents when they report an account or content.
The company has stated that it will continue to improve its safety features and consider feedback from the community, policymakers, safety and mental health advocates, and other experts to enhance the tools over time.
TikTok
TikTok unveiled additional measures in July to eliminate mature or potentially harmful content. The new measures included assigning a “maturity score” to videos identified as having mature or complex topics. Additionally, it introduced a feature that assists users in determining how much time they want to spend on the app. This feature allows users to establish regular screen time breaks and provides a dashboard with information such as the number of times the app was opened, usage breakdowns by time of day, and other statistics.
The Family Pairing hub on TikTok is a safety feature that allows parents and teens to customize their safety settings. It allows parents to link their TikTok account to their teen’s app and set controls, such as how much time they can spend on the app each day, restrict access to certain content, and decide if teens can search for videos, hashtags, or Live content. TikTok also provides a Guardian’s Guide that outlines how parents can protect their kids on the platform.
To protect younger users, TikTok restricts access to some features, including Live and direct messaging. When teens under the age of 16 are ready to publish their first video, a pop-up appears asking them to choose who can watch the video. Additionally, push notifications are limited after 9 p.m. for users aged 13 to 15 and 10 p.m. for users aged 16 to 17.
TikTok plans to increase awareness of its parental control features in the future.
Discord
Discord did not appear before the Senate last year, but it has faced criticism for the difficulty in reporting problematic content and strangers being able to contact young users. In response, the company has recently updated its Safety Center, which provides parents with guidance on how to enable safety settings, FAQs on how Discord operates, and advice on discussing online safety with teenagers. Some existing parental controls include the ability to prevent minors from receiving friend requests or direct messages from unknown individuals.
However, it is still possible for minors to interact with strangers on public servers or in private chats if invited by someone else in the room or if a channel link is shared in a public group they have joined. By default, all users, including those aged 13 to 17, can receive friend requests from anyone in the same server, which then enables them to send private messages.