Discord is rolling out new “teen-by-default settings” globally, a significant update aimed at enhancing safety and fostering a more inclusive environment for its users aged 13 and above. This phased implementation, scheduled to begin in early March, will automatically apply age-appropriate experiences to all new and existing users, encompassing updated communication preferences, restricted access to age-gated content, and advanced privacy-preserving content filtering.
Enhancing User Safety Through Default Settings
The core of this initiative lies in automatically configuring safety features. These “teen-by-default settings” are designed to protect younger users by default, requiring specific actions and, in some cases, age verification to alter these safeguards. This proactive approach by Discord signifies a clear commitment to user well-being, moving beyond optional safety features to embed them into the foundational user experience for teenagers.
Key safety measures include content filters that blur sensitive media, which users will need to undergo age assurance to view. Access to age-gated channels, servers, commands, and message requests will be restricted for non-adults. Furthermore, message requests from unknown users will be routed to a separate inbox, and warning prompts will appear for unfamiliar friend requests, adding layers of protection against unwanted contact. These features align with broader efforts in online security, where platforms are increasingly adopting measures to protect vulnerable users from evolving threats, including sophisticated phishing attacks that often target unsuspecting individuals through various communication channels.
Age Verification and Privacy Considerations
To ensure the effectiveness of these settings and to manage access to age-restricted content, Discord will implement age verification processes. Users may be prompted to verify their age to change certain default settings or access specific content. Discord’s methods for age assurance involve either facial age estimation or the submission of identification documents to trusted vendor partners. Crucially, these processes are designed with privacy in mind, utilizing on-device processing for video selfies and promptly deleting identity documents post-verification. This approach aims to strike a balance between robust safety measures and user privacy, an ongoing challenge for digital platforms as seen in various regulatory discussions around online surveillance and data handling, such as when Dutch police hack iPhones for real-time monitoring.
These developments come at a time when digital platforms are under increased scrutiny to protect younger users. The move by Discord is a timely response, echoing the concerns addressed by various security agencies worldwide regarding online interactions. For instance, German security agencies warn of state-sponsored phishing attacks via messenger services, highlighting the need for robust platform-level protections.
The Discord Teen Council: Integrating Youth Perspectives
In a further step towards creating a truly user-centric safety framework, Discord is launching its inaugural Teen Council. This council will comprise 10-12 teenagers who will play an integral role in shaping product features, policies, and educational resources by offering authentic adolescent perspectives. The application period for the Teen Council is open until May 1, 2026. This initiative underscores Discord’s commitment to involving its younger user base directly in the development of a safer platform, building on existing safety architectures as noted by Savannah Badalich, Discord’s Head of Product Policy.
The global rollout of these teen-by-default settings marks a significant evolution in Discord’s commitment to user safety, ensuring a more protected and age-appropriate online experience for millions of young users worldwide.

