On October 6, the Children’s Advertising Review Unit (CARU) announced its finding that Discord, a social media platform that provides text, voice and video communication services via desktop, browser and mobile applications, is not an online service directed to children as defined by the Children’s Online Privacy Protection Act (COPPA) and CARU’s Guidelines for Online Privacy Protection.
Discord is popular with gamers who play multiplayer video games. CARU identified during its routine monitoring of child-directed content that a number of Discord channels feature games with a significant number of players under 13, including Fortnite, Pokémon Go and Roblox. Many YouTube influencers with large teen and tween followings also have channels on Discord and encourage viewers to visit them. Concerned about the potential for an online audience of children under age 13, CARU considered whether the online service would be “child-directed” under COPPA and CARU’s Guidelines.
When CARU initially reviewed Discord, it found that, in order to register for an account, a user must supply an email, username and password. Discord did not ask for the age of the user. If Discord were in fact directed to children under COPPA, CARU was concerned that Discord collected and allowed the disclosure of personally identifiable information from children younger than 13 without first obtaining verifiable parental consent.
The space for each community on Discord is called a “server.” The majority of servers are invitation-only spaces where users communicate and collaborate only with users they choose. Discord also hosts some public servers, which users may opt into. Discord noted that, since its inception, its product’s use has expanded into a general communication platform for all sorts of groups. Discord argued that its platform should be considered a general audience platform by design and that, since its inception, it has taken the following steps to ensure that its online service is and remains directed primarily to a general audience:
- Ensuring its mobile app is not, and has never been, posted in the “Kids” or “Family” section of any app store (and further noting that it is rated in Google Play as “Teen,” which means “Content is generally suitable for ages 13 and up,” and as 12+ in the Apple App Store).
- Not marketing or advertising the platform to individuals under 13 or to younger teens—for example, by recruiting only participants ages 18 and up for marketing focus groups and setting its Facebook interest-based ad campaign to audiences ages 18 and up.
- Designing its user interface so that it is adult-oriented and challenging, even for sophisticated users. Discord explained that this adult-oriented design is reflected in the popularity of user-created servers related to complex, hardcore games—such as Rainbow 6, Valorant and Destiny—and to nongaming topics, such as software engineering, COVID-19 and LGBTQ+ activism and solidarity.
- Developing a business model that is based on paid subscriptions sold at the high price point of $99.99 a year or $9.99 a month. Discord further noted that it has never been supported by behavioral advertisements or the monetization of user data.
During the CARU investigation, Discord acknowledged that its platform attracts teen users. For that reason, it has taken steps to promote the online safety of its users and prevent users under 13 from joining the services, including:
- Launching a Safety Center to guide parents on how to help their teen children stay safe on the platform, including step-by-step instructions to prevent teen users from receiving messages from strangers and how to block inappropriate content.
- Creating a Trust and Safety team of 35 highly trained employees (which, it noted, makes up over 15 percent of its workforce) to investigate and review reports and complaints and enforce the company’s policies and community guidelines, including use of the platform by underage users.
- Making available a variety of methods for reporting underage users, including a Trust & Safety form for submitting a “Parent of a User” request.
- Ensuring reports of users under 13 are promptly investigated, and immediately removing users from the platform if the service becomes aware that they are under 13, regardless of whether a report or complaint has been submitted. Discord was able to show that users who are removed for this reason can go back on the platform only if they satisfy a rigorous appeals process.
Based on the arguments set forth by Discord, CARU had to determine whether this platform that users use to communicate through text, voice and video was directed to children as defined in COPPA and, if so, whether it complies with CARU’s guidelines on online privacy protection and COPPA. After reviewing the platform and all the evidence presented by Discord, CARU determined that the platform was not directed primarily or secondarily to children under 13.
Why it matters: CARU agreed with Discord’s evidence that its platform is a general audience service that is primarily directed to adults and secondarily directed to teens. In reaching its determination, CARU relied on the following evidence:
- Evidence of intent to direct its online service to adults who engage in competitive online gaming
- The mature content of the most popular games for which Discord servers are created on the platform
- Discord’s dissimilarity with social media platforms that are popular with children (e.g., it does not include searchable public profiles, rewards and reputations for “liking” and sharing content that are traditionally attractive to children and younger teens)
- The lack of advertising campaigns directing its service to children or a younger 13-17 teen audience
- Use of a subscription-based business model rather than a typical data-driven social media model that monetizes online behavioral data
- The implementation of a large Trust & Safety team that monitors the service and responds quickly to reports of underage use
The factors cited above by CARU in finding that Discord is a general audience service can be used as practical guidance for operators of websites and online services in determining whether a service is directed to children under COPPA. Also, as a general audience service, Discord is permitted under COPPA to screen users on the basis of age and to block children under 13 from using its services. CARU was pleased that Discord decided, after CARU opened its investigation, to implement a neutral age screen as an added measure to ensure that children under 13 do not register for and use the platform.
In the wake of the COVID-19 pandemic and distance learning, children are spending significantly more time on mobile and digital devices, both for education and entertainment purposes, which has resulted in both the Federal Trade Commission and CARU ramping up their COPPA enforcement efforts.