
[co-author: Stephanie Kozol]*
On February 4, the Office of the Minnesota Attorney General (AG) released its second Report on Emerging Technology and Its Effect on Youth Well-Being, outlining the effects young Minnesota residents allegedly experience from using social media and artificial intelligence (AI). The report highlights alleged adverse effects that technology platforms have on minors and claims that specific design choices exacerbate these issues.
Minnesota law requires the AG to issue reports that (1) evaluate the impact of technology companies and their products on the mental health and well-being of Minnesotans, with a focus on children; (2) discuss proposed and enacted consumer protection laws related to the regulation of technology companies in other jurisdictions; and (3) include policy recommendations to the Minnesota legislature. Upon releasing the report, Minnesota AG Keith Ellison voiced his perceived concerns about minors’ use of social media and expressed his belief that companies need to “establish reasonable guardrails to protect young people online[.]”
Specific findings discussed in the report include:
- The Effects of Emerging Technology on Minnesotans, Especially Youth: The report claims that younger users experience bullying and harassment on social media platforms, despite the fact that some users find community on such platforms. The report also alleges concerns that younger users may view unwanted sexual content; experience negative social comparison; receive unwanted contact from strangers; risk the misuse of their personal data; and experience risks of compulsive technology use, with potential impacts on sleep and socialization.
- The Legislative and Legal Landscape: According to the report, legislation regulating social media has been introduced and enacted in various jurisdictions, such as the UK, the EU, and Australia. Per the report, these regulations have focused on regulating illegal content, such as graphic or violent images, and attempting to reduce the prevalence of bullying and harmful content. For example, the EU enacted the “Digital Services Act,” which requires social media platforms to disclose how their algorithms operate and to develop added transparency for advertising and content moderation. In the U.S., several states have begun implementing similar legislation, but some have faced opposition based on potential First Amendment violations.
- What Can We Learn From Previous Legislative Efforts?: As detailed in the report, Minnesota believes that for states to successfully regulate social media, proposed legislation must survive challenges based on the First Amendment and Section 230 of the Communications Decency Act. California previously faced obstacles in its attempt to regulate social media because proposed legislation has applied directly to expressive content. Additionally, Minnesota and Ohio have also faced difficulty in enforcing its social media regulation due to the alleged overbreadth and vagueness of its proposed legislation. In response, Minnesota claims that constitutional issues can be avoided by including precise language in proposed legislation and by refraining from regulating expressive content.
- The Current and Expanding Impact of AI: Minnesota has concerns regarding the widespread presence and the usage of AI. In the report, Minnesota claims AI has been used to create “deepfakes,” which may contribute to online bullying. Additionally, Minnesota alleges that AI may not have necessary and appropriate safeguards for young users, making its usage risky.
- Policy Recommendations: Minnesota recommends implementing new policies that protect children, rather than attempting to restrict their social media usage, all while respecting privacy and First Amendment concerns. These policy recommendations include banning “deceptive patterns” in social media, implementing technology education in schools, and implementing greater transparency about platform features.
Ellison also included six model bills directed at promoting additional safety measures in social media by utilizing the report’s recommendations to draft bills for the Minnesota legislature, as well as for other jurisdictions.
Why It Matters
The Minnesota AG’s report demonstrates that state legislatures and AGs have prioritized scrutinizing social media companies, with a particular focus on how those companies may affect minors. This trend is further evidenced by the increasing number of social media bills introduced in state legislatures, as well as the fact there is ongoing social media litigation in nearly every state. As a result, social media platforms — particularly those whose user bases feature a significant number of minors — must closely monitor industry litigation trends and be prepared to adjust their policies as necessary.
*Senior Government Relations Manager