What’s Trending? (Privacy a la Mode)
Notable fashion brands have been engaging in a “trial period” of new technologies as privacy laws and privacy enforcement are trending – for example, exploring integrating branding into digital assets in video games, virtual reality (VR) and augmented reality (AR) technology, metaverses, and non-fungible tokens (NFTs). Fashion naturally pushes the envelope, taking on risks in the interest of not being left behind and losing relevancy and notoriety. This brings about several legal issues, such as those arising from trademark infringement by NFT creators, as well as marketing collaborations as influencers are becoming an essential component of a brand’s commercial success.
In the case of privacy compliance, new laws (including state laws in California, Colorado, Connecticut, Virginia and Utah), enforcement actions and plaintiff lawsuits are making privacy law in vogue. Fashion brands that prefer style over trend and do not plan to explore new online practices as well as brands that prefer avant-garde tactics by actively adopting cutting-edge technologies have new considerations that should be accounted for.
Does the Dress Fit? (Law Applicability)
Fashion brands should take into careful consideration whether these and other new privacy laws apply. Many have specific thresholds, and before raising a red flag to product and marketing teams, the extent of applicability should be measured. For example, under the California Privacy Rights Act (CPRA), there is a concept called “common branding”: If a business that does not meet the initial thresholds for CPRA applicability shares common branding (e.g., similar trademarks) with a business that must comply with the CPRA, that business must comply with the CPRA despite its not meeting the initial thresholds. Fashion brands should consider the corporate structure at play and where consumers may understand brands to be connected.
It’s Giving … Needs a Makeover Vibes (Cookies and Other Tracking Technologies)
According to Gen Z, Flares Are Back In (Minors Privacy)
Additionally, it is difficult to deny that the fashion industry has grown in popularity among the youth, with the growing prevalence of social media and accessibility enabling all to consume fashion content. This has led to changes in fashion culture where what’s “in” may differ drastically from week to week and from one age group to another fairly quickly, depending on what’s trending across pockets of online communities. Fashion brands considering expanding their revenue by appealing to minors under the age of 18 should also consider new and current laws regarding children’s privacy. The Children’s Online Privacy Protection Act (COPPA), which is enforced by the Federal Trade Commission (FTC), may come into play if a fashion brand markets its products on platforms popular among children under 13. Additionally, the California Age-Appropriate Design Code Act (CAADC), enforceable in 2024, is broadly applicable to online properties that minors under 18 may interact with, leaving little room for fashion brands to avoid its obligations. Fashion brands whose marketing is even remotely targeted to those under 18 should consider the CAADC’s requirements, including preforming a Data Protection Impact Assessment that, among other things, identifies how personal information is used, the risks or material harm to minors that may arise, the design of the product and how the design mitigates potential harm, and the use of targeted advertising and algorithms.
Fabric Comes in Many Patterns (Algorithms)
Changes in fashion culture may lead brands to venture into diversity and inclusion programs, externally for consumers as well as internally for employees or designers. These programs can drive successful marketing moments such as online avatar expression in AR/VR and metaverse experiences. In the same way that “who you’re wearing” matters in the physical realm, fashion will matter in online experiences. “Who” your avatar is wearing will carry weight, as the way that imagery comes across in an online experience is an individual’s social currency. To improve these experiences, brands may explore technology such as algorithms or machine learning to reduce bias. For example, certain AR experiences, such as beauty filters, may require data sets including ethnic and racial diversity so the filters do not represent the consumer inappropriately during use (e.g., a makeup filter not working on a darker skin tone). However, such algorithms may be subject to regulatory scrutiny and compliance standards. For example, there is precedent for the FTC mandating an entity to delete an algorithm, its outputs and the data that fed the algorithm should unfair and deceptive business practices occur. Additionally, the CPRA regulations (not fully finalized yet) are meant to address consumer rights as to opting out of certain automated decision-making. Similarly, but in a different vein, New York City’s local law on automated employment decision tools may also require compliance steps if a fashion brand seeks to use machine learning to ensure a diverse pool of candidates to drive its brand forward.
Serving Looks … in a Blockchain (NFTs)
Outfitting Your Privacy Strategy (Key Takeaways)
In any case, whether or not a fashion brand is considering new technologies, new legal trends regarding privacy demand that brands mitigate risk through a privacy compliance implementation strategy. Some implementations could or, depending on the law, must include:
- Publishing robust terms of service and privacy policies and notices on online properties, including terms addressing digital asset purchases, digital subscriptions and refunds.
- Executing data processing agreements or addenda with vendors that will process personal information for the fashion brand, such as app developers, analytics service providers and independent designers.
- Performing privacy impact assessments (PIAs) to assess mitigations necessary when implementing new or novel personal information processing, such as using data in a new way, collecting new data types or using a new technology with regard to data already collected. For example, for brands not historically marketed to minors, a PIA should be performed to determine what risk mitigations should take place to allow for expansion into that age group A PIA should also be conducted for brands looking to use algorithms to further diversity goals.
- Using vendors to cover privacy compliance areas for efficient resource allocation such as using a tracking technology/cookie consent manager vendor to facilitate opt-outs or opt-ins on online properties or for data mapping purposes. BakerHostetler has connections with such vendors.
- Considering how platforms and vendors that the fashion brand uses comply with applicable law, such as those that may require consent on the platform side.
- Implementing user interface design guidelines to avoid privacy “dark patterns” and address specific legal requirements, such as how consent interfaces should look and feel. These can be consistent with the brand’s aesthetic. Consider a recent FTC Staff Report addressing “dark patterns.”
- Considering privacy at all levels of the fashion brand from consumers to employees to B2B relationships. Consider the FTC’s Advance Notice of Proposed Rulemaking (ANPR) seeking public comment addressing possible new rules for data security, commercial surveillance and other broad privacy issues.
- Keeping up with privacy legal movements such as plaintiff attorney litigation trends and legislation. Consider the American Data Privacy and Protection Act moving through Congress, which may not pass but provides a sense of what is on the minds of regulators and legislators.