Asia Funds ESG Report 2023

Morrison & Foerster LLP
Contact

Morrison & Foerster LLP

What is Responsible Tech for Private Equity Funds?

At a minimum, responsible tech is the active consideration of the potential unintended consequences and negative impacts of technology on individuals, society and the environment, and the proactive mitigation of these potential risks and harms. At its best, responsible tech means designing, deploying, and using technology for the good, where it promotes a fair, inclusive, and thriving society that is founded on fair and transparent contributions between people and technology and is considerate of the impact of technology on the environment. Responsible tech fits in with Corporate Social Responsibility, whereby companies—in addition to following the law—integrate social, environmental, ethical, consumer, and human rights concerns into their business strategies and operations. There are numerous laws around the globe that make complying with responsible technology principles a statutory duty—for example, around energy use, non-bias, use of data and more. Fines for non-compliance in certain areas around the globe can command figures such as 4% or 6% of global turnover and, at times, imprisonment for wrongful acts or non-compliances with law.

Often, responsible tech is equated with ensuring that deployment of new technologies complies with data protection and cybersecurity requirements. Responsible tech, however, can only be achieved if this notion is applied throughout the entire lifecycle of disruptive technologies, which consists of three stages: how these technologies are designed and developed; how they are deployed or to whom they are sold; and how they are used by an organization other than the original creator.[1] Responsible tech requires different actions by different players in each phase.

Responsible Tech in the Three Phases of the Lifecycle

  • Design and Development. Responsible innovation principles should be designed into the technology from the beginning. For example:
    • Inclusive Design. Consider use of the product by differently-abled individuals in the design phase. Individuals with visual impairments or mobility constraints often have to wait months or years for accommodations to be built into applications and devices so they can use them. Various countries’ laws also demand a certain level of inclusivity for products and services.
    • Bias. Despite the attention given to issues of racial and gender inequity in recent years as well as those from under-privileged backgrounds, underrepresented individuals still face persistent bias due to new technologies. Smartphone biometrics and cameras can struggle to perceive and render the skin tones of non-white individuals. Unbalanced training datasets in payments and banking products can result in black or poorer applicants being denied credit at higher rates than white applicants or wealthier applicants. There has already been various case-law on this as well as laws that would apply to technology around bias and discrimination. The use of diverse and representative datasets can mitigate bias and reduce discrimination in algorithms. Diverse and representative development teams can further provide alternate perspectives or new cultural contexts that help identify unanticipated issues or unintended use cases for review.
    • Energy Efficiency. New technologies need to be designed and developed to reduce energy use, resource use and waste, and increase energy efficiency, and be able to measure and log energy and resource consumption, and, where technically feasible, other environmental impact.
  • Deployment: The first law for technology is that it is not good or bad, but also not neutral. It is as good as the purposes of deployment. A company can seek to influence how its technology is deployed by requiring buyers/users to sign Acceptable Use Policies and enforcing these. Alternatively, some high-risk uses or users can be contractually excluded. Examples include everything from use of drones, facial recognition techniques, and automatic weapons to algorithmic decision‑making. Certain statutes around the globe actually set out certain use cases are illegal as well. It is important to consider this when deploying but also when developing technology. If you are deploying a third-parties’ technology, it is important to ensure that deployment is legal as various use cases are either not permitted in certain countries or, as mentioned above, high risk.
  • Use/Application: Once the technology company has done what it can during development and, where it is helping deploy, deployment, responsibilities also fall to the organization using the technology or, if they are deploying it themselves, also at deployment stage. For example, in the retail industry, stores are deploying facial recognition to prevent theft and share data of shoplifters across shops. Depending on how shop owners use this technology, one small mistake by a minor could mean they are not able to shop again, which would seem to be an excessive punishment when weighed against the interests of the shop owner.[2] With the known error rate of facial recognition of non-white persons, this further creates discrimination risks, especially for vulnerable populations and marginalized groups.

Take China as an example: responsible tech practices are evident in leading online streaming websites like Tencent Video and iQiyi, which offer “Youth Mode” to monitor and restrict age‑inappropriate content, akin to U.S. film ratings. Additionally, WeChat and Alipay, two of China’s most popular mobile apps, have developed accessibility features for visually impaired users, enhancing their social and payment experiences. Moreover, apps such as Meituan (a food delivery and service platform) and DiDi (a ride-sharing service) have introduced “Elderly‑Friendly Mode,” simplifying interfaces and accommodating older users, reflecting a commitment to inclusivity and responsible tech adoption. These examples demonstrate how China’s tech industry is actively responding to diverse user needs while promoting a safer and more-accessible digital environment.

How Can PE Funds Factor Responsible Tech into Their Investment Decisions?

By assimilating essential responsible tech considerations into their investment procedures, private equity “PE” funds can simultaneously mitigate risks and position themselves as conscientious custodians of capital. In doing so, they contribute to the wider acceptance of ethical and sustainable technology practices across the corporate landscape. PE funds can:

  • Incorporate ESG criteria into their investment screening and due diligence protocols, conducting thorough assessments of technology-related aspects of potential investments. In some countries, not having compliance with energy consumption levels based on laws may affect value.
  • Conduct responsible tech-specific reviews to evaluate how a company develops, uses, and safeguards its technology assets. This may also involve pre-investment examination of whether a key technology has the potential to yield enduring value or, conversely, result in unforeseen adverse ramifications.
  • Ensure that any data utilized by the technology company for the products or services has been legally used, whether training data, for marketing purposes or otherwise. Non-compliances here can hugely damage both for valuation and from a PR perspective, and there are numerous examples of share prices being shaved significantly and customer attrition being in the sums of millions where there is a perception or an actual case of not using or deploying technology responsibly.

How Can PE Funds Factor Responsible Tech into the Management of Their Portfolio Companies?

PE funds can use their influence to push for sustainability initiatives and responsible data handling within their portfolio companies. At a minimum, they should understand levels of compliance with laws and responsible technology principles and societal expectations on a regular basis (encouraging self-reporting as well as potential audits) and, if needed, help ensure a program for improvement by those companies, especially to ensure they are fit for future sale. For example, they can review the measures in place to protect user data and ensure data privacy compliance and address potential risks related to data breaches, which can have serious legal, financial, and reputational consequences. They can also encourage their portfolio companies to share knowledge and best practices related to responsible tech to foster common value recognition. Portfolio management to enhance responsible tech should also include encouraging portfolio companies to report on their responsible tech practices. Transparent reporting can enhance the portfolio company’s and the parent fund’s reputations and attract socially responsible investors. It will become increasingly important for valuations in many countries in the coming years. It is heavily talked about in the UK and continental Europe, for example. Finally, it is important for PE funds’ portfolio companies to engage with various stakeholders, including customers, employees, and communities, to understand their concerns and expectations regarding responsible tech.

Is it Appropriate to Apply Responsible Tech in Asia?

The comparatively low level of familiarity in parts of Asia with the foundational concepts that underlie responsible tech can present a challenge. To many businesspeople in the region, “responsible tech” is synonymous with, and limited to, data privacy and cybersecurity. Post‑GDPR, many Asian countries have adopted or updated prescriptive data privacy regimes and considered cyber security implications, and so implementing “responsible tech” is often equated in the region to somewhat mechanical compliance with specific statutory data rules in these areas.

This narrow approach misses one of the key benefits of the concept of responsible tech, including ensuring compliance with laws, for markets companies may sell or license into or outside of China who are focused on responsible tech as well as the ability to mitigate business risk in novel areas where regulatory approaches have not yet coalesced.

Recent advances in AI technology have had a tremendous market impact with the introduction of sophisticated generative AI tools that leverage large datasets to generate text, images, computer code music, and other content. By reducing the cost and effort involved in content production, this technology may transform entire industries, but it also challenges established IP ownership concepts and has the potential to facilitate deepfakes, systemic bias, and misinformation campaigns—a situation that is ripe for regulatory intervention.

In the last few months, China has issued binding regulations on generative AI, and Japan has indicated that it will bolster its existing machine-learning IP rules with a draft code of conduct by the end of 2023 to try to address these risks. Other jurisdictions seem likely to follow in Asia, and many non-Asian countries have addressed these risks or are addressing them, which, again, becomes important where the products or services are to be marketed, sold, or licensed around the globe. The specific approach that China and Japan have taken differs considerably and may continue to evolve. This rather fluid regulatory situation injects unwelcome uncertainty into the value (and possibly even the viability) of portfolio companies operating in the AI space. However, while it may be difficult to predict the exact laws that will emerge once the initial dust has settled, any laws are likely to be based on foundational responsible tech concepts and principles, such as curating AI training datasets to avoid systemic bias, applying ethical filters to limit the risk of unlawful or discriminatory outputs, and designing products to give users control over any reuse of their inputs and consideration of environmental impacts. By embracing this broader concept of responsible tech and incorporating it into their investment screening and due diligence protocols as well as their ongoing work with their portfolio companies, PE funds can effectively mitigate many of the risks associated with the current and emergent regulatory environment.


[1] See: “Responsible Use of Technology” White Paper.

[2] See: “With facial recognition, shoplifting may get you banned in places you’ve never been”.

[View source.]

DISCLAIMER: Because of the generality of this update, the information provided herein may not be applicable in all situations and should not be acted upon without specific legal advice based on particular situations.

© Morrison & Foerster LLP | Attorney Advertising

Written by:

Morrison & Foerster LLP
Contact
more
less

Morrison & Foerster LLP on:

Reporters on Deadline

"My best business intelligence, in one easy email…"

Your first step to building a free, personalized, morning email brief covering pertinent authors and topics on JD Supra:
*By using the service, you signify your acceptance of JD Supra's Privacy Policy.
Custom Email Digest
- hide
- hide