Navigating the Intersections of Data, Artificial Intelligence, and Privacy

H5
Contact

Companies can expertly address AI-related privacy concerns with the right knowledge and team.

While the U.S. is figuring out privacy laws at the state and federal level, artificial and augmented intelligence (AI) is evolving and becoming commonplace for businesses and consumers. These technologies are driving new privacy concerns. Years ago, consumers feared a stolen Social Security number. Now, organizations can uncover political views, purchasing habits, and much more. The repercussions of data are broader and deeper than ever.

H5 recently convened a panel of experts to discuss these emerging issues and ways leaders can tackle their most urgent privacy challenges in the webinar “Everything Personal: AI and Privacy.”

The panel featured Nia M. Jenkins, Senior Associate General Counsel, Data, Technology, Digital Health & Cybersecurity at Optum (UnitedHealth Group); Kimberly Pack, Associate General Counsel, Compliance, at Anheuser-Busch; Jennifer Beckage, Managing Director at Beckage; and Eric Pender, Engagement Manager at H5; and was moderated by Sheila Mackay, Managing Director, Corporate Segment at H5.

While the regulatory and technology landscape continues to rapidly change, the panel highlighted some key takeaways and solutions to protect and manage sensitive data leaders should consider:

  • Build, nurture, and utilize cross-functional teams to tackle data challenges

  • Develop robust and well-defined workflows to work with AI technology

  • Understand the type and quality of data your organization collects and stores

  • Engage with experts and thought leadership to stay current with evolving technology and regulations

  • Collaborate with experts across your organization to learn the needs of different functions and business units and how they can deploy AI

  • Enable your company’s innovation and growth by understanding the data, technology, and risks involved with new AI

Develop collaboration, knowledge, and cross-functional teams

While addressing challenges related to data and privacy certainly requires technical and legal expertise, the need for strong teamwork and knowledge sharing should not be overlooked. Nia Jenkins said her organization utilizes cross-functional teams, which can pull together privacy, governance, compliance, security, and other subject matter experts to gain a “line of sight into the data that’s coming in and going out of the organization.”

“We also have an infrastructure where people are able to reach out to us to request access to certain data pools,” Jenkins said. “With that team, we are able to think through, is it appropriate to let that team use the data for their intended purpose or use?”

In addition to collaboration, well-developed workflows are paramount too. Kimberly Pack explained that her company does have a formalized team that comes together on a bi-monthly basis and defined workflows that are improving daily. She emphasized that it all begins with “having clarity about how business gets done.”

Jennifer Beckage highlighted the need for an organization to develop a plan, build a strong team, and understand the type and quality of the data it collects before adopting AI. Businesses have to address data retention, cybersecurity, intellectual property, and many other potential risks before taking full advantage of AI technology.

Engage with internal and external experts to understand changing regulations

Keeping up with a dynamic regulatory landscape requires expanding your information network. Pack was frank that it’s too much for one person to learn themselves. She relies on following law firms, becoming involved in professional organizations and forums, and connecting with privacy professionals on LinkedIn. As she continually educates herself, she creates training for various teams at her organization, including human resources, procurement, and marketing.

“Really cascade that information,” said Pack. “Really try to tailor the training so that it makes sense for people. Also, try to have tools and infographics, so people can use it, pass it along. Record all your trainings because everyone’s not going to show up.”

The panel discussed how their companies are using AI and whether there’s any resistance. Pack noted her organization has carefully taken advantage of AI for HR, marketing, enterprise tools, and training. She noted that providing your teams with information and assistance is key to comfort and adoption.

“AI is just a tool, right?” Pack said. “It’s not good, it’s not bad.” The privacy team conducts a privacy impact assessment to understand how the business can use the technology. Then her team places any necessary limitations and builds controls to ensure the team uses the technology ethically. Pack and Jenkins both noted that the companies must proactively address potential bias and not allow automated decision-making.

Evaluate the benefits and risks of AI for your organization

The panel agreed organizations should adopt AI to remain competitive and meet consumer expectations. Pack pointed out the purpose of AI technology is for it to learn. Businesses adopting it now will see the benefits sooner than those that wait.

Eric Pender noted advanced technologies are becoming more common for particular uses: cybersecurity breach response, production of documents, including privilege review and identifying Personally Identifiable Information (PII), and defensible disposal. Many of these tasks have tight timelines and require efficiency and accuracy, which AI provides.

The risks of AI depend on the nature of the specific technology, according to Beckage. It’s each organization’s responsibility to perform a risk assessment, determine how to use the technology ethically, and perform audits to ensure the technology is working without unintended consequences.

Facilitate innovation and growth

It is also important to remember that in-house and outside counsel don’t have to be “dream killers” when it comes to innovation. Lawyers with a good understanding of their company’s data, technology, and ways to mitigate risk can guide their businesses in taking advantage of AI now and years down the road.

Pack encouraged compliance professionals to enjoy the problem-solving process. “Continue to know your business. Be in front of what their desires are, what their goals are, what their dreams are, so that you can actively support that,” she said.

Pender says companies are shifting from a reactive approach to a proactive approach, and advised that “data that’s been defensively disposed of is not a risk to the company.” Though implementing AI technology is complex and challenging, managing sensitive, personal data is achievable, and the potential benefits are enormous.

Jenkins encouraged the “four B’s.” Be aware of the data, be collaborative with your subject matter experts, be willing to learn and ask tough questions of your team, and be open to learning more about the product, what’s happening with your business team, and privacy in an ever-changing landscape.

Beckage closed out the webinar by warning organizations not to reinvent the wheel. While it’s risky to copy another organization’s privacy policy word for word, organizations can learn from the people in the privacy space who know what they’re doing well.

Written by:

H5
Contact
more
less

H5 on:

Reporters on Deadline

"My best business intelligence, in one easy email…"

Your first step to building a free, personalized, morning email brief covering pertinent authors and topics on JD Supra:
*By using the service, you signify your acceptance of JD Supra's Privacy Policy.
Custom Email Digest
- hide
- hide

This website uses cookies to improve user experience, track anonymous site usage, store authorization tokens and permit sharing on social media networks. By continuing to browse this website you accept the use of cookies. Click here to read more about how we use cookies.