Embracing Data Privacy to Drive Business Growth with Aihong Yu, Chief Privacy Counsel of CDK Global

Furia Rubel Communications, Inc.
Contact

In this episode of On Record PR, Gina Rubel goes on record with Aihong Yu, Chief Privacy Counsel of CDK Global, to discuss how embracing privacy and security measures can accelerate growth for businesses. Aihong oversees CDK Global’s privacy, AI, and data governance programs.

Would you tell us a little bit about CDK Global and what you do for the company?

CDK is a software company. We provide SaaS solutions to car dealers and manufacturers. I usually use simple language to tell people what we do because the products themselves are so complicated. We provide all the tax solutions to car dealerships and manufacturers for the operation of the entire dealership. From the time you walk into a dealership and they take your information to the time you buy a car, borrow loans from the banks, get your insurance, and come back for services and warranties, all those things in the back office are handled by CDK’s systems.

Tell us about the significant trends or events that have shaped data privacy and why they’re so important.

CDK Global is a B2B company, but at another level it’s also a B2C company, because we process the data on behalf of the dealers and we process all these consumers. I will tell you that today’s privacy landscape is a very complex web.

In addition to California, which is a leader in the state privacy laws in the U.S., four other states have had their comprehensive privacy laws become effective in 2023. Eight other states have passed their laws, and most of them will become effective later this year. It’s a huge wave of privacy legislation. Many people are hoping that instead of dealing with this patchwork of different states, we will have a comprehensive federal law. That has been in talks for a couple of years now, but I don’t think in an election year there’s any hope that we will pass a federal law. However, some federal agencies have stepped into this area.

Privacy and data security go hand in hand. They’re inseparable. The SEC passed an important regulation last year that requires companies to disclose their cybersecurity practice and risks in their annual and quarterly filings. One of the most impactful provisions of the SEC rule is to require public companies to report any data incident in four business days. That is a very demanding timeline because if there’s a cyber attack or even a human error incident, it usually takes a lot of time to investigate what exactly happened. By the fourth day, you probably know very little information.

The public companies are having this dilemma. On the one hand, you don’t want to disclose too much only to find out some of it is not true and then you have to retract those statements or mislead the investors. On the other hand, if you report too little, you’re not complying with SEC rules. It’s a pretty delicate balance to strike for companies.

The FTC has been very active in publishing rules, and they’re using the existing privacy laws to enforce data protection. There were a lot of cases of FTC enforcement in 2023. The one important rule that they have updated is the Safeguards Rule. It has similar requirements. It requires non-banking financial institutions like car dealerships or mortgage brokers to report data incidents. They define an incident as unauthorized access of unencrypted data. That will be very impactful for some industries because those types of incidents happen very often in companies now. That’s a big change.

On the international front, you have the EU and the U.S signing the Privacy Framework. This framework provides a mechanism for companies to transfer personal data from the EU to the U.S, but the future of this new framework is very uncertain. The U.S and the EU signed two similar frameworks in the past, one called Safe Harbor and the other called Privacy Shield, and both were invalidated. The EU Court of Justice said that the agreement is invalid because the U.S is not providing adequate protections for personal data. The fundamental reason for that is a U.S. law called the Foreign Intelligence Surveillance Act. It is an old law from 1978. The law was enacted to give the US government the authority to impose surveillance on certain individuals or organizations for foreign intelligence purposes. The concern and the criticism there is that this law is often abused and there’s a lack of transparency. There’s an indiscriminating kind of surveillance on individuals and businesses.

Therefore, the future of this new privacy framework is really in doubt. The person who successfully challenged the previous two frameworks, Max Schrems, is an Austrian activist and lawyer. He filed a complaint that eventually resulted in the invalidation of those two frameworks. He has already indicated he’s going to file a complaint against this third one.

The privacy compliance world is complicated for businesses. The critical question is, how do you comply with all these privacy laws without slowing down technology, innovation, and creative work? I always have this picture in my mind of the privacy landscape. The businesses are driving racing cars, rushing forward in a new field. In the field, you see a lot of objects. Are those objects road blockers or accelerators? The answer depends on how you approach those objects and from what angle. What kinds of strategies and how much effort are you willing to put into it?

Gina Rubel: That’s a brilliant analogy. Are they accelerators or road blockers? You highlighted a number of challenges, one of which is just the inconsistencies, the ambiguities, and the number of different privacy regulations a global company has to comply with. You mentioned the U.S and the EU, but as a global company, you’re managing every single state, because there’s no federal law. I just wrote a blog about New Jersey’s new privacy act that passed a few weeks ago. Now you have to come up to speed with that, on top of everything else. I don’t know how you do it. I think it’s incredibly daunting and I do hope the U.S comes up with something on the federal level to start managing the playing field for people who are working in that space.

What are some of the opportunities that can arise from effectively managing and communicating your data privacy practices?

It really depends on how you look at these things. If it’s a difficult problem to solve and you can solve the problem in the right way, then you’re the winner. If you have a strategy that incorporates privacy in every aspect of your business where you don’t treat privacy as purely legal compliance but as an operation, then you’re going to have a competitive advantage.

What does that mean? I talk to our service people every day about this. When you view it as compliance, it becomes a burden. You have to jump through all these hoops to reach the end. But think about privacy. This is different from anti-bribery or code of business conduct. Privacy is all about how you use data, how you collect personal data, how you store it, how long you keep it, and then whether you are implementing a data minimization policy. Are you collecting too much data? Are you considering what the consumers’ expectations are?

In today’s digital world, data travels so fast, but you can treat privacy as an operational matter. If I tell you there’s a product feature that you can add that will instantly make you millions of dollars of revenue, it’s a no-brainer you’re going to do it right away. The same thing with privacy.

If you incorporate privacy into product development, service processes, and your website and you earn trust and a good reputation, then you’re going to make money because at some point that will convert into profit, either directly or indirectly.

Apple is a great example. When you talk about all these big companies, very often we hear about enforcement cases against Facebook, Google, Amazon, you name it. All the companies have so far paid some sort of fine in enforcement cases, but Apple has been a privacy champion. That’s how they have positioned themselves.

I was at the International Association of Privacy Professionals (IAPP) conference, and they invited Tim Cook, the CEO, to be the keynote speaker. It was pretty inspiring. This was a couple of years ago, but Apple started its privacy strategy many years ago in 2014. I remember that was the year when Tim Cook published a public letter to consumers to announce that the whole reason they collect information is to make good products, to make something that is good for you. They’re not going to sell your information. If someone tells you that you have to give your data to them so they can make money, don’t trust them. That’s literally what he was saying.

Since then, they have built all this privacy infrastructure. For example, their hardware for cell phones or MacBooks is built to process data locally. They have very high power to process data, and that makes it possible for them to keep certain data on your local device. It is never sent to the cloud, so Apple doesn’t see it. We all use facial recognition instead of a password to unlock our phones and Siri recording. All this data is kept on your local device. You own your data, and you own your privacy. They don’t have it. They don’t see it. They don’t send it to the cloud like other players do.

For example, there are a lot of things that Apple can do about arranging your photos to make a movie. They use their AI on their local device to make all these little videos, and it’s a very attractive app. I think Google does similar things. I’m not an Android user, but I know it does similar things. They have to send photos and videos to the cloud and Amazon Alexa does the same thing. It’s not saved locally.

Apple is positioning itself as a privacy champion, and it’s paid off. Every time you use a third-party app on an iPhone, it will pop up with a message asking if you allow them to track you. Apple is forcing all the app developers to get the users’ permission to use their data to track them in different apps that they don’t own. I think Apple’s story is going to be a textbook example of how you use privacy as your competitive advantage.

The world has witnessed many high-profile data breaches. How have those types of incidents affected the perception of data privacy among consumers, and what lessons can businesses learn from those situations?

That’s a question that we deal with every day. We all know these high-profile data breaches with the big names – Target, Walmart, Facebook, Microsoft, Verizon, SolarWinds, Colonial Pipeline. All these big names were involved in some sort of big data breaches.

Those high-profile data breaches are very visible because they’re headline news. A lot of us have received letters from these companies telling us, “Unfortunately your data was breached, leaked, or hacked and obtained by someone else.” We know in today’s world, data breaches are just not avoidable.

Ironically, if there’s anything good that comes out of it, that would be awareness. It has helped to raise the public’s awareness about data protection. Consumers now start to say, “Whichever business has my data, I want to know what kind of a company they are and if they have enough reasonable security measures in place to safeguard my data. I don’t even know what kind of information you have about me. I have all these rights under the privacy law. I can go to any business to ask them. Tell me what information you have about me and how you use my information. Do you sell my information? Who do you share this information with, and how long do you keep it? After I’m done with your business, are you going to purge everything, or are you going to keep my data somewhere and this data will travel somewhere on the internet?” That raised the consumers’ awareness, which is a great thing. For companies, it has the same effect. It has raised awareness among the CEOs. The question for the company nowadays is not whether you’re going to have a data breach, it’s when.

All the companies have had some sort of incident, whether it’s a cyber attack or something that happened because of a human being, like a data leak. It gives companies an incentive to train their employees and make sure that they have the right security and privacy programs in place. That helps make the work of privacy and security professionals a little bit easier.

Another lesson the companies have learned is that you have to be prepared. That means that you have to have a very solid data incident response. You not only have to have it written on paper, you have to test it again and again so that when the real thing comes, you are prepared and you can respond the way that you planned.

Gina Rubel: You are speaking my language. I cannot tell you how often we say that as a public relations agency. Most of what I do is crisis communications. It’s everything from crisis planning to tabletop exercises. For law firms and outside counsel, make sure your firms have them because your partners expect them. Anyone who doesn’t carry cyber insurance, it almost feels like they are years and years behind.

What emerging trends do you foresee in the intersection of data privacy and business strategy?

One thing is for sure: privacy laws are here to stay. Not only here to stay – it will accelerate. The enforcement will accelerate. Companies should take note. We’re in the sixth year of the General Data Protection Regulation (GDPR), and we’re in the fourth year of California Consumer Privacy Act (CCPA). So far, the enforcement has been quiet in the U.S. Under the CCPA, the only openly announced case was the Sephora case.

I think we’re at the stage where the regulators are giving companies a little bit of time – a grace period to get your programs in place and do whatever is needed to enhance your privacy and security programs. 2024 and 2025 are going to be the years for enforcement. The regulators have given you time to get into the position where you should be. The enforcement will accelerate.

The question for business is, how are you going to form or improve your strategy so that privacy and security are not a road blocker, if not accelerators? It should definitely not be a road blocker and in the best scenario, it should be an accelerator. I think the companies are doing a lot. You will see a trend where companies are using advanced technology to promote privacy.

Once Apple started to require app developers to ask permission from users to track them, you probably remember that Facebook openly complained. Then their stock tanked because one stream of revenue was threatened – advertising. But guess what? After a year, Facebook has revamped its business model and found new ways to make money.

Another example is Google, which also relies on digital advertising. Google has already announced that it plans to stop supporting third-party cookies. These cookies track people online. That has been a complaint from people. People think, “I’m behind my screen but somehow I feel that there’s no privacy because once I start to do something online, it seems that an invisible person is tracking me every minute and knows what I do and they push all this advertising to me.”

All these complaints and concerns about privacy have given companies the incentive to use technology to promote privacy. Privacy enhancing technology (PET) is used more and more in the digital world. It either anonymizes or tokenizes personal information. They use different methods to reduce the chance of identifying individuals from the data set. All these new technologies are emerging in response to the restrictions and legal requirements regarding respect for consumers’ personal data and privacy.

The other thing that I have to mention in 2024 is generative AI. You cannot do business without thinking about generative AI. The rise of generative AI has generated new legal issues – IP data privacy and what input data you are using. What is the output data? How your algorithm works, how the sausage is made, and who owns the outcome. Do you have the right to use that data in the first place to train your model? All these legal questions are just waiting to be answered. ChatGPT was announced in November 2022, and since then, we’ve already seen 25 litigations on IP, data privacy, or other issues.

Generative AI is really smart, and yes, it has created legal issues that we have to answer. On the other hand, maybe generative AI is the answer to those problems it has created. I’m very sure a lot of smart people nowadays are working on those issues.

Aihong Yu

Learn more about CDK Global

LinkedIn: https://www.linkedin.com/in/aihong-yu-2561474/

The views and opinions expressed in this program are those of the speakers and do not reflect the views or positions of the entities they represent.

Written by:

Furia Rubel Communications, Inc.
Contact
more
less

Furia Rubel Communications, Inc. on:

Reporters on Deadline

"My best business intelligence, in one easy email…"

Your first step to building a free, personalized, morning email brief covering pertinent authors and topics on JD Supra:
*By using the service, you signify your acceptance of JD Supra's Privacy Policy.
Custom Email Digest
- hide
- hide