As I look back at calendar year 2019 and contemplate the legal work that came my way, it is impossible to overlook the impact that privacy laws have on organizations. Just as businesses began to exhale after a grueling race toward the GDPR compliance deadline, the California legislature enacted a law that required the focused attention of every company collecting personal information on California consumers.
Most Americans agree that data privacy is an issue that matters to them and over which they feel little control. A June 2019 Pew Research Center survey of a cohort of randomly selected U.S. adults revealed that 62% of Americans believe “it is not possible to go through daily life without companies collecting data about them.” The survey further reveals statistics regarding the perception American citizens have vis a vis the potential risks of personal data proliferation as compared to the benefits resulting thereof. These findings support the attention that legislators and privacy advocates give to the issue of personal data protection and regulation.
Privacy laws, once enacted, send in-house legal staff, technologists, compliance teams, and data management personnel scurrying about to develop policies and processes, write code, and train employees in an effort to ensure compliance by a looming effective date. Companies spend significant dollars and co-opt large teams of human capital to address CCPA requirements after having just done the same in response to GDPR.
I, like some of you, am a lawyer who has spent the last 18 months advising companies about their CCPA compliance obligations. I participated in the design of a CCPA – compliance program for a multi-billion dollar data company and I have spent a great deal of time counseling smaller companies with respect to their CCPA compliance efforts.
While researching a topic the other day, I happened upon a video in which Elon Musk introduces his new company, Neuralink, to an audience of potential recruits. Somehow, while I was busy offering sage counsel in the arcane space of the CCPA, Neuralink’s debut slipped under my radar. I watched the presentation and was struck. It became frighteningly clear that these privacy laws that legislators, lawyers, lobbyists, advocates, and businesses are so busy negotiating and around which they are so busy navigating may be one critical defense against humanity’s demise.
Neuralink is a company operating in the realm of “neurotechnology” and their immediate goal is to repair broken brain circuitry with a device comprised of a chip and tiny threads that are implanted on a human’s brain by a robot, thereby creating a “brain/machine interface”. Neuralink is already testing the device in monkeys and forecasts that the device will become part of a human study by the end of 2020, likely in the context of a quadriplegic patient with a C1 – C4 spinal injury.
Mr. Musk foresees that the procedure to implant the Neuralink device could ultimately be as simple as that of Lasik: no hospital stay and use of conscious sedation. A brain chip will be as accessible as hair transplants and breast augmentation. Ready availability of a device that could reactivate mobility and successfully treat diseases such as Parkinson’s, ALS, Depression, and chronic pain is extraordinary. Having lost my mother at too young an age to Multiple System Atrophy, a neurodegenerative disorder, I understand the suffering such diseases inflict on patients and their families.
Yet, in listening carefully to Mr. Musk’s presentation, I felt a chill creep up my still human spine. He spoke of a “symbiosis with AI.” He opines that the Neuralink technologies will be “important at a civilization level scale” in that they will provide humans the “option of merging with AI”. He hopes that Neuralink will help to mitigate the “existential threat of AI” by enabling a “well-aligned future” between humans and machines, which I interpreted as a scenario where humans and machines coexist peaceably rather than in a war of the worlds. Just how long is the evolutionary timeline between chip implantation at the corner doc in the box and a cyborg nation? And, what happens when this technological “tertiary super-intelligence layer”, controlled by an app from the Apple store, becomes available on a mass scale?
Will the privacy laws devised by humans be able to protect us once our every thought, emotion, and memory is recorded, and our moods decoded? Can we effectively defend these devices, and ourselves, from the effect of malignant custom code? How do we design and build an ethical framework, with which to employ this technology? Can we, as mere humans, be prescient enough to craft legal protections that will preserve humanity? Happy New Year and welcome to our brave new world.