Artificial Intelligence and Learned Intermediaries

by Reed Smith
Contact

In the July 7, 2017, “Artificial Intelligence” issue of Science, we were intrigued by a short piece in the “Insights” section on “Artificial Intelligence in Research” that discussed the future use of autonomous robots in surgery.  Surgeonless surgery would “allow[] work around the clock with higher productivity, accuracy, and efficiency as well as shorter hospital stays and faster recovery.” Science, at 28.  The listed drawbacks were:  “technical difficulties in the midst of a surgery,” the “loss of relevance of surgeons,” and “how to equip artificial intelligence with tools to handle . . . inherent moral responsibility.”  Id.

Fascinating.  In addition to driverless cars, do we also need to contemplate surgeonless surgery?  We’ve long been aware of the advent of robots as an adjunct to surgery.  Bexis filed a (largely unsuccessful) PLAC amicus brief in Taylor v. Intuitive Surgical, Inc., 389 P.3d 517 (Wash. 2017), but the surgical robot in Taylor in no way threatened to displace the surgeon, and the applicability (if not application) of the learned intermediary rule in Taylor was undisputed.  Id. at 526-28.

We checked the Internet, and sure enough there were plenty of articles from reputable sources:

Completely automated robotic surgery: on the horizon?” (Reuters)

Autonomous Robot Surgeon Bests Humans in World First” (Inst. of Electrical & Electronics Engineers)

Would you let a robot perform your surgery by itself?” (CNN)

The Future Of Robotic Surgery” (Forbes)

Science fiction?  Apparently not anymore.  As the last article stated:

Having totally automated procedures was once a thing of science fiction, very futuristic and not very practical. . . .  But over the last three or four years, technology has evolved and this has become a possibility.  I think potentially we’ll see some automated tasks in the medical field in the next five years.

All these articles are from 2016.

Since we’ll still be practicing law in five years, we thought we’d better start thinking about this.

First, will there be product liability litigation involving autonomous surgical robots at all?  Existing surgical robots appear to have been “cleared” by the FDA, Taylor, 389 P.3d at 520, so there hasn’t been much of a preemption barrier to bringing suit.  We’re not FDA regulatory specialists, but we have some doubt about how a fully autonomous surgical robot – described as something out of “science fiction” in the articles – could be marketed as “substantially equivalent” to existing devices.  If autonomous surgical robots, or the software that runs them, must go through FDA pre-market approval, then they would be protected by preemption, subject only to “parallel claims” that the manufacturer somehow violated relevant FDA regulations.  We are assuming, perhaps incorrectly, the continuity of the current preemption regime for medical devices.

Second, what happens to the learned intermediary rule where the product itself – an autonomous surgical robot – stands in the shoes of the traditional learned intermediary?  Plaintiffs would, of course, give the same answer as always:  Abolish the rule as outdated.  We disagree.  Any consideration of the jurisprudential reasons for the learned intermediary rule, discussed here, suggests just the opposite.  The rule exists because patients can’t be expected to understand for themselves the complexities of prescription medical products, so the law demands that the scientific and technological information necessary to make intelligent use of these products be provided to trained, professional “learned intermediaries,” who are then expected to counsel their patients about individualized treatment decisions.

Does this rationale apply to autonomous surgical robots?  Absolutely.  These products will be some of the most advanced and complex medical technology yet produced, and the law cannot expect their manufacturers simply to provide patients with the instructions for use, tell them to “have at it” and make up their own minds.  More than ever, patients will need medical professionals to explain the risks, benefits, and alternatives of automated surgery.  Who, then, becomes the learned intermediary when the traditional role of the surgeon is performed by a “product” in a potential legal action?  Looking to the purposes of the learned intermediary rule, our answer, at this point, is whichever physician whose legal duty it is to conduct the informed consent discussion with the patient.  The learned intermediary rule exists in large part to ensure that the doctor who will be advising the patient has adequate information to do so.  The professional standard that the medical community ultimately adopts to handle informed consent in automated surgery is its own business.  But however the medical community resolves that issue, the duty of the robot manufacturer should be the same as ever:  to provide information about the product adequate to enable the learned intermediary to evaluate that information, along with the patient’s medical history, in order to make proper treatment decisions and to explain these decisions to the patient.

Third, what will the advent of autonomous surgical robots do to the legal distinction between “services” and “product sales” that has traditionally protected health care providers – including hospitals - from strict liability?  We don’t know.  The answer probably depends on how the medical community integrates these robots into the health care system generally.  If robotic surgery is carried out under the close supervision of medical professionals, then probably not much will change in terms of the sales/services distinction.  That has been the case with currently available robot-assisted surgery.  See Moll v. Intuitive Surgical, Inc., 2014 WL 1389652, at *4 (E.D. La. April 1, 2014) (robot use did not remove surgical claim from scope of malpractice statute).

However, if cost consciousness leads to “routine” automated surgery being conducted with only technicians on hand to ensure that the robots are functioning properly, then the entire exercise starts to look more like the use of a product than the provision of medical services. Once again, it will be up to the medical community to develop its standards of care for the use of autonomous surgical robots.  If necessary, the law will adapt.

A number of sources of potential liability associated with automated surgery, such as failure to detect an unexpected cancer,or a non-robot-related intra-operative complication (like an adverse reaction to anesthesia) would appear to implicate medical malpractice theories of liability (e.g. “lost chance”) rather than product liability.  How will courts handle claims at the intersection of medical malpractice and product liability - that, however good the robotic software is at its intended surgical use, it does not allow the robot to react to the unexpected like human surgeons can?

Fourth, in terms of product liability, what’s the “product?”  Here, we mean whether the software, including the MRIs, CAT scans and other patient imaging data, is considered something separate from the physical robot itself.  Is the software purchased, or provided, separately from the hardware that is the visible robot?  This distinction could make a big difference in available theories of liability.  It could also be important in determining component part liability in cases where the hardware and software manufacturers point fingers at one another.  In such cases, possible defendants include healthcare professionals, hospitals that maintain the robots, manufacturers of robotic hardware, and providers of software – both the software that runs the robot and patient-specific electronic scans.  As now, there is also the possibility that the patient may not follow proper instructions.  Will autonomous surgical robots be required to have aviation-style “black boxes” to provide post-accident information?

The prevailing view under current law has been that software is not a “product.”  “Courts have yet to extend products liability theories to bad software, computer viruses, or web sites with inadequate security or defective design.”  James A. Henderson, “Tort vs. Technology: Accommodating Disruptive Innovation,” 47 Ariz. St. L.J. 1145, 1165-66 n.135 (2015).  The current restatement defines a “product” as “tangible personal property.”  Restatement (Third) of Torts, Products Liability §19(a) (1998).  In a variety of contexts, software has not been considered “tangible.”  See 2005 UCC Revisions to §§2-105(1), 9-102; Uniform Computer Information Transactions Act §102(a)(33) (NCCUSL 2002); ClearCorrect Operating, LLC v. ITC, 810 F.3d 1283, 1290-94 (Fed. Cir. 2015); United States v. Aleynikov, 676 F.3d 71, 76-77 (2d Cir. 2012); Wilson v. Midway Games, Inc., 198 F. Supp.2d 167, 173 (D. Conn. 2002) (product liability case); Sanders v. Acclaim Entertainment, Inc., 188 F. Supp.2d 1264, 1278-79 (D. Colo. 2002) (product liability case).  However, a couple of cases have gone the other way.  Winter v. G.P. Putnam’s Sons, 938 F.2d 1033, 1036 (9th Cir. 1991) (dictum in case involving books); Corley v. Stryker Corp., 2014 WL 3375596 at *3-4 (Mag. W.D. La. May 27, 2014), adopted, 2014 WL 3125990 (W.D. La. July 3, 2014).  Also of possible note, a legally non-binding 2016 FDA draft guidance considers software to be a “medical device” subject to FDA regulation in situations that would probably include autonomous surgery.

The availability – or not – of strict liability could be a big deal in cases alleging injuries arising from fully automated surgery performed by autonomous surgical robots.  What caused the injury?  Was there a problem with the robot’s hardware (such as a blade or needle malfunction)?  Was the robot incorrectly maintained?  These issues would not implicate the robot’s software.  On the other hand, was there a defect in the surgical software’s algorithms (that is, a design defect)?  Was the software designed properly but somehow corrupted (that is, a manufacturing defect), or hacked (intervening cause).  Or, to introduce a different defendant, was there some sort of error in the electronic patient-imaging files that told the robot how to operate on this particular patient?

In strict liability, a “product” defect is the key element of liability (as is a “good” for warranty claims).  A product malfunction, in the absence of reasonable secondary causes, in many jurisdictions can establish a jury submissible case.  In negligence, the plaintiff must also prove breach of duty, and an accident is not generally considered probative of such a breach.  Res ipsa loquitur – the negligence version of circumstantial proof of defect – is almost unheard-of in the context of medical treatment.  If there is a “product,” then strict liability is available.  If there isn’t a “product,” the plaintiff is obliged to prove negligence.  This distinction can be important, given how difficult proof of defect is likely be.  Cf. Pohly v. Intuitive Surgical, Inc., 2017 WL 900760, at *2-3 (N.D. Cal. March 7, 2017) (rejecting theory that invisible “microcracks” caused burns during robot-assisted surgery).

These are the issues that jump out at us as we consider the possibility of autonomous surgical robots for the first time.  There are undoubtedly others.  The technological possibilities are amazing.  As defense lawyers, it is our job to ensure that these possibilities are realized, and are not put out of reach by excessive liability.

DISCLAIMER: Because of the generality of this update, the information provided herein may not be applicable in all situations and should not be acted upon without specific legal advice based on particular situations.

© Reed Smith | Attorney Advertising

Written by:

Reed Smith
Contact
more
less

Reed Smith on:

Readers' Choice 2017
Reporters on Deadline

"My best business intelligence, in one easy email…"

Your first step to building a free, personalized, morning email brief covering pertinent authors and topics on JD Supra:
Sign up using*

Already signed up? Log in here

*By using the service, you signify your acceptance of JD Supra's Privacy Policy.
Custom Email Digest
Privacy Policy (Updated: October 8, 2015):
hide

JD Supra provides users with access to its legal industry publishing services (the "Service") through its website (the "Website") as well as through other sources. Our policies with regard to data collection and use of personal information of users of the Service, regardless of the manner in which users access the Service, and visitors to the Website are set forth in this statement ("Policy"). By using the Service, you signify your acceptance of this Policy.

Information Collection and Use by JD Supra

JD Supra collects users' names, companies, titles, e-mail address and industry. JD Supra also tracks the pages that users visit, logs IP addresses and aggregates non-personally identifiable user data and browser type. This data is gathered using cookies and other technologies.

The information and data collected is used to authenticate users and to send notifications relating to the Service, including email alerts to which users have subscribed; to manage the Service and Website, to improve the Service and to customize the user's experience. This information is also provided to the authors of the content to give them insight into their readership and help them to improve their content, so that it is most useful for our users.

JD Supra does not sell, rent or otherwise provide your details to third parties, other than to the authors of the content on JD Supra.

If you prefer not to enable cookies, you may change your browser settings to disable cookies; however, please note that rejecting cookies while visiting the Website may result in certain parts of the Website not operating correctly or as efficiently as if cookies were allowed.

Email Choice/Opt-out

Users who opt in to receive emails may choose to no longer receive e-mail updates and newsletters by selecting the "opt-out of future email" option in the email they receive from JD Supra or in their JD Supra account management screen.

Security

JD Supra takes reasonable precautions to insure that user information is kept private. We restrict access to user information to those individuals who reasonably need access to perform their job functions, such as our third party email service, customer service personnel and technical staff. However, please note that no method of transmitting or storing data is completely secure and we cannot guarantee the security of user information. Unauthorized entry or use, hardware or software failure, and other factors may compromise the security of user information at any time.

If you have reason to believe that your interaction with us is no longer secure, you must immediately notify us of the problem by contacting us at info@jdsupra.com. In the unlikely event that we believe that the security of your user information in our possession or control may have been compromised, we may seek to notify you of that development and, if so, will endeavor to do so as promptly as practicable under the circumstances.

Sharing and Disclosure of Information JD Supra Collects

Except as otherwise described in this privacy statement, JD Supra will not disclose personal information to any third party unless we believe that disclosure is necessary to: (1) comply with applicable laws; (2) respond to governmental inquiries or requests; (3) comply with valid legal process; (4) protect the rights, privacy, safety or property of JD Supra, users of the Service, Website visitors or the public; (5) permit us to pursue available remedies or limit the damages that we may sustain; and (6) enforce our Terms & Conditions of Use.

In the event there is a change in the corporate structure of JD Supra such as, but not limited to, merger, consolidation, sale, liquidation or transfer of substantial assets, JD Supra may, in its sole discretion, transfer, sell or assign information collected on and through the Service to one or more affiliated or unaffiliated third parties.

Links to Other Websites

This Website and the Service may contain links to other websites. The operator of such other websites may collect information about you, including through cookies or other technologies. If you are using the Service through the Website and link to another site, you will leave the Website and this Policy will not apply to your use of and activity on those other sites. We encourage you to read the legal notices posted on those sites, including their privacy policies. We shall have no responsibility or liability for your visitation to, and the data collection and use practices of, such other sites. This Policy applies solely to the information collected in connection with your use of this Website and does not apply to any practices conducted offline or in connection with any other websites.

Changes in Our Privacy Policy

We reserve the right to change this Policy at any time. Please refer to the date at the top of this page to determine when this Policy was last revised. Any changes to our privacy policy will become effective upon posting of the revised policy on the Website. By continuing to use the Service or Website following such changes, you will be deemed to have agreed to such changes. If you do not agree with the terms of this Policy, as it may be amended from time to time, in whole or part, please do not continue using the Service or the Website.

Contacting JD Supra

If you have any questions about this privacy statement, the practices of this site, your dealings with this Web site, or if you would like to change any of the information you have provided to us, please contact us at: info@jdsupra.com.

- hide
*With LinkedIn, you don't need to create a separate login to manage your free JD Supra account, and we can make suggestions based on your needs and interests. We will not post anything on LinkedIn in your name. Or, sign up using your email address.