On July 1, James Kurose, the Assistant Director for Computer and Information Science and Engineering Directorate at the National Science Foundation, and Keith Marzullo, the Director of the National Coordination Office for Networking and Information Technology Research and Development, announced the release of the National Privacy Research Strategy (the “Privacy Strategy”), a policy document establishing priorities and objectives for privacy research and development.  The chief goal of the Privacy Strategy is to “produce knowledge and technology that will enable individuals, commercial entities, and the government to benefit from transformative technological advancements, enhance opportunities for innovation, and provide meaningful protections for personal information and individual privacy.”

In pursuit of these objectives, the Privacy Strategy identifies seven priorities for privacy-related research:  

  • Foster multidisciplinary approach to privacy research and solutions.  The Privacy Strategy recognizes that research efforts across various disciplines are required to identify and understand privacy risks and establish sophisticated privacy protection systems.  Such disciplines include computer science, social and behavioral sciences, biomedical science, psychology, economics, law, and ethics.  The Privacy Strategy points out that “[m]ultidisciplinary approaches are needed to understand how the adoption of privacy protections is advanced or impeded by policy and regulatory factors, organizational and business aspects, market competition, and economic and social incentives or disincentives.”
  • Understand and measure privacy desires and impacts.  The Privacy Strategy notes that privacy values and desires vary by context and individual.  The paper states that “[s]ystem designers and developers need to better understand what people value regarding privacy, what are people’s privacy desires and expectations, and in what ways privacy might be infringed upon, in order to develop systems that are more respectful of peoples’ privacy choices.”
  • Develop system design methods to incorporate privacy desires, requirements, and controls.  The Privacy Strategy emphasizes that privacy concerns must be accounted for in the design of systems and technologies, noting that “[d]esigning for privacy must connect individuals’ privacy desires with system requirements and controls in a way that effectively bridges the aspirations with development.”  A key research question here is what “metrics and measurements can measure both privacy and system utility, to understand the tradeoffs between the two, and to support the development of systems that can maximize both.”
  • Increase transparency of data collection, sharing, use, and retention.  The Privacy Strategy states that in today’s vast information ecosystem, the degree of transparency involved in collecting individuals’ data can vary by place and technology.  The Privacy Strategy recommends research into enhanced transparency with respect to data collection and use, which “would enable individuals to better evaluate the privacy implications and potential benefits of their activities and would permit data collectors/users to develop data practices that respect and protect individuals’ privacy desires.”
  • Assure that information flows and use are consistent with privacy rules.  The Privacy Strategy notes that technologies must ensure that personal data is properly linked to the rules for managing that data (e.g., by tagging and processing data to preserve a user’s data preferences).  Per the Privacy Strategy, “[i]mproved technology for managing data use would make it possible for data-processing and storage organizations to determine, rapidly and reliably, if their handling of private information meets legal, regulatory, and ethical standards.”
  • Develop approaches for remediation and recovery.  The Privacy Strategy recommends that privacy research focus on measuring the usefulness of redress mechanisms (e.g., credit freezes) when privacy events occur and evaluating the consequences when remedial steps are not taken.  The paper states that “[r]emediation techniques might also provide the capabilities to correct or delete erroneous data about individuals, exclude improperly used data, and effect a change in the processing systems that caused the privacy event.”
  • Reduce privacy risks of analytical algorithms.  The Privacy Strategy highlights the increasing use of analytical algorithms to classify and assess information.   The Privacy Strategy identifies certain privacy issues in this space, such as “when the information used by algorithms is inappropriate or inaccurate, when incorrect decisions occur, when there is no reasonable means of redress, when an individual’s autonomy is directly related to algorithmic scoring, or when the use of predictive algorithms chills desirable behavior or encourages other privacy harms.”  Accordingly, the paper recommends additional research on the use of these algorithms.

The Privacy Strategy concludes by noting that it is meant to give guidance to the Executive Branch, policymakers, and others on how to direct their research efforts, but it is up to each agency to implement these priorities in their own work.  In addition, at least one organization has criticized the Privacy Strategy as “flawed.”  The Electronic Privacy Information Center commented that the Privacy Strategy “focuses on measuring the ‘privacy desires’ of users rather than the extent of the problem or goals to safeguard privacy, such as coding Fair Information Practices, developing genuine Privacy Enhancing Techniques, or complying with Privacy Act obligations.”