Research Outline

Data Privacy and Security Research

Goals

To have research that answers six questions surrounding data security review of 2019, as well as looking ahead to 2020. These questions are: One: Why is data privacy and security becoming an even bigger issue now than ever before for companies using AI and robotics? Two: What were the biggest data privacy and/or security regulations that came in to effect that have impacted US AI and Robotics companies in 2019? Three: How are tech companies in the US adjusting to the GDPR? Four: What companies are good examples, and are leading the charge in data security and protection in 2019? Are there any cautionary tales? Five: What data privacy and security regulations do US tech companies have to be aware of in 2020? (CCPA, GDPR). Six: What steps can be taken now to develop a plan to stay in compliance with new legislation and/or regulation?

Early Findings

  • According to technology attorney Stephen Wu, the "use of artificial intelligence, machine learning and robotics has enormous potential, but along with that promise come critical privacy and security challenges." "The HIPAA Security Rule doesn't talk about surgical robots and AI systems," he notes. "Nevertheless, HIPAA's administrative, physical and technical safeguard requirements still apply," he says.
  • The development of facial recognition software that, based on facial features, can predict the sexual orientation of people clearly shows why ethical review must go beyond protecting human subjects. Known as “gayface” software, "this experimental facial recognition tool was trained on publicly available photographs and claims to predict sexual orientation from facial characteristics alone. With no foreseeable beneficial use of this technology, it might not be ethical to develop an algorithm when it can only be used for harmful, discriminatory purposes."
  • A study from the University of California Berkeley states that advances in AI have made the Health Insurance Portability and Accountability Act of 1996 (HIPAA) obsolete.
  • Facebook, in 1997, put out a “suicide detection algorithm” in order to promote suicide prevention and awareness. This system uses artificial intelligence to collect data from posts and then makes predictions about mental states and any potential to commit suicide. This suicide algorithm is outside the jurisdiction of HIPAA. Clearly even though this is a positive use case for AI in healthcare, the fact is that Facebook is gathering and storing mental health data, and they’re doing it without consent.
  • “In principle, you could imagine Facebook gathering step data from the app on your smartphone, then buying health care data from another company and matching the two,” says Anil Aswani, lead engineer on UC Berkeley’s privacy study.
  • AI has been called one of the great human rights challenges of the 21st century. And it’s not just about doing the right thing or making the best AI systems possible, it’s about who wields power and how AI affects the balance of power in everything it touches.
  • According to a recent report from the Stanford Institute for Human-Centered AI (HAI), the growing proliferation of AI could lead to great imbalance in society.
  • In addition to this public search, we scanned our proprietary research database of over 1 million sources and were unable to find any specific research reports that address the stated goals.

Summary Of Our Early Findings Relevant To The Goals

  • Our first hour was spent making sure that there was publicly available data to answer all six questions posed and we did confirm that there is a wide array of opinion out there from credible experts. We also assumed a United States focus while scanning. If a more broad approach is desired, for example, a global focus, this would have to be clearly communicated to us in any reply.
  • Our first hour also returned some data and information surrounding some of the questions.
  • While the deliverable was asked to be written like a mini blog post, that is not possible with Wonder's deliverable. We present all research in bullet points.
  • Please select one or more of the options provided in the proposed scoping section below.