Being human in the time of neuroscience and artificial intelligence involve carefully exploring the nexuses of complexity where valid ideas are nevertheless in tension, manifesting subtleties and challenges that must not be overlooked. Each page expresses the existing tension(s) between ideas and within each theme, which emerged in the collective discussions, and are then complemented by insights from NHNAI network researchers.

Complexity on Health #6: Developing AI and Health technologies without undermining persons’ privacy and integrity

Participants largely acknowledge the benefits one can get from developing AI and health technologies in healthcare and medicine as well as in the domain of human enhancement (improved medical decision making, automation of certain tasks, enhanced access to healthcare and health related information, enhancement of physical and mental capacities, …).

At the same time, participants also worry about the risk that sensitive health information are collected for non-medical uses. Health data collected by AI or digital tools should only serve medical and healthcare purposes. Digital solutions should not imply intrusion of outside organizations (like insurance companies).

Moreover, with the convergence of NS and AI, data could be used to enhance prediction power over persons behaviors and thought, as well as the possibilities for cognitive manipulation. Therefore, mind privacy should be protected.

The following ideas can be found in the global and local syntheses downloadable here

  • AI and health technologies can improve medicine and health care: (Global – Health) Acknowledging the positive contribution of health technologies to healthcare
  • Potential positive outcomes of enhancement technologies: (Global – Health) Exploring the potential contributions of health technologies to humans’ self-improvement
  • Importance of (mind) privacy protection: (Global – Health)Ensuring privacy protection (protection of sensitive health information and mind privacy)
Insights from NHNAI academic network:

The possibility of using healthcare data to contribute to the costly financing of healthcare innovation is a point of recurrent debate. This could prove to be an interesting avenue, provided that the protection of such data is convincing, and that it is used anonymized and with informed consent. However, a number of studies have documented cases where anonymization has failed, leading to a risk of re-identification (Ohm, 2010). It has also been highlighted that, under the effect of economic incentives, particularly vulnerable populations could be subject to various types of abuse (Vayena & Tasioulas, 2016).

It is generally recognized as very important that external actors such as insurance companies should not be able to access health data. Public trust could be seriously undermined by the use of health data by private organizations for commercial or discriminatory purposes. The use of health data for explicit medical purposes only is intended to be guaranteed by the RGPD regulation, which imposes clear restrictions on the access and use of personal data to this end (Floridi & Taddeo, 2016).

The convergence of AI and neurotechnologies opens the door to the prediction or manipulation of cognitive behavior, and thus poses new threats to cognitive privacy and mental freedom. Several authors thus insist on the importance of protecting the “privacy of the mind”, notably through regulations (Ienca & Andorno, 2017).

Faced with all these challenges, tools such as blockchain are sometimes mentioned as likely to enable individuals to control access to their health data as well as its eventual availability for innovation purposes, on condition of the parallel development of voluntary and rigorous regulation.

Academic References:

  • Floridi, L., & Taddeo, M. (2016). What is data ethics? Philosophical Transactions of the Royal Society A: Mathematical, Physical and Engineering Sciences, 374(2083), 20160360.
  • Ienca, M., & Andorno, R. (2017). Towards new human rights in the age of neuroscience and neurotechnology. Life Sciences, Society and Policy, 13(1), 5.
  • Ohm, P. (2010). Broken promises of privacy: Responding to the surprising failure of anonymization. UCLA Law Review, 57(6), 1701-1777.
  • Vayena, E., & Tasioulas, J. (2016). The ethics of personalized medicine: New challenges and opportunities. Journal of Medical Ethics, 42(8), 451-454.

Because medical data is so sensitive, protecting the privacy of those human beings present in this data will be of paramount importance. It is not just « data », it is data about people who have dignity and are worthy of respect.