In our consulting practice, we frequently encounter manufacturers or products which do not adhere to the requirements of the General Data Protection Regulation (GDPR). In such cases, we are often told that compliance with the GDPR is not mandatory for manufacturers and products and that only those who ultimately perform the data processing are responsible for complying with its requirements, since it is they who determine the means and purpose of the processing. But in many cases, this assumption is often overly simplistic, as we will explain below. In addition, the EU Commission proposed a Regulation on Artificial Intelligence (AI) in early May 2021 which is expected to create special protections for citizens’ rights and safety from the use of AI, and which may involve additional requirements for manufacturers of smart medical devices.
Data protection requirements for manufacturers
The truth is that manufacturers are not bound by the GDPR’s requirements per se. They are addressed only in Recital 78 to the Regulation, which states that manufacturers should be “encouraged to take into account the right to data protection when developing and designing such products, services and applications and, with due regard to the state of the art, to make sure that controllers and processors are able to fulfil their data protection obligations.” That the GDPR’s requirements do not apply to manufacturers directly is due to the scope of the GDPR, which extends only to cases involving the actual processing of personal data. This is done not by the manufacturer but rather by the operator of the AI application, i.e. the controller, which ultimately determines the means and purpose of the processing activities. Nevertheless, manufacturers should adhere to data protection principles when designing their products; otherwise, future operators will be unable to use the product without violating data protection law. It therefore follows that the GDPR does apply to manufacturers, if only indirectly.
Processing of health data
The use of smart medical devices or even digital health applications inevitably involves the processing of large quantities of generally sensitive health data. But there are some things which need to be kept in mind in this regard: “Personal data concerning health should include all data pertaining to the health status of a data subject which reveal information relating to the past, current or future physical or mental health status of the data subject.” (Recital 35 to the GDPR). The law affords special protection for data subjects, i.e. the individuals whose data is being processed. “Technical and organizational measures to protect the integrity and confidentiality of health data are not only required by law but are also necessary to prevent abuse of the data and to counteract errors in processing,” according to the Commissioner for Data Protection of the Federal State of Bremen (3rd Annual Report for 2020, p. 60). Accordingly, particular caution is required when processing personal data using an AI application, as it could lead to outcomes which could create adverse effects for data subjects, since the processing actions are not subject to human supervision.
Data transfers to the US
Data transfers to third countries present a unique challenge, particularly for manufacturers of digital health applications. In accordance with § 4(3) of the Digital Health Applications Ordinance, personal data may only be transferred to countries outside the European Economic Area (EEA) in cases where there is an adequacy decision from the EU Commission. But such a decision no longer exists for the US ever since the ECJ’s “Schrems II” ruling. Since that ruling, transfers to the US have become legally problematic. Even the question as to whether digital health apps can still be offered in the Google and Apple app stores has yet to be clarified. Unfortunately, the guidelines and information (only in German) from Germany’s Federal Institute for Drugs and Medical Devices (BfArM) fail to provide sufficient clarity as to the legal situation with regard to data transfers. Our close analysis of these publications can be found here. We would generally advise manufacturers of digital health applications to closely examine their data flows from both a factual and legal viewpoint and to be aware of the considerable legal risks which they could be exposed to when using non-European providers or their subsidiaries.
New requirements from the coming AI Regulation?
In April 2021, the EU Commission presented the world’s first regulatory framework for AI, which aims to provide special protections for citizens’ rights and safety from the use of AI systems. The Proposal follows a risk-based approach, under which use of AI systems is subject to various requirements depending on the risk to protected interests, such as e.g. the potential impact on humans and the associated risk to vital legal interests. For example, AI systems which are considered to present a “clear threat to the safety, livelihoods and rights of people” (unacceptable risk) are prohibited. At the next level are “high-risk” AI systems, which include e.g. AI technology which is used as safety components in products. The Commission cites an AI application for robot-assisted surgery as an example of this category. Accordingly, manufacturers of smart medical devices will need to closely examine the requirements and precisely determine the risk associated with their AI application. Regardless of the risk in any individual case, the transparency requirement will likely become an issue for all manufacturers of AI applications, and transparency is a core principle of the GDPR as well. Of course, the regulations for AI systems also overlap with the GDPR and the rules for the protection of health data in cases where personal (health) data is processed using AI applications, which could present an elevated risk given the particular significance of health data in data protection law.
More detailed information about the AI Regulation, as well as a discussion as to which additional requirements will likely apply for manufacturers, can be found here in an article by our colleagues Philipp Reusch and Niklas Weidner.
Our recommendations for future action
We advise manufacturers to look into the legal requirements for their companies and their products as soon as possible. This examination should take into account indirect requirements, particularly those arising from the GDPR. It remains to be seen how strict the supervisory authorities will be in case of violations, but it is clear that requirements in data protection law will take on enormous importance given the enormous enforcement campaign by the supervisory authorities, as well as the sensitive nature of data relating to health. The French data protection authority CNIL, for example, has announced that this will be a focus of its work in the year 2021 and has already imposed its first fines. We would advise companies to implement a compliance management system for the legal requirements relating to data protection and medical devices in order to ensure that they are prepared for investigations by the supervisory authorities and so that they can avoid the negative consequences of violations, such as e.g. fines or prohibition orders.back