Data pro­tec­tion and the Euro­pean AI Regu­la­ti­on: new legal chal­lenges for smart medi­cal devices?!

In our con­sul­ting prac­ti­ce, we fre­quent­ly encoun­ter manu­fac­tu­r­ers or pro­ducts which do not adhe­re to the requi­re­ments of the Gene­ral Data Pro­tec­tion Regu­la­ti­on (GDPR). In such cases, we are often told that com­pli­ance with the GDPR is not man­da­to­ry for manu­fac­tu­r­ers and pro­ducts and that only tho­se who ulti­m­ate­ly per­form the data pro­ces­sing are respon­si­ble for com­ply­ing with its requi­re­ments, sin­ce it is they who deter­mi­ne the means and pur­po­se of the pro­ces­sing. But in many cases, this assump­ti­on is often over­ly sim­pli­stic, as we will explain below. In addi­ti­on, the EU Com­mis­si­on pro­po­sed a Regu­la­ti­on on Arti­fi­ci­al Intel­li­gence (AI) in ear­ly May 2021 which is expec­ted to crea­te spe­cial pro­tec­tions for citi­zens’ rights and safe­ty from the use of AI, and which may invol­ve addi­tio­nal requi­re­ments for manu­fac­tu­r­ers of smart medi­cal devices.

Data pro­tec­tion requi­re­ments for manufacturers

The truth is that manu­fac­tu­r­ers are not bound by the GDPR’s requi­re­ments per se. They are addres­sed only in Reci­tal 78 to the Regu­la­ti­on, which sta­tes that manu­fac­tu­r­ers should be “encou­ra­ged to take into account the right to data pro­tec­tion when deve­lo­ping and desig­ning such pro­ducts, ser­vices and appli­ca­ti­ons and, with due regard to the sta­te of the art, to make sure that con­trol­lers and pro­ces­sors are able to ful­fil their data pro­tec­tion obli­ga­ti­ons.” That the GDPR’s requi­re­ments do not app­ly to manu­fac­tu­r­ers direct­ly is due to the scope of the GDPR, which extends only to cases invol­ving the actu­al pro­ces­sing of per­so­nal data. This is done not by the manu­fac­tu­rer but rather by the ope­ra­tor of the AI appli­ca­ti­on, i.e. the con­trol­ler, which ulti­m­ate­ly deter­mi­nes the means and pur­po­se of the pro­ces­sing acti­vi­ties. Nevert­hel­ess, manu­fac­tu­r­ers should adhe­re to data pro­tec­tion prin­ci­ples when desig­ning their pro­ducts; other­wi­se, future ope­ra­tors will be unable to use the pro­duct wit­hout vio­la­ting data pro­tec­tion law. It the­r­e­fo­re fol­lows that the GDPR does app­ly to manu­fac­tu­r­ers, if only indirectly.

Pro­ces­sing of health data

The use of smart medi­cal devices or even digi­tal health appli­ca­ti­ons ine­vi­ta­b­ly invol­ves the pro­ces­sing of lar­ge quan­ti­ties of gene­ral­ly sen­si­ti­ve health data. But the­re are some things which need to be kept in mind in this regard: “Per­so­nal data con­cer­ning health should include all data per­tai­ning to the health sta­tus of a data sub­ject which reve­al infor­ma­ti­on rela­ting to the past, cur­rent or future phy­si­cal or men­tal health sta­tus of the data sub­ject.” (Reci­tal 35 to the GDPR). The law affords spe­cial pro­tec­tion for data sub­jects, i.e. the indi­vi­du­als who­se data is being pro­ces­sed. “Tech­ni­cal and orga­niza­tio­nal mea­su­res to pro­tect the inte­gri­ty and con­fi­den­tia­li­ty of health data are not only requi­red by law but are also neces­sa­ry to pre­vent abu­se of the data and to coun­ter­act errors in pro­ces­sing,” accor­ding to the Com­mis­sio­ner for Data Pro­tec­tion of the Fede­ral Sta­te of Bre­men (3rd  Annu­al Report for 2020, p. 60). Accor­din­gly, par­ti­cu­lar cau­ti­on is requi­red when pro­ces­sing per­so­nal data using an AI appli­ca­ti­on, as it could lead to out­co­mes which could crea­te adver­se effects for data sub­jects, sin­ce the pro­ces­sing actions are not sub­ject to human supervision.

Data trans­fers to the US

Data trans­fers to third count­ries pre­sent a uni­que chall­enge, par­ti­cu­lar­ly for manu­fac­tu­r­ers of digi­tal health appli­ca­ti­ons. In accordance with § 4(3) of the Digi­tal Health Appli­ca­ti­ons Ordi­nan­ce, per­so­nal data may only be trans­fer­red to count­ries out­side the Euro­pean Eco­no­mic Area (EEA) in cases whe­re the­re is an ade­quacy decis­i­on from the EU Com­mis­si­on. But such a decis­i­on no lon­ger exists for the US ever sin­ce the ECJ’s “Schrems II” ruling. Sin­ce that ruling, trans­fers to the US have beco­me legal­ly pro­ble­ma­tic. Even the ques­ti­on as to whe­ther digi­tal health apps can still be offe­red in the Goog­le and Apple app stores has yet to be cla­ri­fied. Unfort­u­na­te­ly, the gui­de­lines and infor­ma­ti­on (only in Ger­man) from Germany’s Fede­ral Insti­tu­te for Drugs and Medi­cal Devices (BfArM) fail to pro­vi­de suf­fi­ci­ent cla­ri­ty as to the legal situa­ti­on with regard to data trans­fers. Our clo­se ana­ly­sis of the­se publi­ca­ti­ons can be found here. We would gene­ral­ly advi­se manu­fac­tu­r­ers of digi­tal health appli­ca­ti­ons to clo­se­ly exami­ne their data flows from both a fac­tu­al and legal view­point and to be awa­re of the con­sidera­ble legal risks which they could be expo­sed to when using non-European pro­vi­ders or their subsidiaries.

New requi­re­ments from the coming AI Regulation?

In April 2021, the EU Com­mis­si­on pre­sen­ted the world’s first regu­la­to­ry frame­work for AI, which aims to pro­vi­de spe­cial pro­tec­tions for citi­zens’ rights and safe­ty from the use of AI sys­tems. The Pro­po­sal fol­lows a risk-based approach, under which use of AI sys­tems is sub­ject to various requi­re­ments depen­ding on the risk to pro­tec­ted inte­rests, such as e.g. the poten­ti­al impact on humans and the asso­cia­ted risk to vital legal inte­rests. For exam­p­le, AI sys­tems which are con­side­red to pre­sent  a “clear thre­at to the safe­ty, liveli­hoods and rights of peo­p­le” (unac­cep­ta­ble risk) are pro­hi­bi­ted. At the next level are “high-risk” AI sys­tems, which include e.g. AI tech­no­lo­gy which is used as safe­ty com­pon­ents in pro­ducts. The Com­mis­si­on cites an AI appli­ca­ti­on for robot-assisted sur­gery as an exam­p­le of this cate­go­ry. Accor­din­gly, manu­fac­tu­r­ers of smart medi­cal devices will need to clo­se­ly exami­ne the requi­re­ments and pre­cis­e­ly deter­mi­ne the risk asso­cia­ted with their AI appli­ca­ti­on. Regard­less of the risk in any indi­vi­du­al case, the trans­pa­ren­cy requi­re­ment will likely beco­me an issue for all manu­fac­tu­r­ers of AI appli­ca­ti­ons, and trans­pa­ren­cy is a core prin­ci­ple of the GDPR as well. Of cour­se, the regu­la­ti­ons for AI sys­tems also over­lap with the GDPR and the rules for the pro­tec­tion of health data in cases whe­re per­so­nal (health) data is pro­ces­sed using AI appli­ca­ti­ons, which could pre­sent an ele­va­ted risk given the par­ti­cu­lar signi­fi­can­ce of health data in data pro­tec­tion law.

More detail­ed infor­ma­ti­on about the AI Regu­la­ti­on, as well as a dis­cus­sion as to which addi­tio­nal requi­re­ments will likely app­ly for manu­fac­tu­r­ers, can be found here in an artic­le by our col­le­agues Phil­ipp Reusch and Niklas Weidner.

Our recom­men­da­ti­ons for future action

We advi­se manu­fac­tu­r­ers to look into the legal requi­re­ments for their com­pa­nies and their pro­ducts as soon as pos­si­ble. This exami­na­ti­on should take into account indi­rect requi­re­ments, par­ti­cu­lar­ly tho­se ari­sing from the GDPR. It remains to be seen how strict the super­vi­so­ry aut­ho­ri­ties will be in case of vio­la­ti­ons, but it is clear that requi­re­ments in data pro­tec­tion law will take on enorm­ous importance given the enorm­ous enforce­ment cam­paign by the super­vi­so­ry aut­ho­ri­ties, as well as the sen­si­ti­ve natu­re of data rela­ting to health. The French data pro­tec­tion aut­ho­ri­ty CNIL, for exam­p­le, has announ­ced that this will be a focus of its work in the year 2021 and has alre­a­dy impo­sed its first fines. We would advi­se com­pa­nies to imple­ment a com­pli­ance manage­ment sys­tem for the legal requi­re­ments rela­ting to data pro­tec­tion and medi­cal devices in order to ensu­re that they are pre­pared for inves­ti­ga­ti­ons by the super­vi­so­ry aut­ho­ri­ties and so that they can avo­id the nega­ti­ve con­se­quen­ces of vio­la­ti­ons, such as e.g. fines or pro­hi­bi­ti­on orders.

back

Stay up-to-date

We use your email address exclusively for sending our newsletter. You have the right to revoke your consent at any time with effect for the future. For further information, please refer to our privacy policy.