Data pro­tec­tion super­vi­so­ry aut­ho­ri­ty on the use of arti­fi­ci­al intel­li­gence (AI)

CNIL publishes gui­dance for AI deve­lo­pers and operators

The Euro­pean Uni­on (EU) is curr­ent­ly pre­pa­ring an AI Regu­la­ti­on and an AI Lia­bi­li­ty Direc­ti­ve . Howe­ver, the­re are alre­a­dy a lar­ge num­ber of legal requi­re­ments that must be obser­ved when deve­lo­ping or using AI, not least becau­se of the pro­ces­sing of trai­ning data. While the requi­re­ments of the Data Act will be decisi­ve for non-personal data in the future, the Gene­ral Data Pro­tec­tion Regu­la­ti­on (GDPR) must be obser­ved for per­so­nal data. In this light, the French data pro­tec­tion super­vi­so­ry aut­ho­ri­ty CNIL recent­ly published a hand­out on the privacy-compliant use of AI , the key con­tents of which we pre­sent and eva­lua­te below.

Deter­mi­na­ti­on of the pur­po­se of deployment

Accor­ding to the CNIL, when deve­lo­ping and using AI sys­tems, it is essen­ti­al to defi­ne a clear pur­po­se for their deploy­ment. Alre­a­dy befo­re pro­ces­sing per­so­nal data, it should be cle­ar­ly spe­ci­fied exact­ly what the pur­po­se of the pro­ces­sing is to be. This is becau­se it is only on the basis of this pur­po­se that it is pos­si­ble to mea­su­re whe­ther, for exam­p­le, the prin­ci­ple of data mini­mi­sa­ti­on is being obser­ved. The deter­mi­na­ti­on of the legal basis for pro­ces­sing also requi­res defi­ning the pur­po­se of processing.

Deploy­ment and development

In its gui­dance, CNIL distin­gu­is­hes two pha­ses: the deve­lo­p­ment pha­se, in which an AI sys­tem is trai­ned with data, and the deploy­ment pha­se, in which the algo­rithm is used. The two pha­ses should be sepa­ra­ted in terms of data pro­tec­tion law, even though they are often inter­wo­ven in prac­ti­ce, sin­ce AI sys­tems also use the pro­ces­sed data in the deploy­ment pha­se to fur­ther impro­ve the algo­rithm. In the CNIL’s view, over­all lower data pro­tec­tion requi­re­ments should app­ly to the deve­lo­p­ment pha­se than to the deploy­ment pha­se. In the deploy­ment pha­se, an AI sys­tem should only pro­cess data that have been pro­ven effec­ti­ve in achie­ving the pre-defined pur­po­se during the deve­lo­p­ment phase.

The data sets

Deve­lo­pers can eit­her gene­ra­te the data nee­ded to deve­lop or train the AI them­sel­ves, or they can put exis­ting data to a new use. In doing so, they must ensu­re that the com­pi­la­ti­on of the data sets com­pli­es with the law. One par­ti­cu­lar chall­enge is the­r­e­fo­re mer­ging data from dif­fe­rent sources. An AI sys­tem deve­lo­ped by means of unlawful pro­ces­sing of per­so­nal data may not be used, accor­ding to the CNIL.

Rights of data subject

When pro­ces­sing per­so­nal data by means of AI sys­tems, the con­trol­ler must in prin­ci­ple safe­guard the rights of data sub­jects . For com­pa­nies it is encou­ra­ging that the CNIL sees exemp­ti­ons from the infor­ma­ti­on requi­re­ments under the GDPR for AI sys­tems. For exam­p­le, informing data sub­jects may be wai­ved if data was not coll­ec­ted direct­ly from the data sub­jects and the con­trol­ler can pro­ve that it is not pos­si­ble to inform the data sub­jects or would requi­re dis­pro­por­tio­na­te effort.

Sum­ma­ry

The hand­out is a prac­ti­cal aid for data con­trol­lers. At the same time, the CNIL’s exten­si­ve invol­vement with AI shows that data pro­tec­tion requi­re­ments play a key role in the deve­lo­p­ment and deploy­ment of AI and are the focus of data pro­tec­tion super­vi­so­ry aut­ho­ri­ties. Regard­less of whe­ther data pro­tec­tion super­vi­so­ry aut­ho­ri­ties are addi­tio­nal­ly ent­rus­ted with moni­to­ring AI sys­tems under the plan­ned Euro­pean AI Regu­la­ti­on, deve­lo­pers and ope­ra­tors of AI sys­tems should the­r­e­fo­re take into account the appli­ca­ble data pro­tec­tion requi­re­ments and ensu­re com­pli­ance with them via a com­pli­ance manage­ment system.

back

Stay up-to-date

We use your e-mail address exclusively for sending our newsletter. You have the right to revoke your consent at any time with effect for the future. For further information, please refer to our privacy policy.