Cloudpointin blogi

AI, teaching and data privacy

Kirjoittanut Sami Marttinen | 27.1.2023 14:22:25

I confess.

My first reaction after trying Chat GPT was how to block it. Artificial intelligence makes it practically impossible to identify plagiarism, and as a history teacher, my thoughts immediately raced through a bunch of essays, for each of which I would have to think about who or what ultimately wrote it.

However, after a little thawing, I started to warm up. I started looking for examples from the business world, how different artificial intelligence algorithms have been used even lightly, and ah - the flood of examples set the pedagogue’s mind on fire! Logos, homepages, marketing plans, stories, language maintenance... the possibilities would be almost limitless from the perspective of history and social studies alone.

Artificial intelligence and data protection

It is essential to gradually introduce artificial intelligence and the services that utilize it in teaching and learning. However, the use of AI services in education is ultimately dictated by precisely the same conditions as any other use of digital technology in education - whether it produces pedagogical added value, whether it is of practical use, and whether it meets the data protection requirements, respectively.

A quick look at, for example, OpenAI, which develops e.g. ChatGPT and Dall-E 2 AI algorithms, states in its privacy statement as follows “By using our Service, you understand and acknowledge that your Personal Information will be transferred from your location to our facilities and servers in the United States.” 

The transfer of data to the United States is not always a data protection threat in itself, as long as it takes place in accordance with the General Data Protection Regulation (GDPR). The GDPR requires measures to be taken when data is transferred to a so-called third-country. At the time of writing, for the aforementioned service, it does not appear from the data protection statement that the transfer criteria are met.

At the end of the day, the education provider is the one who determines which services are to be allowed for use in education. If the service provider does not take the necessary additional measures in connection with transfers of personal data to a third-country. The employment of the service in education may pose challenges – it is clear to anyone with a remote understanding of the GDPR, what risks third-country transfers may pose to a Data Controller when additional measures are disregarded.

Image: Mari Helin/Unsplash Photos

 

What options does the teacher have?

So, what options does a teacher really have if they want to introduce learners to the possibilities offered by artificial intelligence and educate them about its ethical use, while at the same time ensuring that privacy and security matters are met?

1. A teacher can use different artificial intelligence services in teaching with his/her own account, as long as the risks related to the use of the Service are identified thoroughly, and it will not endanger the learners' personal data, even indirectly.

2. If necessary, tasks to be evaluated can be assigned in closed environments, in light of minimizing the interference of artificial intelligence algorithms. The most familiar example of this would be the Forms test in Classroom, which is done on managed Chromebooks. For all types of devices you can find, for example Trelson Assessment, which can be used to assign versatile different tasks and also open attachment files and individual websites as materials for the test taker.

3. The provider of education can allow the use of certain services with learners only after conducting a data protection impact assessment* and after signing the data processing agreement with the supplier. The provider of the education must, nevertheless, evaluate all digital services used in education, but the evaluation should not be as thorough when the service in question does not pose a significant threat to the rights and freedoms of the data subjects. However, a data processing agreement should be always conducted with third-party service providers in light of ensuring the legitimacy of the processing activities. External consultants and services, such as Edudata.io, can also be used to support the provider of the education in ensuring compliance with relevant data protection legislations.

4. A small hope rests on the fact that the EU and the United States would reach a data transfer agreement, which would amongst other things, ensure comparable guarantees to those recognised under EU law – as it is well known, a large portion of service providers are based in the US where the privacy standards differ. The draft adequacy decision will next go through the approval procedure and it is expected to enter into force in the summer of 2023. It is estimated that the draft decision will not fully resolve the concerns raised in Schrems II and in addition the adequacy decision will only apply to those undertakings’ that have committed to the agreed protective measures ‘’certified undertakings’’. The European Data Protection Board is preparing a statement on the draft decision, and it can be expected in the coming weeks.

*Conducting an impact assessment is mandatory when the criteria of Article 35 of the Data Protection Regulation are met.

Epilogue

It is expected that artificial intelligence will be widely used to support learning. For example, Practise Sets, a feature soon to be added to Google Classroom, will in time become a more developed version to scale according to the student's skills, leaving more of the teacher's resources for the students who need them.
One way or another, artificial intelligence will change our everyday life as well as teaching and learning. Microsoft has already announced that it will bring ChatGPT as part of its services, and it is clear that other international actors are also pursuing actions in this avenue.

There are many possibilities, but perhaps one of the challenges is the justification of the need for learning - why learn something that can be completed by someone or something else? The fear of single plagiarism is perhaps no longer the biggest problem if the learner, regardless of the subject, is able to produce information suitable for the situation without processing it much himself, let alone understanding it structuring and layering of information.

For now, the data protection of different artificial intelligence services may pose challenges for their use in teaching. This too is something that the individual providers of the education cannot do anything about. They, as well as single teachers, need to ensure that the digital solutions (e.g. Services) used in education meet the requirements of the EU's General Data Protection Regulation. Managing one’s own personal data is also an important future skill.

The article was written by Cloudpoint's Head of Training Sami Marttinen, heavily consulting the company's lawyers specializing in data protection.

Main image: Daniel K. Cheung/Unsplash Photos