Has Datatilsynet become an echo-chamber or are they afraid to take a stance in the world of AI and data? The story of Norways regulatory body which does not regaulate.
About Datatilsynet: Datatilsynet conducts supervision, case management, guidance and communication work at the intersection of law and technology.
In the wake of Datatilsynet’s recent report on GDPR compliance and Copilot, a sense of disillusionment has emerged among those who anticipated practical guidance rather than a list of seemingly disconnected points.
The report, which refrains from a definitive stance on Copilot, instead opts for a nuanced exploration of the system, leaving users to navigate the complexities of GDPR compliance largely on their own.
This approach, while perhaps well-intentioned, skirts the pressing issues at hand, offering little in the way of actionable insights for those seeking to reconcile the innovative potential of Copilot with the stringent demands of GDPR. The public’s expectation of Datatilsynet as a regulatory body is not merely to map the terrain but to lead the way through it. The nine points raised, while informative, fall short of this, leaving users to question the relevance of the guidance provided in the context of real-world application.
It also refers to the NTNU report, that we have responded to here: The Copilot Chronicles – A reponse to NTNUs report – Agder in the cloud
Lets look at the 9 points listed in the report – what Datatilsynet says marked with Datatilsynet, and what I believe could be done with it marked with Response.
1. M365 Copilot assumes that business data is already in the Microsoft cloud solution
Datatilsynet: M365 Copilot sits on top of Microsoft’s M365 cloud solution. Before implementing M365 Copilot, it is a prerequisite that you have made all necessary security and privacy assessments related to the M365 platform itself. You must also have the necessary resources and expertise to manage service providers and the cloud solution in a responsible manner over time, especially due to frequent changes from the supplier side. The responsibility for the data used in the copilot lies with the companies that use the tool.
Response: Microsoft do recommends that you use all the security features, like Identity, Devices, and Information. But where are the recommendations from Datatilsynet? Beside, this goes for all programs you use, they have prerequisites and those goes usually along the lines that you should use the system as intended and recommended..
And additionally; Datatilsynet haven’t had a look at organizations in Norway, or the rest of the world, there is a very small percentage that have everything under control… Do Datatilsynet know all the prerequites from Microsoft?
I have a mantra, and this is a basis for information security:
- Know your data
- Classify your data
- Manage your data
2. Keep order in your own house
Datatilsynet: The copilot will have access to all the same information that the user of the tool has. This means that challenges and weaknesses in the “digital foundation”, such as poor access management and control over personal data, will be visible and greatly reinforced by M365 Copilot. It is important to emphasize that Microsoft as a service provider also requires that “everything is in order” with the management of the underlying M365 platform, for the copilot to be used in a responsible and legal manner. Good order in one’s own house must therefore be in place first, and any introduction project will probably require thorough (re)assessments of one’s own information management. This requires effort and resources but is a critical and necessary step in the introduction of new technology.
Response: We (the security people of the world) have been saying and preaching security in all areas of the platform for ages. This is also connected to the first point; Microsoft has always said that you need to do the proper foundation that they recommend.
3. Identify and limit what the Copilot should be used for
Datatilsynet: Consider what tasks and associated processing of personal data M365 Copilot should and should not be used for. Some tasks are poorly suited for the use of generative AI, for example when it is important that answers are correct, and the user does not have the skills or time to control what is generated. Furthermore, the use of M365 Copilot in e.g. HR and personnel management will entail an extra high risk to privacy. This is because the handling of access to personal data is difficult to manage and control, or because the consequences for individuals can be very serious. Tasks involving special categories of (sensitive) personal data should also be carefully considered or avoided in connection with the use of M365 Copilot.
Map and describe the processing operations that occur if M365 Copilot is used for a specific purpose, i.e. from the time an instruction is given to the copilot until the response is released. The treatment protocol is an appropriate place to start, where you go through and assess each treatment for each purpose. It provides a good starting point for assessing what tasks you want and can use the copilot for.
If M365 Copilot has access to information with (sensitive) personal data, the information must be classified, identified, and labelled, at least at document level. We emphasize that Microsoft recognizes that this is necessary for the responsible use of M365 Copilot.
Response: I do agree that organizations should have a set of use cases for Copilot, if you can interpret the part of assess treatments for Copilot to being just that. That will make it easier for the users to understand the purpose of Copilot and how to get value out of it.
The part of the tasks you WANT and CAN use Copilot for.. Well, you can tell people to put the used coffee cup in the dish washer, but will they?
They are given a tool, and the tool can be very good if the learn HOW to use it, so I would focus on making sure we have everything set to protect sensitive data, with the use of Microsoft Purview for information protection and Entra ID for Identity.
Make sure people have access to the things they should and not more. Make sure that sensitive information is treated the right way with sensitive info types, data loss prevention and retention policies. We have several posts on this on this blog
- Automatiser labeling av sensitiv informasjon! – Agder in the cloud
- Hvordan finne kritiske data raskt og enkelt! – Agder in the cloud
- Get Copilot ready(-ish) – with retention policies – Agder in the cloud
- Get Copilot ready(-ish) – with labels – Agder in the cloud
- Get Copilot ready(-ish) – with Microsoft Purview – Agder in the cloud)
4. Consider the legal foundation
Datatilsynet: When tasks and associated processing of personal data are assessed as “M365 Copilot candidates”, the legal basis for the processing must be checked. For processing you already perform, you must consider whether the use of M365 Copilot leads to changes in the processing, such as which or whose personal data is being processed. If there are changes, you must consider whether the existing basis for processing can still be used, including whether the treatment is still “necessary”. If the answer is no, M365 Copilot can’t be used for this.
Processing for new purposes requires that you identify an appropriate basis for processing. Where it is a question of reusing personal data for new purposes, which will often be the case, you must consider whether the new processing is compatible with the original purpose.
Response: Okay, this is a bit tricky.
Copilot has access to all data you have access to; it can rewrite and make” new data” with the access to “old data”. Therefore, it can be considered as processing sensitive or personal data. But when the data already exist, and we just make it into something new, we should already have the foundation and purpose for using the “old data”… With emphasis on should.
And if we do have sensitive data that goes under the GDPR regulatory in M365, we need to have the legal foundation for it already set, and again, back to keeping order – all the features within processing sensitive data should be set.
5. Assessing privacy implications
Datatilsynet: As a general rule, there will be a requirement to assess data protection consequences (DPIA) when using generative AI that processes personal data. This is because the law highlights “use of new technology” as a particularly important factor, and the understanding of risks associated with generative AI is still immature. A DPIA must be done per treatment or set of similar treatments. Tasks that do not in themselves require the processing of personal data will still be able to do so with M365 Copilot, because the copilot uses all the information the user has access to and can thus link it with personal data.
The DPIA process must identify technical and organizational measures that can reduce the risk to an acceptable level, and these must be in place before any use of M365 Copilot. Testing can be a measure to minimize risk. If the risk is too high even after measures have been tried, you should probably not use M365 Copilot for the treatment in question anyway. Alternatively, contact the Norwegian Data Protection Authority for a preliminary discussion.
Response: I must honestly admit that I hoped that the Datatilsynet would take responsibility and instead create frameworks/templates for DPIA in connection with the use of both Copilot and M365. Asking all organizations to do this themselves, while at the same time asking them to create good processes about what tasks Copilot should be allowed to do, shows an inability to see what organizations are struggling with today and how Copilot actually works.
A lack of time, resources, employees that need to understand the legal and the technology, and with budgets deep in the red, means that organizations cannot prioritize these tasks.
How about a little help dear Datatilsynet?
Have basic templates for DPIA, after all most organizations are very similar in our country. We are not so unique that we can’t use basic templates. We handle for the most part, the same type of data within M365.
Ask the IT world, both IT professionals and people who handle sensitive data to discuss the possibilities and challenges.
So help us, take the responsibility that you are given Datatilsynet, and guide us!
6. Will the use conflict with the E-mail Regulation?
Datatilsynet: M365 Copilot logs all interactions. The history is stored in the user’s personal area, and in NTNU’s case, is available to M365 administrators. Overall, we consider it likely that the interaction log could be affected by the prohibition on monitoring employees’ use of electronic equipment. However, we understand that the main purpose of the interaction log is to ensure that the quality of the service is as it should be. This purpose may fall under the exemption for managing the company’s computer network. Whether the second exception, to “uncover or clarify security breaches in the network”, may be relevant, must be assessed specifically in relation to the purpose of the interaction log.
Response: Well, Microsoft365 and Azure logs everything already, so how can that be okay, but not Copilot?? M365 have a very detailed auditlog, and can be accessed with the right permissions.
All logs must of course be configured, and some are only accessible for a certain time, standard being from 30 to 180 days.
In the digital age, data security and privacy are paramount, especially in the workplace. With M365 Copilot, interaction logs are stored securely in the user’s personal area and are accessible to M365 administrators, which is a standard practice within the realm of M365. The concern that such logs could infringe upon the prohibition of monitoring employees is valid. However, it’s essential to understand that the primary intent of these logs is not to monitor.
Administrators can view all data within M365, with predefined security roles which need to be set in particular way, and access to these are logged.
And if we dive a bit into potential concerns about privacy, this can be effectively managed through the implementation of retention policies. These policies can be tailored to ensure that logs are kept only as long as necessary and are disposed of in a secure manner, thus mitigating any risk of misuse. For example, run a retention policy to delete once a week. In summary, while the interaction logs of M365 Copilot serve a critical function in maintaining service quality, the system’s design inherently respects and upholds the privacy of its users. With the right policies in place, M365 Copilot can offer a secure, efficient, and compliant environment for all users.
7. The use of language models requires competence and awareness
Datatilsynet: Language models provide a new user experience for many people, with both possibilities and limitations that are unclear. It can be challenging to understand what information is included in the basis for formulations and what is not. It requires competence to formulate instructions or prompts that provide relevant and good answers. It is the company’s responsibility to ensure that users of the solution have sufficient knowledge, awareness and training in the use of M365 Copilot. This expertise not only ensures good quality in what is generated, but also that the solution is used in a way that safeguards privacy.
Response: YES. Like all other type of software or new pieces of technology.
This is nothing new, it’s something we need to make a prerequisite for everything we handle, I think that the basic use of email should be a larger focus regarding GDPR, we have used that for 25 + years, and still people misuse it and send all kinds of sensitive data.
And again back to Datatilsynet being a public authority on GDPR, they should guide us and give us the basic foundation for this.
8. Consider alternative solutions
Datatilsynet: M365 Copilot can be used for a lot of things. It is therefore extensive work to ensure that the system is used in a responsible and legal manner. Some of the copilot’s characteristics may challenge the purpose limitation principle and the data minimisation principle. Measures that in theory can reduce risk and consequences can in practice be very difficult to introduce. It is therefore important to consider whether other AI solutions with lower privacy risks can meet the specific needs. These can be solutions that transcribe audio recordings or specially adapted dialogue robots and support tools adapted to special purposes and limited to carefully selected and quality-assured internal sources of information.
Response: Well…Always consider what tool to use for the task at hand. BUT, when it comes to the tasks Copilot can do within M365, there is no other tool that is as integrated into the systems as Copilot, and I may sound like a Microsoft über fan when I say that.However, considering the importance of GDPR compliance and the handling of sensitive information, it is prudent to retain the data within the same environment. Utilizing the features mentioned above will help ensure that the data remains secure, classified, and properly managed.
9. Implement in small and controlled steps
Datatilsynet: It is possible for Norwegian companies to use M365 Copilot, but not for everyone and not for everything. Our clear recommendation is that such solutions are introduced in a controlled manner, in small steps, with selected roles and for suitable treatments in the company. Structured arrangements must also be made for follow-up control and follow-up of the quality of what the solution produces, both through organisational and technical measures.
Response: Yes, always do pilots, always to test on how this works for you, and use the time to make sure that – if you have done all the other things towards security – everything works as intended and that people don’t get the wrong answers. This is a common recommendation for all systems, and everything you add to your organization..
The webinar that kicked of publication of the report, showed that Copilot can be used for interpreting people’s state of mind, which raises concerns about its potential to give users a false impression or suggest another person’s feelings inaccurately. This capability, while innovative, must be handled with caution to ensure ethical use and to prevent misunderstandings or misinterpretations that could arise from such interactions.
It is essential that users of Copilot are aware of these limitations and that clear guidelines and safeguards are in place to mitigate any negative consequences.
While the report mentions some good points regarding keeping order in your own house, and that you need training and controlled implementations, I think they focus on the wrong things.
They should shift this to helping with guidelines on how to set up DPIAs, guidelines on how to interact and process sensitive data.
That would actually help organizations and could prevent issues with GDPR if everyone followed the same baseline for data security..
This whole report feels like a echo of NTNUs report, and I feel it doesn’t give anything new, and that disappoints me.
Discover more from Agder in the cloud
Subscribe to get the latest posts sent to your email.