In response to to NTNUs IT director Håkon Alstads statement in Kode24 about NTNU not being likely to turn Copilot on, we dived into the concerns stated.
Our take is that there is nothing new to worry about. The worry is already there, but Copilot might give the impression that it’s a new worry.
We have said it over and over again, Copilot is just that. A copilot – You are still the one in charge and therefor in charge of the data you make available for the Copilot.
So, dear reader, make sure you read on as we address each finding to ensure you that copilot is not the issue.
In the wake of technological advancements, the Norwegian University of Science and Technology (NTNU) has been at the forefront of integrating AI tools into its educational and administrative framework. However, the introduction of Copilot, a sophisticated chatbot and language model, has raised concerns over potential surveillance and privacy issues among the staff. In this post, we look at the key findings from the report and from the article in Kode24.
According to Kode24, these are the key points of the report:
- Pilot Project: NTNU IT tested Copilot for Microsoft 365 as part of a Data Protection Authority sandbox project.
- Report Findings: The report highlights 8 findings:
- Copilot is absolutely brilliant when you already know what you want it to help you with
- Copilot can affect the exercise of public authority
- Copilot processes huge amounts of personal data in new and uncontrolled ways
- Microsoft 365 is challenging to manage
- Copilot is early in the development process
- Copilot affects the organization
- Copilot can be used to monitor and measure performance and behavior
- Copilot works really well at times
- Usage Unlikely: IT director Håkon Alstad considers it unlikely that Copilot will be adopted at NTNU, except for some test users.
- Challenges and Solutions: The report discusses challenges in managing Microsoft 365 with Copilot and suggests controlled testing before full implementation.
The report also notes positive experiences with Copilot’s ability to quickly synthesize information from large files. However, it emphasizes the importance of monitoring work quality to prevent errors.
We’ll just skip right to finding number 2, because finding number 1 sort of gives itself away.
Finding number 2
In finding number 2: Copilot can affect the exercise of public authority, the report states a couple of recommandations:
- Take an active position on which data Copilot should have access to (this applies to all tools, including Microsoft 365).
- Work systematically to ensure a good understanding of the regulations and administrative competence at the institution.
- Prioritise having “your own house in order”.
- Make sure that the human control works.
This is key – not only towards the public authority, but in the overall working with data and how its used. And that is how we can adress the concerns within the findings.
Know your data! Understand your data! Secure the data that needs securing!
Finding number 3
The find says that Copilot processes huge amounts of personal data in new and uncontrolled ways and the recommendations in the report says that you need to do a thorough Privacy Impact Assessment (DPIA) and revise it more often than you had thought, and that you should have a high threshold for approving rest risk when it comes to GDPR and privacy.
If we interpret this statement, we will asume that organizations in general process huge amounts of personal data and are failing to protect and control it appropriately and in accordance with regulations.
That is something you should already have in place, and if not, you need to do it yesteday. That is not something that is Copilots fault, the personal data you have in M365 should have already been taken care of.
If an organization’s risk register is so poor that they have yet to control the data they collect and hold, we have clearly identified the problem. Further, if the data governance is not yet in place one would have to ask why not. What is it about an organization from the top down that is broken or missing that causes data security and privacy to be neglected?
Copilot is like a robot vacuum cleaner that gets under the bed and pulls out everything you say you should have “washed and cleaned”.
If you don’t know your data, understand your data and don’t secure the data that needs to be secured, the consequence will be that Copilot processe the data in uncontrolled ways..
Finding number 4
NTNU states that Microsoft 365 is challenging to manage.
How is M365 challenging to manage? In what way has the author cited the management difficulties?
The authors of the report states to understand M365 Copilot, you need to understand Micorosoft 365. But without propper understanding of the difference between organisational respoinsibilities and IT responsibilites the statement leaves the read in bewilderment.
Data governance is an organizational function stemming from a set of principles which guide decisions. Ergo, if your principles are not in place, we’ll know it from the condition of your data. And these deficiencies existed long before you licensed Copilot.
We could argue that systems in general are challenging to manage and that’s why we have experts who advise. It is challenging, and you need to have the right people to manage it and that are able to read up on the litterature and changes. And you need to accept your limitations and use experts when needed. And again, this is not really a Copilot issue.
If a technology is challenging to manage its essential to understand “why” it is challenging, to build the “how” to elivate this pressure. We did not stop chasing the moon because it was challenging.
The statement from NTNU here feels more like an old man yelling at the cloud
Finding number 5
Copilot is early in the development process. Yes, it’s a baby, its not even a year old yet and like all babies they need to learn all the steps beforing becoming an adult.
And if we look at how other apps has developed, much in the same way Word 2.0 has only rudimentary similarities with Word for M365, we can expect that this product will be gaining features, functionality and finesse as time goes on. This is hardly a bad thing. Products that mature in cooperation with large userbases are more likely to mature in ways that benefit the userbase. This makes sense; feedback loops and current computing conditions have time to influence development.
According to Microsoft, Copilot is an AI-powered productivity tool that coordinates large language models (LLMs), content in Microsoft Graph, and the Microsoft 365 productivity apps that you use every day, such as Word, Excel, PowerPoint, Outlook, Teams, and others. This integration provides real-time intelligent assistance, enabling users to enhance their creativity, productivity, and skills. From Microsoft Copilot for Microsoft 365 overview | Microsoft Learn.
Enables users and gives intelligent assistance. These are important key words.
By fostering an environment that values generational change, we can ensure that the benefits of technology are maximized and that all age groups are included in the digital transformation journey.
Finding number 6
As finding 6 indicated, this project is about organisational development and the influence on the organisation, not only IT.
Copilot can and will make a difference to an organisation. It comes down to a matter of time efficiency. When we can produce the routine parts of a task very quickly, we can choose to use our expertise on the time we save. The potential for positive impact to users (whether they are internal users in an organisation or citizen users in a municipality) is huge.
And it will lead to potentially daunting challenges in transforming our work methods and reassessing our practices, like all new systems, organisational changes and time has done before and will again.
The project’s focus on organizational development emphasizes the strategic use of data to drive decision-making and growth. Recognizing the inherent value of data and leveraging it effectively can lead to significant improvements in efficiency, innovation, and competitive advantage. It is about fostering a culture where data is not just an IT asset but a core business resource that everyone in the organization understands and utilizes to its full potential.
If we can briefly mention Microsoft Purview, it has many of the tools needed to keep data safe and to understand the value of the data in the organization. And this is a leadership task and it need to be deep rooted within the organization. So, as we have been saying for a long time – Start there!
Know your data! Understand your data! Secure the data that requires protection!
Finding number 7
Copilot can be used to monitor and measure performance and behavior.
So if we look at finding number 7, the specific privacy concerns regarding the use of Copilot for Microsoft 365 at NTNU are:
- Employee Surveillance: Copilot’s access allows it to rapidly analyze employee behavior and mood based on written content, raising concerns about privacy and potential misuse for employee performance evaluations without their knowledge.
- Data Handling: There is a risk of Copilot processing and combining personal information in new, uncontrolled ways, especially if the input command is imprecise, leading to possible misinterpretations or misuse of sensitive data.
- Public Administration Impact: The use of Copilot in public administration could affect transparency and trust, as it may provide incorrect conclusions or actions based on insufficient information.
- Security Risks: The possibility of Copilot generating inaccurate or hallucinatory content if lacking sufficient data, including false assessments of employees’ emotions or states of mind.
Our take on this concern is that its not based on the tool it self, but more on the etcical part of using Copilot. Remember, Copilot has access to everything you already have access to. The way you interpret data will always be up to you, the user, who interprets it, and by that account it can always be faulty.
So when IT director Håkon Alstad says its unlikely for NTNU to start using Copilot because of monitioring and measurment of performance, we can hypotize that there still is a gap between understanding the technologies functionality versus how technical solutions inherit organizations etchics and morals.
— Jeff Cooper, “The Art of the Rifle”
The rifle itself has no moral stature, since it has no will of its own. Naturally, it may be
used by evil men for evil purposes, but there are more good men than evil, and
while the latter cannot be persuaded to the path of righteousness by
propaganda, they can certainly be corrected by good men with rifles.
Who is it in the organization who would be so overpermissioned as to be able to perform this monitoring through the correlation of logs and prompt examiniation? And moreover, who would allow this? Technolgogy can’t—and shouldn’t—solve all issues. If there is a risk of people “watching” people, then this should be mitigated with appropriate controls we can pull from basic information management and secuity principles such as separation of duties, auditing logging & access, and internal policies or gudielines.
There is a high need for user adoption working with these kinds of tools, understanding that you are the pilot – you need to make sure that the data you are using or making with Copilot is just a draft; it’s unfinished.
It might be wrong if you have the wrong data accessible to you, and yes, it might hallucinate, but we dare to say so do the users every once in a while.
Finding number 8
NTNU states that Copilot works really well sometimes, and that we can certainly agree on. Its not perfect, it is very reliant on how you prompt and the data available for the user.
Again back to useradoption and having your data in order.
Here NTNU gives the recommandtion that you should not use their project as a showstopper – we emphatically agree at this point. But do use some of their project results to get quicker to the core of your challenges when starting on this project. And lastly, use your time and use it well.
And don’t forget the users.. The users are always your biggest variable so make sure you have user adoption in place as well as the security. Then you can turn on Copilot, and do it knowing that you have secured what needs to be securing.
And we do have many posts on securing the users as well, just securing the data will not get you over the finish line. Please make sure that you have MFA, that you have just enough access at the right time.
It feels like its becoming full circle at this point:
All togheter now! Clean your house – Know your data! Understand your data! Secure the data that needs securing!
Discover more from Agder in the cloud
Subscribe to get the latest posts sent to your email.