Independent journalism about VU Amsterdam | Since 1953
10 January 2025

Science
& Education

Microsoft 365’s AI assistant compromises user privacy

IT organisation SURF has issued a warning against the use of Microsoft Copilot in higher education, claiming that the AI assistant compromises the privacy of students, teachers and researchers.

Programs like Word, Excel and Outlook now include Microsoft 365’s new AI tool, Copilot, a smart assistant that can analyse data and documents, create presentations or draft emails.

But users in higher education can only access this AI feature if they have permission from their institution. According to SURF, the IT partnership organisation set up by Dutch education and research institutions, this permission should be denied. Last month, SURF issued an advisory following extensive research as well as consultation with software giant Microsoft.

The lengthy document highlights two key issues: besides concerns about user privacy, Copilot doesn’t always provide accurate or complete information, especially when it comes to queries about people.

Incomplete and incomprehensible

The personal data Copilot collects may not be limited to basic information like the user’s name and employer, but may also include personal preferences that help the tool write emails and create presentations.

“It’s not clear what personal data Microsoft collects and stores about the use of Microsoft 365 Copilot”, says SURF. Efforts to gain insight into Microsoft’s data collection have yielded little success, as the information provided by the company is both “incomplete and incomprehensible”.

False information

Another problem is the inaccurate or incomplete personal data Copilot may provide in response to a question about, say, a professor or politician. There’s a risk that users will place too much trust in this type of misinformation, says SURF.

Microsoft is well aware of this problem. “Sometimes Copilot will be right, other times usefully wrong – but it will always put you further ahead”, according to a blog post on the company’s website. Evidently, SURF has serious doubts about this supposed ‘usefulness’.

There could also be problems if higher education institutions were to use AI in their admissions processes, or to assess job applicants. Even if educational institutions do decide to allow the use of Copilot, they might have to make exceptions for certain departments.

Big tech

Criticism of big tech and generative AI is certainly nothing new. Major tech companies are often accused of having little regard for intellectual property rights and privacy, while shirking responsibility for the disruption caused by AI. Digital pioneer Marleen Stikker has even argued that higher education institutions have grounds to sue the creators of AI software for damages.

But the conflict between digitalisation and privacy predates the latest technological advances. Over five years ago, well before the advent of generative AI, Dutch university presidents sounded the alarm about the influence of major US tech companies on education. “The student and the teacher are becoming the product; the data no longer belongs to them, or to their institution”, they argued.

Cloud-based data storage raises similar questions: if all our data is stored on American servers, who can access it? A group of researchers from five universities are now joining forces to create their own cloud over the next two years. Their goal is to become independent of companies like Google and Microsoft.

Comment?

Stick to the subject and show respect: commercial expressions, defamation, swearing and discrimination are not allowed. Comments with URLs in them are often mistaken for spam and then deleted. Editors do not discuss deleted comments.

Fields with * are obligated
** your email address will not be published and we will not share it with third parties. We only use it if we would like to contact you about your response. See also our privacy policy.