Independent journalism about VU Amsterdam | Since 1953
1 November 2024

Science
& Education

‘Should one always refuse to partner up with controversial companies?’

Huawei, a company VU Amsterdam works with, has tested software on behalf of the Chinese government for recognizing Uighurs – an oppressed Muslim minority in China -as was uncovered by The Washington Post recently. However, nothing has to change for VU Amsterdam, according to Frank van Harmelen, AI professor and one of two Directors of Research at the DReaMS Lab: the partnership between VU Amsterdam, University of Amsterdam and Huawei.

“We are not surprised that Huawei has tested this software”, claims Van Harmelen. “We already knew that Huawei has developed technology the Chinese government uses for these types of purposes and we condemn this. But we are working with Huawei’s European branch on people-centred artificial intelligence that complies with European regulations. In this way, I believe we can make a contribution to positive developments within the company.”

A better world

Van Harmelen is well aware that the impact of a partnership of no more than three million euros will be limited on a company with an annual turnover of over one hundred billion. “But should one always refuse to partner up with controversial companies? Partnerships exist with companies such as Shell, for example, in order to develop sustainable energy for a better world, while at the same time the company is on trial in Nigeria for the violation of human rights. ING has been convicted of money laundering and Google has been convicted of market corruption three times. You have to constantly weigh up what is ethically responsible and what is not.”

Striking a balance

 Huawei’s work includes developing software that makes it possible to filter “toxic use of language” on social media, just like Facebook and Twitter do. Doesn’t this increase the risk that this software may be used to combat statements that are disagreeable for the Chinese government? Van Harmelen does not think so. “This software focuses on European languages but there is always a discussion on the dual use of technology for good and bad purposes, and this requires striking a balance.”

You could choose not to cooperate with a company that has a track-record of using technology to violate human rights. “We have intense discussions on this topic within our research groups”, says Van Harmelen. “And this topic is frequently up for discussion in our talks with Huawei. We are of course aware of the risks. Colleagues in our field have refused to cooperate on the development of facial recognition software because it can be used directly for the violation of human rights. There is no direct applicability in our field.”

Strict guidelines

 “We have read the article in The Washington post and feel strongly that all types and applications of ethnic profiling are inexcusable’, says VU Amsterdam’s Executive Board. ‘Huawei’s response in the article raises many questions. The conditions for contract research with Huawei’s European branch have been drawn up according to strict European guidelines and the search technology based on European languages we are working on cannot be used for these kinds of purposes.”

Comment?

Stick to the subject and show respect: commercial expressions, defamation, swearing and discrimination are not allowed. Comments with URLs in them are often mistaken for spam and then deleted. Editors do not discuss deleted comments.

Fields with * are obligated
** your email address will not be published and we will not share it with third parties. We only use it if we would like to contact you about your response. See also our privacy policy.