Independent journalism about VU Amsterdam | Since 1953
27 November 2024

Campus
& Culture

Marleen Stikker: ‘AI should not be developed with the purpose of contributing to The Netherlands Inc’

‘Build your own mail server, it’s really not rocket science’

Honorary doctor Marleen Stikker thinks we are putting our fates too easily into the hands of parties which don’t necessarily have our best interests in mind. ‘AI should not be developed with the purpose of contributing to The Netherlands Inc.’

During the Covid crisis we have downloaded, under pressure, a whole range of new software on our phones and computers. A Covid-app, a digital Covid pass, different kinds of software to be able to work at home, identify and have us monitored online. Students were more or less forced to allow Big Brother into their homes to be able to take an online exam. Didn’t this all go too easily? Does IT mean that when you can do something, you should?

But according to Marleen Stikker, an activist for public values in digitalization who was awarded an honorary doctorate by VU Amsterdam, there certainly is regard for a responsible application of digital techniques. “Although decades late, the government is making laws to protect us”, Stikker says. The initiative is taken by the EU. There is already a privacy act and a digital services act as well as a digital markets act are underway. It’s not perfect yet, but it’s a start. There are efforts to regulate behavioral advertising and the collection of biometric data in public space. A whole range of laws to regulate the digital domain is being designed, all instigated by citizens’ initiatives by movements like People versus Big Tech, Bits of Freedom and scientists who have raised awareness. It’s a political struggle.”

“During the development of the Covid-app and the Covid pass a large coalition of civil rights organizations, scientists, technologists, and hackers were involved, Stikker remarks. “This coalition was responsible for making the principle of privacy of design leading in the design of the CoronaMelder, the Dutch Covid notification app. I personally think this is a great victory.”

Discriminatory algorithms

Stikker sees a turning point. “For decades hardly any attention was paid to the undermining aspect of Big Tech, but by now it’s recognized that our strategic autonomy is at stake. How dependent are we on IT-services we don’t have any control over? And how do we restrict the unbridled trade in data? But the alertness is still fragile. As soon as certain urgencies apply, as with Covid, we quickly seem to forget certain principles. In the name of safety our government permits itself the use of discriminatory algorithms. We still have a long way to go.”

‘Donald Trump has ethics and so do the Dutch tax authorities’

The problem, according to Stikker, is that ethical considerations are made too late, when digital applications have already been developed and applied. These considerations should be made before developing an application. “Technology is not neutral”, Stikker says. “Ethics are included in algorithms, in the selection of the data being applied, the use of certain applications. You should start with the question which values you want to make explicit and include these in the application you are going to develop.”

What we should also consider: “The tricky thing when we talk about ethics, popularly something that is ethical means something good and something that is non-ethical means something bad. But ethics means you explicitize certain values and act according to those. Donald Trump has ethics and so do the Dutch tax authorities involved in the current childcare benefits scandal. We should therefore, when we talk about ethics, talk about what these values are exactly.”

Religious mystification

Stikker notes a certain defeatist attitude towards technology. “It is the Internet, so you should accept that you’re giving up your privacy. But that is a falsehood. You can apply privacy by design and opt for data minimalization. It is the earning models of the suppliers of digital service and the hunger for data of our governments trying to make the world controllable and predictable are what turn us into free game.

There is also a certain religious mystification of digital technique, a narrative in which man has no choice but to fuse with a higher form of intelligence, so-called artificial intelligence. But we don’t even know what intelligence is, what consciousness is.

There is reflection on the responsible application of AI under the banner of ELSA Labs (ELSA means Ethical, Legal and Societal Aspects, PB), as part of a larger AI research program financed by the Dutch National Growth Fund, with room for research by legal experts and social scientists.

‘Universities spend huge amounts of money on systems which do not meet our requirements and principles’

“But ELSA sits on the sidelines”, Stikker says, “instead of at the center. It should be the basis from where AI research is directed. AI should not be developed with the purpose of contributing to The Netherlands Inc. We should develop an AI in sync with our constitutional state and our fundamental rights.”

Tar baby

From a political perspective IT has long been a tar baby, Stikker says. “Nobody wanted to be responsible for it, because you know budgets will always be exceeded. But by now the focus has shifted from automatization to digitalization of policy and the way we deal with data and algorithms. It’s about the relation and the balance of power between citizens and the government, about the power of Big Tech over the essential infrastructures, the role of platforms in for instance retail, mobility and tourism. It’s finally being recognized as a political issue.”

This disbalance of power has become so great that it’s uncertain whether it’s even possible to bring back public values into the digital domain, to establish digital safety. The task at hand is gigantic. “To make a comparison: in the food industry we have an intricate system of rules and supervision to guarantee food safety. Everything to ensure you’re not at risk when you buy something at the supermarket. But when we download something from the app store we know by definition it’s bad news. So a lot of work will have to be done before this is fixed and when you think about it, it’s rather strange that so much time has passed before even a start was made.”

Trade in data

As far as Stikker is concerned the trade in data should have far better regulation. “For a long time we have put no limit to the harvesting of data whatsoever, which has resulted in the loss of our privacy and our sovereignty. And I think we are still far too tolerant regarding data harvesting.”

‘when we download something from the app store we know by definition it’s bad news’

“Everyone agrees we should protect our bodily integrity, that something is terribly wrong when people are forced into a position where they have to market their bodies”, Stikker says. “The integrity of person should be protected likewise, our personal data are not a commodity. People shouldn’t be forced to give up their personal data in return for digital services. People often give their consent to have their data exploited, but one might wonder whether there is a legal ground for this consent when you couldn’t have properly read the terms & conditions or when you are excluded from a service when you don’t give your consent. Besides, often the long term consequences of giving consent can hardly be overseen. So I think consent is just not a solid legal ground for handing over your data, the same way it would be illegal to ask for consent to give up parts of our bodies.”

“We can prevent people to get into such a position. For instance with the way we shape our digital identities. Security, Privacy & Identity professor Bart Jacobs, developed the IRMA (I Reveal My Attributes) pass, which allows you to share only those personal data which are relevant for a certain occasion.”

The Dutch childcare benefits scandal, in which parents were falsely accused of benefits fraud, shows how dangerous the disbalance of power is, according to Stikker. “Every euro spent by the government on data harvesting and algorithms enlarges this disbalance. Because a labour union doesn’t have the means to gather data and build algorithms to analyse corporations or shareholders. The tax authorities sit on huge amounts of data, while the victims of the childcare benefits scandal don’t even have legal council. And thus we continue to build an ever greater disbalance of power.”

Digital education

In the beginning of 2020, when from one day to the next everybody had to work from home because of Covid, the IT service departments of the universities in no time built up a system of apps and tools to make it possible to give online classes, take online exams and have a digital campus in a matter weeks. “But how much knowledge do the universities have of the tools they are using?, Stikker wonders. How much governance do they have over it, and is it all accountable, what happens with the data gathered with these tools? Ninety percent of all the software systems used by companies and organizations, are impossible to check on those criteria. Yet it should be the standard with all procurements, the principle that you keep supervision over the things you are to be held accountable for. 

‘We continue to build an ever greater disbalance of power’

The public sector, including the universities, spends huge amounts of money on licenses for systems which do not meet our requirements and principles. “This is why we should rely more on open source software”, Stikker thinks. “Something like a mail server, for instance, it’s really not rocket science to design your own. Yet we outsource these things, meaning we lose not only our strategic autonomy, but also our innovativeness. Because you make yourself dependent on a third party which has no interest in improving a system or add a new application. Universities at least have the expertise to design their own systems, which would certainly benefit their students as well.”

Comment?

Stick to the subject and show respect: commercial expressions, defamation, swearing and discrimination are not allowed. Comments with URLs in them are often mistaken for spam and then deleted. Editors do not discuss deleted comments.

Fields with * are obligated
** your email address will not be published and we will not share it with third parties. We only use it if we would like to contact you about your response. See also our privacy policy.