REGULATION OF ALGORITHMS IN THE NETHERLANDS

Reading time 2 minutes

The leading Dutch think tank on the impact of technology on society, the Rathenau Institute, identified the right to control technology as one of the public values to uphold in this era of digital transformation. In line with this thinking, the Dutch government is currently considering policies concerning the regulation of algorithms. Regulation should tackle the downsides of the use of algorithms, such as discrimination or untransparent decision making.

TOWARDS AN ALGORITHM IMPACT ASSESSMENT

In April 2020, the Dutch Ministry of the Interior and Kingdom Relations, responsible for monitoring digitization, informed the Parliament that they wouldn´t establish an independent authority to regulate the use of algorithms. The cabinet noted that existing privacy laws (based on the European GDPR) should sufficiently safeguard the interests of citizens. They also referred to a European consumer protection directive (2019/2161), that includes transparency obligations for personalized pricing. These rules need to be translated into the day-to-day practice in which algorithms are used. Therefore the cabinet did convey that they are developing an algorithm impact assessment. The idea is that this can be used as a tool for ex-ante evaluation of the use of an algorithm. The tool should give clarity if there are risks involved, for instance, related to unfair discrimination. Of course, taking into account the existing privacy legislation.

THE DUTCH DATA PROTECTION AUTHORITY

In addition to developing the algorithm impact assessment tool, the cabinet has designated the Dutch Data Protection Authority (APG) as the authority responsible for monitoring the application of algorithms. If necessary, this APG must be able to take enforcement action. Thus, the APG plays a vital role with regard to the use of algorithms in the Netherlands. According to its website, “the APG provides important tools for the supervision of algorithms. For example, a controller is required to maintain a register describing activities that process personal data, including the purposes of those processing operations. Also, a controller must assess its effect on the protection of personal data before a particular processing. This so-called data protection impact assessment (DPIA) is often mandatory when using algorithms. In this, a controller must properly substantiate why he or she uses certain data in an algorithm, the purpose of using an algorithm, but also why it is necessary to work with that algorithm. If these risks cannot be sufficiently removed, it is mandatory to submit a prior consultation to the AP for advice.

TOWARDS THE WILD WEST?

So is the APG on top of things, armed with an algorithm impact assessment tool? No, probably not even remotely. The APG is well known to be structurally heavily understaffed. So the chances are that the lead times of the so-called prior consultations will be immensely long, stretching go-to-market phases for businesses unpleasantly long. Moreover, it will probably open the door for competitors that have just a bit, let’s say, less focus on regulatory compliance, to enter the market without taking the proper steps first. Level playing field anyone? We can only hope that the APG will control their technology just as much as they want the Dutch citizens to have control. Only then we can avoid a wild west of checked and non-checked, discriminatory and non-discriminatory algorithms.

Leave a Reply

Your email address will not be published. Required fields are marked *

18 + 8 =

MARKET ACCESS THE
SUCCESSFUL WAY
Privacy statement | © Branch Unlimited