Van oktober 18, 14:45 tot oktober 18, 17:45
Predictive policing systems, facial recognition and sentiment recognition are gaining in popularity. Facial recognition can help you unlock your phone, but it can also be used for surveillance which can lead to various human rights violations. The latter is for example the case in Xinjiang China, where the Uyghur minority is suppressed with the help of facial recognition on the surveillance cameras in the city.
During this masterclass, we will look at the different human rights violations that are lurking with the indiscriminate deployment of these technologies. We will take a look at the Xinjiang example and the involvement of the global tech industry in these human rights violations. This masterclass focuses on the export of surveillance software from Europe to countries that violate human rights with these technologies. What is the power of the EU and the Dutch government to stop these exports? And what can be done to move the EU and national governments to respect human rights in their surveillance export policies? You will use your newly gained knowledge to discuss these issues with the other students in a case study.
What: Masterclass for students
What time: from 14.45 till 17.45
Where: Amnesty office, Keizersgracht 177 Amsterdam