Erasmus University Rotterdam (EUR) is an internationally oriented research university with a strong social orientation in its research and teaching.
With its research impact and study quality, EUR can compete with the top European universities. Important values for Erasmus University Rotterdam are daring, curiosity, social engagement, working at the frontier and striving for success.
Erasmus University Rotterdam hosts the Erasmus Research Centre for Media, Communication, and Culture (ERMeCC). The Centre’s mission is to operate as an international, national and local centre of expertise for high-quality research into the myriad relationships between media, society and culture.
Part of this centre is the MAPS research cluster which addresses the social, political, and ethical issues connected to media practices, the proliferating use of artificial intelligence, privacy negotiations, various forms of surveillance, and (cyber) security approaches and challenges.
Erasmus University Rotterdam is involved in all aspects of the SPATIAL project development but focused on a social science-based analysis of the research and innovation development.
Principal Investigator
Dr. Jason Pridmore is an Associate Professor in the Department of Media and Communication and the Vice Dean of Education of the Erasmus School of History, Culture and Communication at Erasmus University. His work focuses on practices of digital identification, mobile devices, security issues, and the use of new and social media and consumer data. Jason is the coordinator of the TRESCA project, Project Exploitation Manager and Data Security Manager on the BIM-SPEED project, and Project lead at EUR for the Ashvin Project and the SPATIAL project, and was the Principle Investigator on the Mobile Privacy Project.
Principal Investigator
- How did you join SPATIAL?
I was involved in the development of the SPATIAL project at an early stage, which was the result of many conversations I had with SPATIAL project lead Aaron Ding (TUD). We wanted to develop a project that focused on making the embeddedness of AI into cybersecurity and IoT solutions more trustworthy and transparent. As Aaron indicated in an earlier interview, this concerns a socio-technical challenge, and we defined the project in a way that combines technological development with social science understandings and experience to ensure the highest level of success. - What are your expectations in a project of this nature?
The SPATIAL project partners will develop resilient accountable metrics, privacy-preserving methods, verification tools and system solutions that will serve as critical building blocks for trustworthy AI. In addition, we will also facilitate appropriate skill development and education for AI developers to strike a balance among technological complexity, societal complexity, and value conflicts in AI deployment. - What can the research community expect from SPATIAL?
Aside from the project’s technical ambitions, SPATIAL has been designed to include a strong social science component. The complexity of creating algorithmic accountability as a technical process is complicated by different experiences and demands of both designers and users as well as technological potentials and limitations. We want to make use of this to replicate these practices elsewhere as something that will significantly benefit the research community. Additionally, we expect that our educational modules and technological tools will support this and these are based on this approach. - Where do you see SPATIAL results in 10 years?
I hope that the combination of technological and educational tools will form the basis of future AI development practices around IoT and security. We strive to make the findings, products, and solutions of the project available to AI developers in and beyond Europe.