Skip to main content

Emotional AI research team awarded ESRC co-funded project

Tue 7 January 2020

AI Japan cities

Emotional Artificial Intelligence (AI) technologies sense, learn and interact with citizens’ emotions, moods, attention and intentions. Data-driven sensors, robots and pervasive networking are changing how citizens experience cities, but not always for the better. Citizen needs and perspectives are often ancillary in emerging smart city deployments. This generates mistrust in new civic infrastructure and its management, as seen in recent pushback against facial detection and recognition technologies. 

A team of researchers including Dr Lachlan Urquhart of the Edinburgh Law School has been awarded a large grant to study how to avoid these issues repeating as Emotional AI starts to be rolled out in smart cities, co-funded by ESRC and Japan Science & Technology Agency. It will take a citizen-led, multisectoral, anticipatory, interdisciplinary and cross-cultural approach. Japan and the UK are at a critical juncture where technological, social and governance structures can be appropriately prepared before mass adoption of Emotional AI. 

The project, which is titled 'Emotional AI in Cities: Cross Cultural Lessons from UK and Japan on Designing for An Ethical Life', is to be a three-year project beginning 1st January 2020 during which the team will work on "biometric and online technologies that sense, learn and interact with emotions, moods, attention and intentions". It will also examine the societal implications of these technologies in cities both in the UK and Japan.

The research team aims to assess what it means to live ethically and well with Emotional AI in smart cities in cross-cultural commercial, security and media contexts. As well as interviewing key stakeholders developing or deploying emotional AI in smart cities, the team will examine governance approaches (laws, norms, values) for collection and use of intimate data about emotions in public spaces to understand how these guide Emotional AI technological developments.  

Importantly, it will seek to understand diverse citizens’ attitudes to Emotional AI, and will co-design citizen-led, creative visions of what it means to live ethically and well with Emotional AI in cities. Ultimately, it aims to feed all the research insights, including citizens’ views, back to the diverse stakeholders shaping usage of Emotional AI in cities.  

A lasting legacy of the grant will be a think tank to provide impartial ethical advice to governments, industry, educators and other stakeholders on Emotional AI and cross-cultural factors. 

This project is joint funded by UK and Japan research councils as part of  the UKRI-JST Joint Call on Artificial Intelligence and Society. It runs from 1 Jan 2020 – 31 Dec 2022. Its Full Economic Cost value is approx. £710,000 (comprising £497,710 from UKRI’s Economic and Social Research Council and 29,645,000 Yen from Japan Science & Technology funds). 

The UK team is led by Andrew McStay (expert on social impact of emotional AI, Bangor University). Other UK Co-Investigators are Vian Bakir (dataveillance and disinformation expert, Bangor University), Dr. Lachlan Urquhart (multidisciplinary expert in IT law, computing and smart cities, University of Edinburgh) and Dr. Diana Miranda (criminology and surveillance technology expert, Northumbria University). The Japan team is led by Prof. Peter Mantello (dataveillance and predictive policing expert, Ritsumeikan Asia Pacific University). Other Japan Co-Investigators are Dr. Hiromi Tanaka (digital media & gender expert, Meiji University), Prof. Nader Ghotbi (cross-cultural ethics and health expert, Ritsumeikan Asia Pacific University), and Prof. Hiroshi Miyashita (AI and data privacy expert, Chuo University). 

Dr Lachlan Urquhart, Edinburgh lead on the project, commented: "As emotional AI (EAI) emerges in cities, it will have profound impacts on the daily lives of citizens. By attempting to make internal emotional states visible, it immediately raises questions about data privacy in public spaces, emotional surveillance of everyday life and how governance mechanisms should best protect civic values and rights. However, we are particularly interested in learning from different stakeholder groups, from citizens and regulators to law enforcement and industry. We want to understand what they think should shape the design of ethical EAI systems that they actually want to live with. By doing this in both Japan and the UK, we have scope for truly novel cross cultural lessons on best practice in governance and system design.”

For more on this, and related, projects, see the Emotional AI lab . 

Share