Scientists Create Artificial Intelligence Agents to Decode Cause of Religious Conflict
London: Using Artificial Intelligence (Al), researchers at Oxford University have created psychologically realistic models of humans to test what causes religious conflict. The findings revealed that people are a peaceful species by nature. However, in a wide range of contexts they are willing to endorse violence — particularly when others go against the core beliefs which define their identity.
For the study, the researchers did not explicitly simulate violence, but instead focused on the conditions that enabled two specific periods of xenophobic social anxiety, that then escalated to extreme physical violence. The researchers drew on the Northern Ireland Troubles spanning three decades from 1968 to 1998 and the 2002 Gujarat riots of India.
Most people are generally familiar with AI that uses Machine Learning to automate human tasks like classifying something, such as tweets to be positive or negative etc., said one of the study authors Justin Lane from the University of Oxford. “But our study uses something called multi-agent AI to create a psychologically realistic model of a human, for example — how do they think, and particularly how do we identify with groups? Why would someone identify as Christian, Jewish or Muslim etc. Essentially how do our personal beliefs align with how a group defines itself?”
To create these psychologically realistic AI agents, the team used theories in cognitive psychology to mimic how a human being would naturally think and process information, to show how an individual’s beliefs match up with a group situation. They did this by looking at how humans process information against their own personal experiences. The researchers combined some AI models (mimicking people) that have had positive experiences with people from other faiths, and others that have had negative or neutral encounters. They did this to study the escalation and de-escalation of violence over time, and how it can, or cannot be managed. (IANS)