Following on from the two previous focus groups hosted by the Afghanistan and Central Asian Association (ACAA) in collaboration with Royal Holloway University of London on the 11th and 23rd of April, the third and final focus group took place on the 25th of April, this time in an online format. As with the previous sessions, the focus was on the views and experiences of refugees and asylum seekers in relation to digital technology, particularly Artificial Intelligence (AI) as it impacted their seeking of asylum in the UK. As a newly joined Intern for the ACAA I was able to observe this group discussion.
This online meeting heavily focused on the use of AI within the asylum system, reflecting on the experiences of asylum seekers who first-hand witnessed how AI is utilised within the process. The group discussed both the negative and positive impacts of this technological development. While the focus of the discussion was on the use of AI within the asylum process, the views and conclusions made are applicable to the wider use of AI across society and industries. With the sheer advancement of AI having just recently been developed yet already used across many industries, the takeaways from this discussion can be applied to a wider discussion of the development and use of AI. I was able to place the experiences within my own life and own experience of AI, outside of the asylum process.
What struck me most from these discussions was the repeated concern that unlike a human being, AI lacked emotions and therefore empathy. Using AI within the asylum process can pose serious challenges as AI is unable to truly understand and emphasise with the seeker and what they have been through. AI is unable to connect on an emotional level. Participants in the group emphasised that this makes it difficult for them to get their feelings across which can be crucial for their claim and why they are seeking asylum in the first place. They argue that many of them have gone through many hardships to get to this point to claim asylum and it is particularly frustrating when they cannot convey this when using AI technology. AI is unable to read body language, understand emotions or empathise with the recipient. Asylum seekers clearly argue that these are all features that are crucial for them to get their case across.
The use of other technologies can pose similar challenges even when humans are still involved. Participants emphasised that interviews held on the phone also prevent them from connecting with humans despite speaking to a human-being. Again they argue body language cannot come across on the phone and therefore it is harder to convey your emotions. There was a large consensus that the interview experience should not only not be conducted using AI, but should be conducted in person. While I do not have experience within the asylum process I greatly resonated with this experience. Particularly since covid more and more services have shifted online or are now conducted over the phone. One such example which has greatly impacted my life is the dramatic shift of doctor appointments being conducted over the phone. I have found it extremely difficult to get my views and feelings across while speaking solely over the phone, in comparison to when doctor appointments are in person. I have found this has impacted the medical treatment I have received and has made the whole process of seeking medical attention far more stressful. I can only imagine this experience in the context of seeking asylum.
Further, not only have medical appointments shifted online and over the phone, but some medical advice has started to be provided by AI within the private sphere. Already private companies are offering AI mental health services, such as AI therapists, as well as AI systems that can provide diagnosis to patients’ symptoms. As raised by the participants in the asylum focus group, once again this sets a dangerous precedent where the other side lacks human emotions and feelings. The emotional component to ill health, both mental and physical, can be vital when diagnosing and treating the patient, and I fear the lack of human emotions within AI could threaten individuals’ health who opt to turn to AI for medical advice. Having myself been put-off by the lack of in-person appointments, let alone the use of AI, I can see how the points raised in the focus group are applicable and vital to apply to other industries.
A further danger which I think is vital to emphasise is that AI can contain inherent biases and therefore can produce answers and recommendations that are biased. These biases include, but are not limited to, racial and sexist biases that pose a great risk when used in the asylum process. If the increased use of AI is not properly monitored by human-beings AI could influence real world decisions that are based on biased data and information. This threat poses a danger to all, not just those seeking asylum. AI is increasingly being used by people in their day-to-day life asking for general advice in their life, from medical guidance to feedback on their work. I have found myself when using AI that the advice offered can often be biased. In my experience as a half-Pakistani person, when using AI to provide feedback on job cover letters, I found AI had changed the wording to remove my Pakistani heritage from the content of the cover letter. This in itself emphasises the inherent biases AI has ‘learnt’ and how these biases could be filtered further into society and work being produced when AI is involved.
The overarching consensus I took away from the focus group session was that while there could be potential positives from the use of this new technology, such as quicker access to resources through the use of chatbots for basic information, overall there is a valid concern that AI will remove the humanity from the asylum seeking process. The use of this new technology needs to be used in parallel with greater guidance firstly for those using AI, particularly for older users, and must still retain in-person and face-to-face contact with asylum seekers to retain the humanity within the process. This is something we can all relate to whether it be wanting to retain the human contact with one’s doctor, or wanting to ensure creative work is produced without the biases inherent to AI. Despite the positives of technological development it is more important than ever to be aware of its downfalls to ensure human involvement remains central in all realms of our society.
Written by: Saara Bradley