After the COVID-19 pandemic stopped many asylum procedures around Europe, fresh technologies are actually reviving these systems. From lie recognition tools analyzed at the line to a system for confirming documents and transcribes selection interviews, a wide range of technologies is being utilized in asylum applications. This article explores how these solutions have reshaped the ways asylum procedures happen to be conducted. It reveals how asylum seekers will be transformed into pressured hindered techno-users: They are asked to conform to a series of techno-bureaucratic steps also to keep up with unforeseen tiny within criteria and deadlines. This obstructs their very own capacity to browse through these systems and to go after their legal right for protection.
It also demonstrates how these kinds of technologies happen to be embedded in refugee governance: They assist in the 'circuits of financial-humanitarianism’ that function through a whirlwind of spread technological requirements. These requirements increase asylum seekers’ socio-legal precarity simply by hindering these people from being able to access the channels of safeguard. It www.ascella-llc.com/portals-of-the-board-of-directors-for-advising-migrant-workers further argues that analyses of securitization and victimization should be coupled with an insight in the disciplinary mechanisms worth mentioning technologies, by which migrants will be turned into data-generating subjects just who are disciplined by their dependence on technology.
Drawing on Foucault’s notion of power/knowledge and comarcal expertise, the article states that these solutions have an inherent obstructiveness. They have a double impact: even though they assist to expedite the asylum method, they also help to make it difficult with regards to refugees to navigate these types of systems. They are simply positioned in a 'knowledge deficit’ that makes them vulnerable to illegitimate decisions created by non-governmental celebrities, and ill-informed and unreliable narratives about their situations. Moreover, they pose fresh risks of’machine mistakes’ that may result in incorrect or discriminatory outcomes.