Auditing social media: Portrayal of Migrants on Youtube
Auditing social media: (In)visibility of Political Content on Migration
What is the impact of social media on the representation and voice of migrants and refugees in Europe? What are the challenges and opportunities to avoid their invisibilization and promote a fair representation?
Migrants and refugees are often misrepresented and denied agency, voice, and the right to be represented as complex human beings in European media. This report is a comprehensive analysis of the causes of this misrepresentation and lack of representation, and explores the role of social media in shaping public opinion, dealing with the following topics:
Social media platforms are developed and administered with little transparency and accountability, and their algorithms create bias, misrepresentation, discrimination, and suppression of voices.
Social media content moderation, shadowbanning, personalization, and targeted advertising work and affect the representation of marginalized groups.
Social media content often stereotypes and reduces the complexities of migrants and refugees’ stories, and creates new forms of digital surveillance that threaten their rights.
All these dynamics create and allow for new forms of manipulation of reality that affect us all, but hit harder the representation of marginalized social groups like migrants and refugees.
YouTube’s algorithm reinforces a dehumanizing view of migrants
Migrants and refugees are often depicted in a negative light on this platform, predominantly as non-white individuals crossing borders. This perpetuates their dehumanization and stereotyping, which can have real-world consequences on how they are perceived and treated by society.
TikTok wants to entertain, not talk about politics
Being one of the most popular and influential social media platforms, TikTok can also affect the political discourse on migration. However, the audit found that TikTok’s recommender system shows little variation in recommended content regardless of users’ attitudes towards migration and their location’s political leaning.
The platform’s weak personalization for political content indicates that TikTok’s focus is primarily on entertainment rather than politics.
The Digital Services Act (DSA) will require digital services companies, including social networks, to conduct independent audits and risk assessments to ensure a safe digital space in which the fundamental rights of users are protected.
Recommendations:
YouTube needs to further develop its recommendation system for better minority representation, such as migrants. The capability to independently audit the recommender systems of YouTube and other social networks is a high priority in addressing the issues.
YouTube and other social platforms should facilitate researchers or research institutions in accessing necessary internal data in order to study the possible harm or risks that their recommender algorithms can bring.
A collaborative effort by public institutions, experts and YouTube itself should better define the set of biases that should not be present in YouTube’s algorithm system, including those based on gender, race, ethnicity, and other factors.
Engaging migrant communities in all of the mentioned processes is another suggested step, as our study showed the concerned sentiments of the migrants themselves about their representation or absence in social media.
Conclusion: 70% of videos viewed on YouTube are recmended by its algorithm, which, due to the lack of regulation, maintains its opacity without accountability. In this audit, Eticas encourages the platform to bet on algorithmic transparency, improve its recommender system and increase its engagement with migrant communities, for a more faithful representation of them.