For Minorities, Indigenous Peoples, It’s Bleak Technological Future, Minority Rights Group Reports

This year, the Minority Rights Group International’s (MRG) annual Minority and Indigenous Trends report is focusing on technology.

In what is most likely the first global survey of its kind, the 2020 report provides a comprehensive look at the distinct and varied impacts technology has on minorities, indigenous peoples and other groups vulnerable to discrimination and exclusion.

MRG is, however, a leading international human rights organisation working to secure the rights of ethnic, religious and linguistic minorities and indigenous peoples. It’s working with more than 150 partners in over 50 countries.

MRG’s online World Directory of Minorities and Indigenous Peoples is used globally and updated frequently, with more than 50 entries updated this year alone.

MRG’s Director of Policy and Communications, and one of the editors of the volume,  Carl Söderbergh, emphasises ‘’let us be clear, the global technology sector is contributing to massive human rights violations, harming the lives of millions of people belonging to minorities and indigenous peoples.

‘’The damage caused by tech range from the serious health effects for children as young as 12 working in cobalt mines of southern Democratic Republic of Congo; social media spurring violence against Copts in Egypt, Dalits, Muslims and other minorities in India, and people with albinism in Tanzania; and AI and machine learning contributing to mass surveillance of Uyghurs in China.’’

Technological biases against minorities and indigenous peoples are widespread and detrimental to human rights. From racially discriminatory facial recognition systems to predictive policing and intrusive surveillance methods that target minority communities.

MRG’s report covers a range of examples: The European Union (EU) has piloted an AI-driven facial recognition and lie-detector video surveillance system for border control. Called iBorderCtrl, it is intended to replace human border guards with an avatar that asks questions while it scans for facial anomalies. It raises urgent concerns, for instance the risk that it will be unable to distinguish the lingering effects of trauma on asylum-seekers.

In the UK, the Metropolitan police’s machine learning Gangs Matrix programme compiles a database of gang members. Based in part on social media use and music listening habits, it disproportionately targets young black men – reinforcing stigma. Eighty per cent of those found on the Gangs Matrix in 2019 were listed as ‘African-Caribbean’.

Pre-trial risk assessments in the US are performed by AI, for instance, to determine the risk of re-offending. While one such tool, COMPAS, has been found to be correct in 60 per cent of cases, when its wrong, it exhibits stark racial bias: it is twice as likely to assign higher rates of recidivism for non-reoffending black defendants than white defendants.

As Söderbergh explains, ‘’ machine learning and AI intrude into every corner of our lives, replicating the biases of those who develop them, while the companies behind these algorithms remain reluctant to subject these to wide scrutiny.’’

The ‘digital divide’ plays a crucial role. Minorities and indigenous peoples are far less likely to have adequate access to computers and the internet, receive university degrees in science and technology, or participate in tech use and development:

In Bulgaria, over 40 per cent of Roma could not afford access to the internet, or to purchase a computer or a smartphone in 2016.

In China, Uyghurs are actively barred from pursuing university degrees in science and technology. If they travel abroad to study, they risk putting family members at home in danger.

For Syrian refugees in Lebanon, many of whom lack residency permits, testing for Covid-19 could put them at risk of harassment.

For indigenous persons with disabilities in Nepal, the barriers to accessing assistive technologies include administrative obstacles, physical distance, unaffordability and cultural inappropriateness.

‘’Yet as #BlackLivesMatter shows, technology also provides powerful tools for documenting human rights violations and mobilising millions united in calls for change’’, Söderbergh adds.

From the indigenous Kuy in Cambodia using a community-based monitoring app to document and prevent illegal logging, to activists in Ecuador employing social media to disseminate information on COVID-19 in indigenous languages, and mobile finance providing a vital lifeline to indigenous Turkana in Kenya in times of drought and food insecurity – the report also shows how the power of technology can be harnessed for the common good.

‘’Going forward, it is essential that governments and tech companies adopt a human rights-based approach. This means actively ensuring the involvement at every stage of tech development of all those who face discrimination and exclusion’’, says Söderbergh.

‘’Only then will technology be transformed from often being a tool of oppression controlled by the few, to an empowering force benefiting the whole of society and enabling everyone to reach their full potential.’’

The report includes recommendations focused on mainstreaming human rights in the design, development, production, dissemination and use of technology.

 

Subscribe to our newsletter for latest news and updates. You can disable anytime.