Human Rights in the Digital Age: Data Literacy in Tackling the Big Data Divide
Most of us no longer view the world as something that purely exists in material form. Rather we view it as an ongoing process of online-offline interactions. We are continuously navigating our lives in between offline and online realities. The way in which we learn, communicate, shop and even form relationships have changed in the past decade. While digital technologies have improved and enhanced our world in multiple ways, they have also brought about new challenges. What does it mean to meaningfully and consciously participate in the interconnected and data-centric digital world? And, how can we align the values that drive digital progress with human rights and the Sustainable Development Goals?
The Age of [Datafied] Interdependence
According to the UN, we are now entering the age of interdependence. We are interdependent in the way our data is being shared, circulated and – how its interpretation is feedback to and/or at us. Most of us produce data. Our individual data-selves are following us around the web and across our devices (e.g. laptop, smartphone, smart-devices). Both our periods of activity and inactivity are monitored. We are analysed while reading news, scrolling through our social media feeds or in fact – when we are not using our smartphones at all. The process of personalised feeds means that the information we see online is algorithmically curated and personalised. Our data profiles often carry information about our political affiliation, sexual orientation and sometimes even current and possible future health conditions.
Related Topics: Digital National Elections with Blockchain – Digital Higher Education –Freedom of Expression and Information
Among many other things, our data can and do provide tech-companies information about ourselves. This includes interpretations of how we see the world, what we think about ourselves and others. Subsequently, those interpretations influence how we think about the world (for example through targeted advertising). As argued by Shoshana Zubboff, nowadays in the data-centric society “your body is reimaged as a behaving object to be tracked and calculated for indexing and search”, whereby “your inner life – your intentions and motives, meaning and needs, preferences and desires, moods and mentions, personality and disposition, truth-telling and deceit – is summoned into the light for others profit” (Zuboff, 2019:255).
Positive Data-Driven Change
We must be careful not to demonise all of the data collection processes that are out there. Indeed, we must not underestimate the importance of data-driven social change and innovation. In the context of global development, data has been used to combat child labour, improve global health supply chain or to enrich community participation and citizenship.
“Digital technologies are rapidly transforming society, simultaneously allowing for unprecedented advances in the human condition and giving rise to profound new challenges. Growing opportunities created by the application of digital technologies are paralleled by stark abuses and unintended consequences. Digital dividends co-exist with digital divides. And, as technological change has accelerated, the mechanisms for cooperation and governance of this landscape have failed to keep pace.”(The High-level Panel on Digital Cooperation, 2019.)
Human Agency, or Lack Thereof, and Data: Political Persuasion
Political persuasion is one striking example of how data-driven technology companies might disrupt democracy and possibly violate human rights. In their extensive research on Political Persuasion, the Tactical Tech Collective identified 300 companies worldwide, which use data to give political parties insights into who voters are, what they want to hear and how to persuade them. Tactical Tech’s report provides examples of how data has been used to influence voters’ preferences – and thus their human agency – in countries such as Argentina, Brazil, India and the United Kingdom. While using information about voters in election campaigns is not new, it is the opacity and secrecy of data-driven elections pose new ethical challenges to privacy, data governance, freedom of speech, disinformation and democracy itself.
Political persuasion data collection techniques are often presented as fun and harmless elements of our digital participation: interactive personality quizzes, memes, videos and games. As reported by the Tactical Tech Collective:
“Voters are generally unaware of their participation in experiments; moreover, permission is often requested by privacy policies that users tend to accept without reading. As a result of this lack of awareness, there’s no way for participants to opt-out. Furthermore, many voters are unaware of the impacts that past experiments may have had on them” (Bashyakarla et al., 2019:41)
It may be unclear how, if and to what extent data might impact voters’ decisions. However, it is clear that some of the targeted voters feel confused, misinformed and surveilled.
Editor’s Pick – Related Articles
The Big Data Divide
This widespread use of data-driven political persuasion is a symptom of a wider problem of the big-data divide. This refers to the power imbalance between those who store, analyse and understand data (e.g tech-companies), and those who produce it (Internet users). One may argue that the interconnected and datafied world has resulted in the division between data oppressed and data oppressors. While the data oppressors are able to utilise data insight for their chosen purpose, the ordinary data oppressed are only able to see “an opaque algorithm as a black box”. Thus, some argue that the big data-divide, algorithmic selection and surveillance have created new power structures and new forms of inequality which extend the traditional patterns of class, gender, wealth and education.
In 2019, we are somehow stuck in the existing data power structures – often keen to participate and play, but also worried and powerless. In this interrelated reality, our personal data has become both incredibly valuable and inaccessible. There is an existing paradox – to exercise our human rights online, we need to hand some of them away. Users often fluctuate between the desire for online communication and participation, privacy concerns and the illusion of free online services. Therefore, we often agree to the never-ending “terms and conditions”. Considering this, it is hard to decide to what extent our digital participation provides us with new opportunities for self-determination. Thus it is difficult to determine to what extent our digital world is algorithmically manipulated.
Addressing Data Oppression: nurturing critical data citizenship
Opposing any oppression should begin by increasing one’s ability to critically asses and understand existing power imbalance and its consequences. In line with Freire’s idea of democratic education, the [data] oppressed need to develop the critical consciousness which will allow them to view the current data power structures not as “a closed world from which there is no exit, but as a limiting situation which they can transform” (1970:49). Recognizing the importance of a human-centred digital world is crucial.
The data oppressed must be supported in order to develop curiosity and understanding about who their data oppressors might be. This might be particularly difficult in the era of highly-personalised, ‘free’ and ubiquitous digital media. The blurred offline-online realities make it difficult to identify who your data-oppressors really are, let alone how to oppose them. As pointed out by Zara Rahman, “arguing with data is harder than arguing with people”.
Addressing instances of data oppression might sound like a challenging task, but it is not impossible. We must not forget that human rights apply fully in the digital world. Our rights to education, freedom of expression, privacy and data protection are now more important than ever. Consider Article Nineteen of the United Nations’ Universal Declaration of Human Rights. This emphasises citizens right of access to information and thus our data.
Data Literacy as a Human Right
Digital citizens should receive access to their personal data. Moreover, they must be equipped with critical abilities to understand how data-power structures can influence their lives. Thus, we must not only view access to data literacy as an additional element of the educational curriculum. We must view it as a human and civic right. Just like the traditional form of literacy, which is often understood within a rights-based approach and among principles of inclusion for the Global Goals for Sustainable Development – in the Age of [datfied] Interdependence, data literacy should equally provide citizens with knowledge and skills for informed, conscious and meaningful digital participation.
Working alongside Me and My Big Data research team at the University of Liverpool made me realise that data literacy should indeed play a crucial role in future efforts to tackle the big data divide. However, we must not consider it a magic solution. We can create a human-centred digital world. It requires providing data oppressors and oppressed with opportunities to explore the meaning of human rights within their digital citizenship. Moreover, these opportunities must include those who do not fit into the simplistic categories of data oppressors and data oppressed. What are the universal values that unite us as digital citizens? What kind of ethics guides our participation and creation of the digital world? And how do democratic values fit into this possible world?
The United Nations: Approaching Human Rights in the Digital Age
The UN argues that applying human rights in the digital age requires improved coordination and communication between governments, technology companies, civil society and other stakeholders. As argued by Melinda Gates, of the UN Secretary-General’s High-level Panel on Digital Cooperation, through improved collaboration, a co-creation of more human, just and oppression free digital world is possible. Perhaps, Freire’s idea of conscientização – the process of stepping back from our digital realities and collaborative critical reflection about the value our interdependent realities – provide all involved with new ideas on how to move beyond the current data-oppressive reality.
*in relation to this article see UN’s recommendations 3A, 3B and 3C in the UN Secretary-General’s High-level Panel on Digital Cooperation
In the cover picture: The Human Face of Big Data. Photo Credit: Dell EMC.
EDITOR’S NOTE: The opinions expressed here by Impakter columnists are their own, not those of Impakter.com.