The Human and the Algorithm: Response to Council of Europe’s Draft Recommendation on Human Right Impacts of Algorithmic Systems
17 September 2019
In July 2019, the Council of Europe invited public feedback on its draft recommendation on the human rights impacts of algorithmic systems. IFLA, through its Committee on Freedom of Access to Information and Freedom of Information, has responded, highlighting the issues and priorities of libraries.
Algorithmic systems are becoming increasingly influential on our lives. These systems are applications that use one (or several) algorithms to gather and/or analyse data to solve a defined program – e.g. to categorise, select, offer recommendations or make decisions. Technological advancement has seen the use of such systems grow in both private and public spheres.
Setting a Standard: the Draft Council of Europe Recommendation
The draft recommendation outlines how the development and use of such systems can affect human rights and fundamental freedoms – both in everyday life and in public service delivery in particular. It explains that the potential benefits, such as increased service efficiency, can come alongside significant human rights challenges.
These challenges can arise, for example, from data collection at scale, biases and errors in algorithmic decision-making, and uncertainty about a system’s possible impacts on users or their environment.
In light of such challenges, the draft recommendation offers two sets of guidelines – one for state and another for private actors – to safeguard and promote human rights when developing and using algorithmic systems. These recommendations include legislative and regulatory frameworks, literacy programmes, human right impact assessments and continuous evaluations, data management practices, research and innovation, and more.
The Library Perspective: IFLA’s Response
This discussion has practical relevance for the library sector, since algorithmic systems have a number of potential uses in a library – for example, in search and discovery engines. In addition, broader use of algorithmic systems can have an impact on the values that many libraries throughout the world champion: intellectual freedom, access to information, empowerment and non-discrimination.
That is why IFLA has prepared a submission for this call for comments. Some of the key messages in IFLA’s response include the following:
- The impact of algorithmic systems on intellectual freedom can be broad, affecting informational autonomy of individuals and freedom of thought. It is important to understand and safeguard human rights in this area alongside the key issues relating to other rights (such as discrimination or privacy).
-
The guidelines include a recommendation to maintain analogue alternatives for key public services. Library sector’s experiences show how digital (often algorithmic) and analogue services can co-exist.
Based on libraries’ experiences, it is important to consider possible budget constraints and differences in quality of service when discussing the viability of maintaining analogue services. In addition, it is also important for people to have meaningful choice and access to both versions of a service. -
The recommendation calls on various stakeholders to provide digital, media and information literacy training for the broad population. This is crucial to ensure that people can make informed choices and exercise their human rights in algorithmic contexts.
Libraries have substantial experience offering digital and media literacy initiatives, and can offer their expertise to the various digital literacy programs and efforts that the recommendation discusses. The relative absence of entry barriers makes them well-suited to reach older learners and marginalised populations.
We would like to thank all the contributors who have provided feedback on IFLA’s submission to this call for comments.
Read the full text of IFLA’s submission: