Internet Governance Forum 2017 Day 2


A full day of #IGF2017 began with a content-packed session on ‘Emerging Challenges for Data Protection in Latin American Countries’. As Kenya continues to grapple with the development of a stand-alone legal framework for data protection, the experiences from Latin American may provide some insights to guide future work in this area.

Besides dealing with traditional data protection issues such as (i) the concept of personal and anonymous data; (ii) consent and other legal bases for data processing; (iii) international transfer of personal data; (iv) data protection enforcement, etc., the workshop-style session made reference to emerging topics such as (i) privacy by design; (ii) the right to be forgotten; (iii) algorithm accountability; and (iii) the complexity inherent to data flows exchange between private and public entities in processing personal data. The session featured the experiences from Brazil, Peru, Argentina, Colombia and Chile, the latter being the first country in Latin America to have a data protection law back in the 1990s. From the panel, the key take-away is that the constitutional and legal tradition of most of Latin America means that there are very strong influences from Europe as well as the US. However there is need for Latin America to define and re-define their legal frameworks to address unique challenges of their developing countries and young democracies, at a time when data protection and by extension, privacy, has become an important policy and legal issue.
The next session on ‘Surveillance from the Margins’  looked at different experiences of surveillance with a specific focus on mapping and understanding the particular kinds of surveillance experienced by marginalised groups. Surveillance is discriminatory per se, in that it is directed specifically at people because of their gender, race, class, disability, sexual orientation, opinion, belief, religion, among others. David Kaye, UN Special Rapporteur on Freedom of Expression observed that there was a move from mass surveillance to direct targeted surveillance of certain persons. Going back to 2013 and the time of the Snowden revelation, much of the conversation about surveillance was essentially the broad scale, broad scope of surveillance, particularly by the United States and its partners which operated to sweep in vast, vast amounts of data without respect to individual privacy, freedom of expression, freedom of opinion, other fundamental rights. According to Kaye, one of the things that changed over the last couple of years, while mass surveillance is still in an ongoing conversation and an ongoing debate, surveillance has moved from this mass surveillance that Snowden disclosed, to highlight how individual companies around the world have made spyware, surveillance technology available to anyone who can afford it, and the price is dropping. This has been seen in particular in places that are repressive, and where the tools of several surveillance that are purchased from companies around the world, but in particular from Western Europe, where the tools are being used to highlight and target the work of journalists, activists or regular individuals who might be for one reason or another in a position of dissent from government. This raises the need to focus in on how this move from mass set of surveillance to very targeted surveillance has a very direct impact on individual expression and on individual privacy and what kind of regulatory mechanisms should be in place to deal with the issue and spread of this kind of technology.
The final session before lunch was on: ‘Local Content in the Media’. This session was organised by World Intellectual Property Organization (WIPO) and European Broadcasting Union (EBU). The lively panel discussion looked at how a balanced and effective copyright system and media regulation represents a powerful tool to implement any policy goal on local content production and distribution. Sometimes in content discussions, it is forgotten that the first steps for allowing access is actually the creation and production of the content we want. Thus, as WIPO notes, creation remains a pre-condition of access so any sustainable policy on local content must also look at how it affects the creation side of things. The EBU’s comments touched on the importance of an enabling regulatory environment and enforcement within copyright system as crucial success factors for broadcasters as content producers and distributors. According to EBU, there is an intrinsic link between copyright law and protection of local content since copyright law is territorial in nature.
The first session of the afternoon was: ‘Governance Innovation in the Age of Sharing Economy’. The session was designed to tackle governance challenges emerging with the increased coverage and complexity of digitalisation, in regards to governance objects, tools and approaches. It brings the sharing economy, a new and active digital economy pattern into focus, to review how the Internet and society governance model is developing from going in one-way to multi-stakeholder as well as online and offline cooperation. Over the years, there has been a convergence between an online services and offline traditional services that is generating some significant regulatory issues. There have been three phases of this. Thinking back to Napster and other file-sharing sites, that was the first sharing economy. Its limitation was that people were sharing illegally other peoples’ music and eventually the sharing sites got shut down. This history allowed Apple to introduce its iTunes platform and other services such as Spotify came along and put traditional physical distributors of CDs and DVDs out of business. Similarly, the emergence of over the top (OTT) services such as Voice Over IP, Video Over IP disrupted traditional telecommunications that they were competing with using the same equipment, the same lines with these online services such as Skype, Netflix, YouTube and that created some real regulatory issues because the traditional sectors were regulated and OTTs were ostensibly not.
The last session of the afternoon was: ‘Big Data, Business and Respect for Human Rights’. The session organisers were the Swiss Federal Department of Foreign Affairs (FDFA) and the European Broadcasting Union (EBU). EBU has made the challenge of big data a priority and has organised numerous events to discuss why big data is so sensitive as media and the interplay between big data and human rights. FDFA has been following the topic of big data because of its implications for the realisation and promotion of the UN Guiding Principles on Business and Human Rights. Meanwhile the Council of Europe is also working on big data related issues from a standard setting angle, having already addressed the data protection implications of big data and examining the broader impact of algorithms on human rights. In a press release issued on 4 April 2016, the Council of Europe Commissioner for Human Rights underlined that the effects of business practices on human rights have become a central issue for human rights protection. He also referred to a survey carried out by The Economist which highlighted that many businesses have started to view themselves as important actors in respecting human rights. While it is the task of governments to secure for everyone within their jurisdiction the rights and freedoms enshrined in the European Convention on Human Rights, there is now wide recognition that businesses are key actors in the respect for human rights. This is confirmed by the Committee of Ministers in a Declaration in 2014 and a Recommendation in 2016. The protection of personal data and the right to privacy online are at odds with the very nature of the Internet which is to facilitate the free flow of data in an open environment. There is a growing technological ability to collect, process and extract new and predictive knowledge from great volume, velocity, and variety of data. The main issue is the analysis of the data using software to extract new and predictive knowledge for decision-making purposes regarding individuals and groups. A key take-away from this session was the following intervention from the audience: ‘I am worried about the fact that we always look at human rights as if it was just about my rights. And a difficult situation I live through now on the Internet Big Data is used, affecting us all in daily lives, is the suggestion economy. And that suggestion economy is driven by a massive invasion of privacy. But the tool is not the individual directly, mostly it is the Judas Principle. Some of your friends get free gift. It is not my data that I am asked to give, it is other people’s data. And what the — what is being done with that data seems to me to not match what we just heard, of somebody trying to analyze it and get you a good conclusion, but this behind is a robot making decisions. It is not a human making decisions, it is a robot. Actually many of them, but let’s say they tend to be the same thing.  And they make those suggestions.  They make so many discussions that no human person ever, nobody can understand what this thing does. It creates so many situations to be analyzed, maybe hundreds in a million, but that’s not going to make a difference.  We don’t know what the other decisions are that these robots make. They are actually already in a total loss of control. But worse than that, it makes money. And, of course, if a company like one of these big companies that work in the, in social networks does not do that, does not use it, it is going to be eliminated in a competitive environment. So even without desiring to do so, it is going to go on and do as much as it can in collecting data and making suggestions.’

Leave a Comment

Your email address will not be published. Required fields are marked