scholarly journals Automatic de-identification of data download packages

Data Science ◽  
2021 ◽  
pp. 1-20
Author(s):  
Laura Boeschoten ◽  
Roos Voorvaart ◽  
Ruben Van Den Goorbergh ◽  
Casper Kaandorp ◽  
Martine De Vos

The General Data Protection Regulation (GDPR) grants all natural persons the right to access their personal data if this is being processed by data controllers. The data controllers are obliged to share the data in an electronic format and often provide the data in a so called Data Download Package (DDP). These DDPs contain all data collected by public and private entities during the course of a citizens’ digital life and form a treasure trove for social scientists. However, the data can be deeply private. To protect the privacy of research participants while using their DDPs for scientific research, we developed a de-identification algorithm that is able to handle typical characteristics of DDPs. These include regularly changing file structures, visual and textual content, differing file formats, differing file structures and private information like usernames. We investigate the performance of the algorithm and illustrate how the algorithm can be tailored towards specific DDP structures.

2019 ◽  
pp. 358-390
Author(s):  
Patrick Birkinshaw

The Freedom of Information Act is a statute of great constitutional significance. The Act heralded a right to publicly held information which government had attempted to keep private. FOIA laws have their origins in the pre-digital age and any discussion of information rights must take on board the contemporary reality of the global digitization of communications via social media networks and the enhanced capabilities of state intelligence agencies to conduct surveillance over electronic communications. The General Data Protection Regulation seeks to give greater security to personal data. However, private information is harvested by private tech companies which they have obtained often ‘voluntarily’ and used by intermediaries to influence public events, public power and elections—as illustrated by recent scandals involving the practice of ‘data farming’ by social media networks and the sale of personal data to political campaign consultants seeking to pinpoint electors and thereby affect the outcomes of national elections and referenda. Government surveillance is age-old, but the emergence of digital power has enabled public authority to invade our private lives far more intrusively and effectively. The most recent example is the Investigatory Powers Act 2016. All this poses substantial challenges for the public regulation of information access in a growing confusion of public and private in the constitution. Courts, meanwhile, have to balance demands for privacy protection, open justice and secrecy.


2019 ◽  
pp. 245-259
Author(s):  
Bernard Łukanko

The study is concerned with the issue of mutual relationship between the failure to comply with the laws on personal data protection and regulations relating to the protection of personal interests, including in particular the right to privacy. The article presents the views held by the Supreme Court with respect to the possibility of considering acts infringing upon the provisions of the Personal Data Protection Act of 1997 (after 24 May 2018) and of the General Data Protection Regulation (after 25 May 2018) as violation of personal interests, such as the right to privacy. The author shared the view of the case law stating that, if in specifc circumstances the processing of personal data violates the right to privacy, the party concerned may seek remedy on the grounds of Articles 23 and 24 of the Polish Civil Code. This position isalso relevant after the entry into force of the GDPR which, in a comprehensive and exhaustive manner, directly applicable in all Member States, regulates the issue of liability under civil law for infringements of the provisions of the Regulation, however, according to the position expressed in professional literature, it does not exclude the concurrence of claims and violation of the provisions on the protection of personal interests caused by a specifc event. In case of improper processing of personal data, the remedies available under domestic law on the protection of personal interests may be of particular importance outside the subject matter scope of the GDPR applicability. 


2020 ◽  
pp. 34-45

The right of transgender athletes to participate in sports competitions no longer seems to be in question, even if this is a right only recently established. DSD (Disorders of Sexual Development), having a genetic nature, are more widespread than perceived (about one person affected every 2500 births). To these, we have to add all individuals whose sexual identification arises for psychological reasons. Given that, it is obvious how the question is much more important (in numerical terms) of what is currently emerging. We want to focus on the hard balance between personal data protection and the fair competition principle, after the entry into force of the EU General Data Protection Regulation (GDPR), on 25 May 2018. According to GDPR rules, it is prohibited to process data concerning health or sex life. Thus, data regarding sexual identity and/or any changes in gender-related sex fall under special protection. In terms of sports law, IOC Consensus Meeting on Sex Reassignment and Hyperandrogenism (November 2015) reformed previous Stockholm Consensus on Sex Reassignment in Sports (1973). According to it, the completion of surgical anatomical changes is no longer a sine qua non condition, being sufficient the declaration of the gender by athletes. Recalling how athletes have to compete according to the fair competition principle, we wonder if European regulation collides with the respect of this principle. How can we balance them? How can we solve this conflict under the GDPR rules, coordinated with the norms of legal sports systems?


Author(s):  
Sophie Kuebler-Wachendorff ◽  
Robert Luzsa ◽  
Johann Kranz ◽  
Stefan Mager ◽  
Emmanuel Syrmoudis ◽  
...  

AbstractFor almost three years, the General Data Protection Regulation (GDPR) has been granting citizens of the European Union the right to obtain personal data from companies and to transfer these data to another company. The so-called Right to Data Portability (RtDP) promises to significantly reduce switching costs for consumers in digital service markets, provided that its potential is effectively translated into reality. Thus, of all the consumer rights in the GDPR, the RtDP has the potential to be the one with the most significant implications for digital markets and privacy. However, our research shows that the RtDP is barely known among consumers and can currently only be implemented in a fragmented manner—especially with regard to the direct transfer of data between online service providers. We discuss several ways to improve the implementation of this right in the present article.


Author(s):  
Mónica Correia ◽  
Guilhermina Rêgo ◽  
Rui Nunes

AbstractThe European Union (EU) faced high risks from personal data proliferation to individuals’ privacy. Legislation has emerged that seeks to articulate all interests at stake, balancing the need for data flow from EU countries with protecting personal data: the General Data Protection Regulation. One of the mechanisms established by this new law to strengthen the individual’s control over their data is the so-called “right to be forgotten”, the right to obtain from the controller the erasure of records. In gender transition, this right represents a powerful form of control over personal data, especially health data that may reveal a gender with which they do not identify and reject. Therefore, it is pertinent to discern whether the right to have personal data deleted—in particular, health data—is ethically acceptable in gender transition. Towards addressing the ethical dimensions of the right to be forgotten in this case, this study presents relevant concepts, briefly outlines history, ethics and law of records considering the evolution from paper to electronic format, the main aspects of identity construction and gender identity, and explores the relationship between privacy, data protection/information control and identity projection. Also, it discusses in gender transition the relation between “the right to self-determination”, “the right to delete”, and “the right to identity and individuality”. Conclusions on the ethical admissibility of the ‘right to be forgotten’ to control gender-affirming information are presented.


2020 ◽  
Vol 9 (1) ◽  
pp. 86-101
Author(s):  
Aleksandra Gebuza

AbstractThe main aim of the article is to provide analysis on the notion of the right to be forgotten developed by the CJEU in the ruling Google v. AEPD & Gonzalez and by the General Data Protection Regulation within the context of the processing of personal data on the Internet. The analysis provides the comparison of approach towards the notion between European and American jurisprudence and doctrine, in order to demonstrate the scale of difficulty in applying the concept in practice.


2021 ◽  
Vol 16 (2) ◽  
pp. 63-75
Author(s):  
Denitza Toptchiyska

During the pandemic of COVID-19 in April 2020 the Ministry of Health in Bulgaria began the administration of the Virusafe contact tracking application. With the Law on Emergency Measures and Actions, declared by a decision of the National Assembly of 13th March 2020 amendments to the Electronic Communications Act were adopted. The purpose of the legislative amendments was to provide access of the competent authorities to the localization data from the public electronic communication networks of the individuals, who have refused or do not fulfill the obligatory isolation or treatment under art. 61 of the Health Act. This publication aims to analyze the main features of mobile applications for tracking the contacts of infected persons, as well as the adopted legislative changes, comparing them with the standards of personal data protection provided in the EU General Data Protection Regulation 2016/679 and Directive 2002/58/EC on the right to privacy and electronic communications.


2021 ◽  
pp. 151-166
Author(s):  
Sonja Lučić ◽  

By participating in social networks such as Facebook, Twitter and Instagram, network participants are increasingly revealing private information on the Internet. Once published data, whether images or other personal data, can be accessed with virtually no time limit. The idea of developing a "right to be forgotten" for the online area came from the French government. In the meantime, the European Commission has taken up this idea and proposed that, in the context of the revision of the Data Protection Directive 95/46, the "right to be forgotten" be considered in more detail. Although the representatives of the European Commission increasingly pointed out the importance of this right at public hearings, there were obstacles and serious resistance to its introduction, i.e. legal regulation. It was only with the discovery of Edward Snowden about the widespread surveillance of the Internet by the American State Security Agency (NSA) in connection with the increasingly widespread use of the Internet that the question of the need for the "right to be forgotten" became topical again. The author pointed out the specifics of “the right to be forgotten”. In addition, the author dealt with the comparative legal analysis of this institute, and give a special review of the current case law, which has as its subject “the right to be forgotten”. The judgment of the European Court of Human Rights in Hurbain v Belgium provides further clarification of the "right to be forgotten" and a broader approach than that taken in the case law of other courts to balance conflicting legal interests. Recognition of the right of an individual to request a change in the digital archive of a newspaper publisher has expanded the tools for individuals seeking „the right to be forgotten“.


2018 ◽  
Vol 1 (XVIII) ◽  
pp. 335-353
Author(s):  
Weronika Kupny

The protection of the right to privacy is one of the basic human rights and as a fundamental subject in most modern laws. Legal systems extend the privacy protection instruments to a significant extent, but at the same time they find reasons to strongly interfere in this area. Certainly, the dynamic development of modern technologies does not help the legislator to find a comprehensive solution. The article deals with the subject of privacy protection in the employment relationship on the area of innovation, technology development. In this study, the author also compares the impact of the use of modern technologies in the workplace today – in the light of the applicable regulations and tomorrow – taking into account enactment of Regulation (EU) 2016/679 of European Parlliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealinf Directive 95/46/EC (General Data Protection Regulation).


2020 ◽  
Author(s):  
Robert Huber ◽  
Jens Klump

<p>“<em>We kill people based on metadata.</em>” (Gen. Michael V. Hayden, 2014) [1]</p><p>Over the past fifteen years, a number of persistent identifier (PID) systems have been built to help identify the stakeholders and their outputs in the research process and scholarly communication. Transparency is a fundamental principle of science, but this principle of transparency can be in conflict with the principles of the right to privacy. The development of Knowledge Graphs (KG), however, introduces completely new, and possibly unintended uses of publication metadata that require critical discussion. In particular, when personal data, as is linked with ORCID identifiers, are used and linked with research artefacts and personal information, KGs allow identifying personal as well as collaborative networks of individuals. This ability to analyse KGs may be used in a harmful way. It is a sad fact that in some countries, personal relationships or research in certain subject areas can lead to discrimination, persecution or prison. We must, therefore, become aware of the risks and responsibilities that come with networked data in KGs. </p><p>The trustworthiness of PID systems and KGs has so far been discussed in technical and organisational terms. The inclusion of personal data requires a new definition of ‘trust’ in the context of PID systems and Knowledge Graphs which should also include ethical aspects and consider the principles of the General Data Protection Regulation.</p><p>New, trustworthy technological approaches are required to ensure proper maintenance of privacy. As a prerequisite, the level of interoperability between PID needs to be enhanced. Further, new methods and protocols need to be defined which enable secure and prompt cascading update or delete actions of personal data between PID systems as well as knowledge graphs. </p><p>Finally, new trustworthiness criteria must be defined which allow the identification of trusted clients for the exchange of personal data instead of the currently practised open data policy which can be in conflict with legislation protecting privacy and personal data.</p><p>[1] https://www.nybooks.com/daily/2014/05/10/we-kill-people-based-metadata/</p>


Sign in / Sign up

Export Citation Format

Share Document