scholarly journals 醫療AI: 在資料利用與隱私保護之間起舞

Author(s):  
Yue WANG

LANGUAGE NOTE | Document text in Chinese; abstract in English only.At present, the development of AI depends on three core elements: high-quality data, accurate algorithms and sufficient computing power. New technologies represented by big data, cloud computing and AI are exerting a significant impact on traditional data protection. Individuals' control over their personal data is weakening, data protection is becoming more difficult, and traditional measures of privacy protection are at risk of failure. These are the most representative problems in the conflict between the development of new technology and privacy protection. A new legal and ethical framework that values humans' physical safety, health and dignity should be established and deeply integrated into the entire life cycle of the design, production and application of medical AI. Based on this premise, effort should be made to promote the development of medical AI for the benefit of mankind.DOWNLOAD HISTORY | This article has been downloaded 38 times in Digital Commons before migrating into this platform.

Author(s):  
Cristina Contartese

The aim of this work is to examine the European Court of Human Rights’ (ECtHR) balancing exercise between genetic data protection and national security, under Article 8 of the European Convention of Human Rights (ECHR). It analyzes, more specifically, the core principles of the Strasbourg Court that the Council of Europe’s Contracting States are required to apply when they collect and store genetic data in order to reach specific purposes in terms of public security, such as the fight against crimes. It will emerge that the Court, in consideration of the risks new technologies pose to an individual’s data safeguards, pays special attention to the strict periods of storage of such data and requires that their collection be justified by the existing of a pressing social need and a “careful scrutiny” of the principle of proportionally between the intrusive measure and the aim pursued. This work is divided into three main parts. The first part provides a general overview on personal data protection under Article 8, while the second and third part concentrate, respectively, on the collection of genetic data and on their storage for police purposes.


Author(s):  
Paul Nemitz

Given the foreseeable pervasiveness of artificial intelligence (AI) in modern societies, it is legitimate and necessary to ask the question how this new technology must be shaped to support the maintenance and strengthening of constitutional democracy. This paper first describes the four core elements of today's digital power concentration, which need to be seen in cumulation and which, seen together, are both a threat to democracy and to functioning markets. It then recalls the experience with the lawless Internet and the relationship between technology and the law as it has developed in the Internet economy and the experience with GDPR before it moves on to the key question for AI in democracy, namely which of the challenges of AI can be safely and with good conscience left to ethics, and which challenges of AI need to be addressed by rules which are enforceable and encompass the legitimacy of democratic process, thus laws. The paper closes with a call for a new culture of incorporating the principles of democracy, rule of law and human rights by design in AI and a three-level technological impact assessment for new technologies like AI as a practical way forward for this purpose. This article is part of a theme issue ‘Governing artificial intelligence: ethical, legal, and technical opportunities and challenges’.


2016 ◽  
Vol 3 (1) ◽  
Author(s):  
Andrew Nicholas Cormack

Most studies on the use of digital student data adopt an ethical framework derived from human-studies research, based on the informed consent of the experimental subject. However consent gives universities little guidance on the use of learning analytics as a routine part of educational provision: which purposes are legitimate and which analyses involve an unacceptable risk of harm. Obtaining consent when students join a course will not give them meaningful control over their personal data three or more years later. Relying on consent may exclude those most likely to benefit from early interventions. This paper proposes an alternative framework based on European Data Protection law. Separating the processes of analysis (pattern-finding) and intervention (pattern-matching) gives students and staff continuing protection from inadvertent harm during data analysis; students have a fully informed choice whether or not to accept individual interventions; organisations obtain clear guidance: how to conduct analysis, which analyses should not proceed, and when and how interventions should be offered. The framework provides formal support for practices that are already being adopted and helps with several open questions in learning analytics, including its application to small groups and alumni, automated processing and privacy-sensitive data.


2014 ◽  
Vol 2 (2) ◽  
pp. 72 ◽  
Author(s):  
Joanna Kulesza

The paper covers the political and legal consequences of US deployed extensive cyber surveillance program, usually referred to with the codename PRISM. The author identifies the significant transnational legal challenges for privacy protection originated by US cybersecurity policy and the steps taken by other states aimed at limiting its consequences harmful to individual privacy. The author covers varying reactions to USimposed privacy intrusions, from Brazil’s plans to withdraw from the global network to some states’ suggestions of holding Washington internationally responsible for violating the International Covenant on Civil and Political Rights. The paper’s focus however is on the European personal data protection thus far not providing effective transnational protection of privacy, primarily through the strongly criticised and ineffective EU-US Safe Harbor arrangement. The EU personal data reform, approved by the European Parliament in March of 2014, seems the most significant consequence of mass privacy violations committed by the US National Security Agency and its agents. The 2012 proposed Data Protection Regulation, which, together with the new personal data Directive, are to replace the 1995 Data Protection Directive 95/46/EC put strong emphasis on the effectiveness of transboundary privacy protection, although cover also many other significant changes, such as introducing the right to be forgotten or centralising the personal data protection decisions thus-far distributed among national Data Protection Authorities, often varying in their interpretations of community law. The reform is to oblige all companies, regardless of their country of incorporation, to meet EU privacy laws as it introduces high financial responsibility for those who fail to do so, making it a trigger for a significant change in the way the online markets operate. The European approach seems significant for the entire international community not only because European citizens are an important element of the online markets, but also because personal data protection as a tool for safeguarding individual privacy has been adopted in over 100 out of the roughly 190 world’s countries. Including an element of transnational data protection in EU law is therefore certain to influence the approach to privacy in other continents.


FIAT JUSTISIA ◽  
2018 ◽  
Vol 12 (3) ◽  
pp. 206
Author(s):  
Rudi Natamiharja

The rights to privacy as an individual fundamental right should be protected. Ironically, this right is deliberately delivered publicly in social media. And Facebook, the largest social media, keep more than 2.2 billion privacies data in the whole world. In early April 2018, one million personal data of Indonesian Facebook users was stolen by other parties. Mark Zuckerberg, as a founder and CEO, acknowledged that the Facebook data consisting of customer personal data had been stolen and used by other parties. It is one of the weaknesses and negligence of Facebook that needs to be addressed in the future. Indonesia government issued a warning letter to Facebook and required formal explanation concerning those recent cases. However, the Government's seriousness on the protection of personal data of its citizens is still questioned. How Indonesian regulations cover private data protection on their citizen and what steps should be taken to protect personal data in Indonesia? By using the International instrument and Indonesia legal instruments on the protection of privacy right, this article would give the answer what government Indonesian should do to undertake this situation. The research found that the regulation of privacy protection is sufficient yet the government has no determination to take account seriously on protecting the privacy right, and no sanction to the parties was involved. Socialization on the importance of personal data toward Indonesian society in Indonesia should be done, from the basic to the top level. Keyword: Right Privacy, International Law, Fundamental Rights


Author(s):  
Federica Casarosa

The achievement of an adequate level of privacy protection is a demanding objective, especially for new technologies. One relatively new but increasing class of users of Internet related services consists of children and young people. However, if Internet services can improve social skills and widen the knowledge minors have, it could open the doors to privacy abuse and misuse. As it would not be feasible to address all the legal and technical tools available within the privacy protection process, this chapter will focus on a specific element required by regulation and applicable both in Europe and in the US: the inclusion of a privacy policy in any website that collects personal data from users. The paper will provide an analysis of some of the privacy policies available online provided by companies that focus specifically on children and by social networking sites. The analysis will couple the descriptive part with suggestions to improve the level of compliance and, consequentially, the level of protection for minors’ privacy.


2021 ◽  
pp. 201-222
Author(s):  
Omri Ben-Shahar ◽  
Ariel Porat

Personalized law requires massive information, and this chapter examines some of the problems relating to the accumulation of personal data in the hands of the government. It first surveys what kinds of information would be needed and how lawmakers might hope to acquire that necessary data. While much information is already available in government databases, is it realistic to expect commercial databases to share the data with the government? The chapter then shifts to asking how the personalized commands would be communicated to actors. It argues, counterintuitively, that in important areas, private actors may often find it easier to know their personalized command than figure out the uniform command. Finally, the chapter examines problems of privacy and data protection, arising from the accumulation of data in the hands of governments. It argues that privacy interests vary across people, and thus privacy protection—like other aspects of personalized law—could itself be personalized, allowing people to opt out of some privacy-sensitive personalized treatments.


2018 ◽  
Vol 42 (3) ◽  
pp. 290-303 ◽  
Author(s):  
Montserrat Batet ◽  
David Sánchez

Purpose To overcome the limitations of purely statistical approaches to data protection, the purpose of this paper is to propose Semantic Disclosure Control (SeDC): an inherently semantic privacy protection paradigm that, by relying on state of the art semantic technologies, rethinks privacy and data protection in terms of the meaning of the data. Design/methodology/approach The need for data protection mechanisms able to manage data from a semantic perspective is discussed and the limitations of statistical approaches are highlighted. Then, SeDC is presented by detailing how it can be enforced to detect and protect sensitive data. Findings So far, data privacy has been tackled from a statistical perspective; that is, available solutions focus just on the distribution of the data values. This contrasts with the semantic way by which humans understand and manage (sensitive) data. As a result, current solutions present limitations both in preventing disclosure risks and in preserving the semantics (utility) of the protected data. Practical implications SeDC captures more general, realistic and intuitive notions of privacy and information disclosure than purely statistical methods. As a result, it is better suited to protect heterogenous and unstructured data, which are the most common in current data release scenarios. Moreover, SeDC preserves the semantics of the protected data better than statistical approaches, which is crucial when using protected data for research. Social implications Individuals are increasingly aware of the privacy threats that the uncontrolled collection and exploitation of their personal data may produce. In this respect, SeDC offers an intuitive notion of privacy protection that users can easily understand. It also naturally captures the (non-quantitative) privacy notions stated in current legislations on personal data protection. Originality/value On the contrary to statistical approaches to data protection, SeDC assesses disclosure risks and enforces data protection from a semantic perspective. As a result, it offers more general, intuitive, robust and utility-preserving protection of data, regardless their type and structure.


Khazanah ◽  
2020 ◽  
Vol 12 (2) ◽  
Author(s):  
Hidayatun Nafi'ah ◽  
◽  
athifah Nur Hasna ◽  

Background: Personal data is the most fundamental right for everyone including children. Children are the most vulnerable subjects when it comes to the processing of personal data, it is because they do not have awareness and understanding of the risks of misuse of personal data. Regulations regarding the protection of children's personal data in Indonesia are already contained in the draft of personal data protection law but with very limited guidance. Through this comparative study, researchers wanted to compare the United State's COPPA(Children's Online Privacy Protection Act) with the Children and GDPR by the United Kingdom. Both of these regulations are very detailed in regulating the protection of children's personal data. This study will provide a clearer picture of children’s privacy protection regulations so that it can be used as a reference for Indonesia's draft of personal data protection law in regard to the rights of children's privacy. Method: This comparative research uses qualitative descriptive methods with library research and approach. Result: There are fundamental differences regarding the form of guidance, the definition of child, the perpetrator processing of the child's personal data, and things that are included in the child's personal data. Conclusion: The application of children's personal data protection is adjusted to the values and cultures of the country.


Sign in / Sign up

Export Citation Format

Share Document