A review of the implementation of the personal data (privacy) ordinance in the Hong Kong Correctional Services Department

1998 ◽  
Author(s):  
Chi-keung Kan
Data & Policy ◽  
2021 ◽  
Vol 3 ◽  
Author(s):  
Veronica Qin Ting Li ◽  
Masaru Yarime

Abstract Contemporary data tools such as online dashboards have been instrumental in monitoring the spread of the COVID-19 pandemic. These real-time interactive platforms allow citizens to understand the local, regional, and global spread of COVID-19 in a consolidated and intuitive manner. Despite this, little research has been conducted on how citizens respond to the data on the dashboards in terms of the pandemic and data governance issues such as privacy. In this paper, we seek to answer the research question: how can governments use data tools, such as dashboards, to balance the trade-offs between safeguarding public health and protecting data privacy during a public health crisis? This study used surveys and semi-structured interviews to understand the perspectives of the developers and users of COVID-19 dashboards in Hong Kong. A typology was also developed to assess how Hong Kong’s dashboards navigated trade-offs between data disclosure and privacy at a time of crisis compared to dashboards in other jurisdictions. Results reveal that two key factors were present in the design and improvement of COVID-19 dashboards in Hong Kong: informed actions based on open COVID-19 case data, and significant public trust built on data transparency. Finally, this study argues that norms surrounding reporting on COVID-19 cases, as well as cases for future pandemics, should be co-constructed among citizens and governments so that policies founded on such norms can be acknowledged as salient, credible, and legitimate.


2020 ◽  
Vol 7 (2) ◽  
pp. 325-343
Author(s):  
Robin Hui HUANG ◽  
Cynthia Sze Wai CHEUNG ◽  
Christine Meng Lu WANG

AbstractMobile payment generally refers to transactions made through the applications of a portable electronic gadget without the transfer of cash. As one of the most disruptive technologies for finance, mobile payment has been rapidly transforming the traditional financial industry. While it brings important benefits, there are also various risks, in terms of liquidity, security, and data privacy, that call for adequate regulatory responses. As a global financial centre, Hong Kong has gradually established a regulatory framework for mobile payment, addressing the relevant risks with rules on payment and privacy. However, there is still room for further improvement, in terms of measures to deal with cybersecurity issues and strengthen the protection of personal data. The Hong Kong experiences suggest that, to regulate a new and fast-growing industry such as mobile payment, the regulatory regime needs to be improved continuously to alleviate the risk concerns, so as to enhance the protection of financial consumers and society at large.


Author(s):  
Daniel Amo ◽  
David Fonseca ◽  
Marc Alier ◽  
Francisco José García-Peñalvo ◽  
María José Casañ ◽  
...  

2021 ◽  
Vol 4 ◽  
Author(s):  
Vibhushinie Bentotahewa ◽  
Chaminda Hewage ◽  
Jason Williams

The growing dependency on digital technologies is becoming a way of life, and at the same time, the collection of data using them for surveillance operations has raised concerns. Notably, some countries use digital surveillance technologies for tracking and monitoring individuals and populations to prevent the transmission of the new coronavirus. The technology has the capacity to contribute towards tackling the pandemic effectively, but the success also comes at the expense of privacy rights. The crucial point to make is regardless of who uses and which mechanism, in one way another will infringe personal privacy. Therefore, when considering the use of technologies to combat the pandemic, the focus should also be on the impact of facial recognition cameras, police surveillance drones, and other digital surveillance devices on the privacy rights of those under surveillance. The GDPR was established to ensure that information could be shared without causing any infringement on personal data and businesses; therefore, in generating Big Data, it is important to ensure that the information is securely collected, processed, transmitted, stored, and accessed in accordance with established rules. This paper focuses on Big Data challenges associated with surveillance methods used within the COVID-19 parameters. The aim of this research is to propose practical solutions to Big Data challenges associated with COVID-19 pandemic surveillance approaches. To that end, the researcher will identify the surveillance measures being used by countries in different regions, the sensitivity of generated data, and the issues associated with the collection of large volumes of data and finally propose feasible solutions to protect the privacy rights of the people, during the post-COVID-19 era.


2019 ◽  
Vol 22 (1) ◽  
Author(s):  
Miguel Ehecatl Morales-Trujillo ◽  
Gabriel Alberto García-Mireles ◽  
Erick Orlando Matla-Cruz ◽  
Mario Piattini

Protecting personal data in current software systems is a complex issue that requires legal regulations and constraints to manage personal data as well as a methodological support to develop software systems that would safeguard data privacy of their respective users. Privacy by Design (PbD) approach has been proposed to address this issue and has been applied to systems development in a variety of application domains. The aim of this work is to determine the presence of PbD and its extent in software development efforts. A systematic mapping study was conducted in order to identify relevant literature that collects PbD principles and goals in software development as well as methods and/or practices that support privacy aware software development. 53 selected papers address PbD mostly from a theoretical perspective with proposals validation based primarily on experiences or examples. The findings suggest that there is a need to develop privacy-aware methods to be integrated at all stages of software development life cycle and validate them in industrial settings.


Author(s):  
Anastasia Kozyreva ◽  
Philipp Lorenz-Spreen ◽  
Ralph Hertwig ◽  
Stephan Lewandowsky ◽  
Stefan M. Herzog

AbstractPeople rely on data-driven AI technologies nearly every time they go online, whether they are shopping, scrolling through news feeds, or looking for entertainment. Yet despite their ubiquity, personalization algorithms and the associated large-scale collection of personal data have largely escaped public scrutiny. Policy makers who wish to introduce regulations that respect people’s attitudes towards privacy and algorithmic personalization on the Internet would greatly benefit from knowing how people perceive personalization and personal data collection. To contribute to an empirical foundation for this knowledge, we surveyed public attitudes towards key aspects of algorithmic personalization and people’s data privacy concerns and behavior using representative online samples in Germany (N = 1065), Great Britain (N = 1092), and the United States (N = 1059). Our findings show that people object to the collection and use of sensitive personal information and to the personalization of political campaigning and, in Germany and Great Britain, to the personalization of news sources. Encouragingly, attitudes are independent of political preferences: People across the political spectrum share the same concerns about their data privacy and show similar levels of acceptance regarding personalized digital services and the use of private data for personalization. We also found an acceptability gap: People are more accepting of personalized services than of the collection of personal data and information required for these services. A large majority of respondents rated, on average, personalized services as more acceptable than the collection of personal information or data. The acceptability gap can be observed at both the aggregate and the individual level. Across countries, between 64% and 75% of respondents showed an acceptability gap. Our findings suggest a need for transparent algorithmic personalization that minimizes use of personal data, respects people’s preferences on personalization, is easy to adjust, and does not extend to political advertising.


Sign in / Sign up

Export Citation Format

Share Document