80 – Empowering users of online services

Using online services means ticking a lot of boxes in consent forms, but do we always know what we are saying yes to? In this podcast we are talking to Farzaneh Karegar, PhD in Computer Science. In her research she proposed, designed, and tested usable and legally compliant tools and solutions that can empower users to take control of their data when using online services. 

We talk about the trade-offs between convenience and privacy when it comes to making online choices; tools that both users and service providers can benefit from and consent form designs that motivate users to pay more attention to what they are disclosing and for what purposes. Farzaneh introduces us also to dark patterns which are very prominent in, for example, cookie consent banners and explains why they can be a pitfall also for policy designers with good intentions. 

Farzaneh Karegar’s doctoral thesis can be downloaded from DiVA: The Lord of Their Data Under the GDPR?: Empowering Users Through Usable Transparency, Intervenability, and Consent

79 – The Public Interest in the Data Society

The public interest, in its ideal form, offers the possibility for all to exercise individual rights and freedoms, such as freedom of expression and information or the right to personal data protection. However, in practice the definition of public interest can vary depending on the context.

In Maud Bernisson’s doctoral thesis in Media and Communication Studies, she investigates how the notion of public interest was constructed in relation to digital media during the GDPR (General Data Protection Regulation) policymaking process. By interviewing key actors of the process, along with extensive in-depth document analyses, Maud’s research shows that the GDPR redefines the public interest in a way which diverges from its ideal form. In our conversation, Maud explains the reasons for this divergence and how it has affected how the GDPR works for EU citizens.     

Maud Bernisson’s doctoral thesis can be downloaded from DiVA: The Public Interest in the Data Society: Deconstructing the Policy Network Imaginary of the GDPR

41 – Improving transparency in personal data processing

The European Union’s General Data Protection Regulation has been implemented to protect citizens’ data privacy by, for example, increasing control over their personal data. However, computerized systems and web services are not always effectively designed to give users the control they are legally entitled to in a usable way. In her thesis, Farzaneh Karegar, Ph. D. student in computer science, develops new solutions that enhance transparency and make it easier for users to give better-informed consent to service providers handling personal data. In our conversation Farzaneh lets us know more about the solutions, and why it is important to continue to work to improve the gap between legally-compliant and usable services. Farzaneh Karegar’s licentiate thesis can be retrieved from DiVA: Towards Improving Transparency, Intervenability, and Consent in HCI 

32 – Personal data privacy

In our digitalised world more and more of our personal information is registered in network computers and servers. Stakeholders handling personal information thus need to make sure their systems are secure and maintain the integrity of individuals. Automated privacy audits is one approach to ensure that stakeholders do in fact maintain the privacy of personal data. But as Jenni Reuben shows in her research, these audits can themselves subject to privacy risks. In our conversation Jenni, Ph.D. student in computer science, tells more about these risks and of the model she proposes to prevent illegitimate access to personal data. To read Jenni Reuben’s dissertation please follow this link: Privacy-aware Use of Accountability Evidence