Developing methods for the analysis of large amounts of data without compromising data protection.
Effects of increasing data protection efforts on research
Increasing digitisation in nearly every area of life creates not only new opportunities for products and services but also problems, especially when it comes to privacy and the protection of sensitive information. While trends like personalized services and the number of businesses selling information have increased, data protection has also been strengthened in the last years, most notably with the introduction of the European General Data Protection Regulation (GDPR) which came into effect in May 2018.
This leads to researchers and businesses that rely on oftentimes sensitive and person-related data coming into conflict with the data protection efforts. The quality of data is often essential for analysis and classic anonymization methods tend to heavily distort information and therefore negatively impact its quality. Until now, pseudonymization was typically used as an alternative approach but due to the stricter regulations, this option is no longer available.
Methods compatible with the requirements of data protection efforts
In this project, methods that reduce the negative effects of anonymization on the results of big data analyses and make these effects assessable will be investigated.
An important aspect of the GDPR is informational self-determination, including the right to refuse consent retroactively, the right to transparency and the right to delete data. These rights have to be taken into account. Therefore, newly developed methods will have to guarantee transparency without creating new conflicts with data protection as well as allow the erasing of data from complex data-processing systems.
Relevance of the results
The goal is to close the gap between theoretical data protection requirements, rooted in the new regulations of the GDPR, and practical requirements of data processing, especially when a certain quality of results is needed, for example in the medical area.
As the topic of minimally distorting anonymization is a vital aspect for most evaluators of person-related data, a high number of follow-up projects and research cooperations is expected. The topic of deleting data is not only relevant for the area of data protection but also concerns the retrieval of data: it is important to simultaneously build up know-how in the area of data forensic, a subfield that opens up promising areas of application for example fighting economic crime.
Dr. Dipl.-Ing. Thomas Baumhauer BSc