PRIVACY OF SOCIAL RELATIONS AND PRIVACY-UTILITY TRADE-OFFS.
Social networks raise severe privacy concerns as diverse, personal information is shared. Furthermore, this information is propagated and potentially aggregated, which can reveal detailed information about an individual. Unfortunately, these threats often involve complex statistical inference and are not transparent at all to the end user. We improve the understanding of the type of information that can be leaked – such as insights into social relations – and how inherent privacy-utility trade-offs can be formalized and implemented in such networks.
Privacy of Social Relations. We investigate novel social relation inference attacks based on the multi-modal input such as users’ mobility features and visual information. To counter these threats, we research novel and effective mitigation strategies against social link privacy risk. We draw from models of social psychology to arrive at a holistic categorization of social relations in order to quantify risks and effectiveness of mitigation strategies on different levels of granularities.
Privacy-Utility Trade-Offs. Often, privacy and utility (e.g. communication) are partially conflicting objectives. We seek a rigorous understanding of the propagation of information online and the implication of privacy-utility trade-offs. This enables privacy-utility trade-offs that best align with a particular application domain or personal preferences. We research methods that systematically suggest alternative actions on social networks that implement optimal trade-off between privacy and utility. Efficiency is a key concern in order to make these methods practical in real-world deployment.
Founding Director and CEO