Carmela Troncoso wins Google Security and Privacy Research Award
Troncoso, a tenure-track assistant professor in the EPFL School of Computer and Communication Sciences (IC), was honored by Google for her work on digital privacy and security machine learning.
Carmela Troncoso joined EPFL in 2017 as head of the Security and Privacy Engineering Laboratory (SPRING). She is among the first winners of the Google Security and Privacy Research Awards, which were announced last year as a way to “recognize academics who have made major contributions to the field”. A committee of senior Google researchers selects the winners, of whom there were seven in 2018, including Troncoso.
The SPRING lab works on novel ways of thinking about protecting users from the downsides of machine learning.
“We live a world where machine learning dominates us – there are a lot of systems designed to learn from data and make decisions that affect our lives, and we have little means to fight back,” Troncoso explains.
The lab is studying the introduction of small pieces of modified data into systems like social media platforms, to prevent machine learning algorithms from learning too much.
“Machine learning is very good at inferring information from data, but this can be a problem for privacy when the algorithm can infer your gender, preferences, or your plans for next week,” she says.
She and her colleagues are also developing what they call protective optimization technologies for situations when machine learning algorithms cause problems instead of solving them. For example, a route planning app might optimize a journey by suggesting that drivers take a detour through a village to avoid a highway traffic jam. But the app won’t consider the residents of that village, who could be seriously impacted if hundreds of drivers take the suggested detour. An example of protective technology might be a tool to help city planners decide the minimum amount of traffic that must be deviated to ease congestion.
Open solutions
The Google award comes with $75,000 in research funding. Troncoso says she hopes to use the funds to develop open-access protective technologies, as well as privacy evaluation tools that provide a framework for testing and improving the security of machine learning-based systems.
“Privacy and decision-making are two sides of the same problem, because decisions and optimizations are based on how much a system can learn about you. We want people to live in a digital world without giving up their rights and individualism, so we build technologies to achieve that goal,” Troncoso summarizes.