Send email Copy Email Address

2023-09-05
Eva Michely

Artificial neural networks can be more efficient: CISPA researcher Dr. Rebekka Burkholz receives ERC Starting Grant

Rebekka Burkholz wants to democratize machine learning. Her starting point: Making artificial neural networks smaller and at the same time more efficient, so that they can eventually be developed on all devices and be available to more users. The European Research Council (ERC) is now funding her research project, called SPARSE-ML, for five years with an ERC Starting Grant totaling 1.5 million euros.

Machine learning is a rapidly growing area of AI which drives the knowledge acquisition of technological systems. In the training phase, these systems are fed data. From this so-called learning material, they derive regularities and correlations that they can then apply to new data. “In this way, very large and unmanageable amounts of data can be evaluated and analyzed for correlations”, CISPA researcher Rebekka Burkholz explains. "I see great potentials here for the field of biomedicine, for example for cancer research."

For her research on artificial neural networks, Burkholz has now received the support of the European Research Council (ERC). Neural networks are one of the possible foundations for machine learning. They are modeled on the biological nervous systems and are designed to independently acquire and process information. The problem is that in order to achieve the desired learning effect, they have to be fed huge amounts of data. At the same time, they expand in proportion to the amounts of data they must digest. "These large networks devour vast amounts of resources and computing power," Burkholz explains.

Artificial neural networks: Smaller is (likely to be) more efficient

In her project SPARSE-ML, Rebekka Burkholz tackles this challenge. Her goal is to make machine learning models smaller and more efficient. She assumes that leaner and more selectively trained neural networks can also achieve greater domain-specific performance. "Simply pruning existing neural network architectures is not enough to achieve this goal," Burkholz argues. In her project, she plans to use methods from statistical physics to scale neural networks to smaller models that require fewer resources and less computing power.

Burkholz draws, among other things, on research results she obtained during her PhD at ETH Zurich. "Towards the end of my PhD in theoretical physics, I discovered something really amazing," Burkholz says. "My research was on cascade models, like those used in machine learning. I succeeded in making them analytically tractable using random graphs. This insight can now help me design compressed and efficient neural networks."

Prior research has already shown that compressing models can yield a number of advantages for machine learning. "For example, it increases their ability to generalize findings and apply them to new domains. They can better handle noise in the data and process data more efficiently in the learning process," Burkholz explains. Although it may not be possible to combine all advantages in a single model, she hopes to achieve a combination of at least some of them. As part of the SPARSE-ML project, she will attempt to prove this in the application area of biomedicine.

Democratizing machine learning

Designing and training neural networks requires huge server capacities and large amounts of resources, which is why the development of the technology has been in the hands of the tech giants. "Research can't keep up here in terms of resources," Burkholz laments. "It's totally impressive what has been achieved so far with gigantic models. But it's mainly big corporations that can afford to do this at the moment. It's important to me that everyone has a chance to contribute to such an important technology. After all, it is changing the way we live." With SPARSE-ML, she hopes to help democratize the technology and lay the theoretical groundwork for leaner models that can be trained with fewer resources. And she hopes that in the future, these models can also be developed on the devices of ordinary users.
 

About Rebekka Burkholz

Dr. Rebekka Burkholz has been a CISPA faculty since 2021. Her primary research interest is machine learning, with a focus on the optimization of complex networks and their algorithms. She is particularly interested in advancing the benefits of machine learning in biomedicine and molecular biology. Burkholz received her PhD from ETH Zurich and conducted postdoctoral research in the Department for Biostatistics at Harvard University. Her awards include the 2017 Zurich Dissertation Prize and the 2015 CSF Best Contribution Award.

About the ERC

The ERC, set up by the European Union in 2007, is the premier European funding organisation for excellent frontier research. It funds creative researchers of any nationality and age, to run projects based across Europe. The ERC offers 4 core grant schemes: Starting Grants, Consolidator Grants, Advanced Grants and Synergy Grants. With its additional Proof of Concept Grant scheme, the ERC helps grantees to explore the innovation potential of their ideas or research results. The ERC is led by an independent governing body, the Scientific Council. Since 1 November 2021, Maria Leptin is the President of the ERC. The overall ERC budget from 2021 to 2027 is more than €16 billion, as part of the Horizon Europe programme, under the responsibility of the European Commissioner for Innovation, Research, Culture, Education and Youth, Mariya Gabriel.

Visit Project Page