Pioneering Deep Learning Innovations: EPFL’s Novel Algorithm for Power-Saving Neural Networks

by Manuel Costa
5 comments
Analog Neural Networks

Researchers at EPFL have created a novel algorithm that trains analog neural networks with the same accuracy as digital ones, presenting an energy-saving alternative to conventional deep learning systems. This approach is more aligned with natural human learning processes and demonstrates encouraging outcomes in physical systems that use wave-based phenomena, aiming to minimize the environmental footprint of deep neural networks. (Conceptual AI-generated DALL-E 3 image showing light waves interacting with a physical system.) Credit: © LWE/EPFL

The team at EPFL has introduced an algorithm designed for training analog neural networks, providing a more power-efficient option compared to the energy-demanding hardware used in deep learning.

Deep neural networks like Chat-GPT, which process extensive data through algorithmic learning instead of traditional programming, seem to have boundless potential. However, as these systems expand in scope and influence, their size, complexity, and power consumption have also surged, raising concerns about their contribution to global carbon emissions.

Contrary to the common trend of transitioning from analog to digital in technological advancements, researchers are now exploring physical alternatives to digital deep neural networks to address these issues. Romain Fleury from EPFL’s Laboratory of Wave Engineering in the School of Engineering, along with his colleagues, outlined in a Science journal publication an algorithm for training physical systems. This method shows improved speed, better robustness, and lower power usage compared to existing techniques.

Ali Momeni, the lead author and a researcher at LWE, explains, “We applied our training algorithm on three different wave-based physical systems that utilize sound waves, light waves, and microwaves for information transmission instead of electrons. Our adaptable methodology is applicable for training any physical system.”

A “More Biologically Plausible” Method

Training neural networks involves guiding systems to learn optimal parameter values for tasks like image or speech recognition. This typically includes a forward pass, where data flows through the network and an error function is calculated based on the outcome, and a backward pass (also known as backpropagation or BP), where the error function gradient relative to all network parameters is computed.

Through repeated cycles, the system self-adjusts based on these calculations to yield more precise values. However, this process is not only energy-demanding but also ill-suited for physical systems. Training physical systems often requires a digital counterpart for the BP step, leading to inefficiency and potential discrepancies between reality and simulation.

The scientists proposed replacing the BP step with another forward pass through the physical system for local network layer updates. This reduces power consumption and negates the need for a digital twin, more accurately mirroring human learning processes.

Momeni adds, “Although neural networks are brain-inspired, it’s improbable that the brain learns through BP. Our concept is to train each physical layer locally with our actual system instead of constructing a digital version. Our approach is thus closer to biological plausibility.”

The EPFL team, in collaboration with Philipp del Hougne from CNRS IETR and Babak Rahmani from Microsoft Research, employed their physical local learning algorithm (PhyLL) on experimental acoustic and microwave systems, and a simulated optical system, for data classification tasks like vowel sounds and image recognition. Their method, comparable in accuracy to BP-based training, showed adaptability and robustness, even under unpredictable external disturbances, and outperformed existing methods.

Envisioning an Analog Future?

Although the LWE’s method is the initial BP-independent training of deep physical neural networks, some digital parameter updates are still needed. Momeni notes, “It’s a hybrid training technique, but our goal is to minimize digital computation as much as possible.”

The researchers are now aiming to apply their algorithm to a compact optical system, aspiring to enhance network scalability.

“For our experiments, we used neural networks with up to 10 layers. The challenge is to see if this works with networks having 100 layers with billions of parameters. The next phase involves addressing the technical constraints of physical systems.”

Reference: “Backpropagation-free training of deep physical neural networks” by Ali Momeni, Babak Rahmani, Matthieu Malléjac, Philipp del Hougne, and Romain Fleury, 23 November 2023, Science.
DOI: 10.1126/science.adi8474

Frequently Asked Questions (FAQs) about Analog Neural Networks

What is the new algorithm developed by EPFL researchers?

EPFL researchers have created a groundbreaking algorithm that trains analog neural networks efficiently, offering a power-saving alternative to traditional digital networks. This method aligns more closely with human learning processes and is particularly effective in wave-based physical systems.

How does the new EPFL algorithm impact the environment?

The new algorithm developed by EPFL researchers aims to reduce the environmental impact of deep neural networks. By training analog neural networks more efficiently, it presents an energy-efficient alternative to digital networks, thereby potentially lowering global carbon emissions associated with large-scale AI systems.

What are the advantages of the EPFL algorithm over traditional methods?

The EPFL algorithm for training analog neural networks offers several advantages over traditional methods, including improved speed, enhanced robustness, and reduced power consumption. It also eliminates the need for a digital twin, making the process more efficient and reducing the risk of discrepancies between reality and simulation.

What does the research suggest about future neural network training?

The research by EPFL suggests a shift towards more biologically plausible methods of neural network training. The new algorithm trains each physical layer locally, reflecting natural human learning more closely and offering a more sustainable and efficient approach to deep learning.

Who collaborated on this research and where was it published?

The research was a collaborative effort involving scientists from EPFL, including Romain Fleury, Ali Momeni, and partners like Philipp del Hougne from CNRS IETR and Babak Rahmani from Microsoft Research. The findings were published in the journal Science.

More about Analog Neural Networks

You may also like

5 comments

Sarah_84 December 21, 2023 - 10:27 pm

Interesting read. But i wonder how soon we’ll actually see these analog networks in action? seems like theres still a lot to do.

Reply
Mike Johnson December 22, 2023 - 1:47 am

wow this is big news! energy efficiency in AI is so crucial right now, glad to see EPFL is leading the way.

Reply
Emily Green December 22, 2023 - 10:46 am

This article is a bit hard to follow for a layperson? needs more simplification, especially when explaining the technical aspects.

Reply
TechGeek101 December 22, 2023 - 2:38 pm

gotta say, im impressed with the collab between EPFL and Microsoft Research. shows how academia and industry can work together for innovation

Reply
Raj Patel December 22, 2023 - 2:38 pm

its all about reducing carbon footprint these days, good on EPFL for pushing this in AI sector. But, is it really scalable, thats the question…

Reply

Leave a Comment

* By using this form you agree with the storage and handling of your data by this website.

SciTechPost is a web resource dedicated to providing up-to-date information on the fast-paced world of science and technology. Our mission is to make science and technology accessible to everyone through our platform, by bringing together experts, innovators, and academics to share their knowledge and experience.

Subscribe

Subscribe my Newsletter for new blog posts, tips & new photos. Let's stay updated!