Home AI News Revolutionizing AI Training: Efficient Method Utilizing Physical Processes Instead of Digital Networks

Revolutionizing AI Training: Efficient Method Utilizing Physical Processes Instead of Digital Networks

0
Revolutionizing AI Training: Efficient Method Utilizing Physical Processes Instead of Digital Networks

Artificial intelligence (AI) is a powerful tool, but it also consumes a lot of energy. The more complex the tasks AI is trained for, the more energy it requires. Víctor López-Pastor and Florian Marquardt, scientists from the Max Planck Institute for the Science of Light in Germany, have come up with a more efficient way to train AI. They propose using physical processes instead of traditional digital neural networks.

The energy consumption of Open AI’s GPT-3, which powers ChatGPT, hasn’t been disclosed. According to Statista, it would require as much energy as 200 German households with three or more people consume in a year. While GPT-3 can learn patterns in data sets, it doesn’t truly understand the meaning behind them.

Neural Networks on Neuromorphic Computers

In recent years, research institutions have been exploring a new concept called neuromorphic computing to reduce energy consumption in computers, especially for AI. Unlike traditional neural networks, neuromorphic computing doesn’t rely on conventional digital computers. Instead, it aims to mimic the brain’s way of processing information. Digital computers perform calculations step by step, while neuromorphic computers process data in parallel, combining processing and memory.

According to Flora Marquardt, the director of the Max Planck Institute and a professor at the University of Erlangen, traditional neural networks consume a lot of energy just transferring data between the processor and memory. If the human brain worked similarly, it wouldn’t be efficient and would overheat.

Neuromorphic computers aim to replicate the brain’s parallel processing by using different systems, such as photonic circuits that use light instead of electrons to perform calculations. These systems function as both switches and memory cells.

A Self-Learning Physical Machine

Víctor López-Pastor and Florian Marquardt have developed an efficient training method for neuromorphic computers called a self-learning physical machine. Unlike traditional AI training, this method doesn’t require external feedback to adjust synaptic connections. The machine optimizes its own parameters through a physical process, making the training more efficient and saving both energy and computing time.

The specific physical process used doesn’t matter as long as it meets certain criteria. It must be reversible, meaning it can run forwards or backwards without losing much energy. It also needs to be non-linear, allowing for complex transformations between input data and results. Linear processes, like a pinball rolling without colliding, don’t fulfill these requirements.

Practical Test in an Optical Neuromorphic Computer

One example of a reversible, non-linear process is optics. López-Pastor and Marquardt are working with an experimental team to develop an optical neuromorphic computer. This computer processes information using superimposed light waves and components that regulate interaction. The researchers hope to have a working self-learning physical machine within three years, capable of handling larger neural networks and datasets.

As neural networks become more complex, there will be a growing demand for efficient neuromorphic computers. López-Pastor and Marquardt believe self-learning physical machines have a strong future in the development of artificial intelligence.

Source link

LEAVE A REPLY

Please enter your comment!
Please enter your name here