Science.-A new computation to forecast the butterfly effect

22-09-2021 Artificial neural networks, the heart of reservoir computing, have been greatly simplified. RESEARCH AND TECHNOLOGY POLICY OHIO STATE UNIVERSITY

MADRID, 22 (EUROPA PRESS)

A relatively new type of computing that mimics the way the human brain works has provided a new way to solve the most difficult computing problems.

Researchers at Ohio State University have made reservoir computing run 33 to 1 million times faster, with significantly less computing resources and less data input required.

In a test of this next-generation reservoir computing, the researchers solved a complex computing problem in less than a second on a desktop computer.

Using today’s cutting-edge technology, the same problem requires a supercomputer to solve and still takes much longer, Daniel Gauthier, lead author of the study and a professor of physics at Ohio State, said in a statement.

“We can perform very complex information processing tasks in a fraction of the time using far fewer computing resources compared to what reservoir computing can do today,” Gauthier said. “And reservoir computing was already a significant improvement on what was previously possible.”

The study was published in the journal Nature Communications.

Reservoir computing is a machine learning algorithm developed in the early 2000s and used to solve the “hardest of the difficult” computing problems, such as forecasting the evolution of dynamic systems that change over time, Gauthier said.

PREDICT DYNAMIC SYSTEMS

Dynamic systems, like weather, are difficult to predict because just a small change in one condition can have massive effects down the road, he said.

A famous example is the “butterfly effect,” in which, in a metaphorical illustration, the changes created by a butterfly flapping its wings can eventually influence the weather weeks later.

See also  Fernando Moreira inaugurated the Municipal Cultural Space in Cárkova

Previous research has shown that reservoir computing is well suited for learning dynamic systems and can provide accurate predictions of how they will behave in the future, Gauthier said.

It does this by using an artificial neural network, something like a human brain. Scientists feed data into a dynamic network into a “repository” of artificial neurons randomly connected in a network. The network produces useful results that scientists can interpret and feed back to the network, building an increasingly accurate forecast of how the system will evolve in the future.

The larger and more complex the system and the more accurate the scientists want the forecast to be, the larger the network of artificial neurons must be and the more computing resources and time needed to complete the task.

One problem has been that the reservoir of artificial neurons is a “black box,” Gauthier said, and scientists have not known exactly what goes on inside it, they only know that it works.

The artificial neural networks at the heart of reservoir computing are based on mathematics, Gauthier explained. “We had mathematicians look at these networks and ask themselves, ‘To what extent are all these pieces of machinery really necessary?'” He said.

In this study, Gauthier and his colleagues investigated that question and found that the entire reservoir computing system could be greatly simplified, dramatically reducing the need for computing resources and saving significant time.

They tested their concept in a forecasting task involving a meteorological system developed by Edward Lorenz, whose work led to an understanding of the butterfly effect.

See also  Operational deployment of the recovery of public space in Bocagrande | Globalism

Their next-generation reservoir computing was a clear winner over the current state of the art in this Lorenz forecasting task. In a relatively simple simulation performed on a desktop computer, the new system was 33 to 163 times faster than the current model.

But when high forecasting precision was the goal, next-generation reservoir computing was about 1 million times faster. And next-generation computing achieved the same precision with the equivalent of just 28 neurons, compared to the 4,000 required by the current-generation model, Gauthier said.

An important reason for the acceleration is that the “brain” behind this next generation of reservoir computing needs much less training compared to the current generation to produce the same results.

Myrtle Frost

"Reader. Evil problem solver. Typical analyst. Unapologetic internet ninja."

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top