Reservoir computing is one of the most widespread types of AI that scientists have access to currently.
A recent study revealed a new way of making it nearly a million times faster at carrying out specific tasks.
That is very exciting news when it comes to handling advanced computational challenges, from figuring out the way the weather will be like, to simulating the flow of fluids through a specific object.
Problems like these require resource-intensive computing. However, the newest innovations will make the process considerably more efficient.
The team who carried out the study named the discovery of the new generation of reservoir computing.
Daniel Gauthier, a physicist from the Ohio State University, stated:
“We can perform very complex information processing tasks in a fraction of the time using much less computer resources compared to what reservoir computing can currently do. And reservoir computing was already a significant improvement on what was previously possible.”
Reservoir computing works via neural networks – machine learning systems that mimic how brains work, in a sense that they are trained to identify patterns in a considerable amount of data.
You show a neural network a thousand pictures of a dog, and it will learn what a dog looks like, making it able to identify the animal the next time it sees one.
Reservoir computing improves that in a technical manner. The process sends data into a “reservoir” where data points get linked together in certain ways. Data is then output by the reservoir, processed, and fed back to the learning algorithm.
That technical twist makes the entire process faster in many ways and easier to adapt to new learning sequences.
However, it is also very intensive on random processing, which means that whatever goes on within the reservoir isn’t too clear.
The reservoir is what engineers often call a “black box” – it works, but nobody knows for sure how or why.
Leave a Reply