The result is an algorithm that expands the storage capacity of artificial networks by forcing them into an off-line sleep phase during which they reinforce relevant memories (pure states) and erase irrelevant ones (spurious states).
The article's intended audience isn't computer scientists. If you'r a huge nerd like me, you're probably asking yourself, "What's so novel about a neural network that trains itself and then performs on test data? That's like every neural net I've ever written..."
Fear not, my geeky cohorts! Here's the paper, and a choice excerpt I pulled from the abstract:
Beyond obtaining a phase diagram for neural dynamics, we focus on synaptic plasticity and we give explicit prescriptions on the temporal evolution of the synaptic matrix. We analytically prove that our algorithm makes the Hebbian kernel converge with high probability to the projection matrix built over the pure stored patterns. Furthermore, we obtain a sharp and explicit estimate for the “sleep rate” in order to ensure such a convergence. Finally, we run extensive numerical simulations (mainly Monte Carlo sampling) to check the approximations underlying the analytical investigations and possible finite-size effects, finding overall full agreement with the theory.
WOW. If you followed every single word of that, you're better at this than I am. Either way, thought this was a fun article to enjoy with my morning coffee!