Latest YouTube Video

Thursday, April 7, 2016

A New Oscillating-Error Technique for Classifiers. (arXiv:1505.05312v4 [cs.AI] UPDATED)

This paper describes a new method for reducing the error in a classifier. It uses an error correction update, but includes the very simple rule of either adding or subtracting the adjustment, based on whether the variable value is currently larger or smaller than the desired value. The new neuron can take an input from each variable or column and adjust it by either adding or subtracting the difference, on a variable by variable basis. While a traditional neuron would sum the inputs together and then apply the same function in every instance, this new neuron can change the function for each input variable. This gives added flexibility to the convergence procedure, where through a series of transpositions, variables that are far away can continue towards the desired value, whereas variables that are originally much closer can oscillate from one side to the other. Tests show that the method can successfully classify some benchmark datasets. It can also work in a batch mode, with reduced training times and can be used as part of a neural network architecture. There are also some updates to an earlier wave shape paper.



from cs.AI updates on arXiv.org http://ift.tt/1Bd2M5e
via IFTTT

No comments: