Latest YouTube Video

Thursday, August 18, 2016

On the expressive power of deep neural networks. (arXiv:1606.05336v3 [stat.ML] UPDATED)

We study the effects of the depth and width of a neural network on its expressive power. Precise theoretical and experimental results are derived in the generic setting of neural networks after random initialization. We find that three different measures of functional expressivity: number of transitions (a measure of non-linearity/complexity), network activation patterns (a new definition with an intrinsic link to hyperplane arrangements in input space) and number of dichotomies, show an exponential dependence on depth but not width. These three measures are related to each other, and, are also directly proportional to a fourth quantity, trajectory length. Most crucially, we show, both theoretically and experimentally, that trajectory length grows exponentially with depth, which is why all three measures display an exponential dependence on depth.

These results also suggest that parameters earlier in the network have greater influence over the expressive power of the network. So for any layer, its influence on expressivity is determined by the remaining depth of the network after that layer, which is supported by experiments on fully connected and convolutional networks on MNIST and CIFAR-10.



from cs.AI updates on arXiv.org http://ift.tt/1tznj7m
via IFTTT

No comments: