Center for Algorithms and Theory of Computation

CS 269S, Spring 2019: Theory Seminar
Bren Hall, Room 1423, 1pm


May 24, 2019:

On the capacity of FeedForward Neural Networks

Ceasar Aguma

While Artificial Neural Networks have become a quite popular problem-solving tool, much of their theory is unknown or ambiguous. There has been no clear framework for quantifying the theoretical capabilities of Neural Networks. This paper by Baldi and Versynin provides such a framework for theoretically analyzing the capacity of FeedForward Neural Networks. My talk will summarize the core result presented in the paper, that it, a definition of "the capacity of an architecture by the binary logarithm of the number of functions it can compute, as the synaptic weights are varied." While the topic of Artificial Neural Networks is synonymous with Deep Learning, I hope this talk can justify the need to look at Artificial Neural Networks from a theoretical perspective.

Paper by Pierre Baldi and Roman Vershynin