Refreshments: 10:30amPresentation: 11am-12pmUMass Student Chapter of InForms presents:One form of characterizing the expressiveness of a piecewise linear neural network is by the number of linear regi

Refreshments: 10:30am
Presentation: 11am-12pm

UMass Student Chapter of InForms presents:

One form of characterizing the expressiveness of a piecewise linear neural network is by the number of linear regions, or pieces, of the function modeled. In this talk, we investigate the number of linear regions that these networks can attain, both theoretically and empirically. We present upper and lower bounds for the maximum number of linear regions on rectifier and maxout networks and a method to perform exact counting of the number of  regions by modeling the DNN as a mixed-integer linear program. These bounds come from leveraging the dimension of the space defining each linear region, and they indicate that a shallow network may define more linear regions than any deep network. We also show how to approximate the number of linear regions of a rectifier network with an algorithm for probabilistic lower bounds on the number of solutions of mixed-integer linear programs. This algorithm is several orders of magnitude faster than exact counting and the values reach similar orders of magnitude, hence making it a viable method to compare the expressiveness of such networks. This talk is based on joint work with Christian Tjandraatmadja (Google) and Srikumar Ramalingam (The University of Utah). The papers can be found on arXiv:https://arxiv.org/abs/1711.02114 and https://arxiv.org/abs/1810.03370