Does neural network parameters form a low dimensional subspace?

by moreblue   Last Updated July 12, 2019 02:19 AM


Recently I have heard somebody saying that effecitvely the neural network parameters have lower dimensionality than their number.

For instance, AlexNet has 62M parameters, and it is reasonal to consider that the degree of freedom(or dimensionality) of parameter space is less than that.


Is there any theoretical/empirical reference on the dimensionality of neural network parameters?

Any help will be appreciated.

Related Questions

Updated June 15, 2018 13:19 PM

Updated August 03, 2018 16:19 PM

Updated November 07, 2018 22:19 PM