By Eli

2019-03-13 03:03:21 8 Comments

I am looking for a book that goes through the mathematical aspects of neural networks, from simple forward passage of multilayer perceptron in matrix form or differentiation of activation functions, to back propagation in CNN or RNN (to mention some of the topics).

Do you know any book that goes in depth into this theory? I've had a look at a couple (such as Pattern Recognition and Machine Learning by Bishop) but still have not found a rigorous one (with exercises would be a plus). Do you have any suggestions?


@Jim Stuttard 2019-03-24 11:41:20

Not a book but maybe of some interest for a current perspective:

Backprop as Functor: A compositional perspective on supervised learning Brendan Fong David I. Spivak Remy Tuyeras (2018) gives a category theoretic structural framework based on the algorithm:

This is further discussed by David Spivak (2019) via:

@pcp 2019-03-14 09:56:38

One of my favorite books on theoretical aspects of neural networks is Anthony and Bartlett's book: "Neural Network Learning Theoretical Foundations".

This book studies neural networks in the context of statistical learning theory. You will find loads of estimates of VC dimensions of sets of networks and all that fun stuff.

I should say that this book does not go into detail on CNNs and RNNs.

@Josef Knecht 2019-03-14 19:43:35

Gilbert Strang (of MIT OCW Linear Algebra lectures and Introduction to Linear Algebra fame) has a new textbook on linear algebra for deep learning, Linear Algebra and Learning from Data.

It's got a decent course in linear algebra, some statistics & optimization, the calculus needed for stochastic gradient descent, and then applies them all to neural network models.

@Shamisen Expert 2019-03-13 03:33:59

For MLPs, there is a rigorous derivation in the optimization textbook by Edwin Chong and Zak. Although it is notation heavy as all things related to neural networks must be.

This book is for some reason freely available online. See page 219 of,%20S.%20Zak.pdf

I think there is essentially no good mathematical textbook on convolutional neural networks or RNN in existence. People essentially just base their intuition off of MLPs. But it is not hard to create a mathematically rigorous derivation of forward and backward propagation of CNN or RNN.

@Rahul 2019-03-13 12:11:21

"This book is for some reason freely available online." That is probably a copyright violation by the webpage owner But I won't tell anyone if you won't ;)

@Jair Taylor 2019-03-13 03:11:25

I'd recommend Deep Learning by Goodfellow, Bengio and Courville. I don't know if I'd call it "purely mathematical", but it covers a good amount of math background in the first few chapters. No exercises, though.

@Eli 2019-03-13 03:17:54

Thank you - I've actually had a look at that one too, but while it is good in introducing the main mathematical tools needed for NN, I found it a bit lacking when it came to properly develop the model mathematically.

Related Questions

Sponsored Content

0 Answered Questions

Neural Networks and the Chain Rule

0 Answered Questions

Understanding the use of convolution in neural networks.

0 Answered Questions

1 Answered Questions

0 Answered Questions

delay time from mathematical perspective

0 Answered Questions

Two-layer Perceptron for XOR

Sponsored Content