Hey Jason, just dropping a couple of thoughts here for ya:
- Not sure how much time you want to spend on
1-machine-learning, but there's always Andrew Ng's canonical Coursera course; https://docs.google.com/document/d/1AISQIb2LVlMzN2tmTbMg3zsoBg9n4xk7HO12hEYgoIw/edit is the best collection of notes I've seen on it, if you want to skim
- Again unsure of your intended scope, but for
2-neural-nets I might generalize and add a few fundamentals to the mix: backprop, SGD, optimizers, activation functions, loss functions, techniques to combat overfitting. I think a solid basis in each of these things (which I hope to one day have) would go a long way towards making current research easier to grok.
5-rnns-cnns looks scary and intense; since I don't understand a lot of the verbiage here, I just wanted to clarify if your intent is to study the intersection and combination between RNNs and CNNs specifically, or to survey more advanced NN techniques in general.
Cheers!
Hey Jason, just dropping a couple of thoughts here for ya:
1-machine-learning, but there's always Andrew Ng's canonical Coursera course; https://docs.google.com/document/d/1AISQIb2LVlMzN2tmTbMg3zsoBg9n4xk7HO12hEYgoIw/edit is the best collection of notes I've seen on it, if you want to skim2-neural-netsI might generalize and add a few fundamentals to the mix: backprop, SGD, optimizers, activation functions, loss functions, techniques to combat overfitting. I think a solid basis in each of these things (which I hope to one day have) would go a long way towards making current research easier to grok.5-rnns-cnnslooks scary and intense; since I don't understand a lot of the verbiage here, I just wanted to clarify if your intent is to study the intersection and combination between RNNs and CNNs specifically, or to survey more advanced NN techniques in general.Cheers!