- [ ] Factorize `back(float error)` and `train()` between `SimpleNeuron` and `RecurrentNeuron` - [ ] Maybe rename `computeBackOutput` and `computeTrain` - [x] What about this->errors in `operator==`? - [x] Default operator== - [x] Add ResetLearningVars - [ ] Remove learning vars from Archive - [ ] Add a real batchSize() - [ ] Make literals operator consteval - [ ] previousOutput not used in GRU - [ ] Tensor class can be made with private default constructors and friend to boost or something like that.
back(float error)andtrain()betweenSimpleNeuronandRecurrentNeuroncomputeBackOutputandcomputeTrainoperator==?