A novel adaptive activation function with an even cubic nonlinearity is introduced that enhances the accuracy of neural networks without substantial additional computational resources, while exhibiting a tradeoff between convergence and accuracy.
This paper proposes a novel feature-based echo-state network (Feat-ESN) architecture that uses smaller parallel reservoirs driven by different input feature combinations to significantly reduce the computational complexity of traditional ESNs while maintaining predictive performance.