Kombination von linearen RNNs und MLPs ermöglicht universelle Approximation von Sequenz-zu-Sequenz-Abbildungen.
Linear RNNs combined with MLPs provide a powerful architecture for sequence modeling, where the linear RNN encodes input sequences losslessly and the MLP performs non-linear processing. The use of complex eigenvalues in the recurrence enhances memory capabilities and information retention.