Category: T2RNN

Transformer to RNN/ T2RNN

Several research efforts have been made to convert/compress the large-scale pretrained transformer models models into efficient inference models that facilitate downstream applications. This task becomes important, as variety of autoregressive transformers have shown very high improvements in the NLP application performance baselines. Similar to recurrent neural networks (RNNs), those models represent the context by a

Continue reading