Mamatjan

@Mamatjan1920

Mamatjan 暂无简介

所有star的仓库都会放在这里。可以根据需求创建不同的星选集来管理它们。

    1 Xiang/Linear-Attention-Recurrent-Neural-Network

    A recurrent attention module consisting of an LSTM cell which can query its own past cell states by the means of windowed multi-head attention. The formulas are derived from the BN-LSTM and the Transformer Network. The LARNN cell with attention can be easily used inside a loop on the cell state, just like any other RNN. (LARNN)

    最近更新: 5年多前

搜索帮助