# seq2seq_with_attention **Repository Path**: glueo/seq2seq_with_attention ## Basic Information - **Project Name**: seq2seq_with_attention - **Description**: 基于seq2seq原理的机器翻译 - **Primary Language**: Unknown - **License**: Not specified - **Default Branch**: master - **Homepage**: None - **GVP Project**: No ## Statistics - **Stars**: 0 - **Forks**: 0 - **Created**: 2025-03-09 - **Last Updated**: 2025-04-29 ## Categories & Tags **Categories**: Uncategorized **Tags**: Seq2Seq, Python, Deep-learning ## README # seq2seq_with_attention This is a seq2seq model with attention mechanism for machine translation. Creater: **
  • **