# tf-reverse **Repository Path**: xusun000/tf-reverse ## Basic Information - **Project Name**: tf-reverse - **Description**: No description available - **Primary Language**: Unknown - **License**: Not specified - **Default Branch**: master - **Homepage**: None - **GVP Project**: No ## Statistics - **Stars**: 0 - **Forks**: 0 - **Created**: 2026-02-08 - **Last Updated**: 2026-02-08 ## Categories & Tags **Categories**: Uncategorized **Tags**: None ## README # String Reversal with Transformer This project implements a Transformer model to reverse strings character by character. ## Overview The model learns to reverse input strings by attending to each character and predicting the reversed sequence using positional encodings and attention mechanisms. ## Architecture - Input embedding layer - Positional encoding - Multi-head self-attention layers - Feed-forward networks - Output projection layer ## Results The model shows partial success in learning the reversal task. Note that sequence-to-sequence tasks like string reversal can be challenging for Transformer models without an explicit decoder architecture, as commonly used in encoder-decoder models. Our implementation uses only an encoder, which makes the task more difficult as the model needs to learn positional transformations within the same sequence. ## Usage ```bash python train.py python test.py ``` ## Files - `model.py`: Contains the Transformer model implementation - `data.py`: Contains the dataset class and utilities - `train.py`: Training script - `test.py`: Testing script - `requirements.txt`: Python dependencies - `pyproject.toml`: Poetry configuration