# SpaFL_NeruIPS_2024 **Repository Path**: xdjiangkai/SpaFL_NeruIPS_2024 ## Basic Information - **Project Name**: SpaFL_NeruIPS_2024 - **Description**: No description available - **Primary Language**: Unknown - **License**: Not specified - **Default Branch**: main - **Homepage**: None - **GVP Project**: No ## Statistics - **Stars**: 0 - **Forks**: 0 - **Created**: 2025-10-24 - **Last Updated**: 2025-10-24 ## Categories & Tags **Categories**: Uncategorized **Tags**: None ## README # SpaFL - NeruIPS 2024 This is a repository for the paper "SpaFL: Communication-Efficient Federated Learning with Sparse Models and Low Computational Overhead?" published in NeurIPS 2024 # Install # 1. create a conda env with python=3.10.14 2. ```pip install -r requirements.txt``` 3. ```pip install --upgrade protobuf wandb``` # Run # 1. FMNIST: ```python train.py --learning_rate 0.001 --th_coeff 0.002 --local_epoch 5 --seed 1``` 2. CIFAR-10: ```python train.py --affix grad_clip_3_seed_1_conv4 --model conv4 --learning_rate 0.01 --th_coeff 0.00015 --batch_size 16 --alpha 0.1 --local_ep 5 --clip 3 --seed 1``` 3. CIFAR-10 with ViT: ```python train.py --model vit --batch_size 16 --comm_rounds 500 --learning_rate 0.01 --alpha 0.1 --th_coeff 0.0001 --local_epoch 1 --clip 3 --seed 1``` 4. CIFAR-100: ```python train.py --model resnet18 --comm_rounds 1500 --learning_rate 0.01 --lr_decay 0.993 --alpha 0.1 --th_coeff 0.0007 --local_epoch 7 --clip 15 --seed 1```