# EMP-SSL-Eval **Repository Path**: askuasen/emp-ssl-eval ## Basic Information - **Project Name**: EMP-SSL-Eval - **Description**: A evaluation in oil images, based on the work "EMP-SSL: Towards Self-Supervised Learning in One Training Epoch". - **Primary Language**: Python - **License**: MIT - **Default Branch**: master - **Homepage**: None - **GVP Project**: No ## Statistics - **Stars**: 0 - **Forks**: 0 - **Created**: 2025-03-31 - **Last Updated**: 2025-03-31 ## Categories & Tags **Categories**: Uncategorized **Tags**: None ## README # EMP-SSL Eval A self-supervised learning framework in oil images classfication based on the work "EMP-SSL: Towards Self-Supervised Learning in One Training Epoch"(paper Authors: Shengbang Tong*, Yubei Chen*, Yi Ma, Yann LeCun) [![arXiv](https://img.shields.io/badge/arXiv-2304.03977-b31b1b.svg)](https://arxiv.org/abs/2304.03977) ![Training Pipeline](pipeline.png) Repo Authors: Binsen XU ## Introduction This repository contains the implementation for the paper "EMP-SSL: Towards Self-Supervised Learning in One Training Epoch." The paper introduces a simplistic but efficient self-supervised learning method called Extreme-Multi-Patch Self-Supervised-Learning (EMP-SSL). EMP-SSL significantly reduces the training epochs required for convergence by increasing the number of fix size image patches from each image instance. ## Preparing Training Data Oilimage dataset is like cifar 10. Details can be found in this [link](https://pan.quark.cn/s/dffdb9d5cfd4).this passwor is WDBq. ## Getting Started Current code implementation supports Cifar10, Cifar100 and ImageNet100, Oilimages. To get started with the EMP-SSL implementation, follow these instructions: ### 1. Clone this repository ```bash git clone https://gitee.com/askuasen/emp-ssl-eval cd emp-ssl-eval ``` ### 2. Install required packages ``` pip install -r requirements.txt ``` ### 3. Training #### Reproducing 1-epoch results ``` python main.py --data cifar10 --epoch 2 --patch_sim 200 --arch 'resnet18-cifar' --num_patches 20 --lr 0.3 ``` For ImageNet100 Please follow the instruction in [this](https://github.com/zhirongw/lemniscate.pytorch) to download the ImageNet dataset. For small datasets like SVHN, you can either download them manually or set the **download** parameter in torchvision.dataset **True** to download them automatically. After downloading them, please put them in data/, an SSD is highly recommended for training on ImageNet. ``` python main.py --data imagenet100 --epoch 2 --patch_sim 200 --arch 'resnet18-imagenet' --num_patches 20 --lr 0.3 ``` For SmallOil ``` python main.py --data smalloil --epoch 2 --patch_sim 200 --arch 'conv-encoder' --num_patches 20 --lr 0.3 ``` #### Reproducing multi epochs results Change num_patches here to change the number of patches used in EMP-SSL training. ``` python main.py --data cifar10 --epoch 30 --patch_sim 200 --arch 'resnet18-cifar' --num_patches 20 --lr 0.3 ``` ### 4. Evaluating Because our model is trained with only fixed size image patches. To evaluate the performance, we adopt bag-of-features model from intra-instance VICReg paper. Change test_patches here to adjust number of patches used in bag-of-feature model for different GPUs. ``` python evaluate.py --model_path 'path to your evaluated model' --test_patches 128 ``` ``` python evaluate.py --test_patches 128 --data smalloil --arch conv-encoder --knn --model_path logs/EMP-SSL-Training/patchsim200_numpatch20_bs100_lr0.3_NONE/save_models/1.pt ``` ## Acknowledgment This repo is inspired by [MCR2](https://github.com/Ma-Lab-Berkeley/MCR2), [solo-learn](https://github.com/vturrisi/solo-learn) and [NMCE](https://github.com/zengyi-li/NMCE-release) repo. ## Citation If you find this repository useful, please consider giving a star :star: and citation: ``` @article{tong2023empssl, title={EMP-SSL: Towards Self-Supervised Learning in One Training Epoch}, author={Shengbang Tong and Yubei Chen and Yi Ma and Yann Lecun}, journal={arXiv preprint arXiv:2304.03977}, year={2023} } ``` # E-mail xbs150@163.com