# Efficient-ResNet-18-CapsuleNet-EMP-SSL **Repository Path**: askuasen/efficient-res-net-18-capsule-net-emp-ssl ## Basic Information - **Project Name**: Efficient-ResNet-18-CapsuleNet-EMP-SSL - **Description**: A modified for Efficient-CapsuleNet for well logging image classification - **Primary Language**: Python - **License**: BSD-4-Clause - **Default Branch**: master - **Homepage**: None - **GVP Project**: No ## Statistics - **Stars**: 0 - **Forks**: 0 - **Created**: 2024-11-04 - **Last Updated**: 2024-11-04 ## Categories & Tags **Categories**: Uncategorized **Tags**: None ## README # Towards Self-Supervised Learning based on capsule net This work is based on "EMP-SSL: Towards Self-Supervised Learning in One Training Epoch"(https://arxiv.org/abs/2304.03977) and "Efficient-CapsNet: Capsule Network with Self-Attention Routing"(https://arxiv.org/abs/2101.12491)  ## Introduction This repository contains a simplistic but efficient self-supervised learning method called Extreme-Multi-Patch Self-Supervised-Learning (EMP-SSL). EMP-SSL significantly reduces the training epochs required for convergence by increasing the number of fix size image patches from each image instance. ## Preparing Training Data Cifar10 and Cifar100 can be downloaded automatically in the script. ImageNet100 is a special subset of ImageNet. Details can be found in this. If use the dataoil image(https://pan.quark.cn/s/f2555c816534), please use dataset/split_train_test_data.py to split the data. ## Getting Started Current code implementation supports Cifar10, Cifar100 and ImageNet100 OilFacies Dataset. To get started with the EMP-SSL implementation, follow these instructions: ### 1. Clone this repository ```bash git clone https://gitee.com/askuasen/efficient-res-net-18-capsule-net-emp-ssl.git cd efficient-res-net-18-capsule-net-emp-ssl ``` ### 2. Install required packages ``` pip install -r requirements.txt ``` ### 3. Training #### Reproducing 1-epoch results For CIFAR10 or CIFAR100 ``` python main.py --data cifar10 --epoch 2 --patch_sim 200 --arch 'resnet18-cifar' --num_patches 20 --lr 0.3 ``` For ImageNet100 Please follow the instruction in [this](https://github.com/zhirongw/lemniscate.pytorch) to download the ImageNet dataset. For small datasets like SVHN, you can either download them manually or set the **download** parameter in torchvision.dataset **True** to download them automatically. After downloading them, please put them in data/, an SSD is highly recommended for training on ImageNet. ``` python main.py --data imagenet100 --epoch 2 --patch_sim 200 --arch 'resnet18-imagenet' --num_patches 20 --lr 0.3 ``` For SmallOil ``` python main.py --data smalloil --epoch 2 --patch_sim 200 --arch efficient-resnet18-capsule --num_patches 20 --lr 0.3 ``` Change num_patches here to change the number of patches used in EMP-SSL training. ``` python main.py --data cifar10 --epoch 30 --patch_sim 200 --arch 'resnet18-cifar' --num_patches 20 --lr 0.3 ``` ### 4. Evaluating Because our model is trained with only fixed size image patches. To evaluate the performance, we adopt bag-of-features model from intra-instance VICReg paper. Change test_patches here to adjust number of patches used in bag-of-feature model for different GPUs. ``` python evaluate.py --model_path 'path to your evaluated model' --test_patches 128 ``` ``` python evaluate.py --test_patches 128 --data smalloil --arch conv-encoder --knn --model_path logs/EMP-SSL-Training/patchsim200_numpatch20_bs100_lr0.3_NONE/save_models/1.pt ``` ## Acknowledgment ## Reference This repo is inspired by [MCR2](https://github.com/Ma-Lab-Berkeley/MCR2), [solo-learn](https://github.com/vturrisi/solo-learn) and [NMCE](https://github.com/zengyi-li/NMCE-release) repo. ``` @article{tong2023empssl, title={EMP-SSL: Towards Self-Supervised Learning in One Training Epoch}, author={Shengbang Tong and Yubei Chen and Yi Ma and Yann Lecun}, journal={arXiv preprint arXiv:2304.03977}, year={2023} } ``` ### Cite 1.
BinSen Xu ,Ning Li, LiZhi , et al. Serial structure multi-task learning method for predicting reservoir parameters[J]. APPLIED GEOPHYSICS, 2022, 19(4): 513-527. DOI: 10.1007/s11770-022-0931-9
2.BinSen Xu, LiZhi Xiao. 2024. Comparison of well logging formation evaluation using serial and parallel multi-task learning networks. Chinese Journal of Geophysics (in Chinese), 67(4): 1613-1626, doi: 10.6038/cjg2023R0584
3.徐彬森, 肖立志. 基于串行及并行多任务学习网络的储层参数评价研究[J]. 地球物理学报. 2024, 67(4): 1613-1626. DOI: 10.6038/cjg2023R0584
## Contact For any issues/questions regarding the reproducing the results, please contact me. Ethan Askua School of Geophysics, China University of Petroleum(Beijing) (CUPB), Beijing. Email: xbs150@163.com