# Variational-HyperAdam **Repository Path**: frontxiang/Variational-HyperAdam ## Basic Information - **Project Name**: Variational-HyperAdam - **Description**: No description available - **Primary Language**: Python - **License**: GPL-3.0 - **Default Branch**: master - **Homepage**: None - **GVP Project**: No ## Statistics - **Stars**: 0 - **Forks**: 0 - **Created**: 2024-06-04 - **Last Updated**: 2024-06-04 ## Categories & Tags **Categories**: Uncategorized **Tags**: None ## README # Variational-HyperAdam This is a PyTorch implementation of the Variational-HyperAdam algorithm from our TPAMI paper: **Title**: Variational-HyperAdam: A Meta-learning Appproach to Network Training **Authors**: Shipeng Wang, Yan Yang, Jian Sun, Zongben Xu **Email**: wangshipeng8128@stu.xjtu.edu.cn; wangshipeng8128@gmail.com **Institution**: School of Mathematics and Statistics, Xi'an Jiaotong University **Link**: https://ieeexplore.ieee.org/document/9361276 Usage - To replicate the experiments,run from terminal: ``` cd HyperAdam sh batch_process.sh ``` **Requirement**: PyTorch >= 1.0, Python 3.7 Citation - If the code is useful in your research, please cite ourpaper: ``` @ARTICLE{vrhyperadam2021wang, author={S. {Wang} and Y. {Yang} and J. {Sun} and Z. {Xu}}, journal={IEEE Transactions on Pattern Analysis and Machine Intelligence}, title={Variational HyperAdam: A Meta-learning Approach to Network Training}, year={2021}, volume={}, number={}, pages={1-1}, doi={10.1109/TPAMI.2021.3061581} } ```