# Attention_GAN **Repository Path**: shineboy/attention_gan ## Basic Information - **Project Name**: Attention_GAN - **Description**: 结合注意力机制与GAN网络的多源时序数据填补工作。 - **Primary Language**: Unknown - **License**: Not specified - **Default Branch**: master - **Homepage**: None - **GVP Project**: No ## Statistics - **Stars**: 0 - **Forks**: 1 - **Created**: 2024-02-21 - **Last Updated**: 2024-02-21 ## Categories & Tags **Categories**: Uncategorized **Tags**: None ## README # Attention_GAN ## 介绍 结合注意力机制与GAN网络的多源时序数据填补工作。 1. 训练集和验证集的合并提升精度验证 ~~2. 注意力头数,encoder层数模型复杂度~~ ~~3. 损失函数结合调整~~ 2. EMA等trick ~~3. 判别器损失函数调整~~ 4. 跑其他数据集 5. 时空注意力机制 ## 对比 ### D的损失函数就是判别器输出数据和M的差值的平均 #### G的损失函数包括两部分: 1. D生成的缺失部分数据的判定效果变差 2. 生成的未缺失值数据和真实的未缺失值数据的MSE变小 ### GAIN ### attention_gan_3d ### self_attention_gan_3d_m_h_delta > [D loss: 0.213846] [G loss: 90.828865] [training MAE loss: 0.652964] Attention_GAN > [D loss: 0.074355] [G loss: 27.752979] [training MAE loss: 0.233518] Encoder_GAN hint=0.1 > [validating MAE loss: 0.223372] > [testing MAE loss: 0.224183] > [Epoch 44/100] [Batch 9] Encoder_GAN hint=0.0 > [D loss: 0.082321] [G loss: 28.860378] [training MAE loss: 0.239957] > [validating MAE loss: 0.224713] > [Epoch 81/100] [Batch 59] Encoder_GAN hint=0.9 > [D loss: 0.029881] [G loss: 26.908133] [training MAE loss: 0.224450] > [validating MAE loss: 0.217894] > [testing MAE loss: 0.219222] > > [D loss: 0.131459] [G loss: 20.790909] [training MAE loss: 0.220328] G迭代5次 hint=0.1 > [validating MAE loss: 0.237275] 判别器性能因为hint_rate大幅度的提升,需要着重训练生成器 ## 待办 ### 训练方法批判器的损失函数。。。 ### 注意机制打分函数(计算两个缺失值时序数据的相似度) DTW的改版。或者放个神经网络(GRU) |Method | PhysioNet-2012 | Air-Quality | Electricity | ETT | |--- |--------------------------|--- |--- |---| |Median | 0.726 / 0.988 / 103.5% |0.763 / 1.175 / 107.4% |2.056 / 2.732 / 110.1% |1.145 / 1.847 / 139.1%| |Last | 0.862 / 1.207 / 123.0% |0.967 / 1.408 / 136.3% |1.006 / 1.533 / 53.9% |1.007 / 1.365 / 96.4% | |GRUI-GAN | 0.765 / 1.040 / 109.1% |0.788 / 1.179 / 111.0% |/ |0.612 / 0.729 / 95.1% | |E2GAN | 0.702 / 0.964 / 100.1% |0.750 / 1.126 / 105.6% |/ |0.584 / 0.703 / 89.0% | |M-RNN | 0.533 / 0.776 / 76.0% |0.294 / 0.643 / 41.4% |1.244 / 1.867 / 66.6% |0.376 / 0.428 / 31.6% | |GP-VAE | 0.398 / 0.630 / 56.7% |0.268 / 0.614 / 37.7% |1.094 / 1.565 / 58.6% |0.274 / 0.307 / 15.5% | |BRITS | 0.256 / 0.767 / 36.5% |0.153 / 0.525 / 21.6% |0.847 / 1.322 / 45.3% |0.130 / 0.259 / 12.5% | |Transformer| 0.190 / 0.445 / 26.9% |0.158 / 0.521 / 22.3% |0.823 / 1.301 / 44.0% |0.114 / 0.173 / 10.9% | |SAITS-base | 0.192 / 0.439 / 27.3% |0.146 / 0.521 / 20.6% |0.822 / 1.221 / 44.0% |0.121 / 0.197 / 11.6% | |SAITS | 0.186 / 0.431 / 26.6% |0.137 / 0.518 / 19.3% |0.735 / 1.162 / 39.4% |0.092 / 0.139 / 8.8% | > MAE / RMSE / MRE > > > [testing MAE loss: 0.139566 ;MSE loss: 0.080162 ;RMSE loss: 0.283129 ;MRE loss: 0.197731] [testing MAE loss: 0.140062 ;MSE loss: 0.081934 ;RMSE loss: 0.286241 ;MRE loss: 0.198433] [testing MAE loss: 0.141683 ;MSE loss: 0.083693 ;RMSE loss: 0.289298 ;MRE loss: 0.200729] [testing MAE loss: 0.141756 ;MSE loss: 0.080285 ;RMSE loss: 0.283347 ;MRE loss: 0.200834] [testing MAE loss: 0.142518 ;MSE loss: 0.082500 ;RMSE loss: 0.287228 ;MRE loss: 0.201913] [testing MAE loss: 0.143644 ;MSE loss: 0.081907 ;RMSE loss: 0.286194 ;MRE loss: 0.203508] [testing MAE loss: 0.143553 ;MSE loss: 0.083695 ;RMSE loss: 0.289301 ;MRE loss: 0.203380] [testing MAE loss: 0.141972 ;MSE loss: 0.082953 ;RMSE loss: 0.288015 ;MRE loss: 0.201139] [testing MAE loss: 0.143800 ;MSE loss: 0.087774 ;RMSE loss: 0.296267 ;MRE loss: 0.203730] [testing MAE loss: 0.145582 ;MSE loss: 0.088590 ;RMSE loss: 0.297641 ;MRE loss: 0.206254] [testing MAE loss: 0.144881 ;MSE loss: 0.083489 ;RMSE loss: 0.288944 ;MRE loss: 0.205261] [testing MAE loss: 0.146491 ;MSE loss: 0.085480 ;RMSE loss: 0.292370 ;MRE loss: 0.207542] [testing MAE loss: 0.146853 ;MSE loss: 0.085841 ;RMSE loss: 0.292986 ;MRE loss: 0.208055] [testing MAE loss: 0.147741 ;MSE loss: 0.084064 ;RMSE loss: 0.289937 ;MRE loss: 0.209313] [testing MAE loss: 0.147181 ;MSE loss: 0.085427 ;RMSE loss: 0.292279 ;MRE loss: 0.208519] [testing MAE loss: 0.149037 ;MSE loss: 0.083623 ;RMSE loss: 0.289176 ;MRE loss: 0.211148] [testing MAE loss: 0.149110 ;MSE loss: 0.087002 ;RMSE loss: 0.294961 ;MRE loss: 0.211252] [testing MAE loss: 0.148673 ;MSE loss: 0.088598 ;RMSE loss: 0.297654 ;MRE loss: 0.210634] [testing MAE loss: 0.149562 ;MSE loss: 0.089930 ;RMSE loss: 0.299884 ;MRE loss: 0.211892] [testing MAE loss: 0.149409 ;MSE loss: 0.085663 ;RMSE loss: 0.292683 ;MRE loss: 0.211676] [testing MAE loss: 0.149726 ;MSE loss: 0.092834 ;RMSE loss: 0.304687 ;MRE loss: 0.212124] [testing MAE loss: 0.151826 ;MSE loss: 0.085923 ;RMSE loss: 0.293125 ;MRE loss: 0.215100] [testing MAE loss: 0.152344 ;MSE loss: 0.086221 ;RMSE loss: 0.293634 ;MRE loss: 0.215834] [testing MAE loss: 0.154178 ;MSE loss: 0.092462 ;RMSE loss: 0.304076 ;MRE loss: 0.218432] [testing MAE loss: 0.155093 ;MSE loss: 0.088674 ;RMSE loss: 0.297783 ;MRE loss: 0.219729] [testing MAE loss: 0.213716 ;MSE loss: 0.186666 ;RMSE loss: 0.432048 ;MRE loss: 0.308431] [testing MAE loss: 0.214701 ;MSE loss: 0.188496 ;RMSE loss: 0.434162 ;MRE loss: 0.309852] [testing MAE loss: 0.213655 ;MSE loss: 0.189101 ;RMSE loss: 0.434858 ;MRE loss: 0.308343] [testing MAE loss: 0.214932 ;MSE loss: 0.185648 ;RMSE loss: 0.430869 ;MRE loss: 0.310186] [testing MAE loss: 0.216079 ;MSE loss: 0.186633 ;RMSE loss: 0.432010 ;MRE loss: 0.311841] [testing MAE loss: 0.216272 ;MSE loss: 0.188991 ;RMSE loss: 0.434731 ;MRE loss: 0.312120] [testing MAE loss: 0.215513 ;MSE loss: 0.190004 ;RMSE loss: 0.435895 ;MRE loss: 0.311024]