# Pose2ID **Repository Path**: yangxubbc/Pose2ID ## Basic Information - **Project Name**: Pose2ID - **Description**: No description available - **Primary Language**: Unknown - **License**: MIT - **Default Branch**: main - **Homepage**: None - **GVP Project**: No ## Statistics - **Stars**: 0 - **Forks**: 0 - **Created**: 2025-12-25 - **Last Updated**: 2025-12-25 ## Categories & Tags **Categories**: Uncategorized **Tags**: None ## README
🔥 A very simple but efficient framework for ReID tasks/models. 🔥
🔥 A powerful pedestrian generation model (IPG) across rgb, infrared, and occlusion scenes. 🔥
[2025-12-03 NEWS!!!]🔥 We launched OmniPerson, an Omni and powerful Pedestrian Generation Model. 🔥
 We proposed: - _**Training-Free Feature Centralization framework (Pose2ID)**_ that can be directly applied to different ReID tasks and models, even an ImageNet pre-trained model without ReID training; - _**I**dentity-Guided **P**edestrian **G**eneration (**IPG**)_ paradigm, leveraging identity features to generate high-quality images of the same identity in different poses to achieve feature centralization; - _**N**eighbor **F**eature **C**entralization (**NFC**)_ based on sample's neighborhood, discovering hidden positive samples of gallery/query set to achieve feature centralization.  ## 📣 Updates * [2025.12.03] 🔥🔥🔥 We launched [OmniPerson](http://arxiv.org/abs/2512.02554), a powerful Pedestrian Generation Model. (images/videos/infrared/muti-reference) Code is avaliable [here](https://github.com/maxiaoxsi/OmniPerson) * [2025.03.19] 🔥 A demo of TransReID on Market1501 is available! * [2025.03.06] 🔥 Pretrained weights is available on [HuggingFace](https://huggingface.co/yuanc3/Pose2ID)! * [2025.03.04] 🔥 Paper is available on [Arxiv](https://arxiv.org/abs/2503.00938)! * [2025.03.03] 🔥 Official codes has released! * [2025.02.27] 🔥🔥🔥 **Pose2ID** is accepted to CVPR 2025! ## ⚒️ Quick Start There are two parts of our project: **Identity-Guided Pedestrian Generation (IPG)** and **Neighbor Feature Centralization (NFC)**. **IPG** using generated pedestrian images to centralize features. Using simple codes could implement: ```bash ''' normal reid feature extraction to get feats ''' feats_ipg = torch.zeros_like(feats) # fuse features of generated positive samples with different poses for i in range(num_poses): feats_ipg += reid_model(feats_pose[i]) # Any reid model eta = 1 # control the impact of generated images (considering the quality) # centralize features and normalize to original distribution feats = torch.nn.functional.normalize(feats + eta * feats_ipg, dim=1, p=2) # L2 normalization ''' compute distance matrix or post-processing like re-ranking ''' ``` **NFC** explores each sample's potential positive samples from its neighborhood. It can also implement with few lines: ```bash from NFC import NFC feats = NFC(feats, k1 = 2, k2 = 2) ``` ## Demo for TransReID on Market1501 dataset 0. Follow the official instructions of [TransReID](https://github.com/damo-cv/TransReID) to install the environment, and run their test script. If it succeeds, then ours are the same. 1. Modify configuration file `configs/Market/vit_transreid_stride.yml`. Wether use NFC or IPG feature centralization or not. ```yaml TEST: NFC: True IPG: True ``` 2. If want to test IPG feature centralization performance, Download the generated images ([Gallery](https://drive.google.com/file/d/1QdH0CctiUrZTCE3nPzc_kPmgAaxhhWzd/view?usp=sharing) & [Query](https://drive.google.com/file/d/1oiOutY64FQn9RTF2l_T0A8iPCWMkJi3a/view?usp=sharing)) and put them in Market1501 folder. The folder structure should be like: ```shell Market1501 ├── bounding_box_test # original gallery images ├── bounding_box_test_gen # generated gallery images ├── bounding_box_train # original training images ├── query # original query images └── query_gen # generated query images ``` 3. Run the test script. Use their official pretrained model or use our pretrained model (without camera ID) on [HuggingFace](https://huggingface.co/yuanc3/Pose2ID) (`transformer_20.pth`). If use the model w/o camera ID, please set ```camera_num``` in Line45 of ```test.py``` to 0. ```bash cd demo/TransReID # The same with the official repository python test.py --config_file configs/Market/vit_transreid_stride.yml MODEL.DEVICE_ID "('0')" TEST.WEIGHT 'path/to/your/pretrained/model' ``` ```NOTE:``` If all goes well, you can get the **same** results of the first two rows in Table.1. ## 📊 Experiments ### ID² Metric We proposed a quantitative metric (ID²) for **Id**entity **D**ensity to replce visualization tools like t-SNE, which is random and only focus on few samples. It can be used in one line: ```bash from ID2 import ID2 density = ID2(feats, pids) # each ID's density density.mean(0) # global density ``` where `feats` is the features extracted by ReID model and `pids` is the corresponding person IDs. ### Improvements on Person ReID tasks  All the experiments are conducted with the **offcial codes** and **pretrained models**. We appreciate their official repositories and great works: - TransReID