# OpenTraj
**Repository Path**: yuntong239/OpenTraj
## Basic Information
- **Project Name**: OpenTraj
- **Description**: No description available
- **Primary Language**: Unknown
- **License**: Not specified
- **Default Branch**: master
- **Homepage**: None
- **GVP Project**: No
## Statistics
- **Stars**: 1
- **Forks**: 0
- **Created**: 2020-11-24
- **Last Updated**: 2024-02-01
## Categories & Tags
**Categories**: Uncategorized
**Tags**: None
## README
# OpenTraj
### Human Trajectory Prediction Dataset Benchmark
We introduce existing datasets for Human Trajectory Prediction (HTP) task, and also provide tools to load, visualize and analyze datasets. So far multiple datasets are supported.
### Publicly Available Datasets
| Sample | Name | Description | Ref |
|----|----|----|----|
|  | [ETH](datasets/ETH) | 2 top view scenes containing walking pedestrians #Traj:[Peds=750] Coord=world-2D FPS=2.5 | [website](http://www.vision.ee.ethz.ch/en/datasets/) [paper](https://ethz.ch/content/dam/ethz/special-interest/baug/igp/photogrammetry-remote-sensing-dam/documents/pdf/pellegrini09iccv.pdf) |
|  | [UCY](datasets/UCY) | 3 scenes (Zara/Arxiepiskopi/University). Zara and University close to top view. Arxiepiskopi more inclined. #Traj:[Peds=786] Coord=world-2D FPS=2.5 | [website](https://graphics.cs.ucy.ac.cy/research/downloads/crowd-data) [paper](https://onlinelibrary.wiley.com/doi/full/10.1111/j.1467-8659.2007.01089.x) |
|  | [PETS 2009](datasets/PETS-2009) | different crowd activities #Traj:[?] Coord=image-2D FPS=7 | [website](http://www.cvg.reading.ac.uk/PETS2009/data.html) [paper](https://projet.liris.cnrs.fr/imagine/pub/proceedings/AVSS-2010/data/4264a143.pdf) |
|  | [SDD](datasets/SDD) | 8 top view scenes recorded by drone contains various types of agents #Traj:[Bikes=4210 Peds=5232 Skates=292 Carts=174 Cars=316 Buss=76 Total=10,300] Coord=image-2D FPS=30 | [website](http://cvgl.stanford.edu/projects/uav_data) [paper](http://svl.stanford.edu/assets/papers/ECCV16social.pdf) [dropbox](https://www.dropbox.com/s/v9jvt4ln7t42m6m/StanfordDroneDataset.zip) |
|  | [GC](datasets/GC) | Grand Central Train Station Dataset: 1 scene of 33:20 minutes of crowd trajectories #Traj:[Peds=12,684] Coord=image-2D FPS=25 | [dropbox](https://www.dropbox.com/s/7y90xsxq0l0yv8d/cvpr2015_pedestrianWalkingPathDataset.rar) [paper](http://openaccess.thecvf.com/content_cvpr_2015/html/Yi_Understanding_Pedestrian_Behaviors_2015_CVPR_paper.html) |
|  | [HERMES](datasets/HERMES) | Controlled Experiments of Pedestrian Dynamics (Unidirectional and bidirectional flows) #Traj:[?] Coord=world-2D FPS=16 | [website](https://www.fz-juelich.de/ias/ias-7/EN/AboutUs/Projects/Hermes/_node.html) [data](https://www.fz-juelich.de/ias/ias-7/EN/Research/Pedestrian_Dynamics-Empiricism/_node.html) |
|  | [Waymo](datasets/Waymo) | High-resolution sensor data collected by Waymo self-driving cars #Traj:[?] Coord=2D and 3D FPS=? | [website](https://waymo.com/open/) [github](https://github.com/waymo-research/waymo-open-dataset) |
|  | [KITTI](datasets/KITTI) | 6 hours of traffic scenarios. various sensors #Traj:[?] Coord=image-3D + Calib FPS=10 | [website](http://www.cvlibs.net/datasets/kitti/) |
|  | [inD](datasets/InD) | Naturalistic Trajectories of Vehicles and Vulnerable Road Users Recorded at German Intersections #Traj:[Total=11,500] Coord=world-2D FPS=25 | [website](https://www.ind-dataset.com/) [paper](https://arxiv.org/pdf/1911.07602.pdf) |
|  | [L-CAS](datasets/L-CAS) | Multisensor People Dataset Collected by a Pioneer 3-AT robot #Traj:[?] Coord=0 FPS=0 | [website](https://lcas.lincoln.ac.uk/wp/research/data-sets-software/l-cas-multisensor-people-dataset/) |
|  | [VIRAT](datasets/VIRAT) | Natural scenes showing people performing normal actions #Traj:[?] Coord=0 FPS=0 | [website](http://viratdata.org/) |
|  | [VRU](datasets/VRU) | consists of pedestrian and cyclist trajectories, recorded at an urban intersection using cameras and LiDARs #Traj:[peds=1068 Bikes=464] Coord=World (Meter) FPS=25 | [website](https://www.th-ab.de/ueber-uns/organisation/labor/kooperative-automatisierte-verkehrssysteme/trajectory-dataset) |
|  | [Edinburgh](datasets/Edinburgh) | People walking through the Informatics Forum (University of Edinburgh) #Traj:[ped=+92,000] FPS=0 | [website](http://homepages.inf.ed.ac.uk/rbf/FORUMTRACKING/) |
|  | [Town Center](datasets/Town-Center) | CCTV video of pedestrians in a busy downtown area in Oxford #Traj:[peds=2,200] Coord=0 FPS=0 | [website](https://megapixels.cc/datasets/oxford_town_centre/) |
|  | [ATC](datasets/ATC) | 92 days of pedestrian trajectories in a shopping center in Osaka, Japan #Traj:[?] Coord=world-2D + Range data | [website](https://irc.atr.jp/crest2010_HRI/ATC_dataset) |
|  | [City Scapes](datasets/City-Scapes) | 25,000 annotated images (Semantic/ Instance-wise/ Dense pixel annotations) #Traj:[?] | [website](https://www.cityscapes-dataset.com/dataset-overview/) |
|  | [Forking Paths Garden](datasets/Forking-Paths-Garden) | **Multi-modal** _Synthetic_ dataset, created in [CARLA](https://carla.org) (3D simulator) based on real world trajectory data, extrapolated by human annotators #Traj:[?] | [website](https://next.cs.cmu.edu/multiverse/index.html) [github](https://github.com/JunweiLiang/Multiverse) [paper](https://arxiv.org/abs/1912.06445) |
|  | [nuScenes](datasets/NuScenes) | Large-scale Autonomous Driving dataset #Traj:[peds=222,164 vehicles=662,856] Coord=World + 3D Range Data FPS=2 | [website](www.nuscences.org) |
|  | [Argoverse](datasets/Argoverse) | 320 hours of Self-driving dataset #Traj:[objects=11,052] Coord=3D FPS=10 | [website](https://www.argoverse.org) |
|  | [Wild Track](datasets/Wild-Track) | surveillance video dataset of students recorded outside the ETH university main building in Zurich. #Traj:[peds=1,200] | [website](https://megapixels.cc/wildtrack/) |
|  | [DUT](datasets/DUT) | Natural Vehicle-Crowd Interactions in crowded university campus #Traj:[Peds=1,739 vehicles=123 Total=1,862] Coord=world-2D FPS=23.98 | [github](https://github.com/dongfang-steven-yang/vci-dataset-dut) [paper](https://arxiv.org/pdf/1902.00487.pdf) |
|  | [CITR](datasets/CITR) | Fundamental Vehicle-Crowd Interaction scenarios in controlled experiments #Traj:[Peds=340] Coord=world-2D FPS=29.97 | [github](https://github.com/dongfang-steven-yang/vci-dataset-dut) [paper](https://arxiv.org/pdf/1902.00487.pdf) |
|  | [Ko-PER](datasets/Ko-PER) | Trajectories of People and vehicles at Urban Intersections (Laserscanner + Video) #Traj:[peds=350] Coord=world-2D | [paper](https://www.uni-ulm.de/fileadmin/website_uni_ulm/iui.inst.110/Bilder/Forschung/Datensaetze/20141010_DatasetDocumentation.pdf) |
|  | [TRAF](datasets/TRAF) | small dataset of dense and heterogeneous traffic videos in India (22 footages) #Traj:[Cars=33 Bikes=20 Peds=11] Coord=image-2D FPS=10 | [website](https://gamma.umd.edu/researchdirections/autonomousdriving/trafdataset/) [gDrive](https://drive.google.com/drive/folders/1zKaeboslkqoLdTJbRMyQ0Y9JL3007LRr) [paper](https://arxiv.org/pdf/1812.04767.pdf) |
|  | [ETH-Person](datasets/ETH-Person) | Multi-Person Data Collected from Mobile Platforms | [website](https://data.vision.ee.ethz.ch/cvl/aess/) |
#### Human Trajectory Prediction Benchmarks
- [MOT-Challenge](https://motchallenge.net): Multiple Object Tracking Benchmark
- [Trajnet](http://trajnet.stanford.edu/): Trajectory Forecasting Challenge
- [Trajnet++](https://www.aicrowd.com/challenges/trajnet-a-trajectory-forecasting-challenge): Trajectory Forecasting Challenge
- [JackRabbot](https://jrdb.stanford.edu/): Detection And Tracking Dataset and Benchmark
## Toolkit
`To download the toolkit, separately in a zip file click:` [here](https://downgit.github.io/#/home?url=https://github.com/amiryanj/OpenTraj/tree/master/toolkit)
#### 1. Benchmarks
Using python files in [benchmarking/indicators](toolkit/benchmarking/indicators) dir, you can generate the results of each of the indicators presented in the article. For more information about each of the scripts check the information in [toolkit](toolkit).
#### 2. Loaders
Using python files in [loaders](toolkit/loaders) dir, you can load a dataset into a dataset object, which uses Pandas data frames to store the data. It would be super easy to retrieve the trajectories, using different queries (by agent_id, timestamp, ...).
#### 3. Visualization
A simple script is added [play.py](toolkit/ui/play.py), and can be used to visualize a given dataset: