1 Star 0 Fork 0

SIA_cwy / L2G

加入 Gitee
与超过 1200万 开发者一起发现、参与优秀开源项目,私有仓库也完全免费 :)
免费加入
该仓库未声明开源许可证文件(LICENSE),使用请关注具体项目描述及其代码上游依赖。
克隆/下载
贡献代码
同步代码
取消
提示: 由于 Git 不支持空文件夾,创建文件夹后会生成空的 .keep 文件
Loading...
README

End-to-End Learning to Grasp from Object Point Clouds

This is the PyTorch implementation of the paper Learning to Grasp (L2G).

Learning to Grasp (L2G) is an efficient end-to-end learning strategy to generate 6-DOF parallel-jaw grasps starting from a partial point cloud of an object. Our approach does not exploit any geometric assumption, it is instead guided by a principled multi-task optimization objective that generates a diverse set of grasps by combining contact point sampling, grasp regression, and grasp evaluation.

Grasp Trials GIF
(click here to watch higher-res, full-length version)

Environment

This code is tested with Python 3.7.12, PyTorch 1.8.0 and CUDA 11.1 (system-wide).
Make sure to have the following environment variables (CUDA_HOME, CUDA_ROOT, LD_LIBRARY_PATH, PATH) properly set before installing pointnet2 modules. GCC and CMAKE are also required.

SYSTEM:
    CUDA 11.1 (system-wide)
    gcc-7.3.1
    cmake-3.18.4
    
CONDA ENV:
    torch: 1.8.0    # torch.__version__
    CUDA: 11.1  # torch.version.cuda
    CuDNN: 8005 # torch.backends.cudnn.version()
  • Use miniconda and the provided requirements.yml file to build a virtual environment. This will install almost all needed packages, including pytorch=1.8.0 and cudatoolkit=11.1
conda env create --name L2G_env --file=requirements.yml
  • Install KNN CUDA
pip install --upgrade https://github.com/unlimblue/KNN_CUDA/releases/download/0.2/KNN_CUDA-0.2-py3-none-any.whl
  • Install PointNet++
cd pointnet2;
python setup.py install;
cd ..
  • Install GPNet-simulator and grasp-evaluator
git submodule init 
git submodule update
cd GPNet-simulator; pip install -e .; cd ..
cd grasp-evaluator; python setup.py install; cd ..

Datasets

  • ShapeNetSem-8: Originally created by Wu et al. (2020) and first provided here. Used for training and evaluation.
  • YCB-8: Grasping scenarios with 8 YCB objects including grasp annotations for simulation-based as well as rule-based evaluation.
  • YCB-76: Grasping scenarios with 76 YCB objects for simulation-based evaluation only.

Use the script: download_data.sh to download the datasets (requires ~11GB disk space once unzipped). For a more detailed description of the datasets, please refer to their individual README files.

sh download_data.sh;
  • Directory tree:
.
├── README.md
└── data
    ├── ShapeNetSem-8
    │   └── ...
    ├── YCB-76
    │   └── ...
    ├── YCB-8
    │   └── ...
    └── download_script.sh
                

Experiments

Use the provided bash scripts to train and evaluate L2G models. Training is performed on ShapeNetSem-8, evaluation (rule-based + simulation) is performed on ShapeNetSem-8, YCB-8, YCB-76 test sets.

  • Train and Evaluate L2G with PointNet2 Encoder
conda activate L2G_env
source run_l2g_pn2.sh
  • Train and Evaluate L2G with DeCo Encoder
conda activate L2G_env
source run_l2g_deco.sh

Acknowledgement

Our code release is based on the following works

This work is partially funded by CHIST-ERA under EPSRC grant no. EP/S032487/1.

空文件

简介

暂无描述 展开 收起
Python 等 5 种语言
取消

发行版

暂无发行版

贡献者

全部

近期动态

加载更多
不能加载更多了
1
https://gitee.com/sia_cwy/L2G.git
git@gitee.com:sia_cwy/L2G.git
sia_cwy
L2G
L2G
main

搜索帮助

53164aa7 5694891 3bd8fe86 5694891