All Projects → liuxinhai → Point2Sequence

liuxinhai / Point2Sequence

Licence: other
Point2Sequence: Learning the Shape Representation of 3D Point Clouds with an Attention-based Sequence to Sequence Network

Programming Languages

python
139335 projects - #7 most used programming language
C++
36643 projects - #6 most used programming language
Cuda
1817 projects
shell
77523 projects

Projects that are alternatives of or similar to Point2Sequence

Dgcnn.pytorch
A PyTorch implementation of Dynamic Graph CNN for Learning on Point Clouds (DGCNN)
Stars: ✭ 153 (+350%)
Mutual labels:  point-cloud, classification, segmentation
Pointnet Keras
Keras implementation for Pointnet
Stars: ✭ 110 (+223.53%)
Mutual labels:  point-cloud, classification, segmentation
Pointcnn
PointCNN: Convolution On X-Transformed Points (NeurIPS 2018)
Stars: ✭ 1,120 (+3194.12%)
Mutual labels:  point-cloud, classification, segmentation
Pointclouddatasets
3D point cloud datasets in HDF5 format, containing uniformly sampled 2048 points per shape.
Stars: ✭ 80 (+135.29%)
Mutual labels:  point-cloud, classification, segmentation
3d Pointcloud
Papers and Datasets about Point Cloud.
Stars: ✭ 179 (+426.47%)
Mutual labels:  point-cloud, classification, segmentation
Pointasnl
PointASNL: Robust Point Clouds Processing using Nonlocal Neural Networks with Adaptive Sampling (CVPR 2020)
Stars: ✭ 159 (+367.65%)
Mutual labels:  point-cloud, classification, segmentation
Pointnet
PointNet: Deep Learning on Point Sets for 3D Classification and Segmentation
Stars: ✭ 3,517 (+10244.12%)
Mutual labels:  point-cloud, classification, segmentation
Grid Gcn
Grid-GCN for Fast and Scalable Point Cloud Learning
Stars: ✭ 143 (+320.59%)
Mutual labels:  point-cloud, classification, segmentation
Pointnet2
PointNet++: Deep Hierarchical Feature Learning on Point Sets in a Metric Space
Stars: ✭ 2,197 (+6361.76%)
Mutual labels:  point-cloud, classification, segmentation
Cilantro
A lean C++ library for working with point cloud data
Stars: ✭ 577 (+1597.06%)
Mutual labels:  point-cloud, segmentation
Depth clustering
🚕 Fast and robust clustering of point clouds generated with a Velodyne sensor.
Stars: ✭ 657 (+1832.35%)
Mutual labels:  point-cloud, segmentation
Superpoint graph
Large-scale Point Cloud Semantic Segmentation with Superpoint Graphs
Stars: ✭ 533 (+1467.65%)
Mutual labels:  point-cloud, segmentation
Open3d Pointnet2 Semantic3d
Semantic3D segmentation with Open3D and PointNet++
Stars: ✭ 342 (+905.88%)
Mutual labels:  point-cloud, classification
Torch Points3d
Pytorch framework for doing deep learning on point clouds.
Stars: ✭ 1,135 (+3238.24%)
Mutual labels:  point-cloud, segmentation
Gacnet
Pytorch implementation of 'Graph Attention Convolution for Point Cloud Segmentation'
Stars: ✭ 103 (+202.94%)
Mutual labels:  point-cloud, segmentation
3dgnn pytorch
3D Graph Neural Networks for RGBD Semantic Segmentation
Stars: ✭ 187 (+450%)
Mutual labels:  point-cloud, segmentation
point-cloud-segmentation
TF2 implementation of PointNet for segmenting point clouds
Stars: ✭ 33 (-2.94%)
Mutual labels:  point-cloud, segmentation
pointnet2-pytorch
A clean PointNet++ segmentation model implementation. Support batch of samples with different number of points.
Stars: ✭ 45 (+32.35%)
Mutual labels:  point-cloud, segmentation
pyRANSAC-3D
A python tool for fitting primitives 3D shapes in point clouds using RANSAC algorithm
Stars: ✭ 253 (+644.12%)
Mutual labels:  point-cloud, segmentation
Awesome-Tensorflow2
基于Tensorflow2开发的优秀扩展包及项目
Stars: ✭ 45 (+32.35%)
Mutual labels:  classification, segmentation

Point2Sequence: Learning the Shape Representation of 3D Point Clouds with an Attention-based Sequence to Sequence Network

Created by Xinhai Liu, Zhizhong Han, Yu-Shen Liu, Matthias Zwicker.

prediction example

Citation

If you find our work useful in your research, please consider citing:

    @inproceedings{liu2019point2sequence,
      title={Point2Sequence: Learning the Shape Representation of 3D Point Clouds with an Attention-based Sequence to Sequence Network},
      author={Liu, Xinhai and Han, Zhizhong and Liu, Yu-Shen and Zwicker, Matthias},
      booktitle={Thirty-Third AAAI Conference on Artificial Intelligence},
      year={2019}
    }

Introduction

In Point2Sequence, we build the multi-scale areas in the local region of point sets by a sequential manner. To explore the correlation between different scale areas, a RNN-based sequence model is employed to capture the contextual information inside local regions. In addition, we also introduce an attention mechanism to highlight the importance different scale areas.

In this repository we release code our Point2Sequence classification and segmentation networks as well as a few utility scripts for training, testing and data processing.

Installation

Install TensorFlow. The code is tested under TF1.4 GPU version and Python 2.7 on Ubuntu 16.04. There are also some dependencies for a few Python libraries for data processing like cv2, h5py etc. It's highly recommended that you have access to GPUs. Before running the code, you need to compile customized TF operators as described in PointNet++.

Usage

Shape Classification

To train a Point2Sequence model to classify ModelNet40 shapes (using point clouds with XYZ coordinates):

    python train.py

To see all optional arguments for training:

    python train.py -h

In the training process, we also evaluate the performance the model.

Shape Part Segmentation

To train a model to segment object parts for ShapeNet models:

    cd part_seg
    python train.py

Prepare Your Own Data

Follow the dataset in PointNet++, you can refer to here on how to prepare your own HDF5 files for either classification or segmentation. Or you can refer to modelnet_dataset.py on how to read raw data files and prepare mini-batches from them.

License

Our code is released under MIT License (see LICENSE file for details).

Related Projects

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].