All Projects → gengyanlei → ssgan

gengyanlei / ssgan

Licence: other
Semi Supervised Semantic Segmentation Using Generative Adversarial Network ; Pytorch

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to ssgan

Unet Pytorch
U-Net implementation for PyTorch based on https://arxiv.org/abs/1505.04597
Stars: ✭ 229 (+816%)
Mutual labels:  semantic-segmentation
image-segmentation
Mask R-CNN, FPN, LinkNet, PSPNet and UNet with multiple backbone architectures support readily available
Stars: ✭ 62 (+148%)
Mutual labels:  semantic-segmentation
multiclass-semantic-segmentation
Experiments with UNET/FPN models and cityscapes/kitti datasets [Pytorch]
Stars: ✭ 96 (+284%)
Mutual labels:  semantic-segmentation
Fast Scnn Pytorch
A PyTorch Implementation of Fast-SCNN: Fast Semantic Segmentation Network
Stars: ✭ 239 (+856%)
Mutual labels:  semantic-segmentation
Clan
( CVPR2019 Oral ) Taking A Closer Look at Domain Shift: Category-level Adversaries for Semantics Consistent Domain Adaptation
Stars: ✭ 248 (+892%)
Mutual labels:  semantic-segmentation
PixelPick
[ICCVW'21] All you need are a few pixels: semantic segmentation with PixelPick
Stars: ✭ 59 (+136%)
Mutual labels:  semantic-segmentation
Deep Learning In Production
Develop production ready deep learning code, deploy it and scale it
Stars: ✭ 216 (+764%)
Mutual labels:  semantic-segmentation
RGBD-semantic-segmentation
A paper list of RGBD semantic segmentation (processing)
Stars: ✭ 264 (+956%)
Mutual labels:  semantic-segmentation
VT-UNet
[MICCAI2022] This is an official PyTorch implementation for A Robust Volumetric Transformer for Accurate 3D Tumor Segmentation
Stars: ✭ 151 (+504%)
Mutual labels:  semantic-segmentation
MCIS wsss
Code for ECCV 2020 paper (oral): Mining Cross-Image Semantics for Weakly Supervised Semantic Segmentation
Stars: ✭ 151 (+504%)
Mutual labels:  semantic-segmentation
Adaptive affinity fields
Adaptive Affinity Fields for Semantic Segmentation
Stars: ✭ 240 (+860%)
Mutual labels:  semantic-segmentation
Cocostuff10k
The official homepage of the (outdated) COCO-Stuff 10K dataset.
Stars: ✭ 248 (+892%)
Mutual labels:  semantic-segmentation
nobrainer
A framework for developing neural network models for 3D image processing.
Stars: ✭ 123 (+392%)
Mutual labels:  semantic-segmentation
Decouplesegnets
Implementation of Our ECCV2020-work: Improving Semantic Segmentation via Decoupled Body and Edge Supervision
Stars: ✭ 232 (+828%)
Mutual labels:  semantic-segmentation
etos-deepcut
Deep Extreme Cut http://www.vision.ee.ethz.ch/~cvlsegmentation/dextr . a tool to do automatically object segmentation from extreme points.
Stars: ✭ 24 (-4%)
Mutual labels:  semantic-segmentation
Asis
Associatively Segmenting Instances and Semantics in Point Clouds, CVPR 2019
Stars: ✭ 228 (+812%)
Mutual labels:  semantic-segmentation
ResUNetPlusPlus-with-CRF-and-TTA
ResUNet++, CRF, and TTA for segmentation of medical images (IEEE JBIHI)
Stars: ✭ 98 (+292%)
Mutual labels:  semantic-segmentation
tf-semantic-segmentation-FCN-VGG16
Semantic segmentation for classifying road. "Fully Convolutional Networks for Semantic Segmentation (2015)" implemented using TF
Stars: ✭ 30 (+20%)
Mutual labels:  semantic-segmentation
K-Net
[NeurIPS2021] Code Release of K-Net: Towards Unified Image Segmentation
Stars: ✭ 434 (+1636%)
Mutual labels:  semantic-segmentation
Semantic Segmentation
Semantic Segmentation using Fully Convolutional Neural Network.
Stars: ✭ 60 (+140%)
Mutual labels:  semantic-segmentation

ssgan

Semi Supervised Semantic Segmentation Using Generative Adversarial Network ; Pytorch

Environment

    python:3.5 
    Pytorch:0.40

Note

    由于论文未给出代码,并且此论文为“分割”SEMI-GAN,与分类有相似之处,但仍有巨大区别,
    在参考一些分类SEMI-GAN后,复现此半监督分割GAN论文代码。
    若有问题,请及时指出,谢谢。
    
    注意:测试时请添加model.eval() and with torch.no_grad(): 

Refer

Other

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].