DistillerNeural Network Distiller by Intel AI Lab: a Python package for neural network compression research. https://intellabs.github.io/distiller
DeeplearningPython for《Deep Learning》,该书为《深度学习》(花书) 数学推导、原理剖析与源码级别代码实现
AMP-RegularizerCode for our paper "Regularizing Neural Networks via Adversarial Model Perturbation", CVPR2021
FSCNMFAn implementation of "Fusing Structure and Content via Non-negative Matrix Factorization for Embedding Information Networks".
tulipScaleable input gradient regularization
traj-pred-irlOfficial implementation codes of "Regularizing neural networks for future trajectory prediction via IRL framework"
consistencyImplementation of models in our EMNLP 2019 paper: A Logic-Driven Framework for Consistency of Neural Models
AI Learning HubAI Learning Hub for Machine Learning, Deep Learning, Computer Vision and Statistics
Deep-Learning-Specialization-CourseraDeep Learning Specialization Course by Coursera. Neural Networks, Deep Learning, Hyper Tuning, Regularization, Optimization, Data Processing, Convolutional NN, Sequence Models are including this Course.
manifold mixupTensorflow implementation of the Manifold Mixup machine learning research paper
L0LearnEfficient Algorithms for L0 Regularized Learning
pyowlOrdered Weighted L1 regularization for classification and regression in Python
sparsebnSoftware for learning sparse Bayesian networks
Statistical-Learning-using-RThis is a Statistical Learning application which will consist of various Machine Learning algorithms and their implementation in R done by me and their in depth interpretation.Documents and reports related to the below mentioned techniques can be found on my Rpubs profile.
hyperstarHyperstar: Negative Sampling Improves Hypernymy Extraction Based on Projection Learning.
SSE-PTCodes and Datasets for paper RecSys'20 "SSE-PT: Sequential Recommendation Via Personalized Transformer" and NurIPS'19 "Stochastic Shared Embeddings: Data-driven Regularization of Embedding Layers"