All Projects → caoyue10 → aaai17-cdq

caoyue10 / aaai17-cdq

Licence: other
The implementation of AAAI-17 paper "Collective Deep Quantization of Efficient Cross-modal Retrieval"

Programming Languages

python
139335 projects - #7 most used programming language
shell
77523 projects

Projects that are alternatives of or similar to aaai17-cdq

Stochastic-Quantization
Training Low-bits DNNs with Stochastic Quantization
Stars: ✭ 70 (+112.12%)
Mutual labels:  quantization
lixinger-openapi
理杏仁开发平台python api(非官方)
Stars: ✭ 43 (+30.3%)
Mutual labels:  quantization
autoencoder based image compression
Autoencoder based image compression: can the learning be quantization independent? https://arxiv.org/abs/1802.09371
Stars: ✭ 21 (-36.36%)
Mutual labels:  quantization
ATMC
[NeurIPS'2019] Shupeng Gui, Haotao Wang, Haichuan Yang, Chen Yu, Zhangyang Wang, Ji Liu, “Model Compression with Adversarial Robustness: A Unified Optimization Framework”
Stars: ✭ 41 (+24.24%)
Mutual labels:  quantization
bert-squeeze
🛠️ Tools for Transformers compression using PyTorch Lightning ⚡
Stars: ✭ 56 (+69.7%)
Mutual labels:  quantization
image-classification
A collection of SOTA Image Classification Models in PyTorch
Stars: ✭ 70 (+112.12%)
Mutual labels:  quantization
neural-compressor
Intel® Neural Compressor (formerly known as Intel® Low Precision Optimization Tool), targeting to provide unified APIs for network compression technologies, such as low precision quantization, sparsity, pruning, knowledge distillation, across different deep learning frameworks to pursue optimal inference performance.
Stars: ✭ 666 (+1918.18%)
Mutual labels:  quantization
U-Net-Fixed-Point-Quantization-for-Medical-Image-Segmentation
Repository containing code for "U-Net Fixed-Point Quantization for Medical Image Segmentation" paper at MICCAI2019
Stars: ✭ 62 (+87.88%)
Mutual labels:  quantization
navec
Compact high quality word embeddings for Russian language
Stars: ✭ 118 (+257.58%)
Mutual labels:  quantization
EfficientIR
人工智障本地图片检索工具 | An EfficientNet based image retrieval tool
Stars: ✭ 64 (+93.94%)
Mutual labels:  similarity-search
pngquant
A Python Wrapper of Pngquant
Stars: ✭ 27 (-18.18%)
Mutual labels:  quantization
camalian
Library used to deal with colors and images. You can extract colors from images.
Stars: ✭ 45 (+36.36%)
Mutual labels:  quantization
ppq
PPL Quantization Tool (PPQ) is a powerful offline neural network quantization tool.
Stars: ✭ 281 (+751.52%)
Mutual labels:  quantization
MoTIS
Mobile(iOS) Text-to-Image search powered by multimodal semantic representation models(e.g., OpenAI's CLIP). Accepted at NAACL 2022.
Stars: ✭ 60 (+81.82%)
Mutual labels:  cross-modal
tensorflow-quantization-example
TensorFlow Quantization Example, for TensorFlow Lite
Stars: ✭ 19 (-42.42%)
Mutual labels:  quantization
cross-modal-hasing-playground
Python implementation of cross-modal hashing algorithms
Stars: ✭ 19 (-42.42%)
Mutual labels:  cross-modal
pause
🍊 PAUSE (Positive and Annealed Unlabeled Sentence Embedding), accepted by EMNLP'2021 🌴
Stars: ✭ 19 (-42.42%)
Mutual labels:  similarity-search
KGySoft.Drawing
KGy SOFT Drawing is a library for advanced image, icon and graphics handling.
Stars: ✭ 27 (-18.18%)
Mutual labels:  quantization
optimum
🏎️ Accelerate training and inference of 🤗 Transformers with easy to use hardware optimization tools
Stars: ✭ 567 (+1618.18%)
Mutual labels:  quantization
apollo
Advanced similarity and duplicate source code proof of concept for our research efforts.
Stars: ✭ 49 (+48.48%)
Mutual labels:  similarity-search

aaai17-cdq

This is the Tensorflow (Version 0.11) implementation of AAAI-17 paper "Collective Deep Quantization for Efficient Cross-modal Retrieval". The descriptions of files in this directory are listed below:

  • cdq.py: contains the main implementation of the proposed approach cdq.
  • train_script.py: gives an example to show how to train cdq model.
  • validation_script.py: gives an example to show how to evaluate the trained quantization model.
  • run_cdq.sh: gives an example to show the full procedure of training and evaluating the proposed approach cdq.

Data Preparation

In data/nuswide/train.txt and data/nuswide/text_train.txt, we give an example to show how to prepare image/text training data. In data/nuswide/test.txt, data/nuswide/text_test.txt, data/nuswide/database.txt and data/nuswide/text_database.txt, the list of testing and database images/texts could be processed during predicting procedure.

Training Model and Predicting

The AlexNet is used as the pre-trained model. If the NUS_WIDE dataset and pre-trained caffemodel is prepared, the example can be run with the following command:

"./run_cdq.sh"

Citation

@inproceedings{DBLP:conf/aaai/CaoL0L17,
  author    = {Yue Cao and
               Mingsheng Long and
               Jianmin Wang and
               Shichen Liu},
  title     = {Collective Deep Quantization for Efficient Cross-Modal Retrieval},
  booktitle = {Proceedings of the Thirty-First {AAAI} Conference on Artificial Intelligence,
               February 4-9, 2017, San Francisco, California, {USA.}},
  pages     = {3974--3980},
  year      = {2017},
  crossref  = {DBLP:conf/aaai/2017},
  url       = {http://aaai.org/ocs/index.php/AAAI/AAAI17/paper/view/14499},
  timestamp = {Mon, 06 Mar 2017 11:36:24 +0100},
  biburl    = {http://dblp2.uni-trier.de/rec/bib/conf/aaai/CaoL0L17},
  bibsource = {dblp computer science bibliography, http://dblp.org}
}
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].