All Projects → rentainhe → visualization

rentainhe / visualization

Licence: MIT license
a collection of visualization function

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to visualization

Neural sp
End-to-end ASR/LM implementation with PyTorch
Stars: ✭ 408 (+115.87%)
Mutual labels:  transformer, attention, attention-mechanism
h-transformer-1d
Implementation of H-Transformer-1D, Hierarchical Attention for Sequence Learning
Stars: ✭ 121 (-35.98%)
Mutual labels:  transformer, attention, attention-mechanism
Pytorch Original Transformer
My implementation of the original transformer model (Vaswani et al.). I've additionally included the playground.py file for visualizing otherwise seemingly hard concepts. Currently included IWSLT pretrained models.
Stars: ✭ 411 (+117.46%)
Mutual labels:  transformer, attention, attention-mechanism
CrabNet
Predict materials properties using only the composition information!
Stars: ✭ 57 (-69.84%)
Mutual labels:  transformer, attention, attention-mechanism
Self Attention Cv
Implementation of various self-attention mechanisms focused on computer vision. Ongoing repository.
Stars: ✭ 209 (+10.58%)
Mutual labels:  transformer, attention, attention-mechanism
semantic-segmentation
SOTA Semantic Segmentation Models in PyTorch
Stars: ✭ 464 (+145.5%)
Mutual labels:  transformer, vision-transformer
NLP-paper
🎨 🎨NLP 自然语言处理教程 🎨🎨 https://dataxujing.github.io/NLP-paper/
Stars: ✭ 23 (-87.83%)
Mutual labels:  transformer, attention-mechanism
Hierarchical-Word-Sense-Disambiguation-using-WordNet-Senses
Word Sense Disambiguation using Word Specific models, All word models and Hierarchical models in Tensorflow
Stars: ✭ 33 (-82.54%)
Mutual labels:  attention, attention-mechanism
TransMorph Transformer for Medical Image Registration
TransMorph: Transformer for Unsupervised Medical Image Registration (PyTorch)
Stars: ✭ 130 (-31.22%)
Mutual labels:  transformer, vision-transformer
En-transformer
Implementation of E(n)-Transformer, which extends the ideas of Welling's E(n)-Equivariant Graph Neural Network to attention
Stars: ✭ 131 (-30.69%)
Mutual labels:  transformer, attention-mechanism
Linear-Attention-Mechanism
Attention mechanism
Stars: ✭ 27 (-85.71%)
Mutual labels:  attention, attention-mechanism
YOLOS
You Only Look at One Sequence (NeurIPS 2021)
Stars: ✭ 612 (+223.81%)
Mutual labels:  transformer, vision-transformer
learningspoons
nlp lecture-notes and source code
Stars: ✭ 29 (-84.66%)
Mutual labels:  transformer, attention
TRAR-VQA
[ICCV 2021] TRAR: Routing the Attention Spans in Transformers for Visual Question Answering -- Official Implementation
Stars: ✭ 49 (-74.07%)
Mutual labels:  transformer, attention
towhee
Towhee is a framework that is dedicated to making neural data processing pipelines simple and fast.
Stars: ✭ 821 (+334.39%)
Mutual labels:  transformer, vision-transformer
dodrio
Exploring attention weights in transformer-based models with linguistic knowledge.
Stars: ✭ 233 (+23.28%)
Mutual labels:  transformer, attention-mechanism
transformer
A PyTorch Implementation of "Attention Is All You Need"
Stars: ✭ 28 (-85.19%)
Mutual labels:  transformer, attention
halonet-pytorch
Implementation of the 😇 Attention layer from the paper, Scaling Local Self-Attention For Parameter Efficient Visual Backbones
Stars: ✭ 181 (-4.23%)
Mutual labels:  vision, attention-mechanism
FNet-pytorch
Unofficial implementation of Google's FNet: Mixing Tokens with Fourier Transforms
Stars: ✭ 204 (+7.94%)
Mutual labels:  transformer, vision
TokenLabeling
Pytorch implementation of "All Tokens Matter: Token Labeling for Training Better Vision Transformers"
Stars: ✭ 385 (+103.7%)
Mutual labels:  transformer, vision

visualization

a collection of visualization operation for easier usage, check usage for a quick start.

New Features

2021/10/4

  • Add draw_line_chart function, please check drawer.py

2021/09/29

  • Add pip installation
  • Build a cleaner repo

Contents

Visualization Function

Learning Notes Sharing

Relative Blogs

Installation

pip install visualize==0.5.1

Usage

Run Example

You can try example.py by cloning this repo for a quick start.

git clone https://github.com/rentainhe/visualization.git
python example.py

results will be saved to ./test_grid_attention and ./test_region_attention

Region Attention Visualization

download the example.jpg to any folder you like

$ wget https://github.com/rentainhe/visualization/blob/master/visualize/test_data/example.jpg

build the following python script for a quick start

import numpy as np
from visualize import visualize_region_attention

img_path="path/to/example.jpg"
save_path="example"
attention_retio=1.0
boxes = np.array([[14, 25, 100, 200], [56, 75, 245, 300]], dtype='int')
boxes_attention = [0.36, 0.64]
visualize_region_attention(img_path,
                           save_path=save_path, 
                           boxes=boxes, 
                           box_attentions=boxes_attention, 
                           attention_ratio=attention_retio,
                           save_image=True,
                           save_origin_image=True,
                           quality=100)
  • img_path: where to load the original image
  • boxes: a list of coordinates for the bounding boxes
  • box_attentions: a list of attention scores for each bounding box
  • attention_ratio: a special param, if you set the attention_ratio larger, it will make the attention map look more shallow. Just try!
  • save_image=True: save the image with attention map or not, e.g., default: True.
  • save_original_image=True: save the original image at the same time, e.g., default: True

Note that you can check Region Attention Visualization here for more details

Grid Attention Visualization

download the example.jpg to any folder you like

$ wget https://github.com/rentainhe/visualization/blob/master/visualize/test_data/example.jpg

build the following python script for a quick start

from visualize import visualize_grid_attention_v2
import numpy as np

img_path="./example.jpg"
save_path="test"
attention_mask = np.random.randn(14, 14)
visualize_grid_attention_v2(img_path,
                           save_path=save_path,
                           attention_mask=attention_mask,
                           save_image=True,
                           save_original_image=True,
                           quality=100)
  • img_path: where the image you want to put an attention mask on.
  • save_path: where to save the image.
  • attention_mask: the attention mask with format numpy.ndarray, its shape is (H, W)
  • save_image=True: save the image with attention map or not, e.g., default: True.
  • save_original_image=True: save the original image at the same time, e.g., default: True

Note that you can check Grid Attention Visualization here for more details

Draw Line Chart

build the following python script for a quick start

from visualize import draw_line_chart

# test data
data1 = {"data": [13.15, 14.64, 15.83, 17.99], "name": "data 1"}
data2 = {"data": [14.16, 14.81, 16.11, 18.62], "name": "data 2"}
data_list = []
data_list.append(data1["data"])
data_list.append(data2["data"])
name_list = []
name_list.append(data1["name"])
name_list.append(data2["name"])
draw_line_chart(data_list=data_list,
                labels=name_list,
                xlabel="test_x",
                ylabel="test_y",
                save_path="./test.jpg",
                legend={"loc": "upper left", "frameon": True, "fontsize": 12},
                title="example")
  • data_list: a list of data to draw.
  • labels: the label corresponds to each data in data_list.
  • xlabel: label of x-axis.
  • ylabel: label of y-axis.
  • save_path: the path to save image.
  • legend: the params of legend.
  • title: the title of the saved image.

You will get the result like this:

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].