All Projects → chuanqi305 → Focalloss

chuanqi305 / Focalloss

Licence: mit
Caffe implementation of FAIR paper "Focal Loss for Dense Object Detection" for SSD.

Labels

Projects that are alternatives of or similar to Focalloss

Openseachest
Cross platform utilities useful for performing various operations on SATA, SAS, NVMe, and USB storage devices.
Stars: ✭ 98 (-43.02%)
Mutual labels:  ssd
Crossplatformdisktest
Windows, macOS and Android storage (HDD, SSD, RAM) speed testing/performance benchmarking app
Stars: ✭ 123 (-28.49%)
Mutual labels:  ssd
Kdiskmark
A simple open-source disk benchmark tool for Linux distros
Stars: ✭ 152 (-11.63%)
Mutual labels:  ssd
Rootonnvme
Switch the rootfs to a NVMe SSD on the Jetson Xavier NX and Jetson AGX Xavier
Stars: ✭ 103 (-40.12%)
Mutual labels:  ssd
Subscribe2clash
v2ray\trojan\ss\ssr\ssd订阅转换Clash规则配置,自动更新ACL4SSR路由规则
Stars: ✭ 112 (-34.88%)
Mutual labels:  ssd
Vehicle Detection
Compare FasterRCNN,Yolo,SSD model with the same dataset
Stars: ✭ 130 (-24.42%)
Mutual labels:  ssd
Tf Object Detection
Simpler app for tensorflow object detection API
Stars: ✭ 91 (-47.09%)
Mutual labels:  ssd
A Pytorch Tutorial To Object Detection
SSD: Single Shot MultiBox Detector | a PyTorch Tutorial to Object Detection
Stars: ✭ 2,398 (+1294.19%)
Mutual labels:  ssd
Linux
The linux kernel source repository for Open-Channel SSDs
Stars: ✭ 119 (-30.81%)
Mutual labels:  ssd
Ssd keras
A Keras port of Single Shot MultiBox Detector
Stars: ✭ 1,763 (+925%)
Mutual labels:  ssd
Handtracking
Building a Real-time Hand-Detector using Neural Networks (SSD) on Tensorflow
Stars: ✭ 1,506 (+775.58%)
Mutual labels:  ssd
Tabulo
Table Detection and Extraction Using Deep Learning ( It is built in Python, using Luminoth, TensorFlow<2.0 and Sonnet.)
Stars: ✭ 110 (-36.05%)
Mutual labels:  ssd
Ssd pytorch
support different SSDs and different scale test, support refineDet.
Stars: ✭ 139 (-19.19%)
Mutual labels:  ssd
Pytorch Ssd
Single Shot MultiBox Detector in PyTorch [deprecated]
Stars: ✭ 100 (-41.86%)
Mutual labels:  ssd
Ssd keras
简明 SSD 目标检测模型 keras version(交通标志识别 训练部分见 dev 分支)
Stars: ✭ 152 (-11.63%)
Mutual labels:  ssd
Ezfio
Simple NVME/SAS/SATA SSD test framework for Linux and Windows
Stars: ✭ 91 (-47.09%)
Mutual labels:  ssd
Ios tensorflow objectdetection example
An iOS application of Tensorflow Object Detection with different models: SSD with Mobilenet, SSD with InceptionV2, Faster-RCNN-resnet101
Stars: ✭ 126 (-26.74%)
Mutual labels:  ssd
Handpose
A python program to detect and classify hand pose using deep learning techniques
Stars: ✭ 168 (-2.33%)
Mutual labels:  ssd
Proctoring Ai
Creating a software for automatic monitoring in online proctoring
Stars: ✭ 155 (-9.88%)
Mutual labels:  ssd
Mobilenet Ssd
Caffe implementation of Google MobileNet SSD detection network, with pretrained weights on VOC0712 and mAP=0.727.
Stars: ✭ 1,805 (+949.42%)
Mutual labels:  ssd

FocalLoss

Caffe implementation of FAIR paper "Focal Loss for Dense Object Detection" for SSD.

layer {
  name: "mbox_loss"
  type: "MultiBoxFocalLoss" #change the type
  bottom: "mbox_loc"
  bottom: "mbox_conf"
  bottom: "mbox_priorbox"
  bottom: "label"
  top: "mbox_loss"
  include {
    phase: TRAIN
  }
  propagate_down: true
  propagate_down: true
  propagate_down: false
  propagate_down: false
  loss_param {
    normalization: VALID
  }
  focal_loss_param { #set the alpha and gamma, default is alpha=0.25, gamma=2.0
    alpha: 0.25
    gamma: 2.0
  }
  multibox_loss_param {
    loc_loss_type: SMOOTH_L1
    conf_loss_type: SOFTMAX
    loc_weight: 1.0
    num_classes: 21
    share_location: true
    match_type: PER_PREDICTION
    overlap_threshold: 0.5
    use_prior_for_matching: true
    background_label_id: 0
    use_difficult_gt: true
    neg_pos_ratio: 3.0
    neg_overlap: 0.5
    code_type: CENTER_SIZE
    ignore_cross_boundary_bbox: false
    mining_type: NONE #do not use OHEM
  }
}
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].