All Projects → Liu233w → Acm Statistics

Liu233w / Acm Statistics

Licence: agpl-3.0
An online tool (crawler) to analyze users performance in online judges (coding competition websites). Supported OJ: POJ, HDU, ZOJ, HYSBZ, CodeForces, UVA, ICPC Live Archive, FZU, SPOJ, Timus (URAL), LeetCode_CN, CSU, LibreOJ, 洛谷, 牛客OJ, Lutece (UESTC), AtCoder, AIZU, CodeChef, El Judge, BNUOJ, Codewars, UOJ, NBUT, 51Nod, DMOJ, VJudge

Programming Languages

javascript
184084 projects - #8 most used programming language
csharp
926 projects

Projects that are alternatives of or similar to Acm Statistics

Tracker Radar Collector
🕸 Modular, multithreaded, puppeteer-based crawler
Stars: ✭ 67 (-19.28%)
Mutual labels:  crawler
Bee University
Project thu thập điểm chuẩn đại học 2014 - 2018 và phân tích dữ liệu
Stars: ✭ 73 (-12.05%)
Mutual labels:  crawler
Puppeteer Walker
a puppeteer walker 🕷 🕸
Stars: ✭ 78 (-6.02%)
Mutual labels:  crawler
Python Testing Crawler
A crawler for automated functional testing of a web application
Stars: ✭ 68 (-18.07%)
Mutual labels:  crawler
Jd Autobuy
Python爬虫,京东自动登录,在线抢购商品
Stars: ✭ 1,174 (+1314.46%)
Mutual labels:  crawler
Njupt Yellow Page
😋南京邮电大学黄页
Stars: ✭ 74 (-10.84%)
Mutual labels:  acm-icpc
Terpene Profile Parser For Cannabis Strains
Parser and database to index the terpene profile of different strains of Cannabis from online databases
Stars: ✭ 63 (-24.1%)
Mutual labels:  crawler
Work crawler
Download comics novels 小说漫画下载工具 小説漫画のダウンローダ 小說漫畫下載:腾讯漫画 大角虫漫画 有妖气 知音漫客 咪咕 SF漫画 哦漫画 看漫画 漫画柜 汗汗酷漫 動漫伊甸園 快看漫画 微博动漫 733动漫网 大古漫画网 漫画DB 無限動漫 動漫狂 卡推漫画 动漫之家 动漫屋 古风漫画网 36漫画网 亲亲漫画网 乙女漫画 comico webtoons 咚漫 ニコニコ静画 ComicWalker ヤングエースUP モアイ pixivコミック サイコミ;アルファポリス カクヨム ハーメルン 小説家になろう 起点中文网 八一中文网 顶点小说 落霞小说网 努努书坊 笔趣阁→epub.
Stars: ✭ 1,224 (+1374.7%)
Mutual labels:  crawler
Goscraper
Golang pkg to quickly return a preview of a webpage (title/description/images)
Stars: ✭ 72 (-13.25%)
Mutual labels:  crawler
Poopak
POOPAK - TOR Hidden Service Crawler
Stars: ✭ 78 (-6.02%)
Mutual labels:  crawler
Arachnid
Powerful web scraping framework for Crystal
Stars: ✭ 68 (-18.07%)
Mutual labels:  crawler
Scrapy Examples
Some scrapy and web.py exmaples
Stars: ✭ 71 (-14.46%)
Mutual labels:  crawler
Anticrawlersolution
It covers the blockade principle of most anti-climbing strategies and corresponding solutions.👽👽👽👽(涵盖了大部分的反爬策略的封锁原理以及对应的解决方案。)
Stars: ✭ 77 (-7.23%)
Mutual labels:  crawler
Zhihuvapi
优雅地玩知乎
Stars: ✭ 67 (-19.28%)
Mutual labels:  crawler
Swiftlinkpreview
It makes a preview from an URL, grabbing all the information such as title, relevant texts and images.
Stars: ✭ 1,216 (+1365.06%)
Mutual labels:  crawler
Lxspider
爬虫案例合集。包括但不限于《淘宝、京东、天猫、豆瓣、抖音、快手、微博、微信、阿里、头条、pdd、优酷、爱奇艺、携程、12306、58、搜狐、百度指数、维普万方、Zlibraty、Oalib、小说、招标网、采购网、小红书》
Stars: ✭ 60 (-27.71%)
Mutual labels:  crawler
Crawler examples
Some classic web crawler projects.一些经典的爬虫
Stars: ✭ 74 (-10.84%)
Mutual labels:  crawler
Is Google
Verify that a request is from Google crawlers using Google's DNS verification steps
Stars: ✭ 82 (-1.2%)
Mutual labels:  crawler
Wombat
Lightweight Ruby web crawler/scraper with an elegant DSL which extracts structured data from pages.
Stars: ✭ 1,220 (+1369.88%)
Mutual labels:  crawler
Webb
Python: An all-in-one Web Crawler, Web Parser and Web Scrapping library!
Stars: ✭ 77 (-7.23%)
Mutual labels:  crawler

This repo contains the source code of OJ Hunt

简体中文版:README_zh-hans.md

Powered by ZenHub Quality Gate codecov Cypress.io Renovate enabled Mergify Status

All Contributors

Build status

Unit Tests Test E2E

Features

  • Querying ac/submissions of oj
  • Storing querying history

Under development

  • Email support
  • Ranks
  • ……

Directory structure

  • frontend: The front end
  • crawler: Crawlers to query OJs. Being used by both frontend and backend
  • crawler-api-backend: A microservice that provides querying api
  • e2e: E2E tests
  • backend: The back end, a monoservice
  • captcha-service: A microservice that provides captcha support
  • ohunt: A stateful, standalone crawler microservice used to support certain OJs such as ZOJ.
  • build: Codes to build and deploy the project. Tool chain: docker, docker-compose, GNU make.
  • tools: Utility scripts and config files in operation

See the README file in each module for specific documents.

Developing and deploying in docker

  • The project needs docker and docker-compose to function correctly.

Development

  • This project uses makefile to manage dependency between modules. Execute make help in repository root to view document.
  • GNU make is required.

Deploy

There are two ways to deploy this project in a server.

One-liner

Execute following code in shell to deploy the project to port 3000.

curl -s https://raw.githubusercontent.com/Liu233w/acm-statistics/master/tools/remote-docker-up.sh | bash

Vjudge crawler is not available in this way.

Config file version

In this way you are able to customise the configuration, enabling all features.

# Create a folder to store config files
mkdir -p ~/www/acm-statistics
cd ~/www/acm-statistics
# Download runner script and add permissions
curl https://raw.githubusercontent.com/Liu233w/acm-statistics/master/tools/remote-docker-up.sh  -o run.sh
chmod +x run.sh
# Run the script once to generate configuration file. It will exit after the line `.env file created, remember to edit it` is shown.
./run.sh
# Edit the config file following the description in it.
vim .env
# Now we can run the project by the script
./run.sh

Then you can use tools such as systemd to run ./run.sh.

./tools/acm-statistics.service is a template config file of systemd.

run.sh checks updates when it is starting. If there are updates to template.env, run.sh will exit and ask you to compare these two files. The script compares the line count of the two files to check update, please make sure they are identical when editing.

Management

  • Set the url of adminer in .env file. It is /adminer by default.
    • You can view and edit database via adminer.
    • The name of the database is acm_statistics. Username is root. You can set password in .env
  • Backups are created automatically in 3:00am each day, stored in db-backup folder, which is in the folder that contains config files.

License

  • All source code except the code in crawler/crawlers are under AGPL-3.0 license
  • The code in crawler/crawlers are under BSD 2-Clause license.

Contribution

  • All contribution especially crawlers are welcomed.
  • Please follow Commit Message Conventions when writing git commit messages.
  • You may use cz-cli to help writing commit messages.

Contributors ✨

Thanks goes to these wonderful people (emoji key):


Adelard Collins🔗
🐛

BackSlashDelta🔗
🐛

Bodhisatan_Yao🔗
🐛

Geekxiong🔗
🤔

Halorv🔗
🤔

Kido Zhang🔗
🚇 🤔

Liu233w🔗
💻 🤔 🚇 ⚠️

Meulsama🔗
🤔

Michael Xiang🔗
🐛

Zhao🔗
🐛

bluebear4🔗
🐛

ct🔗
🐛

flylai🔗
💻 🐛

fzu-h4cky🔗
🐛

zby🔗
🤔 🐛

This project follows the all-contributors specification. Contributions of any kind welcome!

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].