All Projects → mazzzystar → Proxy

mazzzystar / Proxy

A simple tool for fetching usable proxies from several websites.

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to Proxy

Proxybroker
Proxy [Finder | Checker | Server]. HTTP(S) & SOCKS 🎭
Stars: ✭ 2,767 (+2131.45%)
Mutual labels:  proxies, proxy-list, proxypool
proxi
Proxy pool. Finds and checks proxies with rest api for querying results. Can find over 25k proxies in under 5 minutes.
Stars: ✭ 32 (-74.19%)
Mutual labels:  web-crawler, proxypool, proxy-list
torchestrator
Spin up Tor containers and then proxy HTTP requests via these Tor instances
Stars: ✭ 32 (-74.19%)
Mutual labels:  proxies, proxypool, proxy-list
Free Proxy List
Free proxy list, free proxy world - visit our website
Stars: ✭ 133 (+7.26%)
Mutual labels:  proxy-list, proxypool
Spoon
🥄 A package for building specific Proxy Pool for different Sites.
Stars: ✭ 173 (+39.52%)
Mutual labels:  proxies, proxypool
proxy fetcher
💪 Ruby / JRuby / TrufflleRuby gem & CLI for dealing with proxy lists from various sources
Stars: ✭ 119 (-4.03%)
Mutual labels:  proxies, proxy-list
Proxy Scraper
Proxy-Scraper is simple Perl script for scraping proxies from multiple websites.
Stars: ✭ 24 (-80.65%)
Mutual labels:  proxies, proxy-list
proxy-scrape
scrapin' proxies with ocr
Stars: ✭ 20 (-83.87%)
Mutual labels:  proxies, proxy-list
proxy-list
A curated list of free public proxy servers
Stars: ✭ 70 (-43.55%)
Mutual labels:  proxies, proxy-list
ProxyGrab
Asynchronous Library made using Python and aiohttp to get proxies from multiple services!
Stars: ✭ 17 (-86.29%)
Mutual labels:  proxies, proxy-list
socks5 list
Auto-updated SOCKS5 proxy list + proxies for Telegram
Stars: ✭ 210 (+69.35%)
Mutual labels:  proxies, proxy-list
Pspider
简单易用的Python爬虫框架,QQ交流群:597510560
Stars: ✭ 1,611 (+1199.19%)
Mutual labels:  web-crawler, proxies
Proxy Scraper
Library for scraping free proxies lists
Stars: ✭ 78 (-37.1%)
Mutual labels:  proxy-list
Edge
A set of useful libraries for Edge Apps. Run locally, write tests, and integrate it into your deployment process. Move fast and maybe don't break things? Because, gosh darnit, you're an adult.
Stars: ✭ 105 (-15.32%)
Mutual labels:  proxies
Ospider
开源矢量地理数据获取与预处理工具(POI/AOI/行政区/路网/土地利用)
Stars: ✭ 74 (-40.32%)
Mutual labels:  web-crawler
Cvpr2019
Displays all the 2019 CVPR Accepted Papers in a way that they are easy to parse.
Stars: ✭ 65 (-47.58%)
Mutual labels:  web-crawler
Baiducrawler
Sample of using proxies to crawl baidu search results.
Stars: ✭ 116 (-6.45%)
Mutual labels:  proxies
Freeproxy
免费、高速的 V2Ray 代理和订阅。
Stars: ✭ 104 (-16.13%)
Mutual labels:  proxy-list
Proxypool
Golang实现的IP代理池
Stars: ✭ 1,134 (+814.52%)
Mutual labels:  proxypool
Abotx
Cross Platform C# Web crawler framework, headless browser, parallel crawler. Please star this project! +1.
Stars: ✭ 63 (-49.19%)
Mutual labels:  web-crawler

Proxy

A tiny tool for crawling, assessing, storing some useful proxies.中文版

Construct your ip pool

Install mysql:

pip install pymysql requests

Modify db connection information in config.py.

# crawl, assess and store proxies
python ip_pool.py

# assess proxies quality in db periodically.
python assess_quality.py

Demo on how to use these proxies.

Please first construct your ip pool.

Crawl github homepage data:

# visit database to get all proxies
ip_list = []
try:
    cursor.execute('SELECT content FROM %s' % cfg.TABLE_NAME)
    result = cursor.fetchall()
    for i in result:
        ip_list.append(i[0])
except Exception as e:
    print e
finally:
    cursor.close()
    conn.close()

# use this proxies to crawl website
for i in ip_list:
    proxy = {'http': 'http://'+i}
    url = "https://www.github.com/"
    r = requests.get(url, proxies=proxy, timeout=4)
    print r.text

More detail in crawl_demo.py

Contact

[email protected]

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].