All Projects → tampe125 → mongodb-scraper

tampe125 / mongodb-scraper

Licence: other
Scraps for publicly accessible MongoDB instances and dumps user passwords

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to mongodb-scraper

Spydan
A web spider for shodan.io without using the Developer API.
Stars: ✭ 30 (-9.09%)
Mutual labels:  scraper, shodan
Gosint
OSINT Swiss Army Knife
Stars: ✭ 401 (+1115.15%)
Mutual labels:  scraper, shodan
pysoundcloud
Scraping the Un–scrapable™
Stars: ✭ 63 (+90.91%)
Mutual labels:  scraper
extract-css
Extract all CSS from a webpage, packaged as a Now V2 Lambda
Stars: ✭ 23 (-30.3%)
Mutual labels:  scraper
Android-Apps-Downloader
📱 A tool to download android apps from Google Play Store and Xiaomi App Store (the famous Chinese Store).
Stars: ✭ 16 (-51.52%)
Mutual labels:  scraper
ogcheckr-api
An api to check social media username availability on a variety of services
Stars: ✭ 18 (-45.45%)
Mutual labels:  scraper
the-weather-scraper
A Lightweight Weather Scraper
Stars: ✭ 56 (+69.7%)
Mutual labels:  scraper
scoopi-scraper
Scoopi Web Scraper is a heavy duty tool to extract data from HTML pages.
Stars: ✭ 18 (-45.45%)
Mutual labels:  scraper
bot
Completely free and open-source human-like Instagram bot. Powered by UIAutomator2 and compatible with basically any Android device 5.0+ that can run Instagram - real or emulated.
Stars: ✭ 321 (+872.73%)
Mutual labels:  scraper
discord-music
Discord music bot written in Typescript
Stars: ✭ 12 (-63.64%)
Mutual labels:  scraper
netsploit
📡 A security research tool with shodan integration
Stars: ✭ 25 (-24.24%)
Mutual labels:  shodan
whatsapp-tracking
Scraping the status of WhatsApp contacts
Stars: ✭ 49 (+48.48%)
Mutual labels:  scraper
linkedinscraper
LinkedinScraper is an another information gathering tool written in python. You can scrape employees of companies on Linkedin.com and then create these employee names, titles and emails.
Stars: ✭ 22 (-33.33%)
Mutual labels:  scraper
papercut
Papercut is a scraping/crawling library for Node.js built on top of JSDOM. It provides basic selector features together with features like Page Caching and Geosearch.
Stars: ✭ 15 (-54.55%)
Mutual labels:  scraper
sp-subway-scraper
🚆This web scraper builds a dataset for São Paulo subway operation status
Stars: ✭ 24 (-27.27%)
Mutual labels:  scraper
Episode-ReName
电视剧/番剧自动化重命名工具
Stars: ✭ 89 (+169.7%)
Mutual labels:  scraper
bing-ip2hosts
bingip2hosts is a Bing.com web scraper that discovers websites by IP address
Stars: ✭ 99 (+200%)
Mutual labels:  scraper
linky
Yet Another LInkedIn Scraper...
Stars: ✭ 44 (+33.33%)
Mutual labels:  scraper
CamHell
Ingenic T10 IP camera crawler
Stars: ✭ 53 (+60.61%)
Mutual labels:  shodan
naver movie scraper
네이버 영화 정보 및 사용자 작성 영화평/평점 데이터 수집기
Stars: ✭ 27 (-18.18%)
Mutual labels:  scraper

MongoDB Scraper

So accordingly to Shodan, there are more than 30k Mongodb instances publicly available, running on the standard port. Many of them are running with default settings (ie no authentication required).

What if we start scraping them all and dump the passwords?

Requirements

pip install pymongo
pip install colorlog

Usage

First of all create a data.json file, including a JSON encoded array of IPs:

["123.456.789", "987.654.321"]

If you have downloaded a report from Shodan, you can easily parse it using the script file parse_data.py.
Then simply run the scraper using the following command:

python mongodb-scaper.py

You can supply a comma separate list of IPs as an additional argument --skip to manually check some IPs as processed and thus exlude them from the stack

python mongodb-scraper.py --skip "123.123.123,123.456.789"

Get alerts on juicy results

If you want to get an email when you find some BIG DUMP (default when there are more than 1M of rows), simply copy the settings-dist.json file and rename it to settings.json, filling all the fields.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].