OWASP / D4n155
Licence: gpl-3.0
OWASP D4N155 - Intelligent and dynamic wordlist using OSINT
Stars: ✭ 105
Programming Languages
shell
77523 projects
Projects that are alternatives of or similar to D4n155
Sitedorks
Search Google/Bing/Ecosia/DuckDuckGo/Yandex/Yahoo for a search term with a default set of websites, bug bounty programs or a custom collection.
Stars: ✭ 221 (+110.48%)
Mutual labels: google, duckduckgo, osint
Search Deflector
A small program that forwards searches from Cortana to your preferred browser and search engine.
Stars: ✭ 620 (+490.48%)
Mutual labels: google, duckduckgo, tool
Lulu
[Unmaintained] A simple and clean video/music/image downloader 👾
Stars: ✭ 789 (+651.43%)
Mutual labels: crawler, scraping
Pagodo
pagodo (Passive Google Dork) - Automate Google Hacking Database scraping and searching
Stars: ✭ 603 (+474.29%)
Mutual labels: google, osint
Photon
Incredibly fast crawler designed for OSINT.
Stars: ✭ 8,332 (+7835.24%)
Mutual labels: crawler, osint
Awesome Python Primer
自学入门 Python 优质中文资源索引,包含 书籍 / 文档 / 视频,适用于 爬虫 / Web / 数据分析 / 机器学习 方向
Stars: ✭ 57 (-45.71%)
Mutual labels: crawler, scraping
Is Google
Verify that a request is from Google crawlers using Google's DNS verification steps
Stars: ✭ 82 (-21.9%)
Mutual labels: google, crawler
Geziyor
Geziyor, a fast web crawling & scraping framework for Go. Supports JS rendering.
Stars: ✭ 1,246 (+1086.67%)
Mutual labels: crawler, scraping
Dotnetcrawler
DotnetCrawler is a straightforward, lightweight web crawling/scrapying library for Entity Framework Core output based on dotnet core. This library designed like other strong crawler libraries like WebMagic and Scrapy but for enabling extandable your custom requirements. Medium link : https://medium.com/@mehmetozkaya/creating-custom-web-crawler-with-dotnet-core-using-entity-framework-core-ec8d23f0ca7c
Stars: ✭ 100 (-4.76%)
Mutual labels: crawler, scraping
Easy Scraping Tutorial
Simple but useful Python web scraping tutorial code.
Stars: ✭ 583 (+455.24%)
Mutual labels: crawler, scraping
Headless Chrome Crawler
Distributed crawler powered by Headless Chrome
Stars: ✭ 5,129 (+4784.76%)
Mutual labels: crawler, scraping
Autocrawler
Google, Naver multiprocess image web crawler (Selenium)
Stars: ✭ 957 (+811.43%)
Mutual labels: google, crawler
Ghunt
🕵️♂️ Investigate Google emails and documents.
Stars: ✭ 10,489 (+9889.52%)
Mutual labels: google, osint
Crawly
Crawly, a high-level web crawling & scraping framework for Elixir.
Stars: ✭ 440 (+319.05%)
Mutual labels: crawler, scraping
Scrapple
A framework for creating semi-automatic web content extractors
Stars: ✭ 464 (+341.9%)
Mutual labels: crawler, scraping
Googledriveuploadtool
A tool for Windows to upload and manage files in Google Drive. It resumes uploads in case of an error or failure. Perfect for uploading large files or if your connection is unstable.
Stars: ✭ 58 (-44.76%)
Mutual labels: google, tool
OWASP D4N155
It's an information security audit tool that creates intelligent wordlists based on the content of the target page.
Help us, See some calculations used
Ongoing projects 👷: D4N155 in docker 🎁, Web API D4N155 ☁️, Telegram bot 🤖
Install
Need to: Python3.6, Bash (GNU Bourne-Again SHell), Go
Optional: Git
Source
git clone https://github.com/owasp/D4N155.git
cd D4N155
pip3 install -r requirements.txt
bash main
Or whithout git
wget -qO- https://github.com/owasp/D4N155/archive/master.zip | bsdtar -xf-
cd D4N155-master
pip3 install -r requirements.txt
bash main
Docker
In image:
FROM docker.pkg.github.com/owasp/d4n155/d4n155:latest
Cli:
docker pull docker.pkg.github.com/owasp/d4n155/d4n155:latest
docker run -it d4n155
Manual
D4N155: Tool for smart audit security
Usage: bash main <option> <value>
All options are optionals
Options:
-w, --wordlist <url|ip> Make the smartwordlist based in informations
on website.
-t, --targets <file> Make the smart-wordlist based in your passed
source informations in urls.
-b, --based <file> Analyze texts to generate the
custom wordlist
-r, --rate <time> Defines time interval between requests
-o, --output <file> For to store the all wordlist.
-?a, --aggressive Aggressive reading with headless
-h, --help Show this mensage.
Value: <url | ip | source | file | time>
URL URL target, example: scanme.nmap.org
IP IP address
TIME Time, example: 2.5. I.e: 00:00:02:30.. 0 are default
FILE File, for save the result, get urls or using in
wordlist
Note that the project description data, including the texts, logos, images, and/or trademarks,
for each open source project belongs to its rightful owner.
If you wish to add or remove any projects, please contact us at [email protected].