All Projects → nmalcolm → Inventus

nmalcolm / Inventus

Licence: MIT license
Inventus is a spider designed to find subdomains of a specific domain by crawling it and any subdomains it discovers.

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to Inventus

flydns
Related subdomains finder
Stars: ✭ 29 (-63.75%)
Mutual labels:  bugbounty, subdomains
tugarecon
Pentest: Subdomains enumeration tool for penetration testers.
Stars: ✭ 142 (+77.5%)
Mutual labels:  bugbounty, subdomains
fdnssearch
Swiftly search FDNS datasets from Rapid7 Open Data
Stars: ✭ 19 (-76.25%)
Mutual labels:  bugbounty, subdomains
ICU
An Extended, Modulair, Host Discovery Framework
Stars: ✭ 40 (-50%)
Mutual labels:  bugbounty, subdomains
sqip
SQIP is a tool for SVG-based LQIP image creation written in go
Stars: ✭ 46 (-42.5%)
Mutual labels:  mit-license
ANODA-Turn-Timer
ANODA Open Source iOS Swift example app
Stars: ✭ 19 (-76.25%)
Mutual labels:  mit-license
ArticleSpider
Crawling zhihu, jobbole, lagou by Scrapy, and using Elasticsearch+Django to build a Search Engine website --- README_zh.md (including: implementation roadmap, distributed-crawler and coping with anti-crawling strategies).
Stars: ✭ 34 (-57.5%)
Mutual labels:  scrapy
Awesome-HTTPRequestSmuggling
A curated list of awesome blogs and tools about HTTP request smuggling attacks. Feel free to contribute! 🍻
Stars: ✭ 97 (+21.25%)
Mutual labels:  bugbounty
scrapy-mysql-pipeline
scrapy mysql pipeline
Stars: ✭ 47 (-41.25%)
Mutual labels:  scrapy
fight-for-artistic-creativity
Twitterをディストピアにしないために、我々ができること。
Stars: ✭ 19 (-76.25%)
Mutual labels:  mit-license
VPS-Bug-Bounty-Tools
Script that automates the installation of the main tools used for web application penetration testing and Bug Bounty.
Stars: ✭ 44 (-45%)
Mutual labels:  bugbounty
scrapy-wayback-machine
A Scrapy middleware for scraping time series data from Archive.org's Wayback Machine.
Stars: ✭ 92 (+15%)
Mutual labels:  scrapy
KaliIntelligenceSuite
Kali Intelligence Suite (KIS) shall aid in the fast, autonomous, central, and comprehensive collection of intelligence by executing standard penetration testing tools. The collected data is internally stored in a structured manner to allow the fast identification and visualisation of the collected information.
Stars: ✭ 58 (-27.5%)
Mutual labels:  bugbounty
seedpress-cms
A headless CMS built in Express for PostgresQL using Sequelize. Generally follows the Wordpress post and term schema.
Stars: ✭ 71 (-11.25%)
Mutual labels:  mit-license
magicRecon
MagicRecon is a powerful shell script to maximize the recon and data collection process of an objective and finding common vulnerabilities, all this saving the results obtained in an organized way in directories and with various formats.
Stars: ✭ 478 (+497.5%)
Mutual labels:  bugbounty
Emissary
Send notifications on different channels such as Slack, Telegram, Discord etc.
Stars: ✭ 33 (-58.75%)
Mutual labels:  bugbounty
pre-commit-golang
Pre-commit hooks for Golang with support for monorepos, the ability to pass arguments and environment variables to all hooks, and the ability to invoke custom go tools.
Stars: ✭ 208 (+160%)
Mutual labels:  mit-license
powerslaves
Taking PowerSaves as a slave to your will.
Stars: ✭ 28 (-65%)
Mutual labels:  mit-license
Jira-Lens
Fast and customizable vulnerability scanner For JIRA written in Python
Stars: ✭ 185 (+131.25%)
Mutual labels:  bugbounty
itemadapter
Common interface for data container classes
Stars: ✭ 47 (-41.25%)
Mutual labels:  scrapy

Inventus

Inventus is a spider designed to find subdomains of a specific domain by crawling it and any subdomains it discovers. It's a Scrapy spider, meaning it's easily modified and extendable to your needs.

forthebadge

Demo

asciicast

Requirements

  • Linux -- I haven't tested this on Windows.
  • Python 2.7 or Python 3.3+
  • Scrapy 1.4.0 or above.

Installation

Inventus requires Scrapy to be installed before it can be run. Firstly, clone the repo and enter it.

$ git clone https://github.com/nmalcolm/Inventus
$ cd Inventus

Now install the required dependencies using pip.

$ pip install -r requirements.txt

Assuming the installation succeeded, Inventus should be ready to use.

Usage

The most basic usage of Inventus is as follows:

$ cd Inventus
$ scrapy crawl inventus -a domain=facebook.com

This tells Scrapy which spider to use ("inventus" in this case), and passes the domain to the spider. Any subdomains found will be sent to STDOUT.

The other custom parameter is subdomain_limit. This sets a max limit of subdomains to discover before quitting. The default value is 10000, but isn't a hard limit.

$ scrapy crawl inventus -a domain=facebook.com -a subdomain_limit=100

Exporting

Exporting data can be done in multiple ways. The easiest way is redirecting STDOUT to a file.

$ scrapy crawl inventus -a domain=facebook.com > facebook.txt

Scrapy has a built-in feature which allows you to export items into various formats, including CSV, JSON, and XML. Currently only subdomains will be exported, however this may change in the future.

$ scrapy crawl inventus -a domain=facebook.com -t csv -o Facebook.csv

Configuration

Configurations can be made to how Inventus behaves. By default Inventus will ignore robots.txt, has a 30 second timeout, caches crawl data for 24 hours, has a crawl depth of 5, and uses Scrapy's AutoThrottle extension. These and more can all be changed by editing the inventus_spider/settings.py file. Scrapy's settings are well documented too.

Bugs / Suggestions / Feedback

Feel free to open a new issue for any of the above. Inventus was built in only a few hours and will likely contain bugs. You can also connect with me on Twitter.

License

Released under the MIT License. See LICENSE.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].