All Projects → stashapp → Communityscrapers

stashapp / Communityscrapers

Licence: agpl-3.0
This is a public repository containing scrapers created by the Stash Community.

Projects that are alternatives of or similar to Communityscrapers

Packagedev
Tools to ease the creation of snippets, syntax definitions, etc. for Sublime Text.
Stars: ✭ 378 (+641.18%)
Mutual labels:  hacktoberfest, yaml
Dynaconf
Configuration Management for Python ⚙
Stars: ✭ 2,082 (+3982.35%)
Mutual labels:  hacktoberfest, yaml
Circe Yaml
YAML parser for circe using SnakeYAML
Stars: ✭ 102 (+100%)
Mutual labels:  hacktoberfest, yaml
Configurate
A simple configuration library for Java applications providing a node structure, a variety of formats, and tools for transformation
Stars: ✭ 148 (+190.2%)
Mutual labels:  hacktoberfest, yaml
Esphome
ESPHome is a system to control your ESP8266/ESP32 by simple yet powerful configuration files and control them remotely through Home Automation systems.
Stars: ✭ 4,324 (+8378.43%)
Mutual labels:  hacktoberfest, yaml
Dark
(grafana) Dashboards As Resources in Kubernetes
Stars: ✭ 190 (+272.55%)
Mutual labels:  hacktoberfest, yaml
Config Lint
Command line tool to validate configuration files
Stars: ✭ 118 (+131.37%)
Mutual labels:  hacktoberfest, yaml
Grabana
User-friendly Go library for building Grafana dashboards
Stars: ✭ 313 (+513.73%)
Mutual labels:  hacktoberfest, yaml
Kubernetes Examples
Minimal self-contained examples of standard Kubernetes features and patterns in YAML
Stars: ✭ 811 (+1490.2%)
Mutual labels:  hacktoberfest, yaml
Logidze
Database changes log for Rails
Stars: ✭ 1,060 (+1978.43%)
Mutual labels:  hacktoberfest
Emotecollector
Collects emotes from other servers for use by people who don't have Nitro
Stars: ✭ 51 (+0%)
Mutual labels:  hacktoberfest
Siler
⚡ Flat-files and plain-old PHP functions rockin'on as a set of general purpose high-level abstractions.
Stars: ✭ 1,056 (+1970.59%)
Mutual labels:  hacktoberfest
Flutter Guide
📚 Flutter Guide on becoming a Master Flutterista
Stars: ✭ 51 (+0%)
Mutual labels:  hacktoberfest
Gau
Fetch known URLs from AlienVault's Open Threat Exchange, the Wayback Machine, and Common Crawl.
Stars: ✭ 1,060 (+1978.43%)
Mutual labels:  hacktoberfest
Cliwrap
Library for running command line processes
Stars: ✭ 1,057 (+1972.55%)
Mutual labels:  hacktoberfest
Puppet Selinux
Puppet Module to manage SELinux on RHEL machines
Stars: ✭ 51 (+0%)
Mutual labels:  hacktoberfest
Cloudsplaining
Cloudsplaining is an AWS IAM Security Assessment tool that identifies violations of least privilege and generates a risk-prioritized report.
Stars: ✭ 1,057 (+1972.55%)
Mutual labels:  hacktoberfest
Redux Query
A library for managing network state in Redux
Stars: ✭ 1,055 (+1968.63%)
Mutual labels:  hacktoberfest
Doctest Js
Run JSDoc style doc examples as tests within your test suite
Stars: ✭ 52 (+1.96%)
Mutual labels:  hacktoberfest
Lvm
Development repository for lvm Chef cookbook
Stars: ✭ 51 (+0%)
Mutual labels:  hacktoberfest

CommunityScrapers

This is a public repository containing scrapers created by the Stash Community.

To download the scrapers you can clone the git repo or download directly any of the scrapers.

When downloading directly click at the scraper.yml you want and then make sure to click the raw button:

and then save page as file from the browser to preserve the correct format for the yml file.

Any scraper file has to be stored in the ~/.stash/scrapers ( ~/.stash is where the config and database file are located) directory. If the scrapers directory is not there it needs to be created.

After updating the scrapers directory contents or editing a scraper file a restart of stash is needed and a refresh of the edit scene/performer page.(In recent stash builds instead of restarting scrape with -> reload scrapers is enough)

Some sites block content if the user agent is not valid. If you get some kind of blocked or denied message make sure to configure the Scraping -> Scraper User Agent setting in stash. Valid strings e.g. for firefox can be found here https://developer.mozilla.org/en-US/docs/Web/HTTP/Headers/User-Agent/Firefox . Scrapers for those sites should have a comment mentioning this along with a tested and working user agent string

Scrapers with useCDP set to true require that you have properly configured the Chrome CDP path setting in Stash. If you decide to use a remote instance the headless chromium docker image from https://hub.docker.com/r/chromedp/headless-shell/ is highly recommended.

Scrapers

You can find a list of sites currently supported for by community scraping in SCRAPERS-LIST.md

For most scrapers you have to edit the url. Once you populate that field with a specific url a button will appear.

Clicking on that button brings up a popup that lets you select which fields to update.

Some scrapers support the scrape with function so you can you use that instead of adding a url.

Contributing

Contributions are always welcome! Use the Scraping Configuration wiki entry to get started and stop by the Discord #the-scraping-initiative channel with any questions.

Validation

The scrapers in this repository can be validated against a schema and checked for common errors.

First, install the validator's dependencies - inside the ./validator folder, run: yarn.

Then, to run the validator, use node validate.js in the root of the repository.
Specific scrapers can be checked using: node validate.js scrapers/foo.yml scrapers/bar.yml

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].