All Projects → dcts → opensea-scraper

dcts / opensea-scraper

Licence: other
Scrapes nft floor prices and additional information from opensea. Used for https://nftfloorprice.info

Programming Languages

javascript
184084 projects - #8 most used programming language

Projects that are alternatives of or similar to opensea-scraper

Awesome NFTs
A curated collection about NFTs - by bt3gl
Stars: ✭ 42 (-67.44%)
Mutual labels:  nft, erc721, nfts
opensea-arb-nft20
🧸 💸 Detects arbitrage opportunities for ERC-721 tokens between OpenSea and NFT20.
Stars: ✭ 42 (-67.44%)
Mutual labels:  nft, erc721, opensea
opensea automatic uploader
(Bypass reCAPTCHAs) A Selenium Python bot to automatically and bulky upload and list your NFTs on OpenSea (all metadata integrated - Ethereum and Polygon supported); reCAPTCHA solver & bypasser included.
Stars: ✭ 205 (+58.91%)
Mutual labels:  nft, opensea, nfts
Simple-Game-ERC-721-Token-Template
🔮 Very Simple ERC-721 Smart Contract Template to create your own ERC-721 Tokens on the Ethereum Blockchain, with many customizable Options 🔮
Stars: ✭ 83 (-35.66%)
Mutual labels:  nft, erc721
fnd-docs
Foundation developer docs
Stars: ✭ 33 (-74.42%)
Mutual labels:  nft, nfts
Artion-Server
Artion API Server
Stars: ✭ 26 (-79.84%)
Mutual labels:  nft, erc721
opensea
python wrapper for opensea api
Stars: ✭ 38 (-70.54%)
Mutual labels:  nft, opensea
rarity-analyser
Cool Rarity is an open source package for easy rarity score calculation with ERC721 NFT metadata collection. It was born in punkscape 01 rarity analyser hackathon.
Stars: ✭ 82 (-36.43%)
Mutual labels:  nft, erc721
nft-collection-generator
Generates images and metadata for a collection of NFTs.
Stars: ✭ 77 (-40.31%)
Mutual labels:  nft, nfts
samila
Generative Art Generator
Stars: ✭ 750 (+481.4%)
Mutual labels:  nft, nfts
instagram-get-images
Instagram get images 🌄 (hashtags, account, locations) with puppeteer
Stars: ✭ 69 (-46.51%)
Mutual labels:  scraper, puppeteer
barclayscrape
A small app to programmatically mainpulate Barclays online banking
Stars: ✭ 57 (-55.81%)
Mutual labels:  scraper, puppeteer
erc721-extensions
A set of composable extension contracts for the OpenZeppelin ERC721 base contracts.
Stars: ✭ 157 (+21.71%)
Mutual labels:  erc721, nfts
opensea-images-downloader
Script to download all the images from an opensea collection using the OpenSea API
Stars: ✭ 42 (-67.44%)
Mutual labels:  nft, opensea
nftool
A suite of tools for NFT generative art.
Stars: ✭ 145 (+12.4%)
Mutual labels:  nft, opensea
leumi-leumicard-bank-data-scraper
Open bank data for Leumi bank and Leumi card credit card
Stars: ✭ 28 (-78.29%)
Mutual labels:  scraper, puppeteer
enjin-cpp-sdk
Enjin Platform SDK for C++.
Stars: ✭ 15 (-88.37%)
Mutual labels:  nft, nfts
Jvppeteer
Headless Chrome For Java (Java 爬虫)
Stars: ✭ 193 (+49.61%)
Mutual labels:  scraper, puppeteer
nft-swap-sdk
Ethereum's missing p2p NFT and token swap library for web3 developers. Written in TypeScript. Powered by 0x.
Stars: ✭ 200 (+55.04%)
Mutual labels:  nft, erc721
niftygate
Drop-in Access Control via NFT Ownership
Stars: ✭ 61 (-52.71%)
Mutual labels:  nft, erc721

Opensea Scraper

🎉 UPDATE 2021-Nov-3: Opensea officially updated their API. You can get accurate realtime floor prices from this endpoint: https://api.opensea.io/api/v1/collection/{slug}/stats:

const axios = require("axios");

async function getFloorPrice(slug) {
  try {
    const url = `https://api.opensea.io/collection/${slug}/stats`;
    const response = await axios.get(url);
    return response.data.stats.floor_price;
  } catch (err) {
    console.log(err);
    return undefined;
  }
}

await getFloorPrice("lostpoets");
await getFloorPrice("treeverse");
await getFloorPrice("cool-cats-nft");

If you need floor prices, please use the official API (see above 👆👆👆). This scraper still can be used to scrape additional information about offers (tokenId, name, tokenContractAddress and offerUrl) as well as the ranking.

Install

npm install opensea-scraper

Usage

slug is the human readable identifier that opensea uses to identify a collection. It can be extracted from the URL: https://opensea.io/collection/{slug} slug

options is an object with the following keys

  • debug [Boolean] launches chromium locally, omits headless mode (default: false)
  • logs [Boolean]: display logs in the console (default: false)
  • sort [Boolean]: sorts the offers by lowest to highest (default: true)
  • browserInstance [PuppeteerBrowser]: bring your own browser instance for more control
const OpenseaScraper = require("opensea-scraper");

// which nft project to scrape?
const slug = "cool-cats-nft";

// options
const options = {
  debug: false,
  logs: false,
  sort: true,
  browserInstance: undefined,
}

// get basic info (from the opensea API)
const basicInfo = await OpenseaScraper.basicInfo(slug);

// get offers from opensea. Each offer includes the floor price, tokenName,
// tokenId, tokenContractAddress and offerUrl
let result = await OpenseaScraper.offers(slug, options);
console.dir(result, {depth: null}); // result object contains keys `stats` and `offers`

// get offers from opensea using a custom link
// Opensea supports encoding filtering in the URL so this method is helpful for getting
// a specific asset (for example floor price for a LAND token from the sandbox collection)
let url = "https://opensea.io/collection/sandbox?search[sortAscending]=true&search[sortBy]=PRICE&search[stringTraits][0][name]=Type&search[stringTraits][0][values][0]=Land&search[toggles][0]=BUY_NOW";
result = await OpenseaScraper.offersByUrl(url, options);
console.dir(result, {depth: null}); // result object contains keys `stats` and `offers`

// get offersByScrolling from opensea. This is an alternative method to get the same
// data as in the function `offers`, with the only difference that the data is here
// scraped actively by scrolling through the page. This method is not as efficient
// as the `offers` method, but it can scrape more than 32 offers. You could even scrape
// a whole collection with ~10k spots (this is not recommended though).
let resultSize = 40; // if you need less than 32 offers, please use the function `offers()` instead
result = await OpenseaScraper.offersByScrolling(slug, resultSize, options);
console.dir(result, {depth: null}); // result object contains keys `stats` and `offers`

// get offersByScrollingByUrl from opensea using a custom link instead of the slug
// the same logic applies as in `offersByScrolling()`
// Opensea supports encoding filtering in the URL so this method is helpful for getting
// a specific asset (for example floor price for a LAND token from the sandbox collection)
url = "https://opensea.io/collection/sandbox?search[sortAscending]=true&search[sortBy]=PRICE&search[stringTraits][0][name]=Type&search[stringTraits][0][values][0]=Land&search[toggles][0]=BUY_NOW";
resultSize = 40; // if you need less than 32 offers, please use the function `offers()` instead
result = await OpenseaScraper.offersByScrollingByUrl(url, resultSize, options);
console.dir(result, {depth: null}); // result object contains keys `stats` and `offers`

// scrape all slugs, names and ranks from the top collections from the rankings page
// "type" is one of the following:
//   "24h": ranking of last 24 hours: https://opensea.io/rankings?sortBy=one_day_volume
//   "7d": ranking of last 7 days: https://opensea.io/rankings?sortBy=seven_day_volume
//   "30d": ranking of last 30 days: https://opensea.io/rankings?sortBy=thirty_day_volume
//   "total": scrapes all time ranking: https://opensea.io/rankings?sortBy=total_volume
// "chain" is one of the following: "ethereum", "matic", "klaytn", "solana"
//    if chain is unset, all chains will be selected by default
const type = "24h"; // possible values: "24h", "7d", "30d", "total"
const chain = "solana";
const ranking = await OpenseaScraper.rankings(type, options, chain);

Debugging

To investigate an issue turn on logs and debug mode (debug: true and logs: true):

const result = await OpenseaScraper.offers("treeverse", {
  debug: true,
  logs: true
});

Bring your own puppeteer

if you want to customize the settings for your puppeteer instance you can add your own puppeteer browser instance in the options. 🚧 IMPORTANT: I recommend using stealth plugin as otherwise you most likely won't be able to scrape opensea. If you find a way without using the stealth plugin please report in the form of an issue!

const puppeteer = require('puppeteer-extra');
// add stealth plugin and use defaults (all evasion techniques)
const StealthPlugin = require('puppeteer-extra-plugin-stealth');
puppeteer.use(StealthPlugin());

const myPuppeteerInstance = await puppeteer.launch(myCustomSettings);

const result = await OpenseaScraper.offer("cool-cats-nft", {
  browserInstance: myPuppeteerInstance
});

Demo

npm run demo

Run local console / REPL

To test the functions in an REPL node environment that has OpenseaScraper service preloaded simply run:

node --experimental-repl-await -i -e "$(< init-dev-env.js)"

I recommend saving an alias:

alias consl='node --experimental-repl-await -i -e "$(< init-dev-env.js)"';

Contribute

Open PR or issue if you would like to have more features added.

Donations 🙏

Thanks for your support!
BTC: bc1qq5qn96ahlqjxfxz2n9l20kem8p9nsz5yzz93f7
ETH: 0x3e4503720Fb8f4559Ecf64BE792b3100722dE940

nftfloorprice.info 🔔

Simple NFT floor price alerts. Easily track all your NFTs and receive realtime email alerts with: https://nftfloorprice.info

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].