All Projects → ChrisMuir → Zillow

ChrisMuir / Zillow

Zillow Scraper for Python using Selenium

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to Zillow

Sillynium
Automate the creation of Python Selenium Scripts by drawing coloured boxes on webpage elements
Stars: ✭ 100 (-29.08%)
Mutual labels:  scraper, web-scraping, selenium, chromedriver
TikTok
Download public videos on TikTok using Python with Selenium
Stars: ✭ 37 (-73.76%)
Mutual labels:  scraper, selenium, chromedriver
Scrape Linkedin Selenium
`scrape_linkedin` is a python package that allows you to scrape personal LinkedIn profiles & company pages - turning the data into structured json.
Stars: ✭ 239 (+69.5%)
Mutual labels:  scraper, web-scraping, selenium
yt-videos-list
Create and **automatically** update a list of all videos on a YouTube channel (in txt/csv/md form) via YouTube bot with end-to-end web scraping - no API tokens required. Multi-threaded support for YouTube videos list updates.
Stars: ✭ 64 (-54.61%)
Mutual labels:  scraper, selenium, chromedriver
Scrapstagram
An Instagram Scrapper
Stars: ✭ 50 (-64.54%)
Mutual labels:  scraper, selenium
Botvid 19
Messenger Bot that scrapes for COVID-19 data and periodically updates subscribers via Facebook Messages. Created using Python/Flask, MYSQL, HTML, Heroku
Stars: ✭ 34 (-75.89%)
Mutual labels:  scraper, selenium
Instaloctrack
An Instagram OSINT tool to collect all the geotagged locations available on an Instagram profile in order to plot them on a map, and dump them in a JSON.
Stars: ✭ 85 (-39.72%)
Mutual labels:  scraper, selenium
Splashr
💦 Tools to Work with the 'Splash' JavaScript Rendering Service in R
Stars: ✭ 93 (-34.04%)
Mutual labels:  web-scraping, selenium
Docker Python Chromedriver
Dockerfile for running Python Selenium in headless Chrome (Python 2.7 / 3.6 / 3.7 / 3.8 / Alpine based Python / Chromedriver / Selenium / Xvfb included in different versions)
Stars: ✭ 385 (+173.05%)
Mutual labels:  selenium, chromedriver
Hockey Scraper
Python Package for scraping NHL Play-by-Play and Shift data
Stars: ✭ 93 (-34.04%)
Mutual labels:  scraper, web-scraping
Pulsar
Turn large Web sites into tables and charts using simple SQLs.
Stars: ✭ 100 (-29.08%)
Mutual labels:  web-scraping, selenium
Autocrawler
Google, Naver multiprocess image web crawler (Selenium)
Stars: ✭ 957 (+578.72%)
Mutual labels:  selenium, chromedriver
Gisaid Scrapper
Scrapping tool for GISAID data regarding SARS-CoV-2
Stars: ✭ 25 (-82.27%)
Mutual labels:  scraper, selenium
Spam Bot 3000
Social media research and promotion, semi-autonomous CLI bot
Stars: ✭ 79 (-43.97%)
Mutual labels:  scraper, selenium
Spidr
A versatile Ruby web spidering library that can spider a site, multiple domains, certain links or infinitely. Spidr is designed to be fast and easy to use.
Stars: ✭ 656 (+365.25%)
Mutual labels:  scraper, web-scraping
Seleniumcrawler
An example using Selenium webdrivers for python and Scrapy framework to create a web scraper to crawl an ASP site
Stars: ✭ 117 (-17.02%)
Mutual labels:  scraper, selenium
Instagram Profilecrawl
💻 Quickly crawl the information (e.g. followers, tags, etc...) of an instagram profile. No login required!
Stars: ✭ 110 (-21.99%)
Mutual labels:  selenium, chromedriver
Udemycoursegrabber
Your will to enroll in Udemy course is here, but the money isn't? Search no more! This python program searches for your desired course in more than [insert big number here] websites, compares the last updated date, and gives you the download link of the latest one back, but you also have the choice to see the other ones as well!
Stars: ✭ 137 (-2.84%)
Mutual labels:  scraper, selenium
30 Days Of Python
Learn Python for the next 30 (or so) Days.
Stars: ✭ 1,748 (+1139.72%)
Mutual labels:  web-scraping, selenium
Undetected Chromedriver
Custom Selenium Chromedriver | Zero-Config | Passes ALL bot mitigation systems (like Distil / Imperva/ Datadadome / CloudFlare IUAM)
Stars: ✭ 365 (+158.87%)
Mutual labels:  selenium, chromedriver

Zillow Scraping with Python

ATTENTION:

As of 2019, this code no longer works for most users. Zillow is now able to detect the use of all/most automated webdrivers, and will display an unlimited number of CAPTCHA's when the site is launched in a webdriver. I have no interest in putting more work into this project, but am leaving it up to serve as an example of how to webscrape using Selenium with Python.

WARNING: Use this code at your own risk, scraping is against Zillow's TOC

Basic tool for scraping current home listings from Zillow, written in Python using Selenium. The code takes as input search terms that would normally be entered on the Zillow home page. It creates 11 variables on each home listing from the data, saves them to a dataframe, and then writes the df to a CSV file that gets saved to your working directory. Using zip codes as search terms seems to yield the best results, the scraper works at a rate of about 75 zip codes per hour (compared to the Zillow API limit of 1000 homes per 24h).

There are two files, zillow_runfile.py and zillow_functions.py. Clone this repo to your working directory, open the runfile and step through the code line-by-line. The zillow functions are sourced at the top of the runfile.

This tool uses a for loop to iterate over a list of input search terms, scrape the listings of each, and append the results to a dataframe. Function zipcodes_list() allows the user to compile a large list of zip codes to use as search terms, using the package zipcode. For example, st = zipcodes_list(['10', '11', '770']) will yield every US zip code that begins with '10', '11', or '770' as a single list. Object st could then be passed to the scraper.

Some things to keep in mind

  • You will need to edit the input parameter of function init_driver within zillow_runfile.py to point to the local path of your web driver program (required by Selenium).
  • The max return for each search term (i.e. each zip code) is 520 home listings.
  • Zillow will periodically throw up a CAPTCHA page. The script is designed to pause scraping indefinitely until the user has manually completed the CAPTCHA requirements (at which point it should resume scraping).
  • There tends to be a small amount of NA's on every search, however foreclosure properties seem to be more likely to return NA's. So the more foreclosures there are in a search, the more NA's there will be.

Software Requirements/Info

  • This code was written using Python 3.5.
  • Selenium (this can be PIP installed, written using v3.0.2).
  • The Selenium package requires a webdriver program. This code was written using Chromedriver v2.25.

Example of the output dataframe

df.head(n = 6)
                 address     city state    zip    price  sqft bedrooms  \
0      3011 Bissonnet St  Houston    TX  77005   575000  1820        3   
1          4229 Drake St  Houston    TX  77005   615000  2611        3   
2        2237 Wroxton Rd  HOUSTON    TX  77005  2095000  5492        4   
3      4318 Childress St  Houston    TX  77005   540000  2438        4   
4       2708 Werlein Ave  Houston    TX  77005  1449000  3905        4   
5  5402 Buffalo Speedway  Houston    TX  77005  1995000  4658        3   

  bathrooms days_on_zillow           sale_type  \
0         2             NA      House For Sale   
1         3             NA   For Sale by Owner   
2         5             NA      House For Sale   
3         4              2  Townhouse For Sale   
4         5              1      House For Sale   
5         4              5      House For Sale   

                                                 url  
0  http://www.zillow.com/homes/for_sale//homedeta...  
1  http://www.zillow.com/homes/for_sale//homedeta...  
2  http://www.zillow.com/homes/for_sale//homedeta...  
3  http://www.zillow.com/homes/for_sale//homedeta...  
4  http://www.zillow.com/homes/for_sale//homedeta...  
5  http://www.zillow.com/homes/for_sale//homedeta...  
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].