All Projects → fedspendingtransparency → Usaspending Api

fedspendingtransparency / Usaspending Api

Licence: other
Server application to serve U.S. federal spending data via a RESTful API

Programming Languages

python
139335 projects - #7 most used programming language
python3
1442 projects

Projects that are alternatives of or similar to Usaspending Api

Django rest example
Django/DRF rest application example.
Stars: ✭ 17 (-89.76%)
Mutual labels:  api, postgresql, django, django-rest-framework
Drf Autodocs
Ultimately automated DRF documentation rendering(UNMAINTAINED)
Stars: ✭ 82 (-50.6%)
Mutual labels:  api, django, django-rest-framework
Education Backend
Django backend for my info-business website
Stars: ✭ 79 (-52.41%)
Mutual labels:  django, django-rest-framework, pytest
Node Express Postgresql Sequelize
Node.js, Express.js, Sequelize.js and PostgreSQL RESTful API
Stars: ✭ 148 (-10.84%)
Mutual labels:  api, restful-api, postgresql
Ara
ARA Records Ansible and makes it easier to understand and troubleshoot.
Stars: ✭ 1,176 (+608.43%)
Mutual labels:  api, django, django-rest-framework
Transporter
Sync data between persistence engines, like ETL only not stodgy
Stars: ✭ 1,175 (+607.83%)
Mutual labels:  etl, postgresql, elasticsearch
Django Auth0 Vue
A Django REST Framework + Vue.js CRUD Demo Secured Using Auth0
Stars: ✭ 99 (-40.36%)
Mutual labels:  database, django, django-rest-framework
E Commerce 2 django
Guest register, user register, user login, user logout, account home page, product view history, change password, reset password, change name, send activation email when register, resend activation email, add shipping address, add billing address, add nickname to the addresses, edit shipping address, edit billing address, view list of your addresses, reuse shipping addresses when order products, reuse billing addresses when ordeer products, show sales analytics if staff or admin only using -chart.js-, get analytics data with Ajax, receive marketing email, change if user will receive marketing email or not by admin, send contact message with Ajax, products list, product detail, download product detail as a PDF file, download digital product files -if the user purchased that digital product only-, orders list, list of digital products files, order detail, download order detail as a PDF file, verify order ownership with Ajax -to secure order detail page-, show cart products, add or remove product from cart, checkout page, thanks page when order placed successfully, add or reuse payment method, add or reuse payment method with Ajax, search products by title, search products by description, search products by price, search products by tag title, write tags for products -by admin only-, auto fill contact email, full name if user logged in.
Stars: ✭ 20 (-87.95%)
Mutual labels:  api, restful-api, django
Next
Directus is a real-time API and App dashboard for managing SQL database content. 🐰
Stars: ✭ 111 (-33.13%)
Mutual labels:  api, database, postgresql
Hydroshare
HydroShare is a collaborative website for better access to data and models in the hydrologic sciences.
Stars: ✭ 117 (-29.52%)
Mutual labels:  postgresql, django, django-rest-framework
Directus
Open-Source Data Platform 🐰 — Directus wraps any SQL database with a real-time GraphQL+REST API and an intuitive app for non-technical users.
Stars: ✭ 13,190 (+7845.78%)
Mutual labels:  api, database, postgresql
Best Of Web Python
🏆 A ranked list of awesome python libraries for web development. Updated weekly.
Stars: ✭ 1,118 (+573.49%)
Mutual labels:  api, django, django-rest-framework
Postgraduation
University management platform dedicated for post-graduation in computer science field using django rest framework.
Stars: ✭ 35 (-78.92%)
Mutual labels:  api, django, django-rest-framework
Parvula
An extremely simple & flexible CMS generated from flat files with a complete RESTful API —
Stars: ✭ 76 (-54.22%)
Mutual labels:  api, restful-api, markdown
Djangorestframework Book
Django REST framework 3 中文文档, API参考, 最佳实践指南
Stars: ✭ 28 (-83.13%)
Mutual labels:  restful-api, django, django-rest-framework
Tutorialdb
A search 🔎 engine for programming/dev tutorials, See it in action 👉
Stars: ✭ 93 (-43.98%)
Mutual labels:  api, django, django-rest-framework
Reddit Detective
Play detective on Reddit: Discover political disinformation campaigns, secret influencers and more
Stars: ✭ 129 (-22.29%)
Mutual labels:  api, etl, database
Project Dashboard With Django
Agile Project Management dashboard with Django REST and Vue.js
Stars: ✭ 25 (-84.94%)
Mutual labels:  postgresql, django, django-rest-framework
Node Pg Migrate
Node.js database migration management for Postgresql
Stars: ✭ 838 (+404.82%)
Mutual labels:  api, database, postgresql
Csv2db
The CSV to database command line loader
Stars: ✭ 102 (-38.55%)
Mutual labels:  etl, database, postgresql

USAspending API

Code style: black Build Status Test Coverage Code Climate

This API is utilized by USAspending.gov to obtain all federal spending data which is open source and provided to the public as part of the DATA Act.

USAspending Landing Page

Creating a Development Environment

Ensure the following dependencies are installed and working prior to continuing:

Requirements

If not using Docker:

Using Docker is recommended since it provides a clean environment. Setting up your own local environment requires some technical abilities and experience with modern software tools.

  • Command line package manager
    • Windows' WSL bash uses apt-get
    • MacOS users will use Homebrew
    • Linux users already know their package manager (yum, apt, pacman, etc.)
  • PostgreSQL version 10.x (with a dedicated data_store_api database)
  • Elasticsearch version 7.1
  • Python 3.7 environment
    • Highly recommended to use a virtual environment. There are various tools and associated instructions depending on preferences
    • See Required Python Libraries for an example using pyenv

Cloning the Repository

Now, navigate to the base file directory where you will store the USAspending repositories

$ mkdir -p usaspending && cd usaspending
$ git clone https://github.com/fedspendingtransparency/usaspending-api.git
$ cd usaspending-api

Database Setup

There are three documented options for setting up a local database in order to run the API:

  1. Local Empty DB. Use your own local postgres database for the API to use.
  2. Containerized Empty DB. Create an empty directory on your localhost where all the database files will persist and use the docker-compose file to bring up a containerized postgres database.
  3. Local Populated DB. Download either the whole database or a database subset from the USAspending website.

Option 1: Using a Locally Hosted Postgres Database

Create a Local postgres database called 'data_store_api' and either create a new username and password for the database or use all the defaults. For help, consult:

Make sure to grant whatever user you created for the data_store api database superuser permissions or some scripts will not work:

postgres=# ALTER ROLE <<role/user you created>> WITH SUPERUSER;

Option 2: Using the Docker Compose Postgres Database

See below for basic setup instructions. For help with Docker Compose:

Database Setup and Initialization with Docker Compose
  • None of these commands will rebuild a Docker image! Use --build if you make changes to the code or want to rebuild the image before running the up steps.

  • If you run a local database, set POSTGRES_HOST in .env to host.docker.internal. POSTGRES_PORT should be changed if it isn't 5432.

    • docker-compose up usaspending-db will create and run a Postgres database.

    • docker-compose run --rm usaspending-manage python3 -u manage.py migrate will run Django migrations: https://docs.djangoproject.com/en/2.2/topics/migrations/.

    • docker-compose run --rm usaspending-manage python3 -u manage.py load_reference_data will load essential reference data (agencies, program activity codes, CFDA program data, country codes, and others).

    • docker-compose run --rm usaspending-manage python3 -u manage.py matview_runner --dependencies will provision the materialized views which are required by certain API endpoints.

Manual Database Setup
  • docker-compose.yaml contains the shell commands necessary to set up the database manually, if you prefer to have a more custom environment.

Option 3: Downloading the database or a subset of the database and loading it into PostgreSQL

For further instructions on how to download, use, and setup the database using a subset of our data please go to:

USAspending Database Download

Elasticsearch Setup

Some of the API endpoints reach into Elasticsearch for data.

  • docker-compose up usaspending-es will create and start a single-node Elasticsearch cluster, using the ES_CLUSTER_DIR specified in the .env configuration file. We recommend using a folder outside of the usaspending-api project directory so it does not get copied to other containers.

  • The cluster should be reachable via at http://localhost:9200 ("You Know, for Search").

  • Optionally, to see log output, use docker-compose logs usaspending-es (these logs are stored by docker even if you don't use this).

Running the API

docker-compose up usaspending-api

  • You can update environment variables in settings.py (buckets, elasticsearch, local paths) and they will be mounted and used when you run this.

The application will now be available at http://localhost:8000.

Note: if the code was run outside of Docker then compiled Python files will potentially trip up the docker environment. A useful command to run for clearing out the files on your host is:

find . | grep -E "(__pycache__|\.pyc|\.pyo$)" | xargs rm -rf

Using the API

In your local development environment, available API endpoints may be found at http://localhost:8000/docs/endpoints

Deployed production API endpoints and docs are found by following links here: https://api.usaspending.gov

Loading Data

Note: it is possible to run ad-hoc commands out of a Docker container once you get the hang of it, see the comments in the Dockerfile.

For details on loading reference data, DATA Act Broker submissions, and current USAspending data into the API, see loading_data.md.

For details on how our data loaders modify incoming data, see data_reformatting.md.

Running Tests

Test Setup

To run all tests in the docker services run

docker-compose run --rm usaspending-test

To run tests locally and not in the docker services, you need:

  1. Postgres A running PostgreSQL database server (See Database Setup above)
  2. Elasticsearch A running Elasticsearch cluster (See Elasticsearch Setup above)
  3. Required Python Libraries Python package dependencies downloaded and discoverable (See below)
  4. Environment Variables Tell python where to connect to the various data stores (See below)

Once these are satisfied, run:

(usaspending-api) $ pytest

Required Python Libraries

Create and activate the virtual environment using venv, and ensure the right version of Python 3.7.x is being used (the latest RHEL package available for python36u: as of this writing)

$ pyenv install 3.7.2
$ pyenv local 3.7.2
$ python -m venv .venv/usaspending-api
$ source .venv/usaspending-api/bin/activate

Your prompt should then look as below to show you are in the virtual environment named usaspending-api (to exit that virtual environment, simply type deactivate at the prompt).

(usaspending-api) $

pip install application dependencies

(usaspending-api) $ pip install -r requirements/requirements.txt

Environment Variables

Create a .envrc file in the repo root, which will be ignored by git. Change credentials and ports as-needed for your local dev environment.

export DATABASE_URL=postgres://usaspending:[email protected]:5432/data_store_api
export ES_HOSTNAME=http://localhost:9200
export DATA_BROKER_DATABASE_URL=postgres://admin:[email protected]:5435/data_broker

If direnv does not pick this up after saving the file, type

$ direnv allow

Alternatively, you could skip using direnv and just export these variables in your shell environment.

Including Broker Integration Tests

Some automated integration tests run against a Broker database. If the dependencies to run such integration tests are not satisfied, those tests will bail out and be marked as Skipped. (You can see messages about those skipped tests by adding the -rs flag to pytest, like: pytest -rs)

To satisfy these dependencies and include execution of these tests, do the following:

  1. Ensure you have Docker installed and running on your machine
  2. Ensure the Broker source code is checked out alongside this repo at ../data-act-broker-backend
  3. Ensure you have the DATA_BROKER_DATABASE_URL environment variable set, and pointing to a live PostgreSQL server (no database required)
  4. Ensure you have built the Broker backend Docker image by running:
    (usaspending-api) $ docker build -t dataact-broker-backend ../data-act-broker-backend

NOTE: Broker source code should be re-fetched and image rebuilt to ensure latest integration is tested

Re-running the test suite using pytest -rs with these dependencies satisfied should yield no more skips of the broker integration tests.

Contributing

To submit fixes or enhancements, or to suggest changes, see CONTRIBUTING.md

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].