Setting up consumerfinance.gov¶
This quickstart requires a working Docker Desktop installation and git:
git clone https://github.com/cfpb/consumerfinance.gov.git cd consumerfinance.gov
consumerfinance.gov should now be available at http://localhost:8000.
This documentation will be available at http://localhost:8888 (docker-compose only).
The Wagtail admin area will be available at http://localhost:8000/admin/,
which you can log into with the credentials
Please see our running consumerfinance.gov documentation for next steps.
There are also optional steps described below, as well as alternative setup options.
The quickstart above should get you started. Each step has some additional detail below.
Clone the repository¶
Using the console, navigate to the root directory in which your projects live and clone this project's repository:
git clone firstname.lastname@example.org:cfpb/consumerfinance.gov.git cd consumerfinance.gov
.git-blame-ignore-revs by running the following command within
git config blame.ignoreRevsFile .git-blame-ignore-revs
Set up the environment (optional)¶
The consumerfinance.gov Django site relies on environment variables defined
.env file. If this is your first time setting up the project,
cp -a .env_SAMPLE .env
Set up a local Python environment (optional)¶
For running our Python unit tests, linting, etc outside of the Docker container, we rely on a local Python environment.
brew install pyenv pyenv-virtualenv
Python 3.8 must then be installed once pyenv is installed:
pyenv install 3.8.12
First we need to create a Python virtualenv for consumerfinance.gov:
pyenv virtualenv 3.8.12 consumerfinance.gov
Then we'll need to activate it. Activating the virtualenv is necessary before using it in the future as well:
pyenv activate consumerfinance.gov
Once activated, our Python CI requirements can be installed in the virtualenv:
pip install -r requirements/ci.txt
pre-commit to automatically run our linting tools before a commit
takes place. These tools consist of
pre-commit, running the following commands from within the
pip install -U pre-commit && pre-commit install
Before each commit,
pre-commit will execute and run our
If any task fails, it will attempt to resolve the issue automatically, notify
you of the changes (if any), and ask for you to re-stage the changed files. If
all checks pass, a commit will take place as expected, allowing you to then
push to GitHub. This is to reduce the number of commits with failed lints, and
to assist developers with linting without thinking.
Build the frontend¶
curl -o- https://raw.githubusercontent.com/nvm-sh/nvm/v0.39.0/install.sh | sh
Node 16 must then be installed once nvm is installed:
nvm install 16
Node.js 16 can then be used in any sh using:
nvm use 16
Yarn must then be installed:
curl -o- -L https://yarnpkg.com/install.sh | sh
We have a single script that will install our frontend dependencies for both building and unit testing/linting/etc:
Yarn can be used to rebuild our assets after the initial setup:
Set up and run the Docker containers¶
To build and run our Docker containers for the first time, run:
Kubernetes via Helm:¶
./build-images.sh && ./helm-install.sh
Either approach will build and start our PostgreSQL, Elasticsearch, and Python services.
Load initial data¶
initial-data.sh script can be used to initialize a new database to make
it easy to get started working on consumerfinance.gov.
This script ensures that all migrations are applied to the database
and then does the following:
- Creates an
adminsuperuser with password
- If it doesn't already exist, creates a new Wagtail home page named
CFGOV, with a slug of
- Updates the default Wagtail site to use the port defined by the
DJANGO_HTTP_PORTenvironment variable, if defined; otherwise this port is set to 80.
- If it doesn't already exist, creates a new
SharingSitewith a hostname and port defined by the
This script must be run inside the Docker
docker-compose exec python sh ./initial-data.sh
Load a database dump¶
Alternatively, one of our database dumps can be installed using our
refresh-data.sh script. You can get a database dump by defining
CFGOV_PROD_DB_LOCATION in your
.env file as described in
GitHub Enterprise at
inside a Docker
python container sh immediately before running
docker-compose exec python sh CFGOV_PROD_DB_LOCATION=http://(rest of the URL) ./refresh-data.sh
refresh-data.sh can also be given a path to a gziped database dump:
This automatically (re)builds the Elasticsearch index,
unless you run the
refresh-data.sh script with the
consumerfinance.gov requires a Python environment, PostgreSQL, and Elasticsearch to run. None of this requires Docker, Docker is simply a convenient way to ensure consistent versioning and running of these services along with the consumerfinance.gov Django site.
The consumerfinance.gov Django site can be run locally in a virtualenv and can
use PostgreSQL and Elasticsearch from either
or from Homebrew.
PostgreSQL and Elasticsearch from Docker¶
To build and start only
the PostgreSQL (
and Elasticsearch (
containers from our
explicitly specify them as arguments to
docker-compose up postgres elasticsearch
This will expose
PostgreSQL on port
Elasticsearch on port
PostgreSQL and Elasticsearch from Homebrew¶
You can install PostgreSQL and Elasticsearch from Homebrew if you're on a Mac:
brew install postgresql brew install elasticsearch
Once it's installed, you can configure it to run as a service:
brew services start postgresql brew services start elasticsearch
Our recommended Postgres configuration is a database named
cfgov and a user
cfpb, with data stored in schema
cfpb. This can be created with the
dropdb --if-exists cfgov && dropuser --if-exists cfpb psql postgres -c "CREATE USER cfpb WITH LOGIN PASSWORD 'cfpb' CREATEDB" psql postgres -c "CREATE DATABASE cfgov OWNER cfpb" psql postgres://cfpb:cfpb@localhost/cfgov -c "CREATE SCHEMA cfpb"
We don't support using an SQLite database because we use database fields
that are specific to Postgres. The
CREATEDB keyword above allows the
cfpb user to create a temporary Django database when running unit tests.
Set up the
After you have chosen a means to run PostgreSQL and Elasticsearch, set up the environment, set up a local Python environment, and built the frontend, all the Python dependencies for running locally can be installed:
pyenv activate consumerfinance.gov pip install -r requirements/local.txt
Once complete, our
runserver.sh script will bring up the site at
You can optionally use our private fonts from a CDN as well.
Sync local image and document storage (optional)¶
If using a database dump, pages will contain links to images or documents that exist in the database but don't exist on your local disk. This will cause broken or missing images or links when browsing the site locally.
This is because in production images and documents are stored on S3, but when running locally they are stored on disk.
This project includes two Django management commands that can be used to download any remote images or documents referenced in the database so that they can be served when running locally.
This command downloads all remote images (and image renditions) referenced in the database, retrieving them from the specified URL and storing them in the specified local directory:
cfgov/manage.py sync_image_storage https://files.consumerfinance.gov/f/ ./cfgov/f/
This command does the same, but for documents:
cfgov/manage.py sync_document_storage https://files.consumerfinance.gov/f/ ./cfgov/f/
Install GNU gettext for Django translation support (optional)¶
On macOS, GNU gettext is available via Homebrew:
brew install gettext
but it gets installed as "keg-only" due to conflicts with the default installation of BSD gettext. This means that GNU gettext won't be loaded in your PATH by default. To fix this, you can run
brew link --force gettext
to force its installation, or see
brew info gettext for an alternate
If installed locally, you should be able to run this command successfully:
$ gettext --version
GNU gettext is also required to run our translation-related unit tests locally.