This repository contains a tool to make White House Office of Management and Budget (OMB) policy requirements easier to view, comply with, and maintain.
The repository contains a Django application where requirements are maintained by the OMB team. The Django application also serves an API. The repo also contains an agency- and public-facing tool to view and filter the requirements, and see related information about them. The public-facing tool is comprised of an isomorphic Node and React application.
Brought to you by the 18F eRegulations team.
See our bi-weekly demos on YouTube.
The OMB Policy Library project was born out of the team who launched eRegulations for multiple agencies. The history of eRegs and more documentation, previous agency work, and examples is available here.
We are creating software that simplifies large, complex, hard to navigate policies for agency implementers so they can comply easily and quickly. This saves them time, money, and frustration, so that they can get back to fulfilling their agency’s mission.
While the long-term results of the project will affect many layers of the government, public, and industry, the current project's stakeholders are primarily those that are affected, charged with writing, and enforcing policy guidance. This includes Federal agency Policy Analysts, CIOs, Agency Policy Writers, and agency leadership figures. Our MVP pilot is working with the OMB OFCIO office, and they serve as the initial Stakeholders.
The primary risks we are facing is updating antiquated documentation processes and tools, and trying to improve them without making the conversion too onerous on agencies. By intensely focusing on user-centered design, we are working with agency stakeholders and users to inform us on how to most effectively relieve the burden they face, while also not adding to their workload.
Besides financial and time savings by offering a more effective and organized user experience, these are some of our draft current goals for the MVP launch of the product:
Goal: Users are less confused about what is required of them by policy. Metric: Help ‘tickets’ (Calls, emails, etc.) to OMB desk officers go down in volume/length of time.
Goal: Users can quickly/easily find the parts of policy that they need. Metric: Time spend searching for policy reduces.
Goal: Convert the policy library to pure digital data: Metric: 100% of policies will be in real text as opposed to PDF, by the time we’re done.
Goal: Reduce the time it takes to update a policy and publish it Metric: Editing and approval process is significantly reduced
Goal: Create a policy input parser that has a high level of accuracy dealing with messy source docs Metric: More than 80% of the policies imported require minimal editing/fixes
Goal: Sharing of policy information is valuable and used frequently Metric: Track the usage of the sharing link
Work in progress roadmap available to stakeholders and the team currently
We recommend using Docker, an open source container engine. If you haven't already please install Docker and Docker-compose (which is installed automatically with Docker on Windows and OS X). We'll also refer to several bash scripts for executing Docker-compose commands. If running Windows, you'll want to set up bash or run the wrapped commands directly.
Let's start by adding an admin user.
bin/manage.py migrate # set up database
bin/manage.py createsuperuser
# [fill out information]
docker-compose up
# [Wait while it sets up]
# Starting development server at http://0.0.0.0:8001/
# Quit the server with CONTROL-C.
Then navigate to http://localhost:8001/admin/ and log in.
If you haven't already done so, run:
docker-compose up
# Ctrl-c to kill
Then navigate to http://localhost:8002/.
This runs in development mode (including automatic JS recompilation).
To run in prod mode, first run:
cp .env.sample .env
Then edit .env
and uncomment all lines related to production mode.
Then run:
# Build the UI styles
bin/ui-npm run build-css
# Build the UI app
bin/ui-npm run build
# Build the API styles
bin/api-npm run build
# Collect all static files for the admin
bin/manage.py collectstatic
docker-compose up
Then navigate to http://localhost:8002/.
Let's also load example requirements, agencies, and a whole document:
bin/manage.py fetch_csv
bin/manage.py import_reqs data.csv
bin/manage.py import_xml_doc example_docs/m_16_19_1.xml M-16-19
bin/manage.py import_xml_doc example_docs/m_15_16.xml M-15-16
This may emit some warnings for improper input. The next time you visit the admin, you'll see it's populated.
This project has functionality that processes PDFs and prepares them for import into the policy library.
To download some PDFs for development, you can run:
bin/manage.py download_pdfs
You can then access a variety of development and debugging-related views for the PDFs at http://localhost:8001/pdf/.
Note: You can add --rm
to the following commands to delete the images
after running the commands; alternatively, you can run
docker system prune
at regular intervals to do the same thing.
The following commands pertain to the API and/or its database:
bin/bandit
bin/flake8
bin/manage.py
bin/mypy
bin/pip-compile
bin/ptw
bin/py.test
bin/psql
The following commands pertain to the API UI, which is the user interface that staff and administrators use:
bin/api-npm
The following commands pertain to the UI, which is the user interface that the general public uses:
bin/ui-npm
If the app is throwing an unexpected exception, it might be due to needing new libraries or needing to run a database migration. As a first debugging step, try bouncing the system:
docker-compose down
docker-compose up
Try setting USE_POLLING=true
, either in your host shell environment, or
via an .env
file. This will
force all the watchers to use filesystem polling instead of OS notifications,
which works better on some platforms, such as Windows.
If you see an error about a conflicting port, try spinning down the running services (including those associated with integration tests).
docker-compose down
docker-compose -p integration_tests down
If all it lost and you want to start from scratch, run
docker-compose down -v # also removes database data
docker-compose -p integration_tests down -v
Generally, we don't need to set up local development with authentication
credentials. To exercise that workflow, however, you can create a
docker-compose.override.yml
file. Populate it with the following:
version: '2.1'
services:
api:
environment:
VCAP_SERVICES: >
{"config": [{"name": "config", "credentials": {"UI_BASIC_AUTH": {
"myusername": "mypassword",
"myothername": "itspassword"
}}}]}
In production, we always run with MAX authentication, but
for local development, we've opted for Django's password-based authentication.
If you would prefer to test MAX authentication, set the MAX_URL
environment
variable in your .env
file; you can see .env.sample
for some
example values.
We aim to store a history of changes to requirements, agencies, etc. etc. as a
safety against accidental data loss.
Django-reversion
handles these
changes made in the admin and offers partial solutions for data changes
outside of that context. We must be careful to always wrap creation, deletion,
and updates to data within its create_revision
block, lest we have no
history of the new data. Relatedly, we must not use backwards references (e.g.
a blog_set
field on authors
) when updating data as that won't get
serialized.
When we create database migrations, we may want to create a revision of all
affected models. This is necessary when moving data from one field to another
or transforming data in place. To do this, we can specify a REVISED_MODELS
field on our migration and set it to contain a sequence of pairs of
app_label
, model_name
. After all migrations are run, Django will check
which (if any) models need revisions generated. See
reqs/migrations/0040_auto_20170616_1501.py
for an example.
We provide access to JSON-serialized versions of each of our data types via a
RESTful API. This data can be filtered using a Django queryset-like syntax
(see Django Filters).
Notably, one can query on related fields and using Django-style lookups like
__in
and __range
. For example, to query for requirements which match a
certain set of keywords, use:
https://.../requirements/?keywords__name__in=Keyword1,Keyword2,Keyword3
See our list of endpoints and available filters.
We have unit tests for the API/admin (Python) and for the React-based frontend (JS), which are executed in different ways.
For Python unit tests, run:
bin/flake8 # linting
bin/mypy . # type checking
bin/py.test # run tests once
bin/ptw # watch command to run tests whenever a file changes
For JS unit tests, run:
bin/ui-npm run lint # linting
bin/ui-npm test # run tests once
bin/ui-npm run test:watch # watch command to run tests whenever a file changes
In the above commands, you can replace ui
with api-ui
to run the tests for
the API UI.
We also have a suite of integration tests, which are relatively complicated to set up, so we've wrapped them in a script:
./devops/integration-tests.sh
If your environment does not have a bash-like shell, inspect that file to implement something similar.
See our .circleci/config.yml
for a list of the exact commands we run in CI.
We deploy to our dev/demo environment via CircleCI after every merge to master.
To deploy manually (or to prod), you will need to install the cf
command
line tool and an associated plugin:
- Install/setup
cf
for cloud.gov (our Org name isomb-eregs
) - Install the autopilot plugin
Then, make sure you've built the frontend:
bin/ui-npm run build-css
And deploy!
./devops/deploy.sh dev # replace "dev" with "prod" if desired
See the eRegulations overview for context about eRegulations, which is a multi-agency project.
If you're interested in contributing to OMB eRegulations, see the contributing guidelines.
This project is in the worldwide public domain. As stated in CONTRIBUTING:
This project is in the public domain within the United States, and copyright and related rights in the work worldwide are waived through the CC0 1.0 Universal public domain dedication.
All contributions to this project will be released under the CC0 dedication. By submitting a pull request, you are agreeing to comply with this waiver of copyright interest.