- emgapi - Django RESTful API
- ebi-metagenomics-client - React FrontEnd
- sourmash-queue - microservice API to perform Sourmash MAG vs MAG searches
These repositories are also part of the MGnify web stack, but aren't included as submodules (yet)
- mgnify-sourmash-component - web component to create Sourmash sketches in the browser. Published on NPM.
- See npm link for the nice way to work on this locally too.
- genome-search - microservice API to perform COBS k-mer gene searches on genomes
- blog - GitHub-pages hosted blog and feed for the MGnify frontpage
- notebooks - MGnify Notebooks Server (Jupyter Lab) and Docs (Quarto markdown).
Task, Docker, Docker-Compose, nodejs, webpack (npm install -g webpack
).
git clone --recurse-submodules git@github.com:EBI-Metagenomics/mgnify-web.git
Change ebi-metagenomics-client/config.private.json
to include:
{
"api": "http://127.0.0.1:8000/v1/"
}
then
task restore-mongo-test-db
task run-client
and browse to localhost:9000/metagenomics.
The Taskfile
covers most common dev commands, like testing and version bumping.
Run task --list
to see what can be done.
Clone the repo and the submodules
git clone --recurse-submodules git@github.com:EBI-Metagenomics/mgnify-web.git
Docker-compose is used for local development – it is NOT used in production. The docker-compose.yml by defauly creates an environment with SQlite for the EMG database, and a Mongo DB service. Extra profiles are included in the docker-compose file, to start additional services when needed.
- Use the profile
--profile sourmash
to start the services needed to work on the Sourmash Genome Search. - Use the profile
--profile mysql
to start a MySQL service needed to work on MySQL-specific things (like fulltextindex which isn't supported in SQLite).
Additional config.yml entries/files will be needed for these.
There is a Taskfile.yml
with tasks for most common development needs like running the services, tests, and managing test databases.
There are currently no tasks to help with the mysql setup (like running the mysql
profile and targetting the mysql
config).
Note that MySQL is used on GitHub actions for CI, to match the production setup of this API at EBI.
You can either use a db dump, a minimal-ish test db, or an empty db.
A Sqlite database is available in the ebi-metagenomics-client/ci/testdbs
directory (because this is used in the webclient CI tests).
If you ever need to recreate this, use task create-test-dbs
.
This makes a minimally populated, migrated, Sqlite emg
db, a very minimal fake ena
Sqlite, and dumps a Mongo archive.
There is a Mongo DB dump to go alongside this option. Restore it with:
task restore-mongo-test-db
Run the django migrations to get the DB in shape
task manage -- migrate
Some manual intervention is required to populate the API database with a production DB dump.
You need to get a dump of the MySQL database, for that refer to the documentation in confluence (EBI-only).
Use the MySQL setup, and pipe the dump files into MySQL – the dump files will INSERT
commands.
You can add a Django superuser to the database, so you can use the Django Admin console.
In the minimal Sqlite dbs, one has been created (username/password: emgtest
, emgemgtesttest
).
task manage -- createsuperuser
Then you can log into the Django admin console
To run MGnify you will need the API and the WebClient running at the same time.
The API will run using docker-compose (to run the databases and django etc).
task run-api
If you need to use the MySQL backend instead, you would run:
docker compose -f docker-compose.yml -f docker-compose-mysql.yml up -d
task manage -- collectstatic
task manage -- runserver 0.0.0.0:8000
Either way the api will be available in http://localhost:8000/metagenomics/api
Install npm modules
npm install
If you want the client to talk to the local API, change ebi-metagenomics-client/config.private.json
to include
{
"api": "http://127.0.0.1:8000/v1/"
}
run the webpack dev server (and API, DBs etc via docker compose):
task run-client
The webclient will be available in http://localhost:9000/metagenomics
To work on the Sourmash (MAG) search, build a minimal index in the sourmash-queue
service:
task create-sourmash-test-index
Flower (a dashboard for the Celery queue system) is running, browse to 5555 to see it.
To debug the worker: docker attach mgnify-web-sourmash-queue-1
.
To run any django manage command for the API:
task manage -- whatever-command
(note the use of --
to separate arguments for manage.py – this is a standard Taskfile feature)
e.g.
task manage -- migrate 001
On GitHub, these are run by .github/workflows/test.yml in a similar-ish way.
task run-client
#(in a new terminal)
cd ebi-metagenomics-client
task test-client-run
#or to run a single test:
task test-client-run -- -s "cypress/integration/browse.js"
#or for an interactive testing experience:
task test-client-open
task test-api
#or for specific test/s with file/class/method name matching some string:
task test-api -- -k "PublicationAPI"
Note that the task test-api
uses the --profile tests
docker-compose profile, and uses a different config file too.