- Enter a local python virtual enviroment
- Make sure poetry package is installed
- In directory
./src/backend_api
, run:
poetry install
For local development and testing, we use the override
containers. These have localized ports which can be accessed.
In directory ./src
, run:
docker-compose up -d
This will trigger the containers in docker-compose.override.yml
file.
- Go to http://localhost:8881/ or access pgAdmin software on your own computer
- User credentials:
- Email:
${PGADMIN_DEFAULT_EMAIL}
- Password:
${PGADMIN_DEFAULT_PASSWORD}
- Email:
- Register server credentials:
- General → Enter a
'custom name'
for the server - Connection → Host name/address:
${POSTGRES_SERVER}
if connecting in browser${DOMAIN}
if connecting to pgAdmin on your own computer
- Connection → Username:
${POSTGRES_USER}
- Connection → Password:
${POSTGRES_PASSWORD}
- Leave everything else unchanged
- General → Enter a
- Save
- Tables can be found under:
'custom name'>Databases>${POSTGRES_DB}>Schemas>public>Tables
This section describes how to run the various tests in this application.
Describes how to run the tests in backend API.
For information about the api tests, please check out Test module documentation.
Development docker containers must be running:
docker compose up -d
Start docker test containers:
docker compose -f docker-compose.test.yml up -d
To run all of the tests backend container, run this command:
docker compose exec -T backend pytest
To run tests to the real environment backend API, perform this command:
docker compose exec -T backend sh scripts/test_scripts/{test_script_name}.sh
Describes how to run tests in backend data retrieval.
You need to build Dockerfile.test
within module backend_data_retrieval
root.
To build the container, perform this command:
docker build -t backend_data_retrieval_test -f backend_data_retrieval/Dockerfile.test backend_data_retrieval
To run all tests in backend data retrieval, perform this command:
docker run --rm backend_data_retrieval_test
If you want to specify a test file, run it like this for example:
docker run --rm backend_data_retrieval_test data_retrieval_app/tests/external_data_retrieval/test_continuous_data_retrieval.py
This section describes how to migrate changes made in database models to the database postgres server. Migrations are done inside the backend
container, which runs our API.
To enter the backend
container, run:
docker container exec -it src-backend-1 bash
- To create a alembic revision, run:
alembic revision --autogenerate -m "Message"
- Review the generated migration template
- To migrate database changes, run:
alembic upgrade head
To revert to the latest changes made by an alembic revision, run:
alembic downgrade -1
The redis server is used to handle ratelimit on the Path of Modifiers' API. A user's activity to endpoints are stored inside the Redis server to keep track of rates.
You need to have Redis CLI installed on your computer.
To access the Redis Cache server, run:
redis-cli -h $DOMAIN -p 6379 -a $REDIS_PASSWORD
These are errors that may show up when creating a Redis container in our application.
WARNING Memory overcommit must be enabled! Without it, a background save or replication may fail under low memory condition...
This warning can be fixed by enabling overcommit on reboot:
echo "vm.overcommit_memory = 1" | sudo tee /etc/sysctl.d/nextcloud-aio-memory-overcommit.conf
Enable it temporarily and immediately:
sysctl "vm.overcommit_memory=1"
We use a tool called pre-commit for code linting and formatting.
When you install it, it runs right before making a commit in git. This way it ensures that the code is consistent and formatted even before it is committed.
You can find a file .pre-commit-config.yaml
with configurations at the root of the project.
pre-commit
is already part of the dependencies of the project, but you could also install it globally if you prefer to, following the official pre-commit docs.
After having the pre-commit tool installed and available, you need to "install" it in the local repository, so that it runs automatically before each commit.
In root directory .
, using Poetry, you could do it with:
poetry -C src/backend_api run pre-commit install --config src/.pre-commit-config.yaml
Now whenever you try to commit, e.g. with:
git commit
...pre-commit will run and check and format the code you are about to commit, and will ask you to add that code (stage it) with git again before committing.
Then you can git add the modified/fixed files again and now you can commit.
You can also run pre-commit manually on all the files.
In root directory .
, using Poetry, you could do it with:
poetry -C src/backend_api run pre-commit run --all-files --config src/.pre-commit-config.yaml
Useful if you have gathered data and want to optimize and min max performance with different database architectures.
Or just create a safepoint before doing general migrations.
To backup the database in a file, make sure you are in a fitting folder to put the file (which is quite large). Create the file:
docker exec -t src-db-1 pg_dumpall -c -U pom_oltp_superuser > pom_db_data_dump_`date +%Y-%m-%d"_"%H_%M_%S`.sql
To restore the database from the file:
cat your_dump.sql | docker exec -i src-db-1 psql -U pom_oltp_superuser -d pom_oltp_db
Source: stackoverflow-backup-restore-postgres-db
The production or staging URLs would use these same paths, but with your own domain.
Frontend: http://localhost
Backend API: http://localhost/api/
Automatic Interactive Docs (Swagger UI): http://localhost/docs
Automatic Alternative Docs (ReDoc): http://localhost/redoc
pgAdmin: http://localhost:8081
Traefik UI: http://localhost:8090