This repository is used for the backend of the Curious application stack.
- Curious Admin - GitHub Repo
- Curious Backend - This Repo
- Curious Mobile App - GitHub Repo
- Curious Web App - GitHub Repo
On MacOS:
brew install uvOn other platforms, follow the uv install instructions
Python versions and dependencies are managed via uv. There is no need to use other tooling such as pip,
pyenv, pipenv, etc.
Install the python version specified in pyproject.toml
uv python install
⚠️ If the python version changes (ex: from 3.12 to 3.13) this command will need to be rerun
Install all dependencies from pyproject.toml
uv syncThe backend leverages the python package dotenv to read environment variables from a .env file
on the file system. This is useful during local development due to the large number
of necessary environment variables to configure the application.
Use the supplied .env.default as a baseline to get started. It has valid defaults for for nearly
all local development:
cp .env.default .env🛑 NOTE: Make sure to set
RABBITMQ__USE_SSL=Falsefor local development
| Key | Default value | Description |
|---|---|---|
| DATABASE__HOST | postgres | Database Host |
| DATABASE__USER | postgres | User name for Postgresql Database user |
| DATABASE__PASSWORD | postgres | Password for Postgresql Database user |
| DATABASE__DB | mindlogger_backend | Database name |
| DATABASE__POOL_SIZE | 5 | Database connection pool size |
| DATABASE__POOL_OVERFLOW_SIZE | 10 | Allowed overflow size of the connection pool |
| DATABASE__POOL_TIMEOUT | 30 | The number of seconds to wait for a connection from the pool to become available |
| CORS__ALLOW_ORIGINS | * |
Represents the list of allowed origins. Set the Access-Control-Allow-Origin header. Example: https://dev.com,http://localhost:8000 |
| CORS__ALLOW_ORIGINS_REGEX | - | Regex pattern of allowed origins. |
| CORS__ALLOW_CREDENTIALS | true | Set the Access-Control-Allow-Credentials header |
| CORS__ALLOW_METHODS | * |
Set the Access-Control-Allow-Methods header |
| CORS__ALLOW_HEADERS | * |
Set the Access-Control-Allow-Headers header |
| AUTHENTICATION__ACCESS_TOKEN__SECRET_KEY | secret1 | Access token's salt |
| AUTHENTICATION__REFRESH_TOKEN__SECRET_KEY | secret2 | Refresh token salt |
| AUTHENTICATION__REFRESH_TOKEN__TRANSITION_KEY | transition secret | Transition refresh token salt. Used for changing refresh token key (generate new key for AUTHENTICATION__REFRESH_TOKEN__SECRET_KEY and use previous value as transition token key for accepting previously generated refresh tokens during transition period (see AUTHENTICATION__REFRESH_TOKEN__TRANSITION_EXPIRE_DATE)) |
| AUTHENTICATION__REFRESH_TOKEN__TRANSITION_EXPIRE_DATE | transition expiration date | Transition expiration date. After this date transition token ignored |
| AUTHENTICATION__ALGORITHM | HS256 | The JWT's algorithm |
| AUTHENTICATION__ACCESS_TOKEN__EXPIRATION | 30 | Time in minutes after which the access token will stop working |
| AUTHENTICATION__REFRESH_TOKEN__EXPIRATION | 30 | Time in minutes after which the refresh token will stop working |
| ADMIN_DOMAIN | - | Admin panel domain |
| RABBITMQ__URL | rabbitmq | Rabbitmq service URL |
| RABBITMQ__USE_SSL | True | Rabbitmq ssl setting, turn false to local development |
| MAILING__MAIL__USERNAME | mailhog | Mail service username |
| MAILING__MAIL__PASSWORD | mailhog | Mail service password |
| MAILING__MAIL__SERVER | mailhog | Mail service URL |
| MULTI_INFORMANT__TEMP_RELATION_EXPIRY_SECS | 86400 | Expiry (sec) of temporary multi-informant participant take now relation |
| SECRETS__SECRET_KEY | - | Secret key for data encryption. Use this key only for local development |
| ONEUP_HEALTH__CLIENT_ID | - | OneUpHealth API Client ID |
| ONEUP_HEALTH__CLIENT_SECRET | - | OneUpHealth API Client secret |
| ONEUP_HEALTH__MAX_ERROR_RETRIES | 5 | Maximum number of times to re-attempt fetching health data from the OneUpHealth API. The overall total number of attempts will be this value plus one |
✋Note: most environment variables have double underscore (
__) instead of_.The application leverages
pydantic-settingswhich supports nested settings models to effectively namespace related settings.
The application requires Postgres, Redis, and RabbitMQ to be running to start up and serve requests (as well as running the test suite).
If mail services are needed, mailhog is required and is provided via docker compose.
If uploading media files to applets or answers, then an S3 compatible service is needed. MinIO is provided via docker compose.
To run all services required for the backend:
docker-compose up postgres redis rabbitmqIf you also need mail and S3 storage service
docker-compose up postgres redis rabbitmq mailhog minioAlso, services can be run individually:
Run Postgres
docker-compose up -d postgresRun Redis
docker-compose up -d redisRun RabbitMQ
docker-compose up -d rabbitmq
⚠️ When using MinIO more configuration is needed to configure boto3 to talk to the local endpointsAWS_ACCESS_KEY_ID=minioaccess AWS_SECRET_ACCESS_KEY=miniosecret AWS_ENDPOINT_URL=http://localhost:9000 AWS_DEFAULT_REGION=us-east-1
🛑 NOTE: If the application can't find the
RabbitMQservice even though it's running normally, change yourRABBITMQ__URLto your local ip address instead oflocalhost
It is not recommended to run the services natively. Please use the provided docker setup.
The database needs to be initialized with tables and starting data via alembic (ensure your postgres container is running).
uv run alembic upgrade headIf you are on a unix type system (Linux, MacOS, WSL, etc) add these entries to /etc/hosts. This will need to be
done with elevated privileges (ex: sudo vi /etc/hosts).
#mindlogger
127.0.0.1 postgres
127.0.0.1 rabbitmq
127.0.0.1 redis
127.0.0.1 mailhog
To run just the application: Make sure all required services are properly setup and running
Then start just the app:
docker-compose up appTo run all services (useful when developing the other applications in the stack):
docker-compose upuvicorn src.main:app --proxy-headers --port 8000 --reloadAlternatively, you may run the application using make:
make runSee the PyCharm documentation for details.
You can use the Makefile to work with project (run the application / code quality tools / tests ...)
For local usage:
# Run the application
make run
# Run the test suite
make test
# Check the code quality
make cq
# Check and fix code quality
make cqf
# Check everything in one hop
make check# Connect to the database with Docker
docker-compose exec postgres psql -U postgres postgres
# Or connect to the database locally
psql -U postgres postgres
# Create user's database
psql# create database test;
# Create arbitrary database
psql# create database test_arbitrary;
# Create user test
psql# create user test;
# Set password for the user
psql# alter user test with password 'test';
- Alembic details
- Arbitrary Server
- Curious CLI
- Database
- Git Helpers
- PyCharm Setup
- Security tokens (for deployments)
Common Public Attribution License Version 1.0 (CPAL-1.0)
Refer to LICENSE.md