chore: added documentation
This commit is contained in:
104
docs/database.md
Normal file
104
docs/database.md
Normal file
@@ -0,0 +1,104 @@
|
||||
# Database
|
||||
|
||||
## Overview
|
||||
|
||||
Trackpull uses a single SQLite database at `/config/trackpull.db`. All access goes through `db.py`, which uses thread-local connections to stay safe under multi-threaded Gunicorn workers.
|
||||
|
||||
---
|
||||
|
||||
## Schema
|
||||
|
||||
### `users`
|
||||
|
||||
| Column | Type | Notes |
|
||||
|--------|------|-------|
|
||||
| `id` | TEXT PK | UUID |
|
||||
| `username` | TEXT UNIQUE | Login name |
|
||||
| `password_hash` | TEXT | werkzeug PBKDF2 hash |
|
||||
| `role` | TEXT | `admin` or `user` |
|
||||
| `created_at` | TEXT | ISO datetime |
|
||||
| `last_login` | TEXT | ISO datetime, nullable |
|
||||
|
||||
### `jobs`
|
||||
|
||||
| Column | Type | Notes |
|
||||
|--------|------|-------|
|
||||
| `id` | TEXT PK | UUID |
|
||||
| `user_id` | TEXT FK → users | Cascading delete |
|
||||
| `urls` | TEXT | JSON array of download URLs |
|
||||
| `options` | TEXT | JSON object of download parameters |
|
||||
| `status` | TEXT | `queued`, `running`, `completed`, `failed`, `cancelled` |
|
||||
| `output` | TEXT | JSON array of log lines (max 500) |
|
||||
| `command` | TEXT | Full CLI command string (Votify jobs) |
|
||||
| `return_code` | INTEGER | Process exit code |
|
||||
| `created_at` | TEXT | ISO datetime |
|
||||
| `updated_at` | TEXT | ISO datetime |
|
||||
|
||||
### `app_settings`
|
||||
|
||||
| Column | Type | Notes |
|
||||
|--------|------|-------|
|
||||
| `key` | TEXT PK | Setting name |
|
||||
| `value` | TEXT | Setting value (always a string) |
|
||||
|
||||
Current settings keys: `fallback_quality`, `job_expiry_days`.
|
||||
|
||||
---
|
||||
|
||||
## Threading Model
|
||||
|
||||
`db.py` creates a new connection per thread using `threading.local()`. Each call opens a connection, runs the query, and closes it. This avoids SQLite's "objects created in a thread can only be used in that same thread" limitation.
|
||||
|
||||
Foreign keys are enabled on every connection via `PRAGMA foreign_keys = ON`.
|
||||
|
||||
---
|
||||
|
||||
## Key Functions
|
||||
|
||||
### Users
|
||||
|
||||
| Function | Purpose |
|
||||
|----------|---------|
|
||||
| `create_user(username, password, role)` | Hashes password and inserts row |
|
||||
| `get_user_by_username(username)` | Lookup for login |
|
||||
| `get_user_by_id(user_id)` | Lookup for session validation |
|
||||
| `list_users()` | Admin user list |
|
||||
| `delete_user(user_id)` | Cascades to jobs |
|
||||
| `verify_password(username, password)` | Returns user row or None |
|
||||
| `update_user_password(user_id, new_password)` | Re-hashes and saves |
|
||||
| `update_last_login(user_id)` | Stamps `last_login` |
|
||||
|
||||
### Jobs
|
||||
|
||||
| Function | Purpose |
|
||||
|----------|---------|
|
||||
| `upsert_job(job_dict)` | Insert or replace job record |
|
||||
| `get_job(job_id)` | Fetch single job |
|
||||
| `list_jobs_for_user(user_id)` | All jobs for a user |
|
||||
| `delete_job(job_id)` | Remove single job |
|
||||
| `delete_jobs_older_than(days)` | Expiry cleanup |
|
||||
|
||||
### Settings
|
||||
|
||||
| Function | Purpose |
|
||||
|----------|---------|
|
||||
| `get_setting(key, default)` | Fetch value with fallback |
|
||||
| `set_setting(key, value)` | Upsert a setting |
|
||||
| `get_all_settings()` | Return all as dict |
|
||||
|
||||
---
|
||||
|
||||
## Initialization
|
||||
|
||||
`db.py` calls `init_db()` on import, which:
|
||||
1. Creates all tables if they don't exist.
|
||||
2. Seeds the first admin user from `ADMIN_USERNAME` / `ADMIN_PASSWORD` env vars if the `users` table is empty.
|
||||
|
||||
---
|
||||
|
||||
## Key Files
|
||||
|
||||
| File | Relevance |
|
||||
|------|-----------|
|
||||
| [db.py](../db.py) | Entire database layer |
|
||||
| [app.py](../app.py) | Calls db functions for all CRUD operations |
|
||||
Reference in New Issue
Block a user