ANIMAtiZE is an intent-first workflow for generating AI video sequences from a single source image. The current runtime is a FastAPI application serving a cinematic web console and backend endpoints for scene analysis, prompt compilation, auth, persistent settings, and provider execution.
This README is the single source of truth for local onboarding and runtime startup.
- FastAPI app entrypoint:
src.web.app:app - UI:
src/web/static/index.html,src/web/static/app.js,src/web/static/styles.css - API routes:
GET /GET /healthGET /api/session/bootstrapGET /api/auth/configGET /api/auth/mePOST /api/auth/googlePOST /api/auth/logoutGET /api/settingsPUT /api/settingsGET /api/settings/historyPOST /api/settings/history/{history_id}/restoreGET /api/settings/api-keysPUT /api/settings/api-keys/{provider}DELETE /api/settings/api-keys/{provider}GET /api/settings/backupPOST /api/settings/restoreGET /api/runsPUT /api/runs/{run_id}DELETE /api/runsGET /api/providersGET /api/presetsPOST /api/sequences
The UI and backend must not fabricate generated outputs.
If provider integration is not configured:
- backend returns explicit statuses such as
not_configuredornot_executed, - response includes a clear configuration message,
- UI renders this state directly without fake results.
- Python
3.10+(tested with Python3.13) pip
- OpenCV and CV dependencies from
requirements-cv.txt
- Node.js (only for advanced local work on TypeScript files in
src/models/)
git clone https://github.com/makaronz/animatize.git
cd animatize
python3 -m venv .venv
source .venv/bin/activate
pip install -r requirements.txt
pip install -r requirements-cv.txtUse the canonical entrypoint:
uvicorn src.web.app:app --host 0.0.0.0 --port 8000 --reloadOpen:
http://localhost:8000/(UI)http://localhost:8000/health(health)
curl -sS http://localhost:8000/healthExpected:
- HTTP
200 - JSON containing
"status": "ok"
curl -I http://localhost:8000/Expected:
- HTTP
200
Use an actual image file path:
curl -sS -X POST "http://localhost:8000/api/sequences" \
-F "image=@/absolute/path/to/your-image.jpg" \
-F "intent=Slow cinematic push-in on subject with stable identity" \
-F "preset=cinematic-balanced" \
-F "duration=6" \
-F "variants=3" \
-F "aspect_ratio=16:9" \
-F "motion_intensity=6" \
-F "quality_mode=balanced" \
-F "provider=auto" \
-F "negative_intent=avoid jitter and identity drift"Expected:
- HTTP
200 - JSON with:
run_idstatusanalysisvariants[]
With provider keys configured, variants can return success or failed.
Without keys, variants return not_configured/not_executed with a clear
error message.
Set provider keys in environment before startup:
RUNWAY_API_KEYPIKA_API_KEYVEO_API_KEYSORA_API_KEYorOPENAI_API_KEYFLUX_API_KEYGOOGLE_CLIENT_ID(for Google sign-in)ANIMATIZE_SECRET_KEY(for API-key encryption at rest)
Example:
export RUNWAY_API_KEY="..."
export OPENAI_API_KEY="..."
uvicorn src.web.app:app --host 0.0.0.0 --port 8000 --reloadIf no provider is configured:
- the request still performs real CV analysis and prompt compilation,
- execution status is explicitly non-success (
not_configuredornot_executed), - no fabricated media URL is returned.
src/web/
├── app.py # Canonical FastAPI runtime (UI + API)
├── api.py # Legacy compatibility module (deprecated surface)
├── persistence.py # SQLite sessions/settings/runs/API-key persistence
└── static/
├── index.html # Director Console UI
├── app.js # UI interactions, auth, autosave, API wiring
└── styles.css # Cinematic design system styles
- API details:
docs/API.md - Runtime architecture:
docs/ARCHITECTURE.md - MVDS design system:
docs/DESIGN_SYSTEM_MVDS.md - Operations and troubleshooting:
docs/OPERATIONS.md - Persistence migration:
docs/WEB_CONSOLE_PERSISTENCE_MIGRATION.md
Older documents may exist for historical context. When onboarding or running the app, use this README and the four docs above.