Auto-aggregated event listings for Heidelberg. A scheduler scrapes 4 local event sources daily and serves them through a searchable React frontend with live Firestore updates.
04:00 daily (Cloud Scheduler)
│
▼
┌─────────────────────────┐
│ 4 Scrapers (parallel) │
│ │
│ Heidelberg Marketing │
│ Stadt Heidelberg │
│ EKIHD │
│ Rausgegangen │
└───────────┬──────────────┘
│ Cheerio HTML parsing
│ SHA1 dedup by source URL
▼
┌─────────────────────────┐
│ Cloud Firestore │
│ events collection │
└───────────┬──────────────┘
│ onSnapshot (live)
▼
┌─────────────────────────┐
│ React Frontend │
│ Search + Filter │
│ by text & category │
└─────────────────────────┘
| Source | What it covers |
|---|---|
| Heidelberg Marketing | Tourism & city events |
| Stadt Heidelberg | Official city calendar |
| EKIHD | Evangelische Kirche events |
| Rausgegangen | Nightlife, culture, entertainment |
| Layer | Technology |
|---|---|
| Frontend | React 19, TypeScript, Vite 7 |
| Styling | CSS (vanilla) |
| Dates | date-fns |
| Backend | Firebase Cloud Functions v2 (Node.js 20) |
| Scraping | Cheerio |
| Database | Cloud Firestore |
| Hosting | Firebase Hosting |
| CI/CD | GitHub Actions (auto-deploy on push to main) |
├── web/ # React frontend (Vite)
│ └── src/
│ ├── App.tsx
│ ├── components/
│ │ ├── EventCard.tsx
│ │ └── EventFilters.tsx
│ ├── hooks/
│ │ └── useEvents.ts
│ └── firebase.ts
├── functions/ # Cloud Functions (scrapers)
│ └── src/
│ ├── index.ts # HTTP + scheduler endpoints
│ ├── sources/
│ │ ├── heidelbergMarketing.ts
│ │ ├── stadtHeidelberg.ts
│ │ ├── ekihd.ts
│ │ └── rausgegangen.ts
│ └── lib/
│ ├── saveEvents.ts
│ └── hash.ts
├── firebase.json
└── firestore.rules
- Node.js 20+
- Firebase CLI (
npm i -g firebase-tools) - A Firebase project with Firestore enabled
# Frontend
cd web && npm install
# Cloud Functions
cd functions && npm install# Frontend dev server
cd web && npm run dev
# Build & serve functions
cd functions && npm run build && npm run servefirebase deploy --only hosting,functionsDeployment also runs automatically via GitHub Actions on every push to main.
GET /api/scrape
Returns { ok: true, saved: <count> }.
MIT