Skip to content

BrenoMazieiro/timemachinegame

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

1 Commit
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Time Machine - Ancient Rome Simulator

A text-based civilization advancement game where you are placed into a simulation of an ancient Roman village. Your goal: use your real-world knowledge to advance a primitive village by convincing AI-powered NPCs to adopt new ideas.

The unique hook is that every villager is powered by an actual LLM. The "gameplay" is explaining modern concepts in ways that ancient villagers can understand and accept.

How It Works

You arrive as a mysterious stranger in a small Roman village during the Pax Romana. The villagers have their own personalities, knowledge, concerns, and relationships. You talk to them, learn their problems, and try to teach them concepts from the future — crop rotation, sanitation, better metallurgy, basic medicine, and more.

The gameplay loop:

  1. Talk to NPCs — Learn their problems and needs
  2. Think — What do you actually know that could help?
  3. Research (optional) — Look things up in real life if you're not sure
  4. Explain — Teach the concept to an NPC in terms they'd understand
  5. Observe — Did it work? Did they adopt it?
  6. Iterate — Knowledge spreads between villagers over time

Villagers

Name Role Personality
Marcus Village Elder Wise, skeptical, influential
Lucia Farmer Practical, open to improving yields
Gaius Blacksmith Curious about materials and crafting
Helena Healer Interested in medicine and herbs
Felix Young Laborer Eager, spreads gossip quickly
Titus Magistrate Ambitious, concerned with Roman law and order
Servius Merchant Shrewd, well-traveled, knows trade routes
Cassius Legionary Veteran Disciplined, experienced in engineering

Each NPC has memory of your conversations, relationships with other villagers, and their own concerns that guide what knowledge would help them most.

AI Providers

The game supports multiple LLM providers:

Provider Cost Setup
Ollama (Local) Free Install Ollama, run ollama pull deepseek-r1
Claude (Anthropic) Paid API Get key at console.anthropic.com
GPT (OpenAI) Paid API Get key at platform.openai.com
Gemini (Google) Free tier Get key at aistudio.google.com

Recommended: Use Ollama with DeepSeek-R1 for a completely free, offline experience.

Getting Started

Prerequisites

  • Node.js (v18+)
  • An LLM provider (Ollama recommended)

Running locally

# Install dependencies
npm install

# Start the dev server
npm run dev

Open the URL shown in your terminal (usually http://localhost:5173).

Using Ollama (free, local AI)

# Install Ollama from https://ollama.com
# Then pull a model:
ollama pull deepseek-r1

# Start Ollama (if not already running):
ollama serve

In the game, select "Ollama (Local - FREE)" as the provider and click "Enter the Simulation". No API key needed.

Building for production

npm run build

The output goes to dist/ — a static site deployable to Vercel, Netlify, GitHub Pages, or any static host.

Game Systems

  • Village State — Population, food, resources, and morale change over time
  • Day/Night Cycle — Time of day affects which NPCs are available and where
  • Knowledge Tracking — Technologies you teach are tracked per-NPC
  • Knowledge Spread — NPCs share knowledge with each other over time, based on relationships and personality
  • Save System — Game state persists in browser localStorage

Design Philosophy

The game is about curiosity and learning, not stress.

  • No failure states — The village cannot die. This is a knowledge sandbox.
  • Bad ideas fail naturally — If you explain something wrong, it doesn't work. Try again.
  • NPCs express needs — Villagers talk about their struggles, guiding you toward useful knowledge.
  • Motivates real research — "I know irrigation exists, but how would I actually explain building one?"

Tech Stack

  • TypeScript (vanilla, no framework)
  • Vite (dev server + build)
  • localStorage (saves + API keys)
  • Direct browser API calls to LLM providers

Project Structure

src/
├── main.ts              # Entry point
├── llm/                 # LLM provider abstraction
│   ├── provider.ts      # Common interface
│   ├── claude.ts        # Anthropic Claude
│   ├── openai.ts        # OpenAI GPT
│   ├── gemini.ts        # Google Gemini
│   └── ollama.ts        # Ollama (local)
├── npc/                 # NPC system
│   ├── npc.ts           # NPC class & conversation
│   ├── memory.ts        # Conversation memory
│   ├── knowledge.ts     # Knowledge tracking
│   └── spread.ts        # NPC-to-NPC knowledge spread
├── game/                # Game state
│   ├── state.ts         # Village state management
│   ├── time.ts          # Day/night cycle
│   ├── simulation.ts    # Daily resource simulation
│   └── storage.ts       # localStorage save/load
├── data/                # Game data
│   ├── npcs.ts          # NPC definitions
│   └── knowledge.ts     # Discoverable technologies
├── ui/                  # User interface
│   ├── renderer.ts      # Main game controller
│   ├── screens/         # Setup, village, conversation, stats
│   └── components/      # Chat, NPC list
└── styles/
    └── main.css         # Retro terminal aesthetic

License

MIT

About

An AI game where you go back to ancient rome

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors