Skip to content

apocohq/backend-code-challenge

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

2 Commits
 
 
 
 

Repository files navigation

👨‍💻 Apoco BE Code Challenge

Your mission is to build a well-structured, production-quality REST API for a Pokémon catalog — with a modern twist of AI-powered features. Show us your API design, data modeling, engineering discipline, and ability to integrate AI without over-engineering it.

🔍 What we are looking for in a candidate

  • 🧠 Analytical mind
  • 📐 Data modeling and API design skills
  • 👨‍💻 Software engineering excellence
  • 🔌 Basic DevOps skills
  • 🔍 Attention to detail

⌨️ The Code Challenge

1. Business requirements

You are provided with a catalog of Pokémon (check out pokemons.json). The catalog includes basic information about Pokémon, their statistics, and relationships. Your task is to implement an API that serves data needed by a website where users can browse the catalog, save their favorite Pokémon — and get AI-powered insights about them.

💡 Hint: We're testing your 🧠 Analytical mind skills. You have a data file but no strict specification. Understand the data, infer what makes sense, and design accordingly.

2. Functional requirements

Core Pokémon API

The RESTful API should implement methods to:

  • Get a list of Pokémon, including:
    • Pagination
    • Search by name
    • Filter by Pokémon type
    • Filter by favorite
  • Get a Pokémon by ID
  • Get a Pokémon by name
  • Get a list of Pokémon types
  • Set/unset a Pokémon as favorite

💡 Hint: We're testing your 📐 Data modeling and API design skills. Databases and interfaces have design techniques and best practices.

AI-powered features (required)

Integrate at least one LLM-powered feature into the API. Suggested options — pick one or come up with your own:

  • Pokédex AI entry: For a given Pokémon, generate a short, Pokédex-style description using an LLM based on its stats, types, and abilities. Store the result so it is not re-generated on every request.
  • Team advisor: Accept a natural language goal (e.g., "I want a balanced team that resists fire attacks") and return a suggested team from the catalog with an LLM-generated explanation.
  • Ask the Professor: A Q&A endpoint where users ask questions about Pokémon and get an LLM-generated answer grounded in the catalog data.

Keep it simple. The AI feature should work and handle errors gracefully — if the LLM is unavailable, the rest of the API must still function.

💡 Hint: We're testing your 📐 API design skills. The AI feature should feel like a natural part of the API, not a bolted-on afterthought. Think about response shape, latency, and how errors surface to the client.

Authentication

Implement username/password authentication backed by a Bearer Token. The goal is to identify and link a user to their favorite Pokémon — no need for email verification, password reset flows, or a full user management system.

💡 Hint: We're testing your 👨‍💻 Software engineering excellence skills. Don't over-engineer it.

3. Tech stack

Required:

  • git
    • Commit regularly to show your progress and thinking.
  • Node.js
    • Pin the version via nvm.
  • yarn / pnpm
    • We prefer either over npm.
  • TypeScript
    • Avoid any. Follow best practices.
  • Docker

    💡 Hint: We're testing your 🔌 Basic DevOps skills. We care about multi-stage builds, build cache, and non-root privileges.

  • Vitest / Jest
    • Either is fine. Test what is important.
  • Swagger/OpenAPI
    • We use OpenAPI specs for data validations and contractual agreements between systems/tools.
  • LLM provider of your choice
    • Use a local model (e.g., Ollama) or an external provider (e.g., OpenAI, Anthropic, HuggingFace Inference API). Document your choice and how to configure it.

Your choice:

  • Framework: Fastify / NestJS / Hono / Express
  • Database: PostgreSQL / MySQL
  • ORM: Drizzle / MikroORM
    • Utilize migrations and seeding.

⚙️ How should it work

We expect that running docker compose up fully sets up and starts the application — including the database and seeding. The LLM provider does not need to be part of Docker Compose; just document how to configure it. The application should print the URL of the API endpoint and the Swagger/OpenAPI specification to the console on startup.


🧑‍⚖️ Assessment Criteria

  • Technical correctness: Does the API work as specified? Are all endpoints functional?
  • API design: Well-structured REST endpoints, consistent response shapes, proper use of OpenAPI.
  • Data modeling: Is the database schema thoughtful? Are relationships and constraints appropriate?
  • Code quality: Clean, modular, well-typed TypeScript. No any.
  • DevOps: Docker Compose works out of the box; multi-stage build, non-root, build cache.
  • AI feature integration: Does it work? Does the API degrade gracefully if the LLM is unavailable?
  • Creativity & initiative: Thoughtful design decisions, extra features, or ideas that go beyond the spec.

📝 Submission Guidelines

  • Create a private GitHub repository and invite @Tomas2D.
  • Let us know when the project is ready by sending an email to tomas.dvorak@apoco.com.

❓ Need Clarification?

Feel free to ask questions. We value both technical skill and critical thinking.

About

The backend code challenge

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors