Backend starter for a WhatsApp AI assistant using:
- ASP.NET Core 8 Web API
- PostgreSQL (Entity Framework Core)
- Ollama (local LLM)
- Meta WhatsApp Cloud API
- Controllers
WhatsAppWebhookController: Handles WhatsApp webhook verification and incoming messages.LeadsController: Exposes basic read endpoints for captured leads.ChatbotKeywordsController,ChatbotSettingsController,ChatbotFaqController: Admin API for configurable chatbot behaviour.
- Services
MessageProcessorService: Orchestrates message flow (greeting → FAQ → domain keywords → out-of-scope or Ollama).OllamaService: Calls the local Ollama API (system prompt configurable via settings).WhatsAppService: Sends outbound messages via WhatsApp Cloud API.KeywordService,SettingsService,FaqService: Configurable keywords, settings (greeting, out-of-scope, system prompt), and FAQs (cached 5 min).LeadService,ConversationService: User and conversation persistence.
- Data
AppDbContext: EF Core DbContext for PostgreSQL.
- Models
User,Conversation,Lead,ChatbotKeyword,ChatbotSetting,ChatbotFAQ.
- DTOs
WhatsAppWebhookDto,WhatsAppMessageDto.
- Config
WhatsAppSettings,OllamaSettingsbound fromappsettings.json.
The diagram below shows both the runtime path (incoming WhatsApp messages) and the knowledge ingestion path (admin uploads -> chunking -> embeddings -> pgvector store).
flowchart TD
%% Incoming WhatsApp webhook
Meta[Meta WhatsApp Cloud API] -->|HTTPS Webhook| WH[WhatsAppWebhookController]
WH --> MP[MessageProcessorService]
%% Config-driven logic
MP --> KWD[KeywordService]
MP --> FAQ[FaqService]
MP --> SET[SettingsService]
%% Conversation persistence
MP --> CS[ConversationService]
CS --> DB[PostgreSQL with pgvector]
%% Optional RAG path
MP -->|if in-domain and not FAQ| RAG[RagService]
RAG --> VSR[VectorSearchService cosine similarity]
VSR --> DB
RAG --> OAI[OllamaService generate]
OAI --> OLLAMA[Ollama Server]
%% Outbound WhatsApp reply
MP --> WAS[WhatsAppService]
WAS --> Meta
%% Knowledge ingestion path (Admin)
UP[Admin: KnowledgeController file upload] --> JOB[KnowledgeIngestionWorker]
JOB --> DOCS[DocumentService extract and normalize text]
DOCS --> CHUNK[TextChunkingService 500 chars]
CHUNK --> EMB[EmbeddingService embeddings endpoint]
EMB --> OLLAMA
EMB --> DB
Message flow:
- WhatsApp sends webhook to
POST /webhook/whatsapp. - Payload is parsed; phone number and message are extracted.
MessageProcessorService(configurable from admin panel):- Gets/creates
User. - If message is greeting (“hi”/“hello”) → return GREETING_MESSAGE from settings.
- Else if message matches an FAQ trigger → return that FAQ response.
- Else if domain keywords are configured and message contains none → return OUT_OF_SCOPE_REPLY from settings.
- Else call Ollama (using SYSTEM_PROMPT from settings if set).
- Persists
Conversationand (if freight/quote)Lead; sends reply via WhatsApp.
- Gets/creates
- .NET 8 SDK
- PostgreSQL
- Ollama installed locally
- Meta WhatsApp Cloud API app + phone number ID + access token
-
Install Ollama from the official site.
-
Pull the
mistralmodel:ollama pull mistral
-
Run Ollama server:
ollama serve
By default it listens on
http://localhost:11434. To check run "Invoke-WebRequest -Uri "http://localhost:11434" -UseBasicParsing" in powershell
-
Create a PostgreSQL database, e.g.
octology_whatsapp. -
Update the connection string in
appsettings.json:"ConnectionStrings": { "DefaultConnection": "Host=localhost;Port=5432;Database=octology_whatsapp;Username=postgres;Password=<POSTGRES_PASSWORD>" }
-
Add EF Core tools (globally or via local dotnet tool) if needed:
dotnet tool install --global dotnet-ef
-
From the project root, create and apply migrations:
dotnet ef migrations add InitialCreate dotnet ef database update
The app also runs
Database.Migrate()on startup, so schema updates are applied automatically when the app boots (suitable for dev/staging).
-
In Meta Developer console, create a WhatsApp Cloud API app and get:
- Phone Number ID
- Permanent Access Token (or long-lived token)
-
In
appsettings.json, configure:"WhatsApp": { "AccessToken": "YOUR_META_WHATSAPP_ACCESS_TOKEN", "PhoneNumberId": "YOUR_PHONE_NUMBER_ID", "VerifyToken": "YOUR_VERIFY_TOKEN" }
VerifyTokenis an arbitrary secret string you choose; you must use the same value in the Meta console webhook configuration.
From the project root:
dotnet restore
dotnet runBy default, Kestrel will listen on http://localhost:5000 and https://localhost:5001 (depending on your ASP.NET Core profile).
Swagger UI is available in development at:
https://localhost:5001/swagger
Meta requires a publicly reachable HTTPS URL for the webhook.
-
Install ngrok and authenticate with your ngrok account.
-
Expose your ASP.NET Core app (assuming HTTP on port 5000):
ngrok http http://localhost:5000
-
Note the public HTTPS URL from ngrok, e.g.
https://abcd1234.ngrok.io. -
In the Meta Developer console:
-
Configure the webhook URL as:
https://abcd1234.ngrok.io/webhook -
Set the Verify Token to the same value as
WhatsApp:VerifyTokeninappsettings.json.
-
-
Meta will call
GET /webhookfor verification. The API compareshub.verify_tokenwith your configured verify token and returnshub.challengewhen valid.
-
Verify Webhook
GET /webhook- Query params:
hub.modehub.verify_tokenhub.challenge
- Behavior:
- If
hub.verify_tokenmatches config, returnshub.challenge. - Otherwise, returns
401 Unauthorized.
- If
-
Receive Messages
-
POST /webhook/whatsapp -
Expected payload (simplified example):
{ "entry": [ { "changes": [ { "value": { "messages": [ { "from": "919999999999", "text": { "body": "Hi" } } ] } } ] } ] } -
Extracts:
- Phone number from
messages[0].from - Text from
messages[0].text.body
- Phone number from
-
-
Greeting detection
-
If message contains
"hi"or"hello"(case-insensitive), the bot replies with:Welcome to Octology Logistics Assistant 🚢 1 Track container 2 Freight quote 3 Import documentation 4 Talk to agent -
No Ollama call is required for simple greetings.
-
-
Freight quote detection
- If message contains
"freight"or"quote"(case-insensitive), aLeadis saved:PhoneNumberfrom the senderRequirementfrom the message text
- If message contains
-
General questions
- For other messages,
OllamaServicesends a prompt:- System prompt from setting SYSTEM_PROMPT (or default Octology logistics assistant).
- User message is appended.
- Response is saved in
Conversationand sent back via WhatsApp.
- For other messages,
Greeting text, out-of-scope reply, and system prompt are configurable via ChatbotSettings (see below).
Behaviour is driven by PostgreSQL and can be managed from the admin panel or Swagger.
- GET /api/chatbot/settings — list all key/value settings.
- PUT /api/chatbot/settings — upsert settings. Body:
{ "settings": [ { "settingKey": "GREETING_MESSAGE", "settingValue": "..." }, ... ] }.
Recommended keys:
OUT_OF_SCOPE_REPLY— reply when the user message does not contain any domain keyword.GREETING_MESSAGE— reply for “hi” / “hello”.SYSTEM_PROMPT— system prompt sent to Ollama (optional).
- GET /api/chatbot/keywords — list all keywords.
- POST /api/chatbot/keywords — add keyword. Body:
{ "keyword": "shipment", "isActive": true }. - DELETE /api/chatbot/keywords/{id} — remove keyword.
If at least one keyword exists, messages that contain none of them receive OUT_OF_SCOPE_REPLY. If no keywords are configured, all messages are treated as in-scope and go to the LLM.
- GET /api/chatbot/faqs — list all FAQs.
- POST /api/chatbot/faqs — add FAQ. Body:
{ "triggerText": "track container", "responseText": "You can track at ...", "isActive": true }. - DELETE /api/chatbot/faqs/{id} — remove FAQ.
If the user message contains triggerText (case-insensitive), the bot replies with responseText and does not call the LLM.
Config (keywords, settings, FAQs) is cached in memory for 5 minutes; changes take effect after cache expiry or restart.
-
User
Id(int, identity)PhoneNumber(unique)CreatedDate
-
Conversation
IdUserId(FK toUser)UserMessageBotResponseTimestamp
-
Lead
IdPhoneNumberRequirementCreatedDate
-
ChatbotKeyword —
Id(Guid),Keyword,IsActive,CreatedAt -
ChatbotSetting —
Id(Guid),SettingKey,SettingValue -
ChatbotFAQ —
Id(Guid),TriggerText,ResponseText,IsActive,CreatedAt
GET /api/leads— list all leads.GET /api/leads/{id}— get a single lead.
Use this to integrate with CRM or BI pipelines.
The project uses structured logging via ILogger:
- Incoming message and final response.
- Greeting detection, FAQ match, keyword (in/out of scope) detection.
- LLM calls and responses.
- Outbound WhatsApp API calls.
- Database operations for conversations and leads.
Configure log levels in appsettings.json under Logging.