Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
8 changes: 7 additions & 1 deletion docs/structsense_getting_started.md
Original file line number Diff line number Diff line change
Expand Up @@ -47,6 +47,13 @@ export EXTERNAL_PDF_EXTRACTION_SERVICE=True
```
> The external API is assumed public (no auth) for now.

### LLM
#### LLM for Agents
We are using OpenRouter to access models like GPT for agents. However, Ollama can also serve as a substitute for OpenRouter when using open-source models such as Llama.

#### Embedding configuration
In our default setup, Ollama is used for embedding generation. You can also use other models via OpenRouter for this purpose.

<!--Running -->

## Running
Expand Down Expand Up @@ -82,7 +89,6 @@ Disabled by default. Enable with:
The `docker/` directory contains **Docker Compose** files for running the following components:

- **Grobid** – for PDF extraction
- **Ollama** – In our setup, Ollama is used for embedding generation. However, it can also serve as a substitute for OpenRouter when using open-source models such as Llama for agents. OpenRouter, on the other hand, provides access to various proprietary models like GPT.
- **Weaviate** – In our StructSense architecture, Weaviate acts as the vector database responsible for storing the ontology, effectively serving as the Ontology database.

These Compose files allow you to quickly stand up a complete local **StructSense** stack.
Expand Down
5 changes: 5 additions & 0 deletions docs/structsense_overview.md
Original file line number Diff line number Diff line change
Expand Up @@ -14,4 +14,9 @@ It orchestrates specialized agents to collaborate, align to schemas/ontologies,
## Architecture
![StructSense Architecture](images/structsense_arch.png)

## Quickstart

```bash
pip install structsense
```