Skip to content

Add Prompts API#14

Open
Stephen Belanger (Qard) wants to merge 1 commit intomainfrom
prompts-api
Open

Add Prompts API#14
Stephen Belanger (Qard) wants to merge 1 commit intomainfrom
prompts-api

Conversation

@Qard
Copy link
Copy Markdown
Contributor

Prompts allow you to version and manage LLM prompts centrally, with support for variable substitution and pinning to specific versions.

let prompt = client
    .prompt_builder_with_credentials(&api_key, &org_id)
    .project_name("my-project")
    .slug("greeting-prompt")
    .build()
    .await?;

// Render with variables
let result = prompt.build(Some(hashmap! {
    "name".to_string() => json!("Alice"),
}));

// Use the rendered prompt
match result.prompt {
    RenderedPrompt::Chat(messages) => {
        // Use chat messages
    }
    RenderedPrompt::Completion(text) => {
        // Use completion text
    }
}

Comment thread src/prompt.rs Outdated
/// A single chat message.
#[derive(Debug, Clone, Serialize, Deserialize)]
#[non_exhaustive]
pub struct PromptChatMessage {
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

personally think we should just avoid doing this until we can fetch prompts as lingua format (or, use lingua to convert whatever we get from the API to this). I don't tink we should have prompt typespecs redundantly defined here and lingua.

Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I can delay until lingua is ready, if you like? I agree that if the types are likely to change then we maybe should hold off. Though would lingua be usable in a way that we could make it an internal implementation detail of this? Or would we want it exported directly as the publicly-visible types?

Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I've updated to use Lingua. It's using a git dependency at the moment. How should we proceed from there?

Comment thread docs/auto-instrumentation.md Outdated
Comment thread src/prompt.rs Outdated
Comment thread src/prompt.rs Outdated
Comment thread src/prompt.rs Outdated
@Qard Stephen Belanger (Qard) force-pushed the prompts-api branch 2 times, most recently from 77dcdf8 to f909706 Compare March 6, 2026 22:12
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
Comment thread Cargo.toml
[dependencies]
anyhow = "1"
base64 = "0.22"
lingua = { git = "https://github.com/braintrustdata/lingua.git" }
Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

we can't publish a crate with git deps, so we'll need to fix this before that happens.

Comment thread tests/prompt_test.rs
"max_tokens": 1000
}
},
"template_format": "mustache"
Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

template_format is in the tests, but we don't actually implement this in the PR. Should we do that?

Comment thread src/prompt.rs
/// Use `client.prompt_builder_with_credentials()` to create a builder,
/// then configure it and call `build()` to fetch the prompt.
#[allow(private_bounds)]
pub struct PromptBuilder<S: PromptFetcher> {
Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ideally we can avoid the allow(private_bounds.

Comment thread src/logger.rs
.json::<PromptFetchResponse>()
.await
.map_err(|e| BraintrustError::Api {
status: 200,
Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

we should pick a better status than 200 for the error.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

enhancement New feature or request

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants