A fun "anti-framework" for doing useful things with agents and LLMs.
Documentation: https://zyx.hammad.app
ZYX is a simplicity-first library wrapped on top of Pydantic AI and heavily inspired by Marvin. It aims to provide a simple, stdlib-like interface for working with language models, without loss of control or flexibility that more complex frameworks provide.
- Simplicity-First: Designed to be as simple as possible to use. The library uses semantically literal function names—a 12-year-old could pick it up and run with it immediately.
- Fast & Flexible: Optimized for both performance and development speed, providing a flexible interface that enables rapid prototyping and iteration.
- Type-Focused: Leverages Pydantic's powerful validation capabilities to ensure your data is always in the expected format, with full type safety and editor support.
- Model Agnostic: Through Pydantic AI,
ZYXsupports virtually any LLM provider—OpenAI, Anthropic, Google, and more. - Semantic Operations: A clean set of operations (
make,parse,query,edit,select,validate) that cover common LLM use cases. - Resource Management: Built-in support for files, code, and vector memory as first-class resources.
- Streaming Support: Built-in streaming for real-time responses and partial results.
ZYX stands on the shoulders of giants:
- Pydantic AI for LLM integration
- Pydantic for data validation
- MarkItDown used by
Attachmentsfor rendering to text - Python 3.11+
Install ZYX using pip:
pip install zyxBy default, ZYX installs the minimum required dependencies from Pydantic AI using their pydantic-ai-slim package, along with the openai library for out-of-the-box OpenAI support.
To add support for additional LLM providers, you can either install the entire pydantic-ai package or add providers individually:
# Install all providers
pip install zyx pydantic-ai
# Or use the `ai` extra
pip install 'zyx[ai]'
# Or add providers manually
pip install zyx anthropicThe easiest way to use LLMs within ZYX is through semantic operations. Start with zyx.make to generate content:
import zyx
result = zyx.make(
target=int,
context="What is 45+45?",
model="openai:gpt-4o-mini",
)
print(result.output)
"""
90
"""Use zyx.parse to extract structured data from any source:
import zyx
from pydantic import BaseModel
class Information(BaseModel):
library_name: str
library_description: str
result = zyx.parse(
source=zyx.paste("https://zyx.hammad.app"),
target=Information,
model="openai:gpt-4o-mini",
)
print(result.output.library_name)
print(result.output.library_description)
"""
ZYX
A fun "anti-framework" for doing useful things with agents and LLMs.
"""Add tools to any semantic operation for enhanced capabilities. Notice how we mix a snippet, string, and OpenAI dict:
import zyx
from pydantic import BaseModel
class Information(BaseModel):
library_name: str
library_description: str
def log_website_url(url: str) -> None:
print(f"Website URL: {url}")
result = zyx.parse(
source=zyx.paste("https://zyx.hammad.app"),
target=Information,
context=[
{"role": "system", "content": "You are a web scraper."},
zyx.paste("scraping_instructions.txt"),
"log the website URL before you parse.",
],
model="anthropic:claude-sonnet-4-5",
tools=[log_website_url],
)
print(result.output.library_name)
print(result.output.library_description)
"""
Website URL: https://zyx.hammad.app
ZYX
A fun "anti-framework" for doing useful things with agents and LLMs.
"""Use zyx.edit to modify existing values:
import zyx
data = {"name": "John", "age": 30}
result = zyx.edit(
target=data,
context="Update the age to 31",
model="anthropic:claude-sonnet-4-5",
merge=False
)
print(result.output)
"""
{"name": None, "age": 31}
"""Use zyx.query to ask questions about a source, that is grounded/anchored to a specific source's content.
import zyx
result = zyx.query(
source="Python is a high-level programming language...",
target=str,
context="What is Python?",
model="openai:gpt-4o-mini",
)
print(result.output)
"""
Python is a high-level programming language.
"""Use zyx.select to choose from a list of options:
import zyx
from typing import Literal
Color = Literal["red", "green", "blue"]
result = zyx.select(
target=Color,
context="What color is the sky?",
model="openai:gpt-4o-mini",
)
print(result.output)
"""
blue
"""All operations have async variants:
import zyx
result = await zyx.amake(
target=str,
context="Write a haiku about Python",
model="openai:gpt-4o-mini",
)
print(result.output)
"""
Code flows like a stream,
Indents guide the logic's dance,
Python dreams in light.
"""Get real-time results with streaming:
import zyx
stream = zyx.make(
target=str,
context="Write a short story",
model="openai:gpt-4o-mini",
stream=True,
)
for text in stream.text():
print(text, end="", flush=True)
"""
**The Last Library**
In a small town nestled between rolling...
"""ZYX provides a set of semantic operations that cover common LLM use cases:
make/amake: Generate new content for a target type or valueparse/aparse: Extract structured content from a primary sourcequery/aquery: Answer questions grounded in the primary sourceedit/aedit: Modify a target value or object, optionally selective or plannedselect/aselect: Choose from a list/enum/literal setvalidate/avalidate: Parse then verify constraints; can return structured violationsexpr: Semantic expression helpers (==,in,bool) usingparse
ZYX provides first-class support for resources that models can read, query, or mutate:
File: Filesystem files (text/JSON/YAML/TOML/etc.) with anchor-based editsCode: Code files with language-aware parsing and editingMemory: Vector-store-backed resource for add/search/delete operations
The context parameter in semantic operations accepts flexible input types that are automatically converted to messages. You can mix and match any combination of context types however you want—pass a single item or a list of items, and they'll be combined into a conversation history.
Simple strings are treated as user messages:
import zyx
result = zyx.make(target=str, context="What is Python?")Use role tags to create multiple messages in a single string:
import zyx
# Supported tags: [s]/[system], [u]/[user], [a]/[assistant]
context = """
[s]You are a helpful assistant.[/s]
[u]What is Python?[/u]
[a]Python is a programming language.[/a]
[u]Tell me more.[/u]
"""
result = zyx.make(target=str, context=context)Use zyx.Context to manage conversation history, instructions, and tools across operations:
import zyx
ctx = zyx.create_context(
instructions="You are a helpful assistant",
auto_update=True,
)
# Pass context as a list item
result1 = zyx.make(target=str, context=[ctx, "What is Python?"])
result2 = zyx.make(target=str, context=[ctx, "Tell me more about that"])Use zyx.paste() to include files, URLs, or other content as attachments:
import zyx
# Include a file
result = zyx.parse(
source=zyx.paste("data.json"),
target=dict,
context=[zyx.paste("schema.txt"), "Parse this according to the schema"],
)
# Include a URL
result = zyx.query(
source=zyx.paste("https://example.com/article"),
target=str,
context="Summarize this article",
)
# Include raw bytes or objects
result = zyx.parse(
source=zyx.paste(b"raw binary data"),
target=str,
context="What is this?",
)Use zyx.attach() to include a Pythonic object as an attachment:
import zyx
result = zyx.parse(
target=str,
attachments=[zyx.attach({"name": "John", "age": 30})],
context="What is the name of the person?",
)The real power is mixing everything together. Combine strings, Context objects, snippets, OpenAI dicts, role-tagged strings, and more—however you want:
import zyx
from pydantic_ai import ModelRequest
ctx = zyx.create_context(instructions="You are a data analyst")
# Wild combination example - mix everything!
result = zyx.parse(
source=zyx.paste("data.csv"),
target=dict,
context=[
# Start with a Context object
ctx,
# Add a role-tagged string
"[s]Also consider the following guidelines.[/s]",
# Include a file snippet
zyx.paste("instructions.md"),
# Add an OpenAI-style system message
{"role": "system", "content": "Be thorough in your analysis."},
# Mix in a direct PydanticAI message
ModelRequest.user_text_prompt("Parse the CSV file according to all the instructions above"),
# Add more snippets
zyx.paste("reference_data.json"),
# Another role-tagged message
"[u]Focus on accuracy over speed.[/u]",
# More OpenAI dicts
{
"role": "assistant",
"content": "I understand. I'll parse the data carefully."
},
# Final instruction as plain string
"Now proceed with the parsing.",
],
)
# Another example: building context dynamically
messages = [
{"role": "system", "content": "You are a code reviewer."},
zyx.paste("code.py"),
"[s]Review this code for security issues.[/s]",
zyx.paste("security_guidelines.md"),
"Provide detailed feedback.",
]
result = zyx.query(
source=zyx.paste("code.py"),
target=str,
context=messages,
)Pydantic models are automatically converted to dictionaries:
import zyx
from pydantic import BaseModel
class Message(BaseModel):
role: str
content: str
msg = Message(role="user", content="Hello!")
result = zyx.make(target=str, context=[msg])ZYX leverages Python's type system and Pydantic for full type safety:
import zyx
from pydantic import BaseModel
class User(BaseModel):
name: str
age: int
result = zyx.parse(
source='{"name": "Alice", "age": 30}',
target=User,
model="openai:gpt-4o-mini",
)
# result.output is fully typed as User
print(result.output.name) # Type-checked!
print(result.output.age) # Type-checked!For more detailed documentation, examples, and guides, visit:
This project is licensed under the terms of the MIT license.
