Integrations

Integrations

Add compliance to your existing AI stack.

AI Frameworks

LangChain

The most popular framework for building LLM applications.

View Guide →

Supported:

  • Agents (ReAct, OpenAI Functions, Structured Chat)
  • Chains (LCEL, legacy)
  • Tools and retrievers
  • All LLM providers
from protectron.langchain import ProtectronCallback

callback = ProtectronCallback(system_id="my-agent")
executor = AgentExecutor(agent=agent, tools=tools, callbacks=[callback])

CrewAI

Multi-agent orchestration framework.

View Guide →

Supported:

  • Multi-agent crews
  • Task delegation
  • Per-agent logging
  • Inter-agent communication
from protectron.crewai import ProtectronCallback

callback = ProtectronCallback(system_id="my-crew")
crew = Crew(agents=[...], tasks=[...], callbacks=[callback])

AutoGen

Microsoft's multi-agent conversation framework.

Supported:

  • Agent conversations
  • Code execution logging
  • Human-in-the-loop
  • Group chat
from protectron.autogen import ProtectronLogger

logger = ProtectronLogger(system_id="autogen-app")

Vercel AI SDK

TypeScript SDK for AI applications.

Supported:

  • streamText / generateText
  • All model providers
  • Edge runtime compatible
import { withProtectron } from '@protectron/vercel-ai';

const result = await withProtectron(
    streamText({ model: openai('gpt-5.2'), prompt }),
    { systemId: 'my-app' }
);

Custom Integration

Building something custom? Use our base SDK.

Python

from protectron import Protectron

protectron = Protectron(system_id="custom-app")

with protectron.trace("session-123") as trace:
    trace.log_llm_call(model="gpt-5.2", input=[...], output="...")
    trace.log_tool_call(tool="search", input={...}, output={...})

TypeScript

import { Protectron } from '@protectron/sdk';

const protectron = new Protectron({ systemId: 'custom-app' });
await protectron.logEvent('llm_call', { model: 'gpt-5.2', ... });
→ SDK Overview

LLM Providers

Works with any LLM provider through our framework integrations:

ProviderStatus
OpenAI✅ Supported
Anthropic✅ Supported
Google (Gemini)✅ Supported
Azure OpenAI✅ Supported
AWS Bedrock✅ Supported
Mistral✅ Supported
Cohere✅ Supported
Ollama✅ Supported
Local models✅ Supported

Coming Soon

IntegrationStatusETA
Haystack🔄 In DevelopmentQ1 2026
LlamaIndex📋 PlannedQ1 2026
Slack (notifications)📋 PlannedQ1 2026
Jira (requirements sync)📋 PlannedQ2 2026

Want a specific integration? Let us know

Installation

Python

# Core SDK
pip install protectron

# With framework support
pip install protectron[langchain]
pip install protectron[crewai]
pip install protectron[autogen]
pip install protectron[all]

TypeScript

npm install @protectron/sdk
npm install @protectron/vercel-ai

Get Started

1

Create an account

Free 14-day trial

2

Install the SDK

Choose your framework

3

Add the callback

One line of code

4

View your traces

See events in your dashboard

→ Quick Start Guide

Ready to Integrate?

Add EU AI Act compliance to your AI stack in minutes.

Need help with integration? Contact support@protectron.ai