Skip to main content
LangWatch Quick Preview

What is LangWatch?

LangWatch is the open-source LLMOps platform that helps teams collaboratively debug, analyze, and iterate on their LLM applications. All platform features are natively integrated to accelerate the development workflow. Building AI applications is hard. Developers spend weeks debugging issues, optimizing prompts, and ensuring quality. Without proper observability, you’re flying blind - you don’t know why your AI behaves the way it does, where it fails, or how to improve it. LangWatch provides the missing operations platform for AI applications. Observe every LLM call, tool usage, and user interaction is automatically tracked with detailed traces, spans, and metadata. See the full conversation flow, identify bottlenecks, and understand exactly how your AI applications behave in production.

Core Features

LangWatch provides everything you need to build, monitor, and optimize LLM applications through four core capabilities:

For Every Role

LangWatch serves different needs across your organization, providing value to every team member working with AI applications.

For Developers

Debug faster with detailed traces that show exactly what happened in each LLM call. Build datasets from production data, run batch evaluations, and continuously improve your AI applications with comprehensive debugging tools and performance insights.

For Domain Experts

Easily sift through conversations, see topics being discussed, and annotate messages for improvement in a collaborative manner with the development team. Provide feedback on AI outputs and help guide quality improvements through intuitive interfaces.

For Business Teams

Track conversation metrics, user analytics, and cost tracking with custom dashboards and reporting. Monitor AI application performance, understand user behavior, and make data-driven decisions about your AI investments.

Where to Start?

Setting up the full process of online tracing, prompt management, production evaluations, and offline evaluations requires some time. This guide helps you figure out what’s most important for your use case.

Quick Start

Ready to add observability to your LLM application? LangWatch integrates with your existing codebase in just a few lines of code, regardless of your tech stack.
1

Sign up for free

Create your account at app.langwatch.ai to get started with our free tier.
2

Install the SDK

Install the SDK to your project.
pip install langwatch
3

Setup the SDK

Configure the SDK to your project. Choose your preferred language:
  • Python
  • TypeScript
  • Go
Quick Setup:
import langwatch
langwatch.setup()

View Python Guide

4

Start tracking

Your LLM calls are automatically tracked and visible in the LangWatch dashboard.
Ready to get started? Sign up for free and begin building better AI applications today.
⌘I