Documentation

← Back to Home

Track LLM costs with zero-code attribution

Welcome to LangSpend

Track LLM costs with zero-code attribution. Wrap your OpenAI, Anthropic, or AWS Bedrock client and automatically track every request with customer-level attribution.

Quick Start

1. Install the SDK

terminal

2. Get Your API Key

Sign up at app.langspend.com to get your free API key.

3. Wrap Your LLM Client

index.ts

Features

🎯 Zero-Code Tracking

Wrap your existing LLM clients once, and all calls are automatically tracked. No code changes required.

📊 Customer Attribution

Track costs per customer, feature, and environment. Know exactly who's using what.

💰 Real-Time Cost Calculation

Get accurate cost estimates based on the latest pricing from OpenAI, Anthropic, and AWS.

🚀 Multi-Provider Support

Works with OpenAI, Anthropic, AWS Bedrock, and Google Vertex AI.