Show HN: LangSpend – Track LLM costs by feature and customer (OpenAI/Anthropic) https://ift.tt/QcRMCE8

Show HN: LangSpend – Track LLM costs by feature and customer (OpenAI/Anthropic) We're two developers who got hit twice by LLM cost problems and built LangSpend to fix it. First: We couldn't figure out which features in our SaaS were expensive to run or which customers were costing us the most. Made it impossible to price properly or spot runaway costs. Second: We burned 80% of our $1,000 AWS credits on Claude 4 (AWS Bedrock) in just 2 months while building prototypes of our idea but we had zero visibility into which experiments were eating the budget. So we built LangSpend — a simple SDK that wraps your LLM calls and tracks costs per customer and per feature. How it works: - Wrap your LLM calls and tag them with customer/feature metadata. - Dashboard shows you who's costing what in real-time - Currently supports Node.js and Python SDKs Still early days but solving our problem. Try it out and let me know if it helps you too. - https://langspend.com - Docs: https://ift.tt/KHErVTq - Discord: https://ift.tt/1n6kKWr https://ift.tt/rZRSKpw October 31, 2025 at 01:40AM

Post a Comment

0 Comments