o11y.tips

Practical observability guides for practitioners

A conceptual visualization of data being filtered and optimized through a high-tech pipeline, representing LLM observability cost reduction.
Observability

Stop Overpaying for LLM Observability: Reducing Tail-Based Sampling Memory Overhead

Learn how to optimize OpenTelemetry Collector memory usage for LLM traces by implementing attribute stripping and a two-layer tail-based sampling architecture.

Read article

More Articles