Dev Proxy tracks LLM use, estimates costs

infoq.com

Dev Proxy v0.28 introduces telemetry features, enabling developers to track LLM usage and estimate costs for OpenAI and Azure OpenAI requests. This allows for better monitoring of AI-related expenses. The new OpenAITelemetryPlugin in Dev Proxy v0.28 monitors model usage, token counts, and cost estimates. It supports integration with Microsoft's Foundry Local, allowing developers to use local AI models for cost savings. This update also includes improvements for .NET Aspire users, expanded OpenAI payload support, and enhanced TypeSpec generation. Additionally, the Dev Proxy Toolkit for Visual Studio Code has been updated with new features.


With a significance score of 2.8, this news ranks in the top 12% of today's 26949 analyzed articles.

Get summaries of news with significance over 5.5 (usually ~10 stories per week). Read by 10,000+ subscribers: