top of page
  • Robert Terhaar

Announcing the New apiAuditor Mode for Proxati LLM_Proxy

We are excited to announce the latest development in the Proxati Open Source LLM_Proxy: the apiAuditor mode. This new feature provides developers with real-time tracking and visualization of their LLM API spend.


Understanding the cost of LLM APIs can be challenging, especially with non-Latin alphabets such as Hebrew, Greek, Korean, and Japanese, which have a higher token-per-character count. Even for applications entirely in English, tracking LLM API usage costs is challenging.


Real-Time Cost Tracking

The apiAuditor mode monitors your LLM API spend in real-time by analyzing each request. The proxy will listen on localhost:8080 by default, and will print each LLM request that it can identify. Here is an example of what this looks like:

$ ./llm_proxy apiAuditor
URL: https://api.openai.com/v1/chat/completions Model: gpt-3.5-turbo inputCost: $0.000102 outputCost $0.000252 = Request Cost: $0.000354 Grand Total: $0.000354
URL: https://api.openai.com/v1/chat/completions Model: gpt-3.5-turbo inputCost: $0.000102 outputCost $0.000432 = Request Cost: $0.000534 Grand Total: $0.000888
URL: https://api.openai.com/v1/chat/completions Model: gpt-3.5-turbo inputCost: $0.000102 outputCost $0.000138 = Request Cost: $0.00024 Grand Total: $0.001128

Implementation Guide


To leverage the apiAuditor mode, configure the OpenAI Python library to route all traffic through the local proxy server instance:


Next up...

Only the OpenAI chat completion API is currently supported, but we will be adding more providers and endpoints soon. Let us know what providers are most important for you! Also, we will be adding support for structured output formats, initially adding JSON output.


The introduction of apiAuditor mode is in direct response to user feedback. We want to hear from you. Our goal is to make AI APIs easier to operate and manage. We invite you to try out this feature and share your feedback to help us continue improving Proxati Open Source LLM_Proxy.


Check it out on Github: https://github.com/Proxati/llm_proxy


Comments


Commenting has been turned off.
bottom of page