Skip to content

[Bug][Question]: does cost track for batch if I don't retrieve batch? #10011

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
yeahyung opened this issue Apr 15, 2025 · 2 comments
Open

[Bug][Question]: does cost track for batch if I don't retrieve batch? #10011

yeahyung opened this issue Apr 15, 2025 · 2 comments
Labels
bug Something isn't working

Comments

@yeahyung
Copy link

yeahyung commented Apr 15, 2025

What happened?

I'm trying to use enterprise feature(w/ trivial key) track cost of batch.

In docs, it said LiteLLM polls the batch status until completion(https://docs.litellm.ai/docs/batches#how-cost-tracking-for-batches-api-works)

Does it mean when I(client) create batch using LiteLLM,
LiteLLM continuously poll batch status until completion and aggregates usage?

I created batch using LiteLLM, but it didn't. Instead, I retrieve batch using LiteLLM and when the batch status changed to completed, it calculate costs and store in SpendLogs Table.

Do I misunderstand the feature(docs)? Please correct me if I'm wrong. Thank you! @krrishdholakia @ishaan-jaff

Relevant log output

Are you a ML Ops Team?

No

What LiteLLM version are you on ?

v1.65.4

Twitter / LinkedIn details

No response

@yeahyung yeahyung added the bug Something isn't working label Apr 15, 2025
@krrishdholakia
Copy link
Contributor

Hey @yeahyung there is a background polling job but there's room for improvement there (i.e. currently it will die if proxy restarted)

Do you have redis or db setup?

@yeahyung
Copy link
Author

Hey @yeahyung there is a background polling job but there's room for improvement there (i.e. currently it will die if proxy restarted)

Do you have redis or db setup?

@krrishdholakia I used db only.
Doesn't it support background polling job if I start LiteLLM using main method in proxy_cli.py?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants