Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add wandb (Weights & Biases) pip package to proxy Docker image #3654

Conversation

msabramo
Copy link
Contributor

@msabramo msabramo commented May 15, 2024

I'd like to try the Weights & Biases integration. I'm using the standard Docker image (e.g.: ghcr.io/berriai/litellm:main-v1.37.9-stable). If I add this to the proxy config:

success_callback: ["wandb"]

then I get this error:

  File "/usr/local/lib/python3.11/site-packages/litellm/proxy/proxy_server.py", line 3342, in startup_event
    proxy_logging_obj._init_litellm_callbacks()  # INITIALIZE LITELLM CALLBACKS ON SERVER STARTUP <- do this to catch any logging errors on startup, not when calls are being made
    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/litellm/proxy/utils.py", line 166, in _init_litellm_callbacks
    litellm.utils.set_callbacks(callback_list=callback_list)
  File "/usr/local/lib/python3.11/site-packages/litellm/utils.py", line 7365, in set_callbacks
    raise e
  File "/usr/local/lib/python3.11/site-packages/litellm/utils.py", line 7339, in set_callbacks
    weightsBiasesLogger = WeightsBiasesLogger()
                          ^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/litellm/integrations/weights_biases.py", line 190, in __init__
    raise Exception(
Exception:  wandb not installed, try running 'pip install wandb' to fix this error

ERROR:    Application startup failed. Exiting.

So I propose to add pip install wandb to the Dockerfile.

It's not excessively large:

$ du -sh $VIRTUAL_ENV/lib/python3.12/site-packages/wandb
 31M	/Users/abramowi/Library/Caches/pypoetry/virtualenvs/litellm-Fe7WjZrx-py3.12/lib/python3.12/site-packages/wandb

I'd like to try the [Weights & Biases
integration](https://docs.litellm.ai/docs/observability/wandb_integration). I'm
using the standard Docker image (e.g.:
`ghcr.io/berriai/litellm:main-v1.37.9-stable`). If I add this to the proxy
config:

```yaml
success_callback: ["wandb"]
```

then I get this error:

```pytb
  File "/usr/local/lib/python3.11/site-packages/litellm/proxy/proxy_server.py", line 3342, in startup_event
    proxy_logging_obj._init_litellm_callbacks()  # INITIALIZE LITELLM CALLBACKS ON SERVER STARTUP <- do this to catch any logging errors on startup, not when calls are being made
    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/litellm/proxy/utils.py", line 166, in _init_litellm_callbacks
    litellm.utils.set_callbacks(callback_list=callback_list)
  File "/usr/local/lib/python3.11/site-packages/litellm/utils.py", line 7365, in set_callbacks
    raise e
  File "/usr/local/lib/python3.11/site-packages/litellm/utils.py", line 7339, in set_callbacks
    weightsBiasesLogger = WeightsBiasesLogger()
                          ^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/litellm/integrations/weights_biases.py", line 190, in __init__
    raise Exception(
Exception:  wandb not installed, try running 'pip install wandb' to fix this error

ERROR:    Application startup failed. Exiting.
```

So I propose to add `pip install wandb` to the `Dockerfile`.

It's not excessively large:

```shell
$ du -sh $VIRTUAL_ENV/lib/python3.12/site-packages/wandb
 31M	/Users/abramowi/Library/Caches/pypoetry/virtualenvs/litellm-Fe7WjZrx-py3.12/lib/python3.12/site-packages/wandb
 ```
Copy link

vercel bot commented May 15, 2024

The latest updates on your projects. Learn more about Vercel for Git ↗︎

Name Status Preview Comments Updated (UTC)
litellm ✅ Ready (Inspect) Visit Preview 💬 Add feedback May 15, 2024 7:09pm

@msabramo
Copy link
Contributor Author

@krrishdholakia: Did you see this one?

@krrishdholakia
Copy link
Contributor

Not yet - finishing up a PR - will review after (in an hour)

cc: @ishaan-jaff

@ishaan-jaff
Copy link
Contributor

ishaan-jaff commented May 16, 2024

@msabramo instead of adding a new dependency we've decided to move to using httpx for wandb logging. We'd prefer to keep the litellm dockerfile quite light.

Is this an urgent need ? Happy to prioritize it accordingly

@msabramo
Copy link
Contributor Author

@msabramo instead of adding a new dependency we've decided to move to using httpx for wandb logging. We'd prefer to keep the litellm dockerfile quite light.

Is this an urgent need ? Happy to prioritize it accordingly

Oh okay. Not an urgent need at all. You can take your time on that.

@msabramo
Copy link
Contributor Author

It looks like the LangSmith integration doesn't require a separate package and just uses requests?

@ishaan-jaff
Copy link
Contributor

@msabramo we highly recommend Langfuse if you're trying to Log LLM I/O https://docs.litellm.ai/docs/proxy/logging#logging-proxy-inputoutput---langfuse

It's our most used and best tested integration

@ishaan-jaff
Copy link
Contributor

closing since we discussed other options

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants