Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We鈥檒l occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Feature]: Preloading OIDC/JWT upstream tokens on proxy start #3672

Open
Manouchehri opened this issue May 16, 2024 · 0 comments
Open

[Feature]: Preloading OIDC/JWT upstream tokens on proxy start #3672

Manouchehri opened this issue May 16, 2024 · 0 comments
Assignees
Labels
enhancement New feature or request

Comments

@Manouchehri
Copy link
Collaborator

The Feature

Upon the proxy config being parsed, we should proactively fetch (and cache) the OIDC tokens (and the exchanged JWTs/tokens) for upstream LLMs.

Only two open question I would like help with (before I implement this myself) 馃檪:

  1. Does this already happen automatically for Azure OpenAI, since I think LiteLLM creates the client at startup?
  2. For Bedrock (and maybe Azure OpenAI), where should I be inserting a caching/prefetching function that occurs whenever there鈥檚 a oidc/ defined in a azure_ad_token and/or aws_web_identity_token?

Motivation, pitch

This should improve perf on the first request to Amazon Bedrock when using OIDC, even more so for GitHub Actions and Google Cloud Run (which need to do a HTTP request first to fetch a OIDC token).

Twitter / LinkedIn details

https://www.linkedin.com/in/davidmanouchehri/

@Manouchehri Manouchehri added the enhancement New feature or request label May 16, 2024
@Manouchehri Manouchehri self-assigned this May 16, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

1 participant