Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

LLM service providers #83

Open
haesleinhuepf opened this issue Dec 13, 2024 · 2 comments
Open

LLM service providers #83

haesleinhuepf opened this issue Dec 13, 2024 · 2 comments

Comments

@haesleinhuepf
Copy link
Member

Just in case using local LLMs doesn't work well with huggingface, there are a few options:

  • ScaDS.AI has a dedicated LLM server here, which is only accessible using TU Dresden VPN.
  • Alternatively, we could use the LLM server from KISSKI, a service provided for free to German academics.
  • I presume also ollama can be installed in docker.

You can test API connections to the services as demonstrated here:

@SeverusYixin
Copy link
Collaborator

Thank you :)

@SeverusYixin
Copy link
Collaborator

Hi @haesleinhuepf , did you know how to get a TUD VPN login username?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants