Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add Privacy & Data security to Alwrity #51

Open
AJaySi opened this issue Apr 25, 2024 · 1 comment
Open

Add Privacy & Data security to Alwrity #51

AJaySi opened this issue Apr 25, 2024 · 1 comment
Assignees
Labels
enhancement New feature or request

Comments

@AJaySi
Copy link
Owner

AJaySi commented Apr 25, 2024

All API calls are not private. Privacy can be supported by running the LLMs locally on your laptop.
This ensures nothing from your laptop goes out of the laptop..

Ollama integration with Alwrity is pending for sometime now. Ollama helps run LLMs locally on your laptop and hence no API calls Or transfer of data to outside world.

Also, this will truly make alwrity Privacy focused and Free. As alwrity will use locally hosted LLMs, it free its users from LLM subscription models and monthly payouts.

https://ollama.com/

@AJaySi AJaySi added the enhancement New feature or request label Apr 25, 2024
@AJaySi AJaySi self-assigned this Apr 25, 2024
@AJaySi
Copy link
Owner Author

AJaySi commented Apr 26, 2024

Advantages of Using Local Models:

  • Privacy: Local models allow processing of data within your own infrastructure, ensuring data privacy.
  • Customization: You can customize the model to better suit the specific needs of your tasks.
  • Performance: Depending on your setup, local models can offer performance benefits, especially in terms of latency.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

1 participant