Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Vector dimension 3072 is too large. LanternDB currently supports up to 2000dim vectors #374

Open
ganesh-rao opened this issue Feb 18, 2025 · 0 comments

Comments

@ganesh-rao
Copy link

When I attempt to index a table with a column containing vector embeddings from OpenAI's text-embedding-3-large model, I get the following error:

error: vector dimension 3072 is too large. LanternDB currently supports up to 2000dim vectors
I'd like to utilise the 3072 dimensions if at all possible and would rather avoid reducing it to 1536. Is there a way I can make it happen with lantern?
I see examples using the aforementioned model, as well as Cohere's model which produces vectors with 4096 dimensions. Is it possible to make it work without quantization?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant