You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
When I attempt to index a table with a column containing vector embeddings from OpenAI's text-embedding-3-large model, I get the following error:
error: vector dimension 3072 is too large. LanternDB currently supports up to 2000dim vectors
I'd like to utilise the 3072 dimensions if at all possible and would rather avoid reducing it to 1536. Is there a way I can make it happen with lantern?
I see examples using the aforementioned model, as well as Cohere's model which produces vectors with 4096 dimensions. Is it possible to make it work without quantization?
The text was updated successfully, but these errors were encountered:
When I attempt to index a table with a column containing vector embeddings from OpenAI's text-embedding-3-large model, I get the following error:
error: vector dimension 3072 is too large. LanternDB currently supports up to 2000dim vectors
I'd like to utilise the 3072 dimensions if at all possible and would rather avoid reducing it to 1536. Is there a way I can make it happen with lantern?
I see examples using the aforementioned model, as well as Cohere's model which produces vectors with 4096 dimensions. Is it possible to make it work without quantization?
The text was updated successfully, but these errors were encountered: