can I use kernel memory to generate the embeddings and then query using Semantic Kernel? #577
Unanswered
roldengarm
asked this question in
1. Q&A
Replies: 2 comments
-
currently KM and SK use different schemas, different field names and KM relies on tags, so it's not possible out of the box. You could achieve that creating a custom copy of KM Azure AI Search connector and change it to write using SK format. |
Beta Was this translation helpful? Give feedback.
0 replies
-
Thanks @dluc that answers my question :) |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Context / Scenario
We want to store millions of documents which works well with KM as it's doing the heavy lifting for partitioning, queueing, etc
We also use Semantic Kernel for our chat bot to stream results using Azure Openai.
Question
Kernel Memory lacks streaming of results, so I was wondering if we can use KM to generate embeddings in Azure Ai Search and then plug that into Semantic Kernel. I guess it should work, but just checking if this is the best approach.
Beta Was this translation helpful? Give feedback.
All reactions