Replies: 2 comments 2 replies
-
@timathom while i get the idea and certainly in general this would be doable with Fore it's more a practical problem as such services like the OpenAI API usually need an authorization token to speak to them. The Fore demos are running on github pages and i see no immediate way to make such a demo public without exposing the auth token opening the door for abuse. Further - we would need to make the use case more specific - do you have a more specific vision of how such a chat would look like? A chat session usually just returns text. We can of course try to parse a response and see if it's parseable as json and then push that into an instance but what would you like to do with that instance then? In our applications of Fore we of course use json instances and speak to OpenAPI endpoints all the time but these applications then involve a server-side part to work against which in our case is usually an eXist-db app exposing an OpenAPI. Such more complex demos however can't be directly realized in a github environment and we would need to setup and run our own servers for that. |
Beta Was this translation helpful? Give feedback.
-
@JoernT Right, I was thinking that this could be a demo that might have a placeholder page on the website, but a user would need to run it locally in order to supply the authentication credentials. |
Beta Was this translation helpful? Give feedback.
-
It would be cool to have a Fore demo that featured integration with a GenAI streaming API (OpenAI, Claude, etc.). For example, OpenAI has its Node.js API. The goal would be to allow the user to maintain a chat session within a Fore app that could process structured API responses as Fore instance data (e.g., ask the AI model for a JSON response that will populate a Fore instance).
Beta Was this translation helpful? Give feedback.
All reactions