Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We鈥檒l occasionally send you account related emails.

Already on GitHub? Sign in to your account

gptel: Add gptel-add-context function #292

Open
wants to merge 3 commits into
base: master
Choose a base branch
from

Conversation

munen
Copy link

@munen munen commented Apr 21, 2024

Hi @karthink

Coming back to our short discussion 5 hours ago.

I haven't looked around to see what other LLM clients offer since ~May 2023. Surely someone must have written something more featureful and seamless than gptel? A quick google search indicates that there are multiple (paid) commercial offerings for VSCode, I can't imagine them doing worse than gptel. This package is just a fancy wrapper around Curl.

Maybe I'm just an old, grumpy dude, but those commercial offerings that I checked out didn't hold their water. Cody was quite OK, but I didn't get it to work with my LLM API keys (for example OpenAI). It does work well with Claude 2.0 and ollama, though.

Having said so, I prefer the functionality of gptel. I appreciate that it has a small and simple API surface which nonetheless includes all actually interesting interaction patterns. When adding Emacs and the GPL into the mix, it's a hard to beat mixture. There was only one important feature missing for me, so I built it. Couldn't have done that to any of those proprietary systems(;

The main reason is that gptel is a one-weekend-per-month project for me and that time is eaten up fixing bugs!

Thank you for your effort and wonderful project 馃檹 馃檱

There are three features on the roadmap that I haven't had the time to work on: attaching context (this PR), supporting function calling, and multimodal support (mainly vision) in chats.

Happy to hear 馃憤 Maybe you'll like this implementation of attaching context. I tried to keep the implementation small, but flexible. I don't have much experience using it, so far, but I did dogfood on this PR from the very beginning(; Also, I'm looking very much forward to using it to tackle some bigger projects next week.

Copilot-style completion-at-point, a fourth feature, seems very difficult to do via the Chat APIs so I've shelved that plan for now.

Agreed. I think it's hard to do with the chat APIs. In my tests of the proprietary alternatives, they also didn't really yield better results than actually using the chat modality (when given enough context). So I don't mind intentionally omitting this this feature.

Yes, keeping gptel simple and focused is the only way for it to remain maintainable given my time constraints.

That's a smart choice! Apart from time, these AI based projects tend to have a lot of feature creep and a short half-life. I appreciate your effort to keeping this maintainable while at the same time making it all the more usable. Thank you 馃檹

@munen Love your Emacs work and presentations! Thank you for organice as well, it's very handy!

Thank you for your kind words 馃檹 I didn't have too much time for FOSS lately, but this time was needed so that I can come back to it again in the long term. I'm looking forward to that time!

@munen munen force-pushed the feat/manually-adding-context-snippets branch 2 times, most recently from cea129d to 1c4bd28 Compare April 21, 2024 01:14
@daedsidog
Copy link
Contributor

daedsidog commented May 4, 2024

Could perhaps benefit from the features I have on my PR, i.e. removing context, highlighting context, dedicated context buffer, etc? I think those don't really conflict with the vision of this library just so long as they are opt-in & customizable, which isn't the current state in my PR.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants