Skip to content

OpenAI API toolings for emacs. Allows interacting with various GPT and DALL-E Models directly in emacs

License

Notifications You must be signed in to change notification settings

antonhibl/gptai

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

73 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

GPTAI.el

an OpenAI API for Emacs

https://melpa.org/packages/gptai-badge.svg

This allows for easy communication between emacs and the openAI API platform; allows using all of the available models and integrates cleanly with emacs toolings. Here is the basic things you will need for set-up:

  • Install the package from MELPA using:
    (package-install 'gptai)
        

    then put this in your init.el:

    (require 'gptai)
    ;; set configurations
    (setq gptai-model "<MODEL-HERE>") 
    (setq gptai-username "<USERNAME-HERE>")
    (setq gptai-api-key "<API-KEY-HERE>")
    ;; set keybindings optionally
    (global-set-key (kbd "C-c o") 'gptai-send-query)
        

To fill out the details of the configuration section

  • Define the desired model to use; available models can be found by running gptai-list-models , it will also display this list in the gptai buffer. Use this to choose a model and set gptai-model (text-davinci-003 is a good default model).
  • Define your OpenAI username.
  • Define your API key (your own key can be obtained from OpenAI API Keys)

After that optionally fill out the init section with some keybindings.

Extending the Package

Check out the wiki for examples on how to easily extend this package to your uses

Text Querying

You can send textual queries to different models of openAI using the functions:

  • gptai-send-query

    Prompts in the minibuffer a query and returns the output at the point.

  • gptai-send-query-from-selection

    Sends the text in the current selection to openAI as the prompt to the openAI API model you specified in your configurations, returns the output in place of the original selection.

  • gptai-send-query-from-buffer

    Sends the current buffer as the prompt to the openAI API model you specified in your configurations, returns the output in place of the original buffer.

  • gptai-spellcheck-text-from-selection

    Sends the selected text for spellchecking to openAI and replaces the selection with the corrected text.

  • gptai-elaborate-on-text-from-selection

    Sends the selected text for further elaboration by openAI and replaces the original selection with the improved text.

Image Querying

You can generate images with the DALL-E engine using:

  • gptai-send-image-query

    Prompts the user in the minibuffer for

    • A prompt for the command
    • How many images you want to generate
    • What size they should be
    • Where you want to store them on the disk

    It will then generate those images at the API endpoint, then use curl to download those images to your specified directory path. Once it is done, if one image was downloaded it will open it in a new buffer for viewing, otherwise if more were downloaded it will simply display success when done.

Code Querying and Generation

Using the gptai-code-query and gptai-code-query-from-selection functions, you are able to generate code from instructive prompts in specified languages, both of these functions by default will prompt the user for a language and return the returned code in-place at your selection or point in the buffer.

  • gptai-code-query

    Prompts the user for instructions and language to use; then returns the output code that was generated at the point.

  • gptai-code-query-from-selection

    Uses the selection text as the instructions and prompts for a language to use; then returns the code that was generated in place of the original selection.

  • gptai-explain-code-from-selection

    Explain the code from the selection, return output above selection.

  • gptai-fix-code-from-selection

    Fix and debug the code from the selection, return the output in place of the original selection.

  • gptai-document-code-from-selection

    Document and describe the code from the selection, return output above selection.

  • gptai-optimize-code-from-selection

    Optimizes and refactors code from selection, returns output in place of the original selection.

  • gptai-improve-code-from-selection

    Improves and extends on code from selection, returns output in place of original selection.

Using the 3.5-turbo models

The newer models of gpt-3.5 and gpt-4 requires a different query structure, this means that to use these models there needed to be a seperate query handler; this is achieved with gpt-turbo.el. Using the turbo model is as easy as calling gptai-turbo-query while passing it a text query input either interactively or pragrammatically. Here is an example:

(gptai-turbo-query "This is a prompt")

this functions basically the same as gptai-send-query, it just limits you to the newer 3.5 turbo, and 4 models for GPT language models instead of giving more ability to choose what models you are using with the standard query handler.

Contributing

Feel free to make a PR with improvements, all PRs should include your changes as well as a addition to the CHANGELOG.md file noting any important changes for users to be aware of.

About

OpenAI API toolings for emacs. Allows interacting with various GPT and DALL-E Models directly in emacs

Resources

License

Code of conduct

Stars

Watchers

Forks

Packages

No packages published

Contributors 4

  •  
  •  
  •  
  •