Skip to content

0.31.2

Compare
Choose a tag to compare
@intitni intitni released this 14 Mar 14:33
· 209 commits to main since this release

instruction

  • Fix that getting previous/next suggestion would dismiss the suggestion #457
  • Fix that getting suggestion manually might not work if a suggestion was just dismissed #454
  • Fix a crash in project scope.
  • Some improvements to the UI, especially for light mode. (now the corner radius of the suggestion matches that of the completion panel!)

New in 0.31.1

  • Make the response format from OpenAI compatible API less strict so that it won't break those that are slightly different from OpenAI's format. #456

New in 0.31.0

  • Add Ollama support. You can now add a chat model or embedding model backed by Ollama. (Custom Suggestion Service 0.2.0 has also been released to support Ollama).
  • Update the OpenAI chat completions service implementation to use the new tool calls instead of function calls. If the service you are using is still using the function call fields, please turn the toggle "Supports function calling" off. The new fields have been there since last year so I believe most of the services have been updated.
  • Bump Copilot.vim to 1.25.0
  • Bump Codeium language server to 1.8.5