Releases: jekalmin/extended_openai_conversation
Releases · jekalmin/extended_openai_conversation
3.0.0-beta9
What's Changed
- Change naming from "FunctionExecutor" to "Function" by @jekalmin in #408
- Enhance documentation by @jekalmin in #417, #424
- Bump openai to 2.21.0 and minimum Home Assistant to 2026.3.0b0 by @jekalmin in #414
Requirements
- HA Core version: 2026.3.x or newer
Full Changelog: 3.0.0-beta8...3.0.0-beta9
2.0.2
3.0.0-beta8
What's Changed
Full Changelog: 3.0.0-beta7...3.0.0-beta8
3.0.0-beta7
What's Changed
- Add CI Pipeline by @jekalmin in #404
- Mistral compatibility by @lachmanfrantisek in #383
New Contributors
- @lachmanfrantisek made their first contribution in #383
Full Changelog: 3.0.0-beta6...3.0.0-beta7
3.0.0-beta6
What's Changed
Full Changelog: 3.0.0-beta5...3.0.0-beta6
2.0.1
3.0.0-beta5
3.0.0-beta4
What's Changed
- Fix composite function consuming unnecessary contexts by @jekalmin in #388
- Expose error message in conversation by @jekalmin in #389
- Change default settings to suit reasoning model by @jekalmin in #390
- model:
gpt-4o-mini→gpt-5-miniModel Input Cached input Output gpt-4o-mini $0.15 $0.075 $0.60 gpt-5-mini (flex) $0.125 $0.0125 $1.00 - max tokens:
150→500 - max function calls per conversation:
1→3 - prompt
- functions
- Add
delayproperties inexecute_servicesfunction. - Add
get_attributesfunction
- Add
- service tier:
flex - reasoning effort:
low
- model:
Full Changelog: 3.0.0-beta3...3.0.0-beta4
3.0.0-beta3
What's Changed
Full Changelog: 3.0.0-beta2...3.0.0-beta3