All notable changes to Chainlit will be documented in this file.
The format is based on Keep a Changelog.
- Fixed critical vulnerabilities allowing arbitrary file read access (#1326)
- Improved path traversal protection in various endpoints (#1326)
- Hebrew translation JSON (#1322)
- Translation files for Indian languages (#1321)
- Support for displaying function calls as tools in Chain of Thought for LlamaIndexCallbackHandler (#1285)
- Improved feedback UI with refined type handling (#1325)
- Upgraded cryptography from 43.0.0 to 43.0.1 in backend dependencies (#1298)
- Improved GitHub Actions workflow (#1301)
- Enhanced data layer cleanup for better performance (#1288)
- Factored out callbacks with extensive test coverage (#1292)
- Adopted strict adherence to Semantic Versioning (SemVer)
- Websocket connection issues when submounting Chainlit (#1337)
- Show_input functionality on chat resume for SQLAlchemy (#1221)
- Negative feedback class incorrectness (#1332)
- Interaction issues with Chat Profile Description Popover (#1276)
- Centered steps within assistant messages (#1324)
- Minor spelling errors (#1341)
- Added documentation for release engineering process (#1293)
- Implemented testing for FastAPI version matrix (#1306)
- Removed wait statements from E2E tests for improved performance (#1270)
- Bumped dataclasses to latest version (#1291)
- Ensured environment loading before other imports (#1328)
- [breaking]: Listen to 127.0.0.1 (localhost) instead on 0.0.0.0 (public) (#861).
- [breaking]: Dropped support for Python 3.8, solving dependency resolution, addressing vulnerable dependencies (#1192, #1236, #1250).
- Frontend connection resuming after connection loss (#828).
- Gracefully handle HTTP errors in data layers (#1232).
- AttributeError: 'ChatCompletionChunk' object has no attribute 'get' in llama_index (#1229).
edit_message
in correct place in default config, allowing users to edit messages (#1218).
CHAINLIT_APP_ROOT
environment variable to modifyAPP_ROOT
, enabling the ability to set the location ofconfig.toml
and other setting files (#1259).- Poetry lockfile in GIT repository for reproducible builds (#1191).
- pytest-based testing infrastructure, first unit tests of backend and testing on all supported Python versions (#1245 and #1271).
- Black and isort added to dev dependencies group (#1217).
- Langchain Callback handler IndexError
- Attempt to fix websocket issues
- The
User
class now has adisplay_name
field. It will not be persisted by the data layer. - The logout button will now reload the page (needed for custom auth providers)
- Directly log step input args by name instead of wrapping them in "args" for readability.
- Langchain Callback handler ValueError('not enough values to unpack (expected 2, got 0)')
- hide_cot becomes cot and has three possible values: hidden, tool_call, full
- User feedback are now scoring an entire run instead of a specific message
- Slack/Teams/Discord DM threads are now split by day
- Slack DM now also use threads
- Avatars are always displayed at the root level of the conversation
- disable_feedback has been removed
- root_message has been removed
- Messages are now editable. You can disable this feature with
config.features.edit_message = false
cl.chat_context
to help keeping track of the messages of the current thread- You can now enable debug_mode when mounting Chainlit as a sub app by setting the
CHAINLIT_DEBUG
totrue
.
- Message are now collapsible if too long
- Only first level tool calls are displayed
- OAuth redirection when mounting Chainlit on a FastAPI app should now work
- The Langchain callback handler should better capture chain runs
- The Llama Index callback handler should now work with other decorators
- Mistral AI instrumentation
- OAuth final redirection should account for root path if provided
- OAuth URL redirection should be correctly formed when using CHAINLIT_URL + submounted chainlit app
- Width and height option for the copilot bubble
- Chat profile icon in copilot should load
- Theme should work with Copilot
- Running toast when an action is running
- Azure AD oauth get_user_info not implemented error
@cl.set_starters
andcl.Starter
to suggest conversation starters to the user- Teams integration
- Expand copilot button
- Debug mode when starting with
-d
. Only available if the data layer supports it. This replaces the Prompt Playground. default
theme config inconfig.toml
- If only one OAuth provider is set, automatically redirect the user to it
- Input streaming for tool calls
- [BREAKING] Custom endpoints have been reworked. You should now mount your Chainlit app as a FastAPI subapp.
- [BREAKING] Avatars have been reworked.
cl.Avatar
has been removed, instead place your avatars by name in/public/avatars/*
- [BREAKING] The
running
,took_one
andtook_other
translations have been replaced byused
. - [BREAKING]
root
attribute ofcl.Step
has been removed. Usecl.Message
to send root level messages. - Chain of Thought has been reworked. Only steps of type
tool
will be displayed ifhide_cot
is false - The
show_readme_as_default
config has been removed - No longer collapse root level messages
- The blue alert "Continuing chat" has been removed.
- The Chat Profile description should now disappear when not hovered.
- Error handling of steps has been improved
- No longer stream the first token twice
- Copilot should now work as expected even if the user is closing/reopening it
- Copilot CSS should no longer leak/be impacted by the host website CSS
- Fix various
cl.Context
errors - Reworked message padding and spacing
- Chat profile should now support non-ASCII characters (like chinese)
- Support for video players like youtube or vimeo
- Fix audio capture on windows browsers
- Intermediary steps button placement
- User message UI has been updated
- Loading indicator has been improved and visually updated
- Icons have been updated
- Dark theme is now the default
- Scroll issues on mobile browsers
- Github button now showing
- The discord bot now shows "typing" while responding
- Discord and Slack bots should no longer fail to respond if the data layer fails
- You can know serve your Chainlit app as a Slack bot
- You can know serve your Chainlit app as a Discord bot
cl.on_audio_chunk
decorator to process incoming the user incoming audio streamcl.on_audio_end
decorator to react to the end of the user audio stream- The
cl.Audio
element now has anauto_play
property layout
theme config, wide or defaulthttp_referer
is now available incl.user_session
- The UI has been revamped, especially the navigation
- The arrow up button has been removed from the input bar, however pressing the arrow up key still opens the last inputs menu
- The user session will no longer be persisted as metadata if > 1mb
- [breaking] the
send()
method oncl.Message
now returns the message instead of the message id - [breaking] The
multi_modal
feature has been renamedspontaneous_file_upload
in the config - Element display property now defaults to
inline
instead ofside
- The SQL Alchemy data layer logging has been improved
- Fixed a bug disconnecting the user when loading the chat history
- Elements based on an URL should now have a mime type
- Stopping a task should now work better (using asyncio task.cancel)
- add support for multiline option in TextInput chat settings field - @kevinwmerritt
- disable gzip middleware to prevent a compression issue on safari
- pasting from microsoft products generates text instead of an image
- do not prevent thread history revalidation - @kevinwmerritt
- display the label instead of the value for menu item - @kevinwmerritt
- The user's browser language configuration is available in
cl.user_session.get("languages")
- Allow html in text elements - @jdb78
- Allow for setting a ChatProfile default - @kevinwmerritt
- The thread history refreshes right after a new thread is created.
- The thread auto-tagging feature is now opt-in using
auto_tag_thread
in the config.toml file
- Fixed incorrect step ancestor in the OpenAI instrumentation
- Enabled having a
storage_provider
set toNone
in SQLAlchemyDataLayer - @mohamedalani - Correctly serialize
generation
in SQLAlchemyDataLayer - @mohamedalani
- Chainlit apps should function correctly even if the data layer is down
- Enable persisting threads using a Custom Data Layer (through SQLAlchemy) - @hayescode
- React-client: Expose
sessionId
inuseChatSession
- Add chat profile as thread tag metadata
- Add quotes around the chainlit create-secret CLI output to avoid any issues with special characters
- Actions now trigger conversation persistence
- Messages and steps now accept tags and metadata (useful for the data layer)
- The LLama Index callback handler should now show retrieved chunks in the intermadiary steps
- Renamed the Literal environment variable to
LITERAL_API_URL
(it used to beLITERAL_SERVER
)
- Starting a new conversation should close the element side bar
- Resolved security issues by upgrading starlette dependency
- Added a new command
chainlit lint-translations
to check that translations file are OK - Added new sections to the translations, like signin page
- chainlit.md now supports translations based on the browser's language. Like chainlit_pt-BR.md
- A health check endpoint is now available through a HEAD http call at root
- You can now specify a custom frontend build path
- Translated will no longer flash at app load
- Llama Index callback handler has been updated
- File watcher should now properly refresh the app when the code changes
- Markdown titles should now have the correct line height
multi_modal
is now under feature in the config.toml and has more granularity- Feedback no longer has a -1 value. Instead a delete_feedback method has been added to the data layer
- ThreadDict no longer has the full User object. Instead it has user_id and user_identifier fields
- OpenAI integration
- Langchain final answer streaming should work again
- Elements with public URLs should be correctly persisted by the data layer
- Enforce UTC DateTimes
- Custom js script injection
- First token and token throughput per second metrics
- The
ChatGeneration
andCompletionGeneration
has been reworked to better match the OpenAI semantics
- Chainlit Copilot
- Translations
- Custom font
- Tasklist flickering
- Llama index callback handler should now correctly nest the intermediary steps
- Toggling hide_cot parameter in the UI should correctly hide the
took n steps
buttons running
loading button should only be displayed once whenhide_cot
is true and a message is being streamed
on_logout
hook allowing to clear cookies when a user logs out
- Chainlit apps won't crash anymore if the data layer is not reachable
- File upload now works when switching chat profiles
- Avatar with an image no longer have a background color
- If
hide_cot
is set totrue
, the UI will never get the intermediary steps (but they will still be persisted) - Fixed a bug preventing to open past chats
- Scroll down button
- If
hide_cot
is set totrue
, arunning
loader is displayed by default under the last message when a task is running.
- Avatars are now always displayed
- Chat history sidebar has been revamped
- Stop task button has been moved to the input bar
- If
hide_cot
is set totrue
, the UI will never get the intermediary steps (but they will still be persisted)
- Elements are now working when authenticated
- First interaction is correctly set when resuming a chat
- The copy button is hidden if
disable_feedback
istrue
- Copy button under messages
- OAuth samesite cookie policy is now configurable through the
CHAINLIT_COOKIE_SAMESITE
env var
- Relax Python version requirements
- If
hide_cot
is configured totrue
, steps will never be sent to the UI, but still persisted. - Message buttons are now positioned below
- cl.Step
- File upload uses HTTP instead of WS and no longer has size limitation
cl.AppUser
becomescl.User
Prompt
has been split inChatGeneration
andCompletionGeneration
Action
now display a toaster in the UI while running
- Support for custom HTML in message content is now an opt in feature in the config
- Uvicorn
ws_per_message_deflate
config param is now configurable likeUVICORN_WS_PER_MESSAGE_DEFLATE=false
- Latex support is no longer enabled by default and is now a feature in the config
- Fixed LCEL memory message order in the prompt playground
- Fixed a key error when using the file watcher (-w)
- Fixed several user experience issues with
on_chat_resume
on_chat_end
is now always called when a chat ends- Switching chat profiles correctly clears previous AskMessages
on_chat_resume
now works properly with non json serializable objectsLangchainCallbackHandler
no longer send tokens to the wrong user under high concurrency- Langchain cache should work when
cache
is totrue
inconfig.toml
- Markdown links special characters are no longer encoded
- Collapsed messages no longer make the chat scroll
- Stringified Python objects are now displayed in a Python code block
- Latex support (only supporting $$ notation)
- Go back button on element page
- Code blocks should no longer flicker or display
[object object]
. - Now properly displaying empty messages with inlined elements
- Fixed
Too many values to unpack error
in langchain callback - Langchain final streamed answer is now annotable with human feedback
- AzureOpenAI should now work properly in the Prompt Playground
- Code blocks display has been enhanced
- Replaced aiohttp with httpx
- Prompt Playground has been updated to work with the new openai release (v1). Including tools
- Auth0 oauth provider has a new configurable env variable
OAUTH_AUTH0_ORIGINAL_DOMAIN
cl.on_chat_resume
decorator to enable users to continue a conversation.- Support for OpenAI functions in the Prompt Playground
- Ability to add/remove messages in the Prompt Playground
- Plotly element to display interactive charts
- Langchain intermediate steps display are now much more readable
- Chat history loading latency has been enhanced
- UTF-8 characters are now correctly displayed in json code blocks
- Select widget
items
attribute is now working properly - Chat profiles widget is no longer scrolling horizontally
- Support for Langchain Expression Language. https://docs.chainlit.io/integrations/langchain
- UI rendering optimization to guarantee high framerate
- Chainlit Cloud latency optimization
- Speech recognition to type messages. https://docs.chainlit.io/backend/config/features
- Descope OAuth provider
LangchainCallbackHandler
is now displaying inputs and outputs of intermediate steps.
- AskUserMessage now work properly with data persistence
- You can now use a custom okta authorization server for authentication
ChatProfile
allows to configure different agents that the user can freely chose- Multi modal support at the input bar level. Enabled by
features.multi_modal
in the config cl.AskUserAction
allows to block code execution until the user clicked an action.- Displaying readme when chat is empty is now configurable through
ui.show_readme_as_default
in the config
cl.on_message
is no longer taking a string as parameter but rather acl.Message
- Chat history is now correctly displayed on mobile
- Azure AD OAuth authentication should now correctly display the user profile picture
@cl.on_file_upload
is replaced by true multi modal support at the input bar level
- Logo is displayed in the UI header (works with custom logo)
- Azure AD single tenant is now supported
collapsed
attribute on theAction
class- Latency improvements when data persistence is enabled
- Chat history has been entirely reworked
- Chat messages redesign
config.ui.base_url
becomesCHAINLIT_URL
env variable
- File watcher (-w) is now working with nested module imports
- Unsupported character during OAuth authentication
- Pydantic v2 support
- Okta auth provider
- Auth0 auth provider
- Prompt playground support for mix of template/formatted prompts
@cl.on_chat_end
decorator- Textual comments to user feedback
- Langchain errors are now correctly indented
- Langchain nested chains prompts are now correctly displayed
- Langchain error TypeError: 'NoneType' object is not a mapping.
- Actions are now displayed on mobile
- Custom logo is now working as intended
- Authentication is now unopinionated:
@cl.password_auth_callback
for login/password auth@cl.oauth_callback
for oAuth auth@cl.header_auth_callback
for header auth
- Data persistence is now enabled through
CHAINLIT_API_KEY
env variable
@cl.auth_client_factory
(see new authentication)@cl.db_client_factory
(see new data persistence)
disable_human_feedback
parameter oncl.Message
- Configurable logo
- Configurable favicon
- Custom CSS injection
- GCP Vertex AI LLM provider
- Long message collpasing feature flag
- Enable Prompt Playground feature flag
- History page filters now work properly
- History page does not show empty conversations anymore
- Langchain callback handler Message errors
@cl.on_file_upload
to enable spontaneous file uploadsLangchainGenericProvider
to add any Langchain LLM in the Prompt Playgroundcl.Message
content now support dict (previously only supported string)- Long messages are now collapsed by default
- Deadlock in the Llama Index callback handler
- Langchain MessagesPlaceholder and FunctionMessage are now correctly supported
- Complete rework of the Prompt playground. Now supports custom LLMs, templates, variables and more
- Enhanced Langchain final answer streaming
remove_actions
method on theMessage
class- Button to clear message history
- Chainlit CLI performance issue
- Llama Index v0.8+ callback handler. Now supports messages prompts
- Tasklist display, persistence and
.remove()
- Custom headers growing infinitely large
- Action callback can now handle multiple actions
- Langflow integration load_flow_from_json
- Video and audio elements on Safari
- Make the chat experience configurable with Chat Settings
- Authenticate users based on custom headers with the Custom Auth client
- Author rename now works with all kinds of messages
- Create message error with chainlit cloud (chenjuneking)
- Security improvements
- Haystack callback handler
- Theme customizability
- Allow multiple browser tabs to connect to one Chainlit app
- Sidebar blocking the send button on mobile
- Factories, run and post process decorators are removed.
- langchain_rename becomes author_rename and works globally
- Message.update signature changed
Migration guide available here.
- Langchain final answer streaming
- Redesign of chainlit input elements
- Possibility to add custom endpoints to the fast api server
- New File Element
- Copy button in code blocks
- Persist session between websocket reconnection
- The UI is now more mobile friendly
- Avatar element Path parameter
- Increased web socket message max size to 100 mb
- Duplicated conversations in the history tab
- Add the video element
- Fix the inline element flashing when scrolling the page, due to un-needed re-rendering
- Fix the orange flash effect on messages
- Task list element
- Audio element
- All elements can use the
.remove()
method to remove themselves from the UI - Can now use cloud auth with any data persistence mode (like local)
- Microsoft auth
- Files in app dir are now properly served (typical use case is displaying an image in the readme)
- Add missing attribute
size
to Pyplot element
- AskUserMessage.remove() now works properly
- Avatar element cannot be referenced in messages anymore
- New data persistence mode
local
andcustom
are available on top of the pre-existingcloud
one. Learn more here.
- Performance improvements and bug fixes on run_sync and asyncify
- File watcher now reloads the app when the config is updated
- cl.cache to avoid wasting time reloading expensive resources every time the app reloads
- Bug introduced by 0.4.0 preventing to run private apps
- Long line content breaking the sidebar with Text elements
- File watcher preventing to keyboard interrupt the chainlit process
- Updated socket io to fix a security issue
- Bug preventing config settings to be the default values for the settings in the UI
- Pyplot chart element
- Config option
default_expand_messages
to enable the default expand message settings by default in the UI (breaking change)
- Scoped elements sharing names are now correctly displayed
- Clickable Element refs are now correctly displayed, even if another ref being a substring of it exists
- Moving from sync to async runtime (breaking change):
- Support async implementation (eg openai, langchain)
- Performance improvements
- Removed patching of different libraries
- Elements:
- Merged LocalImage and RemoteImage to Image (breaking change)
- New Avatar element to display avatars in messages
- AskFileMessage now supports multi file uploads (small breaking change)
- New settings interface including a new "Expand all" messages setting
- The element sidebar is resizable
- Secure origin issues when running on HTTP
- Updated the callback handler to langchain 0.0.198 latest changes
- Filewatcher issues
- Blank screen issues
- Port option in the CLI does not fail anymore because of os import
- Pdf element reloading issue
- CI is more stable
AskFileMessage
's accept parameter can now can take a Dict to allow more fine grained rules. More infos here https://react-dropzone.org/#!/Accepting%20specific%20file%20types.- The PDF viewer element helps you display local or remote PDF files (documentation).
- When running the tests, the chainlit cli is installed is installed in editable mode to run faster.
- URL preview for social media share
max_http_buffer_size
is now set to 100mb, fixing themax_size_mb
parameter ofAskFileMessage
- Enhanced security
- Global element display
- Display elements with display
page
based on their ids instead of their names
- Rework of the Message, AskUserMessage and AskFileMessage APIs:
cl.send_message(...)
becomescl.Message(...).send()
cl.send_ask_user(...)
becomescl.AskUserMessage(...).send()
cl.send_ask_file(...)
becomescl.AskFileMessage(...).send()
update
andremove
methods to thecl.Message
class
- Blank screen for windows users (Chainlit#3)
- Header navigation for mobile (Chainlit#12)
- Starting to log changes in CHANGELOG.md
- Port and hostname are now configurable through the
CHAINLIT_HOST
andCHAINLIT_PORT
env variables. You can also use--host
and--port
when runningchainlit run ...
. - A label attribute to Actions to facilitate localization.
- Clicks on inlined
RemoteImage
now opens the image in a NEW tab.