Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Feature Request: Local server and system instruction presets etc. #26

Open
czkoko opened this issue Nov 6, 2024 · 3 comments
Open

Feature Request: Local server and system instruction presets etc. #26

czkoko opened this issue Nov 6, 2024 · 3 comments
Labels
bug Something isn't working enhancement New feature or request

Comments

@czkoko
Copy link

czkoko commented Nov 6, 2024

Thanks for your work, the elegant interface design and lightweight client are great.

I have a few new feature suggestions:

  1. Can start an OpenAI API compatible server locally, so that the AI ​​code plugin of vs code can be used through ChatMLX.
  2. Can customize a set of system instruction presets, so that you can easily switch the role that the model needs to play.
  3. Can customize the model save directory. After all, the models are large and suitable for storage on an external SSD
@johnmai-dev
Copy link
Owner

johnmai-dev commented Nov 7, 2024

Thank you for your suggestions. The suggestion 3 is currently in development. Suggestions 1 and 2 are also being planned.

@johnmai-dev johnmai-dev added the enhancement New feature or request label Nov 7, 2024
@czkoko
Copy link
Author

czkoko commented Nov 7, 2024

In addition, report a few issues:

  1. If you delete a message when the model is replying to it, will crash.
  2. When the conversation list on the left column is cleared, if you send a message to the model, will crash.
  3. mlx-community/gemma-2-9b-it-4bit has an extra string <end_of_turn> at the end.
    3

@johnmai-dev johnmai-dev added the bug Something isn't working label Nov 9, 2024
@Daksani
Copy link

Daksani commented Dec 18, 2024

i second all of these. I really like the look and feel of this but I can't use it atm due to it's limitations. And for some reason manually copying over the model to the system's download folders doesn't work. so i can't use predownloaded models, only what chatmlx provides. Hope we can get some of these features soon, really liking everything else so far.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

3 participants