MLX
![mlx logo](https://raw.githubusercontent.com/github/explore/090734749506ea47f24291be887530a32579cd9d/topics/mlx/mlx.png)
MLX is a NumPy-like array framework designed for efficient and flexible machine learning on Apple silicon, brought to you by Apple machine learning research.
Here are 288 public repositories matching this topic...
A MLX port of FLUX based on the Huggingface Diffusers implementation.
-
Updated
Feb 2, 2025 - Python
This repository provides the code and model checkpoints for AIMv1 and AIMv2 research projects.
-
Updated
Nov 22, 2024 - Python
MLX-VLM is a package for inference and fine-tuning of Vision Language Models (VLMs) on your Mac using MLX.
-
Updated
Feb 4, 2025 - Python
🤖✨ChatMLX is a modern, open-source, high-performance chat application for MacOS based on large language models.
-
Updated
Nov 3, 2024 - Swift
Solve Puzzles. Learn Metal 🤘
-
Updated
Sep 24, 2024 - Jupyter Notebook
Implementation of F5-TTS in MLX
-
Updated
Feb 2, 2025 - Python
Codam's own fixed, functioning and open source alternative of the miniLibX. MLX42 is a simple cross-platform graphics library running on GLFW and OpenGL.
-
Updated
Jan 17, 2025 - C
Generate accurate transcripts using Apple's MLX framework
-
Updated
Dec 10, 2024 - Python
Phi-3.5 for Mac: Locally-run Vision and Language Models for Apple Silicon
-
Updated
Sep 7, 2024 - Jupyter Notebook
MLX Omni Server is a local inference server powered by Apple's MLX framework, specifically designed for Apple Silicon (M-series) chips. It implements OpenAI-compatible API endpoints, enabling seamless integration with existing OpenAI SDK clients while leveraging the power of local ML inference.
-
Updated
Feb 5, 2025 - Python