Skip to content

Commit

Permalink
README.md: Add MUSA as supported backend
Browse files Browse the repository at this point in the history
Signed-off-by: Xiaodong Ye <[email protected]>
  • Loading branch information
yeahdongcn committed Nov 26, 2024
1 parent d455817 commit 0a8b764
Showing 1 changed file with 11 additions and 2 deletions.
13 changes: 11 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -200,7 +200,7 @@ CMAKE_ARGS="-DGGML_VULKAN=on" pip install llama-cpp-python
To install with SYCL support, set the `GGML_SYCL=on` environment variable before installing:

```bash
source /opt/intel/oneapi/setvars.sh
source /opt/intel/oneapi/setvars.sh
CMAKE_ARGS="-DGGML_SYCL=on -DCMAKE_C_COMPILER=icx -DCMAKE_CXX_COMPILER=icpx" pip install llama-cpp-python
```
</details>
Expand All @@ -211,11 +211,20 @@ CMAKE_ARGS="-DGGML_SYCL=on -DCMAKE_C_COMPILER=icx -DCMAKE_CXX_COMPILER=icpx" pip
To install with RPC support, set the `GGML_RPC=on` environment variable before installing:

```bash
source /opt/intel/oneapi/setvars.sh
source /opt/intel/oneapi/setvars.sh
CMAKE_ARGS="-DGGML_RPC=on" pip install llama-cpp-python
```
</details>

<details>
<summary>MUSA</summary>

To install with MUSA support, set the `GGML_MUSA=on` environment variable before installing:

```bash
CMAKE_ARGS="-DGGML_MUSA=on" pip install llama-cpp-python
```
</details>

### Windows Notes

Expand Down

0 comments on commit 0a8b764

Please sign in to comment.