Skip to content

Commit 4a30950

Browse files
committed
Update documentation
1 parent 9cf0712 commit 4a30950

File tree

2 files changed

+43
-32
lines changed

2 files changed

+43
-32
lines changed

README.md

Lines changed: 43 additions & 27 deletions
Original file line numberDiff line numberDiff line change
@@ -15,16 +15,14 @@ Go, Java, LaTeX, PHP, Python, Ruby, Rust, Swift, and TypeScript.
1515

1616
## Installation
1717

18-
> **Note**
18+
> [!NOTE]
1919
>
20-
> CWhy needs to be connected to an [OpenAI account](https://openai.com/api/) or an Amazon Web Services account.
20+
> CWhy needs to be connected to an [OpenAI account](https://openai.com/api/).
2121
> _Your account will need to have a positive balance for this to work_
2222
> ([check your OpenAI balance](https://platform.openai.com/usage)).
2323
> [Get an OpenAI key here](https://platform.openai.com/api-keys).
2424
>
25-
> CWhy currently defaults to GPT-4, and falls back to GPT-3.5-turbo if a request error occurs. For the newest and best
26-
> model (GPT-4) to work, you need to have purchased at least $1 in credits (if your API account was created before
27-
> August 13, 2023) or $0.50 (if you have a newer API account).
25+
> You may need to purchase $0.50 - $1 in OpenAI credits depending on when your API account was created.
2826
>
2927
> Once you have an API key, set it as an environment variable called `OPENAI_API_KEY`.
3028
>
@@ -35,24 +33,42 @@ Go, Java, LaTeX, PHP, Python, Ruby, Rust, Swift, and TypeScript.
3533
> # On Windows:
3634
> $env:OPENAI_API_KEY=<your-api-key>
3735
> ```
38-
>
39-
> **New**: CWhy now has alpha support for Amazon Bedrock, using the Claude model.
40-
> To use Bedrock, you need to set three environment variables.
41-
>
42-
> ```bash
43-
> # On Linux/MacOS:
44-
> export AWS_ACCESS_KEY_ID=<your-access-key>
45-
> export AWS_SECRET_ACCESS_KEY=<your-secret-key>
46-
> export AWS_REGION_NAME=<your-region>
47-
> ```
48-
>
49-
> CWhy will automatically select which AI service to use (OpenAI or AWS Bedrock) when it detects that the appropriate
50-
> environment variables have been set.
5136
52-
```
37+
```bash
5338
python3 -m pip install cwhy
5439
```
5540
41+
### Other LLMs
42+
43+
We mostly test with OpenAI, but other LLMs can be made to work with CWhy. Please report any bug you may encounter.
44+
45+
#### OpenAI API Compatible
46+
47+
If your provider supports OpenAI style API calls, you can simply specify the `OPENAI_BASE_URL` environment variable to
48+
select a different URL to send requests to. For example, this will work great with [Ollama](https://ollama.com/):
49+
50+
```bash
51+
docker run -d --gpus=all -v ollama:/root/.ollama -p 11434:11434 --rm --name ollama ollama/ollama
52+
docker exec -it ollama ollama pull llama3.1:70b
53+
export OPENAI_BASE_URL=http://localhost:11434/v1
54+
cwhy --llm llama3.1:70b --- clang++ tests/c++/missing-hash.cpp
55+
```
56+
57+
#### LiteLLM Proxy
58+
59+
If your provider does not support OpenAI style API calls, such as AWS Bedrock which we used to support, we recommend
60+
using the [LiteLLM Proxy Server](https://docs.litellm.ai/docs/simple_proxy).
61+
62+
```bash
63+
pip install 'litellm[proxy]'
64+
# Set AWS_ACCESS_KEY_ID, AWS_REGION_NAME, and AWS_SECRET_ACCESS_KEY.
65+
litellm --model bedrock/anthropic.claude-v2
66+
export OPENAI_BASE_URL=http://0.0.0.0:4000
67+
cwhy --- clang++ tests/c++/missing-hash.cpp
68+
```
69+
70+
Note that when using the LiteLLM Proxy, CWhy's `--llm` argument will be ignored completely.
71+
5672
## Usage
5773

5874
### Linux/MacOS
@@ -63,24 +79,24 @@ by creating a short executable script wrapping the compiler command.
6379

6480
```bash
6581
# Invoking the compiler directly.
66-
% cwhy --- g++ mycode.cpp
82+
cwhy --- g++ mycode.cpp
6783

6884
# Using CWhy with Java and an increased timeout.
69-
% cwhy --timeout 180 --- javac MyCode.java
85+
cwhy --timeout 180 --- javac MyCode.java
7086

7187
# Invoking with GNU Make, using GPT-3.5.
72-
% CXX=`cwhy --llm=gpt-3.5-turbo --wrapper --- c++` make
88+
CXX=`cwhy --llm=gpt-3.5-turbo --wrapper --- c++` make
7389

7490
# Invoking with CMake, using GPT-4 and clang++.
75-
% CWHY_DISABLE=1 cmake -DCMAKE_CXX_COMPILER=`cwhy --llm=gpt-4 --wrapper --- clang++` ...
91+
CWHY_DISABLE=1 cmake -DCMAKE_CXX_COMPILER=`cwhy --llm=gpt-4 --wrapper --- clang++` ...
7692
```
7793

7894
Configuration tools such as CMake or Autoconf will occasionally invoke the compiler to check for features, which will
7995
fail and invoke CWhy unnecessarily if not available on the machine. To circumvent this, `CWHY_DISABLE` can be set in
8096
the environment to disable CWhy at configuration time.
8197

8298
```bash
83-
% CWHY_DISABLE='ON' cmake -DCMAKE_CXX_COMPILER=`cwhy --wrapper --- c++` ...
99+
CWHY_DISABLE='ON' cmake -DCMAKE_CXX_COMPILER=`cwhy --wrapper --- c++` ...
84100
```
85101

86102
### Windows
@@ -89,9 +105,9 @@ Windows support has been tested using Powershell. On the command line, using Nin
89105
will override any option set.
90106

91107
```bash
92-
% $env:CWHY_DISABLE='ON'
93-
% cmake -G Ninja -DCMAKE_CXX_COMPILER="$(python -m cwhy --wrapper --- cl)" ...
94-
% $env:CWHY_DISABLE=''
108+
$env:CWHY_DISABLE='ON'
109+
cmake -G Ninja -DCMAKE_CXX_COMPILER="$(python -m cwhy --wrapper --- cl)" ...
110+
$env:CWHY_DISABLE=''
95111
```
96112

97113
### Continuous Integration

src/cwhy/cwhy.py

Lines changed: 0 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -94,14 +94,9 @@ def evaluate_text_prompt(
9494
) -> str:
9595
completion = complete(client, args, prompt)
9696

97-
msg = f"Analysis from {args.llm}:"
98-
print(msg)
99-
print("-" * len(msg))
10097
text: str = completion.choices[0].message.content
101-
10298
if wrap:
10399
text = llm_utils.word_wrap_except_code_blocks(text)
104-
105100
text += "\n\n"
106101
text += f"(TODO seconds, $TODO USD.)"
107102

0 commit comments

Comments
 (0)