@@ -15,16 +15,14 @@ Go, Java, LaTeX, PHP, Python, Ruby, Rust, Swift, and TypeScript.
15
15
16
16
## Installation
17
17
18
- > ** Note **
18
+ > [ !NOTE ]
19
19
>
20
- > CWhy needs to be connected to an [ OpenAI account] ( https://openai.com/api/ ) or an Amazon Web Services account .
20
+ > CWhy needs to be connected to an [ OpenAI account] ( https://openai.com/api/ ) .
21
21
> _ Your account will need to have a positive balance for this to work_
22
22
> ([ check your OpenAI balance] ( https://platform.openai.com/usage ) ).
23
23
> [ Get an OpenAI key here] ( https://platform.openai.com/api-keys ) .
24
24
>
25
- > CWhy currently defaults to GPT-4, and falls back to GPT-3.5-turbo if a request error occurs. For the newest and best
26
- > model (GPT-4) to work, you need to have purchased at least $1 in credits (if your API account was created before
27
- > August 13, 2023) or $0.50 (if you have a newer API account).
25
+ > You may need to purchase $0.50 - $1 in OpenAI credits depending on when your API account was created.
28
26
>
29
27
> Once you have an API key, set it as an environment variable called ` OPENAI_API_KEY ` .
30
28
>
@@ -35,24 +33,42 @@ Go, Java, LaTeX, PHP, Python, Ruby, Rust, Swift, and TypeScript.
35
33
> # On Windows:
36
34
> $env :OPENAI_API_KEY=< your-api-key>
37
35
> ` ` `
38
- >
39
- > ** New** : CWhy now has alpha support for Amazon Bedrock, using the Claude model.
40
- > To use Bedrock, you need to set three environment variables.
41
- >
42
- > ` ` ` bash
43
- > # On Linux/MacOS:
44
- > export AWS_ACCESS_KEY_ID=< your-access-key>
45
- > export AWS_SECRET_ACCESS_KEY=< your-secret-key>
46
- > export AWS_REGION_NAME=< your-region>
47
- > ` ` `
48
- >
49
- > CWhy will automatically select which AI service to use (OpenAI or AWS Bedrock) when it detects that the appropriate
50
- > environment variables have been set.
51
36
52
- ` ` `
37
+ ` ` ` bash
53
38
python3 -m pip install cwhy
54
39
```
55
40
41
+ ### Other LLMs
42
+
43
+ We mostly test with OpenAI, but other LLMs can be made to work with CWhy. Please report any bug you may encounter.
44
+
45
+ #### OpenAI API Compatible
46
+
47
+ If your provider supports OpenAI style API calls, you can simply specify the ` OPENAI_BASE_URL ` environment variable to
48
+ select a different URL to send requests to. For example, this will work great with [ Ollama] ( https://ollama.com/ ) :
49
+
50
+ ``` bash
51
+ docker run -d --gpus=all -v ollama:/root/.ollama -p 11434:11434 --rm --name ollama ollama/ollama
52
+ docker exec -it ollama ollama pull llama3.1:70b
53
+ export OPENAI_BASE_URL=http://localhost:11434/v1
54
+ cwhy --llm llama3.1:70b --- clang++ tests/c++/missing-hash.cpp
55
+ ```
56
+
57
+ #### LiteLLM Proxy
58
+
59
+ If your provider does not support OpenAI style API calls, such as AWS Bedrock which we used to support, we recommend
60
+ using the [ LiteLLM Proxy Server] ( https://docs.litellm.ai/docs/simple_proxy ) .
61
+
62
+ ``` bash
63
+ pip install ' litellm[proxy]'
64
+ # Set AWS_ACCESS_KEY_ID, AWS_REGION_NAME, and AWS_SECRET_ACCESS_KEY.
65
+ litellm --model bedrock/anthropic.claude-v2
66
+ export OPENAI_BASE_URL=http://0.0.0.0:4000
67
+ cwhy --- clang++ tests/c++/missing-hash.cpp
68
+ ```
69
+
70
+ Note that when using the LiteLLM Proxy, CWhy's ` --llm ` argument will be ignored completely.
71
+
56
72
## Usage
57
73
58
74
### Linux/MacOS
@@ -63,24 +79,24 @@ by creating a short executable script wrapping the compiler command.
63
79
64
80
``` bash
65
81
# Invoking the compiler directly.
66
- % cwhy --- g++ mycode.cpp
82
+ cwhy --- g++ mycode.cpp
67
83
68
84
# Using CWhy with Java and an increased timeout.
69
- % cwhy --timeout 180 --- javac MyCode.java
85
+ cwhy --timeout 180 --- javac MyCode.java
70
86
71
87
# Invoking with GNU Make, using GPT-3.5.
72
- % CXX=` cwhy --llm=gpt-3.5-turbo --wrapper --- c++` make
88
+ CXX=` cwhy --llm=gpt-3.5-turbo --wrapper --- c++` make
73
89
74
90
# Invoking with CMake, using GPT-4 and clang++.
75
- % CWHY_DISABLE=1 cmake -DCMAKE_CXX_COMPILER=` cwhy --llm=gpt-4 --wrapper --- clang++` ...
91
+ CWHY_DISABLE=1 cmake -DCMAKE_CXX_COMPILER=` cwhy --llm=gpt-4 --wrapper --- clang++` ...
76
92
```
77
93
78
94
Configuration tools such as CMake or Autoconf will occasionally invoke the compiler to check for features, which will
79
95
fail and invoke CWhy unnecessarily if not available on the machine. To circumvent this, ` CWHY_DISABLE ` can be set in
80
96
the environment to disable CWhy at configuration time.
81
97
82
98
``` bash
83
- % CWHY_DISABLE=' ON' cmake -DCMAKE_CXX_COMPILER=` cwhy --wrapper --- c++` ...
99
+ CWHY_DISABLE=' ON' cmake -DCMAKE_CXX_COMPILER=` cwhy --wrapper --- c++` ...
84
100
```
85
101
86
102
### Windows
@@ -89,9 +105,9 @@ Windows support has been tested using Powershell. On the command line, using Nin
89
105
will override any option set.
90
106
91
107
``` bash
92
- % $env :CWHY_DISABLE=' ON'
93
- % cmake -G Ninja -DCMAKE_CXX_COMPILER=" $( python -m cwhy --wrapper --- cl) " ...
94
- % $env :CWHY_DISABLE=' '
108
+ $env :CWHY_DISABLE=' ON'
109
+ cmake -G Ninja -DCMAKE_CXX_COMPILER=" $( python -m cwhy --wrapper --- cl) " ...
110
+ $env :CWHY_DISABLE=' '
95
111
```
96
112
97
113
### Continuous Integration
0 commit comments