Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

"Actor Model claude-3-5-sonnet-20241022 not supported" #77

Open
janmechtel opened this issue Feb 14, 2025 · 0 comments
Open

"Actor Model claude-3-5-sonnet-20241022 not supported" #77

janmechtel opened this issue Feb 14, 2025 · 0 comments

Comments

@janmechtel
Copy link

I can't seem to select claude. It's weird that the log shows "provider=openai" even though anthropic is selected?

(base) C:\Projects\computer_use_ootb>python app.py
[INFO] computer_use_demo.tools.logger - Starting the gradio app
[INFO] computer_use_demo.tools.logger - Found 1 screens
[INFO] computer_use_demo.tools.logger - loaded initial api_key for openai: sk-proj-XXX-XXX
C:\Users\jmech\miniconda3\Lib\site-packages\gradio\components\chatbot.py:288: UserWarning: The 'tuples' format for chatbot messages is deprecated and will be removed in a future version of Gradio. Please set type='messages' instead, which uses openai-style 'role' and 'content' keys.
  warnings.warn(
* Running on local URL:  http://127.0.0.1:7888

To create a public link, set `share=True` in `launch()`.
[INFO] computer_use_demo.tools.logger - Model updated to: gpt-4o-mini
[INFO] computer_use_demo.tools.logger - Updated state: model=gpt-4o-mini, provider=openai, api_key=sk-proj-XXXX
[INFO] computer_use_demo.tools.logger - Model updated to: claude-3-5-sonnet-20241022
[INFO] computer_use_demo.tools.logger - Updated state: model=claude-3-5-sonnet-20241022, provider=anthropic, api_key=sk-ant-api03-XXXX
[INFO] computer_use_demo.tools.logger - Actor model updated to: claude-3-5-sonnet-20241022
[INFO] computer_use_demo.tools.logger - API key updated: provider=openai, api_key=sk-ant-api03-XXXX
[INFO] computer_use_demo.tools.logger - loaded initial api_key for openai: sk-ant-api03-n6_jg1j-XXXX
Traceback (most recent call last):
  File "C:\Users\jmech\miniconda3\Lib\site-packages\gradio\queueing.py", line 715, in process_events
    response = await route_utils.call_process_api(
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\jmech\miniconda3\Lib\site-packages\gradio\route_utils.py", line 322, in call_process_api
    output = await app.get_blocks().process_api(
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\jmech\miniconda3\Lib\site-packages\gradio\blocks.py", line 2044, in process_api
    result = await self.call_function(
             ^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\jmech\miniconda3\Lib\site-packages\gradio\blocks.py", line 1603, in call_function
    prediction = await utils.async_iteration(iterator)
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\jmech\miniconda3\Lib\site-packages\gradio\utils.py", line 728, in async_iteration
    return await anext(iterator)
           ^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\jmech\miniconda3\Lib\site-packages\gradio\utils.py", line 722, in __anext__
    return await anyio.to_thread.run_sync(
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\jmech\AppData\Roaming\Python\Python312\site-packages\anyio\to_thread.py", line 56, in run_sync
    return await get_async_backend().run_sync_in_worker_thread(
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\jmech\AppData\Roaming\Python\Python312\site-packages\anyio\_backends\_asyncio.py", line 2441, in run_sync_in_worker_thread
    return await future
           ^^^^^^^^^^^^
  File "C:\Users\jmech\AppData\Roaming\Python\Python312\site-packages\anyio\_backends\_asyncio.py", line 943, in run
    result = context.run(func, *args)
             ^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\jmech\miniconda3\Lib\site-packages\gradio\utils.py", line 705, in run_sync_iterator_async
    return next(iterator)
           ^^^^^^^^^^^^^^
  File "C:\Users\jmech\miniconda3\Lib\site-packages\gradio\utils.py", line 866, in gen_wrapper
    response = next(iterator)
               ^^^^^^^^^^^^^^
  File "C:\Projects\computer_use_ootb\app.py", line 222, in process_input
    for loop_msg in sampling_loop_sync(
                    ^^^^^^^^^^^^^^^^^^^
  File "C:\Projects\computer_use_ootb\computer_use_demo\loop.py", line 181, in sampling_loop_sync
    raise ValueError(f"Actor Model {actor_model} not supported")
ValueError: Actor Model claude-3-5-sonnet-20241022 not supported
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant