-
Notifications
You must be signed in to change notification settings - Fork 1.7k
Open
Description
Summary
When using factory.ModelConfig with an explicit provider parameter, the builtin providers are not loaded first, causing the provider resolution to fail.
Steps to Reproduce
from langextract import factory
import langextract as lx
config = factory.ModelConfig(
model_id="llama-3.3-70b-versatile",
provider="openai", # or "OpenAI" or "OpenAILanguageModel"
provider_kwargs={
"api_key": "...",
"base_url": "https://api.groq.com/openai/v1",
}
)
result = lx.extract(
text_or_documents="Some text",
prompt_description="Extract info",
examples=[...],
config=config,
)Error
langextract.core.exceptions.InferenceConfigError: No provider found matching: 'openai'. Available providers can be listed with list_providers()
Root Cause
In factory.py line 223-228, providers.load_builtins_once() is only called when config.provider is NOT set:
if config.provider:
provider_class = router.resolve_provider(config.provider)
else:
providers.load_builtins_once() # Only called here!
providers.load_plugins_once()
provider_class = router.resolve(config.model_id)Workaround
Import and instantiate the provider directly:
from langextract.providers.openai import OpenAILanguageModel
model = OpenAILanguageModel(
model_id="llama-3.3-70b-versatile",
api_key="...",
base_url="https://api.groq.com/openai/v1",
)
result = lx.extract(
text_or_documents="...",
model=model, # Pass model directly instead of config
...
)Expected Behavior
providers.load_builtins_once() should be called regardless of whether config.provider is set.
Use Case
This bug blocks using OpenAI-compatible APIs like Groq, Together, Anyscale, etc. where:
- The model ID doesn't match the default OpenAI patterns (
^gpt-4,^gpt-5) - Users need to explicitly specify
provider="openai"with a custombase_url
Environment
- langextract version: 1.1.1
- Python: 3.12
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
No labels