-
Notifications
You must be signed in to change notification settings - Fork 1.4k
type stubs for some third-party libraries #3443
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Open
lars20070
wants to merge
24
commits into
pydantic:main
Choose a base branch
from
lars20070:experiment-wide-evals
base: main
Could not load branches
Branch not found: {{ refName }}
Loading
Could not load tags
Nothing to show
Loading
Are you sure you want to change the base?
Some commits from the old base branch may be removed from the timeline,
and old review comments may become outdated.
+274
−50
Open
Changes from 13 commits
Commits
Show all changes
24 commits
Select commit
Hold shift + click to select a range
eea1136
mlx only on Apple silicon
lars20070 eb32ac4
stubs for mlx, vllm, outlines and transformers
lars20070 6498440
add stubs to ruff config
lars20070 265a0d1
deferred evaluation
lars20070 9756d30
ignore more in stubs
lars20070 bb4d9c3
remove further type ignore
lars20070 8335128
remove further type check
lars20070 9241013
remove further type ignore
lars20070 cda13d9
remove further type ignore
lars20070 ad6b275
remove further type ignore
lars20070 d627aa2
add linter rules again
lars20070 6f117d7
ignore rules again
lars20070 3b64931
Trigger CI
lars20070 76d684b
Merge branch 'main' into experiment-wide-evals
DouweM a8554f0
fix stub inconsistencies and duplications
lars20070 5cc2e78
docu for stubs
lars20070 7c11f09
further stubs for outlines
lars20070 e343876
fix error in outlines stub
lars20070 ca13700
fix linter
lars20070 bfeee98
check the stubs
lars20070 0c09656
Merge branch 'main' into experiment-wide-evals
DouweM b0b7aab
rename /stubs folder to /typings
lars20070 5b9fc1c
fix markdown formatting
lars20070 489d9c1
fix docu
lars20070 File filter
Filter by extension
Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
There are no files selected for viewing
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,19 @@ | ||
| from collections.abc import Sequence | ||
| from os import PathLike | ||
| from typing import Any, Literal | ||
|
|
||
| from typing_extensions import Self | ||
|
|
||
| class Llama: | ||
| def __init__(self, *args: Any, **kwargs: Any) -> None: ... | ||
| @classmethod | ||
| def from_pretrained( | ||
| cls, | ||
| repo_id: str, | ||
| filename: str | None = None, | ||
| additional_files: Sequence[str] | None = None, | ||
| local_dir: str | PathLike[str] | None = None, | ||
| local_dir_use_symlinks: bool | Literal['auto'] = 'auto', | ||
| cache_dir: str | PathLike[str] | None = None, | ||
| **kwargs: Any, | ||
| ) -> Self: ... |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,4 @@ | ||
| from typing import Any | ||
|
|
||
| # mlx is imported as a package, primarily for mlx.nn | ||
| __all__: list[str] = [] |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,3 @@ | ||
| from typing import Any | ||
|
|
||
| class Module: ... |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,7 @@ | ||
| from typing import Any | ||
|
|
||
| from mlx.nn import Module | ||
| from transformers.tokenization_utils import PreTrainedTokenizer | ||
|
|
||
| def load(model_path: str) -> tuple[Module, PreTrainedTokenizer]: ... | ||
| def generate_step(*args: Any, **kwargs: Any) -> Any: ... |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,3 @@ | ||
| from . import models | ||
|
|
||
| __all__: list[str] = [] |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,3 @@ | ||
| from . import base, llamacpp, mlxlm, sglang, transformers, vllm_offline | ||
|
|
||
| __all__: list[str] = [] |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,10 @@ | ||
| from collections.abc import AsyncIterable, Iterable | ||
| from typing import Any | ||
|
|
||
| class Model: | ||
| def __call__(self, *args: Any, **kwargs: Any) -> Any: ... | ||
| def stream(self, *args: Any, **kwargs: Any) -> Iterable[Any]: ... | ||
|
|
||
| class AsyncModel(Model): | ||
| async def __call__(self, *args: Any, **kwargs: Any) -> Any: ... | ||
| def stream(self, *args: Any, **kwargs: Any) -> AsyncIterable[Any]: ... |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,10 @@ | ||
| from typing import TYPE_CHECKING | ||
|
|
||
| from outlines.models.base import Model | ||
|
|
||
| if TYPE_CHECKING: | ||
| from llama_cpp import Llama | ||
|
|
||
| class LlamaCpp(Model): ... | ||
|
|
||
| def from_llamacpp(model: Llama) -> LlamaCpp: ... |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,10 @@ | ||
| from typing import Any | ||
|
|
||
| from mlx.nn import Module | ||
| from outlines.models.base import Model | ||
| from transformers.tokenization_utils import PreTrainedTokenizer | ||
|
|
||
| class MLXLM(Model): | ||
| def __init__(self, *args: Any, **kwargs: Any) -> None: ... | ||
|
|
||
| def from_mlxlm(model: Module, tokenizer: PreTrainedTokenizer) -> MLXLM: ... |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,11 @@ | ||
| from typing import TYPE_CHECKING, Any, Union | ||
|
|
||
| from outlines.models.base import AsyncModel, Model | ||
|
|
||
| if TYPE_CHECKING: | ||
| from openai import AsyncOpenAI, OpenAI | ||
|
|
||
| class SGLang(Model): ... | ||
| class AsyncSGLang(AsyncModel): ... | ||
|
|
||
| def from_sglang(client: OpenAI | AsyncOpenAI, *args: Any, **kwargs: Any) -> SGLang | AsyncSGLang: ... |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,17 @@ | ||
| from typing import Any | ||
|
|
||
| from outlines.models.base import Model | ||
| from transformers import AutoModelForCausalLM, AutoProcessor, AutoTokenizer, LlavaForConditionalGeneration | ||
| from transformers.modeling_utils import PreTrainedModel | ||
| from transformers.processing_utils import ProcessorMixin | ||
| from transformers.tokenization_utils import PreTrainedTokenizer | ||
|
|
||
| class Transformers(Model): ... | ||
| class TransformersMultiModal(Model): ... | ||
|
|
||
| def from_transformers( | ||
| model: PreTrainedModel | AutoModelForCausalLM | LlavaForConditionalGeneration, | ||
| tokenizer_or_processor: PreTrainedTokenizer | ProcessorMixin | AutoTokenizer | AutoProcessor, | ||
| *, | ||
| device_dtype: Any = None, | ||
| ) -> Transformers | TransformersMultiModal: ... |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,10 @@ | ||
| from typing import TYPE_CHECKING | ||
|
|
||
| from outlines.models.base import Model | ||
|
|
||
| if TYPE_CHECKING: | ||
| from vllm import LLM | ||
|
|
||
| class VLLMOffline(Model): ... | ||
|
|
||
| def from_vllm_offline(model: LLM) -> VLLMOffline: ... |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,26 @@ | ||
| from typing import Any | ||
|
|
||
| from typing_extensions import Self | ||
|
|
||
| from . import modeling_utils, processing_utils, tokenization_utils | ||
| from .modeling_utils import PreTrainedModel | ||
| from .processing_utils import ProcessorMixin | ||
| from .tokenization_utils import PreTrainedTokenizer | ||
|
|
||
| class AutoModelForCausalLM(PreTrainedModel): | ||
| @classmethod | ||
| def from_pretrained(cls, *args: Any, **kwargs: Any) -> Self: ... | ||
|
|
||
| class AutoTokenizer(PreTrainedTokenizer): | ||
| @classmethod | ||
| def from_pretrained(cls, *args: Any, **kwargs: Any) -> Self: ... | ||
|
|
||
| class AutoProcessor(ProcessorMixin): | ||
| @classmethod | ||
| def from_pretrained(cls, *args: Any, **kwargs: Any) -> Self: ... | ||
|
|
||
| class LlavaForConditionalGeneration(PreTrainedModel): | ||
| @classmethod | ||
| def from_pretrained(cls, *args: Any, **kwargs: Any) -> Self: ... | ||
|
|
||
| def from_pretrained(*args: Any, **kwargs: Any) -> Any: ... |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,3 @@ | ||
| from typing import Any | ||
|
|
||
| class PreTrainedModel: ... |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,3 @@ | ||
| from typing import Any | ||
|
|
||
| class ProcessorMixin: ... |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,8 @@ | ||
| from typing import Any | ||
|
|
||
| class PreTrainedTokenizer: | ||
| chat_template: str | None | ||
|
|
||
| def __init__(self, *args: Any, **kwargs: Any) -> None: ... | ||
|
|
||
| class ProcessorMixin: ... |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,4 @@ | ||
| from typing import Any | ||
|
|
||
| class LLM: | ||
| def __init__(self, model: str, *args: Any, **kwargs: Any) -> None: ... |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,25 @@ | ||
| from typing import Any | ||
|
|
||
| class SamplingParams: | ||
| max_tokens: int | None | ||
| temperature: float | None | ||
| top_p: float | None | ||
| seed: int | None | ||
| presence_penalty: float | None | ||
| frequency_penalty: float | None | ||
| logit_bias: dict[int, float] | None | ||
| extra_body: dict[str, Any] | None | ||
|
|
||
| def __init__( | ||
| self, | ||
| max_tokens: int | None = None, | ||
| temperature: float | None = None, | ||
| top_p: float | None = None, | ||
| seed: int | None = None, | ||
| presence_penalty: float | None = None, | ||
| frequency_penalty: float | None = None, | ||
| logit_bias: dict[int, float] | None = None, | ||
| extra_body: dict[str, Any] | None = None, | ||
| *args: Any, | ||
| **kwargs: Any, | ||
| ) -> None: ... |
Oops, something went wrong.
Oops, something went wrong.
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can we call it
type_stubsor something like that to make it more clear what it's about?Uh oh!
There was an error while loading. Please reload this page.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There is no PEP rule demanding a
./stubsfolder. ButPEP 484is linking totypeshedwhich is using./stubs. So are mypy and the Microsoft's stubs collection. On the other hand, Pyright goes with./typings.The name
./stubsis not required, but a convention. I have explained the folder in the correspondingREADME. It's like renaming thesrcfolder tosource. Yes, possible. But why do it?Uh oh!
There was an error while loading. Please reload this page.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It's just a bit unclear to people new to the repo/Python typing, who'll have no idea what "stubs" means and what the dir could contain. Those examples you mentioned are dedicated to stubs, so it makes sense to use that name, but I'd find it more convincing if there's an other non-typing-specific Python package that has a top-level stubs dir. I like
typingsortype_stubsbetterThere was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I agree. I went for
typingsas used by Pyright. More informative and an existing convention.