-
Notifications
You must be signed in to change notification settings - Fork 154
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[BUG] #411 introduced even worse side effect than it fixed. #433
Comments
The first error message is because these lines: if _cls is None:
return wrap So the dataclass_json() can actually return a function instead of a class. This confuse static type checker. The reason of 2nd error is that the type checker does not know the _process_class() returned sublcass is also derived from Person. |
Is this a problem specific to
Untrue, I just tested this on PyCharm with 0.5.9 and auto-complete works as expected when using either annotation or the base class Also, could you please elaborate from the example, what should be autocompleted? It is a valid code piece. |
nope it is not just pylance. The pycharm has the exactly same problem. from dataclasses import dataclass
from dataclasses_json import dataclass_json, DataClassJsonMixin
from typing import List
@dataclass_json
@dataclass
class Person:
x: List[int]
@dataclass
class Person2(DataClassJsonMixin):
x: List[int]
p2 = Person2([2])
def test(a: Person):
print(a.x)
p = Person([1])
print(p.x)
print(p.to_json()) mypy actually report error like previous dataclasses_json version, it seems fail to realize it is subclass of DataClassJsonMixin. Here is mypy result:
In pycharm it did not report any issue with the code inspection. However, The only reason why pycharm inspection does not report error/warning because:
So I try to create a minimal example to show the problem, once the base class and decorator defined in the same module it starts to report problem: from dataclasses import dataclass
from typing import List, Type
import abc
class Base1(abc.ABC):
def foo(self, *, x: str) -> str:
msg = "Base1:" + x
print(msg)
return msg
def decorator_factory(_cls=None, *, y: str = "param1"):
print(f"{_cls=}, {y=}")
def wrap(cls):
print(f"wrap {cls=}, {y=}")
return class_factory(cls, y)
if _cls is None:
return wrap
return wrap(_cls)
def class_factory(cls, y: str) -> Type[Base1]:
print(f"wrap {cls=}, {y=}")
cls._y = y
cls.foo = Base1.foo
Base1.register(cls)
return cls
@decorator_factory
@dataclass
class Person:
x: List[int]
@decorator_factory(y="B")
@dataclass
class PersonB:
x: List[int]
def test(a: Person):
print(a.x)
p = Person([1])
pB = PersonB([1])
print(p.x)
print(p.foo(x="test"))
print(pB.x)
print(pB.foo(x="testB")) Here the error on Interesting part is, if we remove the pylance give more errors, it implicates that it does not know what class is coming out from the decorator at all:
if we remove the annotation, it reports on the memeber foo. I think the code inspection won't get very deep into the dynamic generated types. We may have an interface like this: # in api
DataClassJsonMixin_factory(letter_case, undefined)->Type[DataClassJsonMixin]:
# add dataclass_json_config and the missing value handler
return the_subclass
# in user code:
@dataclass
class Person(DataClassJsonMixin_factory(<whatever the parameters>)):
pass |
Alright I see the problem now. Decorator function shadows the actual return type, I'll investigate what we can do about this. We don't use decorators internally that's why our CI never saw a problem there. Thanks for clarifications! |
So, I researched this a bit and I'm starting to think in another direction here a bit. It is quite clear that if you want proper IDE support you are forced to use a This resonates a bit with #243 for me. Thus, theoretically, we can make a breaking change to the library (and maybe graduate it to 1.0 afterwards? who knows :)) and rethink the way serialization methods are activated. If we go for the approach of:
This would resolve all IDE/static type checker errors because injected serializer will always return (and hint) the correct type to the caller. Moreover, this would remove the ambiguity of doing either annotation or the subclass and remove the need to support two APIs essentially. I'd argue people would get less confused as well, AND we get a big plus by allowing overwriting of the injected serializer with a homebrewed one, in case somebody wants - or if we want to support some exotic stuff. As a downside, this would break all code that uses pre-1.0 library, as people would be required to either remove subclassing (like our corp team does) or replace @healthmatrice @matt035343 @s-vitaliy @lidatong @andersk and others I forgot to tag - could you please let me know your thoughts :) As for the commit in question, I'd put this to vote - @matt035343 , @s-vitaliy - should we revert? |
@george-zubrienko I am really appreciate that you are willing to discuss about the potential change of the interface. I like the function module. Actually I think a lot of users already did this by themselves, for example:
That is why I said the If you officially provide these utilities I can simply directly import them instead of define them. We also used to hide the dataclass json interface completely inside the json encoder and object_hook. Then we can basically forget about the dataclass_json interface completely. Our users simply just call json.dump and json.load without knowing the detail internally. However, I think we should still be very careful about the breaking change in generally. We also need to think about our merit over other libraries (pydantic, etc) and emphasize them.
|
Can reverting the change be considered? This has now broken all over our codebase in a way that wasn't expected. |
Yes, we'll revert this as it seems to negatively interact with certain code analysis tools. However, it would be nice if you provide some context to 'broken all over our codebase' - it would help to plan next steps here - or maybe something else is broken and we need to take a look at that. |
In 0.5.13 the change in question will be reverted. I'll afterwards close this issue and open a new one for a potential 1.0 breaking API change that should in theory help with issues like this one in the future. Please take a look at it once it appears and let us know your thoughts. |
Re
I'm personally fine with some users switching from v1, given that we keep the core majority happy. This is not in any way a commercial project, our main goal is to keep the lib healthy, useful and up-to-date with latest Python features/changes. As you also pointed out, this lib provides much less fluff than others, among other things. |
We use this pattern quite often Happy to be involved in the breaking API change if I can, please share the issue here and I will subscribe to it. I think there's nothing wrong with a refactor into v1. Especially if it can meet all the modern typing expectations that have showed up with python. And I know that technically it is 0ver, so breaking changes are "allowed" without a major bump. But I think it should at least bump to |
I think if we are going to do this, it will be 1.0 with certain API stability guarantees from that point.
Quite curious, this is different from the OP issue, but falls under the same category. We use pylint and it doesn't complain about this. I think this is another example as to why we should consider a refactor |
Also, a small note @artificial-aidan and @healthmatrice, if any of you wants to contribute adding checks using pylance/pyright, so we can catch stuff like this in CI - I'd be happy to review the PR :) |
Do you have any current way of testing type checkers? I can maybe take a look at it. From a brief glance it would involve installing the type checker standalone and running it in CI? |
I think for pyright it is pretty simple - if I google correctly, it is a bit like mypy. So, to add it you need a few simple steps:
As an alternative, we can add it as a separate workflow - but that we can figure out if the PR is there. If you are unfamiliar with Poetry, check this part of our README |
Ok thanks, i missed the mypy step in code-quality. |
Well, to be honest, I am not very satisfied with how it works, and if you look at the annotations' code, lots of stuff is excluded from mypy checks, which is another flag indicating this is not the optimal way to do things. |
I can try when I get some free time. I've already have some pipeline about pyright, but I guess I need to add some unit test first. |
Would it be possible to yank 0.5.12? (https://peps.python.org/pep-0592/) I'm still having developers somehow accidentally get 0.5.12 installed |
There will be 0.5.14 end of this week :) |
Your guess is as good as mine. I didn't bother debugging, just upgraded them and moved on. I figured if it wasn't a lot of work, could save some confusion. |
Description
#411 try to fix the pylance cannot resolve
to_json()
, etc.However, this breaks all auto-completion of dataclass_json decorated classes.
I think 99% use cases, we need the IDE to treat the decorated class as dataclass instead of DataClassJsonMixin.
Because we all know what kind of method it adds. Thus
# type: ignore
is totally acceptable.Moreover, the
to_json
kind of method is used far less frequent that the dataclass__init__
and its other methods.I think we should prioritize the IDE support from dataclass itself than the DataClassJsonMixin.
Can we please revert that change till we found a better solution about that?
The thing is the returned class should be a sub class of both original cls and the DataClassjsonMixin, not just the ABC class.
Code snippet that reproduces the issue
Describe the results you expected
pylance should now show error and should auto-complete the Person class.
Python version you are using
Python 3.10.6
Environment description
dataclasses-json==0.5.9
marshmallow==3.19.0
marshmallow-enum==1.5.1
mypy-extensions==1.0.0
packaging==23.1
typing-inspect==0.9.0
typing_extensions==4.7.1
The text was updated successfully, but these errors were encountered: