Fix type annotations (from Samsung, AI Center, Cambridge)#2911
Fix type annotations (from Samsung, AI Center, Cambridge)#2911pplantinga merged 3 commits intospeechbrain:developfrom
Conversation
|
|
||
| @functools.lru_cache(None) | ||
| def warning_once(self, *args: tuple, **kwargs: dict): | ||
| def warning_once(self, *args, **kwargs): |
There was a problem hiding this comment.
From https://typing.python.org/en/latest/spec/callables.html#annotating-args-and-kwargs:
Therefore, the definition:
def func(*args: str, **kwargs: int): ...
means that the function accepts an arbitrary number of positional arguments of type str and an arbitrary number of keyword arguments of type int.
pplantinga
left a comment
There was a problem hiding this comment.
Overall, looks good. Just a few questions here
| from collections import Counter | ||
| from operator import itemgetter | ||
| from typing import List | ||
| from typing import List, Optional, Union |
There was a problem hiding this comment.
Should List be removed here?
There was a problem hiding this comment.
There are existing uses of List which I have not changed, but we could/should do this in another PR, now that python>=3.9.
| d_ffn=2048, | ||
| dropout=0.1, | ||
| activation=nn.ReLU, | ||
| activation: type = nn.ReLU, |
There was a problem hiding this comment.
Should we use Type[nn.Module] here?
There was a problem hiding this comment.
Yes but I can't get it to work. I may be missing something.
speechbrain/lobes/models/transformer/Transformer.py:228:28 - error: Argument of type "type[Module]" cannot be assigned to parameter "activation" of type "type[ReLU]" in function "__init__"
"type[Module]" is not assignable to "type[ReLU]"
Type "type[Module]" is not assignable to type "type[ReLU]" (reportArgumentType)
There was a problem hiding this comment.
I messed around with this a bit and still couldn't get it to work, I'm fine with type for now until we find something better.
| def forgetMult(self, f, x, hidden): | ||
| # type: (Tensor, torch.Tensor, Optional[Tensor]) -> torch.Tensor # noqa F821 | ||
| def forgetMult( | ||
| self, f: torch.Tensor, x: torch.Tensor, hidden: Optional[torch.Tensor] |
There was a problem hiding this comment.
Should this be hidden: Optional[torch.Tensor] = None ?
There was a problem hiding this comment.
No default value was given in the original, and I haven't changed the behaviour. This may have been the original intention, or not. Of course Optional[int] is int | None: Optional does not mean that the argument is optional.
| d_ffn=2048, | ||
| dropout=0.1, | ||
| activation=nn.ReLU, | ||
| activation: type = nn.ReLU, |
There was a problem hiding this comment.
I messed around with this a bit and still couldn't get it to work, I'm fine with type for now until we find something better.
…n#2911) Co-authored-by: Rogier van Dalen <r.vandalen@samsung.com>
What does this PR do?
Fix existing type annotations. These are meant to be uncontroversial.
There are a few types of categories:
argument: int = Nonewhere the type should beOptional[int]argument: Optional[int] = 3072where the type should beint, becauseOptionalmeans "may beNone".These were flagged by Pyright (see #2901).
Before submitting
PR review
Reviewer checklist