typing
— Prise en charge des annotations de type¶
Ajouté dans la version 3.5.
Code source : Lib/typing.py
Note
The Python runtime does not enforce function and variable type annotations. They can be used by third party tools such as type checkers, IDEs, linters, etc.
This module provides runtime support for type hints.
Consider the function below:
def surface_area_of_cube(edge_length: float) -> str:
return f"The surface area of the cube is {6 * edge_length ** 2}."
The function surface_area_of_cube
takes an argument expected to
be an instance of float
, as indicated by the type hint
edge_length: float
. The function is expected to return an instance
of str
, as indicated by the -> str
hint.
While type hints can be simple classes like float
or str
,
they can also be more complex. The typing
module provides a vocabulary of
more advanced type hints.
New features are frequently added to the typing
module.
The typing_extensions package
provides backports of these new features to older versions of Python.
Voir aussi
- "Typing cheat sheet"
A quick overview of type hints (hosted at the mypy docs)
- "Type System Reference" section of the mypy docs
The Python typing system is standardised via PEPs, so this reference should broadly apply to most Python type checkers. (Some parts may still be specific to mypy.)
- "Static Typing with Python"
Type-checker-agnostic documentation written by the community detailing type system features, useful typing related tools and typing best practices.
Specification for the Python Type System¶
The canonical, up-to-date specification of the Python type system can be found at "Specification for the Python type system".
Alias de type¶
A type alias is defined using the type
statement, which creates
an instance of TypeAliasType
. In this example,
Vector
and list[float]
will be treated equivalently by static type
checkers:
type Vector = list[float]
def scale(scalar: float, vector: Vector) -> Vector:
return [scalar * num for num in vector]
# passes type checking; a list of floats qualifies as a Vector.
new_vector = scale(2.0, [1.0, -4.2, 5.4])
Les alias de type sont utiles pour simplifier les signatures complexes. Par exemple :
from collections.abc import Sequence
type ConnectionOptions = dict[str, str]
type Address = tuple[str, int]
type Server = tuple[Address, ConnectionOptions]
def broadcast_message(message: str, servers: Sequence[Server]) -> None:
...
# The static type checker will treat the previous type signature as
# being exactly equivalent to this one.
def broadcast_message(
message: str,
servers: Sequence[tuple[tuple[str, int], dict[str, str]]]
) -> None:
...
The type
statement is new in Python 3.12. For backwards
compatibility, type aliases can also be created through simple assignment:
Vector = list[float]
Or marked with TypeAlias
to make it explicit that this is a type alias,
not a normal variable assignment:
from typing import TypeAlias
Vector: TypeAlias = list[float]
NewType¶
Utilisez la classe NewType
pour créer des types distincts :
from typing import NewType
UserId = NewType('UserId', int)
some_id = UserId(524313)
Le vérificateur de types statiques traite le nouveau type comme s'il s'agissait d'une sous-classe du type original. C'est utile pour aider à détecter les erreurs logiques :
def get_user_name(user_id: UserId) -> str:
...
# passes type checking
user_a = get_user_name(UserId(42351))
# fails type checking; an int is not a UserId
user_b = get_user_name(-1)
Vous pouvez toujours effectuer toutes les opérations applicables à un entier (type int
) sur une variable de type UserId
, mais le résultat sera toujours de type int
. Ceci vous permet de passer un UserId
partout où un int
est attendu, mais vous empêche de créer accidentellement un UserId
d'une manière invalide :
# 'output' is of type 'int', not 'UserId'
output = UserId(23413) + UserId(54341)
Notez que ces contrôles ne sont exécutés que par le vérificateur de types statique. À l'exécution, l'instruction Derived = NewType('Derived', Base)
fait de Derived
une fonction qui renvoie immédiatement le paramètre que vous lui passez. Cela signifie que l'expression Derived(some_value)
ne crée pas une nouvelle classe et n'introduit pas de surcharge au-delà de celle d'un appel de fonction normal.
Plus précisément, l'expression some_value is Derived(some_value)
est toujours vraie au moment de l'exécution.
La création d'un sous-type de Derived
est invalide :
from typing import NewType
UserId = NewType('UserId', int)
# Fails at runtime and does not pass type checking
class AdminUserId(UserId): pass
Il est néanmoins possible de créer un NewType
basé sur un NewType
« dérivé » :
from typing import NewType
UserId = NewType('UserId', int)
ProUserId = NewType('ProUserId', UserId)
et la vérification de type pour ProUserId
fonctionne comme prévu.
Voir la PEP 484 pour plus de détails.
Note
Recall that the use of a type alias declares two types to be equivalent to
one another. Doing type Alias = Original
will make the static type checker
treat Alias
as being exactly equivalent to Original
in all cases.
This is useful when you want to simplify complex type signatures.
En revanche, NewType
déclare qu'un type est un sous-type d'un autre. Écrire Derived = NewType('Derived', Original)
fait que le vérificateur de type statique traite Derived
comme une sous-classe de Original
, ce qui signifie qu'une valeur de type Original
ne peut être utilisée dans les endroits où une valeur de type Derived
est prévue. C'est utile lorsque vous voulez éviter les erreurs logiques avec un coût d'exécution minimal.
Ajouté dans la version 3.5.2.
Modifié dans la version 3.10: NewType
is now a class rather than a function. As a result, there is
some additional runtime cost when calling NewType
over a regular
function.
Modifié dans la version 3.11: The performance of calling NewType
has been restored to its level in
Python 3.9.
Annotating callable objects¶
Functions -- or other callable objects -- can be annotated using
collections.abc.Callable
or deprecated typing.Callable
.
Callable[[int], str]
signifies a function that takes a single parameter
of type int
and returns a str
.
For example:
from collections.abc import Callable, Awaitable
def feeder(get_next_item: Callable[[], str]) -> None:
... # Body
def async_query(on_success: Callable[[int], None],
on_error: Callable[[int, Exception], None]) -> None:
... # Body
async def on_update(value: str) -> None:
... # Body
callback: Callable[[str], Awaitable[None]] = on_update
The subscription syntax must always be used with exactly two values: the
argument list and the return type. The argument list must be a list of types,
a ParamSpec
, Concatenate
, or an ellipsis. The return type must
be a single type.
If a literal ellipsis ...
is given as the argument list, it indicates that
a callable with any arbitrary parameter list would be acceptable:
def concat(x: str, y: str) -> str:
return x + y
x: Callable[..., str]
x = str # OK
x = concat # Also OK
Callable
cannot express complex signatures such as functions that take a
variadic number of arguments, overloaded functions, or
functions that have keyword-only parameters. However, these signatures can be
expressed by defining a Protocol
class with a
__call__()
method:
from collections.abc import Iterable
from typing import Protocol
class Combiner(Protocol):
def __call__(self, *vals: bytes, maxlen: int | None = None) -> list[bytes]: ...
def batch_proc(data: Iterable[bytes], cb_results: Combiner) -> bytes:
for item in data:
...
def good_cb(*vals: bytes, maxlen: int | None = None) -> list[bytes]:
...
def bad_cb(*vals: bytes, maxitems: int | None) -> list[bytes]:
...
batch_proc([], good_cb) # OK
batch_proc([], bad_cb) # Error! Argument 2 has incompatible type because of
# different name and kind in the callback
Les appelables qui prennent en argument d'autres appelables peuvent indiquer que leurs types de paramètres dépendent les uns des autres en utilisant ParamSpec
. De plus, si un appelable ajoute ou supprime des arguments d'autres appelables, l'opérateur Concatenate
peut être utilisé. Ils prennent la forme Callable[ParamSpecVariable, ReturnType]
et Callable[Concatenate[Arg1Type, Arg2Type, ..., ParamSpecVariable], ReturnType]
respectivement.
Modifié dans la version 3.10: Callable
prend désormais en charge ParamSpec
et Concatenate
. Voir PEP 612 pour plus de détails.
Voir aussi
La documentation pour ParamSpec
et Concatenate
fournit des exemples d'utilisation dans Callable
.
Génériques¶
Since type information about objects kept in containers cannot be statically inferred in a generic way, many container classes in the standard library support subscription to denote the expected types of container elements.
from collections.abc import Mapping, Sequence
class Employee: ...
# Sequence[Employee] indicates that all elements in the sequence
# must be instances of "Employee".
# Mapping[str, str] indicates that all keys and all values in the mapping
# must be strings.
def notify_by_email(employees: Sequence[Employee],
overrides: Mapping[str, str]) -> None: ...
Generic functions and classes can be parameterized by using type parameter syntax:
from collections.abc import Sequence
def first[T](l: Sequence[T]) -> T: # Function is generic over the TypeVar "T"
return l[0]
Or by using the TypeVar
factory directly:
from collections.abc import Sequence
from typing import TypeVar
U = TypeVar('U') # Declare type variable "U"
def second(l: Sequence[U]) -> U: # Function is generic over the TypeVar "U"
return l[1]
Modifié dans la version 3.12: Syntactic support for generics is new in Python 3.12.
Annotating tuples¶
For most containers in Python, the typing system assumes that all elements in the container will be of the same type. For example:
from collections.abc import Mapping
# Type checker will infer that all elements in ``x`` are meant to be ints
x: list[int] = []
# Type checker error: ``list`` only accepts a single type argument:
y: list[int, str] = [1, 'foo']
# Type checker will infer that all keys in ``z`` are meant to be strings,
# and that all values in ``z`` are meant to be either strings or ints
z: Mapping[str, str | int] = {}
list
only accepts one type argument, so a type checker would emit an
error on the y
assignment above. Similarly,
Mapping
only accepts two type arguments: the first
indicates the type of the keys, and the second indicates the type of the
values.
Unlike most other Python containers, however, it is common in idiomatic Python
code for tuples to have elements which are not all of the same type. For this
reason, tuples are special-cased in Python's typing system. tuple
accepts any number of type arguments:
# OK: ``x`` is assigned to a tuple of length 1 where the sole element is an int
x: tuple[int] = (5,)
# OK: ``y`` is assigned to a tuple of length 2;
# element 1 is an int, element 2 is a str
y: tuple[int, str] = (5, "foo")
# Error: the type annotation indicates a tuple of length 1,
# but ``z`` has been assigned to a tuple of length 3
z: tuple[int] = (1, 2, 3)
To denote a tuple which could be of any length, and in which all elements are
of the same type T
, use tuple[T, ...]
. To denote an empty tuple, use
tuple[()]
. Using plain tuple
as an annotation is equivalent to using
tuple[Any, ...]
:
x: tuple[int, ...] = (1, 2)
# These reassignments are OK: ``tuple[int, ...]`` indicates x can be of any length
x = (1, 2, 3)
x = ()
# This reassignment is an error: all elements in ``x`` must be ints
x = ("foo", "bar")
# ``y`` can only ever be assigned to an empty tuple
y: tuple[()] = ()
z: tuple = ("foo", "bar")
# These reassignments are OK: plain ``tuple`` is equivalent to ``tuple[Any, ...]``
z = (1, 2, 3)
z = ()
The type of class objects¶
A variable annotated with C
may accept a value of type C
. In
contrast, a variable annotated with type[C]
(or deprecated
typing.Type[C]
) may accept values that are classes
themselves -- specifically, it will accept the class object of C
. For
example:
a = 3 # Has type ``int``
b = int # Has type ``type[int]``
c = type(a) # Also has type ``type[int]``
Note that type[C]
is covariant:
class User: ...
class ProUser(User): ...
class TeamUser(User): ...
def make_new_user(user_class: type[User]) -> User:
# ...
return user_class()
make_new_user(User) # OK
make_new_user(ProUser) # Also OK: ``type[ProUser]`` is a subtype of ``type[User]``
make_new_user(TeamUser) # Still fine
make_new_user(User()) # Error: expected ``type[User]`` but got ``User``
make_new_user(int) # Error: ``type[int]`` is not a subtype of ``type[User]``
The only legal parameters for type
are classes, Any
,
type variables, and unions of any of these types.
For example:
def new_non_team_user(user_class: type[BasicUser | ProUser]): ...
new_non_team_user(BasicUser) # OK
new_non_team_user(ProUser) # OK
new_non_team_user(TeamUser) # Error: ``type[TeamUser]`` is not a subtype
# of ``type[BasicUser | ProUser]``
new_non_team_user(User) # Also an error
type[Any]
is equivalent to type
, which is the root of Python's
metaclass hierarchy.
Annotating generators and coroutines¶
A generator can be annotated using the generic type
Generator[YieldType, SendType, ReturnType]
.
For example:
def echo_round() -> Generator[int, float, str]:
sent = yield 0
while sent >= 0:
sent = yield round(sent)
return 'Done'
Note that unlike many other generic classes in the standard library,
the SendType
of Generator
behaves
contravariantly, not covariantly or invariantly.
The SendType
and ReturnType
parameters default to None
:
def infinite_stream(start: int) -> Generator[int]:
while True:
yield start
start += 1
It is also possible to set these types explicitly:
def infinite_stream(start: int) -> Generator[int, None, None]:
while True:
yield start
start += 1
Simple generators that only ever yield values can also be annotated
as having a return type of either
Iterable[YieldType]
or Iterator[YieldType]
:
def infinite_stream(start: int) -> Iterator[int]:
while True:
yield start
start += 1
Async generators are handled in a similar fashion, but don't
expect a ReturnType
type argument
(AsyncGenerator[YieldType, SendType]
).
The SendType
argument defaults to None
, so the following definitions
are equivalent:
async def infinite_stream(start: int) -> AsyncGenerator[int]:
while True:
yield start
start = await increment(start)
async def infinite_stream(start: int) -> AsyncGenerator[int, None]:
while True:
yield start
start = await increment(start)
As in the synchronous case,
AsyncIterable[YieldType]
and AsyncIterator[YieldType]
are
available as well:
async def infinite_stream(start: int) -> AsyncIterator[int]:
while True:
yield start
start = await increment(start)
Coroutines can be annotated using
Coroutine[YieldType, SendType, ReturnType]
.
Generic arguments correspond to those of Generator
,
for example:
from collections.abc import Coroutine
c: Coroutine[list[str], str, int] # Some coroutine defined elsewhere
x = c.send('hi') # Inferred type of 'x' is list[str]
async def bar() -> None:
y = await c # Inferred type of 'y' is int
Types génériques définis par l'utilisateur¶
Une classe définie par l'utilisateur peut être définie comme une classe générique.
from logging import Logger
class LoggedVar[T]:
def __init__(self, value: T, name: str, logger: Logger) -> None:
self.name = name
self.logger = logger
self.value = value
def set(self, new: T) -> None:
self.log('Set ' + repr(self.value))
self.value = new
def get(self) -> T:
self.log('Get ' + repr(self.value))
return self.value
def log(self, message: str) -> None:
self.logger.info('%s: %s', self.name, message)
This syntax indicates that the class LoggedVar
is parameterised around a
single type variable T
. This also makes T
valid as
a type within the class body.
Generic classes implicitly inherit from Generic
. For compatibility
with Python 3.11 and lower, it is also possible to inherit explicitly from
Generic
to indicate a generic class:
from typing import TypeVar, Generic
T = TypeVar('T')
class LoggedVar(Generic[T]):
...
Generic classes have __class_getitem__()
methods, meaning they
can be parameterised at runtime (e.g. LoggedVar[int]
below):
from collections.abc import Iterable
def zero_all_vars(vars: Iterable[LoggedVar[int]]) -> None:
for var in vars:
var.set(0)
A generic type can have any number of type variables. All varieties of
TypeVar
are permissible as parameters for a generic type:
from typing import TypeVar, Generic, Sequence
class WeirdTrio[T, B: Sequence[bytes], S: (int, str)]:
...
OldT = TypeVar('OldT', contravariant=True)
OldB = TypeVar('OldB', bound=Sequence[bytes], covariant=True)
OldS = TypeVar('OldS', int, str)
class OldWeirdTrio(Generic[OldT, OldB, OldS]):
...
Chaque argument de variable de type Generic
doit être distinct. Ceci n'est donc pas valable :
from typing import TypeVar, Generic
...
class Pair[M, M]: # SyntaxError
...
T = TypeVar('T')
class Pair(Generic[T, T]): # INVALID
...
Generic classes can also inherit from other classes:
from collections.abc import Sized
class LinkedList[T](Sized):
...
When inheriting from generic classes, some type parameters could be fixed:
from collections.abc import Mapping
class MyDict[T](Mapping[str, T]):
...
Dans ce cas, MyDict
a un seul paramètre, T
.
Using a generic class without specifying type parameters assumes
Any
for each position. In the following example, MyIterable
is
not generic but implicitly inherits from Iterable[Any]
:
from collections.abc import Iterable
class MyIterable(Iterable): # Same as Iterable[Any]
...
User-defined generic type aliases are also supported. Examples:
from collections.abc import Iterable
type Response[S] = Iterable[S] | int
# Return type here is same as Iterable[str] | int
def response(query: str) -> Response[str]:
...
type Vec[T] = Iterable[tuple[T, T]]
def inproduct[T: (int, float, complex)](v: Vec[T]) -> T: # Same as Iterable[tuple[T, T]]
return sum(x*y for x, y in v)
For backward compatibility, generic type aliases can also be created through a simple assignment:
from collections.abc import Iterable
from typing import TypeVar
S = TypeVar("S")
Response = Iterable[S] | int
Modifié dans la version 3.7: Generic
n'a plus de métaclasse personnalisée.
Modifié dans la version 3.12: Syntactic support for generics and type aliases is new in version 3.12.
Previously, generic classes had to explicitly inherit from Generic
or contain a type variable in one of their bases.
User-defined generics for parameter expressions are also supported via parameter
specification variables in the form [**P]
. The behavior is consistent
with type variables' described above as parameter specification variables are
treated by the typing module as a specialized type variable. The one exception
to this is that a list of types can be used to substitute a ParamSpec
:
>>> class Z[T, **P]: ... # T is a TypeVar; P is a ParamSpec
...
>>> Z[int, [dict, float]]
__main__.Z[int, [dict, float]]
Classes generic over a ParamSpec
can also be created using explicit
inheritance from Generic
. In this case, **
is not used:
from typing import ParamSpec, Generic
P = ParamSpec('P')
class Z(Generic[P]):
...
Another difference between TypeVar
and ParamSpec
is that a
generic with only one parameter specification variable will accept
parameter lists in the forms X[[Type1, Type2, ...]]
and also
X[Type1, Type2, ...]
for aesthetic reasons. Internally, the latter is converted
to the former, so the following are equivalent:
>>> class X[**P]: ...
...
>>> X[int, str]
__main__.X[[int, str]]
>>> X[[int, str]]
__main__.X[[int, str]]
Note that generics with ParamSpec
may not have correct
__parameters__
after substitution in some cases because they
are intended primarily for static type checking.
Modifié dans la version 3.10: Generic
can now be parameterized over parameter expressions.
See ParamSpec
and PEP 612 for more details.
A user-defined generic class can have ABCs as base classes without a metaclass conflict. Generic metaclasses are not supported. The outcome of parameterizing generics is cached, and most types in the typing module are hashable and comparable for equality.
Le type Any
¶
Un type particulier est Any
. Un vérificateur de types statiques traite chaque type comme étant compatible avec Any
et Any
comme étant compatible avec chaque type.
This means that it is possible to perform any operation or method call on a
value of type Any
and assign it to any variable:
from typing import Any
a: Any = None
a = [] # OK
a = 2 # OK
s: str = ''
s = a # OK
def foo(item: Any) -> int:
# Passes type checking; 'item' could be any type,
# and that type might have a 'bar' method
item.bar()
...
Notice that no type checking is performed when assigning a value of type
Any
to a more precise type. For example, the static type checker did
not report an error when assigning a
to s
even though s
was
declared to be of type str
and receives an int
value at
runtime!
De plus, toutes les fonctions sans type de retour ni type de paramètre sont considérées comme utilisant Any
implicitement par défaut :
def legacy_parser(text):
...
return data
# A static type checker will treat the above
# as having the same signature as:
def legacy_parser(text: Any) -> Any:
...
return data
Ce comportement permet à Any
d'être utilisé comme succédané lorsque vous avez besoin de mélanger du code typé dynamiquement et statiquement.
Comparons le comportement de Any
avec celui de object
. De la même manière que pour Any
, chaque type est un sous-type de object
. Cependant, contrairement à Any
, l'inverse n'est pas vrai : object
n'est pas un sous-type de chaque autre type.
Cela signifie que lorsque le type d'une valeur est object
, un vérificateur de types rejette presque toutes les opérations sur celle-ci, et l'affecter à une variable (ou l'utiliser comme une valeur de retour) d'un type plus spécialisé est une erreur de typage. Par exemple :
def hash_a(item: object) -> int:
# Fails type checking; an object does not have a 'magic' method.
item.magic()
...
def hash_b(item: Any) -> int:
# Passes type checking
item.magic()
...
# Passes type checking, since ints and strs are subclasses of object
hash_a(42)
hash_a("foo")
# Passes type checking, since Any is compatible with all types
hash_b(42)
hash_b("foo")
Utilisez object
pour indiquer qu'une valeur peut être de n'importe quel type de manière sûre. Utiliser Any
pour indiquer qu'une valeur est typée dynamiquement.
Sous-typage nominal et sous-typage structurel¶
Initially PEP 484 defined the Python static type system as using
nominal subtyping. This means that a class A
is allowed where
a class B
is expected if and only if A
is a subclass of B
.
This requirement previously also applied to abstract base classes, such as
Iterable
. The problem with this approach is that a class had
to be explicitly marked to support them, which is unpythonic and unlike
what one would normally do in idiomatic dynamically typed Python code.
For example, this conforms to PEP 484:
from collections.abc import Sized, Iterable, Iterator
class Bucket(Sized, Iterable[int]):
...
def __len__(self) -> int: ...
def __iter__(self) -> Iterator[int]: ...
La PEP 544 permet de résoudre ce problème en permettant aux utilisateurs d'écrire le code ci-dessus sans classes mères explicites dans la définition de classe, permettant à Bucket
d'être implicitement considéré comme un sous-type de Sized
et Iterable[int]
par des vérificateurs de type statique. C'est ce qu'on appelle le sous-typage structurel (ou typage canard) :
from collections.abc import Iterator, Iterable
class Bucket: # Note: no base classes
...
def __len__(self) -> int: ...
def __iter__(self) -> Iterator[int]: ...
def collect(items: Iterable[int]) -> int: ...
result = collect(Bucket()) # Passes type check
De plus, en sous-classant une classe spéciale Protocol
, un utilisateur peut définir de nouveaux protocoles personnalisés pour profiter pleinement du sous-typage structurel (voir exemples ci-dessous).
Classe de données¶
The typing
module defines the following classes, functions and decorators.
Special typing primitives¶
Special types¶
These can be used as types in annotations. They do not support subscription
using []
.
- typing.Any¶
Type spécial indiquant un type non contraint.
Modifié dans la version 3.11:
Any
can now be used as a base class. This can be useful for avoiding type checker errors with classes that can duck type anywhere or are highly dynamic.
- typing.AnyStr¶
-
Definition:
AnyStr = TypeVar('AnyStr', str, bytes)
AnyStr
is meant to be used for functions that may acceptstr
orbytes
arguments but cannot allow the two to mix.Par exemple :
def concat(a: AnyStr, b: AnyStr) -> AnyStr: return a + b concat("foo", "bar") # OK, output has type 'str' concat(b"foo", b"bar") # OK, output has type 'bytes' concat("foo", b"bar") # Error, cannot mix str and bytes
Note that, despite its name,
AnyStr
has nothing to do with theAny
type, nor does it mean "any string". In particular,AnyStr
andstr | bytes
are different from each other and have different use cases:# Invalid use of AnyStr: # The type variable is used only once in the function signature, # so cannot be "solved" by the type checker def greet_bad(cond: bool) -> AnyStr: return "hi there!" if cond else b"greetings!" # The better way of annotating this function: def greet_proper(cond: bool) -> str | bytes: return "hi there!" if cond else b"greetings!"
Deprecated since version 3.13, will be removed in version 3.18: Deprecated in favor of the new type parameter syntax. Use
class A[T: (str, bytes)]: ...
instead of importingAnyStr
. See PEP 695 for more details.In Python 3.16,
AnyStr
will be removed fromtyping.__all__
, and deprecation warnings will be emitted at runtime when it is accessed or imported fromtyping
.AnyStr
will be removed fromtyping
in Python 3.18.
- typing.LiteralString¶
Special type that includes only literal strings.
Any string literal is compatible with
LiteralString
, as is anotherLiteralString
. However, an object typed as juststr
is not. A string created by composingLiteralString
-typed objects is also acceptable as aLiteralString
.Example:
def run_query(sql: LiteralString) -> None: ... def caller(arbitrary_string: str, literal_string: LiteralString) -> None: run_query("SELECT * FROM students") # OK run_query(literal_string) # OK run_query("SELECT * FROM " + literal_string) # OK run_query(arbitrary_string) # type checker error run_query( # type checker error f"SELECT * FROM students WHERE name = {arbitrary_string}" )
LiteralString
is useful for sensitive APIs where arbitrary user-generated strings could generate problems. For example, the two cases above that generate type checker errors could be vulnerable to an SQL injection attack.See PEP 675 for more details.
Ajouté dans la version 3.11.
- typing.Never¶
- typing.NoReturn¶
Never
andNoReturn
represent the bottom type, a type that has no members.They can be used to indicate that a function never returns, such as
sys.exit()
:from typing import Never # or NoReturn def stop() -> Never: raise RuntimeError('no way')
Or to define a function that should never be called, as there are no valid arguments, such as
assert_never()
:from typing import Never # or NoReturn def never_call_me(arg: Never) -> None: pass def int_or_str(arg: int | str) -> None: never_call_me(arg) # type checker error match arg: case int(): print("It's an int") case str(): print("It's a str") case _: never_call_me(arg) # OK, arg is of type Never (or NoReturn)
Never
andNoReturn
have the same meaning in the type system and static type checkers treat both equivalently.Ajouté dans la version 3.6.2: Added
NoReturn
.Ajouté dans la version 3.11: Added
Never
.
- typing.Self¶
Special type to represent the current enclosed class.
Par exemple :
from typing import Self, reveal_type class Foo: def return_self(self) -> Self: ... return self class SubclassOfFoo(Foo): pass reveal_type(Foo().return_self()) # Revealed type is "Foo" reveal_type(SubclassOfFoo().return_self()) # Revealed type is "SubclassOfFoo"
This annotation is semantically equivalent to the following, albeit in a more succinct fashion:
from typing import TypeVar Self = TypeVar("Self", bound="Foo") class Foo: def return_self(self: Self) -> Self: ... return self
In general, if something returns
self
, as in the above examples, you should useSelf
as the return annotation. IfFoo.return_self
was annotated as returning"Foo"
, then the type checker would infer the object returned fromSubclassOfFoo.return_self
as being of typeFoo
rather thanSubclassOfFoo
.Other common use cases include:
classmethod
s that are used as alternative constructors and return instances of thecls
parameter.Annotating an
__enter__()
method which returns self.
You should not use
Self
as the return annotation if the method is not guaranteed to return an instance of a subclass when the class is subclassed:class Eggs: # Self would be an incorrect return annotation here, # as the object returned is always an instance of Eggs, # even in subclasses def returns_eggs(self) -> "Eggs": return Eggs()
See PEP 673 for more details.
Ajouté dans la version 3.11.
- typing.TypeAlias¶
Special annotation for explicitly declaring a type alias.
Par exemple :
from typing import TypeAlias Factors: TypeAlias = list[int]
TypeAlias
is particularly useful on older Python versions for annotating aliases that make use of forward references, as it can be hard for type checkers to distinguish these from normal variable assignments:from typing import Generic, TypeAlias, TypeVar T = TypeVar("T") # "Box" does not exist yet, # so we have to use quotes for the forward reference on Python <3.12. # Using ``TypeAlias`` tells the type checker that this is a type alias declaration, # not a variable assignment to a string. BoxOfStrings: TypeAlias = "Box[str]" class Box(Generic[T]): @classmethod def make_box_of_strings(cls) -> BoxOfStrings: ...
See PEP 613 for more details.
Ajouté dans la version 3.10.
Obsolète depuis la version 3.12:
TypeAlias
is deprecated in favor of thetype
statement, which creates instances ofTypeAliasType
and which natively supports forward references. Note that whileTypeAlias
andTypeAliasType
serve similar purposes and have similar names, they are distinct and the latter is not the type of the former. Removal ofTypeAlias
is not currently planned, but users are encouraged to migrate totype
statements.
Special forms¶
These can be used as types in annotations. They all support subscription using
[]
, but each has a unique syntax.
- typing.Union¶
Union type;
Union[X, Y]
is equivalent toX | Y
and means either X or Y.To define a union, use e.g.
Union[int, str]
or the shorthandint | str
. Using that shorthand is recommended. Details:Les arguments doivent être des types et il doit y en avoir au moins un.
Les unions d'unions sont aplanies, par exemple :
Union[Union[int, str], float] == Union[int, str, float]
Les unions d'un seul argument disparaissent, par exemple :
Union[int] == int # The constructor actually returns int
Les arguments redondants sont ignorés, par exemple :
Union[int, str, int] == Union[int, str] == int | str
Lors de la comparaison d'unions, l'ordre des arguments est ignoré, par exemple :
Union[int, str] == Union[str, int]
You cannot subclass or instantiate a
Union
.Vous ne pouvez pas écrire
Union[X][Y]
.
Modifié dans la version 3.7: Ne supprime pas les sous-classes explicites des unions à l'exécution.
Modifié dans la version 3.10: Unions can now be written as
X | Y
. See union type expressions.
- typing.Optional¶
Optional[X]
is equivalent toX | None
(orUnion[X, None]
).Notez que ce n'est pas le même concept qu'un argument optionnel, qui est un argument qui possède une valeur par défaut. Un argument optionnel (qui a une valeur par défaut) ne nécessite pas, à ce titre, le qualificatif
Optional
sur son annotation de type. Par exemple :def foo(arg: int = 0) -> None: ...
Par contre, si une valeur explicite de
None
est permise, l'utilisation deOptional
est appropriée, que l'argument soit facultatif ou non. Par exemple :def foo(arg: Optional[int] = None) -> None: ...
Modifié dans la version 3.10: Optional can now be written as
X | None
. See union type expressions.
- typing.Concatenate¶
Special form for annotating higher-order functions.
Concatenate
can be used in conjunction with Callable andParamSpec
to annotate a higher-order callable which adds, removes, or transforms parameters of another callable. Usage is in the formConcatenate[Arg1Type, Arg2Type, ..., ParamSpecVariable]
.Concatenate
is currently only valid when used as the first argument to a Callable. The last parameter toConcatenate
must be aParamSpec
or ellipsis (...
).For example, to annotate a decorator
with_lock
which provides athreading.Lock
to the decorated function,Concatenate
can be used to indicate thatwith_lock
expects a callable which takes in aLock
as the first argument, and returns a callable with a different type signature. In this case, theParamSpec
indicates that the returned callable's parameter types are dependent on the parameter types of the callable being passed in:from collections.abc import Callable from threading import Lock from typing import Concatenate # Use this lock to ensure that only one thread is executing a function # at any time. my_lock = Lock() def with_lock[**P, R](f: Callable[Concatenate[Lock, P], R]) -> Callable[P, R]: '''A type-safe decorator which provides a lock.''' def inner(*args: P.args, **kwargs: P.kwargs) -> R: # Provide the lock as the first argument. return f(my_lock, *args, **kwargs) return inner @with_lock def sum_threadsafe(lock: Lock, numbers: list[float]) -> float: '''Add a list of numbers together in a thread-safe manner.''' with lock: return sum(numbers) # We don't need to pass in the lock ourselves thanks to the decorator. sum_threadsafe([1.1, 2.2, 3.3])
Ajouté dans la version 3.10.
Voir aussi
PEP 612 -- Parameter Specification Variables (the PEP which introduced
ParamSpec
andConcatenate
)
- typing.Literal¶
Special typing form to define "literal types".
Literal
can be used to indicate to type checkers that the annotated object has a value equivalent to one of the provided literals.Par exemple :
def validate_simple(data: Any) -> Literal[True]: # always returns True ... type Mode = Literal['r', 'rb', 'w', 'wb'] def open_helper(file: str, mode: Mode) -> str: ... open_helper('/some/path', 'r') # Passes type check open_helper('/other/path', 'typo') # Error in type checker
Literal[...]
ne peut être sous-classé. Lors de l'exécution, une valeur arbitraire est autorisée comme argument de type pourLiteral[...]
, mais les vérificateurs de type peuvent imposer des restrictions. Voir la PEP 586 pour plus de détails sur les types littéraux.Ajouté dans la version 3.8.
- typing.ClassVar¶
Construction de type particulière pour indiquer les variables de classe.
Telle qu'introduite dans la PEP 526, une annotation de variable enveloppée dans ClassVar indique qu'un attribut donné est destiné à être utilisé comme une variable de classe et ne doit pas être défini sur des instances de cette classe. Utilisation :
class Starship: stats: ClassVar[dict[str, int]] = {} # class variable damage: int = 10 # instance variable
ClassVar
n'accepte que les types et ne peut plus être dérivé.ClassVar
n'est pas une classe en soi, et ne devrait pas être utilisée avecisinstance()
ouissubclass()
.ClassVar
ne modifie pas le comportement d'exécution Python, mais il peut être utilisé par des vérificateurs tiers. Par exemple, un vérificateur de types peut marquer le code suivant comme une erreur :enterprise_d = Starship(3000) enterprise_d.stats = {} # Error, setting class variable on instance Starship.stats = {} # This is OK
Ajouté dans la version 3.5.3.
- typing.Final¶
Special typing construct to indicate final names to type checkers.
Final names cannot be reassigned in any scope. Final names declared in class scopes cannot be overridden in subclasses.
Par exemple :
MAX_SIZE: Final = 9000 MAX_SIZE += 1 # Error reported by type checker class Connection: TIMEOUT: Final[int] = 10 class FastConnector(Connection): TIMEOUT = 1 # Error reported by type checker
Ces propriétés ne sont pas vérifiées à l'exécution. Voir la PEP 591 pour plus de détails.
Ajouté dans la version 3.8.
- typing.Required¶
Special typing construct to mark a
TypedDict
key as required.This is mainly useful for
total=False
TypedDicts. SeeTypedDict
and PEP 655 for more details.Ajouté dans la version 3.11.
- typing.NotRequired¶
Special typing construct to mark a
TypedDict
key as potentially missing.See
TypedDict
and PEP 655 for more details.Ajouté dans la version 3.11.
- typing.ReadOnly¶
A special typing construct to mark an item of a
TypedDict
as read-only.Par exemple :
class Movie(TypedDict): title: ReadOnly[str] year: int def mutate_movie(m: Movie) -> None: m["year"] = 1999 # allowed m["title"] = "The Matrix" # typechecker error
There is no runtime checking for this property.
See
TypedDict
and PEP 705 for more details.Ajouté dans la version 3.13.
- typing.Annotated¶
Special typing form to add context-specific metadata to an annotation.
Add metadata
x
to a given typeT
by using the annotationAnnotated[T, x]
. Metadata added usingAnnotated
can be used by static analysis tools or at runtime. At runtime, the metadata is stored in a__metadata__
attribute.If a library or tool encounters an annotation
Annotated[T, x]
and has no special logic for the metadata, it should ignore the metadata and simply treat the annotation asT
. As such,Annotated
can be useful for code that wants to use annotations for purposes outside Python's static typing system.Using
Annotated[T, x]
as an annotation still allows for static typechecking ofT
, as type checkers will simply ignore the metadatax
. In this way,Annotated
differs from the@no_type_check
decorator, which can also be used for adding annotations outside the scope of the typing system, but completely disables typechecking for a function or class.The responsibility of how to interpret the metadata lies with the tool or library encountering an
Annotated
annotation. A tool or library encountering anAnnotated
type can scan through the metadata elements to determine if they are of interest (e.g., usingisinstance()
).- Annotated[<type>, <metadata>]
Here is an example of how you might use
Annotated
to add metadata to type annotations if you were doing range analysis:@dataclass class ValueRange: lo: int hi: int T1 = Annotated[int, ValueRange(-10, 5)] T2 = Annotated[T1, ValueRange(-20, 3)]
Details of the syntax:
The first argument to
Annotated
must be a valid typeMultiple metadata elements can be supplied (
Annotated
supports variadic arguments):@dataclass class ctype: kind: str Annotated[int, ValueRange(3, 10), ctype("char")]
It is up to the tool consuming the annotations to decide whether the client is allowed to add multiple metadata elements to one annotation and how to merge those annotations.
Annotated
must be subscripted with at least two arguments (Annotated[int]
is not valid)The order of the metadata elements is preserved and matters for equality checks:
assert Annotated[int, ValueRange(3, 10), ctype("char")] != Annotated[ int, ctype("char"), ValueRange(3, 10) ]
Nested
Annotated
types are flattened. The order of the metadata elements starts with the innermost annotation:assert Annotated[Annotated[int, ValueRange(3, 10)], ctype("char")] == Annotated[ int, ValueRange(3, 10), ctype("char") ]
Duplicated metadata elements are not removed:
assert Annotated[int, ValueRange(3, 10)] != Annotated[ int, ValueRange(3, 10), ValueRange(3, 10) ]
Annotated
can be used with nested and generic aliases:@dataclass class MaxLen: value: int type Vec[T] = Annotated[list[tuple[T, T]], MaxLen(10)] # When used in a type annotation, a type checker will treat "V" the same as # ``Annotated[list[tuple[int, int]], MaxLen(10)]``: type V = Vec[int]
Annotated
cannot be used with an unpackedTypeVarTuple
:type Variadic[*Ts] = Annotated[*Ts, Ann1] # NOT valid
This would be equivalent to:
Annotated[T1, T2, T3, ..., Ann1]
where
T1
,T2
, etc. areTypeVars
. This would be invalid: only one type should be passed to Annotated.By default,
get_type_hints()
strips the metadata from annotations. Passinclude_extras=True
to have the metadata preserved:>>> from typing import Annotated, get_type_hints >>> def func(x: Annotated[int, "metadata"]) -> None: pass ... >>> get_type_hints(func) {'x': <class 'int'>, 'return': <class 'NoneType'>} >>> get_type_hints(func, include_extras=True) {'x': typing.Annotated[int, 'metadata'], 'return': <class 'NoneType'>}
At runtime, the metadata associated with an
Annotated
type can be retrieved via the__metadata__
attribute:>>> from typing import Annotated >>> X = Annotated[int, "very", "important", "metadata"] >>> X typing.Annotated[int, 'very', 'important', 'metadata'] >>> X.__metadata__ ('very', 'important', 'metadata')
At runtime, if you want to retrieve the original type wrapped by
Annotated
, use the__origin__
attribute:>>> from typing import Annotated, get_origin >>> Password = Annotated[str, "secret"] >>> Password.__origin__ <class 'str'>
Note that using
get_origin()
will returnAnnotated
itself:>>> get_origin(Password) typing.Annotated
Voir aussi
- PEP 593 - Flexible function and variable annotations
The PEP introducing
Annotated
to the standard library.
Ajouté dans la version 3.9.
- typing.TypeIs¶
Special typing construct for marking user-defined type predicate functions.
TypeIs
can be used to annotate the return type of a user-defined type predicate function.TypeIs
only accepts a single type argument. At runtime, functions marked this way should return a boolean and take at least one positional argument.TypeIs
aims to benefit type narrowing -- a technique used by static type checkers to determine a more precise type of an expression within a program's code flow. Usually type narrowing is done by analyzing conditional code flow and applying the narrowing to a block of code. The conditional expression here is sometimes referred to as a "type predicate":def is_str(val: str | float): # "isinstance" type predicate if isinstance(val, str): # Type of ``val`` is narrowed to ``str`` ... else: # Else, type of ``val`` is narrowed to ``float``. ...
Sometimes it would be convenient to use a user-defined boolean function as a type predicate. Such a function should use
TypeIs[...]
orTypeGuard
as its return type to alert static type checkers to this intention.TypeIs
usually has more intuitive behavior thanTypeGuard
, but it cannot be used when the input and output types are incompatible (e.g.,list[object]
tolist[int]
) or when the function does not returnTrue
for all instances of the narrowed type.Using
-> TypeIs[NarrowedType]
tells the static type checker that for a given function:The return value is a boolean.
If the return value is
True
, the type of its argument is the intersection of the argument's original type andNarrowedType
.If the return value is
False
, the type of its argument is narrowed to excludeNarrowedType
.
Par exemple :
from typing import assert_type, final, TypeIs class Parent: pass class Child(Parent): pass @final class Unrelated: pass def is_parent(val: object) -> TypeIs[Parent]: return isinstance(val, Parent) def run(arg: Child | Unrelated): if is_parent(arg): # Type of ``arg`` is narrowed to the intersection # of ``Parent`` and ``Child``, which is equivalent to # ``Child``. assert_type(arg, Child) else: # Type of ``arg`` is narrowed to exclude ``Parent``, # so only ``Unrelated`` is left. assert_type(arg, Unrelated)
The type inside
TypeIs
must be consistent with the type of the function's argument; if it is not, static type checkers will raise an error. An incorrectly writtenTypeIs
function can lead to unsound behavior in the type system; it is the user's responsibility to write such functions in a type-safe manner.If a
TypeIs
function is a class or instance method, then the type inTypeIs
maps to the type of the second parameter (aftercls
orself
).In short, the form
def foo(arg: TypeA) -> TypeIs[TypeB]: ...
, means that iffoo(arg)
returnsTrue
, thenarg
is an instance ofTypeB
, and if it returnsFalse
, it is not an instance ofTypeB
.TypeIs
also works with type variables. For more information, see PEP 742 (Narrowing types withTypeIs
).Ajouté dans la version 3.13.
- typing.TypeGuard¶
Special typing construct for marking user-defined type predicate functions.
Type predicate functions are user-defined functions that return whether their argument is an instance of a particular type.
TypeGuard
works similarly toTypeIs
, but has subtly different effects on type checking behavior (see below).Using
-> TypeGuard
tells the static type checker that for a given function:The return value is a boolean.
If the return value is
True
, the type of its argument is the type insideTypeGuard
.
TypeGuard
also works with type variables. See PEP 647 for more details.Par exemple :
def is_str_list(val: list[object]) -> TypeGuard[list[str]]: '''Determines whether all objects in the list are strings''' return all(isinstance(x, str) for x in val) def func1(val: list[object]): if is_str_list(val): # Type of ``val`` is narrowed to ``list[str]``. print(" ".join(val)) else: # Type of ``val`` remains as ``list[object]``. print("Not a list of strings!")
TypeIs
andTypeGuard
differ in the following ways:TypeIs
requires the narrowed type to be a subtype of the input type, whileTypeGuard
does not. The main reason is to allow for things like narrowinglist[object]
tolist[str]
even though the latter is not a subtype of the former, sincelist
is invariant.When a
TypeGuard
function returnsTrue
, type checkers narrow the type of the variable to exactly theTypeGuard
type. When aTypeIs
function returnsTrue
, type checkers can infer a more precise type combining the previously known type of the variable with theTypeIs
type. (Technically, this is known as an intersection type.)When a
TypeGuard
function returnsFalse
, type checkers cannot narrow the type of the variable at all. When aTypeIs
function returnsFalse
, type checkers can narrow the type of the variable to exclude theTypeIs
type.
Ajouté dans la version 3.10.
- typing.Unpack¶
Typing operator to conceptually mark an object as having been unpacked.
For example, using the unpack operator
*
on a type variable tuple is equivalent to usingUnpack
to mark the type variable tuple as having been unpacked:Ts = TypeVarTuple('Ts') tup: tuple[*Ts] # Effectively does: tup: tuple[Unpack[Ts]]
In fact,
Unpack
can be used interchangeably with*
in the context oftyping.TypeVarTuple
andbuiltins.tuple
types. You might seeUnpack
being used explicitly in older versions of Python, where*
couldn't be used in certain places:# In older versions of Python, TypeVarTuple and Unpack # are located in the `typing_extensions` backports package. from typing_extensions import TypeVarTuple, Unpack Ts = TypeVarTuple('Ts') tup: tuple[*Ts] # Syntax error on Python <= 3.10! tup: tuple[Unpack[Ts]] # Semantically equivalent, and backwards-compatible
Unpack
can also be used along withtyping.TypedDict
for typing**kwargs
in a function signature:from typing import TypedDict, Unpack class Movie(TypedDict): name: str year: int # This function expects two keyword arguments - `name` of type `str` # and `year` of type `int`. def foo(**kwargs: Unpack[Movie]): ...
See PEP 692 for more details on using
Unpack
for**kwargs
typing.Ajouté dans la version 3.11.
Building generic types and type aliases¶
The following classes should not be used directly as annotations. Their intended purpose is to be building blocks for creating generic types and type aliases.
These objects can be created through special syntax
(type parameter lists and the type
statement).
For compatibility with Python 3.11 and earlier, they can also be created
without the dedicated syntax, as documented below.
- class typing.Generic¶
Classe de base abstraite pour les types génériques.
A generic type is typically declared by adding a list of type parameters after the class name:
class Mapping[KT, VT]: def __getitem__(self, key: KT) -> VT: ... # Etc.
Such a class implicitly inherits from
Generic
. The runtime semantics of this syntax are discussed in the Language Reference.Cette classe peut alors être utilisée comme suit :
def lookup_name[X, Y](mapping: Mapping[X, Y], key: X, default: Y) -> Y: try: return mapping[key] except KeyError: return default
Here the brackets after the function name indicate a generic function.
For backwards compatibility, generic classes can also be declared by explicitly inheriting from
Generic
. In this case, the type parameters must be declared separately:KT = TypeVar('KT') VT = TypeVar('VT') class Mapping(Generic[KT, VT]): def __getitem__(self, key: KT) -> VT: ... # Etc.
- class typing.TypeVar(name, *constraints, bound=None, covariant=False, contravariant=False, infer_variance=False, default=typing.NoDefault)¶
Variables de type.
The preferred way to construct a type variable is via the dedicated syntax for generic functions, generic classes, and generic type aliases:
class Sequence[T]: # T is a TypeVar ...
This syntax can also be used to create bounded and constrained type variables:
class StrSequence[S: str]: # S is a TypeVar with a `str` upper bound; ... # we can say that S is "bounded by `str`" class StrOrBytesSequence[A: (str, bytes)]: # A is a TypeVar constrained to str or bytes ...
However, if desired, reusable type variables can also be constructed manually, like so:
T = TypeVar('T') # Can be anything S = TypeVar('S', bound=str) # Can be any subtype of str A = TypeVar('A', str, bytes) # Must be exactly str or bytes
Type variables exist primarily for the benefit of static type checkers. They serve as the parameters for generic types as well as for generic function and type alias definitions. See
Generic
for more information on generic types. Generic functions work as follows:def repeat[T](x: T, n: int) -> Sequence[T]: """Return a list containing n references to x.""" return [x]*n def print_capitalized[S: str](x: S) -> S: """Print x capitalized, and return x.""" print(x.capitalize()) return x def concatenate[A: (str, bytes)](x: A, y: A) -> A: """Add two strings or bytes objects together.""" return x + y
Note that type variables can be bounded, constrained, or neither, but cannot be both bounded and constrained.
The variance of type variables is inferred by type checkers when they are created through the type parameter syntax or when
infer_variance=True
is passed. Manually created type variables may be explicitly marked covariant or contravariant by passingcovariant=True
orcontravariant=True
. By default, manually created type variables are invariant. See PEP 484 and PEP 695 for more details.Bounded type variables and constrained type variables have different semantics in several important ways. Using a bounded type variable means that the
TypeVar
will be solved using the most specific type possible:x = print_capitalized('a string') reveal_type(x) # revealed type is str class StringSubclass(str): pass y = print_capitalized(StringSubclass('another string')) reveal_type(y) # revealed type is StringSubclass z = print_capitalized(45) # error: int is not a subtype of str
The upper bound of a type variable can be a concrete type, abstract type (ABC or Protocol), or even a union of types:
# Can be anything with an __abs__ method def print_abs[T: SupportsAbs](arg: T) -> None: print("Absolute value:", abs(arg)) U = TypeVar('U', bound=str|bytes) # Can be any subtype of the union str|bytes V = TypeVar('V', bound=SupportsAbs) # Can be anything with an __abs__ method
Using a constrained type variable, however, means that the
TypeVar
can only ever be solved as being exactly one of the constraints given:a = concatenate('one', 'two') reveal_type(a) # revealed type is str b = concatenate(StringSubclass('one'), StringSubclass('two')) reveal_type(b) # revealed type is str, despite StringSubclass being passed in c = concatenate('one', b'two') # error: type variable 'A' can be either str or bytes in a function call, but not both
At runtime,
isinstance(x, T)
will raiseTypeError
.- __name__¶
The name of the type variable.
- __covariant__¶
Whether the type var has been explicitly marked as covariant.
- __contravariant__¶
Whether the type var has been explicitly marked as contravariant.
- __infer_variance__¶
Whether the type variable's variance should be inferred by type checkers.
Ajouté dans la version 3.12.
- __bound__¶
The upper bound of the type variable, if any.
Modifié dans la version 3.12: For type variables created through type parameter syntax, the bound is evaluated only when the attribute is accessed, not when the type variable is created (see Lazy evaluation).
- evaluate_bound()¶
An evaluate function corresponding to the
__bound__
attribute. When called directly, this method supports only theVALUE
format, which is equivalent to accessing the__bound__
attribute directly, but the method object can be passed toannotationlib.call_evaluate_function()
to evaluate the value in a different format.Ajouté dans la version 3.14.
- __constraints__¶
A tuple containing the constraints of the type variable, if any.
Modifié dans la version 3.12: For type variables created through type parameter syntax, the constraints are evaluated only when the attribute is accessed, not when the type variable is created (see Lazy evaluation).
- evaluate_constraints()¶
An evaluate function corresponding to the
__constraints__
attribute. When called directly, this method supports only theVALUE
format, which is equivalent to accessing the__constraints__
attribute directly, but the method object can be passed toannotationlib.call_evaluate_function()
to evaluate the value in a different format.Ajouté dans la version 3.14.
- __default__¶
The default value of the type variable, or
typing.NoDefault
if it has no default.Ajouté dans la version 3.13.
- evaluate_default()¶
An evaluate function corresponding to the
__default__
attribute. When called directly, this method supports only theVALUE
format, which is equivalent to accessing the__default__
attribute directly, but the method object can be passed toannotationlib.call_evaluate_function()
to evaluate the value in a different format.Ajouté dans la version 3.14.
- has_default()¶
Return whether or not the type variable has a default value. This is equivalent to checking whether
__default__
is not thetyping.NoDefault
singleton, except that it does not force evaluation of the lazily evaluated default value.Ajouté dans la version 3.13.
Modifié dans la version 3.12: Type variables can now be declared using the type parameter syntax introduced by PEP 695. The
infer_variance
parameter was added.Modifié dans la version 3.13: Support for default values was added.
- class typing.TypeVarTuple(name, *, default=typing.NoDefault)¶
Type variable tuple. A specialized form of type variable that enables variadic generics.
Type variable tuples can be declared in type parameter lists using a single asterisk (
*
) before the name:def move_first_element_to_last[T, *Ts](tup: tuple[T, *Ts]) -> tuple[*Ts, T]: return (*tup[1:], tup[0])
Or by explicitly invoking the
TypeVarTuple
constructor:T = TypeVar("T") Ts = TypeVarTuple("Ts") def move_first_element_to_last(tup: tuple[T, *Ts]) -> tuple[*Ts, T]: return (*tup[1:], tup[0])
A normal type variable enables parameterization with a single type. A type variable tuple, in contrast, allows parameterization with an arbitrary number of types by acting like an arbitrary number of type variables wrapped in a tuple. For example:
# T is bound to int, Ts is bound to () # Return value is (1,), which has type tuple[int] move_first_element_to_last(tup=(1,)) # T is bound to int, Ts is bound to (str,) # Return value is ('spam', 1), which has type tuple[str, int] move_first_element_to_last(tup=(1, 'spam')) # T is bound to int, Ts is bound to (str, float) # Return value is ('spam', 3.0, 1), which has type tuple[str, float, int] move_first_element_to_last(tup=(1, 'spam', 3.0)) # This fails to type check (and fails at runtime) # because tuple[()] is not compatible with tuple[T, *Ts] # (at least one element is required) move_first_element_to_last(tup=())
Note the use of the unpacking operator
*
intuple[T, *Ts]
. Conceptually, you can think ofTs
as a tuple of type variables(T1, T2, ...)
.tuple[T, *Ts]
would then becometuple[T, *(T1, T2, ...)]
, which is equivalent totuple[T, T1, T2, ...]
. (Note that in older versions of Python, you might see this written usingUnpack
instead, asUnpack[Ts]
.)Type variable tuples must always be unpacked. This helps distinguish type variable tuples from normal type variables:
x: Ts # Not valid x: tuple[Ts] # Not valid x: tuple[*Ts] # The correct way to do it
Type variable tuples can be used in the same contexts as normal type variables. For example, in class definitions, arguments, and return types:
class Array[*Shape]: def __getitem__(self, key: tuple[*Shape]) -> float: ... def __abs__(self) -> "Array[*Shape]": ... def get_shape(self) -> tuple[*Shape]: ...
Type variable tuples can be happily combined with normal type variables:
class Array[DType, *Shape]: # This is fine pass class Array2[*Shape, DType]: # This would also be fine pass class Height: ... class Width: ... float_array_1d: Array[float, Height] = Array() # Totally fine int_array_2d: Array[int, Height, Width] = Array() # Yup, fine too
However, note that at most one type variable tuple may appear in a single list of type arguments or type parameters:
x: tuple[*Ts, *Ts] # Not valid class Array[*Shape, *Shape]: # Not valid pass
Finally, an unpacked type variable tuple can be used as the type annotation of
*args
:def call_soon[*Ts]( callback: Callable[[*Ts], None], *args: *Ts ) -> None: ... callback(*args)
In contrast to non-unpacked annotations of
*args
- e.g.*args: int
, which would specify that all arguments areint
-*args: *Ts
enables reference to the types of the individual arguments in*args
. Here, this allows us to ensure the types of the*args
passed tocall_soon
match the types of the (positional) arguments ofcallback
.See PEP 646 for more details on type variable tuples.
- __name__¶
The name of the type variable tuple.
- __default__¶
The default value of the type variable tuple, or
typing.NoDefault
if it has no default.Ajouté dans la version 3.13.
- evaluate_default()¶
An evaluate function corresponding to the
__default__
attribute. When called directly, this method supports only theVALUE
format, which is equivalent to accessing the__default__
attribute directly, but the method object can be passed toannotationlib.call_evaluate_function()
to evaluate the value in a different format.Ajouté dans la version 3.14.
- has_default()¶
Return whether or not the type variable tuple has a default value. This is equivalent to checking whether
__default__
is not thetyping.NoDefault
singleton, except that it does not force evaluation of the lazily evaluated default value.Ajouté dans la version 3.13.
Ajouté dans la version 3.11.
Modifié dans la version 3.12: Type variable tuples can now be declared using the type parameter syntax introduced by PEP 695.
Modifié dans la version 3.13: Support for default values was added.
- class typing.ParamSpec(name, *, bound=None, covariant=False, contravariant=False, default=typing.NoDefault)¶
Parameter specification variable. A specialized version of type variables.
In type parameter lists, parameter specifications can be declared with two asterisks (
**
):type IntFunc[**P] = Callable[P, int]
For compatibility with Python 3.11 and earlier,
ParamSpec
objects can also be created as follows:P = ParamSpec('P')
Parameter specification variables exist primarily for the benefit of static type checkers. They are used to forward the parameter types of one callable to another callable -- a pattern commonly found in higher order functions and decorators. They are only valid when used in
Concatenate
, or as the first argument toCallable
, or as parameters for user-defined Generics. SeeGeneric
for more information on generic types.For example, to add basic logging to a function, one can create a decorator
add_logging
to log function calls. The parameter specification variable tells the type checker that the callable passed into the decorator and the new callable returned by it have inter-dependent type parameters:from collections.abc import Callable import logging def add_logging[T, **P](f: Callable[P, T]) -> Callable[P, T]: '''A type-safe decorator to add logging to a function.''' def inner(*args: P.args, **kwargs: P.kwargs) -> T: logging.info(f'{f.__name__} was called') return f(*args, **kwargs) return inner @add_logging def add_two(x: float, y: float) -> float: '''Add two numbers together.''' return x + y
Without
ParamSpec
, the simplest way to annotate this previously was to use aTypeVar
with upper boundCallable[..., Any]
. However this causes two problems:The type checker can't type check the
inner
function because*args
and**kwargs
have to be typedAny
.cast()
may be required in the body of theadd_logging
decorator when returning theinner
function, or the static type checker must be told to ignore thereturn inner
.
- args¶
- kwargs¶
Since
ParamSpec
captures both positional and keyword parameters,P.args
andP.kwargs
can be used to split aParamSpec
into its components.P.args
represents the tuple of positional parameters in a given call and should only be used to annotate*args
.P.kwargs
represents the mapping of keyword parameters to their values in a given call, and should be only be used to annotate**kwargs
. Both attributes require the annotated parameter to be in scope. At runtime,P.args
andP.kwargs
are instances respectively ofParamSpecArgs
andParamSpecKwargs
.
- __name__¶
The name of the parameter specification.
- __default__¶
The default value of the parameter specification, or
typing.NoDefault
if it has no default.Ajouté dans la version 3.13.
- evaluate_default()¶
An evaluate function corresponding to the
__default__
attribute. When called directly, this method supports only theVALUE
format, which is equivalent to accessing the__default__
attribute directly, but the method object can be passed toannotationlib.call_evaluate_function()
to evaluate the value in a different format.Ajouté dans la version 3.14.
- has_default()¶
Return whether or not the parameter specification has a default value. This is equivalent to checking whether
__default__
is not thetyping.NoDefault
singleton, except that it does not force evaluation of the lazily evaluated default value.Ajouté dans la version 3.13.
Parameter specification variables created with
covariant=True
orcontravariant=True
can be used to declare covariant or contravariant generic types. Thebound
argument is also accepted, similar toTypeVar
. However the actual semantics of these keywords are yet to be decided.Ajouté dans la version 3.10.
Modifié dans la version 3.12: Parameter specifications can now be declared using the type parameter syntax introduced by PEP 695.
Modifié dans la version 3.13: Support for default values was added.
Note
Only parameter specification variables defined in global scope can be pickled.
Voir aussi
PEP 612 -- Parameter Specification Variables (the PEP which introduced
ParamSpec
andConcatenate
)
- typing.ParamSpecArgs¶
- typing.ParamSpecKwargs¶
Arguments and keyword arguments attributes of a
ParamSpec
. TheP.args
attribute of aParamSpec
is an instance ofParamSpecArgs
, andP.kwargs
is an instance ofParamSpecKwargs
. They are intended for runtime introspection and have no special meaning to static type checkers.Calling
get_origin()
on either of these objects will return the originalParamSpec
:>>> from typing import ParamSpec, get_origin >>> P = ParamSpec("P") >>> get_origin(P.args) is P True >>> get_origin(P.kwargs) is P True
Ajouté dans la version 3.10.
- class typing.TypeAliasType(name, value, *, type_params=())¶
The type of type aliases created through the
type
statement.Example:
>>> type Alias = int >>> type(Alias) <class 'typing.TypeAliasType'>
Ajouté dans la version 3.12.
- __name__¶
The name of the type alias:
>>> type Alias = int >>> Alias.__name__ 'Alias'
- __module__¶
The module in which the type alias was defined:
>>> type Alias = int >>> Alias.__module__ '__main__'
- __type_params__¶
The type parameters of the type alias, or an empty tuple if the alias is not generic:
>>> type ListOrSet[T] = list[T] | set[T] >>> ListOrSet.__type_params__ (T,) >>> type NotGeneric = int >>> NotGeneric.__type_params__ ()
- __value__¶
The type alias's value. This is lazily evaluated, so names used in the definition of the alias are not resolved until the
__value__
attribute is accessed:>>> type Mutually = Recursive >>> type Recursive = Mutually >>> Mutually Mutually >>> Recursive Recursive >>> Mutually.__value__ Recursive >>> Recursive.__value__ Mutually
- evaluate_value()¶
An evaluate function corresponding to the
__value__
attribute. When called directly, this method supports only theVALUE
format, which is equivalent to accessing the__value__
attribute directly, but the method object can be passed toannotationlib.call_evaluate_function()
to evaluate the value in a different format:>>> type Alias = undefined >>> Alias.__value__ Traceback (most recent call last): ... NameError: name 'undefined' is not defined >>> from annotationlib import Format, call_evaluate_function >>> Alias.evaluate_value(Format.VALUE) Traceback (most recent call last): ... NameError: name 'undefined' is not defined >>> call_evaluate_function(Alias.evaluate_value, Format.FORWARDREF) ForwardRef('undefined')
Ajouté dans la version 3.14.
Other special directives¶
These functions and classes should not be used directly as annotations. Their intended purpose is to be building blocks for creating and declaring types.
- class typing.NamedTuple¶
Version typée de
collections.namedtuple()
.Utilisation :
class Employee(NamedTuple): name: str id: int
Ce qui est équivalent à :
Employee = collections.namedtuple('Employee', ['name', 'id'])
Pour assigner une valeur par défaut à un champ, vous pouvez lui donner dans le corps de classe :
class Employee(NamedTuple): name: str id: int = 3 employee = Employee('Guido') assert employee.id == 3
Les champs avec une valeur par défaut doivent venir après tous les champs sans valeur par défaut.
The resulting class has an extra attribute
__annotations__
giving a dict that maps the field names to the field types. (The field names are in the_fields
attribute and the default values are in the_field_defaults
attribute, both of which are part of thenamedtuple()
API.)Les sous-classes de
NamedTuple
peuvent aussi avoir des docstrings et des méthodes :class Employee(NamedTuple): """Represents an employee.""" name: str id: int = 3 def __repr__(self) -> str: return f'<Employee {self.name}, id={self.id}>'
NamedTuple
subclasses can be generic:class Group[T](NamedTuple): key: T group: list[T]
Utilisation rétrocompatible :
# For creating a generic NamedTuple on Python 3.11 T = TypeVar("T") class Group(NamedTuple, Generic[T]): key: T group: list[T] # A functional syntax is also supported Employee = NamedTuple('Employee', [('name', str), ('id', int)])
Modifié dans la version 3.6: Ajout de la gestion de la syntaxe d'annotation variable de la PEP 526.
Modifié dans la version 3.6.1: Ajout de la prise en charge des valeurs par défaut, des méthodes et des chaînes de caractères docstrings.
Modifié dans la version 3.8: Les attributs
_field_types
et__annotations__
sont maintenant des dictionnaires standards au lieu d'instances deOrderedDict
.Modifié dans la version 3.9: rend l'attribut
_field_types
obsolète en faveur de l'attribut plus standard__annotations__
qui a la même information.Modifié dans la version 3.11: Added support for generic namedtuples.
Deprecated since version 3.13, will be removed in version 3.15: The undocumented keyword argument syntax for creating NamedTuple classes (
NT = NamedTuple("NT", x=int)
) is deprecated, and will be disallowed in 3.15. Use the class-based syntax or the functional syntax instead.Deprecated since version 3.13, will be removed in version 3.15: When using the functional syntax to create a NamedTuple class, failing to pass a value to the 'fields' parameter (
NT = NamedTuple("NT")
) is deprecated. PassingNone
to the 'fields' parameter (NT = NamedTuple("NT", None)
) is also deprecated. Both will be disallowed in Python 3.15. To create a NamedTuple class with 0 fields, useclass NT(NamedTuple): pass
orNT = NamedTuple("NT", [])
.
- class typing.NewType(name, tp)¶
Helper class to create low-overhead distinct types.
A
NewType
is considered a distinct type by a typechecker. At runtime, however, calling aNewType
returns its argument unchanged.Utilisation :
UserId = NewType('UserId', int) # Declare the NewType "UserId" first_user = UserId(1) # "UserId" returns the argument unchanged at runtime
- __module__¶
The module in which the new type is defined.
- __name__¶
The name of the new type.
- __supertype__¶
The type that the new type is based on.
Ajouté dans la version 3.5.2.
Modifié dans la version 3.10:
NewType
is now a class rather than a function.
- class typing.Protocol(Generic)¶
Base class for protocol classes.
Protocol classes are defined like this:
class Proto(Protocol): def meth(self) -> int: ...
Ces classes sont principalement utilisées avec les vérificateurs statiques de type qui reconnaissent les sous-types structurels (typage canard statique), par exemple :
class C: def meth(self) -> int: return 0 def func(x: Proto) -> int: return x.meth() func(C()) # Passes static type check
See PEP 544 for more details. Protocol classes decorated with
runtime_checkable()
(described later) act as simple-minded runtime protocols that check only the presence of given attributes, ignoring their type signatures.Les classes de protocole peuvent être génériques, par exemple :
class GenProto[T](Protocol): def meth(self) -> T: ...
In code that needs to be compatible with Python 3.11 or older, generic Protocols can be written as follows:
T = TypeVar("T") class GenProto(Protocol[T]): def meth(self) -> T: ...
Ajouté dans la version 3.8.
- @typing.runtime_checkable¶
Marquez une classe de protocole comme protocole d'exécution.
Such a protocol can be used with
isinstance()
andissubclass()
. This raisesTypeError
when applied to a non-protocol class. This allows a simple-minded structural check, very similar to "one trick ponies" incollections.abc
such asIterable
. For example:@runtime_checkable class Closable(Protocol): def close(self): ... assert isinstance(open('/some/file'), Closable) @runtime_checkable class Named(Protocol): name: str import threading assert isinstance(threading.Thread(name='Bob'), Named)
Note
runtime_checkable()
will check only the presence of the required methods or attributes, not their type signatures or types. For example,ssl.SSLObject
is a class, therefore it passes anissubclass()
check against Callable. However, thessl.SSLObject.__init__
method exists only to raise aTypeError
with a more informative message, therefore making it impossible to call (instantiate)ssl.SSLObject
.Note
An
isinstance()
check against a runtime-checkable protocol can be surprisingly slow compared to anisinstance()
check against a non-protocol class. Consider using alternative idioms such ashasattr()
calls for structural checks in performance-sensitive code.Ajouté dans la version 3.8.
Modifié dans la version 3.12: The internal implementation of
isinstance()
checks against runtime-checkable protocols now usesinspect.getattr_static()
to look up attributes (previously,hasattr()
was used). As a result, some objects which used to be considered instances of a runtime-checkable protocol may no longer be considered instances of that protocol on Python 3.12+, and vice versa. Most users are unlikely to be affected by this change.Modifié dans la version 3.12: The members of a runtime-checkable protocol are now considered "frozen" at runtime as soon as the class has been created. Monkey-patching attributes onto a runtime-checkable protocol will still work, but will have no impact on
isinstance()
checks comparing objects to the protocol. See "What's new in Python 3.12" for more details.
- class typing.TypedDict(dict)¶
Special construct to add type hints to a dictionary. At runtime it is a plain
dict
.TypedDict
declares a dictionary type that expects all of its instances to have a certain set of keys, where each key is associated with a value of a consistent type. This expectation is not checked at runtime but is only enforced by type checkers. Usage:class Point2D(TypedDict): x: int y: int label: str a: Point2D = {'x': 1, 'y': 2, 'label': 'good'} # OK b: Point2D = {'z': 3, 'label': 'bad'} # Fails type check assert Point2D(x=1, y=2, label='first') == dict(x=1, y=2, label='first')
An alternative way to create a
TypedDict
is by using function-call syntax. The second argument must be a literaldict
:Point2D = TypedDict('Point2D', {'x': int, 'y': int, 'label': str})
This functional syntax allows defining keys which are not valid identifiers, for example because they are keywords or contain hyphens:
# raises SyntaxError class Point2D(TypedDict): in: int # 'in' is a keyword x-y: int # name with hyphens # OK, functional syntax Point2D = TypedDict('Point2D', {'in': int, 'x-y': int})
By default, all keys must be present in a
TypedDict
. It is possible to mark individual keys as non-required usingNotRequired
:class Point2D(TypedDict): x: int y: int label: NotRequired[str] # Alternative syntax Point2D = TypedDict('Point2D', {'x': int, 'y': int, 'label': NotRequired[str]})
This means that a
Point2D
TypedDict
can have thelabel
key omitted.It is also possible to mark all keys as non-required by default by specifying a totality of
False
:class Point2D(TypedDict, total=False): x: int y: int # Alternative syntax Point2D = TypedDict('Point2D', {'x': int, 'y': int}, total=False)
This means that a
Point2D
TypedDict
can have any of the keys omitted. A type checker is only expected to support a literalFalse
orTrue
as the value of thetotal
argument.True
is the default, and makes all items defined in the class body required.Individual keys of a
total=False
TypedDict
can be marked as required usingRequired
:class Point2D(TypedDict, total=False): x: Required[int] y: Required[int] label: str # Alternative syntax Point2D = TypedDict('Point2D', { 'x': Required[int], 'y': Required[int], 'label': str }, total=False)
It is possible for a
TypedDict
type to inherit from one or more otherTypedDict
types using the class-based syntax. Usage:class Point3D(Point2D): z: int
Point3D
has three items:x
,y
andz
. It is equivalent to this definition:class Point3D(TypedDict): x: int y: int z: int
A
TypedDict
cannot inherit from a non-TypedDict
class, except forGeneric
. For example:class X(TypedDict): x: int class Y(TypedDict): y: int class Z(object): pass # A non-TypedDict class class XY(X, Y): pass # OK class XZ(X, Z): pass # raises TypeError
A
TypedDict
can be generic:class Group[T](TypedDict): key: T group: list[T]
To create a generic
TypedDict
that is compatible with Python 3.11 or lower, inherit fromGeneric
explicitly:T = TypeVar("T") class Group(TypedDict, Generic[T]): key: T group: list[T]
A
TypedDict
can be introspected via annotations dicts (see Bonnes pratiques concernant les annotations for more information on annotations best practices),__total__
,__required_keys__
, and__optional_keys__
.- __total__¶
Point2D.__total__
gives the value of thetotal
argument. Example:>>> from typing import TypedDict >>> class Point2D(TypedDict): pass >>> Point2D.__total__ True >>> class Point2D(TypedDict, total=False): pass >>> Point2D.__total__ False >>> class Point3D(Point2D): pass >>> Point3D.__total__ True
This attribute reflects only the value of the
total
argument to the currentTypedDict
class, not whether the class is semantically total. For example, aTypedDict
with__total__
set toTrue
may have keys marked withNotRequired
, or it may inherit from anotherTypedDict
withtotal=False
. Therefore, it is generally better to use__required_keys__
and__optional_keys__
for introspection.
- __required_keys__¶
Ajouté dans la version 3.9.
- __optional_keys__¶
Point2D.__required_keys__
andPoint2D.__optional_keys__
returnfrozenset
objects containing required and non-required keys, respectively.Keys marked with
Required
will always appear in__required_keys__
and keys marked withNotRequired
will always appear in__optional_keys__
.For backwards compatibility with Python 3.10 and below, it is also possible to use inheritance to declare both required and non-required keys in the same
TypedDict
. This is done by declaring aTypedDict
with one value for thetotal
argument and then inheriting from it in anotherTypedDict
with a different value fortotal
:>>> class Point2D(TypedDict, total=False): ... x: int ... y: int ... >>> class Point3D(Point2D): ... z: int ... >>> Point3D.__required_keys__ == frozenset({'z'}) True >>> Point3D.__optional_keys__ == frozenset({'x', 'y'}) True
Ajouté dans la version 3.9.
Note
If
from __future__ import annotations
is used or if annotations are given as strings, annotations are not evaluated when theTypedDict
is defined. Therefore, the runtime introspection that__required_keys__
and__optional_keys__
rely on may not work properly, and the values of the attributes may be incorrect.
Support for
ReadOnly
is reflected in the following attributes:- __readonly_keys__¶
A
frozenset
containing the names of all read-only keys. Keys are read-only if they carry theReadOnly
qualifier.Ajouté dans la version 3.13.
- __mutable_keys__¶
A
frozenset
containing the names of all mutable keys. Keys are mutable if they do not carry theReadOnly
qualifier.Ajouté dans la version 3.13.
See PEP 589 for more examples and detailed rules of using
TypedDict
.Ajouté dans la version 3.8.
Modifié dans la version 3.11: Added support for marking individual keys as
Required
orNotRequired
. See PEP 655.Modifié dans la version 3.11: Added support for generic
TypedDict
s.Modifié dans la version 3.13: Removed support for the keyword-argument method of creating
TypedDict
s.Modifié dans la version 3.13: Support for the
ReadOnly
qualifier was added.Deprecated since version 3.13, will be removed in version 3.15: When using the functional syntax to create a TypedDict class, failing to pass a value to the 'fields' parameter (
TD = TypedDict("TD")
) is deprecated. PassingNone
to the 'fields' parameter (TD = TypedDict("TD", None)
) is also deprecated. Both will be disallowed in Python 3.15. To create a TypedDict class with 0 fields, useclass TD(TypedDict): pass
orTD = TypedDict("TD", {})
.
Protocoles¶
The following protocols are provided by the typing module. All are decorated
with @runtime_checkable
.
- class typing.SupportsAbs¶
Une ABC avec une méthode abstraite
__abs__
qui est covariante dans son type de retour.
- class typing.SupportsBytes¶
Une ABC avec une méthode abstraite
__bytes__
.
- class typing.SupportsComplex¶
Une ABC avec une méthode abstraite
__complex__
.
- class typing.SupportsFloat¶
Une ABC avec une méthode abstraite
__float__
.
- class typing.SupportsIndex¶
Une ABC avec une méthode abstraite
__index__
.Ajouté dans la version 3.8.
- class typing.SupportsInt¶
Une ABC avec une méthode abstraite
__int__
.
- class typing.SupportsRound¶
Une ABC avec une méthode abstraite
__round__
qui est covariante dans son type de retour.
ABCs for working with IO¶
Functions and decorators¶
- typing.cast(typ, val)¶
Convertit une valeur en un type.
Ceci renvoie la valeur inchangée. Pour le vérificateur de types, cela signifie que la valeur de retour a le type désigné mais, à l'exécution, intentionnellement, rien n'est vérifié (afin que cela soit aussi rapide que possible).
- typing.assert_type(val, typ, /)¶
Vérifie que val est bien du type typ.
At runtime this does nothing: it returns the first argument unchanged with no checks or side effects, no matter the actual type of the argument.
When a static type checker encounters a call to
assert_type()
, it emits an error if the value is not of the specified type:def greet(name: str) -> None: assert_type(name, str) # OK, inferred type of `name` is `str` assert_type(name, int) # type checker error
Cette fonction permet de s'assurer de la compréhension du vérificateur de type d'un script par rapport aux intentions du développeur :
def complex_function(arg: object): # Do some complex type-narrowing logic, # after which we hope the inferred type will be `int` ... # Test whether the type checker correctly understands our function assert_type(arg, int)
Ajouté dans la version 3.11.
- typing.assert_never(arg, /)¶
Demande une confirmation de la part du vérificateur statique de type qu'une ligne de code est inaccessible.
Example:
def int_or_str(arg: int | str) -> None: match arg: case int(): print("It's an int") case str(): print("It's a str") case _ as unreachable: assert_never(unreachable)
Here, the annotations allow the type checker to infer that the last case can never execute, because
arg
is either anint
or astr
, and both options are covered by earlier cases.If a type checker finds that a call to
assert_never()
is reachable, it will emit an error. For example, if the type annotation forarg
was insteadint | str | float
, the type checker would emit an error pointing out thatunreachable
is of typefloat
. For a call toassert_never
to pass type checking, the inferred type of the argument passed in must be the bottom type,Never
, and nothing else.Une erreur est levé si la fonction est appelé lors de l'exécution.
Voir aussi
Unreachable Code and Exhaustiveness Checking has more information about exhaustiveness checking with static typing.
Ajouté dans la version 3.11.
- typing.reveal_type(obj, /)¶
Ask a static type checker to reveal the inferred type of an expression.
When a static type checker encounters a call to this function, it emits a diagnostic with the inferred type of the argument. For example:
x: int = 1 reveal_type(x) # Revealed type is "builtins.int"
Cela est utile afin de comprendre comment le vérificateur de types va traiter un bout de code précis.
At runtime, this function prints the runtime type of its argument to
sys.stderr
and returns the argument unchanged (allowing the call to be used within an expression):x = reveal_type(1) # prints "Runtime type is int" print(x) # prints "1"
Note that the runtime type may be different from (more or less specific than) the type statically inferred by a type checker.
Most type checkers support
reveal_type()
anywhere, even if the name is not imported fromtyping
. Importing the name fromtyping
, however, allows your code to run without runtime errors and communicates intent more clearly.Ajouté dans la version 3.11.
- @typing.dataclass_transform(*, eq_default=True, order_default=False, kw_only_default=False, frozen_default=False, field_specifiers=(), **kwargs)¶
Decorator to mark an object as providing
dataclass
-like behavior.dataclass_transform
may be used to decorate a class, metaclass, or a function that is itself a decorator. The presence of@dataclass_transform()
tells a static type checker that the decorated object performs runtime "magic" that transforms a class in a similar way to@dataclasses.dataclass
.Example usage with a decorator function:
@dataclass_transform() def create_model[T](cls: type[T]) -> type[T]: ... return cls @create_model class CustomerModel: id: int name: str
Avec une classe de base :
@dataclass_transform() class ModelBase: ... class CustomerModel(ModelBase): id: int name: str
Avec une métaclasse :
@dataclass_transform() class ModelMeta(type): ... class ModelBase(metaclass=ModelMeta): ... class CustomerModel(ModelBase): id: int name: str
Les classes
CustomerModel
définis ci-dessus sont traitées par les vérificateurs de type de la même que les classes créées avec@dataclasses.dataclass
. Par exemple, les vérificateurs de type déduisent que ces classes possèdent une méthode__init__
acceptantid
etname
comme arguments.Les arguments booléens suivants sont acceptés,les vérificateurs de type supposent qu'ils ont le même effet qu'ils auraient eu sur le décorateur
@dataclasses.dataclass
:init
,eq
,order
,unsafe_hash
,frozen
,match_args
,kw_only
, etslots
. Il est possible d'évaluer statiquement les valeurs de ces arguments (True
orFalse
).Les arguments du décorateur
dataclass_transform
permettent de personnaliser le comportement par défaut de la classe, métaclasse ou fonction décorée :- Paramètres:
eq_default (bool) -- Indicates whether the
eq
parameter is assumed to beTrue
orFalse
if it is omitted by the caller. Defaults toTrue
.order_default (bool) -- Indicates whether the
order
parameter is assumed to beTrue
orFalse
if it is omitted by the caller. Defaults toFalse
.kw_only_default (bool) -- Indicates whether the
kw_only
parameter is assumed to beTrue
orFalse
if it is omitted by the caller. Defaults toFalse
.frozen_default (bool) --
Indicates whether the
frozen
parameter is assumed to beTrue
orFalse
if it is omitted by the caller. Defaults toFalse
.Ajouté dans la version 3.12.
field_specifiers (tuple[Callable[..., Any], ...]) -- Specifies a static list of supported classes or functions that describe fields, similar to
dataclasses.field()
. Defaults to()
.**kwargs (Any) -- D'autres arguments sont acceptés afin d'autoriser de futurs possibles extensions.
Type checkers recognize the following optional parameters on field specifiers:
¶ Parameter name
Description
init
Indicates whether the field should be included in the synthesized
__init__
method. If unspecified,init
defaults toTrue
.default
Provides the default value for the field.
default_factory
Provides a runtime callback that returns the default value for the field. If neither
default
nordefault_factory
are specified, the field is assumed to have no default value and must be provided a value when the class is instantiated.factory
An alias for the
default_factory
parameter on field specifiers.kw_only
Indicates whether the field should be marked as keyword-only. If
True
, the field will be keyword-only. IfFalse
, it will not be keyword-only. If unspecified, the value of thekw_only
parameter on the object decorated withdataclass_transform
will be used, or if that is unspecified, the value ofkw_only_default
ondataclass_transform
will be used.alias
Provides an alternative name for the field. This alternative name is used in the synthesized
__init__
method.Lors de l'exécution, les arguments de ce décorateur sont enregistrés au sein de l'attribut
__dataclass_transform__
de l'objet décoré. Il n'y pas d'autre effet à l'exécution.See PEP 681 for more details.
Ajouté dans la version 3.11.
- @typing.overload¶
Decorator for creating overloaded functions and methods.
The
@overload
decorator allows describing functions and methods that support multiple different combinations of argument types. A series of@overload
-decorated definitions must be followed by exactly one non-@overload
-decorated definition (for the same function/method).@overload
-decorated definitions are for the benefit of the type checker only, since they will be overwritten by the non-@overload
-decorated definition. The non-@overload
-decorated definition, meanwhile, will be used at runtime but should be ignored by a type checker. At runtime, calling an@overload
-decorated function directly will raiseNotImplementedError
.An example of overload that gives a more precise type than can be expressed using a union or a type variable:
@overload def process(response: None) -> None: ... @overload def process(response: int) -> tuple[int, str]: ... @overload def process(response: bytes) -> str: ... def process(response): ... # actual implementation goes here
See PEP 484 for more details and comparison with other typing semantics.
Modifié dans la version 3.11: Les fonctions surchargées peuvent maintenant être inspectées durant l'exécution via
get_overloads()
.
- typing.get_overloads(func)¶
Return a sequence of
@overload
-decorated definitions for func.func is the function object for the implementation of the overloaded function. For example, given the definition of
process
in the documentation for@overload
,get_overloads(process)
will return a sequence of three function objects for the three defined overloads. If called on a function with no overloads,get_overloads()
returns an empty sequence.get_overloads()
peut être utilisé afin d'inspecter une fonction surchargée durant l'exécution.Ajouté dans la version 3.11.
- typing.clear_overloads()¶
Clear all registered overloads in the internal registry.
This can be used to reclaim the memory used by the registry.
Ajouté dans la version 3.11.
- @typing.final¶
Decorator to indicate final methods and final classes.
Decorating a method with
@final
indicates to a type checker that the method cannot be overridden in a subclass. Decorating a class with@final
indicates that it cannot be subclassed.Par exemple :
class Base: @final def done(self) -> None: ... class Sub(Base): def done(self) -> None: # Error reported by type checker ... @final class Leaf: ... class Other(Leaf): # Error reported by type checker ...
Ces propriétés ne sont pas vérifiées à l'exécution. Voir la PEP 591 pour plus de détails.
Ajouté dans la version 3.8.
Modifié dans la version 3.11: The decorator will now attempt to set a
__final__
attribute toTrue
on the decorated object. Thus, a check likeif getattr(obj, "__final__", False)
can be used at runtime to determine whether an objectobj
has been marked as final. If the decorated object does not support setting attributes, the decorator returns the object unchanged without raising an exception.
- @typing.no_type_check¶
Décorateur pour indiquer que les annotations ne sont pas des indications de type.
This works as a class or function decorator. With a class, it applies recursively to all methods and classes defined in that class (but not to methods defined in its superclasses or subclasses). Type checkers will ignore all annotations in a function or class with this decorator.
@no_type_check
mutates the decorated object in place.
- @typing.no_type_check_decorator¶
Décorateur pour donner à un autre décorateur l'effet
no_type_check()
.Ceci enveloppe le décorateur avec quelque chose qui enveloppe la fonction décorée dans
no_type_check()
.Deprecated since version 3.13, will be removed in version 3.15: No type checker ever added support for
@no_type_check_decorator
. It is therefore deprecated, and will be removed in Python 3.15.
- @typing.override¶
Decorator to indicate that a method in a subclass is intended to override a method or attribute in a superclass.
Type checkers should emit an error if a method decorated with
@override
does not, in fact, override anything. This helps prevent bugs that may occur when a base class is changed without an equivalent change to a child class.For example:
class Base: def log_status(self) -> None: ... class Sub(Base): @override def log_status(self) -> None: # Okay: overrides Base.log_status ... @override def done(self) -> None: # Error reported by type checker ...
There is no runtime checking of this property.
The decorator will attempt to set an
__override__
attribute toTrue
on the decorated object. Thus, a check likeif getattr(obj, "__override__", False)
can be used at runtime to determine whether an objectobj
has been marked as an override. If the decorated object does not support setting attributes, the decorator returns the object unchanged without raising an exception.See PEP 698 for more details.
Ajouté dans la version 3.12.
- @typing.type_check_only¶
Decorator to mark a class or function as unavailable at runtime.
Ce décorateur n'est pas disponible à l'exécution. Il est principalement destiné à marquer les classes qui sont définies dans des fichiers séparés d'annotations de type (type stub file, en anglais) si une implémentation renvoie une instance d'une classe privée :
@type_check_only class Response: # private or not available at runtime code: int def get_header(self, name: str) -> str: ... def fetch_response() -> Response: ...
Notez qu'il n'est pas recommandé de renvoyer les instances des classes privées. Il est généralement préférable de rendre ces classes publiques.
Utilitaires d'introspection¶
- typing.get_type_hints(obj, globalns=None, localns=None, include_extras=False)¶
Renvoie un dictionnaire contenant des annotations de type pour une fonction, une méthode, un module ou un objet de classe.
This is often the same as
obj.__annotations__
, but this function makes the following changes to the annotations dictionary:Forward references encoded as string literals or
ForwardRef
objects are handled by evaluating them in globalns, localns, and (where applicable) obj's type parameter namespace. If globalns or localns is not given, appropriate namespace dictionaries are inferred from obj.None
is replaced withtypes.NoneType
.If
@no_type_check
has been applied to obj, an empty dictionary is returned.If obj is a class
C
, the function returns a dictionary that merges annotations fromC
's base classes with those onC
directly. This is done by traversingC.__mro__
and iteratively combining__annotations__
dictionaries. Annotations on classes appearing earlier in the method resolution order always take precedence over annotations on classes appearing later in the method resolution order.The function recursively replaces all occurrences of
Annotated[T, ...]
withT
, unless include_extras is set toTrue
(seeAnnotated
for more information).
See also
inspect.get_annotations()
, a lower-level function that returns annotations more directly.Note
If any forward references in the annotations of obj are not resolvable or are not valid Python code, this function will raise an exception such as
NameError
. For example, this can happen with imported type aliases that include forward references, or with names imported underif TYPE_CHECKING
.Modifié dans la version 3.9: Added
include_extras
parameter as part of PEP 593. See the documentation onAnnotated
for more information.Modifié dans la version 3.11: Avant,
Optional[t]
était ajouté pour les annotations de fonctions et de méthodes dans le cas où une valeur par défaut était égal àNone
. Maintenant, les annotations sont renvoyées inchangées.
- typing.get_origin(tp)¶
Get the unsubscripted version of a type: for a typing object of the form
X[Y, Z, ...]
returnX
.If
X
is a typing-module alias for a builtin orcollections
class, it will be normalized to the original class. IfX
is an instance ofParamSpecArgs
orParamSpecKwargs
, return the underlyingParamSpec
. ReturnNone
for unsupported objects.Examples:
assert get_origin(str) is None assert get_origin(Dict[str, int]) is dict assert get_origin(Union[int, str]) is Union assert get_origin(Annotated[str, "metadata"]) is Annotated P = ParamSpec('P') assert get_origin(P.args) is P assert get_origin(P.kwargs) is P
Ajouté dans la version 3.8.
- typing.get_args(tp)¶
Get type arguments with all substitutions performed: for a typing object of the form
X[Y, Z, ...]
return(Y, Z, ...)
.If
X
is a union orLiteral
contained in another generic type, the order of(Y, Z, ...)
may be different from the order of the original arguments[Y, Z, ...]
due to type caching. Return()
for unsupported objects.Examples:
assert get_args(int) == () assert get_args(Dict[int, str]) == (int, str) assert get_args(Union[int, str]) == (int, str)
Ajouté dans la version 3.8.
- typing.get_protocol_members(tp)¶
Return the set of members defined in a
Protocol
.>>> from typing import Protocol, get_protocol_members >>> class P(Protocol): ... def a(self) -> str: ... ... b: int >>> get_protocol_members(P) == frozenset({'a', 'b'}) True
Raise
TypeError
for arguments that are not Protocols.Ajouté dans la version 3.13.
- typing.is_protocol(tp)¶
Determine if a type is a
Protocol
.Par exemple :
class P(Protocol): def a(self) -> str: ... b: int is_protocol(P) # => True is_protocol(int) # => False
Ajouté dans la version 3.13.
- typing.is_typeddict(tp)¶
Vérifier si un type est un
TypedDict
.For example:
class Film(TypedDict): title: str year: int assert is_typeddict(Film) assert not is_typeddict(list | str) # TypedDict is a factory for creating typed dicts, # not a typed dict itself assert not is_typeddict(TypedDict)
Ajouté dans la version 3.10.
- class typing.ForwardRef¶
Class used for internal typing representation of string forward references.
For example,
List["SomeClass"]
is implicitly transformed intoList[ForwardRef("SomeClass")]
.ForwardRef
should not be instantiated by a user, but may be used by introspection tools.Note
Les types PEP 585 tels que
list["SomeClass"]
ne seront pas implicitement transformés enlist[ForwardRef("SomeClass")]
et ne seront donc pas automatiquement résolus enlist[SomeClass]
.Ajouté dans la version 3.7.4.
Modifié dans la version 3.14: This is now an alias for
annotationlib.ForwardRef
.
- typing.evaluate_forward_ref(forward_ref, *, owner=None, globals=None, locals=None, type_params=None, format=annotationlib.Format.VALUE)¶
Evaluate an
annotationlib.ForwardRef
as a type hint.This is similar to calling
annotationlib.ForwardRef.evaluate()
, but unlike that method,evaluate_forward_ref()
also:Recursively evaluates forward references nested within the type hint.
Raises
TypeError
when it encounters certain objects that are not valid type hints.Replaces type hints that evaluate to
None
withtypes.NoneType
.Supports the
FORWARDREF
andSTRING
formats.
forward_ref must be an instance of
ForwardRef
. owner, if given, should be the object that holds the annotations that the forward reference derived from, such as a module, class object, or function. It is used to infer the namespaces to use for looking up names. globals and locals can also be explicitly given to provide the global and local namespaces. type_params is a tuple of type parameters that are in scope when evaluating the forward reference. This parameter must be provided (though it may be an empty tuple) if owner is not given and the forward reference does not already have an owner set. format specifies the format of the annotation and is a member of theannotationlib.Format
enum.Ajouté dans la version 3.14.
- typing.NoDefault¶
A sentinel object used to indicate that a type parameter has no default value. For example:
>>> T = TypeVar("T") >>> T.__default__ is typing.NoDefault True >>> S = TypeVar("S", default=None) >>> S.__default__ is None True
Ajouté dans la version 3.13.
Constante¶
- typing.TYPE_CHECKING¶
A special constant that is assumed to be
True
by 3rd party static type checkers. It isFalse
at runtime.Utilisation :
if TYPE_CHECKING: import expensive_mod def fun(arg: 'expensive_mod.SomeType') -> None: local_var: expensive_mod.AnotherType = other_fun()
The first type annotation must be enclosed in quotes, making it a "forward reference", to hide the
expensive_mod
reference from the interpreter runtime. Type annotations for local variables are not evaluated, so the second annotation does not need to be enclosed in quotes.Note
Si
from __future__ import annotations
est utilisé, les annotations ne sont pas évaluées au moment de la définition de fonction. Elles sont alors stockées comme des chaînes de caractères dans__annotations__
, ce qui rend inutile l'utilisation de guillemets autour de l'annotation (Voir PEP 563).Ajouté dans la version 3.5.2.
Deprecated aliases¶
This module defines several deprecated aliases to pre-existing
standard library classes. These were originally included in the typing
module in order to support parameterizing these generic classes using []
.
However, the aliases became redundant in Python 3.9 when the
corresponding pre-existing classes were enhanced to support []
(see
PEP 585).
The redundant types are deprecated as of Python 3.9. However, while the aliases may be removed at some point, removal of these aliases is not currently planned. As such, no deprecation warnings are currently issued by the interpreter for these aliases.
If at some point it is decided to remove these deprecated aliases, a deprecation warning will be issued by the interpreter for at least two releases prior to removal. The aliases are guaranteed to remain in the typing module without deprecation warnings until at least Python 3.14.
Type checkers are encouraged to flag uses of the deprecated types if the program they are checking targets a minimum Python version of 3.9 or newer.
Aliases to built-in types¶
- class typing.Dict(dict, MutableMapping[KT, VT])¶
Deprecated alias to
dict
.Note that to annotate arguments, it is preferred to use an abstract collection type such as
Mapping
rather than to usedict
ortyping.Dict
.Obsolète depuis la version 3.9:
builtins.dict
now supports subscripting ([]
). See PEP 585 and Type Alias générique.
- class typing.List(list, MutableSequence[T])¶
Deprecated alias to
list
.Note that to annotate arguments, it is preferred to use an abstract collection type such as
Sequence
orIterable
rather than to uselist
ortyping.List
.Obsolète depuis la version 3.9:
builtins.list
now supports subscripting ([]
). See PEP 585 and Type Alias générique.
- class typing.Set(set, MutableSet[T])¶
Deprecated alias to
builtins.set
.Note that to annotate arguments, it is preferred to use an abstract collection type such as
collections.abc.Set
rather than to useset
ortyping.Set
.Obsolète depuis la version 3.9:
builtins.set
now supports subscripting ([]
). See PEP 585 and Type Alias générique.
- class typing.FrozenSet(frozenset, AbstractSet[T_co])¶
Deprecated alias to
builtins.frozenset
.Obsolète depuis la version 3.9:
builtins.frozenset
now supports subscripting ([]
). See PEP 585 and Type Alias générique.
- typing.Tuple¶
Deprecated alias for
tuple
.tuple
andTuple
are special-cased in the type system; see Annotating tuples for more details.Obsolète depuis la version 3.9:
builtins.tuple
now supports subscripting ([]
). See PEP 585 and Type Alias générique.
- class typing.Type(Generic[CT_co])¶
Deprecated alias to
type
.See The type of class objects for details on using
type
ortyping.Type
in type annotations.Ajouté dans la version 3.5.2.
Obsolète depuis la version 3.9:
builtins.type
now supports subscripting ([]
). See PEP 585 and Type Alias générique.
Aliases to types in collections
¶
- class typing.DefaultDict(collections.defaultdict, MutableMapping[KT, VT])¶
Deprecated alias to
collections.defaultdict
.Ajouté dans la version 3.5.2.
Obsolète depuis la version 3.9:
collections.defaultdict
now supports subscripting ([]
). See PEP 585 and Type Alias générique.
- class typing.OrderedDict(collections.OrderedDict, MutableMapping[KT, VT])¶
Deprecated alias to
collections.OrderedDict
.Ajouté dans la version 3.7.2.
Obsolète depuis la version 3.9:
collections.OrderedDict
now supports subscripting ([]
). See PEP 585 and Type Alias générique.
- class typing.ChainMap(collections.ChainMap, MutableMapping[KT, VT])¶
Deprecated alias to
collections.ChainMap
.Ajouté dans la version 3.6.1.
Obsolète depuis la version 3.9:
collections.ChainMap
now supports subscripting ([]
). See PEP 585 and Type Alias générique.
- class typing.Counter(collections.Counter, Dict[T, int])¶
Deprecated alias to
collections.Counter
.Ajouté dans la version 3.6.1.
Obsolète depuis la version 3.9:
collections.Counter
now supports subscripting ([]
). See PEP 585 and Type Alias générique.
- class typing.Deque(deque, MutableSequence[T])¶
Deprecated alias to
collections.deque
.Ajouté dans la version 3.6.1.
Obsolète depuis la version 3.9:
collections.deque
now supports subscripting ([]
). See PEP 585 and Type Alias générique.
Aliases to other concrete types¶
- class typing.Pattern¶
- class typing.Match¶
Deprecated aliases corresponding to the return types from
re.compile()
andre.match()
.These types (and the corresponding functions) are generic over
AnyStr
.Pattern
can be specialised asPattern[str]
orPattern[bytes]
;Match
can be specialised asMatch[str]
orMatch[bytes]
.Obsolète depuis la version 3.9: Classes
Pattern
andMatch
fromre
now support[]
. See PEP 585 and Type Alias générique.
- class typing.Text¶
Deprecated alias for
str
.Text
is provided to supply a forward compatible path for Python 2 code: in Python 2,Text
is an alias forunicode
.Utilisez
Text
pour indiquer qu'une valeur doit contenir une chaîne Unicode d'une manière compatible avec Python 2 et Python 3 :def add_unicode_checkmark(text: Text) -> Text: return text + u' \u2713'
Ajouté dans la version 3.5.2.
Obsolète depuis la version 3.11: Python 2 is no longer supported, and most type checkers also no longer support type checking Python 2 code. Removal of the alias is not currently planned, but users are encouraged to use
str
instead ofText
.
Aliases to container ABCs in collections.abc
¶
- class typing.AbstractSet(Collection[T_co])¶
Deprecated alias to
collections.abc.Set
.Obsolète depuis la version 3.9:
collections.abc.Set
now supports subscripting ([]
). See PEP 585 and Type Alias générique.
- class typing.Collection(Sized, Iterable[T_co], Container[T_co])¶
Deprecated alias to
collections.abc.Collection
.Ajouté dans la version 3.6.
Obsolète depuis la version 3.9:
collections.abc.Collection
now supports subscripting ([]
). See PEP 585 and Type Alias générique.
- class typing.Container(Generic[T_co])¶
Deprecated alias to
collections.abc.Container
.Obsolète depuis la version 3.9:
collections.abc.Container
now supports subscripting ([]
). See PEP 585 and Type Alias générique.
- class typing.ItemsView(MappingView, AbstractSet[tuple[KT_co, VT_co]])¶
Deprecated alias to
collections.abc.ItemsView
.Obsolète depuis la version 3.9:
collections.abc.ItemsView
now supports subscripting ([]
). See PEP 585 and Type Alias générique.
- class typing.KeysView(MappingView, AbstractSet[KT_co])¶
Deprecated alias to
collections.abc.KeysView
.Obsolète depuis la version 3.9:
collections.abc.KeysView
now supports subscripting ([]
). See PEP 585 and Type Alias générique.
- class typing.Mapping(Collection[KT], Generic[KT, VT_co])¶
Deprecated alias to
collections.abc.Mapping
.Obsolète depuis la version 3.9:
collections.abc.Mapping
now supports subscripting ([]
). See PEP 585 and Type Alias générique.
- class typing.MappingView(Sized)¶
Deprecated alias to
collections.abc.MappingView
.Obsolète depuis la version 3.9:
collections.abc.MappingView
now supports subscripting ([]
). See PEP 585 and Type Alias générique.
- class typing.MutableMapping(Mapping[KT, VT])¶
Deprecated alias to
collections.abc.MutableMapping
.Obsolète depuis la version 3.9:
collections.abc.MutableMapping
now supports subscripting ([]
). See PEP 585 and Type Alias générique.
- class typing.MutableSequence(Sequence[T])¶
Deprecated alias to
collections.abc.MutableSequence
.Obsolète depuis la version 3.9:
collections.abc.MutableSequence
now supports subscripting ([]
). See PEP 585 and Type Alias générique.
- class typing.MutableSet(AbstractSet[T])¶
Deprecated alias to
collections.abc.MutableSet
.Obsolète depuis la version 3.9:
collections.abc.MutableSet
now supports subscripting ([]
). See PEP 585 and Type Alias générique.
- class typing.Sequence(Reversible[T_co], Collection[T_co])¶
Deprecated alias to
collections.abc.Sequence
.Obsolète depuis la version 3.9:
collections.abc.Sequence
now supports subscripting ([]
). See PEP 585 and Type Alias générique.
- class typing.ValuesView(MappingView, Collection[_VT_co])¶
Deprecated alias to
collections.abc.ValuesView
.Obsolète depuis la version 3.9:
collections.abc.ValuesView
now supports subscripting ([]
). See PEP 585 and Type Alias générique.
Aliases to asynchronous ABCs in collections.abc
¶
- class typing.Coroutine(Awaitable[ReturnType], Generic[YieldType, SendType, ReturnType])¶
Deprecated alias to
collections.abc.Coroutine
.See Annotating generators and coroutines for details on using
collections.abc.Coroutine
andtyping.Coroutine
in type annotations.Ajouté dans la version 3.5.3.
Obsolète depuis la version 3.9:
collections.abc.Coroutine
now supports subscripting ([]
). See PEP 585 and Type Alias générique.
- class typing.AsyncGenerator(AsyncIterator[YieldType], Generic[YieldType, SendType])¶
Deprecated alias to
collections.abc.AsyncGenerator
.See Annotating generators and coroutines for details on using
collections.abc.AsyncGenerator
andtyping.AsyncGenerator
in type annotations.Ajouté dans la version 3.6.1.
Obsolète depuis la version 3.9:
collections.abc.AsyncGenerator
now supports subscripting ([]
). See PEP 585 and Type Alias générique.Modifié dans la version 3.13: The
SendType
parameter now has a default.
- class typing.AsyncIterable(Generic[T_co])¶
Deprecated alias to
collections.abc.AsyncIterable
.Ajouté dans la version 3.5.2.
Obsolète depuis la version 3.9:
collections.abc.AsyncIterable
now supports subscripting ([]
). See PEP 585 and Type Alias générique.
- class typing.AsyncIterator(AsyncIterable[T_co])¶
Deprecated alias to
collections.abc.AsyncIterator
.Ajouté dans la version 3.5.2.
Obsolète depuis la version 3.9:
collections.abc.AsyncIterator
now supports subscripting ([]
). See PEP 585 and Type Alias générique.
- class typing.Awaitable(Generic[T_co])¶
Deprecated alias to
collections.abc.Awaitable
.Ajouté dans la version 3.5.2.
Obsolète depuis la version 3.9:
collections.abc.Awaitable
now supports subscripting ([]
). See PEP 585 and Type Alias générique.
Aliases to other ABCs in collections.abc
¶
- class typing.Iterable(Generic[T_co])¶
Deprecated alias to
collections.abc.Iterable
.Obsolète depuis la version 3.9:
collections.abc.Iterable
now supports subscripting ([]
). See PEP 585 and Type Alias générique.
- class typing.Iterator(Iterable[T_co])¶
Deprecated alias to
collections.abc.Iterator
.Obsolète depuis la version 3.9:
collections.abc.Iterator
now supports subscripting ([]
). See PEP 585 and Type Alias générique.
- typing.Callable¶
Deprecated alias to
collections.abc.Callable
.See Annotating callable objects for details on how to use
collections.abc.Callable
andtyping.Callable
in type annotations.Obsolète depuis la version 3.9:
collections.abc.Callable
now supports subscripting ([]
). See PEP 585 and Type Alias générique.Modifié dans la version 3.10:
Callable
prend désormais en chargeParamSpec
etConcatenate
. Voir PEP 612 pour plus de détails.
- class typing.Generator(Iterator[YieldType], Generic[YieldType, SendType, ReturnType])¶
Deprecated alias to
collections.abc.Generator
.See Annotating generators and coroutines for details on using
collections.abc.Generator
andtyping.Generator
in type annotations.Obsolète depuis la version 3.9:
collections.abc.Generator
now supports subscripting ([]
). See PEP 585 and Type Alias générique.Modifié dans la version 3.13: Default values for the send and return types were added.
- class typing.Hashable¶
Deprecated alias to
collections.abc.Hashable
.Obsolète depuis la version 3.12: Use
collections.abc.Hashable
directly instead.
- class typing.Reversible(Iterable[T_co])¶
Deprecated alias to
collections.abc.Reversible
.Obsolète depuis la version 3.9:
collections.abc.Reversible
now supports subscripting ([]
). See PEP 585 and Type Alias générique.
- class typing.Sized¶
Deprecated alias to
collections.abc.Sized
.Obsolète depuis la version 3.12: Use
collections.abc.Sized
directly instead.
Aliases to contextlib
ABCs¶
- class typing.ContextManager(Generic[T_co, ExitT_co])¶
Deprecated alias to
contextlib.AbstractContextManager
.The first type parameter,
T_co
, represents the type returned by the__enter__()
method. The optional second type parameter,ExitT_co
, which defaults tobool | None
, represents the type returned by the__exit__()
method.Ajouté dans la version 3.5.4.
Obsolète depuis la version 3.9:
contextlib.AbstractContextManager
now supports subscripting ([]
). See PEP 585 and Type Alias générique.Modifié dans la version 3.13: Added the optional second type parameter,
ExitT_co
.
- class typing.AsyncContextManager(Generic[T_co, AExitT_co])¶
Deprecated alias to
contextlib.AbstractAsyncContextManager
.The first type parameter,
T_co
, represents the type returned by the__aenter__()
method. The optional second type parameter,AExitT_co
, which defaults tobool | None
, represents the type returned by the__aexit__()
method.Ajouté dans la version 3.6.2.
Obsolète depuis la version 3.9:
contextlib.AbstractAsyncContextManager
now supports subscripting ([]
). See PEP 585 and Type Alias générique.Modifié dans la version 3.13: Added the optional second type parameter,
AExitT_co
.
Étapes d'Obsolescence des Fonctionnalités Majeures¶
Certaines fonctionnalités dans typing
sont obsolètes et peuvent être supprimées dans une future version de Python. Le tableau suivant résume les principales dépréciations. Celui-ci peut changer et toutes les dépréciations ne sont pas listées.
Fonctionnalité |
Obsolète en |
Suppression prévue |
PEP/issue |
---|---|---|---|
Versions de typage des collections standards |
3.9 |
Undecided (see Deprecated aliases for more information) |
|
3.11 |
Non défini |
||
3.12 |
Non défini |
||
3.12 |
Non défini |
||
3.13 |
3.15 |
||
3.13 |
3.18 |