functools
— Funções e operações de ordem superior em objetos chamáveis¶
Código-fonte: Lib/functools.py
O módulo functools
é para funções de ordem superior: funções que atuam ou retornam outras funções. Em geral, qualquer objeto chamável pode ser tratado como uma função para os propósitos deste módulo.
O módulo functools
define as seguintes funções:
-
@
functools.
cache
(user_function)¶ Cache simples e leve de funções sem vínculo. Às vezes chamado de “memoizar”.
Retorna o mesmo que
lru_cache(maxsize=None)
, criando um invólucro fino em torno de uma pesquisa de dicionário para os argumentos da função. Como nunca precisa remover valores antigos, isso é menor e mais rápido do quelru_cache()
com um limite de tamanho.Por exemplo:
@cache def factorial(n): return n * factorial(n-1) if n else 1 >>> factorial(10) # no previously cached result, makes 11 recursive calls 3628800 >>> factorial(5) # just looks up cached value result 120 >>> factorial(12) # makes two new recursive calls, the other 10 are cached 479001600
Novo na versão 3.9.
-
@
functools.
cached_property
(func)¶ Transforma um método de uma classe em uma propriedade cujo valor é calculado uma vez e, em seguida, armazenado em cache como um atributo normal para a vida útil da instância. Semelhante a
property()
, com a adição de armazenamento em cache. Útil para propriedades computadas caras de instâncias que são efetivamente imutáveis.Exemplo:
class DataSet: def __init__(self, sequence_of_numbers): self._data = tuple(sequence_of_numbers) @cached_property def stdev(self): return statistics.stdev(self._data)
The mechanics of
cached_property()
are somewhat different fromproperty()
. A regular property blocks attribute writes unless a setter is defined. In contrast, a cached_property allows writes.The cached_property decorator only runs on lookups and only when an attribute of the same name doesn’t exist. When it does run, the cached_property writes to the attribute with the same name. Subsequent attribute reads and writes take precedence over the cached_property method and it works like a normal attribute.
The cached value can be cleared by deleting the attribute. This allows the cached_property method to run again.
Note, this decorator interferes with the operation of PEP 412 key-sharing dictionaries. This means that instance dictionaries can take more space than usual.
Also, this decorator requires that the
__dict__
attribute on each instance be a mutable mapping. This means it will not work with some types, such as metaclasses (since the__dict__
attributes on type instances are read-only proxies for the class namespace), and those that specify__slots__
without including__dict__
as one of the defined slots (as such classes don’t provide a__dict__
attribute at all).If a mutable mapping is not available or if space-efficient key sharing is desired, an effect similar to
cached_property()
can be achieved by a stackingproperty()
on top ofcache()
:class DataSet: def __init__(self, sequence_of_numbers): self._data = sequence_of_numbers @property @cache def stdev(self): return statistics.stdev(self._data)
Novo na versão 3.8.
-
functools.
cmp_to_key
(func)¶ Transforma uma função de comparação de estilo antigo para um função chave. Usado com ferramentas que aceitam funções chave (como
sorted()
,min()
,max()
,heapq.nlargest()
,heapq.nsmallest()
,itertools.groupby()
). Esta função é usada principalmente como uma ferramenta de transição para programas que estão sendo convertidos a partir do Python 2, que suportou o uso de funções de comparação.A comparison function is any callable that accepts two arguments, compares them, and returns a negative number for less-than, zero for equality, or a positive number for greater-than. A key function is a callable that accepts one argument and returns another value to be used as the sort key.
Exemplo:
sorted(iterable, key=cmp_to_key(locale.strcoll)) # locale-aware sort order
Para exemplos de classificação e um breve tutorial de classificação, veja HowTo - Ordenação.
Novo na versão 3.2.
-
@
functools.
lru_cache
(user_function)¶ -
@
functools.
lru_cache
(maxsize=128, typed=False) Decorador para embrulhar uma função com um chamável memoizável que economiza até as chamadas mais recentes maxsize. Pode economizar tempo quando uma função cara ou E/S é periodicamente chamada com os mesmos argumentos.
Since a dictionary is used to cache results, the positional and keyword arguments to the function must be hashable.
Padrões de argumento distintos podem ser considerados chamadas distintas com entradas de cache separadas. Por exemplo, f(a=1, b=2) e f(b=2, a=1) diferem em sua ordem de argumento nomeado e podem ter duas entradas de cache separadas.
Se user_function especificado, deve ser um chamável. Isso permite que o decorador lru_cache seja aplicado diretamente a uma função do usuário, deixando maxsize em seu valor padrão de 128:
@lru_cache def count_vowels(sentence): return sum(sentence.count(vowel) for vowel in 'AEIOUaeiou')
Se maxsize for definido como
None
, o recurso LRU é desabilitado e o cache pode crescer sem limites.If typed is set to true, function arguments of different types will be cached separately. If typed is false, the implementation will usually regard them as equivalent calls and only cache a single result. (Some types such as str and int may be cached separately even when typed is false.)
Note, type specificity applies only to the function’s immediate arguments rather than their contents. The scalar arguments,
Decimal(42)
andFraction(42)
are be treated as distinct calls with distinct results. In contrast, the tuple arguments('answer', Decimal(42))
and('answer', Fraction(42))
are treated as equivalent.A função envolta é instrumentada com uma função
cache_parameters()
que retorna um novodict
mostrando os valores para maxsize e digitado. Este é apenas para fins informativos. A alteração dos valores não tem efeito.To help measure the effectiveness of the cache and tune the maxsize parameter, the wrapped function is instrumented with a
cache_info()
function that returns a named tuple showing hits, misses, maxsize and currsize.O decorador também fornece uma função
cache_clear()
para limpar ou invalidar o cache.A função subjacente original é acessível através do atributo
__wrapped__
. Isso é útil para introspecção, para ignorar o cache, ou para reinstalar a função com um cache diferente.The cache keeps references to the arguments and return values until they age out of the cache or until the cache is cleared.
Um cache LRU (menos usado recentemente) funciona melhor quando as chamadas mais recentes são os melhores preditores de chamadas futuras (por exemplo, os artigos mais populares em um servidor de notícias tendem a mudar a cada dia). O limite de tamanho do cache garante que o cache não cresça sem limites em processos de longa execução, como servidores da web.
Em geral, o cache LRU deve ser usado somente quando você deseja reutilizar valores calculados anteriormente. Da mesma forma, não faz sentido armazenar em cache funções com efeitos colaterais, funções que precisam criar objetos mutáveis distintos em cada chamada ou funções impuras, como time() ou random().
Exemplo de um cache LRU para conteúdo web estático:
@lru_cache(maxsize=32) def get_pep(num): 'Retrieve text of a Python Enhancement Proposal' resource = 'https://www.python.org/dev/peps/pep-%04d/' % num try: with urllib.request.urlopen(resource) as s: return s.read() except urllib.error.HTTPError: return 'Not Found' >>> for n in 8, 290, 308, 320, 8, 218, 320, 279, 289, 320, 9991: ... pep = get_pep(n) ... print(n, len(pep)) >>> get_pep.cache_info() CacheInfo(hits=3, misses=8, maxsize=32, currsize=8)
Exemplo de computação eficiente dos números Fibonacci usando um cache para implementar uma programação dinâmica técnica:
@lru_cache(maxsize=None) def fib(n): if n < 2: return n return fib(n-1) + fib(n-2) >>> [fib(n) for n in range(16)] [0, 1, 1, 2, 3, 5, 8, 13, 21, 34, 55, 89, 144, 233, 377, 610] >>> fib.cache_info() CacheInfo(hits=28, misses=16, maxsize=None, currsize=16)
Novo na versão 3.2.
Alterado na versão 3.3: Adicionada a opção typed.
Alterado na versão 3.8: Adicionada a opção user_function.
Novo na versão 3.9: Adicionada a função
cache_parameters()
-
@
functools.
total_ordering
¶ Given a class defining one or more rich comparison ordering methods, this class decorator supplies the rest. This simplifies the effort involved in specifying all of the possible rich comparison operations:
The class must define one of
__lt__()
,__le__()
,__gt__()
, or__ge__()
. In addition, the class should supply an__eq__()
method.Por exemplo:
@total_ordering class Student: def _is_valid_operand(self, other): return (hasattr(other, "lastname") and hasattr(other, "firstname")) def __eq__(self, other): if not self._is_valid_operand(other): return NotImplemented return ((self.lastname.lower(), self.firstname.lower()) == (other.lastname.lower(), other.firstname.lower())) def __lt__(self, other): if not self._is_valid_operand(other): return NotImplemented return ((self.lastname.lower(), self.firstname.lower()) < (other.lastname.lower(), other.firstname.lower()))
Nota
While this decorator makes it easy to create well behaved totally ordered types, it does come at the cost of slower execution and more complex stack traces for the derived comparison methods. If performance benchmarking indicates this is a bottleneck for a given application, implementing all six rich comparison methods instead is likely to provide an easy speed boost.
Nota
This decorator makes no attempt to override methods that have been declared in the class or its superclasses. Meaning that if a superclass defines a comparison operator, total_ordering will not implement it again, even if the original method is abstract.
Novo na versão 3.2.
Alterado na versão 3.4: Returning NotImplemented from the underlying comparison function for unrecognised types is now supported.
-
functools.
partial
(func, /, *args, **keywords)¶ Return a new partial object which when called will behave like func called with the positional arguments args and keyword arguments keywords. If more arguments are supplied to the call, they are appended to args. If additional keyword arguments are supplied, they extend and override keywords. Roughly equivalent to:
def partial(func, /, *args, **keywords): def newfunc(*fargs, **fkeywords): newkeywords = {**keywords, **fkeywords} return func(*args, *fargs, **newkeywords) newfunc.func = func newfunc.args = args newfunc.keywords = keywords return newfunc
The
partial()
is used for partial function application which “freezes” some portion of a function’s arguments and/or keywords resulting in a new object with a simplified signature. For example,partial()
can be used to create a callable that behaves like theint()
function where the base argument defaults to two:>>> from functools import partial >>> basetwo = partial(int, base=2) >>> basetwo.__doc__ = 'Convert base 2 string to an int.' >>> basetwo('10010') 18
-
class
functools.
partialmethod
(func, /, *args, **keywords)¶ Return a new
partialmethod
descriptor which behaves likepartial
except that it is designed to be used as a method definition rather than being directly callable.func must be a descriptor or a callable (objects which are both, like normal functions, are handled as descriptors).
When func is a descriptor (such as a normal Python function,
classmethod()
,staticmethod()
,abstractmethod()
or another instance ofpartialmethod
), calls to__get__
are delegated to the underlying descriptor, and an appropriate partial object returned as the result.When func is a non-descriptor callable, an appropriate bound method is created dynamically. This behaves like a normal Python function when used as a method: the self argument will be inserted as the first positional argument, even before the args and keywords supplied to the
partialmethod
constructor.Exemplo:
>>> class Cell: ... def __init__(self): ... self._alive = False ... @property ... def alive(self): ... return self._alive ... def set_state(self, state): ... self._alive = bool(state) ... set_alive = partialmethod(set_state, True) ... set_dead = partialmethod(set_state, False) ... >>> c = Cell() >>> c.alive False >>> c.set_alive() >>> c.alive True
Novo na versão 3.4.
-
functools.
reduce
(function, iterable[, initializer])¶ Apply function of two arguments cumulatively to the items of iterable, from left to right, so as to reduce the iterable to a single value. For example,
reduce(lambda x, y: x+y, [1, 2, 3, 4, 5])
calculates((((1+2)+3)+4)+5)
. The left argument, x, is the accumulated value and the right argument, y, is the update value from the iterable. If the optional initializer is present, it is placed before the items of the iterable in the calculation, and serves as a default when the iterable is empty. If initializer is not given and iterable contains only one item, the first item is returned.Aproximadamente equivalente a:
def reduce(function, iterable, initializer=None): it = iter(iterable) if initializer is None: value = next(it) else: value = initializer for element in it: value = function(value, element) return value
See
itertools.accumulate()
for an iterator that yields all intermediate values.
-
@
functools.
singledispatch
¶ Transform a function into a single-dispatch generic function.
To define a generic function, decorate it with the
@singledispatch
decorator. When defining a function using@singledispatch
, note that the dispatch happens on the type of the first argument:>>> from functools import singledispatch >>> @singledispatch ... def fun(arg, verbose=False): ... if verbose: ... print("Let me just say,", end=" ") ... print(arg)
To add overloaded implementations to the function, use the
register()
attribute of the generic function, which can be used as a decorator. For functions annotated with types, the decorator will infer the type of the first argument automatically:>>> @fun.register ... def _(arg: int, verbose=False): ... if verbose: ... print("Strength in numbers, eh?", end=" ") ... print(arg) ... >>> @fun.register ... def _(arg: list, verbose=False): ... if verbose: ... print("Enumerate this:") ... for i, elem in enumerate(arg): ... print(i, elem)
For code which doesn’t use type annotations, the appropriate type argument can be passed explicitly to the decorator itself:
>>> @fun.register(complex) ... def _(arg, verbose=False): ... if verbose: ... print("Better than complicated.", end=" ") ... print(arg.real, arg.imag) ...
To enable registering lambdas and pre-existing functions, the
register()
attribute can also be used in a functional form:>>> def nothing(arg, verbose=False): ... print("Nothing.") ... >>> fun.register(type(None), nothing)
The
register()
attribute returns the undecorated function. This enables decorator stacking,pickling
, and the creation of unit tests for each variant independently:>>> @fun.register(float) ... @fun.register(Decimal) ... def fun_num(arg, verbose=False): ... if verbose: ... print("Half of your number:", end=" ") ... print(arg / 2) ... >>> fun_num is fun False
When called, the generic function dispatches on the type of the first argument:
>>> fun("Hello, world.") Hello, world. >>> fun("test.", verbose=True) Let me just say, test. >>> fun(42, verbose=True) Strength in numbers, eh? 42 >>> fun(['spam', 'spam', 'eggs', 'spam'], verbose=True) Enumerate this: 0 spam 1 spam 2 eggs 3 spam >>> fun(None) Nothing. >>> fun(1.23) 0.615
Where there is no registered implementation for a specific type, its method resolution order is used to find a more generic implementation. The original function decorated with
@singledispatch
is registered for the baseobject
type, which means it is used if no better implementation is found.If an implementation is registered to an abstract base class, virtual subclasses of the base class will be dispatched to that implementation:
>>> from collections.abc import Mapping >>> @fun.register ... def _(arg: Mapping, verbose=False): ... if verbose: ... print("Keys & Values") ... for key, value in arg.items(): ... print(key, "=>", value) ... >>> fun({"a": "b"}) a => b
To check which implementation the generic function will choose for a given type, use the
dispatch()
attribute:>>> fun.dispatch(float) <function fun_num at 0x1035a2840> >>> fun.dispatch(dict) # note: default implementation <function fun at 0x103fe0000>
To access all registered implementations, use the read-only
registry
attribute:>>> fun.registry.keys() dict_keys([<class 'NoneType'>, <class 'int'>, <class 'object'>, <class 'decimal.Decimal'>, <class 'list'>, <class 'float'>]) >>> fun.registry[float] <function fun_num at 0x1035a2840> >>> fun.registry[object] <function fun at 0x103fe0000>
Novo na versão 3.4.
Alterado na versão 3.7: The
register()
attribute now supports using type annotations.
-
class
functools.
singledispatchmethod
(func)¶ Transform a method into a single-dispatch generic function.
To define a generic method, decorate it with the
@singledispatchmethod
decorator. When defining a function using@singledispatchmethod
, note that the dispatch happens on the type of the first non-self or non-cls argument:class Negator: @singledispatchmethod def neg(self, arg): raise NotImplementedError("Cannot negate a") @neg.register def _(self, arg: int): return -arg @neg.register def _(self, arg: bool): return not arg
@singledispatchmethod
supports nesting with other decorators such as@classmethod
. Note that to allow fordispatcher.register
,singledispatchmethod
must be the outer most decorator. Here is theNegator
class with theneg
methods bound to the class, rather than an instance of the class:class Negator: @singledispatchmethod @classmethod def neg(cls, arg): raise NotImplementedError("Cannot negate a") @neg.register @classmethod def _(cls, arg: int): return -arg @neg.register @classmethod def _(cls, arg: bool): return not arg
The same pattern can be used for other similar decorators:
@staticmethod
,@abstractmethod
, and others.Novo na versão 3.8.
-
functools.
update_wrapper
(wrapper, wrapped, assigned=WRAPPER_ASSIGNMENTS, updated=WRAPPER_UPDATES)¶ Update a wrapper function to look like the wrapped function. The optional arguments are tuples to specify which attributes of the original function are assigned directly to the matching attributes on the wrapper function and which attributes of the wrapper function are updated with the corresponding attributes from the original function. The default values for these arguments are the module level constants
WRAPPER_ASSIGNMENTS
(which assigns to the wrapper function’s__module__
,__name__
,__qualname__
,__annotations__
and__doc__
, the documentation string) andWRAPPER_UPDATES
(which updates the wrapper function’s__dict__
, i.e. the instance dictionary).To allow access to the original function for introspection and other purposes (e.g. bypassing a caching decorator such as
lru_cache()
), this function automatically adds a__wrapped__
attribute to the wrapper that refers to the function being wrapped.The main intended use for this function is in decorator functions which wrap the decorated function and return the wrapper. If the wrapper function is not updated, the metadata of the returned function will reflect the wrapper definition rather than the original function definition, which is typically less than helpful.
update_wrapper()
may be used with callables other than functions. Any attributes named in assigned or updated that are missing from the object being wrapped are ignored (i.e. this function will not attempt to set them on the wrapper function).AttributeError
is still raised if the wrapper function itself is missing any attributes named in updated.Novo na versão 3.2: Automatic addition of the
__wrapped__
attribute.Novo na versão 3.2: Copying of the
__annotations__
attribute by default.Alterado na versão 3.2: Missing attributes no longer trigger an
AttributeError
.Alterado na versão 3.4: The
__wrapped__
attribute now always refers to the wrapped function, even if that function defined a__wrapped__
attribute. (see bpo-17482)
-
@
functools.
wraps
(wrapped, assigned=WRAPPER_ASSIGNMENTS, updated=WRAPPER_UPDATES)¶ This is a convenience function for invoking
update_wrapper()
as a function decorator when defining a wrapper function. It is equivalent topartial(update_wrapper, wrapped=wrapped, assigned=assigned, updated=updated)
. For example:>>> from functools import wraps >>> def my_decorator(f): ... @wraps(f) ... def wrapper(*args, **kwds): ... print('Calling decorated function') ... return f(*args, **kwds) ... return wrapper ... >>> @my_decorator ... def example(): ... """Docstring""" ... print('Called example function') ... >>> example() Calling decorated function Called example function >>> example.__name__ 'example' >>> example.__doc__ 'Docstring'
Without the use of this decorator factory, the name of the example function would have been
'wrapper'
, and the docstring of the originalexample()
would have been lost.
Objetos partial
¶
partial
objects are callable objects created by partial()
. They
have three read-only attributes:
-
partial.
func
¶ A callable object or function. Calls to the
partial
object will be forwarded tofunc
with new arguments and keywords.
-
partial.
args
¶ The leftmost positional arguments that will be prepended to the positional arguments provided to a
partial
object call.
partial
objects are like function
objects in that they are
callable, weak referencable, and can have attributes. There are some important
differences. For instance, the __name__
and __doc__
attributes
are not created automatically. Also, partial
objects defined in
classes behave like static methods and do not transform into bound methods
during instance attribute look-up.