Novedades de Python 2.1
***********************

Autor:
   A.M. Kuchling


Introducción
============

This article explains the new features in Python 2.1.  While there
aren't as many changes in 2.1 as there were in Python 2.0, there are
still some pleasant surprises in store.  2.1 is the first release to
be steered through the use of Python Enhancement Proposals, or PEPs,
so most of the sizable changes have accompanying PEPs that provide
more complete documentation and a design rationale for the change.
This article doesn't attempt to document the new features completely,
but simply provides an overview of the new features for Python
programmers. Refer to the Python 2.1 documentation, or to the specific
PEP, for more details about any new feature that particularly
interests you.

Un objetivo reciente del equipo de desarrollo de Python ha sido
acelerar el ritmo de las nuevas versiones, con una nueva versión cada
6 a 9 meses. La versión 2.1 es la primera que sale a este ritmo más
rápido, con la primera alfa que apareció en enero, 3 meses después de
que se publicara la versión final de la 2.0.

La versión final de Python 2.1 se realizó el 17 de abril de 2001.


PEP 227: Ámbitos anidados
=========================

The largest change in Python 2.1 is to Python's scoping rules.  In
Python 2.0, at any given time there are at most three namespaces used
to look up variable names: local, module-level, and the built-in
namespace.  This often surprised people because it didn't match their
intuitive expectations.  For example, a nested recursive function
definition doesn't work:

   def f():
       ...
       def g(value):
           ...
           return g(value-1) + 1
       ...

The function "g()" will always raise a "NameError" exception, because
the binding of the name "g" isn't in either its local namespace or in
the module-level namespace.  This isn't much of a problem in practice
(how often do you recursively define interior functions like this?),
but this also made using the "lambda" expression clumsier, and this
was a problem in practice. In code which uses "lambda" you can often
find local variables being copied by passing them as the default
values of arguments.

   def find(self, name):
       "Return list of any entries equal to 'name'"
       L = filter(lambda x, name=name: x == name,
                  self.list_attribute)
       return L

La legibilidad del código Python escrito en un estilo fuertemente
funcional sufre mucho como resultado.

The most significant change to Python 2.1 is that static scoping has
been added to the language to fix this problem.  As a first effect,
the "name=name" default argument is now unnecessary in the above
example.  Put simply, when a given variable name is not assigned a
value within a function (by an assignment, or the "def", "class", or
"import" statements), references to the variable will be looked up in
the local namespace of the enclosing scope.  A more detailed
explanation of the rules, and a dissection of the implementation, can
be found in the PEP.

Este cambio puede causar algunos problemas de compatibilidad para el
código en el que el mismo nombre de variable se utiliza tanto a nivel
de módulo como de variable local dentro de una función que contiene
otras definiciones de función. Sin embargo, esto parece bastante
improbable, ya que dicho código habría sido bastante confuso de leer
en primer lugar.

One side effect of the change is that the "from module import *" and
"exec" statements have been made illegal inside a function scope under
certain conditions.  The Python reference manual has said all along
that "from module import *" is only legal at the top level of a
module, but the CPython interpreter has never enforced this before.
As part of the implementation of nested scopes, the compiler which
turns Python source into bytecodes has to generate different code to
access variables in a containing scope.  "from module import *" and
"exec" make it impossible for the compiler to figure this out, because
they add names to the local namespace that are unknowable at compile
time. Therefore, if a function contains function definitions or
"lambda" expressions with free variables, the compiler will flag this
by raising a "SyntaxError" exception.

Para que la explicación anterior quede un poco más clara, he aquí un
ejemplo:

   x = 1
   def f():
       # The next line is a syntax error
       exec 'x=2'
       def g():
           return x

La línea 4 que contiene la sentencia "exec" es un error de sintaxis,
ya que "exec" definiría una nueva variable local llamada "x" cuyo
valor debería ser accedido por "g()".

Esto no debería ser una gran limitación, ya que "exec" rara vez se
utiliza en la mayoría del código de Python (y cuando se utiliza, a
menudo es un signo de un mal diseño de todos modos).

Compatibility concerns have led to nested scopes being introduced
gradually; in Python 2.1, they aren't enabled by default, but can be
turned on within a module by using a future statement as described in
**PEP 236**.  (See the following section for further discussion of
**PEP 236**.)  In Python 2.2, nested scopes will become the default
and there will be no way to turn them off, but users will have had all
of 2.1's lifetime to fix any breakage resulting from their
introduction.

Ver también:

  **PEP 227** - Ámbitos anidados estáticamente
     Escrito e implementado por Jeremy Hylton.


PEP 236: Directivas __future__
==============================

The reaction to nested scopes was widespread concern about the dangers
of breaking code with the 2.1 release, and it was strong enough to
make the Pythoneers take a more conservative approach.  This approach
consists of introducing a convention for enabling optional
functionality in release N that will become compulsory in release N+1.

The syntax uses a "from...import" statement using the reserved module
name "__future__".  Nested scopes can be enabled by the following
statement:

   from __future__ import nested_scopes

While it looks like a normal "import" statement, it's not; there are
strict rules on where such a future statement can be put. They can
only be at the top of a module, and must precede any Python code or
regular "import" statements.  This is because such statements can
affect how the Python bytecode compiler parses code and generates
bytecode, so they must precede any statement that will result in
bytecodes being produced.

Ver también:

  **PEP 236** - De vuelta al "__future__"
     Escrito por Tim Peters, y ejecutado principalmente por Jeremy
     Hylton.


PEP 207: Comparaciones Enriquecidas
===================================

In earlier versions, Python's support for implementing comparisons on
user-defined classes and extension types was quite simple. Classes
could implement a "__cmp__()" method that was given two instances of a
class, and could only return 0 if they were equal or +1 or -1 if they
weren't; the method couldn't raise an exception or return anything
other than a Boolean value.  Users of Numeric Python often found this
model too weak and restrictive, because in the number-crunching
programs that numeric Python is used for, it would be more useful to
be able to perform elementwise comparisons of two matrices, returning
a matrix containing the results of a given comparison for each
element.  If the two matrices are of different sizes, then the compare
has to be able to raise an exception to signal the error.

In Python 2.1, rich comparisons were added in order to support this
need. Python classes can now individually overload each of the "<",
"<=", ">", ">=", "==", and "!=" operations.  The new magic method
names are:

+-------------+------------------+
| Operación   | Nombre del       |
|             | método           |
|=============|==================|
| "<"         | "__lt__()"       |
+-------------+------------------+
| "<="        | "__le__()"       |
+-------------+------------------+
| ">"         | "__gt__()"       |
+-------------+------------------+
| ">="        | "__ge__()"       |
+-------------+------------------+
| "=="        | "__eq__()"       |
+-------------+------------------+
| "!="        | "__ne__()"       |
+-------------+------------------+

(The magic methods are named after the corresponding Fortran operators
".LT.". ".LE.", &c.  Numeric programmers are almost certainly quite
familiar with these names and will find them easy to remember.)

Each of these magic methods is of the form "method(self, other)",
where "self" will be the object on the left-hand side of the operator,
while "other" will be the object on the right-hand side.  For example,
the expression "A < B" will cause "A.__lt__(B)" to be called.

Each of these magic methods can return anything at all: a Boolean, a
matrix, a list, or any other Python object.  Alternatively they can
raise an exception if the comparison is impossible, inconsistent, or
otherwise meaningless.

The built-in "cmp(A,B)" function can use the rich comparison
machinery, and now accepts an optional argument specifying which
comparison operation to use; this is given as one of the strings
""<"", ""<="", "">"", "">="", ""=="", or ""!="".  If called without
the optional third argument, "cmp()" will only return -1, 0, or +1 as
in previous versions of Python; otherwise it will call the appropriate
method and can return any Python object.

There are also corresponding changes of interest to C programmers;
there's a new slot "tp_richcmp" in type objects and an API for
performing a given rich comparison.  I won't cover the C API here, but
will refer you to **PEP 207**, or to 2.1's C API documentation, for
the full list of related functions.

Ver también:

  **PEP 207** - Comparaciones enriquecidas
     Escrito por Guido van Rossum, basado en gran medida en un trabajo
     anterior de David Ascher, e implementado por Guido van Rossum.


PEP 230: Marco de advertencia
=============================

Over its 10 years of existence, Python has accumulated a certain
number of obsolete modules and features along the way.  It's difficult
to know when a feature is safe to remove, since there's no way of
knowing how much code uses it --- perhaps no programs depend on the
feature, or perhaps many do.  To enable removing old features in a
more structured way, a warning framework was added. When the Python
developers want to get rid of a feature, it will first trigger a
warning in the next version of Python.  The following Python version
can then drop the feature, and users will have had a full release
cycle to remove uses of the old feature.

Python 2.1 adds the warning framework to be used in this scheme.  It
adds a "warnings" module that provide functions to issue warnings, and
to filter out warnings that you don't want to be displayed. Third-
party modules can also use this framework to deprecate old features
that they no longer wish to support.

Por ejemplo, en Python 2.1 el módulo "regex" está obsoleto, por lo que
al importarlo se imprime una advertencia:

   >>> import regex
   __main__:1: DeprecationWarning: the regex module
            is deprecated; please use the re module
   >>>

Las advertencias se pueden emitir llamando a la función
"warnings.warn()":

   warnings.warn("feature X no longer supported")

El primer parámetro es el mensaje de advertencia; se pueden utilizar
otros parámetros opcionales para especificar una categoría de
advertencia concreta.

Filters can be added to disable certain warnings; a regular expression
pattern can be applied to the message or to the module name in order
to suppress a warning.  For example, you may have a program that uses
the "regex" module and not want to spare the time to convert it to use
the "re" module right now.  The warning can be suppressed by calling

   import warnings
   warnings.filterwarnings(action = 'ignore',
                           message='.*regex module is deprecated',
                           category=DeprecationWarning,
                           module = '__main__')

This adds a filter that will apply only to warnings of the class
"DeprecationWarning" triggered in the "__main__" module, and applies a
regular expression to only match the message about the "regex" module
being deprecated, and will cause such warnings to be ignored.
Warnings can also be printed only once, printed every time the
offending code is executed, or turned into exceptions that will cause
the program to stop (unless the exceptions are caught in the usual
way, of course).

También se agregaron funciones a la API de C de Python para emitir
advertencias; consulte el PEP 230 o la documentación de la API de
Python para conocer los detalles.

Ver también:

  **PEP 5** - Directrices para la evolución del lenguaje
     Written by Paul Prescod, to specify procedures to be followed
     when removing old features from Python.  The policy described in
     this PEP hasn't been officially adopted, but the eventual policy
     probably won't be too different from Prescod's proposal.

  **PEP 230** - Marco de advertencia
     Escrito y ejecutado por Guido van Rossum.


PEP 229: Sistema de construcción nuevo
======================================

Al compilar Python, el usuario tenía que entrar y editar el archivo
"Modules/Setup" para habilitar varios módulos adicionales; el conjunto
por defecto es relativamente pequeño y se limita a los módulos que se
compilan en la mayoría de las plataformas Unix. Esto significa que en
plataformas Unix con muchas más características, sobre todo Linux, las
instalaciones de Python no suelen contener todos los módulos útiles
que podrían.

Python 2.0 added the Distutils, a set of modules for distributing and
installing extensions.  In Python 2.1, the Distutils are used to
compile much of the standard library of extension modules,
autodetecting which ones are supported on the current machine.  It's
hoped that this will make Python installations easier and more
featureful.

Instead of having to edit the "Modules/Setup" file in order to enable
modules, a "setup.py" script in the top directory of the Python source
distribution is run at build time, and attempts to discover which
modules can be enabled by examining the modules and header files on
the system.  If a module is configured in "Modules/Setup", the
"setup.py" script won't attempt to compile that module and will defer
to the "Modules/Setup" file's contents.  This provides a way to
specific any strange command-line flags or libraries that are required
for a specific platform.

In another far-reaching change to the build mechanism, Neil
Schemenauer restructured things so Python now uses a single makefile
that isn't recursive, instead of makefiles in the top directory and in
each of the "Python/", "Parser/", "Objects/", and "Modules/"
subdirectories.  This makes building Python faster and also makes
hacking the Makefiles clearer and simpler.

Ver también:

  **PEP 229** - Uso de Distutils para construir Python
     Escrito y ejecutado por A.M. Kuchling.


PEP 205: Referencias débiles
============================

Las referencias débiles, disponibles a través del módulo "weakref",
son un nuevo tipo de datos menor pero útil en la caja de herramientas
del programador de Python.

Storing a reference to an object (say, in a dictionary or a list) has
the side effect of keeping that object alive forever.  There are a few
specific cases where this behaviour is undesirable, object caches
being the most common one, and another being circular references in
data structures such as trees.

Por ejemplo, considere una función de memoización que almacena en
caché los resultados de otra función "f(x)" almacenando el argumento
de la función y su resultado en un diccionario:

   _cache = {}
   def memoize(x):
       if _cache.has_key(x):
           return _cache[x]

       retval = f(x)

       # Cache the returned object
       _cache[x] = retval

       return retval

Esta versión funciona para cosas simples como los enteros, pero tiene
un efecto secundario; el diccionario "_cache" mantiene una referencia
a los valores retornados, por lo que nunca serán desocupados hasta que
el proceso de Python salga y se limpie. Esto no es muy notable para
los enteros, pero si "f()" retorna un objeto, o una estructura de
datos que ocupa mucha memoria, esto puede ser un problema.

Weak references provide a way to implement a cache that won't keep
objects alive beyond their time.  If an object is only accessible
through weak references, the object will be deallocated and the weak
references will now indicate that the object it referred to no longer
exists.  A weak reference to an object *obj* is created by calling "wr
= weakref.ref(obj)".  The object being referred to is returned by
calling the weak reference as if it were a function: "wr()".  It will
return the referenced object, or "None" if the object no longer
exists.

Esto hace posible escribir una función "memoize()" cuya caché no
mantenga objetos vivos, almacenando referencias débiles en la caché.

   _cache = {}
   def memoize(x):
       if _cache.has_key(x):
           obj = _cache[x]()
           # If weak reference object still exists,
           # return it
           if obj is not None: return obj

       retval = f(x)

       # Cache a weak reference
       _cache[x] = weakref.ref(retval)

       return retval

The "weakref" module also allows creating proxy objects which behave
like weak references --- an object referenced only by proxy objects is
deallocated -- but instead of requiring an explicit call to retrieve
the object, the proxy transparently forwards all operations to the
object as long as the object still exists.  If the object is
deallocated, attempting to use a proxy will cause a
"weakref.ReferenceError" exception to be raised.

   proxy = weakref.proxy(obj)
   proxy.attr   # Equivalent to obj.attr
   proxy.meth() # Equivalent to obj.meth()
   del obj
   proxy.attr   # raises weakref.ReferenceError

Ver también:

  **PEP 205** - Referencias débiles
     Escrito e implementado por Fred L. Drake, Jr.


PEP 232: Atributos de la función
================================

In Python 2.1, functions can now have arbitrary information attached
to them. People were often using docstrings to hold information about
functions and methods, because the "__doc__" attribute was the only
way of attaching any information to a function.  For example, in the
Zope Web application server, functions are marked as safe for public
access by having a docstring, and in John Aycock's SPARK parsing
framework, docstrings hold parts of the BNF grammar to be parsed.
This overloading is unfortunate, since docstrings are really intended
to hold a function's documentation; for example, it means you can't
properly document functions intended for private use in Zope.

Ahora se pueden establecer y recuperar atributos arbitrarios en las
funciones utilizando la sintaxis normal de Python:

   def f(): pass

   f.publish = 1
   f.secure = 1
   f.grammar = "A ::= B (C D)*"

Se puede acceder al diccionario que contiene los atributos como
"__dict__" de la función. A diferencia del atributo "__dict__" de las
instancias de clase, en las funciones se puede asignar un nuevo
diccionario a "__dict__", aunque el nuevo valor está restringido a un
diccionario normal de Python; no se puede ser complicado y
establecerlo como una instancia de "UserDict", o cualquier otro objeto
aleatorio que se comporte como un mapeo.

Ver también:

  **PEP 232** - Atributos de la función
     Escrito y ejecutado por Barry Warsaw.


PEP 235: Importación de módulos en plataformas que no distinguen entre mayúsculas y minúsculas
==============================================================================================

Algunos sistemas operativos tienen sistemas de archivos que no
distinguen entre mayúsculas y minúsculas, siendo MacOS y Windows los
principales ejemplos; en estos sistemas, es imposible distinguir los
nombres de archivo "FILE.PY" y "file.py", aunque almacenan el nombre
del archivo en su caso original (también preservan las mayúsculas).

In Python 2.1, the "import" statement will work to simulate case-
sensitivity on case-insensitive platforms.  Python will now search for
the first case-sensitive match by default, raising an "ImportError" if
no such file is found, so "import file" will not import a module named
"FILE.PY". Case-insensitive matching can be requested by setting the
"PYTHONCASEOK" environment variable before starting the Python
interpreter.


PEP 217: Gancho de pantalla interactivo
=======================================

Cuando se utiliza el intérprete de Python de forma interactiva, la
salida de los comandos se muestra utilizando la función incorporada
"repr()". En Python 2.1, la variable "sys.displayhook()" puede
establecerse a un objeto invocable que será llamado en lugar de
"repr()". Por ejemplo, puede establecerla a una función especial de
impresión bonita:

   >>> # Create a recursive data structure
   ... L = [1,2,3]
   >>> L.append(L)
   >>> L # Show Python's default output
   [1, 2, 3, [...]]
   >>> # Use pprint.pprint() as the display function
   ... import sys, pprint
   >>> sys.displayhook = pprint.pprint
   >>> L
   [1, 2, 3,  <Recursion on list with id=135143996>]
   >>>

Ver también:

  **PEP 217** - Gancho de visualización para uso interactivo
     Escrito y ejecutado por Moshe Zadka.


PEP 208: Nuevo modelo de coerción
=================================

How numeric coercion is done at the C level was significantly
modified.  This will only affect the authors of C extensions to
Python, allowing them more flexibility in writing extension types that
support numeric operations.

Extension types can now set the type flag "Py_TPFLAGS_CHECKTYPES" in
their "PyTypeObject" structure to indicate that they support the new
coercion model. In such extension types, the numeric slot functions
can no longer assume that they'll be passed two arguments of the same
type; instead they may be passed two arguments of differing types, and
can then perform their own internal coercion. If the slot function is
passed a type it can't handle, it can indicate the failure by
returning a reference to the "Py_NotImplemented" singleton value. The
numeric functions of the other type will then be tried, and perhaps
they can handle the operation; if the other type also returns
"Py_NotImplemented", then a "TypeError" will be raised.  Numeric
methods written in Python can also return "Py_NotImplemented", causing
the interpreter to act as if the method did not exist (perhaps raising
a "TypeError", perhaps trying another object's numeric methods).

Ver también:

  **PEP 208** - Reformulación del modelo de coerción
     Written and implemented by Neil Schemenauer, heavily based upon
     earlier work by Marc-André Lemburg.  Read this to understand the
     fine points of how numeric operations will now be processed at
     the C level.


PEP 241: Metadatos en paquetes de Python
========================================

A common complaint from Python users is that there's no single catalog
of all the Python modules in existence.  T. Middleton's Vaults of
Parnassus at http://www.vex.net/parnassus/ are the largest catalog of
Python modules, but registering software at the Vaults is optional,
and many people don't bother.

As a first small step toward fixing the problem, Python software
packaged using the Distutils **sdist** command will include a file
named "PKG-INFO" containing information about the package such as its
name, version, and author (metadata, in cataloguing terminology).
**PEP 241** contains the full list of fields that can be present in
the "PKG-INFO" file.  As people began to package their software using
Python 2.1, more and more packages will include metadata, making it
possible to build automated cataloguing systems and experiment with
them.  With the result experience, perhaps it'll be possible to design
a really good catalog and then build support for it into Python 2.2.
For example, the Distutils **sdist** and **bdist_*** commands could
support an "upload" option that would automatically upload your
package to a catalog server.

You can start creating packages containing "PKG-INFO" even if you're
not using Python 2.1, since a new release of the Distutils will be
made for users of earlier Python versions.  Version 1.0.2 of the
Distutils includes the changes described in **PEP 241**, as well as
various bugfixes and enhancements.  It will be available from the
Distutils SIG at https://www.python.org/community/sigs/current
/distutils-sig/.

Ver también:

  **PEP 241** - Metadatos para paquetes de software de Python
     Escrito y ejecutado por A.M. Kuchling.

  **PEP 243** - Mecanismo de carga del repositorio de módulos
     Escrito por Sean Reifschneider, este borrador de PEP describe un
     mecanismo propuesto para subir paquetes de Python a un servidor
     central.


Módulos nuevos y mejorados
==========================

* Ka-Ping Yee contributed two new modules: "inspect.py", a module for
  getting information about live Python code, and "pydoc.py", a module
  for interactively converting docstrings to HTML or text.  As a
  bonus, "Tools/scripts/pydoc", which is now automatically installed,
  uses "pydoc.py" to display documentation given a Python module,
  package, or class name.  For example, "pydoc xml.dom" displays the
  following:

     Python Library Documentation: package xml.dom in xml

     NAME
         xml.dom - W3C Document Object Model implementation for Python.

     FILE
         /usr/local/lib/python2.1/xml/dom/__init__.pyc

     DESCRIPTION
         The Python mapping of the Document Object Model is documented in the
         Python Library Reference in the section on the xml.dom package.

         This package contains the following modules:
           ...

  "pydoc" also includes a Tk-based interactive help browser.   "pydoc"
  quickly becomes addictive; try it out!

* Two different modules for unit testing were added to the standard
  library. The "doctest" module, contributed by Tim Peters, provides a
  testing framework based on running embedded examples in docstrings
  and comparing the results against the expected output.  PyUnit,
  contributed by Steve Purcell, is a unit testing framework inspired
  by JUnit, which was in turn an adaptation of Kent Beck's Smalltalk
  testing framework.  See http://pyunit.sourceforge.net/ for more
  information about PyUnit.

* The "difflib" module contains a class, "SequenceMatcher", which
  compares two sequences and computes the changes required to
  transform one sequence into the other.  For example, this module can
  be used to write a tool similar to the Unix **diff** program, and in
  fact the sample program "Tools/scripts/ndiff.py" demonstrates how to
  write such a script.

* "curses.panel", a wrapper for the panel library, part of ncurses and
  of SYSV curses, was contributed by Thomas Gellekum.  The panel
  library provides windows with the additional feature of depth.
  Windows can be moved higher or lower in the depth ordering, and the
  panel library figures out where panels overlap and which sections
  are visible.

* The PyXML package has gone through a few releases since Python 2.0,
  and Python 2.1 includes an updated version of the "xml" package.
  Some of the noteworthy changes include support for Expat 1.2 and
  later versions, the ability for Expat parsers to handle files in any
  encoding supported by Python, and various bugfixes for SAX, DOM, and
  the "minidom" module.

* Ping also contributed another hook for handling uncaught exceptions.
  "sys.excepthook()" can be set to a callable object.  When an
  exception isn't caught by any "try"..."except" blocks, the exception
  will be passed to "sys.excepthook()", which can then do whatever it
  likes.  At the Ninth Python Conference, Ping demonstrated an
  application for this hook: printing an extended traceback that not
  only lists the stack frames, but also lists the function arguments
  and the local variables for each frame.

* Various functions in the "time" module, such as "asctime()" and
  "localtime()", require a floating point argument containing the time
  in seconds since the epoch.  The most common use of these functions
  is to work with the current time, so the floating point argument has
  been made optional; when a value isn't provided, the current time
  will be used.  For example, log file entries usually need a string
  containing the current time; in Python 2.1, "time.asctime()" can be
  used, instead of the lengthier
  "time.asctime(time.localtime(time.time()))" that was previously
  required.

  Este cambio fue propuesto y aplicado por Thomas Wouters.

* The "ftplib" module now defaults to retrieving files in passive
  mode, because passive mode is more likely to work from behind a
  firewall.  This request came from the Debian bug tracking system,
  since other Debian packages use "ftplib" to retrieve files and then
  don't work from behind a firewall. It's deemed unlikely that this
  will cause problems for anyone, because Netscape defaults to passive
  mode and few people complain, but if passive mode is unsuitable for
  your application or network setup, call "set_pasv(0)" on FTP objects
  to disable passive mode.

* Se ha añadido soporte para el acceso a sockets sin procesar en el
  módulo "socket", aportado por Grant Edwards.

* The "pstats" module now contains a simple interactive statistics
  browser for displaying timing profiles for Python programs, invoked
  when the module is run as a script.  Contributed by  Eric S.
  Raymond.

* A new implementation-dependent function, "sys._getframe([depth])",
  has been added to return a given frame object from the current call
  stack. "sys._getframe()" returns the frame at the top of the call
  stack;  if the optional integer argument *depth* is supplied, the
  function returns the frame that is *depth* calls below the top of
  the stack.  For example, "sys._getframe(1)" returns the caller's
  frame object.

  This function is only present in CPython, not in Jython or the .NET
  implementation.  Use it for debugging, and resist the temptation to
  put it into production code.


Otros cambios y correcciones
============================

There were relatively few smaller changes made in Python 2.1 due to
the shorter release cycle.  A search through the CVS change logs turns
up 117 patches applied, and 136 bugs fixed; both figures are likely to
be underestimates.  Some of the more notable changes are:

* A specialized object allocator is now optionally available, that
  should be faster than the system "malloc()" and have less memory
  overhead.  The allocator uses C's "malloc()" function to get large
  pools of memory, and then fulfills smaller memory requests from
  these pools.  It can be enabled by providing the "--with-pymalloc"
  option to the **configure** script; see "Objects/obmalloc.c" for the
  implementation details.

  Authors of C extension modules should test their code with the
  object allocator enabled, because some incorrect code may break,
  causing core dumps at runtime. There are a bunch of memory
  allocation functions in Python's C API that have previously been
  just aliases for the C library's "malloc()" and "free()", meaning
  that if you accidentally called mismatched functions, the error
  wouldn't be noticeable.  When the object allocator is enabled, these
  functions aren't aliases of "malloc()" and "free()" any more, and
  calling the wrong function to free memory will get you a core dump.
  For example, if memory was allocated using "PyMem_New()", it has to
  be freed using "PyMem_Del()", not "free()".  A few modules included
  with Python fell afoul of this and had to be fixed; doubtless there
  are more third-party modules that will have the same problem.

  El asignador de objetos fue aportado por Vladimir Marangozov.

* The speed of line-oriented file I/O has been improved because people
  often complain about its lack of speed, and because it's often been
  used as a naïve benchmark.  The "readline()" method of file objects
  has therefore been rewritten to be much faster.  The exact amount of
  the speedup will vary from platform to platform depending on how
  slow the C library's "getc()" was, but is around 66%, and
  potentially much faster on some particular operating systems. Tim
  Peters did much of the benchmarking and coding for this change,
  motivated by a discussion in comp.lang.python.

  A new module and method for file objects was also added, contributed
  by Jeff Epler. The new method, "xreadlines()", is similar to the
  existing "xrange()" built-in.  "xreadlines()" returns an opaque
  sequence object that only supports being iterated over, reading a
  line on every iteration but not reading the entire file into memory
  as the existing "readlines()" method does. You'd use it like this:

     for line in sys.stdin.xreadlines():
         # ... do something for each line ...
         ...

  Para una discusión más completa de los cambios en la línea de E/S,
  véase el resumen de python-dev del 1 al 15 de enero de 2001 en
  https://mail.python.org/pipermail/python-dev/2001-January/.

* A new method, "popitem()", was added to dictionaries to enable
  destructively iterating through the contents of a dictionary; this
  can be faster for large dictionaries because there's no need to
  construct a list containing all the keys or values. "D.popitem()"
  removes a random "(key, value)" pair from the dictionary "D" and
  returns it as a 2-tuple.  This was implemented mostly by Tim Peters
  and Guido van Rossum, after a suggestion and preliminary patch by
  Moshe Zadka.

* Modules can now control which names are imported when "from module
  import *" is used, by defining an "__all__" attribute containing a
  list of names that will be imported.  One common complaint is that
  if the module imports other modules such as "sys" or "string", "from
  module import *" will add them to the importing module's namespace.
  To fix this, simply list the public names in "__all__":

     # List public names
     __all__ = ['Database', 'open']

  Una versión más estricta de este parche fue primero sugerida e
  implementada por Ben Wolfson, pero después de algunas discusiones de
  python-dev, una versión final más débil fue revisada.

* Applying "repr()" to strings previously used octal escapes for non-
  printable characters; for example, a newline was "'\012'".  This was
  a vestigial trace of Python's C ancestry, but today octal is of very
  little practical use.  Ka-Ping Yee suggested using hex escapes
  instead of octal ones, and using the "\n", "\t", "\r" escapes for
  the appropriate characters, and implemented this new formatting.

* Los errores de sintaxis detectados en tiempo de compilación pueden
  ahora lanzar excepciones que contienen el nombre del archivo y el
  número de línea del error, un agradable efecto secundario de la
  reorganización del compilador realizada por Jeremy Hylton.

* C extensions which import other modules have been changed to use
  "PyImport_ImportModule()", which means that they will use any import
  hooks that have been installed.  This is also encouraged for third-
  party extensions that need to import some other module from C code.

* El tamaño de la base de datos de caracteres Unicode se redujo en
  otros 340K gracias a Fredrik Lundh.

* Se han aportado algunos puertos nuevos: MacOS X (por Steven
  Majewski), Cygwin (por Jason Tishler); RISCOS (por Dietmar
  Schwertberger); Unixware 7 (por Billy G. Allie).

Y hay la lista habitual de correcciones de errores menores, fugas de
memoria menores, ediciones de docstrings, y otros ajustes, demasiado
largos para que valga la pena detallarlos; vea los registros de CVS
para los detalles completos si los quiere.


Agradecimientos
===============

El autor desea agradecer a las siguientes personas sus sugerencias
sobre varios borradores de este artículo: Graeme Cross, David Goodger,
Jay Graves, Michael Hudson, Marc-André Lemburg, Fredrik Lundh, Neil
Schemenauer, Thomas Wouters.
