As my colleague Wes McKinney likes to say (quoting Matthew Goodman): “Are you using IPython? If not, you’re doing it wrong!”

You shouldn’t have to wait for an exception to invoke the interactive debugger, and you definitely should be using the IPython debugger. One convenience function in the pandas (pandas.util.testing) code base is this:

def debug(f, *args, **kwargs):
    from pdb import Pdb as OldPdb
        from IPython.core.debugger import Pdb
        kw = dict(color_scheme='Linux')
    except ImportError:
        Pdb = OldPdb
        kw = {}
    pdb = Pdb(**kw)
    return pdb.runcall(f, *args, **kwargs)

You can invoke it on a function and arguments like so:

debug(test_function, arg1, arg2, named_arg1='hello')

You will get all the interactive IPython goodness as you step through your code. Funny enough, doesn’t seem like qtconsole version supports tab completion. Maybe will file a bug report…

The Python path determines how the Python interpreter locates modules. How exactly does Python construct the path?

Using the official docs on sys.path, with its footnote reference to the site┬ámodule, I’ll recap the process.

If a script is executed, the interpreter sets the first entry of sys.path to that script’s directory. If Python is launched interactively, the first entry is the empty string (“”), meaning Python will scan the present working directory first. The next entries of sys.path are the contents of the PYTHONPATH environment variable, if it exists. Then, installation-dependent entries are appended (example below).

When initializing, the interpreter normally imports the site module automatically. The module, on import, executes code to find .pth files in known site-packages directory locations, which themselves contain entries which are either paths to add to sys.path, or import calls. If we really want to trace what’s going on, we can launch a Python interpreter with -S to prevent loading the site module automatically, and instead trace the import.

(Note, I am working within a virtualenv called py27.)

(py27) ~$ python -S
Python 2.7.2+ (default, Oct 4 2011, 20:06:09)
[GCC 4.6.1] on linux2
>>> import sys
>>> for p in sys.path:
... print p


I have no PYTHONPATH, so these are just my installation-dependent paths. Now, we need to add the directory where the pdb module lives, so we can import it:

>>> sys.path += ["/usr/lib/python2.7"]
>>> import pdb
>>>"import site")
> <string>(1)<module>()
(Pdb) s
> /home/adam/.virtualenvs/py27/lib/python2.7/
-> """

I’ll spare you the debugging session details, and summarize what I see:

– grabs orig-prefix.txt from <VIRTUAL_ENV>/lib/python2.7, which for me contains “/usr”, and extends the sys.path array to contain additional “/usr”-based paths.

– then scans the site-packages (in lib/python2.7). For each .pth file (in alphabetical order), step through its entries. If an entry begins with “import”, call exec() on the line; otherwise append the (absolute) path to sys.path. Then do the same in the user site-packages directory (in local/lib/python2.7).

Note, easy-install.pth contains executable code, eg:

import sys; sys.__plen = len(sys.path)
import sys; new=sys.path[sys.__plen:]; del sys.path[sys.__plen:]; p=getattr(sys,'__egginsert',0); sys.path[p:p]=new; sys.__egginsert = p+len(new)

The executable lines move all the entries (some of which are .egg zipped packages) up to the top of the path.

– After stepping through all .pth files, add the existing site-packages directories themselves.

– Finally, attempt to call “import” (which doesn’t do anything on my install).

Cython is my new favorite tool. It lets you write compiled C extension modules for the CPython interpreter using annotated Python for speed-critical parts and pure Python for non-critical parts. Further, you can import and call C functions directly. The user guide is (surprisingly?) well-written.

In particular, it lets you do blazing computations using Numpy. See this excellent whitepaper.

But what about that old Python extension module you have lying around? What if you want to utilize Cython to call into it, fast, bypassing its Python API? You don’t want to rip out all the C(++) code you care about from that module and recompile it into a new Cython extension module. Or maybe you do. But suppose you don’t.

You’ll just have to give that rickety old extension a C API and expose it properly!

Let’s imagine you’ve got a function “myfunc” in your old extension module called “myold”. So for example in the file myoldmodule.cpp you may have:

static float64_t myfunc(float64_t x) { ... }

We need to create a new header file, myold_capi.h, that declares and exports the relevant symbols that live in the compiled myold module, and that we would like to import into the new Cython module to call. We use the Python Capsule mechanism for this, and the following comes right out of the Python documentation.

#ifndef _MYOLD_CAPI_H_
#define _MYOLD_CAPI_H_

/* import required header files here */

#ifdef __cplusplus
extern "C" {

/* Total number of C API functions to export */
#define MYOLD_CAPI_pointers 1

/* C API functions to export */
#define MYOLD_myfunc_NUM 0
#define MYOLD_myfunc_RETURN float64_t
#define MYOLD_myfunc_PROTO (float64_t x)

/* This section is used when compiling myold */

static MYOLD_myfunc_RETURN myfunc MYOLD_myfunc_PROTO;
/* This section is used in modules that compile against myold's C API */

static void **MYOLD_CAPI;

#define myfunc \
     (*(MYOLD_myfunc_RETURN (*)MYOLD_myfunc_PROTO) MYOLD_CAPI[MYOLD_myfunc_NUM])

/* Return -1 on error, 0 on success.
   PyCapsule_Import will set an exception if there's an error.  */

static int
    MYOLD_CAPI = (void **)PyCapsule_Import("myold._C_API", 0);
    return (myold_CAPI != NULL) ? 0 : -1;


#ifdef __cplusplus

#endif /* !defined(_MYOLD_CAPI_H_) */

Now, we have to include this header in our old module, myoldmodule.cpp. So right before, say,

PyObject* pModule = 0;

Add these lines:

#include "myold_capi.h"

Finally, in your PyInit_myold() or initmyold() function that initializes your module, you need to create the Capsule holding the array of function pointers you are exporting:

    // start capsule creation for C API
    static void *MYOLD_CAPI[MYOLD_CAPI_pointers];

    MYOLD_CAPI[MYOLD_myfunc_NUM] = (void *)myfunc;

    /* Create a Capsule containing the API pointer array's address */
    PyObject *c_api_object = PyCapsule_New((void *)MYOLD_CAPI, "myold._C_API", NULL);

    if (c_api_object != NULL)
        PyModule_AddObject(pModule, "_C_API", c_api_object);

    // end capsule creation

Awesome. Now, does your old C module still compile? I hope so!

Next, we need to create a new Cython header file, myold.pxd. It should look something like this:

cdef extern from "myold_capi.h":
    # C-API exports via the myold capsule
    float64_t myfunc(float64_t x)
    # must call this before using module
    int import_myold()

Now, go ahead and write your new Cython module, for example mynew.pyx:

from myold cimport *

# The following call is required to initialize the
# static capsule variable that holds the pointers
# to the myold C API functions

cdef class NewClass():
    cpdef float64_t mynewfunc(self, float64_t x):
        return myfunc(x)

Not too bad!

Sadly, this is often the case:


But not today! I wanted to get my dual monitors set up with my good ole’ GTS 250 and got a cringe-inducing nvidia-settings error:

“Failed to set MetaMode (1) ‘DFP-0: nvidia-auto-select@1680×1050 +0+0, DFP-1: nvidia-auto-select @1680×1050 +0+0 (Mode 1680×1050, id: 52) on X screen 0.”

Could a recently-released, updated driver solve this issue? Yes.

From the command line:

sudo add-apt-repository ppa:ubuntu-x-swat/x-updates
sudo apt-get update
sudo apt-get install nvidia-current

Next: reboot. Update additional packages per any suggestions.

No tears!

© 2014 Adam Klein's Blog Suffusion theme by Sayontan Sinha, modified by Adam :)