Enforcing the order of extension loading

I have two python extensions (dynamic libraries), say a.so and b.so. Of the two, a.so depends on b.so, specifically it uses a type defined in b.so.

In python, I could safely do

import b
import a
# work

But when I do

import a
import b

It imports fine, but when running the code, it reports that the type b.the_type in a is not the b.the_type in b. A close examination with gdb gives me that the PyTypeObject of that type in a.so and b.so have two different addresses (and different refcnt).

My question is how do I enforce the loading order, or make sure that both ways work.


In order to make it possible for people who know well about shared libraries but not python to help me, here's some extra information. In python extensions, a python type is essentially a unique global variable that is initialized in its module (the .so file). Types MUST be initialized before it can be used (this is done by a call to python API). These required initialization is wrapped within specific function that has a particular name. Python will call this function when it loads the extension.

My guess is that, as the OS knows that a.so depends on b.so, the system loads b (instead of python) when python requests only a.so. Yet it is python's responsibility to call the module initialization function and python doesn't know a depends on b, so OS only loads b without initializing. On import b, when python then actually calls the module initialization function, it results in a different PyTypeObject.

If the solution is platform-dependent, my project is currently running on linux (archlinux).

1 answer

  • answered 2018-01-13 17:51 Martijn Pieters

    You appear to have linked a to b to import the types b defines. Don't do this.

    Instead, import b like you would any other Python module. In other words, the dependency on b should be handled entirely by the Python binary, not by your OS's dynamic library loading structures.

    Use the C-API import functions to import b. At that point it should not matter how b is imported; it's just a bunch of Python objects from that point onwards.

    That's not to say that b can't produce a C-level API for those objects (NumPy does this too), you just have to make sure that it is Python that loads the extension, not your library. Incidentally, NumPy defines helper functions that do the importing for you, see the import_umath() code generator for an example.