How to handle build time dependencies on native python modules?
I ran into the following situation and I am wondering, what the "correct"
way of handling such a setup is?
We have an application that is embedding Python 3 in order to allow users to
modify app behavior; so unlike the most common scenario, it's not
a standalone python module or python bindings to a library, but a linux binary
that does stuff, but that also allows to execute python scripts from within by
linking vs libpython (i.e. python3-embed.pc setup). I am pointing this out
specifically, because from what I have seen this is the least used scenario.
We depend on cffi to generate Python bindings for the application
API, so the idea is that configure runs the cffi python script which will
spit out the sources, which then get compiled together with the application.
The sources generated by cffi are portable, so it would be OK to use the
native python interpreter to run the cffi build script.
I added python3-native and python3-cffi-native to DEPENDS and I can see that
the cffi module is present in:
The issue is, that python3-native does not see the cffi module, because
PYTHONPATH does not reflect the recipe-sysroot-native directory. So when I
try to run my cffi build.py script it fails at "import cffi"
I "hacked" a solution by inheriting python3-dir and setting PYTHONPATH
in do_configure_prepend() and in do_compile_prepend():
The above works, but feels like a hack.
I am wondering if there is a way of configuring this properly?
I asked on IRC and Russ Burton suggested to try inheriting python3native, but
as far as I can see it does not do anything to PYTHONPATH, actually it's
in turn inheriting python3-dir. Interestingly enough, if I inherit
python3native, but keep my "export" hack in place - it does not work anymore.
It does work if I only inherit python3-dir
Did anyone facec a similar problem and do you have any hints?
I'm on Dunfell btw (LTS).