-
Notifications
You must be signed in to change notification settings - Fork 2.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Targeting multiple Python versions at once #748
Comments
Thank you, @thorink. This is exactly what I was looking for. I do not know of a better way to achieve the same result, but I do want to emphasize that, from my point of view, this use case would be a very nice addition to pybind11. |
I can't think of a better way to do this with the current CMake tooling and agree that this is an important use case due to the ABI incompatibility of Python major versions. |
This would actually be a nice for the CI tests: we could essentially cut the number of separate builds in half by making each build and test against both 2.7/3.[56] in one build; that ought to save a bunch of time (for many of the travis-ci builds, the majority of the time spent by most of the builds is in startup + dependency installation). |
I did a first part (https://github.com/FIFTY2Technology/pybind11/tree/target_multiple_python_versions). This is now possible: set(PYBIND11_PYTHON_VERSION 2.7 3.5 3.6)
add_subdirectory(../pybind11 pybind11)
pybind11_add_module(example example.cpp) It creates up to 3 targets (depending on how many python versions are installed). Do you think it goes into the right direction? |
I'd suggest keeping everything internal to pybind11_add_module(example27 ${src_files} PYTHON_VERSION 2.7)
pybind11_add_module(example36 ${src_files} PYTHON_VERSION 3.6) Otherwise, it's going to interact poorly with the CMake targets ( That said, I don't actually fully understand the use case for this. What's the advantage of building multiple versions within a single CMake run compared to: mkdir build27 && cd build27 && cmake -DPYBIND11_PYTHON_VERSION=2.7 .. && make
mkdir build36 && cd build36 && cmake -DPYBIND11_PYTHON_VERSION=3.6 .. && make This works perfectly well and covers the CI use case that @jagerman mentioned. What's the situation when doing this would be problematic? |
The main reason I like it is that it gives better parallelism and lower build times for running multiple test builds. It would be very convenient to have one build directory that builds against, say, Python 2.7, 3.6, and PyPy. That has the potential to make the linking stage (which is by far the slowest single part of building the test suite, and is non-parallizable) noticeably faster over doing it in three separate builds. |
Another advantage would be builds for distribution like Debian (cc @ghisvail for input): I think that right now that the Debian package just runs a plain |
I don't think this approach is going to be viable for the pybind11 test suite itself. It tests more than just the But perhaps it's better to look outside of testing and distributing pybind11 itself. On the user side, for building and distributing Python extension modules with I believe that there are use cases for fully custom builds (non-Python packaging), but I'm wondering: What does this build look like? That would help inform the design of the feature inside pybind11's build system. |
Concerning the build process on the Debian packaging side, here is what we are doing currently:
CMakeLists.txt: cmake_minimum_required(VERSION 2.8.12)
project(pybind11-example)
find_package(pybind11 REQUIRED)
pybind11_add_module(example example.cpp) example.cpp: // Basic example from the upstream documentation.
//
// See: http://pybind11.readthedocs.io/en/latest/basics.html
#include <pybind11/pybind11.h>
int add(int i, int j) {
return i + j;
}
namespace py = pybind11;
PYBIND11_PLUGIN(example) {
py::module m("example", "pybind11 example plugin");
m.def("add", &add, "A function which adds two numbers");
return m.ptr();
} and then repeat the CMake configuration, build and execution of In the future I would like to replace the example test case above with running the test suite for all supported Python versions, but have yet to find the time to try it. |
In our build pipeline, we have the following steps: First we build a GUI app, a CLI app and the python extension for multiple python versions. Everything in one cmake/make run. After the test step we create installers for the GUI and CLI and package the python extension into wheels. So we don't compile inside the setup.py wheel creation phase. |
I'd like to work further on this topic. Building multiple python versions in one cmake run saves us a lot of compilation time. It would be great if we could use a stable pybind11 version again. For me it would also be fine if multiple pybind11_add_module(example27 ${src_files} PYTHON_VERSION 2.7)
pybind11_add_module(example36 ${src_files} PYTHON_VERSION 3.6) The main problem is that the Python version selection is already done when including pybind11. I've chosen this way because it was nearer to the current way of doing things. Maybe it would work if we memorize the available interpreters and use the right one from the list on every |
@thorink I think that keeping the multiple version selection as local as possible would be the best thing. Users already rely on the global variables, so I'd be hesitant to mess with those too much (not to mention breaking the function(pybind11_add_module target_name)
# ... existing `cmake_parse_arguments` code is here
# Only new lines in this function
if(ARG_PYTHON_VERSION)
# Replace PYTHON_INCLUDE_DIRS, PYTHON_LIBRARIES, etc. in the local scope
python_version_override(${ARG_PYTHON_VERSION})
endif()
# ... continue existing code (with possibly overriden locals)
endfunction() # globals are in effect again after the function returns where: function(python_version_override version)
# ... look up data specific to ${version}
# cache as needed, but avoid overriding existing globals
# set variables in parent scope only
set(PYTHON_INCLUDE_DIRS <value> PARENT_SCOPE)
set(PYTHON_LIBRARIES <value> PARENT_SCOPE)
# etc.
endfunction() Looking at your branch, I believe you already have the code that does all this, my suggestion is to just encapsulate everything in a single function ( |
Hey, has anyone had any progress on this? |
I still would like to have a nice solution for this. Do you have similar needs? We have it working in our codebase, but I had to modify the pybind11 cmake scripts a bit (removing stuff we don't need). If you want, I can show you our current solution. |
Any updates on what people are currently doing to solve this issue? (Specifically about supporting two Python versions simultaneously) |
@Colelyman : nothing at the moment, PRs are welcomed. |
+1 for this issue. I've been using a hack like @thorink's original approach. |
Hi guys, I want to build a Python extension for different Python versions (2.7 and 3.5 for the moment) in one CMake run. The goal is to have multiple targets, one for each Python version.
In the main CMakeLists.txt I have the following:
I include my
pythonbindings
source directory two times. Inside I unset some cache variables, include pybind11 and add the extension target:For this to work I changed some code in the CMakeLists.txt of pybind11:
Instead of creating a target named
module
I name itmodule2.7
ormodule3.5
, depending on the targeted Python version.This allows me to build my extension two times with a single
make
.It now works for me, but actually I don't like it very much. I would like to have your opinions on this. Do you know some better ideas to archive the same? Would you like to have support for this use case in pybind11?
The text was updated successfully, but these errors were encountered: