Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Targeting multiple Python versions at once #748

Closed
thorink opened this issue Mar 20, 2017 · 17 comments · Fixed by #2370
Closed

Targeting multiple Python versions at once #748

thorink opened this issue Mar 20, 2017 · 17 comments · Fixed by #2370

Comments

@thorink
Copy link
Contributor

thorink commented Mar 20, 2017

Hi guys, I want to build a Python extension for different Python versions (2.7 and 3.5 for the moment) in one CMake run. The goal is to have multiple targets, one for each Python version.

In the main CMakeLists.txt I have the following:

set(bindings_python_version 2.7)
add_subdirectory(pythonbinding pythonbindings27)
set(bindings_python_version 3.5)
add_subdirectory(pythonbinding pythonbindings35)

I include my pythonbindings source directory two times. Inside I unset some cache variables, include pybind11 and add the extension target:

unset(PYTHONINTERP_FOUND CACHE)
unset(PYTHON_EXECUTABLE CACHE)
unset(PYTHON_VERSION_STRING CACHE)
unset(PYTHON_VERSION_MAJOR CACHE)
unset(PYTHON_VERSION_MINOR CACHE)
unset(PYTHON_VERSION_PATCH CACHE)

find_package(PythonInterp ${preonpy_python_version})

unset(PYBIND11_INCLUDE_DIR CACHE)
unset(PYTHON_INCLUDE_DIRS CACHE)
unset(PYTHON_LIBRARIES CACHE)
unset(PYTHON_MODULE_PREFIX CACHE)
unset(PYTHON_MODULE_EXTENSION CACHE)

unset(PYTHON_LIBRARY CACHE)

set(PYBIND11_PYTHON_VERSION ${preonpy_python_version} CACHE STRING "" FORCE)

add_subdirectory(pybind11 ${CMAKE_CURRENT_BINARY_DIR}/pybind11)

set(_target_name "_pyexttarget${bindings_python_version}")

add_library(${_target_name} MODULE main.cpp)
target_include_directories(${_target_name} PRIVATE ${PYBIND11_INCLUDE_DIR})
target_include_directories(${_target_name} PRIVATE ${PYTHON_INCLUDE_DIRS})
set_target_properties(${_target_name} PROPERTIES PREFIX "${PYTHON_MODULE_PREFIX}"
                                                 SUFFIX "${PYTHON_MODULE_EXTENSION}"
                                                 OUTPUT_NAME "_pyexttarget")

For this to work I changed some code in the CMakeLists.txt of pybind11:

if (NOT (CMAKE_VERSION VERSION_LESS 3.0))  # CMake >= 3.0
    set(target_name "module${PYBIND11_PYTHON_VERSION}")
    # Build an interface library target:
    add_library(${target_name} INTERFACE)
    target_include_directories(${target_name} INTERFACE $<BUILD_INTERFACE:${PYBIND11_INCLUDE_DIR}>
            $<BUILD_INTERFACE:${PYTHON_INCLUDE_DIRS}>
            $<INSTALL_INTERFACE:${CMAKE_INSTALL_INCLUDEDIR}>)
    if (WIN32 OR CYGWIN)
        target_link_libraries(${target_name} INTERFACE $<BUILD_INTERFACE:${PYTHON_LIBRARIES}>)
    elseif (APPLE)
        target_link_libraries(${target_name} INTERFACE "-undefined dynamic_lookup")
    endif ()
    target_compile_options(${target_name} INTERFACE $<BUILD_INTERFACE:${PYBIND11_CPP_STANDARD}>)

    add_library(pybind11::module ALIAS ${target_name})  # to match exported target
endif ()

Instead of creating a target named module I name it module2.7 or module3.5, depending on the targeted Python version.

This allows me to build my extension two times with a single make.

It now works for me, but actually I don't like it very much. I would like to have your opinions on this. Do you know some better ideas to archive the same? Would you like to have support for this use case in pybind11?

@thorink thorink changed the title Targeting multiple Python version at once Targeting multiple Python versions at once Mar 20, 2017
@asnt
Copy link

asnt commented Mar 20, 2017

Thank you, @thorink. This is exactly what I was looking for. I do not know of a better way to achieve the same result, but I do want to emphasize that, from my point of view, this use case would be a very nice addition to pybind11.

@wjakob
Copy link
Member

wjakob commented Mar 20, 2017

I can't think of a better way to do this with the current CMake tooling and agree that this is an important use case due to the ABI incompatibility of Python major versions.

@jagerman
Copy link
Member

This would actually be a nice for the CI tests: we could essentially cut the number of separate builds in half by making each build and test against both 2.7/3.[56] in one build; that ought to save a bunch of time (for many of the travis-ci builds, the majority of the time spent by most of the builds is in startup + dependency installation).

@thorink
Copy link
Contributor Author

thorink commented Mar 30, 2017

I did a first part (https://github.com/FIFTY2Technology/pybind11/tree/target_multiple_python_versions).

This is now possible:

set(PYBIND11_PYTHON_VERSION 2.7 3.5 3.6)
add_subdirectory(../pybind11 pybind11)

pybind11_add_module(example example.cpp)

It creates up to 3 targets (depending on how many python versions are installed).

Do you think it goes into the right direction?

@dean0x7d
Copy link
Member

dean0x7d commented Mar 30, 2017

I'd suggest keeping everything internal to pybind11_add_module with a syntax like:

pybind11_add_module(example27 ${src_files} PYTHON_VERSION 2.7)
pybind11_add_module(example36 ${src_files} PYTHON_VERSION 3.6)

Otherwise, it's going to interact poorly with the CMake targets (pybind11::module) and I don't see a good way to adapt those since it would be working against the CMake workflow. Creating multiple Python version modules within the same CMake run is only practical with the pybind11_add_module function so it makes sense to keep the version selection localized to it.

That said, I don't actually fully understand the use case for this. What's the advantage of building multiple versions within a single CMake run compared to:

mkdir build27 && cd build27 && cmake -DPYBIND11_PYTHON_VERSION=2.7 .. && make
mkdir build36 && cd build36 && cmake -DPYBIND11_PYTHON_VERSION=3.6 .. && make

This works perfectly well and covers the CI use case that @jagerman mentioned. What's the situation when doing this would be problematic?

@jagerman
Copy link
Member

The main reason I like it is that it gives better parallelism and lower build times for running multiple test builds. It would be very convenient to have one build directory that builds against, say, Python 2.7, 3.6, and PyPy. That has the potential to make the linking stage (which is by far the slowest single part of building the test suite, and is non-parallizable) noticeably faster over doing it in three separate builds.

@jagerman
Copy link
Member

jagerman commented Mar 30, 2017

Another advantage would be builds for distribution like Debian (cc @ghisvail for input): I think that right now that the Debian package just runs a plain cmake followed by make check (which is going to pick up prefer python 3.5). If changing the test suite to test against both is just a matter of adding -DPYBIND11_PYTHON_VERSION="2.7 3.5" it's extremely simple; if it requires multiple builds it's not going to be worth bothering with.

@dean0x7d
Copy link
Member

I don't think this approach is going to be viable for the pybind11 test suite itself. It tests more than just the pybind11_add_module function and the targets are tricky (an extra target + test executable is also going to be needed for embedding the interpreter -- upcoming PR).

But perhaps it's better to look outside of testing and distributing pybind11 itself. On the user side, for building and distributing Python extension modules with pip and conda, different versions must always be built independently, so that's also not the use case.

I believe that there are use cases for fully custom builds (non-Python packaging), but I'm wondering: What does this build look like? That would help inform the design of the feature inside pybind11's build system.

@ghisvail
Copy link
Contributor

Concerning the build process on the Debian packaging side, here is what we are doing currently:

  • At package build time: run the CMake configuration, make, make check, make install for the default Python 3 version (3.5 at the time of writing). The Python module is built for all supported Python versions (currently 2.7, 3.5, later 3.5, 3.6) and is patched to point to the system installation location of the headers, i.e. /usr/include.

  • At CI testing time: build the CMake example from the docs for all supported Python versions using:

CMakeLists.txt:

cmake_minimum_required(VERSION 2.8.12)
project(pybind11-example)

find_package(pybind11 REQUIRED)
pybind11_add_module(example example.cpp)

example.cpp:

// Basic example from the upstream documentation.
//
// See: http://pybind11.readthedocs.io/en/latest/basics.html

#include <pybind11/pybind11.h>

int add(int i, int j) {
    return i + j;
}

namespace py = pybind11;

PYBIND11_PLUGIN(example) {
    py::module m("example", "pybind11 example plugin");

    m.def("add", &add, "A function which adds two numbers");

    return m.ptr();
}

and then repeat the CMake configuration, build and execution of $py -c "import example; assert(example.add(1, 2) == 3)" for each supported Python version with -DPYTHON_EXECUTABLE=/usr/bin/$py.

In the future I would like to replace the example test case above with running the test suite for all supported Python versions, but have yet to find the time to try it.

@thorink
Copy link
Contributor Author

thorink commented Apr 2, 2017

In our build pipeline, we have the following steps: First we build a GUI app, a CLI app and the python extension for multiple python versions. Everything in one cmake/make run. After the test step we create installers for the GUI and CLI and package the python extension into wheels. So we don't compile inside the setup.py wheel creation phase.
In the next couple of months we also want to integrate the python extension into the CLI and the GUI apps via an embedded python interpreter (I'm happy to read, that you are working on this @dean0x7d 👍 ).

@thorink
Copy link
Contributor Author

thorink commented Apr 13, 2017

I'd like to work further on this topic. Building multiple python versions in one cmake run saves us a lot of compilation time. It would be great if we could use a stable pybind11 version again.

For me it would also be fine if multiple pybind11_add_module invocations are necessary, like @dean0x7d suggested.

pybind11_add_module(example27 ${src_files} PYTHON_VERSION 2.7)
pybind11_add_module(example36 ${src_files} PYTHON_VERSION 3.6)

The main problem is that the Python version selection is already done when including pybind11. I've chosen this way because it was nearer to the current way of doing things.

Maybe it would work if we memorize the available interpreters and use the right one from the list on every pybind11_add_module call. Then the user can set the target name individually.

@dean0x7d
Copy link
Member

dean0x7d commented Apr 18, 2017

@thorink I think that keeping the multiple version selection as local as possible would be the best thing. Users already rely on the global variables, so I'd be hesitant to mess with those too much (not to mention breaking the pybind11::* targets). A concrete proposal would be something like this:

function(pybind11_add_module target_name)
    # ... existing `cmake_parse_arguments` code is here
    
    # Only new lines in this function
    if(ARG_PYTHON_VERSION)
         # Replace PYTHON_INCLUDE_DIRS, PYTHON_LIBRARIES, etc. in the local scope
         python_version_override(${ARG_PYTHON_VERSION})
    endif()

    # ... continue existing code (with possibly overriden locals)
endfunction() # globals are in effect again after the function returns

where:

function(python_version_override version)
    # ... look up data specific to ${version}
    #     cache as needed, but avoid overriding existing globals

    # set variables in parent scope only
    set(PYTHON_INCLUDE_DIRS <value> PARENT_SCOPE)
    set(PYTHON_LIBRARIES <value> PARENT_SCOPE)
    # etc.
endfunction()

Looking at your branch, I believe you already have the code that does all this, my suggestion is to just encapsulate everything in a single function (python_version_override) and only set the variables in the parent scope (local non-cache variables have priority over global cache variables in CMake). What do you think about this approach?

@dscole
Copy link

dscole commented Jun 29, 2017

Hey, has anyone had any progress on this?

@thorink
Copy link
Contributor Author

thorink commented Jun 30, 2017

I still would like to have a nice solution for this.
The problem is, that CMake does not really support this. The solution will always be like working against CMake.

Do you have similar needs? We have it working in our codebase, but I had to modify the pybind11 cmake scripts a bit (removing stuff we don't need). If you want, I can show you our current solution.

@Colelyman
Copy link

Any updates on what people are currently doing to solve this issue? (Specifically about supporting two Python versions simultaneously)

@wjakob
Copy link
Member

wjakob commented Oct 26, 2018

@Colelyman : nothing at the moment, PRs are welcomed.

@tdp2110
Copy link
Contributor

tdp2110 commented Dec 20, 2018

+1 for this issue. I've been using a hack like @thorink's original approach.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

9 participants