-
Notifications
You must be signed in to change notification settings - Fork 2.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat: new FindPython support #2370
Conversation
18a077c
to
bbfef23
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Wow, that's an impressive amount of work!
I don't have the background to meaningfully review the details, but one high-level observation: when seeing PYBIND11_NEW_PYTHON
for the first time I had no clue what it might be doing. Looking through your changes I found:
PYBIND11_NEW_PYTHON "Force new findPython"
Aha! :-)
How about: PYBIND11_FORCE_NEW_FINDPYTHON
?
It's only a little bit longer but gives the right clues even at first sight.
How about |
Looks good to me. 👍 |
6c72200
to
476fe4d
Compare
Merging if this passes! Made very minor (mostly docs) improvements after one more read over. |
Hi, seems to generate a .o file that cannot be imported on a different machine (Linux and Mac Os x86) than the host build machine. Reading the documentation and you exchange on git made us a bit confuse on how to do this . Could you refer us to an example of such a use case ? Thank you. |
I'm no expert in cross-compilation, but combined with Python, I don't know if it would be a good idea. The standard way to do this would be to setup a CI job (using GitHub Actions, Azure, GitLab CI, Travis, AppVeyor, or CircleIO, maybe DroneIO) that runs on each of the major OSs (such as https://scikit-hep.org/developer/gha_wheels) that you can then download binaries for each platform you support. I would use setuptools (if you have a simple build) or CMake (more complex build) to get the compilation right on each platform, as the commands will vary. I don't think this has anything to do with the new FindPython support, though. ;) |
Hello, we have a large source tree with a few pybind11 targets. We use multiple python versions which requires us to build the entire tree multiple times. I am very interested in building pybind11 so targets against multiple versions of python in the same run. I have not been able to find any examples of doing this, but I see comments to the effect that it is possible. I have cmake 3.21 and pybind11 2.6.1. Can someone point me in the right direction? Sample CMakeLists.txt: cmake_minimum_required(VERSION 3.21) set(CMAKE_CXX_STANDARD 20) # c++ standard version set(CMAKE_CXX_STANDARD_REQUIRED ON) # disable fallback behavior set(CMAKE_CXX_EXTENSIONS OFF) # no compiler extensions project(tst) set(pybind11_ROOT /opt/pb11) set(PYTHON_ROOT /opt/conda/envs/py3.9.6) #set(PYTHON_EXECUTABLE /opt/conda/envs/py3.9.6/bin/python3) find_package(Python COMPONENTS Interpreter Development) find_package(pybind11 CONFIG) pybind11_add_module(tst-3.9.6 tst.cpp) set(PYTHON_ROOT /opt/conda/envs/py3.9.7) #set(PYTHON_EXECUTABLE /opt/conda/envs/py3.9.7/bin/python3) find_package(Python COMPONENTS Interpreter Development) find_package(pybind11 CONFIG) pybind11_add_module(tst-3.9.7 tst.cpp) Thank you. |
Could you please use 2.8.1 instead of 2.6.1? CMake support is likely better. Also, I'd recommend trying to use the native tools more heavily (like python_add_library), and just use the light-weight pybind11 ones (like pybind11::headers targets). You might be able to do this with Also, are you really sensitive to patch releases? A build should work on all patch releases of Python, unless you statically link or something like that. |
I will download 2.8.1. I'm not really sensitive to patch releases, I just created two conda environments to try to get this working. Also, what is "native tools"? I will look at pybind11_add_module. Thank you Henry |
I would build for two different minor releases of Python, as otherwise you'll be making exactly matching extensions, so there's no difference in the output, one will override the other. You'll make cp39 both times. By native tools, I mean the ones provided by FindPython, rather than our wrappers. We haven't put careful thought into building multiple Python extensions at once (normally, you are inside a Python environment and building for that environment, there aren't a lot of reasons to build against several external Pythons), but I think CMake's FindPython has. We do provide lots of granular targets to make this easy to do, though. |
Also, be careful about building against Anaconda Python. They have custom compilers for anaconda, so you have to use those, not your system compilers, like normal copies of Python. |
This is true when people are just developing Python modules consist of native extensions, for which native source code are just bundled in a pypi package. Simply running However, there is another situation that people are working on an existing native codebase and just want to provide a Python binding library as side-products to distribute. Instead of filling the build configuration in People have proposed many workaround for such case, like #748 and ROCm/AMDMIGraphX#655 . But an official solution would be more favourable. |
One of the reasons for FindPython was to support multiple Pythons at once. We might already support it when using FindPython; we just haven't done much with setting things up to make sure we also support it. You have a typo, it's cmake_minimum_required(VERSION 3.21)
set(CMAKE_CXX_STANDARD 20) # c++ standard version
set(CMAKE_CXX_STANDARD_REQUIRED ON) # disable fallback behavior
set(CMAKE_CXX_EXTENSIONS OFF) # no compiler extensions
project(tst)
set(pybind11_ROOT /opt/pb11)
set(Python_ROOT /opt/conda/envs/py3.9.6)
find_package(Python COMPONENTS Interpreter Development)
find_package(pybind11 CONFIG)
pybind11_add_module(tst-3.9.6 tst.cpp)
set(Python_ROOT /opt/conda/envs/py3.9.7)
find_package(Python COMPONENTS Interpreter Development)
find_package(pybind11 CONFIG) # maybe optional?
pybind11_add_module(tst-3.9.7 tst.cpp) But if that doesn't work (and if it doesn't, I'd be willing to help work fixing whatever doesn't work if shown), then try the native tools: cmake_minimum_required(VERSION 3.21)
set(CMAKE_CXX_STANDARD 20) # c++ standard version
set(CMAKE_CXX_STANDARD_REQUIRED ON) # disable fallback behavior
set(CMAKE_CXX_EXTENSIONS OFF) # no compiler extensions
project(tst)
set(pybind11_ROOT /opt/pb11)
set(pybind11_NOPYTHON TRUE)
find_package(pybind11 CONFIG)
set(Python_ROOT /opt/conda/envs/py3.9.6)
find_package(Python COMPONENTS Interpreter Development)
Python_add_library(tst-3.9.6 MODULE tst.cpp)
target_link_libraries(tst-3.9.6 PUBLIC pybind11::module)
set(Python_ROOT /opt/conda/envs/py3.9.7)
find_package(Python COMPONENTS Interpreter Development)
Python_add_library(tst-3.9.7 MODULE tst.cpp)
target_link_libraries(tst-3.9.7 PUBLIC pybind11::module) |
I am following these steps and not able to compile it for two different versions of python. Here is my cmake file:
two libraries are generated, but one of them is corrupted. And I am getting the following error in cmake:
|
cmake_minimum_required(VERSION 3.21)
project(pybind11-example LANGUAGES CXX)
set(CMAKE_CXX_EXTENSIONS OFF)
include(FetchContent)
set(PYBIND11_NOPYTHON ON)
FetchContent_Declare(pybind11
GIT_REPOSITORY https://github.com/pybind/pybind11.git
GIT_TAG v2.8.1
)
FetchContent_MakeAvailable(pybind11)
set(Python_ROOT_DIR /usr)
find_package(Python 3.6 COMPONENTS Interpreter Development.Module)
Python_add_library(example example.cpp)
target_link_libraries(example PUBLIC pybind11::module)
set(Python_ROOT_DIR /opt/rh/rh-python38/root/usr)
find_package(Python 3.8 COMPONENTS Interpreter Development.Module)
Python_add_library(example38 example.cpp)
target_link_libraries(example38 PUBLIC pybind11::module) |
Also, it's https://cmake.org/cmake/help/latest/module/FindPython.html#hints |
I copied this cmake and tested it, it still shows the same error:
Also when I try to import example38, it throws the below error:
|
If you switch the order (3.8 first), does that find 3.8 or is it still stuck on 3.6? |
If I switch the order, it finds 3.8 for both:
And if I write EXACT in find package it throws error:
|
This refactors the targets and tooling to make this cleaner, simpler, more modular, and (of course, given the title) adds support for FindPython in CMake 3.12+. Minimum versions of CMake now explicitly fixed, listed, and tested (3.4+ Linux, 3.7 macOS, 3.8 Windows for building pybind11's own tests, probably 3.4+ for submodule/config mode on all platforms).
Modular targets
There are now a collection of modular targets, and you can mix and match. The modules beyond
pybind11::module
andpybind::embed
are considered advanced, but provide better support for integrating pybind11 into an existing or complex build than before. Targets (other thanheaders
) are generated at run/configure import time, which allows better customization for CMake version or generator (MSVC + CMake < 3.11, for example).pybind11::headers
- Just the PyBind11 headers and minimum compile requirementspybind11::pybind11
- Python headers toopybind11::module
- for extension modulespybind11::embed
- for embedding the Python interpreterpybind11::python_link_helper
- Just the "linking" part of pybind11:modulepybind11::python2_no_register
- Quiets the warning/error when mixing C++14+ and Python 2, also included in pybind11::modulepybind11::lto
/pybind11::thin_lto
- An alternative toINTERPROCEDURAL_OPTIMIZATION
pybind11::windows_extras
- Bigobj and mp for MSVC.The stripping feature has been made available as
pybind11_strip(target)
, as well. The extension feature is available aspybind11_extension(target)
(like everything else, this works with both modes).These targets are now created in a common place,
pybind11Common.cmake
, which is shared across Config/CMakeLists, so both modes share the same initialization, with the single exception ofheaders
, which is a normal, exported target - as it's the only one that contains non-"imported" information. All classic Python setup is inpybind11Tools.cmake
and all new Python setup is inpybind11NewTools.cmake
.INTERPROCEDURAL_OPTIMIZATION
This has been refactored (mostly due to a bug in CMake 3.15-3.18's FindPython) to be simpler: If
CMAKE_INTERPROCEDURAL_OPTIMIZATION
is defined, use that (on or off). If not, addpybind11::lto
orpybind11::thin_lto
to the created target. This provides a way to turn it off (which was missing before), and a way to use the built-in CMake support if you know it supports your situation (it is affected by bugs due to CMake polices). If you target MSVC,INTERPROCEDURAL_OPTIMIZATION
uses incremental IPO, which needs all targets to have it enabled, so it is better to use the CMAKE variable instead of the property - but advanced users can always set the variable to OFF and then add it to each target. (The bug is fixed for CMake 3.18.2)FindPython Mode
To use the new FindPython support, simply use
find_package(Python COMPONENTS Interpreter Development)
before adding PyBind11 - this way, you can control the hints, run multiple times, etc. You can also use-DPYBIND11_FINDPYTHON=ON
to force the search internally (very useful for building tests, etc).Example:
NOPYTHON mode
Manually selectable
PYBIND11_NOPYTHON=ON
(also activated by calling FindPython2 and FindPython3 without FindPython), ideal for multi-target support, Scikit-Build PythonExtensions integration, OpenCV interaction, and more. Disables Python-specific features but still providespybind11::headers
,pybind11::python2_no_register
,pybind11::lto
,pybind11::thin_lto
,pybind11::windows_extras
, andpybind11_strip
.Classic Python mode
Support for
venv
,virtualenv
, andconda
added. GitHub Action'ssetup-python
activation, too.TODO:
THIN_LTO
is not respected yetINTERPROCEDURAL_OPTIMIZATION
from vcpkg.cmake #2355, closes GCC 4.8 LTO Crashes #1415 - IPO can be deactivated or changed to CMake built-inpybind11::headers
(probably with NOPYTHON mode)Bugs reported and fixed in CMake 3.18.2 (upcoming) as a result of this work:
INTERPROCEDURAL_OPTIMIZATION
can't be used withPython_add_library