Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Nested event loop #14

Open
davidbrochart opened this issue Jul 25, 2023 · 6 comments
Open

Nested event loop #14

davidbrochart opened this issue Jul 25, 2023 · 6 comments
Labels
question Further information is requested

Comments

@davidbrochart
Copy link
Contributor

I am wondering if greenletio could be used to implement nested event loop. It seems it's currently not possible:

import asyncio
from greenletio import await_

async def async_function():
    pass

def sync_function():
    await_(async_function())
    # if asyncio allowed nested event loop, we could do:
    # asyncio.run(async_function())

async def main():
    sync_function()

asyncio.run(main())

# RuntimeError: await_ cannot be called from the asyncio task

But do you see a fundamental reason why it could not work?

@miguelgrinberg
Copy link
Owner

I'm not sure I understand how this package can help.

Also I don't see why the loop needs to be re-entrant, given that it is possible and legal to start additional loops on other threads. Below I took one of your examples posted elsewhere and modified to do this:

import asyncio
from threading import Thread

def reentrant_asyncio_run(coro):
    ret = None
    def _run(coro):
        ret = asyncio.run(coro)
        return ret
    t = Thread(target=_run, args=(coro,))
    t.start()
    t.join()
    return ret

async def task(name):
    for i in range(10):
        await asyncio.sleep(0.1)
        print(f"from {name}: {i}")

async def bar():
    asyncio.create_task(task("bar"))
    await asyncio.sleep(1.1)
    print("bar done")

def foo():
    # asyncio.run inside an already running event loop
    # pre-empts the execution of any other task in the event loop
    reentrant_asyncio_run(bar())

async def main():
    t = asyncio.create_task(task("main"))  # not executed until foo() is done
    foo()
    await asyncio.sleep(1.1)  # t resumes execution

asyncio.run(main())

@miguelgrinberg miguelgrinberg added the question Further information is requested label Jul 25, 2023
@davidbrochart
Copy link
Contributor Author

I'm not sure I understand how this package can help.

Maybe it cannot, but it allows running async code from a sync function, so I thought it could detect that it's already running in an event loop, and in that case do some greenlet magic to await the async function in the sync function. But I'm probably missing something.

Also I don't see why the loop needs to be re-entrant, given that it is possible and legal to start additional loops on other threads.

Sure, and that's our current solution in Jupyter. But threads are not as lightweight as coroutines, and some libraries don't like it when they don't run in the main thread. Also, for asyncio objects like Event to be used in the async and sync code, they must belong to the same event loop.

@miguelgrinberg
Copy link
Owner

But I'm probably missing something.

Yes, I think what you are missing is that in spite of the greenlet stuff happening in the background, the asyncio loop runs without any hacks or modifications. A sync function that is relocated to a greenlet via the async_() function or decorator effectively becomes an async function that can interact with native async functions directly. Running async code from a sync function is definitely the goal of this package, but I don't see how tasks that were started before the sync function can be prevented from running until the sync function returns.

But threads are not as lightweight as coroutines

How many levels of loops inside loops do you need? For a handful of them, using threads should not have any performance or resource consumption impact.

Also, for asyncio objects like Event to be used in the async and sync code, they must belong to the same event loop.

So sharing concurrency primitives is a requirement? What's the use case for this requirement? And how is this going to work, given that only the most inner loop is active? and all the others are blocked so they cannot trigger or receive notifications? Feels like a recipe for deadlocks.

@davidbrochart
Copy link
Contributor Author

Threads might be just fine, but in theory they can still be problematic with some libraries or even some platforms like the browser running Python in WASM where they don't exist. I'm sorry I can't provide any real use-case, but these are just some potential issues that make me look for a better solution.
Same for sharing concurrency primitives, where it used to be possible when we were using nest-asyncio (but I admit it was more "magical" since asyncio.run was not even blocking).

@miguelgrinberg
Copy link
Owner

I think you will need to convince the Python core team to implement this in Python, because doing this without threads is just not possible right now.

@gsakkis
Copy link

gsakkis commented Jan 5, 2024

FWIW this is possible with the greenback package:

import asyncio
import greenback


async def async_function():
    await asyncio.sleep(1)


def sync_function():
    greenback.await_(async_function())


async def main():
    await greenback.ensure_portal()
    sync_function()

asyncio.run(main())

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
question Further information is requested
Projects
None yet
Development

No branches or pull requests

3 participants