-
Notifications
You must be signed in to change notification settings - Fork 31
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Feature] async support #619
Comments
Dear Robert, thank you for writing in. Are you looking at asnyc support for the HTTP driver, or async support for the SQLAlchemy dialect? In general, you can always use the asyncpg or psycopg3 libraries, as outlined on this documentation page. If you are looking at async support for SQLAlchemy, on top of the PostgreSQL drivers enumerated above, that patch might bring in what you are looking for. In this case, please have a look at those examples, which can be used right away when following the corresponding dependency specifications.
With kind regards, |
@amotl Just wanted to be able to use the HTTP bulk API with async, don't need async SQLAlchemy at the moment It seems like the HTTP bulk API processes bulk inserts the most efficiently than using Postgres |
The main purpose of this library is to implement the DBAPI. Given that there is no async version of it yet, I'm not sure what adding async capabilities into this library would bring on the table. It's not much effort to use a async http library directly - or one of the async pg driver variants. E.g. with aiohttp (untested): async with aiohttp.ClientSession as session:
data = json.dumps({
"stmt": "insert into ...",
"bulk_args": [
[...],
[...],
]
})
async with session.post(server_url,
data=data,
headers={'Content-Type': 'application/json'}) as resp:
result = await resp.json() |
I'm doing a lot of bulk inserts that are generated from user activity. (Some times we have 10 rows, other times we have 100,000 rows but broken up into blocks of 1,000)
It would be really helpful to have async support so that we don't have issues of blocking while waiting for Crate to ingest 1,000 rows at a time.
The text was updated successfully, but these errors were encountered: