-
Notifications
You must be signed in to change notification settings - Fork 565
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Inserting many items into a SQL Server database #217
Comments
You appear to be hitting SQL Server's limit of 2100 parameters per stored procedure (ref: here via here). When pyodbc sends a parameterized query to SQL Server it ends up being processed as a call to the SQL Server stored procedure SQLAlchemy is producing a parameterized query using a "table value constructor" that in your case adds nine (9) parameters per row. A batch of 200 rows results in an So for now it looks like you'll need to keep inserting in batches of ~230 rows or less. Note that pyodbc may be changing the way it handles parameterized queries, so keep an eye on issue #214. |
Again, great catch @gordthompson. I'm going to close this for now. |
@gordthompson Is there any known workaround? I tried generating a
[Number of row value expressions in the insert statement is bigger than the maximum number of 1000 row values] So for me it looks like there is no way to insert more than 1000 rows at a time? |
"So for me it looks like there is no way to insert more than 1000 rows at a time?" With a single INSERT statement, yes. The relevant MSDN document says that
It goes on to say that
|
BULK INSERT statement should work for you. It can handle large imports. It has quaint issues around constraint checking, input file parsing and reading input files remote from the database server. Here's an example (for SQL Server 2005-2017)
|
I'm currently trying to add 100k entries into a table on an MS SQL Server 2014 from Linux via SQLAlchemy. Unfortunately, an error happens when I try to insert more than 200 entries per insert. The error happens with both the latest MS ODBC driver and with the FreeTDS driver:
MS ODBC with 300 rows (error:
COUNT field incorrect or syntax error (0) (SQLExecDirectW)')
):FreeTDs with 300 row (error:
Invalid descriptor index (0) (SQLBindParameter)')
):I'm currently uploading in 200er batches and that works, but is slow. If I run the same insert into a postgres DB, it works. If I run a simple
INSERT INTO ... VALUES (real values, Without binds,...),...
it also works (with ~500 rows, both in a SQL Tool and viasession.execute("SQL String")
in SQLAlchemy).I'm a bit lost how to debug this further. As it happens with both FreeTDS and the MS ODBC drivers, it looks a bit like a MS SQL Server error?
Installed packages (ubuntu 16.10):
The text was updated successfully, but these errors were encountered: