Skip to content

Commit

Permalink
DEPR: Positional arguments in to_sql except name (#54397)
Browse files Browse the repository at this point in the history
* Updated method header and whatsnew file

* Updated unit tests to use keyword argument for con parameter.

* Updating unit tests and implementation.

* Updated documentation and unit tests.

* Updating documentation and fixing unit tests.

* Updating documentation.

* Updating documentation and fixing failing unit tests.

* Updating documentation and unit tests.

* Updating implementation based on reviewer feedback.

* Updating implementation to allow 'self' to be a positional arg.

* Deprecating con positional arg in new test case.

* Fixing typo

* Fixing typo
  • Loading branch information
rmhowe425 authored Aug 10, 2023
1 parent 224457d commit f935543
Show file tree
Hide file tree
Showing 6 changed files with 127 additions and 88 deletions.
10 changes: 5 additions & 5 deletions doc/source/user_guide/io.rst
Original file line number Diff line number Diff line change
Expand Up @@ -5651,7 +5651,7 @@ the database using :func:`~pandas.DataFrame.to_sql`.
data = pd.DataFrame(d, columns=c)
data
data.to_sql("data", engine)
data.to_sql("data", con=engine)
With some databases, writing large DataFrames can result in errors due to
packet size limitations being exceeded. This can be avoided by setting the
Expand All @@ -5660,7 +5660,7 @@ writes ``data`` to the database in batches of 1000 rows at a time:

.. ipython:: python
data.to_sql("data_chunked", engine, chunksize=1000)
data.to_sql("data_chunked", con=engine, chunksize=1000)
SQL data types
++++++++++++++
Expand All @@ -5680,7 +5680,7 @@ default ``Text`` type for string columns:
from sqlalchemy.types import String
data.to_sql("data_dtype", engine, dtype={"Col_1": String})
data.to_sql("data_dtype", con=engine, dtype={"Col_1": String})
.. note::

Expand Down Expand Up @@ -5849,7 +5849,7 @@ have schema's). For example:

.. code-block:: python
df.to_sql("table", engine, schema="other_schema")
df.to_sql(name="table", con=engine, schema="other_schema")
pd.read_sql_table("table", engine, schema="other_schema")
Querying
Expand All @@ -5876,7 +5876,7 @@ Specifying this will return an iterator through chunks of the query result:
.. ipython:: python
df = pd.DataFrame(np.random.randn(20, 3), columns=list("abc"))
df.to_sql("data_chunks", engine, index=False)
df.to_sql(name="data_chunks", con=engine, index=False)
.. ipython:: python
Expand Down
2 changes: 1 addition & 1 deletion doc/source/whatsnew/v0.14.0.rst
Original file line number Diff line number Diff line change
Expand Up @@ -437,7 +437,7 @@ This ``engine`` can then be used to write or read data to/from this database:
.. ipython:: python
df = pd.DataFrame({'A': [1, 2, 3], 'B': ['a', 'b', 'c']})
df.to_sql('db_table', engine, index=False)
df.to_sql(name='db_table', con=engine, index=False)
You can read data from a database by specifying the table name:

Expand Down
2 changes: 2 additions & 0 deletions doc/source/whatsnew/v2.1.0.rst
Original file line number Diff line number Diff line change
Expand Up @@ -260,6 +260,7 @@ Other enhancements
- :meth:`DataFrame.to_parquet` and :func:`read_parquet` will now write and read ``attrs`` respectively (:issue:`54346`)
- Added support for the DataFrame Consortium Standard (:issue:`54383`)
- Performance improvement in :meth:`GroupBy.quantile` (:issue:`51722`)
-

.. ---------------------------------------------------------------------------
.. _whatsnew_210.notable_bug_fixes:
Expand Down Expand Up @@ -600,6 +601,7 @@ Other Deprecations
- Deprecated the use of non-supported datetime64 and timedelta64 resolutions with :func:`pandas.array`. Supported resolutions are: "s", "ms", "us", "ns" resolutions (:issue:`53058`)
- Deprecated values "pad", "ffill", "bfill", "backfill" for :meth:`Series.interpolate` and :meth:`DataFrame.interpolate`, use ``obj.ffill()`` or ``obj.bfill()`` instead (:issue:`53581`)
- Deprecated the behavior of :meth:`Index.argmax`, :meth:`Index.argmin`, :meth:`Series.argmax`, :meth:`Series.argmin` with either all-NAs and skipna=True or any-NAs and skipna=False returning -1; in a future version this will raise ``ValueError`` (:issue:`33941`, :issue:`33942`)
- Deprecated allowing non-keyword arguments in :meth:`DataFrame.to_sql` except ``name``. (:issue:`54229`)
-

.. ---------------------------------------------------------------------------
Expand Down
22 changes: 14 additions & 8 deletions pandas/core/generic.py
Original file line number Diff line number Diff line change
Expand Up @@ -97,7 +97,10 @@
SettingWithCopyWarning,
_chained_assignment_method_msg,
)
from pandas.util._decorators import doc
from pandas.util._decorators import (
deprecate_nonkeyword_arguments,
doc,
)
from pandas.util._exceptions import find_stack_level
from pandas.util._validators import (
check_dtype_backend,
Expand Down Expand Up @@ -2792,6 +2795,9 @@ def to_hdf(
)

@final
@deprecate_nonkeyword_arguments(
version="3.0", allowed_args=["self", "name"], name="to_sql"

This comment has been minimized.

Copy link
@jorisvandenbossche

jorisvandenbossche Aug 19, 2023

Member

It might make sense to allow con as positional as well?
(certainly given that it is a required argument, quite self-descriptive (eg not a boolean argument), and that this pattern is widely used in our own docs)

)
def to_sql(
self,
name: str,
Expand Down Expand Up @@ -2911,7 +2917,7 @@ def to_sql(
1 User 2
2 User 3
>>> df.to_sql('users', con=engine)
>>> df.to_sql(name='users', con=engine)
3
>>> from sqlalchemy import text
>>> with engine.connect() as conn:
Expand All @@ -2922,14 +2928,14 @@ def to_sql(
>>> with engine.begin() as connection:
... df1 = pd.DataFrame({'name' : ['User 4', 'User 5']})
... df1.to_sql('users', con=connection, if_exists='append')
... df1.to_sql(name='users', con=connection, if_exists='append')
2
This is allowed to support operations that require that the same
DBAPI connection is used for the entire operation.
>>> df2 = pd.DataFrame({'name' : ['User 6', 'User 7']})
>>> df2.to_sql('users', con=engine, if_exists='append')
>>> df2.to_sql(name='users', con=engine, if_exists='append')
2
>>> with engine.connect() as conn:
... conn.execute(text("SELECT * FROM users")).fetchall()
Expand All @@ -2939,7 +2945,7 @@ def to_sql(
Overwrite the table with just ``df2``.
>>> df2.to_sql('users', con=engine, if_exists='replace',
>>> df2.to_sql(name='users', con=engine, if_exists='replace',
... index_label='id')
2
>>> with engine.connect() as conn:
Expand All @@ -2956,7 +2962,7 @@ def to_sql(
... stmt = insert(table.table).values(data).on_conflict_do_nothing(index_elements=["a"])
... result = conn.execute(stmt)
... return result.rowcount
>>> df_conflict.to_sql("conflict_table", conn, if_exists="append", method=insert_on_conflict_nothing) # doctest: +SKIP
>>> df_conflict.to_sql(name="conflict_table", con=conn, if_exists="append", method=insert_on_conflict_nothing) # doctest: +SKIP
0
For MySQL, a callable to update columns ``b`` and ``c`` if there's a conflict
Expand All @@ -2973,7 +2979,7 @@ def to_sql(
... stmt = stmt.on_duplicate_key_update(b=stmt.inserted.b, c=stmt.inserted.c)
... result = conn.execute(stmt)
... return result.rowcount
>>> df_conflict.to_sql("conflict_table", conn, if_exists="append", method=insert_on_conflict_update) # doctest: +SKIP
>>> df_conflict.to_sql(name="conflict_table", con=conn, if_exists="append", method=insert_on_conflict_update) # doctest: +SKIP
2
Specify the dtype (especially useful for integers with missing values).
Expand All @@ -2989,7 +2995,7 @@ def to_sql(
2 2.0
>>> from sqlalchemy.types import Integer
>>> df.to_sql('integers', con=engine, index=False,
>>> df.to_sql(name='integers', con=engine, index=False,
... dtype={"A": Integer()})
3
Expand Down
2 changes: 1 addition & 1 deletion pandas/io/sql.py
Original file line number Diff line number Diff line change
Expand Up @@ -621,7 +621,7 @@ def read_sql(
>>> conn = connect(':memory:')
>>> df = pd.DataFrame(data=[[0, '10/11/12'], [1, '12/11/10']],
... columns=['int_column', 'date_column'])
>>> df.to_sql('test_data', conn)
>>> df.to_sql(name='test_data', con=conn)
2
>>> pd.read_sql('SELECT int_column, date_column FROM test_data', conn)
Expand Down
Loading

0 comments on commit f935543

Please sign in to comment.