Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[AIRFLOW-2859] Implement own UtcDateTime #3708

Merged
merged 1 commit into from
Aug 8, 2018

Conversation

bolkedebruin
Copy link
Contributor

Make sure you have checked all steps below.

Jira

  • My PR addresses the following Airflow Jira issues and references them in the PR title. For example, "[AIRFLOW-XXX] My Airflow PR"

Description

  • Here are some details about my PR, including screenshots of any UI changes:

The different UtcDateTime implementations all have issues.
Either they replace tzinfo directly without converting
or they do not convert to UTC at all.

Tests

  • My PR adds the following unit tests OR does not need testing for this extremely good reason:

Added

Commits

  • My commits all reference Jira issues in their subject lines, and I have squashed multiple commits if they address the same issue. In addition, my commits follow the guidelines from "How to write a good git commit message":
    1. Subject is separated from body by a blank line
    2. Subject is limited to 50 characters (not including Jira issue reference)
    3. Subject does not end with a period
    4. Subject uses the imperative mood ("add", not "adding")
    5. Body wraps at 72 characters
    6. Body explains "what" and "why", not "how"

Documentation

  • In case of new functionality, my PR adds documentation that describes how to use it.
    • When adding new operators/hooks/sensors, the autoclass documentation generation needs to be added.

Code Quality

  • Passes git diff upstream/master -u -- "*.py" | flake8 --diff

cc @Fokko @ashb new tests pass locally with equal mysql version to travis. If test fail for mysql I might need some help ;-)

@ashb
Copy link
Member

ashb commented Aug 6, 2018

Code looks okay at first glance. Not in front of a laptop today to test further.

@bolkedebruin bolkedebruin force-pushed the AIRFLOW-2859 branch 2 times, most recently from e587835 to ca975b4 Compare August 6, 2018 15:46
@codecov-io
Copy link

codecov-io commented Aug 7, 2018

Codecov Report

Merging #3708 into master will decrease coverage by 59.85%.
The diff coverage is 44.44%.

Impacted file tree graph

@@             Coverage Diff             @@
##           master    #3708       +/-   ##
===========================================
- Coverage   77.57%   17.72%   -59.86%     
===========================================
  Files         204      204               
  Lines       15776    15789       +13     
===========================================
- Hits        12239     2798     -9441     
- Misses       3537    12991     +9454
Impacted Files Coverage Δ
airflow/bin/cli.py 14.54% <ø> (-49.81%) ⬇️
airflow/jobs.py 12.22% <100%> (-70.54%) ⬇️
airflow/models.py 27.58% <20%> (-61.03%) ⬇️
airflow/utils/sqlalchemy.py 56.92% <42.1%> (-16.99%) ⬇️
...w/example_dags/example_latest_only_with_trigger.py 0% <0%> (-100%) ⬇️
airflow/hooks/pig_hook.py 0% <0%> (-100%) ⬇️
airflow/example_dags/example_branch_operator.py 0% <0%> (-100%) ⬇️
airflow/example_dags/example_docker_operator.py 0% <0%> (-100%) ⬇️
airflow/example_dags/subdags/subdag.py 0% <0%> (-100%) ⬇️
airflow/www_rbac/blueprints.py 0% <0%> (-100%) ⬇️
... and 167 more

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update d47580f...97e4be4. Read the comment docs.

@ashb
Copy link
Member

ashb commented Aug 7, 2018

Can I help?

@bolkedebruin
Copy link
Contributor Author

bolkedebruin commented Aug 7, 2018

@ashb I think I finally nailed it. The issue was in my understanding how sqlalchemy was dealing with timezone information inside fields for mysql. If you insert in mysql 2018-08-08 20:40:21.443732+02:00 mysql will ignore the +02:00 and apply the connection's timezone setting (e.g. set time_zone='+01:00'). I thought sqlalchemy would handle this somehow (ie bij separating the timezone and doing something like this [1].

So, this creates chaos obviously, although not too much as long as you do not change the connection's timezone. In my tests I was actually doing so hence they didn't pass. Strangely enough if executing them isolated / locally it worked for some reason. probably due to connection re-use.

Anyways I borrowed you event watcher and now make sure we always connect in UTC with mysql. For postgres it doesn't matter and we can test it doesn't.

[1] https://stackoverflow.com/questions/7651409/mysql-datetime-insert-a-date-with-timezone-offset

@ashb
Copy link
Member

ashb commented Aug 7, 2018

AIP-3: Drop support for Mysql :D

@bolkedebruin
Copy link
Contributor Author

I think our google friends use mysql 5.7 for their cloud composer. Probably they won't be happy with us then ;-)

@feng-tao
Copy link
Member

feng-tao commented Aug 7, 2018

flake8 fails.

The different UtcDateTime implementations all have issues.
Either they replace tzinfo directly without converting
or they do not convert to UTC at all.

We also ensure all mysql connections are in UTC
in order to keep sanity, as mysql will ignore the
timezone of a field when inserting/updating.
@bolkedebruin bolkedebruin merged commit 6fd4e60 into apache:master Aug 8, 2018
bolkedebruin added a commit that referenced this pull request Aug 8, 2018
The different UtcDateTime implementations all have issues.
Either they replace tzinfo directly without converting
or they do not convert to UTC at all.

We also ensure all mysql connections are in UTC
in order to keep sanity, as mysql will ignore the
timezone of a field when inserting/updating.

(cherry picked from commit 6fd4e60)
Signed-off-by: Bolke de Bruin <bolke@xs4all.nl>
bolkedebruin added a commit that referenced this pull request Aug 8, 2018
The different UtcDateTime implementations all have issues.
Either they replace tzinfo directly without converting
or they do not convert to UTC at all.

We also ensure all mysql connections are in UTC
in order to keep sanity, as mysql will ignore the
timezone of a field when inserting/updating.

(cherry picked from commit 6fd4e60)
Signed-off-by: Bolke de Bruin <bolke@xs4all.nl>
(cherry picked from commit 8fc8c7a)
Signed-off-by: Bolke de Bruin <bolke@xs4all.nl>
lxneng pushed a commit to lxneng/incubator-airflow that referenced this pull request Aug 10, 2018
The different UtcDateTime implementations all have issues.
Either they replace tzinfo directly without converting
or they do not convert to UTC at all.

We also ensure all mysql connections are in UTC
in order to keep sanity, as mysql will ignore the
timezone of a field when inserting/updating.
aliceabe pushed a commit to aliceabe/incubator-airflow that referenced this pull request Jan 3, 2019
The different UtcDateTime implementations all have issues.
Either they replace tzinfo directly without converting
or they do not convert to UTC at all.

We also ensure all mysql connections are in UTC
in order to keep sanity, as mysql will ignore the
timezone of a field when inserting/updating.
cfei18 pushed a commit to cfei18/incubator-airflow that referenced this pull request Jan 23, 2019
The different UtcDateTime implementations all have issues.
Either they replace tzinfo directly without converting
or they do not convert to UTC at all.

We also ensure all mysql connections are in UTC
in order to keep sanity, as mysql will ignore the
timezone of a field when inserting/updating.

(cherry picked from commit 6fd4e60)
Signed-off-by: Bolke de Bruin <bolke@xs4all.nl>
(cherry picked from commit 8fc8c7a)
Signed-off-by: Bolke de Bruin <bolke@xs4all.nl>
wmorris75 pushed a commit to modmed/incubator-airflow that referenced this pull request Jul 31, 2019
author Ash Berlin-Taylor <ash_github@firemirror.com> 1564493832 +0100
committer wayne.morris <wayne.morris@modmed.com> 1564516048 -0400

parent 6ef0e37
author Ash Berlin-Taylor <ash_github@firemirror.com> 1564493832 +0100
committer wayne.morris <wayne.morris@modmed.com> 1564515968 -0400

parent 6ef0e37
author Ash Berlin-Taylor <ash_github@firemirror.com> 1564493832 +0100
committer wayne.morris <wayne.morris@modmed.com> 1564515909 -0400

parent 6ef0e37
author Ash Berlin-Taylor <ash_github@firemirror.com> 1564493832 +0100
committer wayne.morris <wayne.morris@modmed.com> 1564515887 -0400

parent 6ef0e37
author Ash Berlin-Taylor <ash_github@firemirror.com> 1564493832 +0100
committer wayne.morris <wayne.morris@modmed.com> 1564507924 -0400

parent 6ef0e37
author Ash Berlin-Taylor <ash_github@firemirror.com> 1564493832 +0100
committer wayne.morris <wayne.morris@modmed.com> 1564507818 -0400

parent 6ef0e37
author Ash Berlin-Taylor <ash_github@firemirror.com> 1564493832 +0100
committer wayne.morris <wayne.morris@modmed.com> 1564507092 -0400

parent 6ef0e37
author Ash Berlin-Taylor <ash_github@firemirror.com> 1564493832 +0100
committer wayne.morris <wayne.morris@modmed.com> 1564507071 -0400

parent 6ef0e37
author Ash Berlin-Taylor <ash_github@firemirror.com> 1564493832 +0100
committer wayne.morris <wayne.morris@modmed.com> 1564507049 -0400

parent 6ef0e37
author Ash Berlin-Taylor <ash_github@firemirror.com> 1564493832 +0100
committer wayne.morris <wayne.morris@modmed.com> 1564506218 -0400

parent 6ef0e37
author Ash Berlin-Taylor <ash_github@firemirror.com> 1564493832 +0100
committer wayne.morris <wayne.morris@modmed.com> 1564506121 -0400

parent 6ef0e37
author Ash Berlin-Taylor <ash_github@firemirror.com> 1564493832 +0100
committer wayne.morris <wayne.morris@modmed.com> 1564505391 -0400

parent 6ef0e37
author Ash Berlin-Taylor <ash_github@firemirror.com> 1564493832 +0100
committer wayne.morris <wayne.morris@modmed.com> 1564504191 -0400

parent 6ef0e37
author Ash Berlin-Taylor <ash_github@firemirror.com> 1564493832 +0100
committer wayne.morris <wayne.morris@modmed.com> 1564504099 -0400

[AIRFLOW-5052] Added the include_deleted param to salesforce_hook

[AIRFLOW-1840] Support back-compat on old celery config

The new names are in-line with Celery 4, but if
anyone upgrades Airflow
without following the UPDATING.md instructions
(which we probably assume
most people won't, not until something stops
working) their workers
would suddenly just start failing. That's bad.

This will issue a warning but carry on working as
expected. We can
remove the deprecation settings (but leave the
code in config) after
this release has been made.

Closes apache#3549 from ashb/AIRFLOW-1840-back-compat

(cherry picked from commit a4592f9)
Signed-off-by: Bolke de Bruin <bolke@xs4all.nl>

[AIRFLOW-2812] Fix error in Updating.md for upgrading to 1.10

Closes apache#3654 from nrhvyc/AIRFLOW-2812

[AIRFLOW-2816] Fix license text in docs/license.rst

(cherry picked from commit af15f11)
Signed-off-by: Bolke de Bruin <bolke@xs4all.nl>

[AIRFLOW-2817] Force explicit choice on GPL dependency (apache#3660)

By default one of Apache Airflow's dependencies pulls in a GPL
library. Airflow should not install (and upgrade) without an explicit choice.

This is part of the Apache requirements as we cannot depend on Category X
software.

(cherry picked from commit c37fc0b)
Signed-off-by: Bolke de Bruin <bolke@xs4all.nl>
(cherry picked from commit b39e453)
Signed-off-by: Bolke de Bruin <bolke@xs4all.nl>

[AIRFLOW-2869] Remove smart quote from default config

Closes apache#3716 from wdhorton/remove-smart-quote-
from-cfg

(cherry picked from commit 67e2bb9)
Signed-off-by: Bolke de Bruin <bolke@xs4all.nl>
(cherry picked from commit 700f5f0)
Signed-off-by: Bolke de Bruin <bolke@xs4all.nl>

[AIRFLOW-2140] Don't require kubernetes for the SparkSubmit hook (apache#3700)

This extra dep is a quasi-breaking change when upgrading - previously
there were no deps outside of Airflow itself for this hook. Importing
the k8s libs breaks installs that aren't also using Kubernetes.

This makes the dep optional for anyone who doesn't explicitly use the
functionality

(cherry picked from commit 0be002e)
Signed-off-by: Bolke de Bruin <bolke@xs4all.nl>
(cherry picked from commit f58246d)
Signed-off-by: Bolke de Bruin <bolke@xs4all.nl>

[AIRFLOW-2859] Implement own UtcDateTime (apache#3708)

The different UtcDateTime implementations all have issues.
Either they replace tzinfo directly without converting
or they do not convert to UTC at all.

We also ensure all mysql connections are in UTC
in order to keep sanity, as mysql will ignore the
timezone of a field when inserting/updating.

(cherry picked from commit 6fd4e60)
Signed-off-by: Bolke de Bruin <bolke@xs4all.nl>
(cherry picked from commit 8fc8c7a)
Signed-off-by: Bolke de Bruin <bolke@xs4all.nl>

[AIRFLOW-2895] Prevent scheduler from spamming heartbeats/logs

Reverts most of AIRFLOW-2027 until the issues with it can be fixed.

Closes apache#3747 from aoen/revert_min_file_parsing_time_commit

[AIRFLOW-2979] Make celery_result_backend conf Backwards compatible (apache#3832)

(apache#2806) Renamed `celery_result_backend` to `result_backend` and broke backwards compatibility.

[AIRFLOW-2524] Add Amazon SageMaker Training (apache#3658)

Add SageMaker Hook, Training Operator & Sensor
Co-authored-by: srrajeev-aws <srrajeev@amazon.com>

[AIRFLOW-2524] Add Amazon SageMaker Tuning (apache#3751)

Add SageMaker tuning Operator and sensor
Co-authored-by: srrajeev-aws <srrajeev@amazon.com>

[AIRFLOW-2524] Add SageMaker Batch Inference (apache#3767)

* Fix for comments
* Fix sensor test
* Update non_terminal_states and failed_states to static variables of SageMakerHook

Add SageMaker Transform Operator & Sensor
Co-authored-by: srrajeev-aws <srrajeev@amazon.com>

[AIRFLOW-2763] Add check to validate worker connectivity to metadata Database

[AIRFLOW-2786] Gracefully handle Variable import errors (apache#3648)

Variables that are added through a file are not
checked as explicity as creating a Variable in the
web UI. This handles exceptions that could be caused
by improper keys or values.

[AIRFLOW-2860] DruidHook: time check is wrong (apache#3745)

[AIRFLOW-2773] Validates Dataflow Job Name

Closes apache#3623 from kaxil/AIRFLOW-2773

[AIRFLOW-2845] Asserts in contrib package code are changed on raise ValueError and TypeError (apache#3690)

[AIRFLOW-1917] Trim extra newline and trailing whitespace from log (apache#3862)

[AIRFLOW-XXX] Fix SlackWebhookOperator docs (apache#3915)

The docs refer to `conn_id` while the actual argument is `http_conn_id`.

[AIRFLOW-2912] Add Deploy and Delete operators for GCF (apache#3969)

Both Deploy and Delete operators interact with Google
Cloud Functions to manage functions. Both are idempotent
and make use of GcfHook - hook that encapsulates
communication with GCP over GCP API.

[AIRFLOW-3078] Basic operators for Google Compute Engine (apache#4022)

Add GceInstanceStartOperator, GceInstanceStopOperator and GceSetMachineTypeOperator.

Each operator includes:
- core logic
- input params validation
- unit tests
- presence in the example DAG
- docstrings
- How-to and Integration documentation

Additionally, in GceHook error checking if response is 200 OK was added:

Some types of errors are only visible in the response's "error" field
and the overall HTTP response is 200 OK.

That is why apart from checking if status is "done" we also check
if "error" is empty, and if not an exception is raised with error
message extracted from the "error" field of the response.

In this commit we also separated out Body Field Validator to
separate module in tools - this way it can be reused between
various GCP operators, it has proven to be usable in at least
two of them now.

Co-authored-by: sprzedwojski <szymon.przedwojski@polidea.com>
Co-authored-by: potiuk <jarek.potiuk@polidea.com>

[AIRFLOW-3183] Fix bug in DagFileProcessorManager.max_runs_reached() (apache#4031)

The condition is intended to ensure the function
will return False if any file's run_count is still smaller
than max_run. But the operator used here is "!=".
Instead, it should be "<".

This is because in DagFileProcessorManager,
there is no statement helping limit the upper
limit of run_count. It's possible that
files' run_count will be bigger than max_run.
In such case, max_runs_reached() method
may fail its purpose.

[AIRFLOW-3099] Don't ever warn about missing sections of config (apache#4028)

Rather than looping through and setting each config variable
individually, and having to know which sections are optional and which
aren't, instead we can just call a single function on ConfigParser and
it will read the config from the dict, and more importantly here, never
error about missing sections - it will just create them as needed.

[AIRFLOW-3089] Drop hard-coded url scheme in google auth redirect. (apache#3919)

The google auth provider hard-codes the `_scheme` in the callback url to
`https` so that airflow generates correct urls when run behind a proxy
that terminates tls. But this means that google auth can't be used when
running without https--for example, during local development. Also,
hard-coding `_scheme` isn't the correct solution to the problem of
running behind a proxy. Instead, the proxy should be configured to set
the `X-Forwarded-Proto` header to `https`; Flask interprets this header
and generates the appropriate callback url without hard-coding the
scheme.

[AIRFLOW-3178] Handle percents signs in configs for airflow run (apache#4029)

* [AIRFLOW-3178] Don't mask defaults() function from ConfigParser

ConfigParser (the base class for AirflowConfigParser) expects defaults()
to be a function - so when we re-assign it to be a property some of the
methods from ConfigParser no longer work.

* [AIRFLOW-3178] Correctly escape percent signs when creating temp config

Otherwise we have a problem when we come to use those values.

* [AIRFLOW-3178] Use os.chmod instead of shelling out

There's no need to run another process for a built in Python function.

This also removes a possible race condition that would make temporary
config file be readable by more than the airflow or run-as user
The exact behaviour would depend on the umask we run under, and the
primary group of our user, likely this would mean the file was readably
by members of the airflow group (which in most cases would be just the
airflow user). To remove any such possibility we chmod the file
before we write to it

[AIRFLOW-2216] Use profile for AWS hook if S3 config file provided in aws_default connection extra parameters (apache#4011)

Use profile for AWS hook if S3 config file provided in
aws_default connection extra parameters
Add test to validate profile set

[AIRFLOW-3138] Use current data type for migrations (apache#3985)

* Use timestamp instead of timestamp with timezone for migration.

[AIRFLOW-3119] Enable debugging with Celery(apache#3950)

This will enable --loglevel when launching a
celery worker and inherit that LOGGING_LEVEL
setting from airflow.cfg

[AIRFLOW-3197] EMRHook is missing new parameters of the AWS API (apache#4044)

Allow passing any params to the CreateJobFlow API, so that we don't have
to stay up to date with AWS api changes.

[AIRFLOW-3203] Fix DockerOperator & some operator test (apache#4049)

- For argument `image`, no need to explicitly
  add "latest" if tag is omitted.
  "latest" will be used by default if no
  tag provided. This is handled by `docker` package itself.

- Intermediate variable `cpu_shares` is not needed.

- Fix wrong usage of `cpu_shares` and `cpu_shares`.
  Based on
  https://docker-py.readthedocs.io/en/stable/api.html#docker.api.container.ContainerApiMixin.create_host_config,
  They should be an arguments of
  self.cli.create_host_config()
  rather than
  APIClient.create_container().

- Change name of the corresponding test script,
  to ensure it can be discovered.

- Fix the test itself.

- Some other test scripts are not named properly,
  which result in failure of test discovery.

[AIRFLOW-3232] More readable GCF operator documentation (apache#4067)

[AIRFLOW-3231] Basic operators for Google Cloud SQL (apache#4097)

Add CloudSqlInstanceInsertOperator, CloudSqlInstancePatchOperator and CloudSqlInstanceDeleteOperator.

Each operator includes:
- core logic
- input params validation
- unit tests
- presence in the example DAG
- docstrings
- How-to and Integration documentation

Additionally, small improvements to GcpBodyFieldValidator were made:
- add simple list validation capability (type="list")
- introduced parameter allow_empty, which can be set to False
	to test for non-emptiness of a string instead of specifying
	a regexp.

Co-authored-by: sprzedwojski <szymon.przedwojski@polidea.com>
Co-authored-by: potiuk <jarek.potiuk@polidea.com>

[AIRFLOW-2524] Update SageMaker hook and operators (apache#4091)

This re-works the SageMaker functionality in Airflow to be more complete, and more useful for the kinds of operations that SageMaker supports.

We removed some files and operators here, but these were only added after the last release so we don't need to worry about any sort of back-compat.

[AIRFLOW-3276] Cloud SQL: database create / patch / delete operators (apache#4124)

[AIRFLOW-2192] Allow non-latin1 usernames with MySQL backend by adding a SQL_ENGINE_ENCODING param and default to UTF-8 (apache#4087)

Compromised of:

Since we have unicode_literals importred and the engine arguments must be strings in Python2 explicitly make 'utf-8' a string.

replace bare exception with conf.AirflowConfigException for missing value.

It's just got for strings apparently.

Add utf-8 to default_airflow.cfg - question do I still need the try try/except block or can we depend on defaults (I note some have both).

Get rid of try/except block and depend on default_airflow.cfg

Use __str__ since calling str just gives us back a newstr as well.

Test that a panda user can be saved.

[AIRFLOW-3295] Fix potential security issue in DaskExecutor (apache#4128)

When user decides to use TLS/SSL encryption
for DaskExecutor communications,
`Distributed.Security` object will be created.

However, argument `require_encryption` is missed
to be set to `True` (its default value is `False`).

This may fail the TLS/SSL encryption setting-up.

[AIRFLOW-XXX] Fix flake8 errors from apache#4144

[AIRFLOW-2574] Cope with '%' in SQLA DSN when running migrations (apache#3787)

Alembic uses a ConfigParser like Airflow does, and "%% is a special
value in there, so we need to escape it. As per the Alembic docs:

> Note that this value is passed to ConfigParser.set, which supports
> variable interpolation using pyformat (e.g. `%(some_value)s`). A raw
> percent sign not part of an interpolation symbol must therefore be
> escaped, e.g. `%%`

[AIRFLOW-3090] Demote dag start/stop log messages to debug (apache#3920)

[AIRFLOW-3090] Specify path of key file in log message (apache#3921)

[AIRFLOW-3111] Fix instructions in UPDATING.md and remove comment (apache#3944)

artifacts in default_airflow.cfg

- fixed incorrect instructions in UPDATING.md regarding core.log_filename_template and elasticsearch.elasticsearch_log_id_template
- removed comments referencing "additional curly braces" from
default_airflow.cfg since they're irrelevant to the rendered airflow.cfg

[AIRFLOW-3127] Fix out-dated doc for Celery SSL (apache#3967)

Now in `airflow.cfg`, for Celery-SSL, the item names are
"ssl_active", "ssl_key", "ssl_cert", and "ssl_cacert".
(since PR https://github.com/apache/incubator-airflow/pull/2806/files)

But in the documentation
https://airflow.incubator.apache.org/security.html?highlight=celery
or
https://github.com/apache/incubator-airflow/blob/master/docs/security.rst,
it's "CELERY_SSL_ACTIVE", "CELERY_SSL_KEY", "CELERY_SSL_CERT", and
"CELERY_SSL_CACERT", which is out-dated and may confuse readers.

[AIRFLOW-3187] Update airflow.gif file with a slower version (apache#4033)

[AIRFLOW-3164] Verify server certificate when connecting to LDAP (apache#4006)

Misconfiguration and improper checking of exceptions disabled
server certificate checking. We now only support TLS connections
and do not support insecure connections anymore.

[AIRFLOW-2779] Add license headers to doc files (apache#4178)

This adds ASF license headers to all the .rst and .md files with the
exception of the Pull Request template (as that is included verbatim
when opening a Pull Request on Github which would be messy)

Added the include_deleted parameter to salesforce hook
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants