Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Postgres to Redshift Connector fails but sync shows succeeded #4538

Closed
danieldiamond opened this issue Jul 4, 2021 · 6 comments
Closed

Postgres to Redshift Connector fails but sync shows succeeded #4538

danieldiamond opened this issue Jul 4, 2021 · 6 comments

Comments

@danieldiamond
Copy link
Contributor

Enviroment

  • Airbyte version: 0.26.4-alpha
  • OS Version / Instance: AWS EC2
  • Deployment: Docker
  • Source Connector and version: source-postgres:0.3.4
  • Destination Connector and version: destination-redshift:0.3.9
  • Severity: High
  • Step where error happened: Job Sync

Current Behavior

Sync fails but the status shows succeeded

2021-07-04 20:35:44 ERROR () LineGobbler(voidCall):85 - Exception in thread "main" java.lang.RuntimeException: Unable establish a connection: Could not connect with provided configuration. Error: Cannot create PoolableConnectionFactory (Connection to localhost:9999 refused. Check that the hostname and port are correct and that the postmaster is accepting TCP/IP connections.)
2021-07-04 20:35:44 ERROR () LineGobbler(voidCall):85 - 	at io.airbyte.integrations.source.postgres.PostgresSource.read(PostgresSource.java:187)
2021-07-04 20:35:44 ERROR () LineGobbler(voidCall):85 - 	at io.airbyte.integrations.base.IntegrationRunner.run(IntegrationRunner.java:103)
2021-07-04 20:35:44 ERROR () LineGobbler(voidCall):85 - 	at io.airbyte.integrations.source.postgres.PostgresSource.main(PostgresSource.java:335)
2021-07-04 20:35:45 INFO () DefaultAirbyteStreamFactory(lambda$create$0):73 - 2021-07-04 20:35:45 �[32mINFO�[m i.a.i.d.j.c.s.S3StreamCopier(<init>):94 - {} - S3 upload part size: 10 MB
2021-07-04 20:35:46 INFO () DefaultReplicationWorker(run):141 - Source thread complete.

Expected Behavior

Sync should have a failed status

Logs

If applicable, please upload the logs from the failing operation.
For sync jobs, you can download the full logs from the UI by going to the sync attempt page and
clicking the download logs button at the top right of the logs display window.

LOG

2021-07-04 20:35:37 INFO () WorkerRun(call):62 - Executing worker wrapper. Airbyte version: 0.26.4-alpha
2021-07-04 20:35:37 INFO () TemporalAttemptExecution(get):110 - Executing worker wrapper. Airbyte version: 0.26.4-alpha
2021-07-04 20:35:37 INFO () DefaultReplicationWorker(run):102 - start sync worker. job id: ## attempt id: 0
2021-07-04 20:35:37 INFO () DefaultReplicationWorker(run):111 - configured sync modes: {public.jobs=full_refresh - append, public.attempts=full_refresh - append, public.airbyte_metadata=full_refresh - append}
2021-07-04 20:35:37 INFO () DefaultAirbyteDestination(start):78 - Running destination...
2021-07-04 20:35:38 INFO () LineGobbler(voidCall):85 - Checking if airbyte/destination-redshift:0.3.9 exists...
2021-07-04 20:35:38 INFO () LineGobbler(voidCall):85 - airbyte/destination-redshift:0.3.9 was found locally.
2021-07-04 20:35:38 INFO () DockerProcessFactory(create):127 - Preparing command: docker run --rm --init -i -v airbyte_workspace:/data -v /tmp/airbyte_local:/local -w /data/##/0 --network host airbyte/destination-redshift:0.3.9 write --config destination_config.json --catalog destination_catalog.json
2021-07-04 20:35:38 INFO () LineGobbler(voidCall):85 - Checking if airbyte/source-postgres:0.3.4 exists...
2021-07-04 20:35:38 INFO () LineGobbler(voidCall):85 - airbyte/source-postgres:0.3.4 was found locally.
2021-07-04 20:35:38 INFO () DockerProcessFactory(create):127 - Preparing command: docker run --rm --init -i -v airbyte_workspace:/data -v /tmp/airbyte_local:/local -w /data/##/0 --network host airbyte/source-postgres:0.3.4 read --config source_config.json --catalog source_catalog.json
2021-07-04 20:35:38 INFO () DefaultReplicationWorker(lambda$getDestinationOutputRunnable$3):243 - Destination output thread started.
2021-07-04 20:35:38 INFO () DefaultReplicationWorker(run):139 - Waiting for source thread to join.
2021-07-04 20:35:38 INFO () DefaultReplicationWorker(lambda$getReplicationRunnable$2):207 - Replication thread started.
2021-07-04 20:35:43 INFO () DefaultAirbyteStreamFactory(lambda$create$0):73 - 2021-07-04 20:35:43 �[32mINFO�[m i.a.i.s.p.PostgresSource(main):334 - {} - starting source: class io.airbyte.integrations.source.postgres.PostgresSource
2021-07-04 20:35:43 INFO () DefaultAirbyteStreamFactory(lambda$create$0):73 - 2021-07-04 20:35:43 �[32mINFO�[m i.a.i.d.r.RedshiftDestination(main):97 - {} - starting destination: class io.airbyte.integrations.destination.redshift.RedshiftDestination
2021-07-04 20:35:43 INFO () DefaultAirbyteStreamFactory(lambda$create$0):73 - 2021-07-04 20:35:43 �[32mINFO�[m i.a.i.b.IntegrationRunner(run):78 - {} - Running integration: io.airbyte.integrations.source.postgres.PostgresSource
2021-07-04 20:35:43 INFO () DefaultAirbyteStreamFactory(lambda$create$0):73 - 2021-07-04 20:35:43 �[32mINFO�[m i.a.i.b.IntegrationRunner(run):78 - {} - Running integration: io.airbyte.integrations.destination.redshift.RedshiftDestination
2021-07-04 20:35:43 INFO () DefaultAirbyteStreamFactory(lambda$create$0):73 - 2021-07-04 20:35:43 �[32mINFO�[m i.a.i.b.IntegrationCliParser(parseOptions):135 - {} - integration args: {catalog=destination_catalog.json, write=null, config=destination_config.json}
2021-07-04 20:35:43 INFO () DefaultAirbyteStreamFactory(lambda$create$0):73 - 2021-07-04 20:35:43 �[32mINFO�[m i.a.i.b.IntegrationRunner(run):82 - {} - Command: WRITE
2021-07-04 20:35:43 INFO () DefaultAirbyteStreamFactory(lambda$create$0):73 - 2021-07-04 20:35:43 �[32mINFO�[m i.a.i.b.IntegrationRunner(run):83 - {} - Integration config: IntegrationConfig{command=WRITE, configPath='destination_config.json', catalogPath='destination_catalog.json', statePath='null'}
2021-07-04 20:35:43 INFO () DefaultAirbyteStreamFactory(lambda$create$0):73 - 2021-07-04 20:35:43 �[32mINFO�[m i.a.i.b.IntegrationCliParser(parseOptions):135 - {} - integration args: {read=null, catalog=source_catalog.json, config=source_config.json}
2021-07-04 20:35:43 INFO () DefaultAirbyteStreamFactory(lambda$create$0):73 - 2021-07-04 20:35:43 �[32mINFO�[m i.a.i.b.IntegrationRunner(run):82 - {} - Command: READ
2021-07-04 20:35:43 INFO () DefaultAirbyteStreamFactory(lambda$create$0):73 - 2021-07-04 20:35:43 �[32mINFO�[m i.a.i.b.IntegrationRunner(run):83 - {} - Integration config: IntegrationConfig{command=READ, configPath='source_config.json', catalogPath='source_catalog.json', statePath='null'}
2021-07-04 20:35:43 INFO () DefaultAirbyteStreamFactory(lambda$create$0):73 - 2021-07-04 20:35:43 �[32mINFO�[m i.a.i.d.j.c.SwitchingDestination(getConsumer):83 - {} - Using destination type: COPY_S3
2021-07-04 20:35:44 INFO () DefaultAirbyteStreamFactory(lambda$create$0):73 - 2021-07-04 20:35:44 �[32mINFO�[m i.a.i.s.p.PostgresSource(isCdc):285 - {} - using CDC: false
2021-07-04 20:35:44 INFO () DefaultAirbyteStreamFactory(lambda$create$0):73 - 2021-07-04 20:35:44 �[32mINFO�[m i.a.i.s.j.AbstractJdbcSource(lambda$getCheckOperations$1):152 - {} - Attempting to get metadata from the database to see if we can connect.
2021-07-04 20:35:44 INFO () DefaultAirbyteStreamFactory(lambda$create$0):73 - 2021-07-04 20:35:44 �[32mINFO�[m i.a.i.s.j.AbstractJdbcSource(check):138 - {} - Exception while checking connection: 
2021-07-04 20:35:44 INFO () DefaultAirbyteStreamFactory(lambda$create$0):73 - java.sql.SQLException: Cannot create PoolableConnectionFactory (Connection to localhost:9999 refused. Check that the hostname and port are correct and that the postmaster is accepting TCP/IP connections.)
2021-07-04 20:35:44 INFO () DefaultAirbyteStreamFactory(lambda$create$0):73 - 	at org.apache.commons.dbcp2.BasicDataSource.createPoolableConnectionFactory(BasicDataSource.java:669) ~[commons-dbcp2-2.7.0.jar:2.7.0]
2021-07-04 20:35:44 INFO () DefaultAirbyteStreamFactory(lambda$create$0):73 - 	at org.apache.commons.dbcp2.BasicDataSource.createDataSource(BasicDataSource.java:544) ~[commons-dbcp2-2.7.0.jar:2.7.0]
2021-07-04 20:35:44 INFO () DefaultAirbyteStreamFactory(lambda$create$0):73 - 	at org.apache.commons.dbcp2.BasicDataSource.getConnection(BasicDataSource.java:753) ~[commons-dbcp2-2.7.0.jar:2.7.0]
2021-07-04 20:35:44 INFO () DefaultAirbyteStreamFactory(lambda$create$0):73 - 	at io.airbyte.db.jdbc.DefaultJdbcDatabase$DataSourceConnectionSupplier.getConnection(DefaultJdbcDatabase.java:140) ~[io.airbyte-airbyte-db-0.24.7-alpha.jar:?]
2021-07-04 20:35:44 INFO () DefaultAirbyteStreamFactory(lambda$create$0):73 - 	at io.airbyte.db.jdbc.DefaultJdbcDatabase.bufferedResultSetQuery(DefaultJdbcDatabase.java:69) ~[io.airbyte-airbyte-db-0.24.7-alpha.jar:?]
2021-07-04 20:35:44 INFO () DefaultAirbyteStreamFactory(lambda$create$0):73 - 	at io.airbyte.db.jdbc.StreamingJdbcDatabase.bufferedResultSetQuery(StreamingJdbcDatabase.java:63) ~[io.airbyte-airbyte-db-0.24.7-alpha.jar:?]
2021-07-04 20:35:44 INFO () DefaultAirbyteStreamFactory(lambda$create$0):73 - 	at io.airbyte.integrations.source.jdbc.AbstractJdbcSource.lambda$getCheckOperations$1(AbstractJdbcSource.java:153) ~[io.airbyte.airbyte-integrations.connectors-source-jdbc-0.24.7-alpha.jar:?]
2021-07-04 20:35:44 INFO () DefaultAirbyteStreamFactory(lambda$create$0):73 - 	at io.airbyte.integrations.source.jdbc.AbstractJdbcSource.check(AbstractJdbcSource.java:133) [io.airbyte.airbyte-integrations.connectors-source-jdbc-0.24.7-alpha.jar:?]
2021-07-04 20:35:44 INFO () DefaultAirbyteStreamFactory(lambda$create$0):73 - 	at io.airbyte.integrations.source.postgres.PostgresSource.read(PostgresSource.java:184) [io.airbyte.airbyte-integrations.connectors-source-postgres-0.24.7-alpha.jar:?]
2021-07-04 20:35:44 INFO () DefaultAirbyteStreamFactory(lambda$create$0):73 - 	at io.airbyte.integrations.base.IntegrationRunner.run(IntegrationRunner.java:103) [io.airbyte.airbyte-integrations.bases-base-java-0.24.7-alpha.jar:?]
2021-07-04 20:35:44 INFO () DefaultAirbyteStreamFactory(lambda$create$0):73 - 	at io.airbyte.integrations.source.postgres.PostgresSource.main(PostgresSource.java:335) [io.airbyte.airbyte-integrations.connectors-source-postgres-0.24.7-alpha.jar:?]
2021-07-04 20:35:44 INFO () DefaultAirbyteStreamFactory(lambda$create$0):73 - Caused by: org.postgresql.util.PSQLException: Connection to localhost:9999 refused. Check that the hostname and port are correct and that the postmaster is accepting TCP/IP connections.
2021-07-04 20:35:44 INFO () DefaultAirbyteStreamFactory(lambda$create$0):73 - 	at org.postgresql.core.v3.ConnectionFactoryImpl.openConnectionImpl(ConnectionFactoryImpl.java:303) ~[postgresql-42.2.18.jar:42.2.18]
2021-07-04 20:35:44 INFO () DefaultAirbyteStreamFactory(lambda$create$0):73 - 	at org.postgresql.core.ConnectionFactory.openConnection(ConnectionFactory.java:51) ~[postgresql-42.2.18.jar:42.2.18]
2021-07-04 20:35:44 INFO () DefaultAirbyteStreamFactory(lambda$create$0):73 - 	at org.postgresql.jdbc.PgConnection.<init>(PgConnection.java:225) ~[postgresql-42.2.18.jar:42.2.18]
2021-07-04 20:35:44 INFO () DefaultAirbyteStreamFactory(lambda$create$0):73 - 	at org.postgresql.Driver.makeConnection(Driver.java:465) ~[postgresql-42.2.18.jar:42.2.18]
2021-07-04 20:35:44 INFO () DefaultAirbyteStreamFactory(lambda$create$0):73 - 	at org.postgresql.Driver.connect(Driver.java:264) ~[postgresql-42.2.18.jar:42.2.18]
2021-07-04 20:35:44 INFO () DefaultAirbyteStreamFactory(lambda$create$0):73 - 	at org.apache.commons.dbcp2.DriverConnectionFactory.createConnection(DriverConnectionFactory.java:55) ~[commons-dbcp2-2.7.0.jar:2.7.0]
2021-07-04 20:35:44 INFO () DefaultAirbyteStreamFactory(lambda$create$0):73 - 	at org.apache.commons.dbcp2.PoolableConnectionFactory.makeObject(PoolableConnectionFactory.java:355) ~[commons-dbcp2-2.7.0.jar:2.7.0]
2021-07-04 20:35:44 INFO () DefaultAirbyteStreamFactory(lambda$create$0):73 - 	at org.apache.commons.dbcp2.BasicDataSource.validateConnectionFactory(BasicDataSource.java:115) ~[commons-dbcp2-2.7.0.jar:2.7.0]
2021-07-04 20:35:44 INFO () DefaultAirbyteStreamFactory(lambda$create$0):73 - 	at org.apache.commons.dbcp2.BasicDataSource.createPoolableConnectionFactory(BasicDataSource.java:665) ~[commons-dbcp2-2.7.0.jar:2.7.0]
2021-07-04 20:35:44 INFO () DefaultAirbyteStreamFactory(lambda$create$0):73 - 	... 10 more
2021-07-04 20:35:44 INFO () DefaultAirbyteStreamFactory(lambda$create$0):73 - Caused by: java.net.ConnectException: Connection refused
2021-07-04 20:35:44 INFO () DefaultAirbyteStreamFactory(lambda$create$0):73 - 	at sun.nio.ch.Net.pollConnect(Native Method) ~[?:?]
2021-07-04 20:35:44 INFO () DefaultAirbyteStreamFactory(lambda$create$0):73 - 	at sun.nio.ch.Net.pollConnectNow(Net.java:589) ~[?:?]
2021-07-04 20:35:44 INFO () DefaultAirbyteStreamFactory(lambda$create$0):73 - 	at sun.nio.ch.NioSocketImpl.timedFinishConnect(NioSocketImpl.java:542) ~[?:?]
2021-07-04 20:35:44 INFO () DefaultAirbyteStreamFactory(lambda$create$0):73 - 	at sun.nio.ch.NioSocketImpl.connect(NioSocketImpl.java:597) ~[?:?]
2021-07-04 20:35:44 INFO () DefaultAirbyteStreamFactory(lambda$create$0):73 - 	at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:333) ~[?:?]
2021-07-04 20:35:44 INFO () DefaultAirbyteStreamFactory(lambda$create$0):73 - 	at java.net.Socket.connect(Socket.java:648) ~[?:?]
2021-07-04 20:35:44 INFO () DefaultAirbyteStreamFactory(lambda$create$0):73 - 	at org.postgresql.core.PGStream.createSocket(PGStream.java:231) ~[postgresql-42.2.18.jar:42.2.18]
2021-07-04 20:35:44 INFO () DefaultAirbyteStreamFactory(lambda$create$0):73 - 	at org.postgresql.core.PGStream.<init>(PGStream.java:95) ~[postgresql-42.2.18.jar:42.2.18]
2021-07-04 20:35:44 INFO () DefaultAirbyteStreamFactory(lambda$create$0):73 - 	at org.postgresql.core.v3.ConnectionFactoryImpl.tryConnect(ConnectionFactoryImpl.java:98) ~[postgresql-42.2.18.jar:42.2.18]
2021-07-04 20:35:44 INFO () DefaultAirbyteStreamFactory(lambda$create$0):73 - 	at org.postgresql.core.v3.ConnectionFactoryImpl.openConnectionImpl(ConnectionFactoryImpl.java:213) ~[postgresql-42.2.18.jar:42.2.18]
2021-07-04 20:35:44 INFO () DefaultAirbyteStreamFactory(lambda$create$0):73 - 	at org.postgresql.core.ConnectionFactory.openConnection(ConnectionFactory.java:51) ~[postgresql-42.2.18.jar:42.2.18]
2021-07-04 20:35:44 INFO () DefaultAirbyteStreamFactory(lambda$create$0):73 - 	at org.postgresql.jdbc.PgConnection.<init>(PgConnection.java:225) ~[postgresql-42.2.18.jar:42.2.18]
2021-07-04 20:35:44 INFO () DefaultAirbyteStreamFactory(lambda$create$0):73 - 	at org.postgresql.Driver.makeConnection(Driver.java:465) ~[postgresql-42.2.18.jar:42.2.18]
2021-07-04 20:35:44 INFO () DefaultAirbyteStreamFactory(lambda$create$0):73 - 	at org.postgresql.Driver.connect(Driver.java:264) ~[postgresql-42.2.18.jar:42.2.18]
2021-07-04 20:35:44 INFO () DefaultAirbyteStreamFactory(lambda$create$0):73 - 	at org.apache.commons.dbcp2.DriverConnectionFactory.createConnection(DriverConnectionFactory.java:55) ~[commons-dbcp2-2.7.0.jar:2.7.0]
2021-07-04 20:35:44 INFO () DefaultAirbyteStreamFactory(lambda$create$0):73 - 	at org.apache.commons.dbcp2.PoolableConnectionFactory.makeObject(PoolableConnectionFactory.java:355) ~[commons-dbcp2-2.7.0.jar:2.7.0]
2021-07-04 20:35:44 INFO () DefaultAirbyteStreamFactory(lambda$create$0):73 - 	at org.apache.commons.dbcp2.BasicDataSource.validateConnectionFactory(BasicDataSource.java:115) ~[commons-dbcp2-2.7.0.jar:2.7.0]
2021-07-04 20:35:44 INFO () DefaultAirbyteStreamFactory(lambda$create$0):73 - 	at org.apache.commons.dbcp2.BasicDataSource.createPoolableConnectionFactory(BasicDataSource.java:665) ~[commons-dbcp2-2.7.0.jar:2.7.0]
2021-07-04 20:35:44 INFO () DefaultAirbyteStreamFactory(lambda$create$0):73 - 	... 10 more
2021-07-04 20:35:44 ERROR () LineGobbler(voidCall):85 - Exception in thread "main" java.lang.RuntimeException: Unable establish a connection: Could not connect with provided configuration. Error: Cannot create PoolableConnectionFactory (Connection to localhost:9999 refused. Check that the hostname and port are correct and that the postmaster is accepting TCP/IP connections.)
2021-07-04 20:35:44 ERROR () LineGobbler(voidCall):85 - 	at io.airbyte.integrations.source.postgres.PostgresSource.read(PostgresSource.java:187)
2021-07-04 20:35:44 ERROR () LineGobbler(voidCall):85 - 	at io.airbyte.integrations.base.IntegrationRunner.run(IntegrationRunner.java:103)
2021-07-04 20:35:44 ERROR () LineGobbler(voidCall):85 - 	at io.airbyte.integrations.source.postgres.PostgresSource.main(PostgresSource.java:335)
2021-07-04 20:35:45 INFO () DefaultAirbyteStreamFactory(lambda$create$0):73 - 2021-07-04 20:35:45 �[32mINFO�[m i.a.i.d.j.c.s.S3StreamCopier(<init>):94 - {} - S3 upload part size: 10 MB
2021-07-04 20:35:46 INFO () DefaultReplicationWorker(run):141 - Source thread complete.
2021-07-04 20:35:46 INFO () DefaultReplicationWorker(run):142 - Waiting for destination thread to join.
2021-07-04 20:35:46 INFO () DefaultAirbyteStreamFactory(lambda$create$0):73 - 2021-07-04 20:35:46 �[32mINFO�[m a.m.s.StreamTransferManager(getMultiPartOutputStreams):329 - {} - Initiated multipart upload to <redacted>>
2021-07-04 20:35:46 INFO () DefaultAirbyteStreamFactory(lambda$create$0):73 - 2021-07-04 20:35:46 �[32mINFO�[m i.a.i.d.j.c.s.S3StreamCopier(<init>):94 - {} - S3 upload part size: 10 MB
2021-07-04 20:35:46 INFO () DefaultAirbyteStreamFactory(lambda$create$0):73 - 2021-07-04 20:35:46 �[32mINFO�[m a.m.s.StreamTransferManager(getMultiPartOutputStreams):329 - {} - Initiated multipart upload to <redacted>>
2021-07-04 20:35:46 INFO () DefaultAirbyteStreamFactory(lambda$create$0):73 - 2021-07-04 20:35:46 �[32mINFO�[m i.a.i.d.j.c.s.S3StreamCopier(<init>):94 - {} - S3 upload part size: 10 MB
2021-07-04 20:35:46 INFO () DefaultAirbyteStreamFactory(lambda$create$0):73 - 2021-07-04 20:35:46 �[32mINFO�[m a.m.s.StreamTransferManager(getMultiPartOutputStreams):329 - {} - Initiated multipart upload to <redacted>>
2021-07-04 20:35:46 INFO () DefaultAirbyteStreamFactory(lambda$create$0):73 - 2021-07-04 20:35:46 �[32mINFO�[m i.a.i.d.b.BufferedStreamConsumer(startTracked):120 - {} - class io.airbyte.integrations.destination.buffered_stream_consumer.BufferedStreamConsumer started.
2021-07-04 20:35:46 INFO () DefaultAirbyteStreamFactory(lambda$create$0):73 - 2021-07-04 20:35:46 �[32mINFO�[m i.a.i.b.FailureTrackingAirbyteMessageConsumer(close):80 - {} - Airbyte message consumer: succeeded.
2021-07-04 20:35:46 INFO () DefaultAirbyteStreamFactory(lambda$create$0):73 - 2021-07-04 20:35:46 �[32mINFO�[m i.a.i.d.b.BufferedStreamConsumer(close):190 - {} - executing on success close procedure.
2021-07-04 20:35:46 INFO () DefaultAirbyteStreamFactory(lambda$create$0):73 - 2021-07-04 20:35:46 �[32mINFO�[m i.a.i.d.j.c.s.S3StreamCopier(closeAndWaitForUpload):194 - {} - Uploading remaining data for airbyte_metadata stream.
2021-07-04 20:35:46 INFO () DefaultAirbyteStreamFactory(lambda$create$0):73 - 2021-07-04 20:35:46 �[32mINFO�[m a.m.s.MultiPartOutputStream(close):158 - {} - Called close() on [MultipartOutputStream for parts 1 - 10000]
2021-07-04 20:35:46 INFO () DefaultAirbyteStreamFactory(lambda$create$0):73 - 2021-07-04 20:35:46 �[32mINFO�[m a.m.s.MultiPartOutputStream(close):158 - {} - Called close() on [MultipartOutputStream for parts 1 - 10000]
2021-07-04 20:35:46 INFO () DefaultAirbyteStreamFactory(lambda$create$0):73 - 2021-07-04 20:35:46 �[33mWARN�[m a.m.s.MultiPartOutputStream(close):160 - {} - [MultipartOutputStream for parts 1 - 10000] is already closed
2021-07-04 20:35:46 INFO () DefaultAirbyteStreamFactory(lambda$create$0):73 - 2021-07-04 20:35:46 �[32mINFO�[m a.m.s.StreamTransferManager(complete):395 - {} - [Manager uploading to <redacted>>
2021-07-04 20:35:46 INFO () DefaultAirbyteStreamFactory(lambda$create$0):73 - 2021-07-04 20:35:46 �[32mINFO�[m i.a.i.d.j.c.s.S3StreamCopier(closeAndWaitForUpload):198 - {} - All data for airbyte_metadata stream uploaded.
2021-07-04 20:35:46 INFO () DefaultAirbyteStreamFactory(lambda$create$0):73 - 2021-07-04 20:35:46 �[32mINFO�[m i.a.i.d.j.c.s.S3StreamCopier(createDestinationSchema):134 - {} - Creating schema in destination if it doesn't exist: airbyte
2021-07-04 20:35:48 INFO () DefaultAirbyteStreamFactory(lambda$create$0):73 - 2021-07-04 20:35:48 �[32mINFO�[m i.a.i.d.j.c.s.S3StreamCopier(createTemporaryTable):140 - {} - Preparing tmp table in destination for stream: airbyte_metadata, schema: airbyte, tmp table name: _airbyte_1625430945358__airbyte_raw_airbyte_metadata.
2021-07-04 20:35:49 INFO () DefaultAirbyteStreamFactory(lambda$create$0):73 - 2021-07-04 20:35:49 �[32mINFO�[m i.a.i.d.j.c.s.S3StreamCopier(copyStagingFileToTemporaryTable):146 - {} - Starting copy to tmp table: _airbyte_1625430945358__airbyte_raw_airbyte_metadata in destination for stream: airbyte_metadata, schema: airbyte, .
2021-07-04 20:35:54 INFO () DefaultAirbyteStreamFactory(lambda$create$0):73 - 2021-07-04 20:35:54 �[32mINFO�[m i.a.i.d.j.c.s.S3StreamCopier(copyStagingFileToTemporaryTable):148 - {} - Copy to tmp table _airbyte_1625430945358__airbyte_raw_airbyte_metadata in destination for stream airbyte_metadata complete.
2021-07-04 20:35:54 INFO () DefaultAirbyteStreamFactory(lambda$create$0):73 - 2021-07-04 20:35:54 �[32mINFO�[m i.a.i.d.j.c.s.S3StreamCopier(createDestinationTable):154 - {} - Preparing table _airbyte_raw_airbyte_metadata in destination.
2021-07-04 20:35:57 INFO () DefaultAirbyteStreamFactory(lambda$create$0):73 - 2021-07-04 20:35:57 �[32mINFO�[m i.a.i.d.j.c.s.S3StreamCopier(createDestinationTable):156 - {} - Table _airbyte_1625430945358__airbyte_raw_airbyte_metadata in destination prepared.
2021-07-04 20:35:57 INFO () DefaultAirbyteStreamFactory(lambda$create$0):73 - 2021-07-04 20:35:57 �[32mINFO�[m i.a.i.d.j.c.s.S3StreamCopier(generateMergeStatement):163 - {} - Preparing to merge tmp table _airbyte_1625430945358__airbyte_raw_airbyte_metadata to dest table: _airbyte_raw_airbyte_metadata, schema: airbyte, in destination.
2021-07-04 20:35:57 INFO () DefaultAirbyteStreamFactory(lambda$create$0):73 - 2021-07-04 20:35:57 �[32mINFO�[m i.a.i.d.j.c.s.S3StreamCopier(closeAndWaitForUpload):194 - {} - Uploading remaining data for attempts stream.
2021-07-04 20:35:57 INFO () DefaultAirbyteStreamFactory(lambda$create$0):73 - 2021-07-04 20:35:57 �[32mINFO�[m a.m.s.MultiPartOutputStream(close):158 - {} - Called close() on [MultipartOutputStream for parts 1 - 10000]
2021-07-04 20:35:57 INFO () DefaultAirbyteStreamFactory(lambda$create$0):73 - 2021-07-04 20:35:57 �[32mINFO�[m a.m.s.MultiPartOutputStream(close):158 - {} - Called close() on [MultipartOutputStream for parts 1 - 10000]
2021-07-04 20:35:57 INFO () DefaultAirbyteStreamFactory(lambda$create$0):73 - 2021-07-04 20:35:57 �[33mWARN�[m a.m.s.MultiPartOutputStream(close):160 - {} - [MultipartOutputStream for parts 1 - 10000] is already closed
2021-07-04 20:35:57 INFO () DefaultAirbyteStreamFactory(lambda$create$0):73 - 2021-07-04 20:35:57 �[32mINFO�[m a.m.s.StreamTransferManager(complete):395 - {} - [Manager uploading to <redacted>>
2021-07-04 20:35:57 INFO () DefaultAirbyteStreamFactory(lambda$create$0):73 - 2021-07-04 20:35:57 �[32mINFO�[m i.a.i.d.j.c.s.S3StreamCopier(closeAndWaitForUpload):198 - {} - All data for attempts stream uploaded.
2021-07-04 20:35:57 INFO () DefaultAirbyteStreamFactory(lambda$create$0):73 - 2021-07-04 20:35:57 �[32mINFO�[m i.a.i.d.j.c.s.S3StreamCopier(createDestinationSchema):134 - {} - Creating schema in destination if it doesn't exist: airbyte
2021-07-04 20:36:00 INFO () DefaultAirbyteStreamFactory(lambda$create$0):73 - 2021-07-04 20:36:00 �[32mINFO�[m i.a.i.d.j.c.s.S3StreamCopier(createTemporaryTable):140 - {} - Preparing tmp table in destination for stream: attempts, schema: airbyte, tmp table name: _airbyte_1625430946235__airbyte_raw_attempts.
2021-07-04 20:36:05 INFO () DefaultAirbyteStreamFactory(lambda$create$0):73 - 2021-07-04 20:36:05 �[32mINFO�[m i.a.i.d.j.c.s.S3StreamCopier(copyStagingFileToTemporaryTable):146 - {} - Starting copy to tmp table: _airbyte_1625430946235__airbyte_raw_attempts in destination for stream: attempts, schema: airbyte, .
2021-07-04 20:36:08 INFO () DefaultAirbyteStreamFactory(lambda$create$0):73 - 2021-07-04 20:36:08 �[32mINFO�[m i.a.i.d.j.c.s.S3StreamCopier(copyStagingFileToTemporaryTable):148 - {} - Copy to tmp table _airbyte_1625430946235__airbyte_raw_attempts in destination for stream attempts complete.
2021-07-04 20:36:08 INFO () DefaultAirbyteStreamFactory(lambda$create$0):73 - 2021-07-04 20:36:08 �[32mINFO�[m i.a.i.d.j.c.s.S3StreamCopier(createDestinationTable):154 - {} - Preparing table _airbyte_raw_attempts in destination.
2021-07-04 20:36:13 INFO () DefaultAirbyteStreamFactory(lambda$create$0):73 - 2021-07-04 20:36:13 �[32mINFO�[m i.a.i.d.j.c.s.S3StreamCopier(createDestinationTable):156 - {} - Table _airbyte_1625430946235__airbyte_raw_attempts in destination prepared.
2021-07-04 20:36:13 INFO () DefaultAirbyteStreamFactory(lambda$create$0):73 - 2021-07-04 20:36:13 �[32mINFO�[m i.a.i.d.j.c.s.S3StreamCopier(generateMergeStatement):163 - {} - Preparing to merge tmp table _airbyte_1625430946235__airbyte_raw_attempts to dest table: _airbyte_raw_attempts, schema: airbyte, in destination.
2021-07-04 20:36:13 INFO () DefaultAirbyteStreamFactory(lambda$create$0):73 - 2021-07-04 20:36:13 �[32mINFO�[m i.a.i.d.j.c.s.S3StreamCopier(closeAndWaitForUpload):194 - {} - Uploading remaining data for jobs stream.
2021-07-04 20:36:13 INFO () DefaultAirbyteStreamFactory(lambda$create$0):73 - 2021-07-04 20:36:13 �[32mINFO�[m a.m.s.MultiPartOutputStream(close):158 - {} - Called close() on [MultipartOutputStream for parts 1 - 10000]
2021-07-04 20:36:13 INFO () DefaultAirbyteStreamFactory(lambda$create$0):73 - 2021-07-04 20:36:13 �[32mINFO�[m a.m.s.MultiPartOutputStream(close):158 - {} - Called close() on [MultipartOutputStream for parts 1 - 10000]
2021-07-04 20:36:13 INFO () DefaultAirbyteStreamFactory(lambda$create$0):73 - 2021-07-04 20:36:13 �[33mWARN�[m a.m.s.MultiPartOutputStream(close):160 - {} - [MultipartOutputStream for parts 1 - 10000] is already closed
2021-07-04 20:36:13 INFO () DefaultAirbyteStreamFactory(lambda$create$0):73 - 2021-07-04 20:36:13 �[32mINFO�[m a.m.s.StreamTransferManager(complete):395 - {} - [Manager uploading to <redacted>>
2021-07-04 20:36:13 INFO () DefaultAirbyteStreamFactory(lambda$create$0):73 - 2021-07-04 20:36:13 �[32mINFO�[m i.a.i.d.j.c.s.S3StreamCopier(closeAndWaitForUpload):198 - {} - All data for jobs stream uploaded.
2021-07-04 20:36:13 INFO () DefaultAirbyteStreamFactory(lambda$create$0):73 - 2021-07-04 20:36:13 �[32mINFO�[m i.a.i.d.j.c.s.S3StreamCopier(createDestinationSchema):134 - {} - Creating schema in destination if it doesn't exist: airbyte
2021-07-04 20:36:15 INFO () DefaultAirbyteStreamFactory(lambda$create$0):73 - 2021-07-04 20:36:15 �[32mINFO�[m i.a.i.d.j.c.s.S3StreamCopier(createTemporaryTable):140 - {} - Preparing tmp table in destination for stream: jobs, schema: airbyte, tmp table name: _airbyte_1625430946398__airbyte_raw_jobs.
2021-07-04 20:36:16 INFO () DefaultAirbyteStreamFactory(lambda$create$0):73 - 2021-07-04 20:36:16 �[32mINFO�[m i.a.i.d.j.c.s.S3StreamCopier(copyStagingFileToTemporaryTable):146 - {} - Starting copy to tmp table: _airbyte_1625430946398__airbyte_raw_jobs in destination for stream: jobs, schema: airbyte, .
2021-07-04 20:36:21 INFO () DefaultAirbyteStreamFactory(lambda$create$0):73 - 2021-07-04 20:36:21 �[32mINFO�[m i.a.i.d.j.c.s.S3StreamCopier(copyStagingFileToTemporaryTable):148 - {} - Copy to tmp table _airbyte_1625430946398__airbyte_raw_jobs in destination for stream jobs complete.
2021-07-04 20:36:21 INFO () DefaultAirbyteStreamFactory(lambda$create$0):73 - 2021-07-04 20:36:21 �[32mINFO�[m i.a.i.d.j.c.s.S3StreamCopier(createDestinationTable):154 - {} - Preparing table _airbyte_raw_jobs in destination.
2021-07-04 20:36:22 INFO () DefaultAirbyteStreamFactory(lambda$create$0):73 - 2021-07-04 20:36:22 �[32mINFO�[m i.a.i.d.j.c.s.S3StreamCopier(createDestinationTable):156 - {} - Table _airbyte_1625430946398__airbyte_raw_jobs in destination prepared.
2021-07-04 20:36:22 INFO () DefaultAirbyteStreamFactory(lambda$create$0):73 - 2021-07-04 20:36:22 �[32mINFO�[m i.a.i.d.j.c.s.S3StreamCopier(generateMergeStatement):163 - {} - Preparing to merge tmp table _airbyte_1625430946398__airbyte_raw_jobs to dest table: _airbyte_raw_jobs, schema: airbyte, in destination.
2021-07-04 20:36:26 INFO () DefaultAirbyteStreamFactory(lambda$create$0):73 - 2021-07-04 20:36:26 �[32mINFO�[m i.a.i.d.j.c.s.S3StreamCopier(removeFileAndDropTmpTable):175 - {} - Begin cleaning s3 staging file 7b7c1159-3bd6-40c3-9b72-d84f5ca8a44f/airbyte/airbyte_metadata.
2021-07-04 20:36:26 INFO () DefaultAirbyteStreamFactory(lambda$create$0):73 - 2021-07-04 20:36:26 �[32mINFO�[m i.a.i.d.j.c.s.S3StreamCopier(removeFileAndDropTmpTable):179 - {} - S3 staging file 7b7c1159-3bd6-40c3-9b72-d84f5ca8a44f/airbyte/airbyte_metadata cleaned.
2021-07-04 20:36:26 INFO () DefaultAirbyteStreamFactory(lambda$create$0):73 - 2021-07-04 20:36:26 �[32mINFO�[m i.a.i.d.j.c.s.S3StreamCopier(removeFileAndDropTmpTable):181 - {} - Begin cleaning _airbyte_1625430945358__airbyte_raw_airbyte_metadata tmp table in destination.
2021-07-04 20:36:29 INFO () DefaultAirbyteStreamFactory(lambda$create$0):73 - 2021-07-04 20:36:29 �[32mINFO�[m i.a.i.d.j.c.s.S3StreamCopier(removeFileAndDropTmpTable):183 - {} - _airbyte_1625430945358__airbyte_raw_airbyte_metadata tmp table in destination cleaned.
2021-07-04 20:36:29 INFO () DefaultAirbyteStreamFactory(lambda$create$0):73 - 2021-07-04 20:36:29 �[32mINFO�[m i.a.i.d.j.c.s.S3StreamCopier(removeFileAndDropTmpTable):175 - {} - Begin cleaning s3 staging file 7b7c1159-3bd6-40c3-9b72-d84f5ca8a44f/airbyte/attempts.
2021-07-04 20:36:29 INFO () DefaultAirbyteStreamFactory(lambda$create$0):73 - 2021-07-04 20:36:29 �[32mINFO�[m i.a.i.d.j.c.s.S3StreamCopier(removeFileAndDropTmpTable):179 - {} - S3 staging file 7b7c1159-3bd6-40c3-9b72-d84f5ca8a44f/airbyte/attempts cleaned.
2021-07-04 20:36:29 INFO () DefaultAirbyteStreamFactory(lambda$create$0):73 - 2021-07-04 20:36:29 �[32mINFO�[m i.a.i.d.j.c.s.S3StreamCopier(removeFileAndDropTmpTable):181 - {} - Begin cleaning _airbyte_1625430946235__airbyte_raw_attempts tmp table in destination.
2021-07-04 20:36:33 INFO () DefaultAirbyteStreamFactory(lambda$create$0):73 - 2021-07-04 20:36:33 �[32mINFO�[m i.a.i.d.j.c.s.S3StreamCopier(removeFileAndDropTmpTable):183 - {} - _airbyte_1625430946235__airbyte_raw_attempts tmp table in destination cleaned.
2021-07-04 20:36:33 INFO () DefaultAirbyteStreamFactory(lambda$create$0):73 - 2021-07-04 20:36:33 �[32mINFO�[m i.a.i.d.j.c.s.S3StreamCopier(removeFileAndDropTmpTable):175 - {} - Begin cleaning s3 staging file 7b7c1159-3bd6-40c3-9b72-d84f5ca8a44f/airbyte/jobs.
2021-07-04 20:36:33 INFO () DefaultAirbyteStreamFactory(lambda$create$0):73 - 2021-07-04 20:36:33 �[32mINFO�[m i.a.i.d.j.c.s.S3StreamCopier(removeFileAndDropTmpTable):179 - {} - S3 staging file 7b7c1159-3bd6-40c3-9b72-d84f5ca8a44f/airbyte/jobs cleaned.
2021-07-04 20:36:33 INFO () DefaultAirbyteStreamFactory(lambda$create$0):73 - 2021-07-04 20:36:33 �[32mINFO�[m i.a.i.d.j.c.s.S3StreamCopier(removeFileAndDropTmpTable):181 - {} - Begin cleaning _airbyte_1625430946398__airbyte_raw_jobs tmp table in destination.
2021-07-04 20:36:35 INFO () DefaultAirbyteStreamFactory(lambda$create$0):73 - 2021-07-04 20:36:35 �[32mINFO�[m i.a.i.d.j.c.s.S3StreamCopier(removeFileAndDropTmpTable):183 - {} - _airbyte_1625430946398__airbyte_raw_jobs tmp table in destination cleaned.
2021-07-04 20:36:35 INFO () DefaultAirbyteStreamFactory(lambda$create$0):73 - 2021-07-04 20:36:35 �[32mINFO�[m i.a.i.b.IntegrationRunner(run):118 - {} - Completed integration: io.airbyte.integrations.destination.redshift.RedshiftDestination
2021-07-04 20:36:35 INFO () DefaultAirbyteStreamFactory(lambda$create$0):73 - 2021-07-04 20:36:35 �[32mINFO�[m i.a.i.d.r.RedshiftDestination(main):99 - {} - completed destination: class io.airbyte.integrations.destination.redshift.RedshiftDestination
2021-07-04 20:36:37 INFO () DefaultReplicationWorker(run):144 - Destination thread complete.
2021-07-04 20:36:37 WARN () DefaultAirbyteSource(close):132 - Source process might not have shut down correctly. source process alive: false, source process exit value: 1. This warning is normal if the job was cancelled.
2021-07-04 20:36:37 INFO () DefaultReplicationWorker(run):169 - sync summary: io.airbyte.config.ReplicationAttemptSummary@32d5b252[status=completed,recordsSynced=0,bytesSynced=0,startTime=1625430937981,endTime=1625430997391]
2021-07-04 20:36:37 INFO () DefaultReplicationWorker(run):178 - Source did not output any state messages
2021-07-04 20:36:37 WARN () DefaultReplicationWorker(run):189 - State capture: No state retained.
2021-07-04 20:36:37 INFO () TemporalAttemptExecution(get):133 - Stopping cancellation check scheduling...
2021-07-04 20:36:37 INFO () RetryingTemporalAttemptExecution(get):118 - Last output present: true. Should attempt again: false
2021-07-04 20:36:37 INFO () SyncWorkflow$ReplicationActivityImpl(replicate):200 - attempt summaries: [io.airbyte.config.ReplicationOutput@625d8120[replicationAttemptSummary=io.airbyte.config.ReplicationAttemptSummary@32d5b252[status=completed,recordsSynced=0,bytesSynced=0,startTime=1625430937981,endTime=1625430997391],state=<null>,outputCatalog=io.airbyte.protocol.models.ConfiguredAirbyteCatalog@44b2eafa[streams=[io.airbyte.protocol.models.ConfiguredAirbyteStream@2123b37b[stream=io.airbyte.protocol.models.AirbyteStream@5485dc14[name=airbyte_metadata,jsonSchema={"type":"object","properties":{"key":{"type":"string"},"value":{"type":"string"}}},supportedSyncModes=[full_refresh, incremental],sourceDefinedCursor=<null>,defaultCursorField=[],sourceDefinedPrimaryKey=[[key]],namespace=airbyte,additionalProperties={}],syncMode=full_refresh,cursorField=[],destinationSyncMode=append,primaryKey=[[key]],additionalProperties={}], io.airbyte.protocol.models.ConfiguredAirbyteStream@5a5f0325[stream=io.airbyte.protocol.models.AirbyteStream@9e9c7ae[name=attempts,jsonSchema={"type":"object","properties":{"id":{"type":"number"},"job_id":{"type":"number"},"output":{"type":"string"},"status":{"type":"string"},"ended_at":{"type":"string"},"log_path":{"type":"string"},"created_at":{"type":"string"},"updated_at":{"type":"string"},"attempt_number":{"type":"number"}}},supportedSyncModes=[full_refresh, incremental],sourceDefinedCursor=<null>,defaultCursorField=[],sourceDefinedPrimaryKey=[[id]],namespace=airbyte,additionalProperties={}],syncMode=full_refresh,cursorField=[],destinationSyncMode=append,primaryKey=[[id]],additionalProperties={}], io.airbyte.protocol.models.ConfiguredAirbyteStream@356d0583[stream=io.airbyte.protocol.models.AirbyteStream@4f720a46[name=jobs,jsonSchema={"type":"object","properties":{"id":{"type":"number"},"scope":{"type":"string"},"config":{"type":"string"},"status":{"type":"string"},"created_at":{"type":"string"},"started_at":{"type":"string"},"updated_at":{"type":"string"},"config_type":{"type":"string"}}},supportedSyncModes=[full_refresh, incremental],sourceDefinedCursor=<null>,defaultCursorField=[],sourceDefinedPrimaryKey=[[id]],namespace=airbyte,additionalProperties={}],syncMode=full_refresh,cursorField=[],destinationSyncMode=append,primaryKey=[[id]],additionalProperties={}]],additionalProperties={}]]]
2021-07-04 20:36:37 INFO () SyncWorkflow$ReplicationActivityImpl(replicate):201 - sync summary: io.airbyte.config.StandardSyncOutput@779959c3[standardSyncSummary=io.airbyte.config.StandardSyncSummary@28e65d4c[status=completed,recordsSynced=0,bytesSynced=0,startTime=1625430937981,endTime=1625430997391],state=<null>,outputCatalog=io.airbyte.protocol.models.ConfiguredAirbyteCatalog@44b2eafa[streams=[io.airbyte.protocol.models.ConfiguredAirbyteStream@2123b37b[stream=io.airbyte.protocol.models.AirbyteStream@5485dc14[name=airbyte_metadata,jsonSchema={"type":"object","properties":{"key":{"type":"string"},"value":{"type":"string"}}},supportedSyncModes=[full_refresh, incremental],sourceDefinedCursor=<null>,defaultCursorField=[],sourceDefinedPrimaryKey=[[key]],namespace=airbyte,additionalProperties={}],syncMode=full_refresh,cursorField=[],destinationSyncMode=append,primaryKey=[[key]],additionalProperties={}], io.airbyte.protocol.models.ConfiguredAirbyteStream@5a5f0325[stream=io.airbyte.protocol.models.AirbyteStream@9e9c7ae[name=attempts,jsonSchema={"type":"object","properties":{"id":{"type":"number"},"job_id":{"type":"number"},"output":{"type":"string"},"status":{"type":"string"},"ended_at":{"type":"string"},"log_path":{"type":"string"},"created_at":{"type":"string"},"updated_at":{"type":"string"},"attempt_number":{"type":"number"}}},supportedSyncModes=[full_refresh, incremental],sourceDefinedCursor=<null>,defaultCursorField=[],sourceDefinedPrimaryKey=[[id]],namespace=airbyte,additionalProperties={}],syncMode=full_refresh,cursorField=[],destinationSyncMode=append,primaryKey=[[id]],additionalProperties={}], io.airbyte.protocol.models.ConfiguredAirbyteStream@356d0583[stream=io.airbyte.protocol.models.AirbyteStream@4f720a46[name=jobs,jsonSchema={"type":"object","properties":{"id":{"type":"number"},"scope":{"type":"string"},"config":{"type":"string"},"status":{"type":"string"},"created_at":{"type":"string"},"started_at":{"type":"string"},"updated_at":{"type":"string"},"config_type":{"type":"string"}}},supportedSyncModes=[full_refresh, incremental],sourceDefinedCursor=<null>,defaultCursorField=[],sourceDefinedPrimaryKey=[[id]],namespace=airbyte,additionalProperties={}],syncMode=full_refresh,cursorField=[],destinationSyncMode=append,primaryKey=[[id]],additionalProperties={}]],additionalProperties={}]]
2021-07-04 20:36:37 INFO () TemporalAttemptExecution(get):110 - Executing worker wrapper. Airbyte version: 0.26.4-alpha
2021-07-04 20:36:37 INFO () DefaultNormalizationWorker(run):61 - Running normalization.
2021-07-04 20:36:37 INFO () LineGobbler(voidCall):85 - Checking if airbyte/normalization:0.1.33 exists...
2021-07-04 20:36:37 INFO () LineGobbler(voidCall):85 - airbyte/normalization:0.1.33 was found locally.
2021-07-04 20:36:37 INFO () DockerProcessFactory(create):127 - Preparing command: docker run --rm --init -i -v airbyte_workspace:/data -v /tmp/airbyte_local:/local -w /data/##/0/normalize --network host airbyte/normalization:0.1.33 run --integration-type redshift --config destination_config.json --catalog destination_catalog.json
2021-07-04 20:36:40 INFO () LineGobbler(voidCall):85 - Running: transform-config --config destination_config.json --integration-type redshift --out /data/##/0/normalize
2021-07-04 20:36:40 INFO () LineGobbler(voidCall):85 - Namespace(config='destination_config.json', integration_type=<DestinationType.redshift: 'redshift'>, out='/data/##/0/normalize')
2021-07-04 20:36:40 INFO () LineGobbler(voidCall):85 - transform_redshift
2021-07-04 20:36:40 INFO () LineGobbler(voidCall):85 - Running: transform-catalog --integration-type redshift --profile-config-dir /data/##/0/normalize --catalog destination_catalog.json --out /data/##/0/normalize/models/generated/ --json-column _airbyte_data
2021-07-04 20:36:40 INFO () LineGobbler(voidCall):85 - Processing destination_catalog.json...
2021-07-04 20:36:40 INFO () LineGobbler(voidCall):85 -   Generating airbyte_ctes/airbyte/airbyte_metadata_ab1.sql from airbyte_metadata
2021-07-04 20:36:40 INFO () LineGobbler(voidCall):85 -   Generating airbyte_ctes/airbyte/airbyte_metadata_ab2.sql from airbyte_metadata
2021-07-04 20:36:40 INFO () LineGobbler(voidCall):85 -   Generating airbyte_ctes/airbyte/airbyte_metadata_ab3.sql from airbyte_metadata
2021-07-04 20:36:40 INFO () LineGobbler(voidCall):85 -   Generating airbyte_tables/airbyte/airbyte_metadata.sql from airbyte_metadata
2021-07-04 20:36:40 INFO () LineGobbler(voidCall):85 -   Generating airbyte_ctes/airbyte/attempts_ab1.sql from attempts
2021-07-04 20:36:40 INFO () LineGobbler(voidCall):85 -   Generating airbyte_ctes/airbyte/attempts_ab2.sql from attempts
2021-07-04 20:36:40 INFO () LineGobbler(voidCall):85 -   Generating airbyte_ctes/airbyte/attempts_ab3.sql from attempts
2021-07-04 20:36:40 INFO () LineGobbler(voidCall):85 -   Generating airbyte_tables/airbyte/attempts.sql from attempts
2021-07-04 20:36:40 INFO () LineGobbler(voidCall):85 -   Generating airbyte_ctes/airbyte/jobs_ab1.sql from jobs
2021-07-04 20:36:40 INFO () LineGobbler(voidCall):85 -   Generating airbyte_ctes/airbyte/jobs_ab2.sql from jobs
2021-07-04 20:36:40 INFO () LineGobbler(voidCall):85 -   Generating airbyte_ctes/airbyte/jobs_ab3.sql from jobs
2021-07-04 20:36:40 INFO () LineGobbler(voidCall):85 -   Generating airbyte_tables/airbyte/jobs.sql from jobs
2021-07-04 20:36:42 INFO () LineGobbler(voidCall):85 - Running with dbt=0.19.1
2021-07-04 20:36:44 INFO () LineGobbler(voidCall):85 - [�[33mWARNING�[0m]: Configuration paths exist in your dbt_project.yml file which do not apply to any resources.
2021-07-04 20:36:44 INFO () LineGobbler(voidCall):85 - There are 1 unused configuration paths:
2021-07-04 20:36:44 INFO () LineGobbler(voidCall):85 - - models.airbyte_utils.generated.airbyte_views
2021-07-04 20:36:44 INFO () LineGobbler(voidCall):85 - 
2021-07-04 20:36:44 INFO () LineGobbler(voidCall):85 - Found 12 models, 0 tests, 0 snapshots, 0 analyses, 389 macros, 0 operations, 0 seed files, 3 sources, 0 exposures
2021-07-04 20:36:44 INFO () LineGobbler(voidCall):85 - 
2021-07-04 20:36:46 INFO () LineGobbler(voidCall):85 - 20:36:46 | Concurrency: 32 threads (target='prod')
2021-07-04 20:36:46 INFO () LineGobbler(voidCall):85 - 20:36:46 | 
2021-07-04 20:36:46 INFO () LineGobbler(voidCall):85 - 20:36:46 | 1 of 3 START table model airbyte.attempts.................................................................... [RUN]
2021-07-04 20:36:46 INFO () LineGobbler(voidCall):85 - 20:36:46 | 2 of 3 START table model airbyte.jobs........................................................................ [RUN]
2021-07-04 20:36:46 INFO () LineGobbler(voidCall):85 - 20:36:46 | 3 of 3 START table model airbyte.airbyte_metadata............................................................ [RUN]
2021-07-04 20:37:04 INFO () LineGobbler(voidCall):85 - 20:37:04 | 3 of 3 OK created table model airbyte.airbyte_metadata....................................................... [�[32mSELECT�[0m in 17.35s]
2021-07-04 20:37:06 INFO () LineGobbler(voidCall):85 - 20:37:06 | 2 of 3 OK created table model airbyte.jobs................................................................... [�[32mSELECT�[0m in 19.89s]
2021-07-04 20:37:09 INFO () LineGobbler(voidCall):85 - 20:37:09 | 1 of 3 OK created table model airbyte.attempts............................................................... [�[32mSELECT�[0m in 23.04s]
2021-07-04 20:37:09 INFO () LineGobbler(voidCall):85 - 20:37:09 | 
2021-07-04 20:37:09 INFO () LineGobbler(voidCall):85 - 20:37:09 | Finished running 3 table models in 25.37s.
2021-07-04 20:37:09 INFO () LineGobbler(voidCall):85 - 
2021-07-04 20:37:09 INFO () LineGobbler(voidCall):85 - �[32mCompleted successfully�[0m
2021-07-04 20:37:09 INFO () LineGobbler(voidCall):85 - 
2021-07-04 20:37:09 INFO () LineGobbler(voidCall):85 - Done. PASS=3 WARN=0 ERROR=0 SKIP=0 TOTAL=3
2021-07-04 20:37:12 INFO () DefaultNormalizationWorker(run):76 - Normalization executed in 0.
2021-07-04 20:37:12 INFO () TemporalAttemptExecution(get):133 - Stopping cancellation check scheduling...

Steps to Reproduce

  1. Connect postgres source
  2. to redshift destination
  3. Send it
@danieldiamond danieldiamond added the type/bug Something isn't working label Jul 4, 2021
@jrhizor jrhizor added area/connectors Connector related issues priority/high High priority labels Jul 6, 2021
@rclmenezes
Copy link
Contributor

rclmenezes commented Jul 7, 2021

Enviroment

  • Airbyte version: 0.27.0-alpha
  • OS Version / Instance: Ubuntu 20.04
  • Deployment: Docker on EC2, t3.2xlarge
  • Source Connector and version: Postgres 0.3.5
  • Destination Connector and version: Postgres 0.3.6
  • Severity: High
  • Step where error happened: Sync job

Current Behavior

The sync ran out of memory on a Incremental - Dedupe + History run. A lot of data did not load in the raw tables. For one table, zero rows loaded in the raw tables:

rmenezes=> select count(*) from _airbyte_raw_users;
 count 
-------
     0
(1 row)

For another table, 42976 rows loaded out of 97226:

rmenezes=> select count(*) from loan_applications;
 count 
-------
 97226
(1 row)

rmenezes=> select count(*) from airbyte.loan_applications;
 count 
-------
 42976
(1 row)

Despite this, the transformation step still happened and the run was marked as a success.

Expected Behavior

OOM errors should be loud and fail the run. If only some of the data loaded, and it was marked as a success, users might have a bad time downstream.

Logs

Logs attached below:

@ChristopheDuong
Copy link
Contributor

ChristopheDuong commented Jul 7, 2021

OOM errors should be loud and fail the run. If only some of the data loaded, and it was marked as a success, users might have a bad time downstream.

In this case, normalization was triggered and started processing partial data. The sync should have failed and not trigger transformation steps

@ChristopheDuong
Copy link
Contributor

This seem similar to the same topic here :
#4298
#4236

@subodh1810
Copy link
Contributor

Regarding the OOM #3969 (comment),
We follow the exact same design in postgres CDC for the queue as we were doing in mysql and chances are very high that the queue is at fault here as well

@subodh1810
Copy link
Contributor

This PR #4607 might assist with OOM issues

@sherifnada
Copy link
Contributor

closed in #4617

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

7 participants