Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

db migrations to support new webhook operations #17671

Merged
merged 3 commits into from
Oct 7, 2022
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Original file line number Diff line number Diff line change
Expand Up @@ -138,7 +138,7 @@ void testBootloaderAppBlankDb() throws Exception {
val configsMigrator = new ConfigsDatabaseMigrator(configDatabase, configsFlyway);
// this line should change with every new migration
// to show that you meant to make a new migration to the prod database
assertEquals("0.40.11.002", configsMigrator.getLatestMigration().getVersion().getVersion());
assertEquals("0.40.12.001", configsMigrator.getLatestMigration().getVersion().getVersion());

val jobsPersistence = new DefaultJobPersistence(jobDatabase);
assertEquals(VERSION_0330_ALPHA, jobsPersistence.getVersion().get());
Expand Down
Original file line number Diff line number Diff line change
@@ -0,0 +1,50 @@
/*
* Copyright (c) 2022 Airbyte, Inc., all rights reserved.
*/

package io.airbyte.db.instance.configs.migrations;

import org.flywaydb.core.api.migration.BaseJavaMigration;
import org.flywaydb.core.api.migration.Context;
import org.jooq.DSLContext;
import org.jooq.impl.DSL;
import org.jooq.impl.SQLDataType;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;

// TODO: update migration description in the class name
public class V0_40_12_001__AddWebhookOperationColumns extends BaseJavaMigration {

private static final Logger LOGGER = LoggerFactory.getLogger(V0_40_12_001__AddWebhookOperationColumns.class);

@Override
public void migrate(final Context context) throws Exception {
LOGGER.info("Running migration: {}", this.getClass().getSimpleName());

// Warning: please do not use any jOOQ generated code to write a migration.
// As database schema changes, the generated jOOQ code can be deprecated. So
// old migration may not compile if there is any generated code.
final DSLContext ctx = DSL.using(context.getConnection());
addWebhookOperationConfigColumn(ctx);
addWebhookOperationType(ctx);
addWebhookConfigColumnsToWorkspaceTable(ctx);
}

private void addWebhookConfigColumnsToWorkspaceTable(final DSLContext ctx) {
ctx.alterTable("workspace")
.addColumnIfNotExists(DSL.field(
"webhook_operation_configs",
SQLDataType.JSONB.nullable(true)))
.execute();
}

private void addWebhookOperationType(final DSLContext ctx) {
ctx.alterType("operator_type").addValue("webhook").execute();
}

private void addWebhookOperationConfigColumn(final DSLContext ctx) {
ctx.alterTable("operation").addColumnIfNotExists(DSL.field("operator_webhook",
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@mfsiega-airbyte one bit I'm not sure I understand - can you remind me again why we need to add this to the operation table? Is it so we have a way to indicate that the operation is of a specific webhook type?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Basically: different connections might want different configs (and that happens by creating different entries in the operation table).

There are a few reasons:

  • For one, we need to say "use the webhook for this connection (and not that connection)". So we need to create an operation, and then the connection has a list of operations to execute, so we can indicate that we want to use the operation with a particular connection.
  • We might also want to e.g., change the parameters between connections.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

That sounds like something better suited for the connection_operation table. Maybe worth a quick sync tmrw?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The connection_operation table today just holds:

- id
- connection_id
- operation_id
- created_at/updated_at

essentially just hooking up connections and operations. (I'm not 100% sure why the connection_operation table exists, instead of just having an operation_ids column on the connection table, but I assume there's a good reason).

Whereas the operation table is where e.g., the existing dbt transformation config lives (which is mostly how I decided to put it there).

Regardless, +1 to syncing tomorrow - maybe good to run through the whole thing e2e for some early feedback as well.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

spoke offline: this is column is needed so webhook configs are stored on the operation table.

SQLDataType.JSONB.nullable(true))).execute();
}

}
Original file line number Diff line number Diff line change
Expand Up @@ -134,6 +134,7 @@ create table "public"."operation"(
"tombstone" bool not null default false,
"created_at" timestamptz(35) not null default null,
"updated_at" timestamptz(35) not null default null,
"operator_webhook" jsonb null,
constraint "operation_pkey"
primary key ("id")
);
Expand Down Expand Up @@ -178,6 +179,7 @@ create table "public"."workspace"(
"created_at" timestamptz(35) not null default null,
"updated_at" timestamptz(35) not null default null,
"geography" geography_type not null default null,
"webhook_operation_configs" jsonb null,
constraint "workspace_pkey"
primary key ("id")
);
Expand Down