Skip to content

Commit

Permalink
chore(druid): Remove legacy Druid NoSQL logic (#23997)
Browse files Browse the repository at this point in the history
  • Loading branch information
john-bodley authored Jun 9, 2023
1 parent bdb8bbe commit 9adb023
Show file tree
Hide file tree
Showing 42 changed files with 58 additions and 323 deletions.
12 changes: 5 additions & 7 deletions CONTRIBUTING.md
Original file line number Diff line number Diff line change
Expand Up @@ -1403,13 +1403,11 @@ Note not all fields are correctly categorized. The fields vary based on visualiz
### Time
| Field | Type | Notes |
| ------------------- | -------- | ------------------------------------- |
| `druid_time_origin` | _string_ | The Druid **Origin** widget |
| `granularity` | _string_ | The Druid **Time Granularity** widget |
| `granularity_sqla` | _string_ | The SQLA **Time Column** widget |
| `time_grain_sqla` | _string_ | The SQLA **Time Grain** widget |
| `time_range` | _string_ | The **Time range** widget |
| Field | Type | Notes |
| ------------------ | -------- | ------------------------------------- |
| `granularity_sqla` | _string_ | The SQLA **Time Column** widget |
| `time_grain_sqla` | _string_ | The SQLA **Time Grain** widget |
| `time_range` | _string_ | The **Time range** widget |
### GROUP BY
Expand Down
27 changes: 0 additions & 27 deletions RESOURCES/STANDARD_ROLES.md
Original file line number Diff line number Diff line change
Expand Up @@ -197,30 +197,6 @@
|can add on AccessRequestsModelView|:heavy_check_mark:|O|O|O|
|can delete on AccessRequestsModelView|:heavy_check_mark:|O|O|O|
|muldelete on AccessRequestsModelView|:heavy_check_mark:|O|O|O|
|can edit on DruidDatasourceModelView|:heavy_check_mark:|:heavy_check_mark:|O|O|
|can list on DruidDatasourceModelView|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|can show on DruidDatasourceModelView|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|can add on DruidDatasourceModelView|:heavy_check_mark:|:heavy_check_mark:|O|O|
|can delete on DruidDatasourceModelView|:heavy_check_mark:|:heavy_check_mark:|O|O|
|muldelete on DruidDatasourceModelView|:heavy_check_mark:|:heavy_check_mark:|O|O|
|yaml export on DruidDatasourceModelView|:heavy_check_mark:|:heavy_check_mark:|O|O|
|can edit on DruidClusterModelView|:heavy_check_mark:|O|O|O|
|can list on DruidClusterModelView|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|can show on DruidClusterModelView|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|can add on DruidClusterModelView|:heavy_check_mark:|O|O|O|
|can delete on DruidClusterModelView|:heavy_check_mark:|O|O|O|
|muldelete on DruidClusterModelView|:heavy_check_mark:|O|O|O|
|yaml export on DruidClusterModelView|:heavy_check_mark:|O|O|O|
|can list on DruidMetricInlineView|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|can add on DruidMetricInlineView|:heavy_check_mark:|:heavy_check_mark:|O|O|
|can delete on DruidMetricInlineView|:heavy_check_mark:|:heavy_check_mark:|O|O|
|can edit on DruidMetricInlineView|:heavy_check_mark:|:heavy_check_mark:|O|O|
|can list on DruidColumnInlineView|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|can add on DruidColumnInlineView|:heavy_check_mark:|:heavy_check_mark:|O|O|
|can delete on DruidColumnInlineView|:heavy_check_mark:|:heavy_check_mark:|O|O|
|can edit on DruidColumnInlineView|:heavy_check_mark:|:heavy_check_mark:|O|O|
|can refresh datasources on Druid|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|can scan new datasources on Druid|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|menu access on Row Level Security|:heavy_check_mark:|O|O|O|
|menu access on Access requests|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|menu access on Home|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
Expand All @@ -229,10 +205,7 @@
|menu access on Chart Emails|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|menu access on Alerts|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|menu access on Alerts & Report|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|menu access on Druid Datasources|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|menu access on Druid Clusters|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|menu access on Scan New Datasources|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|menu access on Refresh Druid Metadata|:heavy_check_mark:|O|O|O|
|can share dashboard on Superset|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|can share chart on Superset|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
|can list on FilterSets|:heavy_check_mark:|:heavy_check_mark:|:heavy_check_mark:|O|
Expand Down
12 changes: 5 additions & 7 deletions docs/docs/miscellaneous/chart-params.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -26,13 +26,11 @@ Note not all fields are correctly categorized. The fields vary based on visualiz

### Time

| Field | Type | Notes |
| ------------------- | -------- | ------------------------------------- |
| `druid_time_origin` | _string_ | The Druid **Origin** widget |
| `granularity` | _string_ | The Druid **Time Granularity** widget |
| `granularity_sqla` | _string_ | The SQLA **Time Column** widget |
| `time_grain_sqla` | _string_ | The SQLA **Time Grain** widget |
| `time_range` | _string_ | The **Time range** widget |
| Field | Type | Notes |
| ------------------ | -------- | ------------------------------------- |
| `granularity_sqla` | _string_ | The SQLA **Time Column** widget |
| `time_grain_sqla` | _string_ | The SQLA **Time Grain** widget |
| `time_range` | _string_ | The **Time range** widget |

### GROUP BY

Expand Down
18 changes: 2 additions & 16 deletions docs/docs/miscellaneous/importing-exporting-datasources.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@ version: 1
## Importing and Exporting Datasources

The superset cli allows you to import and export datasources from and to YAML. Datasources include
both databases and druid clusters. The data is expected to be organized in the following hierarchy:
databases. The data is expected to be organized in the following hierarchy:

```
├──databases
Expand All @@ -24,19 +24,6 @@ both databases and druid clusters. The data is expected to be organized in the f
| | | └──... (more metrics)
| | └── ... (more tables)
| └── ... (more databases)
└──druid_clusters
├──cluster_1
| ├──datasource_1
| | ├──columns
| | | ├──column_1
| | | ├──column_2
| | | └──... (more columns)
| | └──metrics
| | ├──metric_1
| | ├──metric_2
| | └──... (more metrics)
| └── ... (more datasources)
└── ... (more clusters)
```

### Exporting Datasources to YAML
Expand All @@ -59,8 +46,7 @@ references to be included (e.g. a column to include the table id it belongs to)
Alternatively, you can export datasources using the UI:

1. Open **Sources -> Databases** to export all tables associated to a single or multiple databases.
(**Tables** for one or more tables, **Druid Clusters** for clusters, **Druid Datasources** for
datasources)
(**Tables** for one or more tables)
2. Select the items you would like to export.
3. Click **Actions -> Export** to YAML
4. If you want to import an item that you exported through the UI, you will need to nest it inside
Expand Down
22 changes: 1 addition & 21 deletions docs/static/resources/openapi.json
Original file line number Diff line number Diff line change
Expand Up @@ -850,13 +850,6 @@
"description": "HAVING clause to be added to aggregate queries using AND operator.",
"type": "string"
},
"having_druid": {
"description": "HAVING filters to be added to legacy Druid datasource queries. This field is deprecated",
"items": {
"$ref": "#/components/schemas/ChartDataFilter"
},
"type": "array"
},
"relative_end": {
"description": "End time for relative time deltas. Default: `config[\"DEFAULT_RELATIVE_START_TIME\"]`",
"enum": ["today", "now"],
Expand Down Expand Up @@ -1228,11 +1221,6 @@
],
"nullable": true
},
"druid_time_origin": {
"description": "Starting point for time grain counting on legacy Druid datasources. Used to change e.g. Monday/Sunday first-day-of-week. This field is deprecated and should be passed to `extras` as `druid_time_origin`.",
"nullable": true,
"type": "string"
},
"extras": {
"allOf": [
{
Expand All @@ -1250,7 +1238,7 @@
"type": "array"
},
"granularity": {
"description": "Name of temporal column used for time filtering. For legacy Druid datasources this defines the time grain.",
"description": "Name of temporal column used for time filtering.
"nullable": true,
"type": "string"
},
Expand All @@ -1270,14 +1258,6 @@
"nullable": true,
"type": "string"
},
"having_filters": {
"description": "HAVING filters to be added to legacy Druid datasource queries. This field is deprecated and should be passed to `extras` as `having_druid`.",
"items": {
"$ref": "#/components/schemas/ChartDataFilter"
},
"nullable": true,
"type": "array"
},
"is_rowcount": {
"description": "Should the rowcount of the actual query be returned",
"nullable": true,
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -334,7 +334,6 @@ export type SharedSectionAlias =
| 'annotations'
| 'colorScheme'
| 'datasourceAndVizType'
| 'druidTimeSeries'
| 'sqlaTimeSeries'
| 'NVD3TimeSeries';

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -37,7 +37,7 @@ import { isDefined } from '../utils';

/**
* Build the common segments of all query objects (e.g. the granularity field derived from
* either sql alchemy or druid). The segments specific to each viz type is constructed in the
* SQLAlchemy). The segments specific to each viz type is constructed in the
* buildQuery method for each viz type (see `wordcloud/buildQuery.ts` for an example).
* Note the type of the formData argument passed in here is the type of the formData for a
* specific viz, which is a subtype of the generic formData shared among all viz types.
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -61,7 +61,6 @@ export type QueryObjectFilterClause =
| UnaryQueryObjectFilterClause;

export type QueryObjectExtras = Partial<{
/** HAVING condition for Druid */
/** HAVING condition for SQLAlchemy */
having?: string;
relative_start?: string;
Expand Down Expand Up @@ -107,7 +106,7 @@ export interface QueryObject
/** SIMPLE where filters */
filters?: QueryObjectFilterClause[];

/** Time column for SQL, time-grain for Druid (deprecated) */
/** Time column for SQL */
granularity?: string;

/** If set, will group by timestamp */
Expand All @@ -119,9 +118,6 @@ export interface QueryObject
/** Free-form HAVING SQL, multiple clauses are concatenated by AND */
having?: string;

/** SIMPLE having filters */
having_filters?: QueryObjectFilterClause[];

post_processing?: (PostProcessingRule | undefined)[];

/** Maximum numbers of rows to return */
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -39,15 +39,6 @@ describe('buildQueryObject', () => {
expect(query.granularity).toEqual('ds');
});

it('should build granularity for druid datasources', () => {
query = buildQueryObject({
datasource: '5__druid',
granularity: 'ds',
viz_type: 'table',
});
expect(query.granularity).toEqual('ds');
});

it('should build metrics based on default queryFields', () => {
query = buildQueryObject({
datasource: '5__table',
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -117,7 +117,6 @@ export default {
js_columns: [],
where: '',
having: '',
having_filters: [],
filters: [
{
col: 'LATITUDE',
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -69,7 +69,6 @@ export default {
],
where: '',
having: '',
having_filters: [],
filters: [
{ col: 'LAT', op: 'IS NOT NULL', val: '' },
{ col: 'LON', op: 'IS NOT NULL', val: '' },
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -69,7 +69,6 @@ export default {
],
where: '',
having: '',
having_filters: [],
filters: [
{ col: 'LAT', op: 'IS NOT NULL', val: '' },
{ col: 'LON', op: 'IS NOT NULL', val: '' },
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -70,7 +70,6 @@ export const payload = theme => ({
js_columns: ['color'],
where: '',
having: '',
having_filters: [],
filters: [{ col: 'path_json', op: 'IS NOT NULL', val: '' }],
},
is_cached: false,
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -84,7 +84,6 @@ export default {
js_columns: [],
where: '',
having: '',
having_filters: [],
filters: [
{
col: 'geometry',
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -84,7 +84,6 @@ export default {
js_columns: ['population', 'area'],
where: '',
having: '',
having_filters: [],
filters: [{ col: 'contour', op: 'IS NOT NULL', val: '' }],
},
is_cached: false,
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -67,7 +67,6 @@ export default {
],
where: '',
having: '',
having_filters: [],
filters: [
{ col: 'LAT', op: 'IS NOT NULL', val: '' },
{ col: 'LON', op: 'IS NOT NULL', val: '' },
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -68,7 +68,6 @@ export default {
],
where: '',
having: '',
having_filters: [],
filters: [
{ col: 'LAT', op: 'IS NOT NULL', val: '' },
{ col: 'LON', op: 'IS NOT NULL', val: '' },
Expand Down
2 changes: 1 addition & 1 deletion superset-frontend/src/components/AlteredSliceTag/index.jsx
Original file line number Diff line number Diff line change
Expand Up @@ -104,7 +104,7 @@ export default class AlteredSliceTag extends React.Component {
if (!ofd[fdKey] && !cfd[fdKey]) {
return;
}
if (['filters', 'having', 'having_filters', 'where'].includes(fdKey)) {
if (['filters', 'having', 'where'].includes(fdKey)) {
return;
}
if (!this.isEqualish(ofd[fdKey], cfd[fdKey])) {
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -765,7 +765,6 @@ describe('Ensure buildTree does not throw runtime errors when encountering an in
applied_time_extras: {},
where: '',
having: '',
having_filters: [],
filters: [],
},
is_cached: false,
Expand Down Expand Up @@ -3131,7 +3130,6 @@ describe('Ensure buildTree does not throw runtime errors when encountering an in
applied_time_extras: {},
where: '',
having: '',
having_filters: [],
filters: [],
},
is_cached: false,
Expand Down Expand Up @@ -16668,7 +16666,6 @@ describe('Ensure buildTree does not throw runtime errors when encountering an in
applied_time_extras: {},
where: '',
having: '',
having_filters: [],
filters: [
{
col: 'rank',
Expand Down Expand Up @@ -17723,7 +17720,6 @@ describe('Ensure buildTree does not throw runtime errors when encountering an in
applied_time_extras: {},
where: '',
having: '',
having_filters: [],
filters: [],
},
is_cached: false,
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -27,7 +27,6 @@ export default function getFilterConfigsFromFormdata(form_data = {}) {
const {
date_filter,
filter_configs = [],
show_druid_time_granularity,
show_sqla_time_column,
show_sqla_time_granularity,
} = form_data;
Expand Down Expand Up @@ -93,13 +92,6 @@ export default function getFilterConfigsFromFormdata(form_data = {}) {
};
}

if (show_druid_time_granularity) {
updatedColumns = {
...updatedColumns,
[TIME_FILTER_MAP.granularity]: form_data.granularity,
};
}

configs = {
...configs,
columns: updatedColumns,
Expand Down
2 changes: 0 additions & 2 deletions superset-frontend/src/explore/constants.ts
Original file line number Diff line number Diff line change
Expand Up @@ -125,8 +125,6 @@ export const sqlaAutoGeneratedMetricNameRegex =
/^(sum|min|max|avg|count|count_distinct)__.*$/i;
export const sqlaAutoGeneratedMetricRegex =
/^(LONG|DOUBLE|FLOAT)?(SUM|AVG|MAX|MIN|COUNT)\([A-Z0-9_."]*\)$/i;
export const druidAutoGeneratedMetricRegex =
/^(LONG|DOUBLE|FLOAT)?(SUM|MAX|MIN|COUNT)\([A-Z0-9_."]*\)$/i;

export const TIME_FILTER_LABELS = {
time_range: t('Time range'),
Expand Down
3 changes: 0 additions & 3 deletions superset-frontend/src/explore/controlPanels/Separator.js
Original file line number Diff line number Diff line change
Expand Up @@ -69,9 +69,6 @@ export default {
},
},
sectionOverrides: {
druidTimeSeries: {
controlSetRows: [],
},
sqlaTimeSeries: {
controlSetRows: [],
},
Expand Down
Loading

0 comments on commit 9adb023

Please sign in to comment.