Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Logs UI] Anomalies page dataset filtering #71110

Merged
Merged
Show file tree
Hide file tree
Changes from 1 commit
Commits
Show all changes
51 commits
Select commit Hold shift + click to select a range
fbed94b
Add category anomalies to anomalies page
Kerry350 Jun 30, 2020
f585973
node scripts/i18n_check --fix
Kerry350 Jul 8, 2020
216cc9e
Add dataset filtering to the anomalies page
Kerry350 Jul 8, 2020
55f6d3f
Fix typecheck issues
Kerry350 Jul 9, 2020
9ab603f
Update x-pack/plugins/infra/common/http_api/log_analysis/results/log_…
Kerry350 Jul 9, 2020
27b63a2
Update x-pack/plugins/infra/common/http_api/log_analysis/results/log_…
Kerry350 Jul 9, 2020
ce78758
Update x-pack/plugins/infra/common/http_api/log_analysis/results/log_…
Kerry350 Jul 9, 2020
96ef88b
Update x-pack/plugins/infra/common/http_api/log_analysis/results/log_…
Kerry350 Jul 9, 2020
926feed
Update x-pack/plugins/infra/common/http_api/log_analysis/results/log_…
Kerry350 Jul 9, 2020
9cc61f2
Update x-pack/plugins/infra/common/http_api/log_analysis/results/log_…
Kerry350 Jul 9, 2020
c700591
Update x-pack/plugins/infra/common/http_api/log_analysis/results/log_…
Kerry350 Jul 9, 2020
0f919ee
Add aria labels
Kerry350 Jul 9, 2020
23346a3
Amend pluralisation on variables and fix useEffect dependency
Kerry350 Jul 9, 2020
c540e33
Ensure category anomlies fetch examples relevant to the category
Kerry350 Jul 9, 2020
d84a7f9
Refine loading states etc
Kerry350 Jul 9, 2020
0686810
Update x-pack/plugins/infra/public/components/loading_overlay_wrapper…
weltenwort Jul 10, 2020
4ff5799
Merge branch 'master' into 64755-expand-anomalies-page-to-add-categories
elasticmachine Jul 10, 2020
9d919fd
Merge remote-tracking branch 'upstream/master' into 64755-expand-anom…
Kerry350 Jul 13, 2020
9a95520
Move anomalies hook state over to a reducer for better clarity and bu…
Kerry350 Jul 13, 2020
0453295
Merge branch '64755-expand-anomalies-page-to-add-categories' of githu…
Kerry350 Jul 13, 2020
27ef241
Change conditional handling
Kerry350 Jul 13, 2020
578f80c
Account for the fact dataset may not exist
Kerry350 Jul 13, 2020
53eff11
Update x-pack/plugins/infra/server/routes/log_analysis/results/log_en…
Kerry350 Jul 13, 2020
34e92a9
Merge branch '64755-expand-anomalies-page-to-add-categories' of githu…
Kerry350 Jul 13, 2020
f2e44e6
Extract variable from common
Kerry350 Jul 13, 2020
1dd7c03
Update x-pack/plugins/infra/server/lib/log_analysis/log_entry_anomali…
Kerry350 Jul 13, 2020
c4f8725
Merge branch '64755-expand-anomalies-page-to-add-categories' of githu…
Kerry350 Jul 13, 2020
fbf5c09
Update x-pack/plugins/infra/server/lib/log_analysis/log_entry_anomali…
Kerry350 Jul 13, 2020
0e5441b
Merge branch '64755-expand-anomalies-page-to-add-categories' of githu…
Kerry350 Jul 13, 2020
a2c1d2f
Update x-pack/plugins/infra/public/pages/logs/log_entry_rate/sections…
Kerry350 Jul 13, 2020
9198baf
Merge branch '64755-expand-anomalies-page-to-add-categories' of githu…
Kerry350 Jul 13, 2020
51a1936
Amend message handling
Kerry350 Jul 13, 2020
ec0152c
Wrap no data state with a loading overlay wrapper
Kerry350 Jul 13, 2020
751ae96
Merge remote-tracking branch 'upstream/master' into 64755-expand-anom…
Kerry350 Jul 13, 2020
706239b
Ensure expanded rows extract their job id from the anomaly record
Kerry350 Jul 13, 2020
c5b2243
Update variable naming
Kerry350 Jul 13, 2020
f95947b
Amend i18n syntax
Kerry350 Jul 13, 2020
99a3e50
Ensure category is passed to category links
Kerry350 Jul 13, 2020
298b66d
Commit
Kerry350 Jul 13, 2020
6cc04bb
Appease eslint
Kerry350 Jul 13, 2020
2524277
Merge branch '64755-expand-anomalies-page-to-add-categories' into 710…
Kerry350 Jul 13, 2020
d3a9380
Change conditional syntax
Kerry350 Jul 13, 2020
47b17de
Add LoadingOverlayContent back for improved accessbility
Kerry350 Jul 13, 2020
667f9c9
node scripts/i18n_check --fix
Kerry350 Jul 13, 2020
f159057
Merge branch '64755-expand-anomalies-page-to-add-categories' into 710…
Kerry350 Jul 13, 2020
cc546d3
Use decodeOrThrow
Kerry350 Jul 13, 2020
0ec814e
Rework dataset filtering in with new reducer based hook state
Kerry350 Jul 13, 2020
2aef2f5
Merge remote-tracking branch 'upstream/master' into 71063-add-dataset…
Kerry350 Jul 13, 2020
2a7e384
Merge branch 'master' into 71063-add-dataset-filtering-to-anomalies-page
elasticmachine Jul 13, 2020
56cd384
i18n --fix
Kerry350 Jul 13, 2020
916ae3d
Merge branch '71063-add-dataset-filtering-to-anomalies-page' of githu…
Kerry350 Jul 13, 2020
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Original file line number Diff line number Diff line change
Expand Up @@ -10,3 +10,4 @@ export * from './log_entry_category_examples';
export * from './log_entry_rate';
export * from './log_entry_examples';
export * from './log_entry_anomalies';
export * from './log_entry_anomalies_datasets';
Original file line number Diff line number Diff line change
Expand Up @@ -112,6 +112,8 @@ export const getLogEntryAnomaliesRequestPayloadRT = rt.type({
pagination: paginationRT,
// Sort properties
sort: sortRT,
// Dataset filters
datasets: rt.array(rt.string),
}),
]),
});
Expand Down
Original file line number Diff line number Diff line change
@@ -0,0 +1,63 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License;
* you may not use this file except in compliance with the Elastic License.
*/

import * as rt from 'io-ts';

import {
badRequestErrorRT,
forbiddenErrorRT,
timeRangeRT,
routeTimingMetadataRT,
} from '../../shared';

export const LOG_ANALYSIS_GET_LOG_ENTRY_ANOMALIES_DATASETS_PATH =
'/api/infra/log_analysis/results/log_entry_anomalies_datasets';

/**
* request
*/

export const getLogEntryAnomaliesDatasetsRequestPayloadRT = rt.type({
data: rt.type({
// the id of the source configuration
sourceId: rt.string,
// the time range to fetch the anomalies datasets from
timeRange: timeRangeRT,
}),
});

export type GetLogEntryAnomaliesDatasetsRequestPayload = rt.TypeOf<
typeof getLogEntryAnomaliesDatasetsRequestPayloadRT
>;

/**
* response
*/

export const getLogEntryAnomaliesDatasetsSuccessReponsePayloadRT = rt.intersection([
rt.type({
data: rt.type({
datasets: rt.array(rt.string),
}),
}),
rt.partial({
timing: routeTimingMetadataRT,
}),
]);

export type GetLogEntryAnomaliesDatasetsSuccessResponsePayload = rt.TypeOf<
typeof getLogEntryAnomaliesDatasetsSuccessReponsePayloadRT
>;

export const getLogEntryAnomaliesDatasetsResponsePayloadRT = rt.union([
getLogEntryAnomaliesDatasetsSuccessReponsePayloadRT,
badRequestErrorRT,
forbiddenErrorRT,
]);

export type GetLogEntryAnomaliesDatasetsReponsePayload = rt.TypeOf<
typeof getLogEntryAnomaliesDatasetsResponsePayloadRT
>;
Original file line number Diff line number Diff line change
Expand Up @@ -16,11 +16,16 @@ export const LOG_ANALYSIS_GET_LOG_ENTRY_RATE_PATH =
*/

export const getLogEntryRateRequestPayloadRT = rt.type({
data: rt.type({
bucketDuration: rt.number,
sourceId: rt.string,
timeRange: timeRangeRT,
}),
data: rt.intersection([
rt.type({
bucketDuration: rt.number,
sourceId: rt.string,
timeRange: timeRangeRT,
}),
rt.partial({
datasets: rt.array(rt.string),
}),
]),
});

export type GetLogEntryRateRequestPayload = rt.TypeOf<typeof getLogEntryRateRequestPayloadRT>;
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@ import { EuiComboBox, EuiComboBoxOptionOption } from '@elastic/eui';
import { i18n } from '@kbn/i18n';
import React, { useCallback, useMemo } from 'react';

import { getFriendlyNameForPartitionId } from '../../../../../../common/log_analysis';
import { getFriendlyNameForPartitionId } from '../../../../common/log_analysis';

type DatasetOptionProps = EuiComboBoxOptionOption<string>;

Expand Down Expand Up @@ -51,7 +51,7 @@ export const DatasetsSelector: React.FunctionComponent<{
};

const datasetFilterPlaceholder = i18n.translate(
'xpack.infra.logs.logEntryCategories.datasetFilterPlaceholder',
'xpack.infra.logs.analysis.datasetFilterPlaceholder',
{
defaultMessage: 'Filter by datasets',
}
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,7 @@ import { BetaBadge } from '../../../../../components/beta_badge';
import { LoadingOverlayWrapper } from '../../../../../components/loading_overlay_wrapper';
import { RecreateJobButton } from '../../../../../components/logging/log_analysis_job_status';
import { AnalyzeInMlButton } from '../../../../../components/logging/log_analysis_results';
import { DatasetsSelector } from './datasets_selector';
import { DatasetsSelector } from '../../../../../components/logging/log_analysis_results/datasets_selector';
import { TopCategoriesTable } from './top_categories_table';

export const TopCategoriesSection: React.FunctionComponent<{
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -21,6 +21,7 @@ import {
StringTimeRange,
useLogAnalysisResultsUrlState,
} from './use_log_entry_rate_results_url_state';
import { DatasetsSelector } from '../../../components/logging/log_analysis_results/datasets_selector';

const JOB_STATUS_POLLING_INTERVAL = 30000;

Expand Down Expand Up @@ -76,15 +77,17 @@ export const LogEntryRateResultsContent: React.FunctionComponent<LogEntryRateRes
[queryTimeRange.value.endTime, queryTimeRange.value.startTime]
);

const [selectedDatasets, setSelectedDatasets] = useState<string[]>([]);

const { getLogEntryRate, isLoading, logEntryRate } = useLogEntryRateResults({
sourceId,
startTime: queryTimeRange.value.startTime,
endTime: queryTimeRange.value.endTime,
bucketDuration,
filteredDatasets: selectedDatasets,
});

const {
getLogEntryAnomalies,
isLoadingLogEntryAnomalies,
logEntryAnomalies,
page,
Expand All @@ -94,13 +97,16 @@ export const LogEntryRateResultsContent: React.FunctionComponent<LogEntryRateRes
changePaginationOptions,
sortOptions,
paginationOptions,
datasets,
isLoadingDatasets,
} = useLogEntryAnomaliesResults({
sourceId,
startTime: queryTimeRange.value.startTime,
endTime: queryTimeRange.value.endTime,
lastChangedTime: queryTimeRange.lastChangedTime,
defaultSortOptions: SORT_DEFAULTS,
defaultPaginationOptions: PAGINATION_DEFAULTS,
filteredDatasets: selectedDatasets,
});

const handleQueryTimeRangeChange = useCallback(
Expand Down Expand Up @@ -173,7 +179,7 @@ export const LogEntryRateResultsContent: React.FunctionComponent<LogEntryRateRes

useEffect(() => {
getLogEntryRate();
}, [getLogEntryRate, getLogEntryAnomalies, queryTimeRange.lastChangedTime]);
}, [getLogEntryRate, selectedDatasets, queryTimeRange.lastChangedTime]);

useEffect(() => {
fetchModuleDefinition();
Expand All @@ -197,7 +203,15 @@ export const LogEntryRateResultsContent: React.FunctionComponent<LogEntryRateRes
<ResultsContentPage>
<EuiFlexGroup direction="column">
<EuiFlexItem grow={false}>
<EuiFlexGroup justifyContent="flexEnd">
<EuiFlexGroup justifyContent="spaceBetween">
<EuiFlexItem>
<DatasetsSelector
availableDatasets={datasets}
isLoading={isLoadingDatasets}
selectedDatasets={selectedDatasets}
onChangeDatasetSelection={setSelectedDatasets}
/>
</EuiFlexItem>
<EuiFlexItem grow={false}>
<EuiSuperDatePicker
start={selectedTimeRange.startTime}
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,8 @@ export const callGetLogEntryAnomaliesAPI = async (
startTime: number,
endTime: number,
sort: Sort,
pagination: Pagination
pagination: Pagination,
datasets?: string[]
) => {
const response = await npStart.http.fetch(LOG_ANALYSIS_GET_LOG_ENTRY_ANOMALIES_PATH, {
method: 'POST',
Expand All @@ -32,6 +33,7 @@ export const callGetLogEntryAnomaliesAPI = async (
},
sort,
pagination,
datasets,
},
})
),
Expand Down
Original file line number Diff line number Diff line change
@@ -0,0 +1,43 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License;
* you may not use this file except in compliance with the Elastic License.
*/

import { fold } from 'fp-ts/lib/Either';
import { pipe } from 'fp-ts/lib/pipeable';
import { identity } from 'fp-ts/lib/function';
import { npStart } from '../../../../legacy_singletons';

import {
getLogEntryAnomaliesDatasetsRequestPayloadRT,
getLogEntryAnomaliesDatasetsSuccessReponsePayloadRT,
LOG_ANALYSIS_GET_LOG_ENTRY_ANOMALIES_DATASETS_PATH,
} from '../../../../../common/http_api/log_analysis';
import { createPlainError, throwErrors } from '../../../../../common/runtime_types';

export const callGetLogEntryAnomaliesDatasetsAPI = async (
sourceId: string,
startTime: number,
endTime: number
) => {
const response = await npStart.http.fetch(LOG_ANALYSIS_GET_LOG_ENTRY_ANOMALIES_DATASETS_PATH, {
method: 'POST',
body: JSON.stringify(
getLogEntryAnomaliesDatasetsRequestPayloadRT.encode({
data: {
sourceId,
timeRange: {
startTime,
endTime,
},
},
})
),
});

return pipe(
Kerry350 marked this conversation as resolved.
Show resolved Hide resolved
getLogEntryAnomaliesDatasetsSuccessReponsePayloadRT.decode(response),
fold(throwErrors(createPlainError), identity)
);
};
Original file line number Diff line number Diff line change
Expand Up @@ -19,7 +19,8 @@ export const callGetLogEntryRateAPI = async (
sourceId: string,
startTime: number,
endTime: number,
bucketDuration: number
bucketDuration: number,
datasets?: string[]
) => {
const response = await npStart.http.fetch(LOG_ANALYSIS_GET_LOG_ENTRY_RATE_PATH, {
method: 'POST',
Expand All @@ -32,6 +33,7 @@ export const callGetLogEntryRateAPI = async (
endTime,
},
bucketDuration,
datasets,
},
})
),
Expand Down
Loading