Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[ML] prefer secondary authorization header for data[feed|frame] authz #54121

Merged
Original file line number Diff line number Diff line change
Expand Up @@ -36,6 +36,12 @@ to create or update it. If the two sets of roles differ then the preview may
not accurately reflect what the {dfeed} will return when started. To avoid
such problems, the same user that creates/updates the {dfeed} should preview
it to ensure it is returning the expected data.
+
--
NOTE: It is possible that secondary authorization headers are supplied in the
request. If this is the case, the secondary authorization headers are used
instead of the primary headers.
--

[[ml-preview-datafeed-path-parms]]
==== {api-path-parms-title}
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -42,6 +42,9 @@ each interval. See {ml-docs}/ml-delayed-data-detection.html[Handling delayed dat
* When {es} {security-features} are enabled, your {dfeed} remembers which roles
the user who created it had at the time of creation and runs the query using
those same roles.
* It is possible that secondary authorization headers are supplied in the
request. If this is the case, the secondary authorization headers are used
instead of the primary headers.
====

[[ml-put-datafeed-path-parms]]
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -35,6 +35,12 @@ IMPORTANT: When {es} {security-features} are enabled, your {dfeed} remembers
which roles the user who updated it had at the time of update and runs the query
using those same roles.

+
--
NOTE: It is possible that secondary authorization headers are supplied in the
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Since "secondary authorization headers" are an ES thing, not a global standard, I think we should have a paragraph somewhere in the security docs that says the header name is es-secondary-authorization and the format is the same as for basic auth, and then "secondary authorization headers" in this sentence and the equivalent one in the other 3 files should be a link to that.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Maybe the new paragraph should go at the bottom of https://www.elastic.co/guide/en/elasticsearch/reference/current/http-clients.html?

request. If this is the case, the secondary authorization headers are used
instead of the primary headers.
--
[[ml-update-datafeed-path-parms]]
==== {api-path-parms-title}

Expand Down
6 changes: 6 additions & 0 deletions docs/reference/ml/df-analytics/apis/put-dfanalytics.asciidoc
Original file line number Diff line number Diff line change
Expand Up @@ -33,6 +33,12 @@ built-in roles and privileges:

For more information, see <<security-privileges>> and <<built-in-roles>>.

+
--
NOTE: It is possible that secondary authorization headers are supplied in the
request. If this is the case, the secondary authorization headers are used
instead of the primary headers.
--

[[ml-put-dfanalytics-desc]]
==== {api-description-title}
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -544,6 +544,46 @@ public void testInsufficientSearchPrivilegesOnPreview() throws Exception {
containsString("[indices:data/read/field_caps] is unauthorized for user [ml_admin]"));
}

public void testSecondaryAuthSearchPrivilegesLookBack() throws Exception {
setupDataAccessRole("airline-data");
String jobId = "secondary-privs-put-job";
createJob(jobId, "airline.keyword");
String datafeedId = "datafeed-" + jobId;
// Primary auth header does not have access, but secondary auth does
new DatafeedBuilder(datafeedId, jobId, "airline-data")
.setAuthHeader(BASIC_AUTH_VALUE_ML_ADMIN)
.setSecondaryAuthHeader(BASIC_AUTH_VALUE_ML_ADMIN_WITH_SOME_DATA_ACCESS)
.build();
openJob(client(), jobId);

startDatafeedAndWaitUntilStopped(datafeedId);
waitUntilJobIsClosed(jobId);

Response jobStatsResponse = client().performRequest(new Request("GET",
MachineLearning.BASE_PATH + "anomaly_detectors/" + jobId + "/_stats"));
String jobStatsResponseAsString = EntityUtils.toString(jobStatsResponse.getEntity());
assertThat(jobStatsResponseAsString, containsString("\"input_record_count\":2"));
assertThat(jobStatsResponseAsString, containsString("\"processed_record_count\":2"));
assertThat(jobStatsResponseAsString, containsString("\"missing_field_count\":0"));
}

public void testSecondaryAuthSearchPrivilegesOnPreview() throws Exception {
setupDataAccessRole("airline-data");
String jobId = "secondary-privs-preview-job";
createJob(jobId, "airline.keyword");

String datafeedId = "datafeed-" + jobId;
new DatafeedBuilder(datafeedId, jobId, "airline-data").build();

Request getFeed = new Request("GET", MachineLearning.BASE_PATH + "datafeeds/" + datafeedId + "/_preview");
RequestOptions.Builder options = getFeed.getOptions().toBuilder();
options.addHeader("Authorization", BASIC_AUTH_VALUE_ML_ADMIN);
options.addHeader("es-secondary-authorization", BASIC_AUTH_VALUE_ML_ADMIN_WITH_SOME_DATA_ACCESS);
getFeed.setOptions(options);
// Should not fail as secondary auth has permissions.
client().performRequest(getFeed);
}

public void testLookbackOnlyGivenAggregationsWithHistogram() throws Exception {
String jobId = "aggs-histogram-job";
Request createJobRequest = new Request("PUT", MachineLearning.BASE_PATH + "anomaly_detectors/" + jobId);
Expand Down Expand Up @@ -1181,6 +1221,7 @@ private static class DatafeedBuilder {
String scriptedFields;
String aggregations;
String authHeader = BASIC_AUTH_VALUE_SUPER_USER;
String secondaryAuthHeader = null;
String chunkingTimespan;
String indicesOptions;

Expand Down Expand Up @@ -1210,6 +1251,11 @@ DatafeedBuilder setAuthHeader(String authHeader) {
return this;
}

DatafeedBuilder setSecondaryAuthHeader(String authHeader) {
this.secondaryAuthHeader = authHeader;
return this;
}

DatafeedBuilder setChunkingTimespan(String timespan) {
chunkingTimespan = timespan;
return this;
Expand All @@ -1233,6 +1279,9 @@ Response build() throws IOException {
+ "}");
RequestOptions.Builder options = request.getOptions().toBuilder();
options.addHeader("Authorization", authHeader);
if (this.secondaryAuthHeader != null) {
options.addHeader("es-secondary-authorization", secondaryAuthHeader);
}
request.setOptions(options);
return client().performRequest(request);
}
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -11,15 +11,18 @@
import org.elasticsearch.client.Client;
import org.elasticsearch.common.bytes.BytesArray;
import org.elasticsearch.common.inject.Inject;
import org.elasticsearch.common.settings.Settings;
import org.elasticsearch.common.xcontent.NamedXContentRegistry;
import org.elasticsearch.tasks.Task;
import org.elasticsearch.threadpool.ThreadPool;
import org.elasticsearch.transport.TransportService;
import org.elasticsearch.xpack.core.ClientHelper;
import org.elasticsearch.xpack.core.XPackSettings;
import org.elasticsearch.xpack.core.ml.action.PreviewDatafeedAction;
import org.elasticsearch.xpack.core.ml.datafeed.ChunkingConfig;
import org.elasticsearch.xpack.core.ml.datafeed.DatafeedConfig;
import org.elasticsearch.xpack.core.ml.datafeed.extractor.DataExtractor;
import org.elasticsearch.xpack.core.security.SecurityContext;
import org.elasticsearch.xpack.ml.datafeed.DatafeedTimingStatsReporter;
import org.elasticsearch.xpack.ml.datafeed.extractor.DataExtractorFactory;
import org.elasticsearch.xpack.ml.datafeed.persistence.DatafeedConfigProvider;
Expand All @@ -34,6 +37,8 @@
import java.util.Optional;
import java.util.stream.Collectors;

import static org.elasticsearch.xpack.ml.utils.SecondaryAuthorizationUtils.useSecondaryAuthIfAvailable;

public class TransportPreviewDatafeedAction extends HandledTransportAction<PreviewDatafeedAction.Request, PreviewDatafeedAction.Response> {

private final ThreadPool threadPool;
Expand All @@ -42,9 +47,10 @@ public class TransportPreviewDatafeedAction extends HandledTransportAction<Previ
private final DatafeedConfigProvider datafeedConfigProvider;
private final JobResultsProvider jobResultsProvider;
private final NamedXContentRegistry xContentRegistry;
private final SecurityContext securityContext;

@Inject
public TransportPreviewDatafeedAction(ThreadPool threadPool, TransportService transportService,
public TransportPreviewDatafeedAction(Settings settings, ThreadPool threadPool, TransportService transportService,
ActionFilters actionFilters, Client client, JobConfigProvider jobConfigProvider,
DatafeedConfigProvider datafeedConfigProvider, JobResultsProvider jobResultsProvider,
NamedXContentRegistry xContentRegistry) {
Expand All @@ -55,6 +61,8 @@ public TransportPreviewDatafeedAction(ThreadPool threadPool, TransportService tr
this.datafeedConfigProvider = datafeedConfigProvider;
this.jobResultsProvider = jobResultsProvider;
this.xContentRegistry = xContentRegistry;
this.securityContext = XPackSettings.SECURITY_ENABLED.get(settings) ?
new SecurityContext(settings, threadPool.getThreadContext()) : null;
}

@Override
Expand All @@ -65,37 +73,39 @@ protected void doExecute(Task task, PreviewDatafeedAction.Request request, Actio
jobConfigProvider.getJob(datafeedConfig.getJobId(), ActionListener.wrap(
jobBuilder -> {
DatafeedConfig.Builder previewDatafeed = buildPreviewDatafeed(datafeedConfig);
Map<String, String> headers = threadPool.getThreadContext().getHeaders().entrySet().stream()
.filter(e -> ClientHelper.SECURITY_HEADER_FILTERS.contains(e.getKey()))
.collect(Collectors.toMap(Map.Entry::getKey, Map.Entry::getValue));
previewDatafeed.setHeaders(headers);
jobResultsProvider.datafeedTimingStats(
jobBuilder.getId(),
timingStats -> {
// NB: this is using the client from the transport layer, NOT the internal client.
// This is important because it means the datafeed search will fail if the user
// requesting the preview doesn't have permission to search the relevant indices.
DataExtractorFactory.create(
client,
previewDatafeed.build(),
jobBuilder.build(),
xContentRegistry,
// Fake DatafeedTimingStatsReporter that does not have access to results index
new DatafeedTimingStatsReporter(timingStats, (ts, refreshPolicy) -> {}),
new ActionListener<>() {
@Override
public void onResponse(DataExtractorFactory dataExtractorFactory) {
DataExtractor dataExtractor = dataExtractorFactory.newExtractor(0, Long.MAX_VALUE);
threadPool.generic().execute(() -> previewDatafeed(dataExtractor, listener));
}
useSecondaryAuthIfAvailable(securityContext, () -> {
Map<String, String> headers = threadPool.getThreadContext().getHeaders().entrySet().stream()
.filter(e -> ClientHelper.SECURITY_HEADER_FILTERS.contains(e.getKey()))
.collect(Collectors.toMap(Map.Entry::getKey, Map.Entry::getValue));
previewDatafeed.setHeaders(headers);
jobResultsProvider.datafeedTimingStats(
jobBuilder.getId(),
timingStats -> {
// NB: this is using the client from the transport layer, NOT the internal client.
// This is important because it means the datafeed search will fail if the user
// requesting the preview doesn't have permission to search the relevant indices.
DataExtractorFactory.create(
client,
previewDatafeed.build(),
jobBuilder.build(),
xContentRegistry,
// Fake DatafeedTimingStatsReporter that does not have access to results index
new DatafeedTimingStatsReporter(timingStats, (ts, refreshPolicy) -> {}),
new ActionListener<>() {
@Override
public void onResponse(DataExtractorFactory dataExtractorFactory) {
DataExtractor dataExtractor = dataExtractorFactory.newExtractor(0, Long.MAX_VALUE);
threadPool.generic().execute(() -> previewDatafeed(dataExtractor, listener));
}

@Override
public void onFailure(Exception e) {
listener.onFailure(e);
}
});
},
listener::onFailure);
@Override
public void onFailure(Exception e) {
listener.onFailure(e);
}
});
},
listener::onFailure);
});
},
listener::onFailure));
},
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -57,6 +57,8 @@
import java.util.Map;
import java.util.Objects;

import static org.elasticsearch.xpack.ml.utils.SecondaryAuthorizationUtils.useSecondaryAuthIfAvailable;

public class TransportPutDataFrameAnalyticsAction
extends TransportMasterNodeAction<PutDataFrameAnalyticsAction.Request, PutDataFrameAnalyticsAction.Response> {

Expand Down Expand Up @@ -140,27 +142,29 @@ private void putValidatedConfig(DataFrameAnalyticsConfig config, ActionListener<
.build();

if (licenseState.isAuthAllowed()) {
final String username = securityContext.getUser().principal();
RoleDescriptor.IndicesPrivileges sourceIndexPrivileges = RoleDescriptor.IndicesPrivileges.builder()
.indices(preparedForPutConfig.getSource().getIndex())
.privileges("read")
.build();
RoleDescriptor.IndicesPrivileges destIndexPrivileges = RoleDescriptor.IndicesPrivileges.builder()
.indices(preparedForPutConfig.getDest().getIndex())
.privileges("read", "index", "create_index")
.build();

HasPrivilegesRequest privRequest = new HasPrivilegesRequest();
privRequest.applicationPrivileges(new RoleDescriptor.ApplicationResourcePrivileges[0]);
privRequest.username(username);
privRequest.clusterPrivileges(Strings.EMPTY_ARRAY);
privRequest.indexPrivileges(sourceIndexPrivileges, destIndexPrivileges);

ActionListener<HasPrivilegesResponse> privResponseListener = ActionListener.wrap(
r -> handlePrivsResponse(username, preparedForPutConfig, r, listener),
listener::onFailure);

client.execute(HasPrivilegesAction.INSTANCE, privRequest, privResponseListener);
useSecondaryAuthIfAvailable(securityContext, () -> {
final String username = securityContext.getUser().principal();
RoleDescriptor.IndicesPrivileges sourceIndexPrivileges = RoleDescriptor.IndicesPrivileges.builder()
.indices(preparedForPutConfig.getSource().getIndex())
.privileges("read")
.build();
RoleDescriptor.IndicesPrivileges destIndexPrivileges = RoleDescriptor.IndicesPrivileges.builder()
.indices(preparedForPutConfig.getDest().getIndex())
.privileges("read", "index", "create_index")
.build();

HasPrivilegesRequest privRequest = new HasPrivilegesRequest();
privRequest.applicationPrivileges(new RoleDescriptor.ApplicationResourcePrivileges[0]);
privRequest.username(username);
privRequest.clusterPrivileges(Strings.EMPTY_ARRAY);
privRequest.indexPrivileges(sourceIndexPrivileges, destIndexPrivileges);

ActionListener<HasPrivilegesResponse> privResponseListener = ActionListener.wrap(
r -> handlePrivsResponse(username, preparedForPutConfig, r, listener),
listener::onFailure);

client.execute(HasPrivilegesAction.INSTANCE, privRequest, privResponseListener);
});
} else {
updateDocMappingAndPutConfig(
preparedForPutConfig,
Expand Down
Loading