Skip to content

Commit

Permalink
Merge branch 'main' into refactor-remote-routing
Browse files Browse the repository at this point in the history
Signed-off-by: Arpit-Bandejiya <abandeji@amazon.com>
  • Loading branch information
Arpit-Bandejiya committed Jul 18, 2024
2 parents 24131d8 + b3b743d commit 8f3db79
Show file tree
Hide file tree
Showing 141 changed files with 6,102 additions and 5,288 deletions.
2 changes: 1 addition & 1 deletion .github/CODEOWNERS
Validating CODEOWNERS rules …
Original file line number Diff line number Diff line change
Expand Up @@ -24,4 +24,4 @@

/.github/ @peternied

/MAINTAINERS.md @anasalkouz @andrross @ashking94 @Bukhtawar @CEHENKLE @dblock @dbwiddis @gbbafna @jed326 @kotwanikunal @mch2 @msfroh @nknize @owaiskazi19 @peternied @reta @Rishikesh1159 @sachinpkale @saratvemulapalli @shwetathareja @sohami @VachaShah
/MAINTAINERS.md @anasalkouz @andrross @ashking94 @Bukhtawar @CEHENKLE @dblock @dbwiddis @gaobinlong @gbbafna @jed326 @kotwanikunal @mch2 @msfroh @nknize @owaiskazi19 @peternied @reta @Rishikesh1159 @sachinpkale @saratvemulapalli @shwetathareja @sohami @VachaShah
9 changes: 9 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -12,9 +12,13 @@ The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/),
- [Workload Management] Add QueryGroup schema ([13669](https://github.com/opensearch-project/OpenSearch/pull/13669))
- Add batching supported processor base type AbstractBatchingProcessor ([#14554](https://github.com/opensearch-project/OpenSearch/pull/14554))
- Fix race condition while parsing derived fields from search definition ([14445](https://github.com/opensearch-project/OpenSearch/pull/14445))
- Add `strict_allow_templates` dynamic mapping option ([#14555](https://github.com/opensearch-project/OpenSearch/pull/14555))
- Add allowlist setting for ingest-common and search-pipeline-common processors ([#14439](https://github.com/opensearch-project/OpenSearch/issues/14439))
- [Workload Management] add queryGroupId header propagator across requests and nodes ([#14614](https://github.com/opensearch-project/OpenSearch/pull/14614))
- Create SystemIndexRegistry with helper method matchesSystemIndex ([#14415](https://github.com/opensearch-project/OpenSearch/pull/14415))
- Print reason why parent task was cancelled ([#14604](https://github.com/opensearch-project/OpenSearch/issues/14604))
- Add matchesPluginSystemIndexPattern to SystemIndexRegistry ([#14750](https://github.com/opensearch-project/OpenSearch/pull/14750))
- Add Plugin interface for loading application based configuration templates (([#14659](https://github.com/opensearch-project/OpenSearch/issues/14659)))
- Refactor remote-routing-table service inline with remote state interfaces([#14668](https://github.com/opensearch-project/OpenSearch/pull/14668))

### Dependencies
Expand All @@ -36,6 +40,7 @@ The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/),
- Bump `com.github.spullara.mustache.java:compiler` from 0.9.13 to 0.9.14 ([#14672](https://github.com/opensearch-project/OpenSearch/pull/14672))
- Bump `net.minidev:accessors-smart` from 2.5.0 to 2.5.1 ([#14673](https://github.com/opensearch-project/OpenSearch/pull/14673))
- Bump `jackson` from 2.17.1 to 2.17.2 ([#14687](https://github.com/opensearch-project/OpenSearch/pull/14687))
- Bump `net.minidev:json-smart` from 2.5.0 to 2.5.1 ([#14748](https://github.com/opensearch-project/OpenSearch/pull/14748))

### Changed
- [Tiered Caching] Move query recomputation logic outside write lock ([#14187](https://github.com/opensearch-project/OpenSearch/pull/14187))
Expand All @@ -49,6 +54,7 @@ The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/),
### Deprecated

### Removed
- Remove query categorization changes ([#14759](https://github.com/opensearch-project/OpenSearch/pull/14759))

### Fixed
- Fix bug in SBP cancellation logic ([#13259](https://github.com/opensearch-project/OpenSearch/pull/13474))
Expand All @@ -64,8 +70,11 @@ The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/),
- Fix FuzzyQuery in keyword field will use IndexOrDocValuesQuery when both of index and doc_value are true ([#14378](https://github.com/opensearch-project/OpenSearch/pull/14378))
- Fix file cache initialization ([#14004](https://github.com/opensearch-project/OpenSearch/pull/14004))
- Handle NPE in GetResult if "found" field is missing ([#14552](https://github.com/opensearch-project/OpenSearch/pull/14552))
- Fix create or update alias API doesn't throw exception for unsupported parameters ([#14719](https://github.com/opensearch-project/OpenSearch/pull/14719))
- Refactoring FilterPath.parse by using an iterative approach ([#14200](https://github.com/opensearch-project/OpenSearch/pull/14200))
- Refactoring Grok.validatePatternBank by using an iterative approach ([#14206](https://github.com/opensearch-project/OpenSearch/pull/14206))
- Update help output for _cat ([#14722](https://github.com/opensearch-project/OpenSearch/pull/14722))
- Fix bulk upsert ignores the default_pipeline and final_pipeline when auto-created index matches the index template ([#12891](https://github.com/opensearch-project/OpenSearch/pull/12891))

### Security

Expand Down
1 change: 1 addition & 0 deletions MAINTAINERS.md
Original file line number Diff line number Diff line change
Expand Up @@ -14,6 +14,7 @@ This document contains a list of maintainers in this repo. See [opensearch-proje
| Charlotte Henkle | [CEHENKLE](https://github.com/CEHENKLE) | Amazon |
| Dan Widdis | [dbwiddis](https://github.com/dbwiddis) | Amazon |
| Daniel "dB." Doubrovkine | [dblock](https://github.com/dblock) | Amazon |
| Gao Binlong | [gaobinlong](https://github.com/gaobinlong) | Amazon |
| Gaurav Bafna | [gbbafna](https://github.com/gbbafna) | Amazon |
| Jay Deng | [jed326](https://github.com/jed326) | Amazon |
| Kunal Kotwani | [kotwanikunal](https://github.com/kotwanikunal) | Amazon |
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -81,50 +81,52 @@ public void apply(Project project) {
// tests
Configuration testConfig = project.getConfigurations().create("restTestConfig");
project.getConfigurations().create("restTests");

if (BuildParams.isInternal()) {
// core
Dependency restTestdependency = project.getDependencies().project(new HashMap<String, String>() {
{
put("path", ":rest-api-spec");
put("configuration", "restTests");
}
});
testConfig.withDependencies(s -> s.add(restTestdependency));
} else {
Dependency dependency = project.getDependencies().create("org.opensearch:rest-api-spec:" + VersionProperties.getOpenSearch());
testConfig.withDependencies(s -> s.add(dependency));
}

Provider<CopyRestTestsTask> copyRestYamlTestTask = project.getTasks()
.register("copyYamlTestsTask", CopyRestTestsTask.class, task -> {
task.includeCore.set(extension.restTests.getIncludeCore());
task.coreConfig = testConfig;
task.sourceSetName = SourceSet.TEST_SOURCE_SET_NAME;
if (BuildParams.isInternal()) {
// core
Dependency restTestdependency = project.getDependencies().project(new HashMap<String, String>() {
{
put("path", ":rest-api-spec");
put("configuration", "restTests");
}
});
project.getDependencies().add(task.coreConfig.getName(), restTestdependency);
} else {
Dependency dependency = project.getDependencies()
.create("org.opensearch:rest-api-spec:" + VersionProperties.getOpenSearch());
project.getDependencies().add(task.coreConfig.getName(), dependency);
}
task.dependsOn(task.coreConfig);
});

// api
Configuration specConfig = project.getConfigurations().create("restSpec"); // name chosen for passivity
project.getConfigurations().create("restSpecs");

if (BuildParams.isInternal()) {
Dependency restSpecDependency = project.getDependencies().project(new HashMap<String, String>() {
{
put("path", ":rest-api-spec");
put("configuration", "restSpecs");
}
});
specConfig.withDependencies(s -> s.add(restSpecDependency));
} else {
Dependency dependency = project.getDependencies().create("org.opensearch:rest-api-spec:" + VersionProperties.getOpenSearch());
specConfig.withDependencies(s -> s.add(dependency));
}

Provider<CopyRestApiTask> copyRestYamlSpecTask = project.getTasks()
.register("copyRestApiSpecsTask", CopyRestApiTask.class, task -> {
task.includeCore.set(extension.restApi.getIncludeCore());
task.dependsOn(copyRestYamlTestTask);
task.coreConfig = specConfig;
task.sourceSetName = SourceSet.TEST_SOURCE_SET_NAME;
if (BuildParams.isInternal()) {
Dependency restSpecDependency = project.getDependencies().project(new HashMap<String, String>() {
{
put("path", ":rest-api-spec");
put("configuration", "restSpecs");
}
});
project.getDependencies().add(task.coreConfig.getName(), restSpecDependency);
} else {
Dependency dependency = project.getDependencies()
.create("org.opensearch:rest-api-spec:" + VersionProperties.getOpenSearch());
project.getDependencies().add(task.coreConfig.getName(), dependency);
}
task.dependsOn(task.coreConfig);
});

Expand Down
Binary file modified gradle/wrapper/gradle-wrapper.jar
Binary file not shown.
4 changes: 2 additions & 2 deletions gradle/wrapper/gradle-wrapper.properties
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,7 @@

distributionBase=GRADLE_USER_HOME
distributionPath=wrapper/dists
distributionUrl=https\://services.gradle.org/distributions/gradle-8.8-all.zip
distributionUrl=https\://services.gradle.org/distributions/gradle-8.9-all.zip
zipStoreBase=GRADLE_USER_HOME
zipStorePath=wrapper/dists
distributionSha256Sum=f8b4f4772d302c8ff580bc40d0f56e715de69b163546944f787c87abf209c961
distributionSha256Sum=258e722ec21e955201e31447b0aed14201765a3bfbae296a46cf60b70e66db70
7 changes: 5 additions & 2 deletions gradlew
Original file line number Diff line number Diff line change
Expand Up @@ -15,6 +15,8 @@
# See the License for the specific language governing permissions and
# limitations under the License.
#
# SPDX-License-Identifier: Apache-2.0
#

##############################################################################
#
Expand Down Expand Up @@ -55,7 +57,7 @@
# Darwin, MinGW, and NonStop.
#
# (3) This script is generated from the Groovy template
# https://github.com/gradle/gradle/blob/HEAD/subprojects/plugins/src/main/resources/org/gradle/api/internal/plugins/unixStartScript.txt
# https://github.com/gradle/gradle/blob/HEAD/platforms/jvm/plugins-application/src/main/resources/org/gradle/api/internal/plugins/unixStartScript.txt
# within the Gradle project.
#
# You can find Gradle at https://github.com/gradle/gradle/.
Expand Down Expand Up @@ -84,7 +86,8 @@ done
# shellcheck disable=SC2034
APP_BASE_NAME=${0##*/}
# Discard cd standard output in case $CDPATH is set (https://github.com/gradle/gradle/issues/25036)
APP_HOME=$( cd "${APP_HOME:-./}" > /dev/null && pwd -P ) || exit
APP_HOME=$( cd -P "${APP_HOME:-./}" > /dev/null && printf '%s
' "$PWD" ) || exit

# Use the maximum available, or set MAX_FD != -1 to use that value.
MAX_FD=maximum
Expand Down
2 changes: 2 additions & 0 deletions gradlew.bat
Original file line number Diff line number Diff line change
Expand Up @@ -13,6 +13,8 @@
@rem See the License for the specific language governing permissions and
@rem limitations under the License.
@rem
@rem SPDX-License-Identifier: Apache-2.0
@rem

@if "%DEBUG%"=="" @echo off
@rem ##########################################################################
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -41,6 +41,10 @@ teardown:
ingest.delete_pipeline:
id: "pipeline2"
ignore: 404
- do:
indices.delete_index_template:
name: test_index_template_for_bulk
ignore: 404

---
"Test bulk request without default pipeline":
Expand Down Expand Up @@ -168,6 +172,40 @@ teardown:
id: test_id3
- match: { _source: {"f1": "v2", "f2": 47, "field1": "value1"}}

# related issue: https://github.com/opensearch-project/OpenSearch/issues/12888
---
"Test bulk upsert honors default_pipeline and final_pipeline when the auto-created index matches with the index template":
- skip:
version: " - 2.99.99"
reason: "fixed in 3.0.0"
- do:
indices.put_index_template:
name: test_for_bulk_upsert_index_template
body:
index_patterns: test_bulk_upsert_*
template:
settings:
number_of_shards: 1
number_of_replicas: 0
default_pipeline: pipeline1
final_pipeline: pipeline2

- do:
bulk:
refresh: true
body:
- '{"update": {"_index": "test_bulk_upsert_index", "_id": "test_id3"}}'
- '{"upsert": {"f1": "v2", "f2": 47}, "doc": {"x": 1}}'

- match: { errors: false }
- match: { items.0.update.result: created }

- do:
get:
index: test_bulk_upsert_index
id: test_id3
- match: { _source: {"f1": "v2", "f2": 47, "field1": "value1", "field2": "value2"}}

---
"Test bulk API with batch enabled happy case":
- skip:
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -60,7 +60,6 @@
import java.util.function.ToLongBiFunction;

import org.ehcache.Cache;
import org.ehcache.CachePersistenceException;
import org.ehcache.PersistentCacheManager;
import org.ehcache.config.builders.CacheConfigurationBuilder;
import org.ehcache.config.builders.CacheEventListenerConfigurationBuilder;
Expand Down Expand Up @@ -104,8 +103,6 @@ public class EhcacheDiskCache<K, V> implements ICache<K, V> {
// Unique id associated with this cache.
private final static String UNIQUE_ID = UUID.randomUUID().toString();
private final static String THREAD_POOL_ALIAS_PREFIX = "ehcachePool";
private final static int MINIMUM_MAX_SIZE_IN_BYTES = 1024 * 100; // 100KB

// A Cache manager can create many caches.
private final PersistentCacheManager cacheManager;

Expand All @@ -127,13 +124,18 @@ public class EhcacheDiskCache<K, V> implements ICache<K, V> {
private final Serializer<K, byte[]> keySerializer;
private final Serializer<V, byte[]> valueSerializer;

final static int MINIMUM_MAX_SIZE_IN_BYTES = 1024 * 100; // 100KB
final static String CACHE_DATA_CLEANUP_DURING_INITIALIZATION_EXCEPTION = "Failed to delete ehcache disk cache under "
+ "path: %s during initialization. Please clean this up manually and restart the process";

/**
* Used in computeIfAbsent to synchronize loading of a given key. This is needed as ehcache doesn't provide a
* computeIfAbsent method.
*/
Map<ICacheKey<K>, CompletableFuture<Tuple<ICacheKey<K>, V>>> completableFutureMap = new ConcurrentHashMap<>();

private EhcacheDiskCache(Builder<K, V> builder) {
@SuppressForbidden(reason = "Ehcache uses File.io")
EhcacheDiskCache(Builder<K, V> builder) {
this.keyType = Objects.requireNonNull(builder.keyType, "Key type shouldn't be null");
this.valueType = Objects.requireNonNull(builder.valueType, "Value type shouldn't be null");
this.expireAfterAccess = Objects.requireNonNull(builder.getExpireAfterAcess(), "ExpireAfterAccess value shouldn't " + "be null");
Expand All @@ -151,6 +153,18 @@ private EhcacheDiskCache(Builder<K, V> builder) {
if (this.storagePath == null || this.storagePath.isBlank()) {
throw new IllegalArgumentException("Storage path shouldn't be null or empty");
}
// Delete all the previous disk cache related files/data. We don't persist data between process restart for
// now which is why need to do this. Clean up in case there was a non graceful restart and we had older disk
// cache data still lying around.
Path ehcacheDirectory = Paths.get(this.storagePath);
if (Files.exists(ehcacheDirectory)) {
try {
logger.info("Found older disk cache data lying around during initialization under path: {}", this.storagePath);
IOUtils.rm(ehcacheDirectory);
} catch (IOException e) {
throw new OpenSearchException(String.format(CACHE_DATA_CLEANUP_DURING_INITIALIZATION_EXCEPTION, this.storagePath), e);
}
}
if (builder.threadPoolAlias == null || builder.threadPoolAlias.isBlank()) {
this.threadPoolAlias = THREAD_POOL_ALIAS_PREFIX + "DiskWrite#" + UNIQUE_ID;
} else {
Expand All @@ -175,6 +189,11 @@ private EhcacheDiskCache(Builder<K, V> builder) {
}
}

// Package private for testing
PersistentCacheManager getCacheManager() {
return this.cacheManager;
}

@SuppressWarnings({ "rawtypes" })
private Cache<ICacheKey, ByteArrayWrapper> buildCache(Duration expireAfterAccess, Builder<K, V> builder) {
// Creating the cache requires permissions specified in plugin-security.policy
Expand Down Expand Up @@ -255,7 +274,7 @@ Map<ICacheKey<K>, CompletableFuture<Tuple<ICacheKey<K>, V>>> getCompletableFutur
}

@SuppressForbidden(reason = "Ehcache uses File.io")
private PersistentCacheManager buildCacheManager() {
PersistentCacheManager buildCacheManager() {
// In case we use multiple ehCaches, we can define this cache manager at a global level.
// Creating the cache manager also requires permissions specified in plugin-security.policy
return AccessController.doPrivileged((PrivilegedAction<PersistentCacheManager>) () -> {
Expand Down Expand Up @@ -444,20 +463,21 @@ public void refresh() {
@Override
@SuppressForbidden(reason = "Ehcache uses File.io")
public void close() {
cacheManager.removeCache(this.diskCacheAlias);
cacheManager.close();
try {
cacheManager.destroyCache(this.diskCacheAlias);
// Delete all the disk cache related files/data
Path ehcacheDirectory = Paths.get(this.storagePath);
if (Files.exists(ehcacheDirectory)) {
cacheManager.close();
} catch (Exception e) {
logger.error(() -> new ParameterizedMessage("Exception occurred while trying to close ehcache manager"), e);
}
// Delete all the disk cache related files/data in case it is present
Path ehcacheDirectory = Paths.get(this.storagePath);
if (Files.exists(ehcacheDirectory)) {
try {
IOUtils.rm(ehcacheDirectory);
} catch (IOException e) {
logger.error(() -> new ParameterizedMessage("Failed to delete ehcache disk cache data under path: {}", this.storagePath));
}
} catch (CachePersistenceException e) {
throw new OpenSearchException("Exception occurred while destroying ehcache and associated data", e);
} catch (IOException e) {
logger.error(() -> new ParameterizedMessage("Failed to delete ehcache disk cache data under path: {}", this.storagePath));
}

}

/**
Expand Down
Loading

0 comments on commit 8f3db79

Please sign in to comment.