Skip to content

Commit

Permalink
Merge branch 'main' into limitedScopes
Browse files Browse the repository at this point in the history
Signed-off-by: Stephen Crawford <65832608+scrawfor99@users.noreply.github.com>
  • Loading branch information
stephen-crawford authored Jul 24, 2023
2 parents 1c8dbe5 + a29521e commit 027b9e7
Show file tree
Hide file tree
Showing 186 changed files with 2,447 additions and 2,645 deletions.
2 changes: 1 addition & 1 deletion .github/CODEOWNERS
Validating CODEOWNERS rules …
Original file line number Diff line number Diff line change
@@ -1 +1 @@
* @reta @anasalkouz @andrross @reta @Bukhtawar @CEHENKLE @dblock @gbbafna @setiah @kartg @kotwanikunal @mch2 @nknize @owaiskazi19 @Rishikesh1159 @ryanbogan @saratvemulapalli @shwetathareja @dreamer-89 @tlfeng @VachaShah @dbwiddis @sachinpkale
* @reta @anasalkouz @andrross @reta @Bukhtawar @CEHENKLE @dblock @gbbafna @setiah @kartg @kotwanikunal @mch2 @nknize @owaiskazi19 @Rishikesh1159 @ryanbogan @saratvemulapalli @shwetathareja @dreamer-89 @tlfeng @VachaShah @dbwiddis @sachinpkale @sohami
6 changes: 6 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -45,6 +45,7 @@ The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/),
- Migrate client transports to Apache HttpClient / Core 5.x ([#4459](https://github.com/opensearch-project/OpenSearch/pull/4459))
- Change http code on create index API with bad input raising NotXContentException from 500 to 400 ([#4773](https://github.com/opensearch-project/OpenSearch/pull/4773))
- Improve summary error message for invalid setting updates ([#4792](https://github.com/opensearch-project/OpenSearch/pull/4792))
- Remote Segment Store Repository setting moved from `index.remote_store.repository` to `index.remote_store.segment.repository` and `cluster.remote_store.repository` to `cluster.remote_store.segment.repository` respectively for Index and Cluster level settings ([#8719](https://github.com/opensearch-project/OpenSearch/pull/8719))

### Deprecated

Expand All @@ -63,6 +64,7 @@ The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/),
- Remove LegacyESVersion.V_7_10_ Constants ([#5018](https://github.com/opensearch-project/OpenSearch/pull/5018))
- Remove Version.V_1_ Constants ([#5021](https://github.com/opensearch-project/OpenSearch/pull/5021))
- Remove custom Map, List and Set collection classes ([#6871](https://github.com/opensearch-project/OpenSearch/pull/6871))
- Remove provision to create Remote Indices without Remote Translog Store ([#8719](https://github.com/opensearch-project/OpenSearch/pull/8719))

### Fixed
- Fix 'org.apache.hc.core5.http.ParseException: Invalid protocol version' under JDK 16+ ([#4827](https://github.com/opensearch-project/OpenSearch/pull/4827))
Expand All @@ -79,10 +81,14 @@ The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/),
### Dependencies
- Bump `org.apache.logging.log4j:log4j-core` from 2.17.1 to 2.20.0 ([#8307](https://github.com/opensearch-project/OpenSearch/pull/8307))
- Bump `io.grpc:grpc-context` from 1.46.0 to 1.56.1 ([#8726](https://github.com/opensearch-project/OpenSearch/pull/8726))
- Bump `com.netflix.nebula:gradle-info-plugin` from 12.1.5 to 12.1.6 ([#8724](https://github.com/opensearch-project/OpenSearch/pull/8724))
- Bump `commons-codec:commons-codec` from 1.15 to 1.16.0 ([#8725](https://github.com/opensearch-project/OpenSearch/pull/8725))

### Changed
- Perform aggregation postCollection in ContextIndexSearcher after searching leaves ([#8303](https://github.com/opensearch-project/OpenSearch/pull/8303))
- Make Span exporter configurable ([#8620](https://github.com/opensearch-project/OpenSearch/issues/8620))
- Change InternalSignificantTerms to sum shard-level superset counts only in final reduce ([#8735](https://github.com/opensearch-project/OpenSearch/pull/8735))
- Exclude 'benchmarks' from codecov report ([#8805](https://github.com/opensearch-project/OpenSearch/pull/8805))

### Deprecated

Expand Down
1 change: 1 addition & 0 deletions MAINTAINERS.md
Original file line number Diff line number Diff line change
Expand Up @@ -25,6 +25,7 @@ This document contains a list of maintainers in this repo. See [opensearch-proje
| Sachin Kale | [sachinpkale](https://github.com/sachinpkale) | Amazon |
| Sarat Vemulapalli | [saratvemulapalli](https://github.com/saratvemulapalli) | Amazon |
| Shweta Thareja | [shwetathareja](https://github.com/shwetathareja) | Amazon |
| Sorabh Hamirwasia | [sohami](https://github.com/sohami) | Amazon |
| Suraj Singh | [dreamer-89](https://github.com/dreamer-89) | Amazon |
| Tianli Feng | [tlfeng](https://github.com/tlfeng) | Amazon |
| Vacha Shah | [VachaShah](https://github.com/VachaShah) | Amazon |
Expand Down
4 changes: 2 additions & 2 deletions buildSrc/build.gradle
Original file line number Diff line number Diff line change
Expand Up @@ -102,12 +102,12 @@ dependencies {

api localGroovy()

api 'commons-codec:commons-codec:1.15'
api 'commons-codec:commons-codec:1.16.0'
api 'org.apache.commons:commons-compress:1.23.0'
api 'org.apache.ant:ant:1.10.13'
api 'com.netflix.nebula:gradle-extra-configurations-plugin:10.0.0'
api 'com.netflix.nebula:nebula-publishing-plugin:20.3.0'
api 'com.netflix.nebula:gradle-info-plugin:12.1.5'
api 'com.netflix.nebula:gradle-info-plugin:12.1.6'
api 'org.apache.rat:apache-rat:0.15'
api 'commons-io:commons-io:2.13.0'
api "net.java.dev.jna:jna:5.13.0"
Expand Down
2 changes: 0 additions & 2 deletions client/rest-high-level/build.gradle
Original file line number Diff line number Diff line change
Expand Up @@ -106,8 +106,6 @@ testClusters.all {
extraConfigFile nodeTrustStore.name, nodeTrustStore
extraConfigFile pkiTrustCert.name, pkiTrustCert

// Enable APIs behind feature flags
setting 'opensearch.experimental.feature.search_pipeline.enabled', 'true'
}

thirdPartyAudit.ignoreMissingClasses(
Expand Down
1 change: 1 addition & 0 deletions codecov.yml
Original file line number Diff line number Diff line change
Expand Up @@ -3,6 +3,7 @@ codecov:

ignore:
- "test"
- "benchmarks"

coverage:
precision: 2
Expand Down
11 changes: 1 addition & 10 deletions distribution/src/config/opensearch.yml
Original file line number Diff line number Diff line change
Expand Up @@ -92,10 +92,7 @@ ${path.logs}
# cluster.remote_store.enabled: true
#
# Repository to use for segment upload while enforcing remote store for an index
# cluster.remote_store.repository: my-repo-1
#
# Controls whether cluster imposes index creation only with translog remote store enabled
# cluster.remote_store.translog.enabled: true
# cluster.remote_store.segment.repository: my-repo-1
#
# Repository to use for translog upload while enforcing remote store for an index
# cluster.remote_store.translog.repository: my-repo-1
Expand Down Expand Up @@ -128,12 +125,6 @@ ${path.logs}
#opensearch.experimental.feature.extensions.enabled: false
#
#
# Gates the search pipeline feature. This feature enables configurable processors
# for search requests and search responses, similar to ingest pipelines.
#
#opensearch.experimental.feature.search_pipeline.enabled: false
#
#
# Gates the concurrent segment search feature. This feature enables concurrent segment search in a separate
# index searcher threadpool.
#
Expand Down
172 changes: 172 additions & 0 deletions libs/common/src/main/java/org/opensearch/common/util/BitMixer.java
Original file line number Diff line number Diff line change
@@ -0,0 +1,172 @@
/*
* SPDX-License-Identifier: Apache-2.0
*
* The OpenSearch Contributors require contributions made to
* this file be licensed under the Apache-2.0 license or a
* compatible open source license.
*/

/*
* HPPC
*
* Copyright (C) 2010-2022 Carrot Search s.c.
* All rights reserved.
*
* Refer to the full license file "LICENSE.txt":
* https://github.com/carrotsearch/hppc/blob/master/LICENSE.txt
*/

/*
* Modifications Copyright OpenSearch Contributors. See
* GitHub history for details.
*/

package org.opensearch.common.util;

/**
* Bit mixing utilities from carrotsearch.hppc.
*
* Licensed under ALv2. This is pulled in directly to avoid a full hppc dependency.
*
* The purpose of these methods is to evenly distribute key space over int32
* range.
*/
public final class BitMixer {

// Don't bother mixing very small key domains much.
public static int mix(byte key) {
return key * PHI_C32;
}

public static int mix(byte key, int seed) {
return (key ^ seed) * PHI_C32;
}

public static int mix(short key) {
return mixPhi(key);
}

public static int mix(short key, int seed) {
return mixPhi(key ^ seed);
}

public static int mix(char key) {
return mixPhi(key);
}

public static int mix(char key, int seed) {
return mixPhi(key ^ seed);
}

// Better mix for larger key domains.
public static int mix(int key) {
return mix32(key);
}

public static int mix(int key, int seed) {
return mix32(key ^ seed);
}

public static int mix(float key) {
return mix32(Float.floatToIntBits(key));
}

public static int mix(float key, int seed) {
return mix32(Float.floatToIntBits(key) ^ seed);
}

public static int mix(double key) {
return (int) mix64(Double.doubleToLongBits(key));
}

public static int mix(double key, int seed) {
return (int) mix64(Double.doubleToLongBits(key) ^ seed);
}

public static int mix(long key) {
return (int) mix64(key);
}

public static int mix(long key, int seed) {
return (int) mix64(key ^ seed);
}

public static int mix(Object key) {
return key == null ? 0 : mix32(key.hashCode());
}

public static int mix(Object key, int seed) {
return key == null ? 0 : mix32(key.hashCode() ^ seed);
}

/**
* MH3's plain finalization step.
*/
public static int mix32(int k) {
k = (k ^ (k >>> 16)) * 0x85ebca6b;
k = (k ^ (k >>> 13)) * 0xc2b2ae35;
return k ^ (k >>> 16);
}

/**
* Computes David Stafford variant 9 of 64bit mix function (MH3 finalization step,
* with different shifts and constants).
*
* Variant 9 is picked because it contains two 32-bit shifts which could be possibly
* optimized into better machine code.
*
* @see "http://zimbry.blogspot.com/2011/09/better-bit-mixing-improving-on.html"
*/
public static long mix64(long z) {
z = (z ^ (z >>> 32)) * 0x4cd6944c5cc20b6dL;
z = (z ^ (z >>> 29)) * 0xfc12c5b19d3259e9L;
return z ^ (z >>> 32);
}

/*
* Golden ratio bit mixers.
*/

private static final int PHI_C32 = 0x9e3779b9;
private static final long PHI_C64 = 0x9e3779b97f4a7c15L;

public static int mixPhi(byte k) {
final int h = k * PHI_C32;
return h ^ (h >>> 16);
}

public static int mixPhi(char k) {
final int h = k * PHI_C32;
return h ^ (h >>> 16);
}

public static int mixPhi(short k) {
final int h = k * PHI_C32;
return h ^ (h >>> 16);
}

public static int mixPhi(int k) {
final int h = k * PHI_C32;
return h ^ (h >>> 16);
}

public static int mixPhi(float k) {
final int h = Float.floatToIntBits(k) * PHI_C32;
return h ^ (h >>> 16);
}

public static int mixPhi(double k) {
final long h = Double.doubleToLongBits(k) * PHI_C64;
return (int) (h ^ (h >>> 32));
}

public static int mixPhi(long k) {
final long h = k * PHI_C64;
return (int) (h ^ (h >>> 32));
}

public static int mixPhi(Object k) {
final int h = (k == null ? 0 : k.hashCode() * PHI_C32);
return h ^ (h >>> 16);
}
}
6 changes: 0 additions & 6 deletions libs/core/.classpath1
Original file line number Diff line number Diff line change
Expand Up @@ -253,12 +253,6 @@
<attribute name="test" value="true"/>
</attributes>
</classpathentry>
<classpathentry sourcepath="/home/alpar/.gradle/caches/modules-2/files-2.1/com.carrotsearch/hppc/0.8.1/b338e50c3f98c7ec2bf67a5efb7fa8726a4a9b2d/hppc-0.8.1-sources.jar" kind="lib" path="/home/alpar/.gradle/caches/modules-2/files-2.1/com.carrotsearch/hppc/0.8.1/ffc7ba8f289428b9508ab484b8001dea944ae603/hppc-0.8.1.jar">
<attributes>
<attribute name="gradle_used_by_scope" value="test"/>
<attribute name="test" value="true"/>
</attributes>
</classpathentry>
<classpathentry sourcepath="/home/alpar/.gradle/caches/modules-2/files-2.1/joda-time/joda-time/2.10.2/fbf6cbd712c30629c77cefa42fe15ca888e609d5/joda-time-2.10.2-sources.jar" kind="lib" path="/home/alpar/.gradle/caches/modules-2/files-2.1/joda-time/joda-time/2.10.2/a079fc39ccc3de02acdeb7117443e5d9bd431687/joda-time-2.10.2.jar">
<attributes>
<attribute name="gradle_used_by_scope" value="test"/>
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -8,8 +8,6 @@

package org.opensearch.geo.search.aggregations.bucket;

import com.carrotsearch.hppc.ObjectIntHashMap;
import com.carrotsearch.hppc.ObjectIntMap;
import org.apache.lucene.geo.GeoEncodingUtils;
import org.opensearch.Version;
import org.opensearch.action.index.IndexRequestBuilder;
Expand All @@ -26,8 +24,10 @@
import org.opensearch.test.VersionUtils;

import java.util.ArrayList;
import java.util.HashMap;
import java.util.HashSet;
import java.util.List;
import java.util.Map;
import java.util.Random;
import java.util.Set;

Expand All @@ -51,11 +51,11 @@ public abstract class AbstractGeoBucketAggregationIntegTest extends GeoModulePlu

protected static Rectangle boundingRectangleForGeoShapesAgg;

protected static ObjectIntMap<String> expectedDocsCountForGeoShapes;
protected static Map<String, Integer> expectedDocsCountForGeoShapes;

protected static ObjectIntMap<String> expectedDocCountsForSingleGeoPoint;
protected static Map<String, Integer> expectedDocCountsForSingleGeoPoint;

protected static ObjectIntMap<String> multiValuedExpectedDocCountsGeoPoint;
protected static Map<String, Integer> multiValuedExpectedDocCountsGeoPoint;

protected static final String GEO_SHAPE_FIELD_NAME = "location_geo_shape";

Expand All @@ -82,7 +82,7 @@ protected boolean forbidPrivateIndexSettings() {
* @throws Exception thrown during index creation.
*/
protected void prepareGeoShapeIndexForAggregations(final Random random) throws Exception {
expectedDocsCountForGeoShapes = new ObjectIntHashMap<>();
expectedDocsCountForGeoShapes = new HashMap<>();
final Settings settings = Settings.builder().put(IndexMetadata.SETTING_VERSION_CREATED, version).build();
final List<IndexRequestBuilder> geoshapes = new ArrayList<>();
assertAcked(prepareCreate(GEO_SHAPE_INDEX_NAME).setSettings(settings).setMapping(GEO_SHAPE_FIELD_NAME, "type" + "=geo_shape"));
Expand Down Expand Up @@ -129,7 +129,7 @@ protected void prepareGeoShapeIndexForAggregations(final Random random) throws E
* @throws Exception thrown during index creation.
*/
protected void prepareSingleValueGeoPointIndex(final Random random) throws Exception {
expectedDocCountsForSingleGeoPoint = new ObjectIntHashMap<>();
expectedDocCountsForSingleGeoPoint = new HashMap<>();
createIndex("idx_unmapped");
final Settings settings = Settings.builder()
.put(IndexMetadata.SETTING_VERSION_CREATED, version)
Expand All @@ -155,7 +155,7 @@ protected void prepareSingleValueGeoPointIndex(final Random random) throws Excep
}

protected void prepareMultiValuedGeoPointIndex(final Random random) throws Exception {
multiValuedExpectedDocCountsGeoPoint = new ObjectIntHashMap<>();
multiValuedExpectedDocCountsGeoPoint = new HashMap<>();
final Settings settings = Settings.builder().put(IndexMetadata.SETTING_VERSION_CREATED, version).build();
final List<IndexRequestBuilder> cities = new ArrayList<>();
assertAcked(
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -31,8 +31,6 @@

package org.opensearch.geo.search.aggregations.bucket;

import com.carrotsearch.hppc.ObjectIntHashMap;
import com.carrotsearch.hppc.cursors.ObjectIntCursor;
import org.opensearch.action.search.SearchResponse;
import org.opensearch.common.geo.GeoBoundingBox;
import org.opensearch.common.geo.GeoPoint;
Expand All @@ -49,6 +47,7 @@
import org.opensearch.search.aggregations.bucket.filter.Filter;
import org.opensearch.test.OpenSearchIntegTestCase;

import java.util.HashMap;
import java.util.HashSet;
import java.util.List;
import java.util.Random;
Expand All @@ -70,7 +69,7 @@ public void setupSuiteScopeCluster() throws Exception {
Random random = random();
// Creating a BB for limiting the number buckets generated during aggregation
boundingRectangleForGeoShapesAgg = getGridAggregationBoundingBox(random);
expectedDocCountsForSingleGeoPoint = new ObjectIntHashMap<>();
expectedDocCountsForSingleGeoPoint = new HashMap<>();
prepareSingleValueGeoPointIndex(random);
prepareMultiValuedGeoPointIndex(random);
prepareGeoShapeIndexForAggregations(random);
Expand Down Expand Up @@ -232,9 +231,9 @@ public void testTopMatch() {
String geohash = cell.getKeyAsString();
long bucketCount = cell.getDocCount();
int expectedBucketCount = 0;
for (ObjectIntCursor<String> cursor : expectedDocCountsForSingleGeoPoint) {
if (cursor.key.length() == precision) {
expectedBucketCount = Math.max(expectedBucketCount, cursor.value);
for (var cursor : expectedDocCountsForSingleGeoPoint.entrySet()) {
if (cursor.getKey().length() == precision) {
expectedBucketCount = Math.max(expectedBucketCount, cursor.getValue());
}
}
assertNotSame(bucketCount, 0);
Expand Down
Loading

0 comments on commit 027b9e7

Please sign in to comment.