Skip to content

Commit

Permalink
Merge branch 'develop' into 10531-contrib IQSS#10531
Browse files Browse the repository at this point in the history
  • Loading branch information
pdurbin committed Jun 13, 2024
2 parents 6b3d06c + 5bf6b6d commit 5b26d88
Show file tree
Hide file tree
Showing 18 changed files with 107 additions and 71 deletions.
4 changes: 2 additions & 2 deletions conf/solr/9.3.0/solrconfig.xml
Original file line number Diff line number Diff line change
Expand Up @@ -290,7 +290,7 @@
have some sort of hard autoCommit to limit the log size.
-->
<autoCommit>
<maxTime>${solr.autoCommit.maxTime:15000}</maxTime>
<maxTime>${solr.autoCommit.maxTime:30000}</maxTime>
<openSearcher>false</openSearcher>
</autoCommit>

Expand All @@ -301,7 +301,7 @@
-->

<autoSoftCommit>
<maxTime>${solr.autoSoftCommit.maxTime:-1}</maxTime>
<maxTime>${solr.autoSoftCommit.maxTime:1000}</maxTime>
</autoSoftCommit>

<!-- Update Related Event Listeners
Expand Down
1 change: 1 addition & 0 deletions doc/release-notes/10466-math-challenge-403-error-page.md
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
On forbidden access error page, also know as 403 error page, the math challenge is now correctly display to submit the contact form.
1 change: 1 addition & 0 deletions doc/release-notes/10547-solr-updates.md
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
Multiple improvements have ben made to they way Solr indexing and searching is done. Response times should be significantly improved. See the individual PRs in this release for details.
1 change: 1 addition & 0 deletions doc/release-notes/10561-3dviewer.md
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
3DViewer by openforestdata.pl has been added to the list of external tools: https://preview.guides.gdcc.io/en/develop/admin/external-tools.html#inventory-of-external-tools
1 change: 1 addition & 0 deletions doc/release-notes/9729-release-notes.md
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
An error is now correctly reported when an attempt is made to assign an identical role to the same collection, dataset, or file. #9729 #10465
Original file line number Diff line number Diff line change
Expand Up @@ -7,3 +7,4 @@ Data Curation Tool configure file "A GUI for curating data by adding labels, gro
Ask the Data query file Ask the Data is an experimental tool that allows you ask natural language questions about the data contained in Dataverse tables (tabular data). See the README.md file at https://github.com/IQSS/askdataverse/tree/main/askthedata for the instructions on adding Ask the Data to your Dataverse installation.
TurboCurator by ICPSR configure dataset TurboCurator generates metadata improvements for title, description, and keywords. It relies on open AI's ChatGPT & ICPSR best practices. See the `TurboCurator Dataverse Administrator <https://turbocurator.icpsr.umich.edu/tc/adminabout/>`_ page for more details on how it works and adding TurboCurator to your Dataverse installation.
JupyterHub explore file The `Dataverse-to-JupyterHub Data Transfer Connector <https://forgemia.inra.fr/dipso/eosc-pillar/dataverse-jupyterhub-connector>`_ is a tool that simplifies the transfer of data between Dataverse repositories and the cloud-based platform JupyterHub. It is designed for researchers, scientists, and data analysts, facilitating collaboration on projects by seamlessly moving datasets and files. The tool is a lightweight client-side web application built using React and relies on the Dataverse External Tool feature, allowing for easy deployment on modern integration systems. Currently optimized for small to medium-sized files, future plans include extending support for larger files and signed Dataverse endpoints. For more details, you can refer to the external tool manifest: https://forgemia.inra.fr/dipso/eosc-pillar/dataverse-jupyterhub-connector/-/blob/master/externalTools.json
3DViewer by openforestdata.pl explore file The 3DViewer by openforestdata.pl can be used to explore 3D files (e.g. STL format). It was presented by Kamil Guryn during the 2020 community meeting (`slide deck <https://osf.io/aqgr5>`_, `video <https://youtu.be/YH4I_kldmGI?si=Y-9dZ3xPXeswchAl&t=2379>`_) and can be found at https://github.com/OpenForestData/open-forest-data-previewers
Original file line number Diff line number Diff line change
Expand Up @@ -129,6 +129,10 @@ public void setUserSum(Long userSum) {
}

public String getMessageTo() {
if (op1 == null || op2 == null) {
// Fix for 403 error page: initUserInput method doesn't call before
initUserInput(null);
}
if (feedbackTarget == null) {
return BrandingUtil.getSupportTeamName(systemAddress);
} else if (feedbackTarget.isInstanceofDataverse()) {
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,6 @@
*/
package edu.harvard.iq.dataverse.engine.command.impl;

import edu.harvard.iq.dataverse.DataFile;
import edu.harvard.iq.dataverse.Dataset;
import edu.harvard.iq.dataverse.Dataverse;
import edu.harvard.iq.dataverse.authorization.DataverseRole;
Expand All @@ -18,6 +17,8 @@
import edu.harvard.iq.dataverse.engine.command.DataverseRequest;
import edu.harvard.iq.dataverse.engine.command.exception.CommandException;
import edu.harvard.iq.dataverse.engine.command.exception.IllegalCommandException;
import edu.harvard.iq.dataverse.util.BundleUtil;

import java.util.Collections;
import java.util.HashSet;
import java.util.Map;
Expand Down Expand Up @@ -68,11 +69,22 @@ public RoleAssignment execute(CommandContext ctxt) throws CommandException {
throw new IllegalCommandException("User " + user.getUserIdentifier() + " is deactivated and cannot be given a role.", this);
}
}
if(isExistingRole(ctxt)){
throw new IllegalCommandException(BundleUtil.getStringFromBundle("datasets.api.grant.role.assignee.has.role.error"), this);
}
// TODO make sure the role is defined on the dataverse.
RoleAssignment roleAssignment = new RoleAssignment(role, grantee, defPoint, privateUrlToken, anonymizedAccess);
return ctxt.roles().save(roleAssignment);
}

private boolean isExistingRole(CommandContext ctxt) {
return ctxt.roles()
.directRoleAssignments(grantee, defPoint)
.stream()
.map(RoleAssignment::getRole)
.anyMatch(it -> it.equals(role));
}

@Override
public Map<String, Set<Permission>> getRequiredPermissions() {
// for data file check permission on owning dataset
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -312,7 +312,7 @@ public Future<String> indexDataverse(Dataverse dataverse, boolean processPaths)
String status;
try {
if (dataverse.getId() != null) {
solrClientService.getSolrClient().add(docs);
solrClientService.getSolrClient().add(docs, COMMIT_WITHIN);
} else {
logger.info("WARNING: indexing of a dataverse with no id attempted");
}
Expand All @@ -321,14 +321,6 @@ public Future<String> indexDataverse(Dataverse dataverse, boolean processPaths)
logger.info(status);
return new AsyncResult<>(status);
}
try {
solrClientService.getSolrClient().commit();
} catch (SolrServerException | IOException ex) {
status = ex.toString();
logger.info(status);
return new AsyncResult<>(status);
}

dvObjectService.updateContentIndexTime(dataverse);
IndexResponse indexResponse = solrIndexService.indexPermissionsForOneDvObject(dataverse);
String msg = "indexed dataverse " + dataverse.getId() + ":" + dataverse.getAlias() + ". Response from permission indexing: " + indexResponse.getMessage();
Expand All @@ -353,6 +345,7 @@ public void indexDatasetInNewTransaction(Long datasetId) { //Dataset dataset) {
private static final Map<Long, Boolean> INDEXING_NOW = new ConcurrentHashMap<>();
// semaphore for async indexing
private static final Semaphore ASYNC_INDEX_SEMAPHORE = new Semaphore(JvmSettings.MAX_ASYNC_INDEXES.lookupOptional(Integer.class).orElse(4), true);
static final int COMMIT_WITHIN = 30000; //Same as current autoHardIndex time

@Inject
@Metric(name = "index_permit_wait_time", absolute = true, unit = MetricUnits.NANOSECONDS,
Expand Down Expand Up @@ -1535,8 +1528,7 @@ private String addOrUpdateDataset(IndexableDataset indexableDataset, Set<Long> d
final SolrInputDocuments docs = toSolrDocs(indexableDataset, datafilesInDraftVersion);

try {
solrClientService.getSolrClient().add(docs.getDocuments());
solrClientService.getSolrClient().commit();
solrClientService.getSolrClient().add(docs.getDocuments(), COMMIT_WITHIN);
} catch (SolrServerException | IOException ex) {
if (ex.getCause() instanceof SolrServerException) {
throw new SolrServerException(ex);
Expand Down Expand Up @@ -1788,8 +1780,7 @@ private void updatePathForExistingSolrDocs(DvObject object) throws SolrServerExc

sid.removeField(SearchFields.SUBTREE);
sid.addField(SearchFields.SUBTREE, paths);
UpdateResponse addResponse = solrClientService.getSolrClient().add(sid);
UpdateResponse commitResponse = solrClientService.getSolrClient().commit();
UpdateResponse addResponse = solrClientService.getSolrClient().add(sid, COMMIT_WITHIN);
if (object.isInstanceofDataset()) {
for (DataFile df : dataset.getFiles()) {
solrQuery.setQuery(SearchUtil.constructQuery(SearchFields.ENTITY_ID, df.getId().toString()));
Expand All @@ -1802,8 +1793,7 @@ private void updatePathForExistingSolrDocs(DvObject object) throws SolrServerExc
}
sid.removeField(SearchFields.SUBTREE);
sid.addField(SearchFields.SUBTREE, paths);
addResponse = solrClientService.getSolrClient().add(sid);
commitResponse = solrClientService.getSolrClient().commit();
addResponse = solrClientService.getSolrClient().add(sid, COMMIT_WITHIN);
}
}
}
Expand Down Expand Up @@ -1845,12 +1835,7 @@ public String delete(Dataverse doomed) {
logger.fine("deleting Solr document for dataverse " + doomed.getId());
UpdateResponse updateResponse;
try {
updateResponse = solrClientService.getSolrClient().deleteById(solrDocIdentifierDataverse + doomed.getId());
} catch (SolrServerException | IOException ex) {
return ex.toString();
}
try {
solrClientService.getSolrClient().commit();
updateResponse = solrClientService.getSolrClient().deleteById(solrDocIdentifierDataverse + doomed.getId(), COMMIT_WITHIN);
} catch (SolrServerException | IOException ex) {
return ex.toString();
}
Expand All @@ -1870,12 +1855,7 @@ public String removeSolrDocFromIndex(String doomed) {
logger.fine("deleting Solr document: " + doomed);
UpdateResponse updateResponse;
try {
updateResponse = solrClientService.getSolrClient().deleteById(doomed);
} catch (SolrServerException | IOException ex) {
return ex.toString();
}
try {
solrClientService.getSolrClient().commit();
updateResponse = solrClientService.getSolrClient().deleteById(doomed, COMMIT_WITHIN);
} catch (SolrServerException | IOException ex) {
return ex.toString();
}
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -356,8 +356,7 @@ private void persistToSolr(Collection<SolrInputDocument> docs) throws SolrServer
/**
* @todo Do something with these responses from Solr.
*/
UpdateResponse addResponse = solrClientService.getSolrClient().add(docs);
UpdateResponse commitResponse = solrClientService.getSolrClient().commit();
UpdateResponse addResponse = solrClientService.getSolrClient().add(docs, IndexServiceBean.COMMIT_WITHIN);
}

public IndexResponse indexPermissionsOnSelfAndChildren(long definitionPointId) {
Expand Down Expand Up @@ -497,26 +496,20 @@ public IndexResponse deleteMultipleSolrIds(List<String> solrIdsToDelete) {
return new IndexResponse("nothing to delete");
}
try {
solrClientService.getSolrClient().deleteById(solrIdsToDelete);
solrClientService.getSolrClient().deleteById(solrIdsToDelete, IndexServiceBean.COMMIT_WITHIN);
} catch (SolrServerException | IOException ex) {
/**
* @todo mark these for re-deletion
*/
return new IndexResponse("problem deleting the following documents from Solr: " + solrIdsToDelete);
}
try {
solrClientService.getSolrClient().commit();
} catch (SolrServerException | IOException ex) {
return new IndexResponse("problem committing deletion of the following documents from Solr: " + solrIdsToDelete);
}
return new IndexResponse("no known problem deleting the following documents from Solr:" + solrIdsToDelete);
}

public JsonObjectBuilder deleteAllFromSolrAndResetIndexTimes() throws SolrServerException, IOException {
JsonObjectBuilder response = Json.createObjectBuilder();
logger.info("attempting to delete all Solr documents before a complete re-index");
solrClientService.getSolrClient().deleteByQuery("*:*");
solrClientService.getSolrClient().commit();
solrClientService.getSolrClient().deleteByQuery("*:*", IndexServiceBean.COMMIT_WITHIN);
int numRowsAffected = dvObjectService.clearAllIndexTimes();
response.add(numRowsClearedByClearAllIndexTimes, numRowsAffected);
response.add(messageString, "Solr index and database index timestamps cleared.");
Expand Down
1 change: 1 addition & 0 deletions src/main/java/propertyFiles/Bundle.properties
Original file line number Diff line number Diff line change
Expand Up @@ -2694,6 +2694,7 @@ datasets.api.datasize.ioerror=Fatal IO error while trying to determine the total
datasets.api.grant.role.not.found.error=Cannot find role named ''{0}'' in dataverse {1}
datasets.api.grant.role.cant.create.assignment.error=Cannot create assignment: {0}
datasets.api.grant.role.assignee.not.found.error=Assignee not found
datasets.api.grant.role.assignee.has.role.error=User already has this role for this dataset
datasets.api.revoke.role.not.found.error="Role assignment {0} not found"
datasets.api.revoke.role.success=Role {0} revoked for assignee {1} in {2}
datasets.api.privateurl.error.datasetnotfound=Could not find dataset.
Expand Down
11 changes: 11 additions & 0 deletions src/test/java/edu/harvard/iq/dataverse/api/DatasetsIT.java
Original file line number Diff line number Diff line change
Expand Up @@ -1717,6 +1717,9 @@ public void testAddRoles(){
giveRandoPermission.prettyPrint();
assertEquals(200, giveRandoPermission.getStatusCode());

//Asserting same role creation is covered
validateAssignExistingRole(datasetPersistentId,randomUsername,apiToken, "fileDownloader");

// Create another random user to become curator:

Response createCuratorUser = UtilIT.createRandomUser();
Expand Down Expand Up @@ -1853,6 +1856,14 @@ public void testListRoleAssignments() {
assertEquals(UNAUTHORIZED.getStatusCode(), notPermittedToListRoleAssignmentOnDataverse.getStatusCode());
}

private static void validateAssignExistingRole(String datasetPersistentId, String randomUsername, String apiToken, String role) {
final Response failedGrantPermission = UtilIT.grantRoleOnDataset(datasetPersistentId, role, "@" + randomUsername, apiToken);
failedGrantPermission.prettyPrint();
failedGrantPermission.then().assertThat()
.body("message", containsString("User already has this role for this dataset"))
.statusCode(FORBIDDEN.getStatusCode());
}

@Test
public void testFileChecksum() {

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -438,7 +438,7 @@ public void testMoveDataverse() {
while (checkIndex) {
try {
try {
Thread.sleep(4000);
Thread.sleep(6000);
} catch (InterruptedException ex) {
}
Response search = UtilIT.search("id:dataverse_" + dataverseId + "&subtree=" + dataverseAlias2, apiToken);
Expand Down
6 changes: 6 additions & 0 deletions src/test/java/edu/harvard/iq/dataverse/api/LinkIT.java
Original file line number Diff line number Diff line change
Expand Up @@ -9,6 +9,8 @@
import static jakarta.ws.rs.core.Response.Status.FORBIDDEN;
import static jakarta.ws.rs.core.Response.Status.OK;
import static org.hamcrest.CoreMatchers.equalTo;
import static org.junit.jupiter.api.Assertions.assertTrue;

import org.junit.jupiter.api.BeforeAll;
import org.junit.jupiter.api.Test;

Expand Down Expand Up @@ -163,6 +165,8 @@ public void testDeepLinks() {
.statusCode(OK.getStatusCode())
.body("data.message", equalTo("Dataverse " + level1a + " linked successfully to " + level1b));

assertTrue(UtilIT.sleepForSearch("*", apiToken, "&subtree="+level1b, 1, UtilIT.GENERAL_LONG_DURATION), "Zero counts in level1b");

Response searchLevel1toLevel1 = UtilIT.search("*", apiToken, "&subtree=" + level1b);
searchLevel1toLevel1.prettyPrint();
searchLevel1toLevel1.then().assertThat()
Expand All @@ -184,6 +188,8 @@ public void testDeepLinks() {
.statusCode(OK.getStatusCode())
.body("data.message", equalTo("Dataverse " + level2a + " linked successfully to " + level2b));

assertTrue(UtilIT.sleepForSearch("*", apiToken, "&subtree=" + level2b, 1, UtilIT.GENERAL_LONG_DURATION), "Never found linked dataverse: " + level2b);

Response searchLevel2toLevel2 = UtilIT.search("*", apiToken, "&subtree=" + level2b);
searchLevel2toLevel2.prettyPrint();
searchLevel2toLevel2.then().assertThat()
Expand Down
Loading

0 comments on commit 5b26d88

Please sign in to comment.