Skip to content

Commit

Permalink
Merge pull request #9131 from IQSS/develop
Browse files Browse the repository at this point in the history
merge develop to master for 5.12.1 release
  • Loading branch information
pdurbin authored Nov 4, 2022
2 parents 71341c0 + 11abccf commit cf90431
Show file tree
Hide file tree
Showing 90 changed files with 1,141 additions and 2,262 deletions.
115 changes: 115 additions & 0 deletions doc/release-notes/5.12.1-release-notes.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,115 @@
# Dataverse Software 5.12.1

This release brings new features, enhancements, and bug fixes to the Dataverse Software. Thank you to all of the community members who contributed code, suggestions, bug reports, and other assistance across the project.

## Release Highlights

### Bug Fix for "Internal Server Error" When Creating a New Remote Account

Unfortunately, as of 5.11 new remote users have seen "Internal Server Error" when creating an account (or checking notifications just after creating an account). Remote users are those who log in with institutional (Shibboleth), OAuth (ORCID, GitHub, or Google) or OIDC providers.

This is a transient error that can be worked around by reloading the browser (or logging out and back in again) but it's obviously a very poor user experience and a bad first impression. This bug is the primary reason we are putting out this patch release. Other features and bug fixes are coming along for the ride.

### Ability to Disable OAuth Sign Up While Allowing Existing Accounts to Log In

A new option called `:AllowRemoteAuthSignUp` has been added providing a mechanism for disabling new account signups for specific OAuth2 authentication providers (Orcid, GitHub, Google etc.) while still allowing logins for already-existing accounts using this authentication method.

See the [Installation Guide](https://guides.dataverse.org/en/5.12.1/installation/config.html#allowremoteauthsignup) for more information on the setting.

### Production Date Now Used for Harvested Datasets in Addition to Distribution Date (`oai_dc` format)

Fix the year displayed in citation for harvested dataset, especially for `oai_dc` format.

For normal datasets, the date used is the "citation date" which is by default the publication date (the first release date) unless you [change it](https://guides.dataverse.org/en/5.12.1/api/native-api.html#set-citation-date-field-type-for-a-dataset).

However, for a harvested dataset, the distribution date was used instead and this date is not always present in the harvested metadata.

Now, the production date is used for harvested dataset in addition to distribution date when harvesting with the `oai_dc` format.

### Publication Date Now Used for Harvested Dataset if Production Date is Not Set (`oai_dc` format)

For exports and harvesting in `oai_dc` format, if "Production Date" is not set, "Publication Date" is now used instead. This change is reflected in the [Dataverse 4+ Metadata Crosswalk][] linked from the [Appendix][] of the User Guide.

[Dataverse 4+ Metadata Crosswalk]: https://docs.google.com/spreadsheets/d/10Luzti7svVTVKTA-px27oq3RxCUM-QbiTkm8iMd5C54/edit#gid=1901625433&range=K7
[Appendix]: https://guides.dataverse.org/en/5.12.1/user/appendix.html

## Major Use Cases and Infrastructure Enhancements

Changes and fixes in this release include:

- Users creating an account by logging in with Shibboleth, OAuth, or OIDC should not see errors. (Issue 9029, PR #9030)
- When harvesting datasets, I want the Production Date if I can't get the Distribution Date (PR #8732)
- When harvesting datasets, I want the Publication Date if I can't get the Production Date (PR #8733)
- As a sysadmin I'd like to disable (temporarily or permanently) sign ups from OAuth providers while allowing existing users to continue to log in from that provider (PR #9112)
- As a C/C++ developer I want to use Dataverse APIs (PR #9070)

## New DB Settings

The following DB settings have been added:

- `:AllowRemoteAuthSignUp`

See the [Database Settings](https://guides.dataverse.org/en/5.12.1/installation/config.html#database-settings) section of the Guides for more information.

## Complete List of Changes

For the complete list of code changes in this release, see the [5.12.1 Milestone](https://github.com/IQSS/dataverse/milestone/106?closed=1) in GitHub.

For help with upgrading, installing, or general questions please post to the [Dataverse Community Google Group](https://groups.google.com/forum/#!forum/dataverse-community) or email support@dataverse.org.

## Installation

If this is a new installation, please see our [Installation Guide](https://guides.dataverse.org/en/5.12.1/installation/). Please also contact us to get added to the [Dataverse Project Map](https://guides.dataverse.org/en/5.10/installation/config.html#putting-your-dataverse-installation-on-the-map-at-dataverse-org) if you have not done so already.

## Upgrade Instructions

Upgrading requires a maintenance window and downtime. Please plan ahead, create backups of your database, etc.

0\. These instructions assume that you've already successfully upgraded from Dataverse Software 4.x to Dataverse Software 5 following the instructions in the [Dataverse Software 5 Release Notes](https://github.com/IQSS/dataverse/releases/tag/v5.0). After upgrading from the 4.x series to 5.0, you should progress through the other 5.x releases before attempting the upgrade to 5.12.1.

If you are running Payara as a non-root user (and you should be!), **remember not to execute the commands below as root**. Use `sudo` to change to that user first. For example, `sudo -i -u dataverse` if `dataverse` is your dedicated application user.

```shell
export PAYARA=/usr/local/payara5
```

(or `setenv PAYARA /usr/local/payara5` if you are using a `csh`-like shell)

1\. Undeploy the previous version

```shell
$PAYARA/bin/asadmin list-applications
$PAYARA/bin/asadmin undeploy dataverse<-version>
```

2\. Stop Payara

```shell
service payara stop
rm -rf $PAYARA/glassfish/domains/domain1/generated
```

6\. Start Payara

```shell
service payara start
```

7\. Deploy this version.

```shell
$PAYARA/bin/asadmin deploy dataverse-5.12.1.war
```

8\. Restart payara

```shell
service payara stop
service payara start
```

## Upcoming Versions of Payara

With the recent release of Payara 6 ([Payara 6.2022.1](https://github.com/payara/Payara/releases/tag/payara-server-6.2022.1) being the first version), the days of free-to-use Payara 5.x Platform Community versions [are numbered](https://blog.payara.fish/whats-new-in-the-november-2022-payara-platform-release). Specifically, Payara's blog post says, "Payara Platform Community 5.2022.4 has been released today as the penultimate Payara 5 Community release."

Given the end of free-to-use Payara 5 versions, we plan to get the Dataverse software working on Payara 6 (#8305), which will require substantial efforts from the IQSS team and community members, as this also means shifting our app to be a [Jakarta EE 10](https://jakarta.ee/release/10/) application (upgrading from EE 8). We are currently working out the details and will share news as soon as we can. Rest assured we will do our best to provide you with a smooth transition. You can follow along in Issue #8305 and related pull requests and you are, of course, very welcome to participate by testing and otherwise contributing, as always.
2 changes: 1 addition & 1 deletion doc/sphinx-guides/source/_static/docsdataverse_org.css
Original file line number Diff line number Diff line change
Expand Up @@ -68,7 +68,7 @@ a.headerlink {
#sidebar.bs-sidenav {
background-color: #f8d5b8;
}
#sidebar.bs-sidenav .nav > li > a:hover, #sidebar.bs-sidenav .nav > li > a:focus {
#sidebar.bs-sidenav .nav > li > a:hover, #sidebar.bs-sidenav .nav > li > a:focus, #sidebar.bs-sidenav .nav > li > a.current {
background-color: #fbf4c5;
border-right: 1px solid #dbd8e0;
text-decoration: none;
Expand Down
2 changes: 1 addition & 1 deletion doc/sphinx-guides/source/_static/util/clear_timer.sh
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,7 @@ DV_DIR=${PAYARA_DIR}/glassfish/domains/domain1
${PAYARA_DIR}/bin/asadmin stop-domain

rm -rf ${PAYARA_DIR}/${DV_DIR}/generated/
rm -rf ${PAYARA_DIR}/${DV_DIR}/osgi-cache/felix
rm -rf ${PAYARA_DIR}/${DV_DIR}/osgi-cache/

# restart the domain (also generates a warning if app server is stopped)
${PAYARA_DIR}/bin/asadmin start-domain
2 changes: 1 addition & 1 deletion doc/sphinx-guides/source/admin/dataverses-datasets.rst
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@ Dataverse collections have to be empty to delete them. Navigate to the Dataverse
Move a Dataverse Collection
^^^^^^^^^^^^^^^^^^^^^^^^^^^

Moves a Dataverse collection whose id is passed to a new Dataverse collection whose id is passed. The Dataverse collection alias also may be used instead of the id. If the moved Dataverse collection has a guestbook, template, metadata block, link, or featured Dataverse collection that is not compatible with the destination Dataverse collection, you will be informed and given the option to force the move and remove the association. Only accessible to superusers. ::
Moves a Dataverse collection whose id is passed to an existing Dataverse collection whose id is passed. The Dataverse collection alias also may be used instead of the id. If the moved Dataverse collection has a guestbook, template, metadata block, link, or featured Dataverse collection that is not compatible with the destination Dataverse collection, you will be informed and given the option to force the move and remove the association. Only accessible to superusers. ::

curl -H "X-Dataverse-key: $API_TOKEN" -X POST http://$SERVER/api/dataverses/$id/move/$destination-id

Expand Down
6 changes: 3 additions & 3 deletions doc/sphinx-guides/source/admin/harvestserver.rst
Original file line number Diff line number Diff line change
Expand Up @@ -26,7 +26,7 @@ The email portion of :ref:`systemEmail` will be visible via OAI-PMH (from the "I
How does it work?
-----------------

Only the published, unrestricted datasets in your Dataverse installation can
Only the published datasets in your Dataverse installation can
be made harvestable. Remote clients normally keep their records in sync
through scheduled incremental updates, daily or weekly, thus
minimizing the load on your server. Note that it is only the metadata
Expand Down Expand Up @@ -115,10 +115,10 @@ Some useful examples of search queries to define OAI sets:

``keywordValue:censorship``

Important: New SOLR schema required!
Important: New Solr schema required!
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

In order to be able to define OAI sets, your SOLR server must be upgraded with the search schema that came with release 4.5 (or later), and all your local datasets must be re-indexed, once the new schema is installed.
In order to be able to define OAI sets, your Solr server must be upgraded with the search schema that came with release 4.5 (or later), and all your local datasets must be re-indexed, once the new schema is installed.

OAI Set updates
---------------
Expand Down
2 changes: 1 addition & 1 deletion doc/sphinx-guides/source/admin/integrations.rst
Original file line number Diff line number Diff line change
Expand Up @@ -57,7 +57,7 @@ their research results and retain links to imported and exported data. Users
can organize their data in "Datasets", which can be exported to a Dataverse installation via
the command-line interface (CLI).

Renku dataset documentation: https://renku-python.readthedocs.io/en/latest/reference/commands.html#module-renku.cli.dataset
Renku documentation: https://renku-python.readthedocs.io

Flagship deployment of the Renku platform: https://renkulab.io

Expand Down
2 changes: 1 addition & 1 deletion doc/sphinx-guides/source/admin/metadatacustomization.rst
Original file line number Diff line number Diff line change
Expand Up @@ -565,7 +565,7 @@ In general, the external vocabulary support mechanism may be a better choice for
The specifics of the user interface for entering/selecting a vocabulary term and how that term is then displayed are managed by third-party Javascripts. The initial Javascripts that have been created provide auto-completion, displaying a list of choices that match what the user has typed so far, but other interfaces, such as displaying a tree of options for a hierarchical vocabulary, are possible.
Similarly, existing scripts do relatively simple things for displaying a term - showing the term's name in the appropriate language and providing a link to an external URL with more information, but more sophisticated displays are possible.

Scripts supporting use of vocabularies from services supporting the SKOMOS protocol (see https://skosmos.org) and retrieving ORCIDs (from https:/orcid.org) are available https://github.com/gdcc/dataverse-external-vocab-support. (Custom scripts can also be used and community members are encouraged to share new scripts through the dataverse-external-vocab-support repository.)
Scripts supporting use of vocabularies from services supporting the SKOMOS protocol (see https://skosmos.org) and retrieving ORCIDs (from https://orcid.org) are available https://github.com/gdcc/dataverse-external-vocab-support. (Custom scripts can also be used and community members are encouraged to share new scripts through the dataverse-external-vocab-support repository.)

Configuration involves specifying which fields are to be mapped, whether free-text entries are allowed, which vocabulary(ies) should be used, what languages those vocabulary(ies) are available in, and several service protocol and service instance specific parameters.
These are all defined in the :ref:`:CVocConf <:CVocConf>` setting as a JSON array. Details about the required elements as well as example JSON arrays are available at https://github.com/gdcc/dataverse-external-vocab-support, along with an example metadata block that can be used for testing.
Expand Down
14 changes: 7 additions & 7 deletions doc/sphinx-guides/source/admin/solr-search-index.rst
Original file line number Diff line number Diff line change
@@ -1,15 +1,15 @@
Solr Search Index
=================

A Dataverse installation requires Solr to be operational at all times. If you stop Solr, you should see a error about this on the root Dataverse installation page, which is powered by the search index Solr provides. You can set up Solr by following the steps in our Installation Guide's :doc:`/installation/prerequisites` and :doc:`/installation/config` sections explaining how to configure it. This section you're reading now is about the care and feeding of the search index. PostgreSQL is the "source of truth" and the Dataverse installation will copy data from PostgreSQL into Solr. For this reason, the search index can be rebuilt at any time. Depending on the amount of data you have, this can be a slow process. You are encouraged to experiment with production data to get a sense of how long a full reindexing will take.
A Dataverse installation requires Solr to be operational at all times. If you stop Solr, you should see an error about this on the root Dataverse installation page, which is powered by the search index Solr provides. You can set up Solr by following the steps in our Installation Guide's :doc:`/installation/prerequisites` and :doc:`/installation/config` sections explaining how to configure it. This section you're reading now is about the care and feeding of the search index. PostgreSQL is the "source of truth" and the Dataverse installation will copy data from PostgreSQL into Solr. For this reason, the search index can be rebuilt at any time. Depending on the amount of data you have, this can be a slow process. You are encouraged to experiment with production data to get a sense of how long a full reindexing will take.

.. contents:: Contents:
:local:

Full Reindex
-------------

There are two ways to perform a full reindex of the Dataverse installation search index. Starting with a "clear" ensures a completely clean index but involves downtime. Reindexing in place doesn't involve downtime but does not ensure a completely clean index.
There are two ways to perform a full reindex of the Dataverse installation search index. Starting with a "clear" ensures a completely clean index but involves downtime. Reindexing in place doesn't involve downtime but does not ensure a completely clean index (e.g. stale entries from destroyed datasets can remain in the index).

Clear and Reindex
+++++++++++++++++
Expand All @@ -22,7 +22,7 @@ Get a list of all database objects that are missing in Solr, and Solr documents

``curl http://localhost:8080/api/admin/index/status``

Remove all Solr documents that are orphaned (ie not associated with objects in the database):
Remove all Solr documents that are orphaned (i.e. not associated with objects in the database):

``curl http://localhost:8080/api/admin/index/clear-orphans``

Expand All @@ -36,7 +36,7 @@ Please note that the moment you issue this command, it will appear to end users
Start Async Reindex
~~~~~~~~~~~~~~~~~~~

Please note that this operation may take hours depending on the amount of data in your system. This known issue is being tracked at https://github.com/IQSS/dataverse/issues/50
Please note that this operation may take hours depending on the amount of data in your system and whether or not you installation is using full-text indexing. More information on this, as well as some reference times, can be found at https://github.com/IQSS/dataverse/issues/50.

``curl http://localhost:8080/api/admin/index``

Expand All @@ -60,7 +60,7 @@ If indexing stops, this command should pick up where it left off based on which
Manual Reindexing
-----------------

If you have made manual changes to a dataset in the database or wish to reindex a dataset that solr didn't want to index properly, it is possible to manually reindex Dataverse collections and datasets.
If you have made manual changes to a dataset in the database or wish to reindex a dataset that Solr didn't want to index properly, it is possible to manually reindex Dataverse collections and datasets.

Reindexing Dataverse Collections
++++++++++++++++++++++++++++++++
Expand All @@ -69,7 +69,7 @@ Dataverse collections must be referenced by database object ID. If you have dire

``select id from dataverse where alias='dataversealias';``

should work, or you may click the Dataverse Software's "Edit" menu and look for dataverseId= in the URLs produced by the drop-down. Then, to re-index:
should work, or you may click the Dataverse Software's "Edit" menu and look for *dataverseId=* in the URLs produced by the drop-down. Then, to re-index:

``curl http://localhost:8080/api/admin/index/dataverses/135``

Expand All @@ -89,7 +89,7 @@ To re-index a dataset by its database ID:
Manually Querying Solr
----------------------

If you suspect something isn't indexed properly in solr, you may bypass the Dataverse installation's web interface and query the command line directly to verify what solr returns:
If you suspect something isn't indexed properly in Solr, you may bypass the Dataverse installation's web interface and query the command line directly to verify what Solr returns:

``curl "http://localhost:8983/solr/collection1/select?q=dsPersistentId:doi:10.15139/S3/HFV0AO"``

Expand Down
2 changes: 1 addition & 1 deletion doc/sphinx-guides/source/admin/troubleshooting.rst
Original file line number Diff line number Diff line change
Expand Up @@ -57,7 +57,7 @@ Ingest is both CPU- and memory-intensive, and depending on your system resources

``/usr/local/payara5/mq/bin/imqcmd -u admin purge dst -t q -n DataverseIngest`` will purge the DataverseIngest queue, and prompt for your confirmation.

Finally, list destinations to verify that the purge was successful::
Finally, list destinations to verify that the purge was successful:

``/usr/local/payara5/mq/bin/imqcmd -u admin list dst``

Expand Down
4 changes: 2 additions & 2 deletions doc/sphinx-guides/source/admin/user-administration.rst
Original file line number Diff line number Diff line change
Expand Up @@ -57,9 +57,9 @@ See :ref:`deactivate-a-user`
Confirm Email
-------------

A Dataverse installation encourages builtin/local users to verify their email address upon signup or email change so that sysadmins can be assured that users can be contacted.
A Dataverse installation encourages builtin/local users to verify their email address upon sign up or email change so that sysadmins can be assured that users can be contacted.

The app will send a standard welcome email with a URL the user can click, which, when activated, will store a ``lastconfirmed`` timestamp in the ``authenticateduser`` table of the database. Any time this is "null" for a user (immediately after signup and/or changing of their Dataverse installation email address), their current email on file is considered to not be verified. The link that is sent expires after a time (the default is 24 hours), but this is configurable by a superuser via the ``:MinutesUntilConfirmEmailTokenExpires`` config option.
The app will send a standard welcome email with a URL the user can click, which, when activated, will store a ``lastconfirmed`` timestamp in the ``authenticateduser`` table of the database. Any time this is "null" for a user (immediately after sign up and/or changing of their Dataverse installation email address), their current email on file is considered to not be verified. The link that is sent expires after a time (the default is 24 hours), but this is configurable by a superuser via the ``:MinutesUntilConfirmEmailTokenExpires`` config option.

Should users' URL token expire, they will see a "Verify Email" button on the account information page to send another URL.

Expand Down
Loading

0 comments on commit cf90431

Please sign in to comment.