-
Notifications
You must be signed in to change notification settings - Fork 14
Dataverse upload
John Hoffer edited this page Jun 21, 2023
·
5 revisions
Here are the steps needed to prepare to upload to the Dataverse with AWS
.
- Uploading
SOURCE-ZIP-FILE
to theminerva-dataverse-uploads
bucket on AWS S3 - Creating an AWS EC2 instance (a cloud server) with an associated security group
- Creating/Locating your user-specific private key file
Open a terminal. Giving the path to YOUR-PRIVATE-KEY.pem
, which you have just downloaded, connect to YOUR-EC2-PREFIX.compute-1.amazonaws.com
with ssh:
ssh -i YOUR-PRIVATE-KEY.pem ec2-user@YOUR-EC2-PREFIX.compute-1.amazonaws.com
While connected to EC2, install java and download the uploader on the EC2 instance:
sudo yum install java
wget https://github.com/GlobalDataverseCommunityConsortium/dataverse-uploader/releases/download/v1.1.0/DVUploader-v1.1.0.jar
Run aws configure
, giving with your AWS ACCESS KEY
and AWS SECRET KEY
when asked. Then, copy SOURCE-ZIP-FILE
from AWS S3. This assumes that the source zip file is already available on AWS within the minerva-data-uploads
S3 bucket.
aws s3 cp s3://minerva-dataverse-uploads/SOURCE-ZIP-FILE SOURCE-ZIP-FILE
Upload, substituting YOUR-DATAVERSE-API-KEY
, TARGET-DOI
and SOURCE-ZIP-FILE
java -jar DVUploader-v1.1.0.jar -directupload -key=YOUR-DATAVERSE-API-KEY -did=doi:TARGET-DOI -server=https://dataverse.harvard.edu SOURCE-ZIP-FILE