Skip to content

Release Verification

kellymclaughlin edited this page Aug 20, 2012 · 12 revisions

Tests to complete

  • User Creation

  • Create two user accounts (instructions) and record the credentials for use in other tests. Note: It is suggested to set up a different .s3cfg file for each user account and use the -c option to s3cmd to specify which user's credentials to use, but the use of -c in any of the following examples has been omitted for clarity.

  • S3 Compliance

    • Verify s3cmd commands. Information on configuring s3cmd can be found here.

      1. Verify the user has no bucket - s3cmd ls
      2. Create a bucket - s3cmd mb s3://testbucket
      3. Verify the bucket is now listed for the user - s3cmd ls
      4. Delete the bucket - s3cmd rb s3://testbucket
      5. Verify the bucket is no longer listed for the user - s3cmd ls
      6. Re-create the bucket - s3cmd mb s3://testbucket
      7. Verify the bucket is empty - s3cmd ls s3://testbucket
      8. Store a file - s3cmd put <filename> s3://testbucket/
      9. List the bucket contents and verify the file is listed - s3cmd ls s3://testbucket
      10. Retrieve the file. Store it at a different name than the original - s3cmd get s3://testbucket/<filename> <filename>.out
      11. Verify the size and cryptographic hash (e.g. sha256) match those of the original file. ls -l <filename> <filename>out;sha256sum <filename> <filename>.out
      12. Delete the file - s3cmd del s3://testbucket/<filename>
      13. List the contents of the bucket and verify that the file is no longer listed - s3cmd ls s3://testbucket
      14. Sync to a remote directory with several files in it (preferably with nested subdirectories) - s3cmd sync ./ s3://testbucket/
      15. Verify the reported size of the files that were synced - s3cmd du s3://testbucket
      16. Make an empty directory and move to it and sync the files from the remote directory to the local directory and verify the files - s3cmd sync s3://testbucket ./
      17. Create another bucket - s3cmd mb s3://testbucket2
      18. Store a file in the new bucket - s3cmd put <filename> s3://testbucket/
      19. List the buckets for the user and verify the new bucket is listed - s3cmd ls
      20. List the files in all buckets and verify the output is correct - s3cmd la
      21. Attempt to delete one of the buckets and verify that an error is generated due to the bucket not being empty - s3cmd rb s3://testbucket2
    • Verify bucket ACL operations

      1. Create an ACL test bucket using the first user's credentials - s3cmd mb s3://acltest
      2. Attempt to delete the bucket using the second user's credentials and verify that a message denying access is returned - s3cmd rb s3://acltest. Also use the first user's credentials to list their buckets and verify the bucket is still listed - s3cmd ls.
      3. Store a file in the bucket - s3cmd put <filename> s3://acltest/
      4. Attempt to fetch a file from the bucket using the second user's credentials and verify that access is denied - s3cmd get s3://acltest/<filename_from_step_3>
      5. Attempt to list the bucket contents using the second user's credentials and verify that access is denied - s3cmd ls s3://acltest
      6. Attempt to store a file in the bucket using the second user's credentials and verify that access is denied - s3cmd put <filename> s3://acltest/
      7. Attempt to read the bucket's acl using the second user's credentials and verify access is denied - s3cmd info s3://acltest
      8. Attempt to modify the bucket's acl using the second user's credentials and verify access is denied - s3cmd setacl --acl-grant=full_control:<second_users_email_here> s3://acltest
      9. Grant the second user read permissions on the bucket - s3cmd setacl --acl-grant=read:<second_users_email_here> s3://acltest.
      10. List the bucket's info and confirm the second user is listed with READ permissions - s3cmd info s3://acltest
      11. Attempt to list the bucket contents using the second user's credentials and verify that access is now granted - s3cmd ls s3://acltest
      12. Repeat steps 6-8
      13. Grant the second user write permissions on the bucket - s3cmd setacl --acl-grant=write:<second_users_email_here> s3://acltest
      14. Attempt to store a file in the bucket using the second user's credentials and verify that access is now granted - s3cmd put <filename> s3://acltest/
      15. Repeat steps 7-8
      16. Grant the second user read_acp permissions on the bucket - s3cmd setacl --acl-grant=read_acp:<second_users_email_here> s3://acltest
      17. Attempt to read the bucket's acl using the second user's credentials and verify access is now granted - s3cmd info s3://acltest
      18. Repeat step 8
      19. Grant the second user write_acp permissions on the bucket - s3cmd setacl --acl-grant=write_acp:<second_users_email_here> s3://acltest
      20. Attempt to modify the bucket's acl using the second user's credentials and verify access is now granted - s3cmd setacl --acl-grant=full_control:<second_users_email_here> s3://acltest
      21. The second user should now have FULL_CONTROL of the bucket, but full control at the bucket level does not allow for fetching objects from the bucket. Attempt to fetch a file from the bucket using the second user's credentials and verify that access is denied - s3cmd get s3://acltest/<filename_from_step_3>
      22. Using the first user's credentials, revoke all permissions on the bucket for the second user - s3cmd setacl --acl-revoke=all:<second_users_email_here> s3://acltest
      23. Repeat steps 6-8.
  • Large File Testing

    • Use s3cmd subcommands get, put, and del to store, retrieve, and delete a file of at least 2 GB (max 4 GB) in size. Ensure that the size and cryptographic hash of the retrieved file matches that of the original.
    • Upload at least 2 different files of size greater than 2GB simultaneously and verify there are no errors or timeouts. Then retrieve the files simultaneously.
    • Begin the upload of a file of at least 2 GB using s3cmd put and give it several seconds to begin uploading. In a different shell load a much smaller file (maybe a few KB) to the same location. Wait for the loading of the first file to complete and then fetch the location and verify that the size and cryptographic hash of the retrieved file matches those of the smaller file.
    • Attempt to upload a file of size greater than 4 GB and ensure that an error about the file being too large is generated.
  • Billing and Usage Testing

    • Pending code completion
  • Globally Unique Email Addresses for User Accounts

    • Attempt to create another user account using an email address from one of the users that you already created. The request should return the an HTTP status code and reason of 409 Conflict and the response body should be the following XML document:
<?xml version="1.0" encoding="UTF-8"?>
<Error>
  <Code>UserAlreadyExists</Code>
  <Message>The specified email address has already been registered. Email addresses must be unique among all users of the system. Please try again with a different email address.</Message>
  <Resource>/user</Resource>
  <RequestId></RequestId>
</Error>
  • Globally Unique Bucket Names

    1. Create a bucket - s3cmd mb s3://uniquebucket
    2. Attempt to create the same bucket using another user's credentials. The response from s3cmd should be ERROR: Bucket 'uniquebucket' already exists.
    3. Delete the bucket - s3cmd rb s3://uniquebucket
    4. Try again to create the bucket with the user credentials from step 2 and it should succeed.
  • Stats Testing

    • Pending code completion
  • Replication Testing

    • Pending code completion

Test Setups

  • Test with Riak, Riak CS, and Stanchion running locally without the use of any load balancers.
  • Test using the SilverLining configuration (e.g. test1.moss.basho.com).