Skip to content
This repository has been archived by the owner on Nov 23, 2017. It is now read-only.

Apply --additional-tags to EBS volumes #48

Merged
merged 1 commit into from
Sep 7, 2016

Conversation

commondatageek
Copy link

My company requires me to track ALL of my EC2 usage. That includes EBS volumes.

However, --additional-tags does NOT apply tags to EBS volumes that are attached to instances.

This patch does the following:

  • Add --tag-volumes option
  • When --tag-volumes is invoked, apply --additional-tags to any EBS volumes that are attached to any master or slave instances.

The contribution is my original work and that I license the work to the project under the project's open source license.

- Add --tag-volumes option
- Apply --additional-tags to any EBS volumes that are attached to any
  master or slave instances.
@shivaram
Copy link
Contributor

shivaram commented Sep 7, 2016

Thanks @aaronj1331 - I was wondering if there was any harm in always applying the tags without the new flag, but I guess some users might be relying on the old behavior.

Code LGTM. I'll try out this snippet and then merge

@commondatageek
Copy link
Author

Yeah, you're right, @shivaram. This way allows users to rely on old behavior, if necessary. But the other way you mentioned would at least be more consistent with current behavior, and would eliminate the need for another flag.

  • Always generate a Name tag for each EBS volume (just as we do with instances)
  • Always apply Name + --additional-tags (which might be empty) for all EBS volumes (just as we do with instances)

I don't have a firm opinion as to which way is better.

@shivaram
Copy link
Contributor

shivaram commented Sep 7, 2016

Yeah i think its fine to have an additional option just to avoid any regressions. I tried out the code locally and it seems to work fine. Merging this

@shivaram shivaram merged commit 06f5d2b into amplab:branch-2.0 Sep 7, 2016
@AEDWIP
Copy link

AEDWIP commented Sep 8, 2016

Spark-ec2 used to be part of the spark distribution. It now seems to be
split into a separate repo https://github.com/amplab/spark-ec2

It does not seem to be listed on https://spark-packages.org/

Does anyone know what the status is? There is a readme.md how ever I am
unable to find any release notes. Is there a spark-ec2 user mail list?

Does it support spark-2.x?

Is there anything in particular I need to be aware of If I want to upgrade a
cluster created using spark-ec2 distributed in spark-1.6.x? I would prefer
not to start another cluster because I had to do a lot of other
configuration work that would be hard to reproduce.

What little I know about upgrading is based on
http://spark.apache.org/docs/latest/spark-standalone.html

[ec2-user ~]$ ls /root

ephemeral-hdfs hadoop-native mapreduce persistent-hdfs scala spark
spark-ec2 tachyon

My naïve guess at how to upgrade would be to

1 back up my copy of /root/spark/conf/spark-env.sh

(I use java8, and python3, and changed logging config)

2 /root/spark/conf/log4j.properties

3 extract spark-2.0 to /root/spark and leave everything else alone?

4 copy my version of spark-env.sh and log4j.properties

Kind regards

Andy

{"api_version":"1.0","publisher":{"api_key":"05dde50f1d1a384dd78767c55493e4bb"
,"name":"GitHub"},"entity":{"external_key":"github/amplab/spark-ec2","title":"
amplab/spark-ec2","subtitle":"GitHub
repository","main_image_url":"https://cloud.githubusercontent.com/assets/14341
8/17495839/a5054eac-5d88-11e6-95fc-7290892c7bb5.png","avatar_image_u
rl":"https://cloud.githubusercontent.com/assets/143418/15842166/7c72
db34-2c0b-11e6-9aed-b52498112777.png","action":{"name&quot
;:"Open">https://cloud.githubusercontent.com/assets/143418/17495839/a5054
eac-5d88-11e6-95fc-7290892c7bb5.png","avatar_image_url":"https://cloud.githubu
sercontent.com/assets/143418/15842166/7c72db34-2c0b-11e6-9aed-b52498112777.png
","action":{"name":"Open in
GitHub","url":"https://github.com/amplab/spark-ec2"}},"updates"
:{"snippets":[{"icon":"PERSON","message&quo
t;:"@shivaram">https://github.com/amplab/spark-ec2"}},"updates":{"snippet
s":[{"icon":"PERSON","message":"@shivaram in #48: Yeah i think its fine to
have an additional option just to avoid any regressions. I tried out the code
locally and it seems to work fine. Merging this"}],"action":{"name":"View Pull
Request","url":"https://github.com//pull/48#issuecomment-24541
0560"}}">https://github.com//pull/48#issuecomment-2454105
60"}}}

@commondatageek commondatageek deleted the tag-volumes branch September 8, 2016 04:03
@shivaram
Copy link
Contributor

shivaram commented Sep 8, 2016

@AEDWIP There is no separate user mailing list. You can just email the Spark user / dev mailing or open a new issue on this repo if you have a feature request. Regarding upgrading a cluster, there is no officially supported way to do this -- and this was the case before 2.0 as well.

This is not tested but in addition to what you have I would just suggest backing up all the config files from /root/spark/conf . Also you'll need to do these steps on all the machines in the cluster.

@AEDWIP
Copy link

AEDWIP commented Sep 8, 2016

Thanks

Andy

From: Shivaram Venkataraman notifications@github.com
Reply-To: amplab/spark-ec2
<reply+00725a2f9396e0cb96b7f2981d4518a9ef99f58b0a3265f492cf0000000113e98e119
2a169ce0a76c464@reply.github.com>
Date: Thursday, September 8, 2016 at 1:37 PM
To: amplab/spark-ec2 spark-ec2@noreply.github.com
Cc: Andrew Davidson Andy@SantaCruzIntegration.com, Mention
mention@noreply.github.com
Subject: Re: [amplab/spark-ec2] Apply --additional-tags to EBS volumes
(#48)

@AEDWIP https://github.com/AEDWIP There is no separate user mailing list.
You can just email the Spark user / dev mailing or open a new issue on this
repo if you have a feature request. Regarding upgrading a cluster, there is no
officially supported way to do this -- and this was the case before 2.0 as
well.

This is not tested but in addition to what you have I would just suggest
backing up all the config files from /root/spark/conf . Also you'll need to do
these steps on all the machines in the cluster.


You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub
#48 (comment) , or mute
the thread
<https://github.com/notifications/unsubscribe-auth/AHJaL4TUNNMsUE5nY962MkVdzbD
m_391ks5qoHIRgaJpZM4J3JM7> .

{"api_version":"1.0","publisher":{"api_key":"05dde50f1d1a384dd78767c55493e4bb"
,"name":"GitHub"},"entity":{"external_key":"github/amplab/spark-ec2","title":"
amplab/spark-ec2","subtitle":"GitHub
repository","main_image_url":"https://cloud.githubusercontent.com/assets/14341
8/17495839/a5054eac-5d88-11e6-95fc-7290892c7bb5.png","avatar_image_url":"https
://cloud.githubusercontent.com/assets/143418/15842166/7c72db34-2c0b-11e6-9aed-
b52498112777.png","action":{"name":"Open in
GitHub","url":"https://github.com/amplab/spark-ec2"}},"updates":{"snippets":[{
"icon":"PERSON","message":"@shivaram in #48: @AEDWIP There is no separate user
mailing list. You can just email the Spark user / dev mailing or open a new
issue on this repo if you have a feature request. Regarding upgrading a
cluster, there is no officially supported way to do this -- and this was the
case before 2.0 as well.\r\n\r\nThis is not tested but in addition to what you
have I would just suggest backing up all the config files from
/root/spark/conf . Also you'll need to do these steps on all the machines in
the cluster."}],"action":{"name":"View Pull
Request","url":"#48 (comment)
2422"}}}

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants