-
Notifications
You must be signed in to change notification settings - Fork 32
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
How to make commands execute on Lambda #3
Comments
spark-shell (Spark Driver) has to be brought in an AWS EC2 or ECS container which is in the same VPC as the lambda function, also you need to create the lambda function through AWS console. Once thats done, you should see executors connecting back to driver thats basically it and you should be able to run you spark commands over AWS Lambda. May be it would be better if you can share your email or something else. |
Ok, I created the function My email can be found in my Github profile :). Thanks again for all your help in getting this up and running on Lambda! |
I think the issue is LambdaSchedulerBackend is not created, you have to pass another config |
Awesome, yes! Adding |
Nice. I think the executors haven't still registered with the Spark Driver. Please check the cloudwatch logs, that would have some info I believe. |
Unfortunately, there still aren't any invocations of |
Hi venkata91, faromero, |
Hey DimitarKum, Thanks for trying this out. Its pending on my side to resolve this issue. I have to update the documentation. Last time when I discussed with faromero, these are the things which I made to make it work, something has changed on AWS side from then (time when we developed this one) to now.
Can you please try the above steps to set up both of your EC2 instance as well as lambda function? Please let us know if you stuck somewhere, happy to help! |
How do I run commands from the
spark-shell
so that they are executed on Lambda? Right now, the commands are being executed locally on my machine, but I would like Lambda to be the backend.I am running the following command to start the shell (which does start successfully):
bin/spark-shell --conf spark.hadoop.fs.s3n.awsAccessKeyId=<my-key> --conf spark.hadoop.fs.s3n.awsSecretAccessKey=<my-secret-key> --conf spark.shuffle.s3.bucket=s3://<my-bucket> --conf spark.lambda.function.name=spark-lambda --conf spark.lambda.s3.bucket=s3://<my-bucket>/lambda --conf spark.lambda.spark.software.version=149
I have created the function
spark-lambda
to be the contents ofspark-lambda-os.py
and have given it S3 and EC2 permissions. In addition, the S3 bucket<my-bucket>/lambda
has the packagespark-lambda-149.zip
which was put together by thespark-lambda
script. Is there anything else I need to do to have it execute on Lambda?The text was updated successfully, but these errors were encountered: