-
Notifications
You must be signed in to change notification settings - Fork 28.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Spark-1134] only call ipython if no arguments are given; remove IPYTHONOPTS from call #227
Conversation
…only call ipython if no command line arguments were supplied
Merged build triggered. |
Merged build started. |
Merged build finished. |
All automated tests passed. |
You can modify your old pull request by pushing new code to the branch you made that pull request from (dianacarroll:master) and github will automatically add the new commits to the pull request. |
Thanks. However, the problem is that the pull request in question was On Tue, Mar 25, 2014 at 2:16 PM, Kay Ousterhout notifications@git.luolix.topwrote:
|
Gotcha -- yeah as you've said usually having a separate branch for each pull request is the way to go. You shouldn't need to delete your whole fork though -- now that you've closed the pull request that depends on your master branch you should be good to go! |
Can one of the admins verify this patch? |
Jenkins, test this please |
Actually I guess Jenkins already tested it. I'll merge it. Thanks for the patch! |
Merged build triggered. |
Merged build started. |
Actually sorry, I didn't look at this closely enough. I don't think removing IPYTHON_OPTS is right here -- what Josh wanted was to pass on the command-line options ( I'll make a pull request that does that based on your branch. I've reverted the current one because I didn't want to disable IPython Notebook and other options at this moment. |
Merged build finished. All automated tests passed. |
All automated tests passed. |
This is based on @dianacarroll's previous pull request #227, and @JoshRosen's comments on #38. Since we do want to allow passing arguments to IPython, this does the following: * It documents that IPython can't be used with standalone jobs for now. (Later versions of IPython will deal with PYTHONSTARTUP properly and enable this, see ipython/ipython#5226, but no released version has that fix.) * If you run `pyspark` with `IPYTHON=1`, it passes your command-line arguments to it. This way you can do stuff like `IPYTHON=1 bin/pyspark notebook`. * The old `IPYTHON_OPTS` remains, but I've removed it from the documentation. This is in case people read an old tutorial that uses it. This is not a perfect solution and I'd also be okay with keeping things as they are today (ignoring `$@` for IPython and using IPYTHON_OPTS), and only doing the doc change. With this change though, when IPython fixes ipython/ipython#5226, people will immediately be able to do `IPYTHON=1 bin/pyspark myscript.py` to run a standalone script and get all the benefits of running scripts in IPython (presumably better debugging and such). Without it, there will be no way to run scripts in IPython. @JoshRosen you should probably take the final call on this. Author: Diana Carroll <dcarroll@cloudera.com> Closes #294 from mateiz/spark-1134 and squashes the following commits: 747bb13 [Diana Carroll] SPARK-1134 bug with ipython prevents non-interactive use with spark; only call ipython if no command line arguments were supplied (cherry picked from commit a599e43) Signed-off-by: Matei Zaharia <matei@databricks.com>
This is based on @dianacarroll's previous pull request #227, and @JoshRosen's comments on #38. Since we do want to allow passing arguments to IPython, this does the following: * It documents that IPython can't be used with standalone jobs for now. (Later versions of IPython will deal with PYTHONSTARTUP properly and enable this, see ipython/ipython#5226, but no released version has that fix.) * If you run `pyspark` with `IPYTHON=1`, it passes your command-line arguments to it. This way you can do stuff like `IPYTHON=1 bin/pyspark notebook`. * The old `IPYTHON_OPTS` remains, but I've removed it from the documentation. This is in case people read an old tutorial that uses it. This is not a perfect solution and I'd also be okay with keeping things as they are today (ignoring `$@` for IPython and using IPYTHON_OPTS), and only doing the doc change. With this change though, when IPython fixes ipython/ipython#5226, people will immediately be able to do `IPYTHON=1 bin/pyspark myscript.py` to run a standalone script and get all the benefits of running scripts in IPython (presumably better debugging and such). Without it, there will be no way to run scripts in IPython. @JoshRosen you should probably take the final call on this. Author: Diana Carroll <dcarroll@cloudera.com> Closes #294 from mateiz/spark-1134 and squashes the following commits: 747bb13 [Diana Carroll] SPARK-1134 bug with ipython prevents non-interactive use with spark; only call ipython if no command line arguments were supplied
Fix small bug in web UI and minor clean-up. There was a bug where sorting order didn't work correctly for write time metrics. I also cleaned up some earlier code that fixed the same issue for read and write bytes. (cherry picked from commit 182f9ba) Signed-off-by: Patrick Wendell <pwendell@gmail.com>
Fix small bug in web UI and minor clean-up. There was a bug where sorting order didn't work correctly for write time metrics. I also cleaned up some earlier code that fixed the same issue for read and write bytes. (cherry picked from commit 182f9ba) Signed-off-by: Patrick Wendell <pwendell@gmail.com>
…HONOPTS from call see comments on Pull Request apache#38 (i couldn't figure out how to modify an existing pull request, so I'm hoping I can withdraw that one and replace it with this one.) Author: Diana Carroll <dcarroll@cloudera.com> Closes apache#227 from dianacarroll/spark-1134 and squashes the following commits: ffe47f2 [Diana Carroll] [spark-1134] remove ipythonopts from ipython command b673bf7 [Diana Carroll] Merge branch 'master' of github.com:apache/spark 0309cf9 [Diana Carroll] SPARK-1134 bug with ipython prevents non-interactive use with spark; only call ipython if no command line arguments were supplied
This is based on @dianacarroll's previous pull request apache#227, and @JoshRosen's comments on apache#38. Since we do want to allow passing arguments to IPython, this does the following: * It documents that IPython can't be used with standalone jobs for now. (Later versions of IPython will deal with PYTHONSTARTUP properly and enable this, see ipython/ipython#5226, but no released version has that fix.) * If you run `pyspark` with `IPYTHON=1`, it passes your command-line arguments to it. This way you can do stuff like `IPYTHON=1 bin/pyspark notebook`. * The old `IPYTHON_OPTS` remains, but I've removed it from the documentation. This is in case people read an old tutorial that uses it. This is not a perfect solution and I'd also be okay with keeping things as they are today (ignoring `$@` for IPython and using IPYTHON_OPTS), and only doing the doc change. With this change though, when IPython fixes ipython/ipython#5226, people will immediately be able to do `IPYTHON=1 bin/pyspark myscript.py` to run a standalone script and get all the benefits of running scripts in IPython (presumably better debugging and such). Without it, there will be no way to run scripts in IPython. @JoshRosen you should probably take the final call on this. Author: Diana Carroll <dcarroll@cloudera.com> Closes apache#294 from mateiz/spark-1134 and squashes the following commits: 747bb13 [Diana Carroll] SPARK-1134 bug with ipython prevents non-interactive use with spark; only call ipython if no command line arguments were supplied
## What changes were proposed in this pull request? Main changes: - Move FilterPushdown.scala under the pushdown package and make it reuse some of the helper functions there (e.g. wrap, block) - Add support for more expressions: StartsWith, EndsWith, Contains, AND, OR, NOT, IN - Add parenthesis around all basic predicates and reapprove affected tests. ## How was this patch tested? Ran all unit tests and `RedshiftReadIntegrationSuite.scala` Author: Adrian Ionescu <adrian@databricks.com> Closes apache#227 from adrian-ionescu/redshift-basic-pushdown.
* Add Octavia devstack configuration There is a lbaas devstack configuration that can enable Octavia through neutron-lbaas. However, the neutron-lbaas is deprecated so we need a new task for enabling Octavia as a standalone service. Related-Bug: theopenlab/openlab-zuul-jobs#143
see comments on Pull Request #38
(i couldn't figure out how to modify an existing pull request, so I'm hoping I can withdraw that one and replace it with this one.)