-
Notifications
You must be signed in to change notification settings - Fork 1.8k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Support pandas>=0.18.1 #443
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM!
We should probably add this to travis as well. |
@twiecki how would I go about doing that? |
Ah got it, thanks. Do we want all python/pandas combinations? |
Yes. |
Need to wait for #442 to fix the current breakage. |
merging #442 once tests pass |
@yankees714 Can you rebase from master? |
If DataFrame.columns has a name set, it was previously maintained by DataFrame.join, but no longer is in recent versions of pandas. This commit resets DataFrame.columns.name before returning, so that the same dataframe is returned across pandas versions.
11dca0b
to
4910fbb
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM, just had two small questions
.travis.yml
Outdated
- "2.7" | ||
- "3.5" | ||
- "3.6" | ||
matrix: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
is adding time to the travis build something we're worried about?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Possibly, any thoughts on this @twiecki, @richafrank?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'm not worried about it.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Each job in the matrix would still take the same amount of time, so we wouldn't suffer any new timeouts. I definitely think we should test an environment that matches what we test for zipline; the others I'm less concerned about. I think we have 4 workers - What's the wall time for a build now?
Is it possible for us to specify the factor, so travis expands the matrix itself?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
For the last build on this branch it was 20-30 minutes per job, a bit over an hour total.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Also yeah looks like we should be able to have travis do the expansion, I'll take a look at that
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
An hour sounds kinda painful to me, but as consumers, what do you all think?
pyfolio/risk.py
Outdated
# drop all-nan rows before the call to `quantile`, and then | ||
# restore them by reindexing the final result. This is fixed | ||
# in pandas 0.19. | ||
longed_threshold = 100*longed_frac.dropna(how='all').quantile( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
will this fail flake8 if there aren't spaces around *
?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Apparently not, as I was just leaving this as it was. For some reason thought we were consistently not including the whitespace in this project, but now that I look again it seems like we do include it more often that not. I'll update this and the other occurrences in this file.
@vikram-narayan I realized there was an issue with one of the workarounds, so I had to modify it (a391f8a), mind taking another look? |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
just had one question on that commit, lgtm otherwise
# pandas 0.18, use np.nanpercentile by applying to each row of | ||
# the dataframe. This is fixed in pandas 0.19. | ||
# | ||
# longed_threshold = 100*longed_frac.quantile(percentile, axis='columns') |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
cruft?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I was just leaving these as a reference for when we no longer need to support 0.18, does that sound alright?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
yup sounds good
actually looks like the travis build failed, is that due to the most recent change? |
@vikram-narayan I think conda is still broken. |
I think @yankees714 included a workaround for that, it looks like python 3.6 and pandas 0.18.1 are incompatible, from one of the travis builds:
|
Also updates .travis.yml to build on pandas 0.18.1, 0.19.2, and 0.20.3.
c4ce1e0
to
3cc1e94
Compare
Only remaining concern here was build time - it currently stands at 45 minutes which seems tolerable to me. Going to merge, we can cut time later if need be. |
Awesome! Should probably release a new version then. |
@twiecki should we align to completion of the first round of perf attribution studies/tearsheets? |
@joshpayne I think we're fine with what we have now. |
For compatibility with zipline.