Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix/pressure parser error #17

Merged
merged 10 commits into from
Oct 14, 2020
Merged

Fix/pressure parser error #17

merged 10 commits into from
Oct 14, 2020

Conversation

CrepeGoat
Copy link
Contributor

resolves #16

…optimization"

This reverts commit a045a8e, reversing
changes made to aefe43d.
…_parser-optimization""

This reverts commit aeff0ad.

The original reversion confirms that the issue is not related to
archive/#11; with that confirmed, the original reversion may be removed
Copy link
Member

@StokesMIDE StokesMIDE left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks good to me.

Copy link
Member

@StokesMIDE StokesMIDE left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

A more meaningful exception (e.g. a TypeError with a descriptive message) would be better than an AssertError.

@CrepeGoat
Copy link
Contributor Author

A more meaningful exception (e.g. a TypeError with a descriptive message) would be better than an AssertError.

I'd prefer an assert statement for the following reasons:

  1. a TypeError is fine when you want to inform the dev that they're misusing a function you wrote; that's an "expected" error in a sense. This is producing an error for a fundamental assumption on which the code is based, and If that assumption is wrong for any reason, it means the code is bad. There is no circumstance under which this should ever be raised, and a user (including other Python dev's) should never see this -> making it a TypeError isn't adding any value in particular.
  2. Since this should never actually get triggered, it'd be fine to remove the assert all together for performance purposes for production code; executing code with the interpreter's optimization flag does just that for assert statements. Alternatively I could explicitly raise an error if __debug__ == True I guess, it's just more verbose.

However I think it's fine to, for clarity, add an error message to the assert explaining the failure.

@CFlaniganMide
Copy link
Contributor

A more meaningful exception (e.g. a TypeError with a descriptive message) would be better than an AssertError.

I'd prefer an assert statement for the following reasons:

1. a `TypeError` is fine when you want to inform the dev that they're misusing a function you wrote; that's an "expected" error in a sense.  _This_ is producing an error for a fundamental assumption on which the code is based, and If that assumption is wrong _for any reason_, it means the code is _bad_. There is **no** circumstance under which this should **ever** be raised, and a user (including other Python dev's) should **never** see this -> making it a `TypeError` isn't adding any value in particular.

2. Since this should never actually get triggered, it'd be fine to remove the assert all together for performance purposes for production code; executing code with [the interpreter's optimization flag](https://docs.python.org/3/using/cmdline.html#cmdoption-o) does just that for assert statements. Alternatively I could explicitly raise an error `if __debug__ == True` I guess, it's just more verbose.

However I think it's fine to, for clarity, add an error message to the assert explaining the failure.

So, this is essentially checking if the Python and Numpy implementations match up, right? If we want to check something for the sake of the whole module, which this would be, would it make more sense to move this out of a class definition? If this is a legitimate concern, should we also be checking platform native data sizes? These are allowed to change from one platform to another, and so that actually could fail.

@StokesMIDE
Copy link
Member

I misinterpreted what the assert was doing; I thought it was validating the channel's formatting string. That said, it seems like this sort of sanity check should be in the unit tests.

@CrepeGoat
Copy link
Contributor Author

should we also be checking platform native data sizes? These are allowed to change from one platform to another, and so that actually could fail.

@CFlaniganMide actually there should be no platform where this fails: the left-side of the equality check is from the struct package, which uses standard sizes when an endian-ness is specified that should be consistent across platforms; the right side is numpy dtypes specified using the type-&-size style typestrings (e.g., 'i4', 'f2', etc.), and obviously those should always be the same size regardless of platform. So together the whole thing should be solid.

@CrepeGoat
Copy link
Contributor Author

it seems like this sort of sanity check should be in the unit tests.

@StokesMIDE not my preference but it's addressed in 4aaa724

Copy link
Contributor

@CFlaniganMide CFlaniganMide left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This generally looks good.

I don't want this discussion to prevent this from moving forwards since I think that where it is is fine. This one check is essentially validating that the struct module and Numpy agree. I'm comfortable assuming that any official releases of cpython and Numpy will be implemented correctly, and saying that is all that we officially support. Running that in the module level init.py file might hypothetically make diagnosing an esoteric error easier if it ever comes up but I don't think that's likely.

@CrepeGoat CrepeGoat merged commit e7d8b68 into develop Oct 14, 2020
@CrepeGoat CrepeGoat deleted the fix/pressure-parser-error branch October 14, 2020 15:39
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants