-
Notifications
You must be signed in to change notification settings - Fork 371
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Parse onHold in vmSettings #2418
Conversation
@@ -101,15 +97,17 @@ def http_request(method, url, data, **kwargs): | |||
return return_value | |||
|
|||
# if the request was not handled try to use the mock wireserver data | |||
if _USE_MOCK_WIRE_DATA_RE.match(url) is not None: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
When I designed the mock protocol I wanted requests that would return mock data to be explicit (they had to start with http://mock-goal-state (or https)). As I've been adding more data files this becomes cumbersome. Furthermore, the underlying implementation of the mocks in mockwiredata.py always try to use mock data for HostGAPlugin requests.
While working on this PR I forgot to mark my mock data with "mock-goal-state" and all the new unit tests were ok, but some existing tests were failing. Turns out that the test request was failing on the direct call and was falling back to the HostGAPlugin, changing the default channel in the process.
Some of the HostGAPlugin tests were assuming the initial state of the default channel, but that is incorrect, since it is actually a static variable that can be changed as a side effect of other test.
I fixed the HostGAPlugin tests, but I also dropped the enforcement of "mock-goal-state" in the test data. Not only the test data is easier to write, but now the behaviour of this mock is consistent with the behaviour of the mock HostGAPlugin (both of them always try to use mock data)
Codecov Report
@@ Coverage Diff @@
## develop #2418 +/- ##
========================================
Coverage 71.64% 71.64%
========================================
Files 100 100
Lines 14628 14631 +3
Branches 2091 2092 +1
========================================
+ Hits 10480 10483 +3
Misses 3684 3684
Partials 464 464
Continue to review full report at Codecov.
|
if method == 'GET': | ||
return protocol.mock_wire_data.mock_http_get(url, **kwargs) | ||
if method == 'POST': | ||
return protocol.mock_wire_data.mock_http_post(url, data, **kwargs) | ||
if method == 'PUT': | ||
return protocol.mock_wire_data.mock_http_put(url, data, **kwargs) | ||
except NotImplementedError: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Is NotImplementedError a temporary error we raise for development purpose ? Just want to make sure why we are not doing anything with it after catching.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes, good comment. I am not totally happy with the choice of NotImplemented, since that exception sort of gives the impression that this is temporary, as you point out. I used it as meaning that support for the specific "url" given as parameter is not supported by the mock_http_* functions (you can see when this exception is raised in these functions). Initially I considered a TypeError. but that is currently used to indicated a bad request (e.g incorrect headers). In the cases where NotImplemented is raised for a specific request, it is totally possible to add support for that request in the mocks, even if right now they are not supported, so I went for that error.
As far as why we are not doing anything in the except clause, that is explained in a comment a couple of lines below: if the mock_http_* functions don't handle the request we need to continue and fail the request or pass it to the original function we are mocking, depending on the flag passed as argument to this function.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Makes sense. Thanks for explaining.
azurelinuxagent/common/protocol/extensions_goal_state_from_vm_settings.py
Show resolved
Hide resolved
@@ -244,6 +244,8 @@ def test_default_channel(self, patch_put, patch_upload, _): | |||
as part of status upload. | |||
""" | |||
with self.create_mock_protocol() as wire_protocol: | |||
wire.HostPluginProtocol.is_default_channel = False |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
would it be worthwhile to add this as a mock inside create_mock_protocol
since we want the state to be clean pretty much after every run?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
probably, but that requires additional changes in those tests, and probably many of the tests in this module have similar dependencies.
i fixed only the ones that broke
While doing this I also initialized the FlexibleVersions explicitly. Although the default FlexibleVersion is equivalent to version 0.0.0.0, it displays kinda funny. Explicit initialization displays slightly better.