Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Cloudcix ds #1351

Merged
merged 35 commits into from
Sep 19, 2024
Merged
Show file tree
Hide file tree
Changes from 30 commits
Commits
Show all changes
35 commits
Select commit Hold shift + click to select a range
49c00f8
Add CIX datasource with tests
BrinKe-dev Feb 14, 2022
3d1d133
Update sample metadata in tests
BrinKe-dev Feb 15, 2022
554419e
Don't try to read response json in helper function
BrinKe-dev Mar 23, 2022
1fd4423
Add test to check failing get_data call
BrinKe-dev Mar 23, 2022
161ce91
Update ca signers list
BrinKe-dev Mar 23, 2022
5f5b013
Remove un-used logger and comment
BrinKe-dev Mar 24, 2022
03c1a44
Fix signature
BrinKe-dev Mar 25, 2022
79877b9
Remove unused import
BrinKe-dev Mar 28, 2022
a02caf9
Add CloudCIX to the end of the DataSource list
BrinKe-dev Mar 30, 2022
b0ad6e1
Move reading metadata to separate function
BrinKe-dev Mar 31, 2022
7413266
Use util.log_time when fetching metadata
BrinKe-dev Mar 31, 2022
2d2d951
Add CloudCIX ds docs. Allow config options
BrinKe-dev Apr 4, 2022
7ae04ff
Use lazy evaluation for logging message
BrinKe-dev Apr 4, 2022
49a89cb
Fix logging message
BrinKe-dev Apr 4, 2022
dcd1c21
Generate v1 metadata url
BrinKe-dev Apr 6, 2022
0167c1b
Merge branch 'main' into cloudcix_ds
BrinKe-dev Mar 6, 2024
bd850a7
Make cloudcix ds use DHCP
BrinKe-dev Mar 19, 2024
4afc962
Add tests and ds_config options
BrinKe-dev Mar 21, 2024
b29a4a0
Fix lint errors
BrinKe-dev Mar 21, 2024
3e5704f
Merge branch 'canonical:main' into cloudcix_ds
BrinKe-dev Mar 26, 2024
657ffef
Mock dhcp network for CloudCIX DS unittests
BrinKe-dev Mar 27, 2024
f231473
Simplify Tests
BrinKe-dev Mar 27, 2024
cdf9335
Merge branch 'canonical:main' into cloudcix_ds
maria-walsh Aug 21, 2024
a765050
Address review comments on #1351
jgrassler Aug 21, 2024
9469fe2
Fix test breakage from addressing review comments
jgrassler Aug 21, 2024
5e4d2ed
Add missing _unpickle for CloudCIX data source
jgrassler Aug 21, 2024
aed1ada
Add NETWORK dependency to CloudCIX data source.
jgrassler Aug 22, 2024
5ebac09
Make various checks happy
jgrassler Aug 30, 2024
61d9041
Converted test_cloudcix.py to pytest.
jgrassler Sep 2, 2024
7d42f49
Addressed second round of review comments
jgrassler Sep 9, 2024
6dce91e
Fix sec_between_retries item in CloudCIX data source documentation.
jgrassler Sep 10, 2024
8a890c2
Add routes and DNS config to metadata.
jgrassler Sep 10, 2024
daacae7
Adjusted unit tests
jgrassler Sep 10, 2024
88cdc11
Pass complete netplan configuration data structure.
jgrassler Sep 17, 2024
68bcb33
Removed _unpickle and test broken by its lack.
jgrassler Sep 19, 2024
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions cloudinit/apport.py
Original file line number Diff line number Diff line change
Expand Up @@ -36,6 +36,7 @@
"Azure",
"Bigstep",
"Brightbox",
"CloudCIX",
"CloudSigma",
"CloudStack",
"DigitalOcean",
Expand Down
1 change: 1 addition & 0 deletions cloudinit/settings.py
Original file line number Diff line number Diff line change
Expand Up @@ -50,6 +50,7 @@
"NWCS",
"Akamai",
"WSL",
"CloudCIX",
# At the end to act as a 'catch' when none of the above work...
"None",
],
Expand Down
199 changes: 199 additions & 0 deletions cloudinit/sources/DataSourceCloudCIX.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,199 @@
# This file is part of cloud-init. See LICENSE file for license information.

import json
import logging
from typing import Any, Dict, Optional

from cloudinit import dmi, net, sources, url_helper, util

LOG = logging.getLogger(__name__)

METADATA_URLS = ["http://169.254.169.254"]
METADATA_VERSION = 1

CLOUDCIX_DMI_NAME = "CloudCIX"


class DataSourceCloudCIX(sources.DataSource):

dsname = "CloudCIX"
blackboxsw marked this conversation as resolved.
Show resolved Hide resolved
# Setup read_url parameters through get_url_params()
url_retries = 3
url_timeout_seconds = 5
url_sec_between_retries = 5

def __init__(self, sys_cfg, distro, paths):
super(DataSourceCloudCIX, self).__init__(sys_cfg, distro, paths)
self._metadata_url = None
self._net_cfg = None

def _unpickle(self, ci_pkl_version: int) -> None:
super()._unpickle(ci_pkl_version)
if not hasattr(self, "_metadata_url"):
setattr(self, "_metadata_url", None)
if not hasattr(self, "_net_cfg"):
setattr(self, "_net_cfg", None)
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We can drop this _unpickle logic as this is the first iteration of this datasource so _unpickle guarding is unnecessary as we won't be deserializing an older version which doesn't yet have these instance attrs.

Suggested change
def _unpickle(self, ci_pkl_version: int) -> None:
super()._unpickle(ci_pkl_version)
if not hasattr(self, "_metadata_url"):
setattr(self, "_metadata_url", None)
if not hasattr(self, "_net_cfg"):
setattr(self, "_net_cfg", None)

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

While it may technically be unneccessary, it will cause the unit test that checks for it to fail:

 tests/unittests/test_upgrade.py::TestUpgrade::test_all_ds_init_vs_unpickle_attributes[mode1] - AssertionError: New CloudCIX attributes need unpickle coverage: {'_metadata_url', '_net_cfg'}

So unless that failure is ok, I'll leave it in for now.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@jgrassler this unit test failure is just due to the fact that the test is short-sighted and doesn't take into account brand new datasources which have yet to have been released. The following diff fixes this test and drops your unpickle method as it's only needed if CloudCIX pre-existed in published images and we need to worry about that obj.pkl loading across system reboot after a package upgrade of cloud-init with introduces new instance attributes across that upgrade. I'l

    drop unneeded unpickle as CloudCIX hsas not released so no pickle issues

diff --git a/cloudinit/sources/DataSourceCloudCIX.py b/cloudinit/sources/DataSourceCloudCIX.py
index 6371434ee..8f6ef4a1b 100644
--- a/cloudinit/sources/DataSourceCloudCIX.py
+++ b/cloudinit/sources/DataSourceCloudCIX.py
@@ -27,13 +27,6 @@ class DataSourceCloudCIX(sources.DataSource):
         self._metadata_url = None
         self._net_cfg = None
 
-    def _unpickle(self, ci_pkl_version: int) -> None:
-        super()._unpickle(ci_pkl_version)
-        if not hasattr(self, "_metadata_url"):
-            setattr(self, "_metadata_url", None)
-        if not hasattr(self, "_net_cfg"):
-            setattr(self, "_net_cfg", None)
-
     def _get_data(self):
         """
         Fetch the user data and the metadata
diff --git a/tests/unittests/test_upgrade.py b/tests/unittests/test_upgrade.py
index 5c8eef5a5..32a3d7c2b 100644
--- a/tests/unittests/test_upgrade.py
+++ b/tests/unittests/test_upgrade.py
@@ -47,6 +47,7 @@ class TestUpgrade:
             "seed",
             "seed_dir",
         },
+        "CloudCIX": {"_metadata_url", "_net_cfg"},
         "CloudSigma": {"cepko", "ssh_public_key"},
         "CloudStack": {
             "api_ver",


def _get_data(self):
"""
Fetch the user data and the metadata
"""
try:
crawled_data = util.log_time(
logfunc=LOG.debug,
msg="Crawl of metadata service",
func=self.crawl_metadata_service,
)
except sources.InvalidMetaDataException as error:
LOG.debug(
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This should be LOG.error as we have failed to detect a platform that claimed to pass is_platform_viable for CloudCIX based on DMI-data. So something is broken in this environment and should be represented as an error instead of debug.

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done.

"Failed to read data from CloudCIX datasource: %s", error
)
return False

self.metadata = crawled_data["meta-data"]
self.userdata_raw = util.decode_binary(crawled_data["user-data"])
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hasn't this already been decoded in read_metadata via maybe_b64decode?

Copy link

@jgrassler jgrassler Aug 22, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Not quite, no. decode_binary turns the userdata into a Python string. It will interpret the bytes it processes as UTF-8 which will yield gibberish if the userdata happens to be Base64 encoded on the outside. That's why both decoding steps are needed to guard against the possibility of the user data payload being base64 encoded.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Make sense. thanks for the explanation


return True

def crawl_metadata_service(self) -> dict:
md_url = self.determine_md_url()
if md_url is None:
raise sources.InvalidMetaDataException(
"Could not determine metadata URL"
)

data = read_metadata(md_url, self.get_url_params())
return data

def determine_md_url(self) -> Optional[str]:
if self._metadata_url:
return self._metadata_url

# Try to reach the metadata server
url_params = self.get_url_params()
base_url, _ = url_helper.wait_for_url(
METADATA_URLS,
max_wait=url_params.max_wait_seconds,
timeout=url_params.timeout_seconds,
)
if not base_url:
return None

# Find the highest supported metadata version
for version in range(METADATA_VERSION, 0, -1):
url = url_helper.combine_url(
base_url, "v{0}".format(version), "metadata"
)
try:
response = url_helper.readurl(url, timeout=self.url_timeout)
except url_helper.UrlError as e:
LOG.debug("URL %s raised exception %s", url, e)
continue

if response.ok():
self._metadata_url = url_helper.combine_url(
base_url, "v{0}".format(version)
)
break
else:
LOG.debug("No metadata found at URL %s", url)

return self._metadata_url

@staticmethod
def ds_detect():
return is_platform_viable()

@property
def network_config(self):
if self._net_cfg:
return self._net_cfg

if not self.metadata:
return None
self._net_cfg = self._generate_net_cfg(self.metadata)
return self._net_cfg

def _generate_net_cfg(self, metadata):
netcfg: Dict[str, Any] = {"version": 2, "ethernets": {}}
macs_to_nics = net.get_interfaces_by_mac()

for iface in metadata["network"]["interfaces"]:
name = macs_to_nics.get(iface["mac_address"])
if name is None:
LOG.warning(
"Metadata MAC address %s not found.", iface["mac_address"]
)
continue
netcfg["ethernets"][name] = {
"set-name": name,
"match": {
"macaddress": iface["mac_address"].lower(),
},
"addresses": iface["addresses"],
}

return netcfg


def is_platform_viable() -> bool:
return dmi.read_dmi_data("system-product-name") == CLOUDCIX_DMI_NAME


def read_metadata(base_url: str, url_params):
"""
Read metadata from metadata server at base_url

:returns: dictionary of retrieved metadata and user data containing the
following keys: meta-data, user-data
:param: base_url: meta data server's base URL
:param: url_params: dictionary of URL retrieval parameters. Valid keys are
`retries`, `sec_between` and `timeout`.
:raises: InvalidMetadataException upon network error connecting to metadata
URL, error response from meta data server or failure to
decode/parse metadata and userdata payload.
"""
md = {}
blackboxsw marked this conversation as resolved.
Show resolved Hide resolved
leaf_key_format_callback = (
("metadata", "meta-data", util.load_json),
("userdata", "user-data", util.maybe_b64decode),
)

for url_leaf, new_key, format_callback in leaf_key_format_callback:
try:
response = url_helper.readurl(
url=url_helper.combine_url(base_url, url_leaf),
retries=url_params.num_retries,
sec_between=url_params.sec_between_retries,
timeout=url_params.timeout_seconds,
)
except url_helper.UrlError as error:
raise sources.InvalidMetaDataException(
f"Failed to fetch IMDS {url_leaf}: "
f"{base_url}/{url_leaf}: {error}"
)

if not response.ok():
raise sources.InvalidMetaDataException(
f"No valid {url_leaf} found. "
f"URL {base_url}/{url_leaf} returned code {response.code}"
)

try:
md[new_key] = format_callback(response.contents)
except json.decoder.JSONDecodeError as exc:
raise sources.InvalidMetaDataException(
f"Invalid JSON at {base_url}/{url_leaf}: {exc}"
) from exc
return md


# Used to match classes to dependencies
datasources = [
(DataSourceCloudCIX, (sources.DEP_FILESYSTEM, sources.DEP_NETWORK)),
BrinKe-dev marked this conversation as resolved.
Show resolved Hide resolved
]


# Return a list of data sources that match this set of dependencies
def get_datasource_list(depends):
return sources.list_from_depends(depends, datasources)
4 changes: 1 addition & 3 deletions cloudinit/url_helper.py
Original file line number Diff line number Diff line change
Expand Up @@ -268,9 +268,7 @@ def __init__(self, contents, url, code=200):
self.url = url

def ok(self, *args, **kwargs):
if self.code != 200:
return False
return True
return self.code == 200
BrinKe-dev marked this conversation as resolved.
Show resolved Hide resolved

def __str__(self):
return self.contents.decode("utf-8")
Expand Down
1 change: 1 addition & 0 deletions doc/rtd/reference/datasources.rst
Original file line number Diff line number Diff line change
Expand Up @@ -43,6 +43,7 @@ The following is a list of documentation for each supported datasource:
datasources/altcloud.rst
datasources/ec2.rst
datasources/azure.rst
datasources/cloudcix.rst
datasources/cloudsigma.rst
datasources/cloudstack.rst
datasources/configdrive.rst
Expand Down
33 changes: 33 additions & 0 deletions doc/rtd/reference/datasources/cloudcix.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,33 @@
.. _datasource_cloudcix:

CloudCIX
========

`CloudCIX`_ serves metadata through an internal server, accessible at
``http://169.254.169.254/v1``. The metadata and userdata can be fetched at
the ``/metadata`` and ``/userdata`` paths respectively.

CloudCIX instances are identified by the dmi product name `CloudCIX`.

Configuration
-------------

CloudCIX datasource has the following config options:

::

datasource:
CloudCIX:
retries: 3
timeout: 2
sec_between_retries: 2


- *retries*: The number of times the datasource should try to connect to the
metadata service
- *timeout*: How long in seconds to wait for a response from the metadata
service
- *wait*: How long in seconds to wait between consecutive requests to the
blackboxsw marked this conversation as resolved.
Show resolved Hide resolved
Copy link
Collaborator

@blackboxsw blackboxsw Sep 9, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
- *wait*: How long in seconds to wait between consecutive requests to the
- *sec_between_retries*: How long in seconds to wait between consecutive requests to the

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Fixed.

metadata service

_CloudCIX: https://www.cloudcix.com/
Loading
Loading