Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Iso in machine-os-images references RHCOS in 4.12 instead of FCOS #1506

Closed
bernhardloos opened this issue Feb 14, 2023 · 2 comments · Fixed by openshift/release#36453
Closed

Comments

@bernhardloos
Copy link

This leads to ironic trying to provision a host with RHCOS first.
This was correct in 4.11.

From 4.11.0-0.okd-2023-01-14-152430:

docker run --rm quay.io/openshift/okd-content@sha256:2ee50ca9e09fd8865c6b322e5e096a246d819bad0a264a557c4c128d5a95a387 cat coreos/coreos-stream.json | head -n 15
{
    "stream": "stable",
    "metadata": {
        "last-modified": "2022-07-26T22:48:18Z",
        "generator": "fedora-coreos-stream-generator v0.2.7"
    },
    "architectures": {
        "aarch64": {
            "artifacts": {
                "aws": {
                    "release": "36.20220716.3.1",
                    "formats": {
                        "vmdk.xz": {
                            "disk": {
                                "location": "https://builds.coreos.fedoraproject.org/prod/streams/stable/builds/36.20220716.3.1/aarch64/fedora-coreos-36.20220716.3.1-aws.aarch64.vmdk.xz",

From 4.12.0-0.okd-2023-02-04-212953:

docker run --rm quay.io/openshift/okd-content@sha256:4640208b8d3ac8ff0441972588163e29de61b6154734f0cb3ec5f906da57ceff cat coreos/coreos-stream.json | head -n 15
{
  "stream": "rhcos-4.11",
  "metadata": {
    "last-modified": "2022-08-10T14:54:27Z",
    "generator": "plume cosa2stream 0.13.0+9-ga19a99c94"
  },
  "architectures": {
    "aarch64": {
      "artifacts": {
        "aws": {
          "release": "412.86.202208101040-0",
          "formats": {
            "vmdk.gz": {
              "disk": {
                "location": "https://rhcos.mirror.openshift.com/art/storage/releases/rhcos-4.12-aarch64/412.86.202208101040-0/aarch64/rhcos-412.86.202208101040-0-aws.aarch64.vmdk.gz",
@okd-project okd-project locked and limited conversation to collaborators Feb 14, 2023
@vrutkovs vrutkovs converted this issue into a discussion Feb 14, 2023
@vrutkovs vrutkovs reopened this Feb 14, 2023
@okd-project okd-project unlocked this conversation Feb 14, 2023
LorbusChris added a commit to LorbusChris/release that referenced this issue Feb 20, 2023
The OKD installer from the `origin` namespace has to be used in the
build in order to actually pull the FCOS bootimage.

Fixes: okd-project/okd#1506
openshift-merge-robot pushed a commit to openshift/release that referenced this issue Feb 20, 2023
* machine-os-images: Use OKD installer for OKD builds

The OKD installer from the `origin` namespace has to be used in the
build in order to actually pull the FCOS bootimage.

Fixes: okd-project/okd#1506

* machine-os-images: Add OKD/SCOS builds
@bernhardloos
Copy link
Author

With 4.12.0-0.okd-2023-03-05-022504, the image now contains FCOS, but it's version 37.20221127.3.0 instead of 37.20230205.3.0 which is actually used by the cluster. It's not that critical, because the upgrade works in this case, but it would still be nice if the correct image would be installed immediately to safe some time and download bandwidth.

@LorbusChris
Copy link
Contributor

@bernhardloos technically, that is expected right now, as that's the FCOS version that the installer references for bootimages: https://github.com/openshift/installer/blob/44b1b0bf0377270adcbabb34062a76fedb9028ad/data/data/coreos/fcos.json#L37

OKD/FCOS does not currently produce its own bootimages -- vanilla FCOS is always booted first and then rebased.
We need coreos/fedora-coreos-tracker#1151 to enable booting into OKD/FCOS right away.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants