Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Meta: improving the shim review process #345

Open
steve-mcintyre opened this issue Sep 21, 2023 · 33 comments
Open

Meta: improving the shim review process #345

steve-mcintyre opened this issue Sep 21, 2023 · 33 comments
Assignees
Labels
meta Not a review request, but an issue or notice wrt the signing process

Comments

@steve-mcintyre
Copy link
Collaborator

Improving the shim review process

We've had quite some problems getting shims reviewed over the last couple of years, and it has not been a great experience for anybody involved:

  • Lack of trusted reviewers has left many of the submissions without any apparent progress for a long time, causing obvious frustration for the submitters.
  • Various vendors have been left hanging, potentially delaying releases and product launches.
  • For the very small number of trusted reviewers, things have been stressful. Lack of time and burnout have both been real problems here.
  • The larger / better-known projects have been seen as "jumping the queue", fairly or not.

So, let's work out a better process. I'm suggesting a few things here, please respond in comments below.

We need more reviewers

The hope so far has been that shim submiiters would also review some of the submissions from other people. However, that has been a difficult / daunting thing for many. Spending time doing this has often been a thankless task, with little feedback to suggest that it's actually useful or valued.

Some people have enthusiastically stepped up, nonetheless! I'm explicitly calling out a number of people who've been active in reviewing shim submissions despite the problems. I've spent a chunk of the last week reading and counting review comments in the last 150-ish submissions, and some names are particularly prominent here!

Others have helped too, I'm not going to list everybody here right now! (But see below...) However, I think that these 5 people have clearly already shown they really understand what's need out of a review. I'd like to offer each of these people trusted reviewer status right now. Guys: please talk to me about this, and let me know if you're interested?

I really want to thank everybody who has helped with reviews, not just these people! There are also a few developers who have contacted me out of band offering to help and asking how to do it. So, let's make that easier for people in the future. If you want to help, read the docs (see below!), look at some example reviews, and start doing it! If you're not sure on something, then be clear and say so. If you think the docs are unclear, say so (and ideally propose improvements!). Once you've reviewed a few submissions and we can see how you're working, we can compare your results with others and if we think you're doing a good job then you get to be trusted. I'm not going to propose formal training here, as I don't think we have enough people to make that work. But we should be able to learn from each other and the docs.

How does that sound?

We need more trust

Some of the reviews only get comments from one reviewer, which means it's likely that we'll have missed things or made mistakes. Judgement calls might go different ways, depending on the reviewer. What I'd love to see going forwards is a more distributed set of reviews. Could we wait until we get (for example) three independent reviews of each submission, with at least one from an anointed/trusted reviewer? Is that too many, or too few? Feedback welcome!

Tagging things with labels only works for trusted reviewers, so instead of that let's just say what we think in the comments. Once a submission has enough positive reviews, a trusted person can apply the "accepted" label and we move on.

We need more and better docs

I started writing some docs for the shim review process a while back, adding them to the shim wiki. Others have helped: thanks to @pnardini in particular here. But I've not managed to document everything I wanted, and I also get the impression that people are not finding those docs.

There's currently a guide for reviewers and a guide for submitters. I hope they're useful? I'd also like to collect together some best practices / HOWTO docs for developers trying to do the right thing with their Secure Boot stack. Some of the well-established projects and vendors have spent a lot of time and effort working out how to do things. Some of these steps are obvious, a lot of them are not - let's share and make it easier for new people to do the right thing!

Could you help with these docs?

It's been suggested (@pnardini again!) that we should move those docs from the shim wiki into the shim-review repo directly, to make it easier for everybody to find them and also easier to collaborate on updating docs. I think that's a great idea, so I've added #344 to do the initial import of current wiki state.

Automation

Some of what we're looking for in reviews can be automated, of course. @jclab-joseph in #340 is working on a bot to do some of this for us. Let's hope this helps too. :-)

Feedback (and patches!) welcome

How do my suggestions sound? Will this help? Am I sounding crazy? Have I missed something important? You know what to do!

List of people

Here's a full-ish list of people who've contributed or shown interest in shim reviews in the last couple of years, so everybody gets notified about this discussion. Apologies for the spam if you didn't want to hear more...

@ecos-platypus @aronowski @dennis-tseng99 @tSU-RooT @THS-on @pjbrown
@SherifNagy @ClaudioGranatiero-10zig @evilteq @keithdlopresto
@realnickel @nicholasbishop @Doncuppjr @christopherco @Burmash
@miray-tf @mheese @rehakp @mikebeaton @tedbranston @christoph-at-unicon

@steve-mcintyre steve-mcintyre added the meta Not a review request, but an issue or notice wrt the signing process label Sep 21, 2023
@steve-mcintyre steve-mcintyre self-assigned this Sep 21, 2023
@steve-mcintyre steve-mcintyre pinned this issue Sep 21, 2023
@aronowski
Copy link
Collaborator

Thanks for the thread, Steven. Here's my take on that.

Spending time doing this has often been a thankless task, with little feedback to suggest that it's actually useful or valued.

Some people have enthusiastically stepped up, nonetheless!

IMHO, while the process is official, it may well serve as a learning space for people who do not necessarily have prior experience. Like some sort of science club at a university where students from different backgrounds can come, ask questions, make mistakes and learn how to do things the correct way.

I'd like to offer each of these people trusted reviewer status right now. Guys: please talk to me about this, and let me know if you're interested?

Yes, I am interested.
Though if there are details to discuss, you still do have my email address and public key, don't you? ;-)

Could we wait until we get (for example) three independent reviews of each submission, with at least one from an anointed/trusted reviewer? Is that too many, or too few? Feedback welcome!

In regard to this, I'd wait for more automation, like Joseph's review bot and discuss, what are the most detailed things to look at, where automation is tough, let's say using correct generation numbers in regard to what happened historically.
I'd rather that manual reviews focus more on these kinds of details and once we know, how much there might be to review, then decide, how many reviews should be there.

It's been suggested (@pnardini again!) that we should move those docs from the shim wiki into the shim-review repo directly, to make it easier for everybody to find them and also easier to collaborate on updating docs. I think that's a great idea, so I've added #344 to do the initial import of current wiki state.

And it does not have the recent fixes I incorporated just before you created this PR. :-(

Will this help?

Yes.

Am I sounding crazy?

Yes, and that's a good thing - we, or at least I, need more craziness in this world!
If not for the fact I'm representing companies and partaking in official venues, I'd be showing that side of me much more often.

@SherifNagy
Copy link
Collaborator

Thank you so much @steve-mcintyre for all your hard work and dedication, those suggestion sounds great

I always bump into one or two of those names whenever I am reading any review here and their feedback is always on point!

We need more trust

Some of the reviews only get comments from one reviewer, which means it's likely that we'll have missed things or made mistakes. Judgement calls might go different ways, depending on the reviewer. What I'd love to see going forwards is a more distributed set of reviews. Could we wait until we get (for example) three independent reviews of each submission, with at least one from an anointed/trusted reviewer? Is that too many, or too few? Feedback welcome!

Having 3 reviewers would be ideal and great, I would suggest maybe if it's the vendor's 1st or 2nd submission? just in case if someone missed something, then it can be two reviewers including 1 trusted reviewer? I guess one of the main point is that we don't have enough trusted reviewers but overloading them with more submission might back fire? but we can start with 3 reviewers per review as you mentioned and see how it goes? we can always change / adopt later on

We need more and better docs

Totally agree

Could you help with these docs?
yes, I am for it

Automation

Some of what we're looking for in reviews can be automated, of course. @jclab-joseph in #340 is working on a bot to do some of this for us. Let's hope this helps too. :-)

This would be very helpful

@ClaudioGranatiero-10zig

Thanks @steve-mcintyre and @aronowski and all the other. I'm not the most active of the peer reviewer, but if I can help let me know and I try.

@rehakp
Copy link

rehakp commented Sep 22, 2023

Hello @steve-mcintyre

So, let's work out a better process. I'm suggesting a few things here, please respond in comments below.

Great to see your initiative and enthusiasm to heal this small world, I've really appreciated and recognized all of your submissions - thanks for them!

We need more reviewers

The hope so far has been that shim submiiters would also review some of the submissions from other people. However, that has been a difficult / daunting thing for many. Spending time doing this has often been a thankless task, with little feedback to suggest that it's actually useful or valued.

Some people have enthusiastically stepped up, nonetheless! I'm explicitly calling out a number of people who've been active in reviewing shim submissions despite the problems. I've spent a chunk of the last week reading and counting review comments in the last 150-ish submissions, and some names are particularly prominent here!

Fantastic, I do recognize those reviewers for quite some time as well - I am all for promoting them to trusted ones, that makes perfect sense.

We need more and better docs

I started writing some docs for the shim review process a while back, adding them to the shim wiki. Others have helped: thanks to @pnardini in particular here. But I've not managed to document everything I wanted, and I also get the impression that people are not finding those docs.

Could you help with these docs?

Having reviewed a little so far, I welcome any docs improvements. I like them, I will read them thoroughly and comment on them when I discover anything unclear.

Automation

Some of what we're looking for in reviews can be automated, of course. @jclab-joseph in #340 is working on a bot to do some of this for us. Let's hope this helps too. :-)

Anything that could be automated is a great improvement that saves time and prevents stupid mistakes, I will definitely try it.

Feedback (and patches!) welcome

How do my suggestions sound? Will this help? Am I sounding crazy? Have I missed something important? You know what to do!
Nothing is crazy from my point of view, I perceive these suggestions as a necessity for healing this community, but that's just my opinion, I have given almost nothing to this community so far, hope it gets better.

@THS-on
Copy link
Collaborator

THS-on commented Sep 22, 2023

I'd like to offer each of these people trusted reviewer status right now. Guys: please talk to me about this, and let me know if you're interested?

@steve-mcintyre thank you for all your work. I'm interested to help with the reviews.

Could you help with these docs?

I'll go rough the wiki and see what we can add. In my experience the most confusion is around how SBAT entries should be constructed. Maybe we can link to the wiki directly in the README.md, so that it is easier to find.

Some of what we're looking for in reviews can be automated, of course. @jclab-joseph in #340 is working on a bot to do some of this for us. Let's hope this helps too. :-)

Automation looks really good. Especially if we can standardize for the checks for patches. Regarding the key protection requirements it seems that some HSM providers provide key attestation, maybe that can help to automate the checks for the certificates inside the shim.

Regarding the amount of reviewers I think the idea proposed by @SherifNagy is good approach. Have up to 3 reviewers for the first review and then 2 for the following reviews. We just need to make sure that we have enough reviewers from different organizations.

@dennis-tseng99
Copy link
Collaborator

Thank @steve-mcintyre, @SherifNagy and others. I was just assigned to be a reviewer by @vathpela. Thank everyone. I will try to do my best to help each release.

@evilteq
Copy link

evilteq commented Sep 25, 2023

Thanks @steve-mcintyre for moving this forward!

I agree with mostly everything that has been said, so I wont repeat. Here there are some extra thoughts:

We need more reviewers

I agree, but I I was thinking more than five, I think we would run into into the same issue of burnout not so long down the line.

I'm not going to propose formal training here, as I don't think we have enough people to make that work. But we should be able to learn from each other and the docs.

Don't we? I guess it comes down to the definition of formal, hence the bold markings, but we could get something. Call it workshops, meetings, conferences, twich streaming or whatever is in fashion these days ;)

Automation

The review is not only about the final binary, or even the binary creation, but definitely many steps on this field can be automated. We can add a checkbox to the list, something like "passes xxx github action".

@SherifNagy
Copy link
Collaborator

We need more reviewers

I agree, but I I was thinking more than five, I think we would run into into the same issue of burnout not so long down the line.

I thought those 5 will be "trusted" reviewers along side the existing trusted reviewers crowd

@steve-mcintyre
Copy link
Collaborator Author

We need more reviewers

I agree, but I I was thinking more than five, I think we would run into into the same issue of burnout not so long down the line.

I thought those 5 will be "trusted" reviewers along side the existing trusted reviewers crowd

Yes, that's exactly it. These 5 are also just the extra people I'm planning to add now if they're interested. I fully expect we'll add and remove more people in the future!

@steve-mcintyre
Copy link
Collaborator Author

Given "write" access here to @aronowski and @THS-on

@steve-mcintyre
Copy link
Collaborator Author

Yes, I am interested. Though if there are details to discuss, you still do have my email address and public key, don't you? ;-)

Yup!

Could we wait until we get (for example) three independent reviews of each submission, with at least one from an anointed/trusted reviewer? Is that too many, or too few? Feedback welcome!

In regard to this, I'd wait for more automation, like Joseph's review bot and discuss, what are the most detailed things to look at, where automation is tough, let's say using correct generation numbers in regard to what happened historically. I'd rather that manual reviews focus more on these kinds of details and once we know, how much there might be to review, then decide, how many reviews should be there.

Nod. I'm definitely leaning towards multiple reviewers, but automation can make reviews less time-consudming too.

It's been suggested (@pnardini again!) that we should move those docs from the shim wiki into the shim-review repo directly, to make it easier for everybody to find them and also easier to collaborate on updating docs. I think that's a great idea, so I've added #344 to do the initial import of current wiki state.

And it does not have the recent fixes I incorporated just before you created this PR. :-(

Argh. Could you point me at a PR/diff for those so I can merge them please? Then a review of my changes would be great!

Am I sounding crazy?

Yes, and that's a good thing - we, or at least I, need more craziness in this world! If not for the fact I'm representing companies and partaking in official venues, I'd be showing that side of me much more often.

grin

@steve-mcintyre
Copy link
Collaborator Author

Having 3 reviewers would be ideal and great, I would suggest maybe if it's the vendor's 1st or 2nd submission? just in case if someone missed something, then it can be two reviewers including 1 trusted reviewer? I guess one of the main point is that we don't have enough trusted reviewers but overloading them with more submission might back fire? but we can start with 3 reviewers per review as you mentioned and see how it goes? we can always change / adopt later on

Sounds like a fair suggestion, yes. :-)

@steve-mcintyre
Copy link
Collaborator Author

@steve-mcintyre thank you for all your work. I'm interested to help with the reviews.

Awesome! :-)

I'll go rough the wiki and see what we can add. In my experience the most confusion is around how SBAT entries should be constructed. Maybe we can link to the wiki directly in the README.md, so that it is easier to find.

Let's see what we can do to make SBAT clearer for people, definitely.

Regarding the amount of reviewers I think the idea proposed by @SherifNagy is good approach. Have up to 3 reviewers for the first review and then 2 for the following reviews. We just need to make sure that we have enough reviewers from different organizations.

Nod.

@steve-mcintyre
Copy link
Collaborator Author

I'm not going to propose formal training here, as I don't think we have enough people to make that work. But we should be able to learn from each other and the docs.

Don't we? I guess it comes down to the definition of formal, hence the bold markings, but we could get something. Call it workshops, meetings, conferences, twich streaming or whatever is in fashion these days ;)

grin. If people would find it interesting, I think we could happily organise a video call or something. Hell, I'm planning on going to FOSDEM again next year and a meetup there might work?

Automation

The review is not only about the final binary, or even the binary creation, but definitely many steps on this field can be automated. We can add a checkbox to the list, something like "passes xxx github action".

👍

@SherifNagy
Copy link
Collaborator

I'm not going to propose formal training here, as I don't think we have enough people to make that work. But we should be able to learn from each other and the docs.

Don't we? I guess it comes down to the definition of formal, hence the bold markings, but we could get something. Call it workshops, meetings, conferences, twich streaming or whatever is in fashion these days ;)

grin. If people would find it interesting, I think we could happily organise a video call or something. Hell, I'm planning on going to FOSDEM again next year and a meetup there might work?

Automation

The review is not only about the final binary, or even the binary creation, but definitely many steps on this field can be automated. We can add a checkbox to the list, something like "passes xxx github action".

👍

That would be awesome! I am planning to go to FOSDEM as well "so far:)"

@aronowski
Copy link
Collaborator

That should do it!
Waiting for GPG-encrypted onboarding details before I'll start applying labels and whatnot. Or discussed via a video conference. If there are any I should be aware of, that is.

I'm planning on going to FOSDEM again next year and a meetup there might work?

It just might!

@mheese
Copy link

mheese commented Sep 25, 2023

@steve-mcintyre yes, I'm definitely also interested in becoming a reviewer. I also won't repeat what has been said already, so I'm just trying to add some points.

  • I think I've mentioned it in this thread already, but I think it definitely would help to give this an official umbrella under a foundation like the LF. That can help with resources and/or organizing training, etc.pp.
  • I still think we should split the reviews into stages (and areas of expertise): I think everybody is aware that reviewing a shim is really just one step in verifying the whole boot chain for secure boot. So while reviewing the shim can be (hopefully) even mostly done automatic down the road, the question of used boot loader (grub or systemd-boot or whatever), as well as reviewing Linux kernel sources/patches/configs (@aronowski and I have started discussing the difficulties around this already) ... for example, I can totally understand if you guys are reluctant to sign off on a boot loader involved that you don't understand - which is why I based our grub builds on Fedora for example to make it easier
  • maybe an initial conference call / team call would be helpful as well?

@aronowski
Copy link
Collaborator

Introductory clarifications

In regard to the recent promotions and communication with our community, I feel there are some ways that may make the environment more welcoming, inclusive and easier for newcomers to both get into as both bootchain developers and reviewers.

I've been thinking and here are some proposals. Let me know what you think about them and once some are agreed to be included in the future, I'll make them separate issues, so their status can be maintained more easily.

They are mentioned in such an order, since they depend on each other in a chronological way.

Code of Conduct improvements

We already have a Code of Conduct in both the shim and shim-review repositories. It focuses on diversity and inclusion, and that's a good thing:

We pledge to act and interact in ways that contribute to an open, welcoming,
diverse, inclusive, and healthy community.

However, what I'd also suggest adding here in the context of inclusivity are the mentions that, among others:

  • people and organizations who come here and file applications for a review may have different backstories, experiences, etc. and not have the knowledge to file a flawless application the first time. It's natural and there's nothing to be ashamed about. We're here to partake in a venue where we all can learn from the public applications in a sorta safe space to make mistakes as I would name it.
  • they may not follow the development of UEFI shim, bootloaders, the Linux kernel, SSL libraries, among others and may not be aware of the venues that take place outside the public Red Hat Bootloader Team's GitHub repositories, as they may just be rebuilding or patching the bootchain components. For instance, they may not know if there already is a solid NX support implementation in the 6.0 kernel family and where to look for clues on this.
  • they may have a high-end technology that clones a repository and performs a build in mere seconds, but this is not always the case. The reviewers may work with a limited (speed or transfer-wise) Internet connection, disk space, processing power, etc. Therefore, I pledge to keep a reasonable repository size, rather than letting a reviewer download hundreds of megabytes

Improving the document for a review application

In regard to the inclusivity I mentioned above, I'd improve the application document template to make it easier for people, who see it for the first time, to understand, what they are asked for.

For example, let's consider someone who read about SBAT, implemented it successfully in their binaries, but due to the lack of expressing themselves with the vocabulary used in this environment, may mistake a global generation with a product-specific generation. I recall seeing applications, where the question

If these fixes have been applied, have you set the global SBAT generation on your GRUB binary to 3?

gave people a hard time. Maybe the "have you set" part was so suggestive, that people thought it refers to their downstream work rather than making sure the upstream entry is grub.3. What I'm suggesting is to reword them for clarity, as well as add some hints that may make their lives easier and even refer to the documentation we have and where to look for help, if something is not clear. For instance, I'd rewrite that question to:

If these fixes have been applied, is the global SBAT generation on your GRUB binary set to 3? In other words: does your GRUB binary have an entry that begins with:
grub,3,Free Software Foundation,grub,[...]?
Hint: for more information on generation numbers, see the document SBAT.example.md.

Reference/dummy application(s)

After introducing the issue template improvements I suggested, what if there was some sort of reference application or more, that presents a dummy company applying to have their shim signed and provides detailed answers to the questions the committee asks for?

For instance, that dummy company wants to have the 15.7 release of UEFI shim signed, but works with older binutils that cause the common bug to be present. Therefore, they explain in their review, why they apply the patch that fixes it.
Furthermore, they may either change the build process of shim 15.7, so that NX bit is applied automatically or apply a patch, that sets the flag to enabled automatically without the -n option. There may be at least two of these dummy applications that present different approaches to the same dilemma.
Or, just as I mentioned the mistaking of global generations with product-specific generations, an answer may shed some light on such a matter and where the dummy company was looking for help.
Or an application that showcases the incrementation of generation numbers and how that dummy company handled it.
I can see there's a lot of room for improvement! Let your creativity flow and suggest more of these!

In this case, newcomers could also base their answers on that dummy applications. If there's a common base, like a distro family, most of the tougher questions would cover the distro-specific things.
For instance, I only use Debian casually as an awesome, solid platform that runs software rather than rebuilding it like Fedora in the context of the bootchain and don't follow Debian development closely, so historically I've been writing that I'm unsure if the NX support in the GRUB2 bootloader in Debian is like in Fedora.

Reorganization of documents

After the suggestions above have been introduced, then I'd focus on reorganizing the documents, so, for instance, the README.md in the shim-review repository does not overwhelm anyone with how huge it is.

Instead, move the questions to another file like review_application_template.md and let README.md mention the most important references (especially for newcomers) to look for, like (chronologically):

  1. reading the application document template casually just to get a grasp on what to expect soon
  2. reading the issue template to see, what is expected (e.g. checking all the boxes in the sense "I've dealt with this, so I'm checking the checkbox." rather than "This is not applicable for me, so I'm leaving the checkbox unchecked.")
  3. reading the applications of the dummy companies to see, how they dealt with the questions
  4. studying (rather than reading casually) all the things they've learned from these examples and applying them to their specific case

Now, I understand that this delegation of the questions to another file than README.md would make it not render on GitHub by default, but I don't think it's that much of a problem for reviewers, since:

  • Markdown is well-readable as plain text
  • the applicants' repositories will still be cloned to review the binary reproducibility, so a reviewer can render the questions on their own

Conclusions

So, what do you think? Are the ideas worthwhile?
If so, make sure to leave a public display of affection down below. ;-)

@aronowski aronowski mentioned this issue Oct 4, 2023
8 tasks
@THS-on
Copy link
Collaborator

THS-on commented Oct 5, 2023

When I was going through the reviews I noticed a couple of things, which I summarized below.
I want to thank all of you that did review other shims, because that made reviewing them a lot easier!

Signing and Embedded Certificate

There is no clear refrence on how the certificates embedded in the shim should look like.

  • What is the advantage of somebody embedding an EV certificate?
    • Is it easier for us to review?
    • Is it a better guarantee that the key is actually in an HSM?
  • If an CA is embedded
    • Which key usage properties should be set? A small survey of the big distributions shows all the combinations
      • Ubuntu Key Usage: Digital Signature, Certificate Sign, CRL Sign
      • Debian Extended Key Usage: Code Signing and Key Usage: Digital Signature, Certificate Sign, CRL Sign
      • RHEL No specific key attributes
    • How should the leaf certificates look like?

Patches to Shim and GRUB2

We now often had that patches for shim that are required for submission, that were not released. This is fine, but takes more time to review. It would speed up the review when the upstream commit is directly referenced in the patches or submission.

Reviewing GRUB2 patches can be more time consuming depending from where the patches were taken. If the GRUB2 build is based on for example the patches from Fedora, Debian or Ubuntu it would be nice to have some (semi-)automated way on to check if and how the patch sets diverge.

GRUB2 Modules

We ask for the list of included GRUB2 modules, but from a submitters perspective it is not clear, which modules are allowed and which might cause the submission to be rejected. I think we can come up with a list of pre-approved modules.

SBAT

Regarding SBAT I noticed the following:

  • It is not very clear when downstream or based on distributions should carry the SBAT entry from the main distributions
    • shim: not really necessary, because we no longer carry a long list of patches that are not released (in most cases only the NX and buggy binutils one)
    • GRUB2: definitely required, because for example Fedora and Debian carry a big patch set
    • fwupd: unsure because currently it seems that distributions do not apply a lot of custom patches on top
  • There is no central documentation on what the current SBAT levels are. There is https://github.com/rhboot/shim/blob/main/SbatLevel_Variable.txt, but this for example does not include revocation done by Debian for GRUB2
  • We don't have a way to lookup which vendor suffixes are currently in use

I would propose that we have a tracking issue or wiki entry for this.

Ephemeral keys for kernel modules and the alternatives

The simplest way to prohibit a kernel to load older kernel modules, is to use an ephemeral key during build time. There are reasons, why this might not be implemented:

  • Building and signing is done on different machines
  • Need for out-of-tree modules or to build modules after the fact and load them
  • Reproducible builds

The question is how we should then proceed in these cases. Possible solutions to mitigate the impact of a vulnerability in a kernel module are:

  • When CONFIG_MODVERSION is used:
    • Only update the ABI name or keys when there was an issue. Doable, but hard to keep track of.
    • Use a different signing key for every kernel version, that is stored in an HSM. Works very close to the ephemeral key solution, but requires a bit of infrastructure to implement.
  • Without CONFIG_MODVERSION:
    • Increase EXTRAVERSION or CONFIG_LOCALVERSION for every kernel version.
    • Use a different signing key for every kernel version, that is stored in an HSM.

Do we accept all those strategies?

Companies being merged or bought

We had it multiple times now that companies merged or bought while submitting a shim for review (Neverware/Google, Micro Focus/OpenText, ITRenew/IronMountain). How do we want to handle this?

@SherifNagy
Copy link
Collaborator

I will try to think out loud here with you.

Signing and Embedded Certificate

There is no clear refrence on how the certificates embedded in the shim should look like.

  • What is the advantage of somebody embedding an EV certificate?

I don't think there is any advantage , but I recall I saw some reviews where the comments were the CA is valid for so many years and that wasn't great

  • Is it easier for us to review?

Maybe in terms of validating the organisation? but in any case, the shim has to be signed with EV cert before being submitted to MSFT

  • Is it a better guarantee that the key is actually in an HSM?

Unless we start asking for signing logs and review those logs, I think it is a matter of trust

  • If an CA is embedded

    • Which key usage properties should be set? A small survey of the big distributions shows all the combinations

      • Ubuntu Key Usage: Digital Signature, Certificate Sign, CRL Sign
      • Debian Extended Key Usage: Code Signing and Key Usage: Digital Signature, Certificate Sign, CRL Sign
      • RHEL No specific key attributes
    • How should the leaf certificates look like?

That did trip me off actually, but we "Rocky team" came up with attributes at least for the certs

GRUB2 Modules

We ask for the list of included GRUB2 modules, but from a submitters perspective it is not clear, which modules are allowed and which might cause the submission to be rejected. I think we can come up with a list of pre-approved modules.

I think all modules are allowed, as long as they are not vulnerable "but I might be wrong here" I know some distro ship NTFS support for example and some don't, both got accepted reviews

SBAT

Regarding SBAT I noticed the following:

  • It is not very clear when downstream or based on distributions should carry the SBAT entry from the main distributions

    • shim: not really necessary, because we no longer carry a long list of patches that are not released (in most cases only the NX and buggy binutils one)
    • GRUB2: definitely required, because for example Fedora and Debian carry a big patch set
    • fwupd: unsure because currently it seems that distributions do not apply a lot of custom patches on top
  • There is no central documentation on what the current SBAT levels are. There is https://github.com/rhboot/shim/blob/main/SbatLevel_Variable.txt, but this for example does not include revocation done by Debian for GRUB2

  • We don't have a way to lookup which vendor suffixes are currently in use
    I would propose that we have a tracking issue or wiki entry for this.

Agree with you here, and it did happen that I did increase my shim SBAT level where I didn't needed to do that, however I thought there was some talks about having a centrializes database where we can track SBAT level s for signed shims that gets reviewed somehow?

Companies being merged or bought

We had it multiple times now that companies merged or bought while submitting a shim for review (Neverware/Google, Micro Focus/OpenText, ITRenew/IronMountain). How do we want to handle this?

I think in this case, they need to be reviewed as a new entity , since the infrastructure might be different, processes and even people might not be same after the merge / acquisition

@steve-mcintyre
Copy link
Collaborator Author

Sorry for going quiet, folks - struggling with some random illness here for the last few days... :-(

@steve-mcintyre
Copy link
Collaborator Author

steve-mcintyre commented Oct 5, 2023

@steve-mcintyre yes, I'm definitely also interested in becoming a reviewer. I also won't repeat what has been said already, so I'm just trying to add some points.

* I think I've mentioned it in this thread already, but I think it definitely would help to give this an official umbrella under a foundation like the LF. That can help with resources and/or organizing training, etc.pp.

Maybe? I've not had great experiences with the LF so far, but maybe that's just me.

* I still think we should split the reviews into stages (and areas of expertise): I think everybody is aware that reviewing a shim is really just one step in verifying the whole boot chain for secure boot. So while reviewing the shim can be (hopefully) even mostly done automatic down the road, the question of used boot loader (grub or systemd-boot or whatever), as well as reviewing Linux kernel sources/patches/configs (@aronowski  and I have started discussing the difficulties around this already) ... for example, I can totally understand if you guys are reluctant to sign off on a boot loader involved that you don't understand - which is why I based our grub builds on Fedora for example to make it easier

ACK. I think that more sharing of knowledge and experience would help, but even then it can be daunting.

* maybe an initial conference call / team call would be helpful as well?

OK, let's try and organise that then. Now we get the fun of trying to organise a meeting for people from ~all time zones, heh! Mail me at 93sam@debian.org - see below...

Initial agenda is at https://docs.google.com/document/d/1lL3Qh-YPxaEKOOVi9YcMrqJ22LPJpZ5UKVZDHMiP75c/edit - please ask for edit access if you want it.

@SherifNagy
Copy link
Collaborator

OK, let's try and organise that then. Now we get the fun of trying to organise a meeting for people from ~all time zones, heh!

I've created a poll at https://framadate.org/piW1VUFjDMZvjrDu, covering lots of options for hours from Sun 15 to Sat 28 October. Please (all) fill in the dates and times you can make, and in a few days I'll announce the best time. Yes, I appreciate that we may not get a good time for everybody.

Link is giving me 500 error btw:)

@steve-mcintyre
Copy link
Collaborator Author

steve-mcintyre commented Oct 5, 2023

Link is giving me 500 error btw:)

I think I broke framadate with a very large poll :-(

Trying something else instead...

@SherifNagy
Copy link
Collaborator

Link is giving me 500 error btw:)

I think I broke framadate with a very large poll :-(

Trying something else instead...

oooops :D , maybe https://www.when2meet.com/ ?

@steve-mcintyre
Copy link
Collaborator Author

Instead, all of you please mail me at 93sam@debian.org with your available times and dates in the timeframe of Sun 15 to Sat 28th October. To accommodate everybody, we may need to pick an uncomfortable time and date - be warned!

Deadline for that is next Wednesday (11 October 23:59:59 UTC). I'll post the result here ASAP afterwards.

@steve-mcintyre
Copy link
Collaborator Author

When I was going through the reviews I noticed a couple of things, which I summarized below. I want to thank all of you that did review other shims, because that made reviewing them a lot easier!

Signing and Embedded Certificate

There is no clear refrence on how the certificates embedded in the shim should look like.

* What is the advantage of somebody embedding an EV certificate?
  
  * Is it easier for us to review?
  * Is it a better guarantee that the key is actually in an HSM?

Some companies have internal rules about how they can issue certs, and developers are not allowed to deviate from them. That's the main reason I've seen.

* If an CA is embedded
  
  * Which key usage properties should be set? A small survey of the big distributions shows all the combinations
    
    * _Ubuntu_ Key Usage: Digital Signature, Certificate Sign, CRL Sign
    * _Debian_ Extended Key Usage:  Code Signing and Key Usage: Digital Signature, Certificate Sign, CRL Sign
    * _RHEL_ No specific key attributes

Argh. We should really be more consistent here; all of these key combinations work for now, but may not if the firmware folks decide to arbitrarily tighten up in future.

  * How should the leaf certificates look like?

Good question! In Debian we've set up separate key/cert pairs for each of the binaries that we sign further, and I think that can be a valuable thing to do in terms of revocation. Each of the certs describes its usage and the year it was created; these keys should also be much more limited in length than the CA, for hopefully obvious reasons. They should be controlled via HSM, ideally.

Patches to Shim and GRUB2

We now often had that patches for shim that are required for submission, that were not released. This is fine, but takes more time to review. It would speed up the review when the upstream commit is directly referenced in the patches or submission.

👍

Reviewing GRUB2 patches can be more time consuming depending from where the patches were taken. If the GRUB2 build is based on for example the patches from Fedora, Debian or Ubuntu it would be nice to have some (semi-)automated way on to check if and how the patch sets diverge.

👍

GRUB2 Modules

We ask for the list of included GRUB2 modules, but from a submitters perspective it is not clear, which modules are allowed and which might cause the submission to be rejected. I think we can come up with a list of pre-approved modules.

There's been some discussion about this, but we don't have a definitive list. I think that we should be looking for a minimal set of modules, particularly filesystems. Let's talk about this.

SBAT

Regarding SBAT I noticed the following:

* It is not very clear when downstream or based on distributions should carry the SBAT entry from the main distributions
  
  * shim: not really necessary, because we no longer carry a long list of patches that are not released (in most cases only the NX and buggy binutils one)
  * GRUB2: definitely required, because for example Fedora and Debian carry a big patch set
  * fwupd: unsure because currently it seems that distributions do not apply a lot of custom patches on top

👍

* There is no central documentation on what the current SBAT levels are. There is https://github.com/rhboot/shim/blob/main/SbatLevel_Variable.txt, but this for example does not include revocation done by Debian for GRUB2

* We don't have a way to lookup which vendor suffixes are currently in use

Good point, yes. I've been suggesting improving the docs around this and making them much easier to find. I'm glad it's not just me! :-)

I would propose that we have a tracking issue or wiki entry for this.

Nod.

Ephemeral keys for kernel modules and the alternatives

The simplest way to prohibit a kernel to load older kernel modules, is to use an ephemeral key during build time. There are reasons, why this might not be implemented:

* Building and signing is done on different machines

* Need for out-of-tree modules or to build modules after the fact and load them

* Reproducible builds

The question is how we should then proceed in these cases. Possible solutions to mitigate the impact of a vulnerability in a kernel module are:

* When `CONFIG_MODVERSION` is used:
  
  * Only update the ABI name or keys when there was an issue. Doable, but hard to keep track of.
  * Use a different signing key for every kernel version, that is stored in an HSM. Works very close to the ephemeral key solution, but requires a bit of infrastructure to implement.

* Without `CONFIG_MODVERSION`:
  
  * Increase `EXTRAVERSION` or `CONFIG_LOCALVERSION` for every kernel version.
  * Use a different signing key for every kernel version, that is stored in an HSM.

Do we accept all those strategies?

Let's discuss.

Companies being merged or bought

We had it multiple times now that companies merged or bought while submitting a shim for review (Neverware/Google, Micro Focus/OpenText, ITRenew/IronMountain). How do we want to handle this?

How much does it affect us?

@THS-on
Copy link
Collaborator

THS-on commented Oct 11, 2023

Companies being merged or bought

We had it multiple times now that companies merged or bought while submitting a shim for review (Neverware/Google, Micro Focus/OpenText, ITRenew/IronMountain). How do we want to handle this?

How much does it affect us?

Mainly affects contact verification and if they use EV certificates, they might still want to submit one with the EV certificate of the old company.

@steve-mcintyre
Copy link
Collaborator Author

steve-mcintyre commented Oct 12, 2023

OK, I think we have a winning time and date for the meeting. For the 7 of us providing data, all of us can make

Monday 16th October, 1500 UTC == 1600 BST == 0800 PDT

There were a couple of other options for the middle weekend or later in the second week, but I've chosen the first available slot. Let's not delay any more than we have to! I'll mail everybody with this as well.

Repeating: the agenda is at https://docs.google.com/document/d/1lL3Qh-YPxaEKOOVi9YcMrqJ22LPJpZ5UKVZDHMiP75c/edit . Let me know if you'd like edit access.

@steve-mcintyre
Copy link
Collaborator Author

OK, so I think we had a good and productive conversation but we ran out of time. :-/

Checking the data from people, it looks like the same time next Monday is feasible for a followup. Let's do that!

Monday 23rd October, 1500 UTC == 1600 BST == 0800 PDT

@SherifNagy
Copy link
Collaborator

My apology for missing the meeting, I added the wrong timezone in my calendar, that's my bad ! I am okay with having a short meeting next week as @steve-mcintyre was suggesting

@julian-klode
Copy link
Collaborator

julian-klode commented Dec 11, 2023

I see a worrying tendency from the new reviewers to try to be helpful and streamline the submission process. This is counter productive. The questions must be vague enough that it is not clear what the right answer is, otherwise we cannot judge if they really know what they're doing.

The checklists we have are very basic and only a part of establishing that this is a trust-able vendor. You also need to rely on your instincts and feelings, and not just accept something that looks sketchy because it ticks all the boxes in the checklist.

You need to assume that every new vendor is just here to p0wn systems and you are the last line of defense.

I wish we could take the submission requests private such that people can't copy each others answers, which is a fairly significant issue with the current process where people then "look" smart but they actually have no clue what they're doing. Remember, cheating is commonplace.

@aronowski
Copy link
Collaborator

Some docs/ directory proposals, hints and Quality of Life improvements A.K.A. what I would happily write myself to make the process more clear and easier in the future.

  1. If a contact verification process has been initiated, and the applicants posted their answers with random words, but the reviewer, who initiated this, has been inactive for a week and has not posted a response, another reviewer shall redo this process.

  2. What policy on comment editing do we set? Do we want to discourage editing comments, as someone from the committee might be using their email client (mail user agent) to track, what's happening as part of the application? Or do we agree, that the committee is obligated to track applications through a web browser, so that the latest version, perhaps with edited comments, is always available?

  3. Good practices for reviewers, no matter if they are from the committee or not. Maybe things like:

    1. How to set up a secure laboratory for rebuilding the shim binaries, as well as the actual reviewing, since we're dealing with untrusted third-party code, which someone can abuse to exploit a Git vulnerability or perform privilege escalation with Docker.
    2. Setting up docker-buildx may not be intuitive for everyone, but I myself came across a situation, which required me to install this plugin
    3. How to set up the random-words script, i.e. cloning from git.einval.com, installing the words package, running the script preferably in an air-gapped machine and emailing these random words after PGP verification
    4. How to securely verify PGP keys, the easiest method being the usage of different connections/ISPs and confirming that they match
  4. I'd like to avoid something like an applicant having to, sadly, attend Daily Standups, Plannings, Retrospectives, among others, only to once again explain with utter sadness, that We don't have this application accepted, since the committee has not yet reviewed our application. Furthermore, I'd like to present this initiative as some kind of a student club at a university, where people from different cultures, experiences, and having different levels of knowledge and expertise can come and safely learn new things, even if that means making mistakes now - it's better to make them now and learn from them, than having an application accepted and the shim binary signed, only to have a security vulnerability. While I would say that the reviewing process should be strict and require asking detailed questions on other things as well, like maybe HSM backup policies, it definitely should not look like a municipal office or notary, especially since the people who do the reviews are in most cases volunteers working on this in their free time, rather than employees being paid to review the applications in their working hours. Sometimes this volunteer work means sacrificing sleep time or neglecting friends and family just so the reviewing process gets uncloged.

  5. The lack of public activity does not mean that nothing is going on behind the scenes, and even if the reviewer looks like they are slacking off, it may mean that they are preparing a decent response, which can take time and energy. Commenting in an application may require thorough research and composing an easy-to-follow response, rather than pasting facts without a good structure/flow. Furthermore, it may require some interest in the product being reviewed itself, as to which the shim will be prepared and its general surroundings, e.g. building frameworks, what the company does, etc., and this also requires time and focus, which may be hard to come by if the reviewer gets little sleep due to a busy life.
    In other words: If you follow a reviewer on social media, and they post goofy things there, but the reviews are stalled, that's not an indicator of whether someone is slacking off or not, because, for example, during long calls, one can write a loose comment publicly, but can't focus on reviews that require accuracy and correctness.

  6. Simplify the SBAT examples, so one can quickly grasp the idea, rather than being forced to read the whole history behind its implementation. Quoting Greg KH:

    Pointing to an external document that is thousands of lines long,
    talking about bootloaders, is NOT a good way to get people to want to
    accept a kernel patch :)

    And I'd say it's also NOT a good way to present this initiative as user-friendly. Let's leave the SBAT history and design implementation in a separate file, so people can later read, why things are the way they are.

  7. The random words being sent to applicants should be checked after being generated as to not contain profanity, politically incorrect words, or combinations of one-around-the-other that may be negatively associated. In general, this small initiative also needs to stick to the Code Of Conduct.

  8. Hints, suggestions and cheat-sheet on commands one can use to quickly check certain things, e.g.:

    • certificate: $ openssl x509 -noout -text -in *.?er | less

    • .sbatlevel correctness: $ objdump -s -j .sbatlevel shimx64.efi

    • .sbat with pretty formatting: $ objcopy --only-section .sbat -O binary shimx64.efi /dev/stdout

    • in case of unsupported binaries one can use PE-bear, dump the sections manually and perform manual verification

    • when exactly in the history of Linux development something got introduced upstream, e.g. when we ask if certain commits are applied on have been backported:

      $ git tag --contains 1957a85b0032a81e6482ca4aab883643b8dae06e | sort --version-sort | head -n 1
      v5.4
      $ git tag --contains 75b0cea7bf307f362057cc778efe89af4c615354 | sort --version-sort | head -n 1
      v5.8
      $ git tag --contains eadb2f47a3ced5c64b23b90fd2a3463f63726066 | sort --version-sort | head -n 1
      v5.19
      
    • maybe other clarifications like, Certain sections in the shim binary may be presented as "/4" or something similar, because (and add the explanation here).

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
meta Not a review request, but an issue or notice wrt the signing process
Projects
None yet
Development

No branches or pull requests

10 participants