Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Caddy 2 port #57

Open
BTBurke opened this issue Oct 30, 2019 · 36 comments
Open

Caddy 2 port #57

BTBurke opened this issue Oct 30, 2019 · 36 comments

Comments

@BTBurke
Copy link
Owner

BTBurke commented Oct 30, 2019

@mholt has announced that Caddy 2 may hit stable by early next year. Based on a quick glance at the new docs, there are lots of changes to how plugins (soon to be called modules) work, and it will likely require a complete rewrite to support JWT auth in Caddy 2. I plan to post a deprecation notice on the readme so people know that no further development will happen on this plugin unless it's to address a security vulnerability.

I haven't decided whether I'll port this to the new module system as I rarely use JWT anymore. I think paseto has a better design and less foot guns if you are starting from scratch on a new application.

This issue is meant as a discussion area for a potential port to Caddy v2. For anyone considering an auth module for Caddy, I'd offer the following ideas based on several years of experience hearing about various use cases.

Eliminate all Caddyfile-style configuration directives that express auth logic

One of the major drawbacks of v1-style plugins was the use of the Caddyfile for expressing auth logic. While the system of allow user bob worked, it's awkward to use a configuration language to express logic. This is a recurring problem with projects that use config files to express branching logic that depends on external state (e.g., Ansible, Kubernetes Helm templates, etc.) where if-then-else logic is bolted on to a declarative config file format. This can be solved by using Starlark as suggested below, but there may be other ways to move this logic outside Caddy configuration and into an environment that is more suitable to scripting. One option is to delegate auth decisions to a separate microservice as in caddy-extauth, so that authorization can be controlled by the user in any language, but easily integrated as a middleware-type plugin.

The new module should take advantage of the integrated Starlark scripting capability to provide a platform for users to express advanced auth flows.

Starlark has a python-like syntax that would allow someone to express all kinds of auth flows and special cases that are awkwardly handled as edge cases in this plugin. Things like extracting tokens from oddly named cookies, stripping claim prefixes, and returning non-standard HTTP responses would be relatively easy in Starlark and keep these edge cases out of the module code.

token = token_from_cookie("my-auth-token")
secret = secret_from_env("JWT_SECRET")

if claim(token, "user") == "bob" and valid(token, HS256, secret):
     return ok()
else:
     return redirect(301, "https://mysite.com/login")

The starlark runtime can be injected with custom functions (implemented in Go), such as valid(token, algorithm, secret) and claim(token, field), that would extract claims and validate tokens, return various HTTP responses, set headers, allow fine grained auth rules for different paths, etc. so that nearly any use case can be under the control of the user while keeping the module code free of special cases.

Address drawbacks of delegating auth decisions to user scripts

One downside of the ideas above is that it would be a lot easier for a user to shoot themselves in the foot unless they are familiar with the design errors in JWT. There are several known vulnerabilities that arise from the JWT specification, like attacks based on monkeying with the alg header field in order to control how JWT validation is applied.

Care should be taken in what functions are supplied to the user so that the design of the API prevents them from insecurely validating tokens.

@mholt
Copy link

mholt commented Oct 30, 2019

Great, I'm glad you bring this up! I think you raise some really good points.

Ideally, the logic will be abstracted away from users enough that they can't make security logic errors. That's kind of the Caddy way.

One possibility is to sacrifice some flexibility for the most common, simple configs, that can be expressed in a declarative syntax. Then allow Starlark scripting to handle more complex, imperative logic. But maybe the Starlark APIs can be designed in a way that it still provides the flexibility users need, without the possibility of making security mistakes.

Also, it would be preferable if it was implemented as native Go code, especially when the need is common enough. Go code will always be faster than an embedded scripting language, and easier to maintain. (Although Starlark is quite fast.)

Caddy 2 has an HTTP authentication module which, I hope, can help provide authentication for HTTP requests by any kind of provider. Right now, the only provider is "basicauth" which uses HTTP Basic Authentication. But I hope that it will one day support plugins that provide JWT, OAuth2, and other methods.

Auth providers implement this interface:

// Authenticator is a type which can authenticate a request.
// If a request was not authenticated, it returns false. An
// error is only returned if authenticating the request fails
// for a technical reason (not for bad/missing credentials).
type Authenticator interface {
	Authenticate(http.ResponseWriter, *http.Request) (User, bool, error)
}

We're still in beta, though, so we can and should change this interface if this is not sufficient.

My hope for this authentication module is that you can provide 1 set of user accounts or credentials or whatever, and just plug in and configure various ways to authenticate them. I don't know if that's possible for all auth methods, but it sure would be nice, right? (i.e. one list of accounts that you can authenticate via basic auth, Oauth2, JWT, etc.)

Let me know what you think, or if this is helpful at all. Also if there are any changes to be made, now is the time!

@BTBurke
Copy link
Owner Author

BTBurke commented Oct 30, 2019

Hey Matt, thanks for that info. I think it's definitely possible to implement that interface across a wide variety of auth methods. One change I'd like to see is to expand the definition of the User struct to allow passing additional metadata about the user that could then be acted on by subsequent middleware. This would be great for JWT, Paseto, and other formats that pass arbitrary claim information as part of the token.

For example, a metadata field as a json.RawMessage would help a lot in being able to chain an auth middleware with a subsequent Starlark or other middleware that would act on claim information passed along in the context.

type User struct {
   ID  string
   Metadata json.RawMessage
}

@jimjimovich
Copy link

First of all, thanks so much for the work you did on this plugin. It has been incredibly helpful to me over the last year!

I really hope that someone will be able to implement JWT for Caddy v2. It's a very common technology and your plugin made it incredibly easy to use.

@mholt
Copy link

mholt commented Oct 31, 2019

One change I'd like to see is to expand the definition of the User struct to allow passing additional metadata about the user that could then be acted on by subsequent middleware. This would be great for JWT, Paseto, and other formats that pass arbitrary claim information as part of the token.

Absolutely -- I think another caddy 1 plugin (http.login, or loginsrv, I think?) has a User struct with a lot more fields.

And it's probably fine to add more strongly-typed fields too if that is more efficient, especially if they are useful/common in other auth methods.

I am mostly hoping the Authenticator interface is powerful and expressive enough to be compatible with all the various methods of authentication. If someone wanted to look into that for JWT (and others, if they want!), that'd be suuuper helpful. (I'm a bit swamped with other things right now, so. Community effort!)

@roblabla
Copy link

roblabla commented Feb 3, 2020

I did a very straightforward port in https://github.com/roblabla/caddy-jwt/tree/caddy2. It basically works exactly the same way (no fancy starlark integration or anything), but it:

  1. Uses the Authenticator interface instead of ServeHTTP
  2. Delegates all path matching to matcher (removes the path, allow_root, excludes options, as they can all be handled through a matcher combining path and not), simplifying the module a bit.

It's very untested so far. In fact, I had to disable all the unit tests because I couldn't figure out how to make them work yet (still figuring out caddy2). The next step for me is going to be porting loginsrv to caddy2. Once I've got both ported, I'll try getting unit tests back up.

Note to people who might be interested: To use it, you need to compile caddy with the following patch applied:

diff --git a/caddyconfig/httpcaddyfile/directives.go b/caddyconfig/httpcaddyfile/directives.go
index 035dcbec..9f86c35c 100644
--- a/caddyconfig/httpcaddyfile/directives.go
+++ b/caddyconfig/httpcaddyfile/directives.go
@@ -38,6 +38,7 @@ var directiveOrder = []string{
        "try_files",
 
        "basicauth",
+       "jwt",
        "header",
        "request_header",
        "encode",
diff --git a/modules/standard/import.go b/modules/standard/import.go
index 5ecfb4ac..48b1ce78 100644
--- a/modules/standard/import.go
+++ b/modules/standard/import.go
@@ -11,4 +11,5 @@ import (
        _ "github.com/caddyserver/caddy/v2/modules/caddytls/standardstek"
        _ "github.com/caddyserver/caddy/v2/modules/filestorage"
        _ "github.com/caddyserver/caddy/v2/modules/logging"
+       _ "github.com/roblabla/caddy-jwt/v4"
 )

and run go get github.com/roblabla/caddy-jwt/v4@caddy2

@mholt
Copy link

mholt commented Feb 3, 2020

Nice start!

You can actually make it work without any changes to Caddy's source code.

Simply create your own main.go to add the import there instead, then use the order option in your Caddyfile:

{
    order jwt after basicauth
}

@roblabla
Copy link

roblabla commented Feb 8, 2020

I managed to get loginsrv ported over as well. However, I came across a fun little problem:

caddy-jwt and loginsrv have a cool "feature" where loginsrv will automatically generate a random jwt secret if none are set in the Caddyfile, and store it in the JWT_SECRET environment variable. When caddy-jwt later parses its own directive, it will use the secret found in the JWT_SECRET environment variable if none are configured.

Now, this is obviously ordering-dependent. The login directive must be parsed before the jwt directive for this to work. Which, obviously, doesn't really work out reliably well. I'm not sure how caddy parses the directive and how everything is ordered, but when I tried it, caddy-jwt didn't find any JWT_SECRET because it was parsed before loginsrv

So, anyone got ideas to make this work reliably? Is there a way to somehow make the two modules communicate a secret?

@magikstm
Copy link
Contributor

magikstm commented Feb 8, 2020

Did you try these in your caddyfile:

{
   order jwt after basicauth
   order login before basicauth
}

It should allow them to be run in the correct order (login before jwt).

Ref: https://github.com/caddyserver/caddy/blob/master/caddyhttp/httpserver/plugin.go#L623

@roblabla
Copy link

roblabla commented Feb 8, 2020

From my understanding, order is an orthogonal concern: it will run the requests through the directives in the prescribed order. My problem has nothing to do with requests though, it's a kinda recey behavior at config parse/module load time. Basically, I need to ensure that loginsrv loads (not executes) before caddy-jwt.

Maybe order also affects other things than request (like Provision/Validate?), but at least the caddyfile parsing is not affected by order.

As a kinda hacky workaround, I'm currently working on lazily loading JWT_SECRET environment variable on first request in caddy-jwt. E.G. instead of grabbing the env vars directly in NewDefaultKeyBackends, have it unconditionally add an EnvVarKeyBackend, which will grab JWT_SECRET and cache its value when it executes its first request. Loginsrv would set the env variable in its provisioning step.

That would work, but is kinda ugly. I wonder if the module lifecycle interfaces like Provision/Validate are affected by order? If they are, I could simply make caddy-jwt fetch the JWT_SECRET in the provisioning step after loginsrv does it, but I'm doubtful that will work.

@mholt
Copy link

mholt commented Feb 8, 2020

I wonder if the module lifecycle interfaces like Provision/Validate are affected by order?

Nope -- modules are loaded and provisioned in an order independent of the HTTP middleware/handler chain. But, don't expect one module to be provisioned before another, independent module, for example. If you absolutely need one to be provisioned before another, then perhaps one should wrap/encapsulate the other, and roll their Provision methods into one. Or something like that. It's actually not unheard of, even in Caddy's standard modules, for one module to literally embed another and simply wrap its behavior. If you have a tight coupling of two modules, they should probably be rolled into one or the logic/concerns should be split differently so that they can cleanly be two separate modules.

@roblabla
Copy link

Alright, I managed to get something that works while keeping the modules separate using the plan outlined above. It's hacky, but everything seems to be working properly so yay \o/.

Thinking about what mholt said, I think jwt should have the ability to load the loginsrv module (or the other way around) using a directive argument, automatically setting up the required configs. E.G. you could have

jwt /* login
# Expands to
jwt /* {
    token-string RANDOM_STRING
}
login {
    jwt-secret SAME_RANDOM_STRING
}

In practice, I think this can be achieved (still keeping the modules separate) by using RegisterDirective (instead of RegisterHandleDirective) and returning two routes in the parseCaddyFile, one for the jwt and one for the login. E.G.

var routes []httpcaddyfile.ConfigValue
// Parse loginsrv part of config
loginsrv_handler := loginsrv.CaddyHandler {}
routes := append(routes, c.NewRoute(nil, &loginsrv_handler)...)
// Parse jwt part of config
handler := jwt.Auth{}
routes := append(routes, c.NewRoute(nil, &jwt_handler)...)

@mholt does that sound reasonable? This would mean that this convenience is only usable from the caddyfile - JSON users would still have to setup both plugins separately, and make sure they pass the same token to both.

And an open questions: Should JWT embed LoginSrv or the other way around?

@mholt
Copy link

mholt commented Feb 11, 2020

does that sound reasonable?

Yep, I think so anyway! In fact, that is how the php_fastcgi directive works (and a few others, though I can't recall atm). RegisterHandlerDirective is just convenient if you're making a single route/handler (which most directives do, but most directives are simpler).

You can also remove /* matchers from your Caddyfile, as that's the same as * which is the default anyway:

jwt login

However... before you go down that road...

I would strongly consider integrating with our existing authentication module -- as I mentioned in an earlier comment -- which is designed to support any mode of authentication using this Authenticator interface. If changes need to be made to the interface, let's do that ASAP before we leave beta. I'm willing to help with that.

I would love it if you can find a way for it to work with the existing auth module, before we leave beta. You can still make your own directives for the Caddyfile (this is what basicauth does, it integrates with the auth middleware but has its own directive).

@BTBurke
Copy link
Owner Author

BTBurke commented Feb 11, 2020

Sorry I've been MIA for this discussion recently. I'd strongly prefer that we don't tightly couple this with loginsrv if I have to maintain it. It just makes maintaining both of those plugins more difficult than it should be and potentially more difficult to find security corner cases.

I'd also like it to integrate with whatever existing authentication hooks Caddy 2 has in it via the Authenticator interface. Since this is a security plugin, it needs a hard look at the tests to make sure it's still working as advertised. Honestly I haven't been following Caddy 2 development closely and I'm not up to speed on how the new modules work.

@roblabla
Copy link

roblabla commented Feb 11, 2020

My fork already hooks into caddy’s authentication interface, implementing Authenticator and being in the http.authenticate namespace.

Here’s some outdated code, i’ll push my current progress tonight https://github.com/roblabla/caddy-jwt/blob/caddy2/jwt.go#L78

As far as changes to the Authenticator interface goes, I think User needs to be more generic, having a metadata map as mentioned before would help. I could then put the jwt information there.

@mholt
Copy link

mholt commented Feb 11, 2020

@roblabla Awesome! Yeah, we can add more fields to the User struct. We can kick it off in a pull request?

@roblabla
Copy link

@roblabla Awesome! Yeah, we can add more fields to the User struct. We can kick it off in a pull request?

I'll get a PR going.


I'd strongly prefer that we don't tightly couple this with loginsrv if I have to maintain it. It just makes maintaining both of those plugins more difficult than it should be and potentially more difficult to find security corner cases.

The plugins would still be different modules. The only change here is that on the configuration side, there's a shortcut that allows configuring both caddy-jwt and loginsrv at the same time. To explain this better, let me quickly explain how caddy2 modules work:

Caddy2 modules are objects that implements various interfaces, the only two required ones being json.UnmarshalJSON and caddy.Module (though you'd usually implement caddyhttp.Handler as well). A module is "normally" created by deserializing a JSON configuration (through UnmarshalJSON) and installing it at a certain route. More information on routing is available here.

But usually, people don't really touch the JSON stuff. Instead, they're using caddyfile directives. Those are totally decoupled from modules. They're basically a function that returns a list of "ConfigValue", basically asking for various stuff to setup the server the way we want. The list of stuff you can configure through ConfigValue is not well-documented, but looking at caddy's source code basically look at every occurence of sb.pile here, it looks like the values are:

  • tls.automation_manager: automatic acquisition and renewal of certs, equivalent to setting up something in apps.tls.automation.policies
  • tls.certificate_loader: Manual cert loading
  • tls.off: Disables tls for domain
  • tls.connection_policy: ??
  • route: Setup a caddyhttp.Handler for the specified route.

Now, with this out of the way, what I'm suggesting is simply to setup a Caddyfile directive that sets up two route handlers, one for caddy-jwt and one for loginsrv. It wouldn't be using any kind of private API or whatever, and would be semantically equivalent to creating two modules sharing the same token. In fact, when dumping the "raw" config from caddy, that's exactly what it would look like.

This should lead to less corner cases. It would allow us to remove the code using environment variables to communicate a token (which is frankly a horrid hack and it's a miracle it currently works at all...).


Since this is a security plugin, it needs a hard look at the tests to make sure it's still working as advertised.

I agree! But I started with getting something that, y'know, works :P.

On the testing side, I mourn the loss of caddy.NewTestController(). It doesn't look like there are any equivalent publicly available. @mholt would it be possible to get something like that, or maybe I'm overlooking something? It seems like all the modules currently in caddy basically just test the module itself, generating a fake request, setting up a Replacer in the request context, and that's it.

That works for unit tests, but it's not really an "end-to-end" test. Caddy-jwt has some nice tests to make sure it properly works on broken paths, and I'd like to keep them, but that means I'd need some way to do complete integration test, creating a complete caddy server from a fake Caddyfile and feeding it fake requests. Do you have any suggestions?

@mholt
Copy link

mholt commented Feb 12, 2020

Great writeup @roblabla!

Just a nit: explicitly implementing json.UnmarshalJSON isn't strictly necessary (unless you have some special case going on), you just need json struct tags to adhere to style conventions for the config structure. So the only interface you need to implement is caddy.Module, as well as whatever other interface the host module expects yours to be (for example, caddyhttp.Handler, as you mentioned).

caddyfile directives ... are totally decoupled from modules.

Correct, this is key!

The list of stuff you can configure through ConfigValue is not well-documented,

Yeah, my bad. I will get around to this eventually, I hope.

Basically, Caddyfile directives just throw ConfigValues into a pile (it's literally called pile as you pointed out). Then, when the Caddyfile adapter (the code that transforms Caddyfile into JSON) is building the JSON, it just looks in the pile and pulls out the class of values it needs for that part of the JSON.

tls.connection_policy: ??

Connection policies are used by HTTP servers when answering the question, "How should I establish this TLS connection?" In other words, when a server receives a TLS handshake, it needs to know how to configure that TLS connection, so it looks through its TLS connection policies and uses the first matching policy. It establishes parameters like protocol versions, cipher suites, and more.

On the testing side, I mourn the loss of caddy.NewTestController(). It doesn't look like there are any equivalent publicly available. @mholt would it be possible to get something like that?

Oof, yeah, I need to get around to that. Or somebody needs to make a PR 😅 I am a bit swamped trying to deliver on a few things after being sick for a while.

We're still figuring out the larger testing picture for Caddy 2. I do have a vision, sort of, but again, am constrained by time and funding for now.

I agree with everything you've written though, and it is accurate as far as I can tell.

@roblabla
Copy link

Thanks for the fast answer mholt! I'll try and see if I can figure something out for the testing. Should I move conversations related to this in caddyserver/caddy#2930 since it seems to be the closest related issue? Or should I open a separate issue specific for integration tests?

I pushed my working loginsrv and caddy-jwt. My next priorities will be making a PR expanding the User struct, setting up some basic unit tests for jwt and loginsrv to at least make sure the basic functionality works, and then exploring the Caddy2 codebase some more to see what would be needed to make integration tests.

@mholt
Copy link

mholt commented Feb 12, 2020

@roblabla That issue you linked to is the right one! We can move discussions about testing Caddyfile parsing to that issue.

@BTBurke
Copy link
Owner Author

BTBurke commented Feb 12, 2020

This should lead to less corner cases. It would allow us to remove the code using environment variables to communicate a token (which is frankly a horrid hack and it's a miracle it currently works at all...).

I think you are misunderstanding what these are for. There has to be some way for users to provide secret key material to this module in order to correctly validate a token. That is what these options are for. They aren't for passing tokens, they are for storing secret key material to validate tokens. If that's not secure, people can easily forge a token with any claims they want. They shouldn't be part of the Caddyfile because it increases the chances that they will be inadvertently committed.

Also, people don't always use loginsrv and may be using tokens from any number of sources, which is why it provides so many ways to configure the token source so that it will work with all the various auth providers out there.

The problem with coupling them isn't a technical one, it's a process one. Someone now has to work across both code bases to ensure that a change in one isn't screwing up something in the other one. I really don't have time for that. If that's something you want to take on and you feel like you know JWT well enough to make sure this module is secure, I'd be happy to pass it off to you.

I only caution that JWT is a foot gun and it's easy to screw up and create unintended vulnerabilities due to the way the spec is written.

@mholt
Copy link

mholt commented Feb 12, 2020

@BTBurke I think you bring up some good points.

They shouldn't be part of the Caddyfile because it increases the chances that they will be inadvertently committed.

Agreed, we discourage adding credentials to the config directly.

However, in the HTTP basic authentication module, we require that passwords are given in a securely-hashed format. I dunno if JWT secrets need to be in plaintext, but, would hashing them be a possibility?

people don't always use loginsrv and may be using tokens from any number of sources, which is why it provides so many ways to configure the token source so that it will work with all the various auth providers out there.

This is an important point. The Caddy-esque way to handle this is to make the token source a module in the config. So then, perhaps Caddy's JWT authentication module would itself be a host module for token source modules, one of which being loginsrv (or whatever it'll be called).

If the token source is a module, then users can configure any other ways to get the tokens. Sure, maybe they come from env vars. (Maybe the env var names could be configurable in the JSON -- and Caddyfile -- to allow multiple overlapping configs, since we try to avoid global state as much as possible in Caddy 2.) Maybe they come from loginsrv, or any other method.

It'd basically look like this in JSON: https://caddyserver.com/docs/json/apps/tls/automation/policies/management/acme/challenges/ -- notice the "dns" property is modular, and currently has 3 providers (soon to be 20x that many!). Or, I guess, a more relevant example, the HTTP authentication handler module accepts various methods for authentication (currently just one): https://caddyserver.com/docs/json/apps/http/servers/routes/handle/authentication/

Would that work then? Again, I don't know much about JWT, but it sounds like the Caddy 2 module design is a good way to solve this problem.

@roblabla
Copy link

I think you are misunderstanding what these are for.

There’s no misunderstanding on my part, only sloppy language. I’m well aware of the intricacies of JWT, and its various pitfalls :).

They shouldn't be part of the Caddyfile because it increases the chances that they will be inadvertently committed.

LoginSrv has multiple options for storing the secret/key either in the config, as a separate file, or if none exists generates a random one and store it in env. Meanwhile, Caddy JWT either takes it from a separate file or an env variable.

I agree storing the token in the Caddyfile is a bit of a footgun, but it’s currently allowed by loginsrv so 🤷‍♂️.

which is why it provides so many ways to configure the token source so that it will work with all the various auth providers out there.

I mean, at the end of the day there are only two ways to pass secrets/keys to caddy-jwt as it exists today: env variable or file path.

My current line of thought is, I’ll disallow passing it inline from a Caddyfile, but at the JSON level the secret will always be stored inline (greatly simplifying the KeyBackend story). This has some implications I need to think hard about though. Notably, the caddy admin API allows dumping the JSON config, so it would be able to dump the secrets, which seems pretty bad... but at the same time it would allow the user to dynamically add new secrets through the caddy admin API, which seems like a genuinely useful feature.

@mholt What’s the threat model like for the admin API? Does it already leak secrets, or should we take care to avoid leaking secret information in it?

Someone now has to work across both code bases to ensure that a change in one isn't screwing up something in the other one.

I suspect the interfaces will stay stable enough that this won’t be a problem. But I understand and respect not wanting to take on that risk.

And I know people are using other providers. My primary use-case is with loginsrv, but I will try my best to make sure I don’t unintentionally break other use-cases. The idea behind the tighter coupling between loginsrv and caddy-jwt is to make it more convenient and less footgunny to use them together.


I dunno if JWT secrets need to be in plaintext, but, would hashing them be a possibility?

JWT secrets are HMAC/RSA/ECDSA keys. They can’t be hashed, because the way JWT works is by using that key to sign a string.

So then, perhaps Caddy's JWT authentication module would itself be a host module for token source modules, one of which being loginsrv (or whatever it'll be called).

That’s... actually an interesting line of thought. It’d be interesting to see if it could automatically provision the secret by talking to auth backends. I’ll push it to future work for now though.

@mholt
Copy link

mholt commented Feb 12, 2020

Makes sense. I wasn't sure if they were signing keys but I guess they are.

I’ll push it to future work for now though.

Alright, but try to avoid design decisions that can't be compatibly reversed later. Supporting guest modules is easy and only takes about 5 extra minutes. There's even a template at the top of this page that can get you started writing the guest module. And making your module a host module is just a matter of adding a few lines of code and a struct tag. I would encourage just doing it up front and then you have nearly unlimited flexibility from then on without awkward/breaking design changes. 👍

It’d be interesting to see if it could automatically provision the secret by talking to auth backends.

Yeah, that'd be awesome. Even if the first/only guest module is very simple (i.e. loading them from env variables), that's OK. Just make an interface you want to contract with and then go with that.

@BTBurke
Copy link
Owner Author

BTBurke commented Feb 12, 2020

@roblabla sounds like you have some good ideas and a much better understanding of Caddy 2 than I do. Do you want to take over this module? If so, I'll wait until C2 is out of beta and then archive this project and point them to your fork.

I really don't have the time or desire to maintain this one anymore since I rarely use JWT.

@BTBurke
Copy link
Owner Author

BTBurke commented Feb 12, 2020

By the way, if you are thinking about making a module for more general secrets management in Caddy which could be useful for a lot more than just JWT secrets (such as API keys), I think Bitnami's sealed secrets for Kubernetes is a good model. It uses asymmetric encryption such that keys are safe to include directly in configuration and would only be able to be decrypted by a running Caddy. These could be NACL boxes/secretboxes which would be pretty bullet proof from a crypto standpoint.

It's not quite as straightforward in Caddy because it runs as a persistent kubernetes operator. You would still have a chicken/egg problem in which Caddy would have to read the decryption key on startup, but that would mean only worrying about how to deal with one secret, but it could be used by other modules that have secrets to protect. Just an idea.

@derekbassett
Copy link

@roblabla I'm looking at the work you've done on your fork. Are you thinking of continuing this work?

@roblabla
Copy link

roblabla commented Apr 1, 2020

@derekbassett I mean, it works pretty well already for my use-cases. Here are the TODOs:

  • Write unit tests. Right now, caddy2 is a bit lacking in this area - a lot of the facilities in caddy1 to help unit testing don't seem to have an equivalent
  • Integrate the latest changes in Caddy. In particular, populating the caddyauth.User's metadata with the claims.
  • Fix token source serialization/deserialization. Right now token sources will always use the defaults because its (de)serialization is broken.
  • Update the README.

There's also quite a bit of cleanup work necessary, for instance a lot of key backends are redundant and could probably be refactored into something a lot cleaner. I haven't done a whole lot of go so it's possible some of this code is really backwards ^^'

@mholt
Copy link

mholt commented Apr 1, 2020

Write unit tests. Right now, caddy2 is a bit lacking in this area - a lot of the facilities in caddy1 to help unit testing don't seem to have an equivalent

Oh, right -- you need a way to test caddyfile parsing right?

@roblabla
Copy link

roblabla commented Apr 1, 2020

Yes, that'd be nice

@mholt
Copy link

mholt commented Apr 1, 2020

@roblabla I feel like we talked about this before but can't find it, can you point me to what you'd like to see for testing? Or if we haven't gone over that yet, could you tell me what specifically would be helpful?

@roblabla
Copy link

roblabla commented Apr 1, 2020

Sorry for the short answer earlier, I wanted to elaborate but something came up haha.

I feel that the main thing missing to caddy's testing story is an "integration test" kinda thing. I talked about it here

On the testing side, I mourn the loss of caddy.NewTestController(). It doesn't look like there are any equivalent publicly available. @ mholt would it be possible to get something like that, or maybe I'm overlooking something? It seems like all the modules currently in caddy basically just test the module itself, generating a fake request, setting up a Replacer in the request context, and that's it.

That works for unit tests, but it's not really an "end-to-end" test. Caddy-jwt has some nice tests to make sure it properly works on broken paths, and I'd like to keep them, but that means I'd need some way to do complete integration test, creating a complete caddy server from a fake Caddyfile and feeding it fake requests. Do you have any suggestions?

Basically, what I want is something that I can pass a caddyfile as a string, and returns a caddy server object I can call ServeHTTP on with mocked requests so I can make sure it exercises the right paths.

@mholt
Copy link

mholt commented Apr 1, 2020

@roblabla Ah, thanks -- I think this is what caddyserver/caddy#2930 is all about, we can move the discussion about this to there.

@greenpau
Copy link

@BTBurke , @roblabla, I am wrapping up working on MVP for https://github.com/greenpau/caddy-auth-saml. Upon validation of a provided SAML assertion it issues a JWT token.

As part of the module configuration, I set the following parameters: https://github.com/greenpau/caddy-auth-saml#jwt-token

I was thinking of quickly putting together caddy-auth-jwt, which would re-utilize a lot of code here.

How far away are you with Caddy v2 port? If it is best effort for you, then I will try putting together caddy-auth-jwt in the next couple days. If you have something already, please let me know and I will integrate with what you already have.

@roblabla
Copy link

roblabla commented Apr 19, 2020

@greenpau I have something that works for my use-case at https://github.com/roblabla/caddy-jwt/tree/caddy2, but has a few bugs:

  • Most importantly, serialization/deserialization of the token source is basically non-functional, which means if a different token source than the default is used, it won't work. That's not too hard to fix, just have to make a wrapping struct around it like I did for KeyBackends.
  • I never got around to using the caddyauth.User Metadata to store the claims.
  • And the unit tests are still not ported.

AFAIK everything else works though. The JSON configuration is maximally flexible, allowing the user to chose what to do in case of error (using a handle_error route), while still keeping the easy-to-use options in the caddyfile.

I'm working on this very sporadically though ^^'. Mostly using this for my own use-case and don't have much time to dedicated to anything else right now.

@greenpau
Copy link

I'm working on this very sporadically though ^^'. Mostly using this for my own use-case and don't have much time to dedicated to anything else right now.

@roblabla , thank you quick reply! I think I will make an attempt at jwt then.

@greenpau
Copy link

Started in https://github.com/greenpau/caddy-auth-jwt. The skeleton for a configuration is here: https://github.com/greenpau/caddy-auth-jwt/blob/master/123#L23-L59

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

7 participants