Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[css-nesting-1] & representing parent elements vs parent selector #8310

Closed
tabatkins opened this issue Jan 13, 2023 · 41 comments
Closed

[css-nesting-1] & representing parent elements vs parent selector #8310

tabatkins opened this issue Jan 13, 2023 · 41 comments
Labels
css-nesting-1 Current Work

Comments

@tabatkins
Copy link
Member

When I wrote the Nesting spec originally, I specifically defined it so that & was an ordinary selector, that happened to match the elements matched by the parent rule. This leveraged the powers that only a live browser could have, which seemed useful. Preprocessors, unable to do the same, instead have to make their "nesting selector" more of a macro, that is substituted with the parent selector.

That is, in a rule like

.a .b {
  .c & {...}
}

In the Nesting spec, the & matches "a .b element with a .a ancestor", and the nested rule imposes an additional "...and has a .c ancestor", so it's the equivalent of .c :is(.a .b). It'll match all the following markup structures:

<div class=a>
 <div class=c>
  <span class=b></span>
 </div>
</div>

<div class=c>
 <div class=a>
  <span class=b></span>
 </div>
</div>

<div class="a c">
 <span class=b></span>
</div>

Preprocessors can't do this (or rather, couldn't do this before :is() was standardized, and still can't widely do it since support for :is() is still in the low 90%s). Expanding such a selector fully (to .a .c .b, .c .a .b, .a.c .b) would mean a combinatorial explosion in selector length, and gets drastically worse the more combinators are in either selector. (Sass has a similar situation with its @extend rule, where it uses heuristics to select only a few likely expansions, rather than all possible ones.) Instead, they substitute the selector directly, giving .c .a .b, and only matching the second of those three markup structures.

I was discussing Nesting with @nex3 a few days ago, and she expressed surprise and some dismay that Nesting had this behavior. She believes that a large amount of CSS written with Sass nesting expects and depends on the current Sass behavior, not accidentally but quite intentionally - they expect this sort of nesting pattern to be expressing "and then a .c container around the previous stuff" rather than just "and then a .c container somewhere around the specific previous elements". She believes this would actually be a significant burden to people migrating from preprocessor nesting to native nesting, and even for people who are using native Nesting fresh, believes it'll be a common point of confusion for people using this pattern.

As well, under Sass's behavior, an author can recover the current spec's behavior by wrapping the & in an :is(), like .c :is(&) (or wrapping the parent selector). However, there is no way to recover Sass's behavior from the current spec's behavior.

In the majority of cases it doesn't make a difference which way we go. The most common relative selector case (a nested selector like .c or + .c) is unchanged, and many non-relative selectors are also unchanged, such as & + & or :not(&). Only cases like the above, where there is a descendant combinator in the parent selector and a descendant combinator in the nested selector with an & following it, will change their behavior.

I know this is a pretty significant semantic change to be coming at this late time, but Natalie felt it was pretty important, and @mirisuzanne backed her up, and the arguments were reasonably convincing to me.

(Note: we'd still not be doing selector concatenation - .foo { &bar {...}} is still equivalent to bar.foo, not .foobar.)

@mirisuzanne
Copy link
Contributor

Yes, I've had use-cases for both selectors – but the desire to 'scope the outer selector entirely' (.c .a .b in the example above) is by far the majority use-case in my experience, and what I would expect from reading the selector. I would find it fairly surprising to get the .c :is(.a .b) behavior without specifically requesting it. And I'd have a strong leaning towards the semantics that allow expressing both variations explicitly, rather than implicitly using the more broad selector without any way to clarify.

I am curious how the css-nesting polyfills in other processors have handled this. Were those polyfills designed with the current spec behavior, and has that resulted in surprising selectors for people?

(While I'm curious about that, I don't think it necessarily tells us if the pattern is safe. In a lot of cases, I would expect the DOM will only have .c .a .b matches, even when the output selector is .c :is(.a .b). That wouldn't make it safe to implement the less expected behavior, it would actually point to the idea that one case is much more common. I don't think that's something we could measure. So the question is really: are people explicitly expecting and relying on the .c :is(.a .b) pattern when writing .c & in one of those polyfills?)

@romainmenke
Copy link
Member

romainmenke commented Jan 13, 2023

I am a bit surprised to see this come up here :)

I've reached out on the Sass repo to bring up exactly this difference : sass/sass#3030 (comment)

I've also brought this up each time someone advertised Option 3 as a syntax option that is like Sass and would be simple to migrate. It is not simple exactly because of this difference.


I am curious how the css-nesting polyfills in other processors have handled this. Were those polyfills designed with the current spec behavior, and has that resulted in surprising selectors for people?

In postcss-preset-env we follow the specification with postcss-nesting.

We have heuristics to mimic what :is() would match without actually using :is() for some common and simple patterns. For all others we use :is().

We've learned that people do not (yet) have a good mental model for :is() when complex selectors are involved. Nesting ultimately follows similar rules as :is(). We've had multiple bug reports because we follow the specification.

Especially people coming from Sass have greater difficulty switching to "native nesting".
We've spend quite a bit of time educating people about the nesting specification, :is() and how it differs from Sass.

Those without a prior mental model seem to intuitively figure out what works and how to use nesting.

@Loirooriol
Copy link
Contributor

How is & + & unchanged? If SASS substitutes the selector, with a parent of .a .b, & + & becomes .a .b + .a .b, which is different than :is(.a .b) + :is(.a .b) a.k.a. .a .b + .b.

Seems to me this happens when the parent and nested selector are complex, and the latter uses & after a combinator. Descendant combinators are not needed at all: .a + .b { .c + & {} } is .c + .a + .b in SASS, and .a.c + .b in CSS.

@tabatkins
Copy link
Member Author

You're right, it does change. Specifically,

.a .b , .c .d {
    & + & { color: blue; }
}

becomes

.a .b + .a .b, .a .b + .c .d, .c .d + .a .b, .c .d + .c .d {
  color: blue;
}

in Sass.

@romainmenke
Copy link
Member

romainmenke commented Jan 14, 2023

In Sass it is also perfectly fine to use pseudo elements in the parent :

::before, ::after {
    color: cyan;

    @media (min-width: 300px) {
        color: lime;
    }

    &:hover {
        color: pink;
    }
}

Not having this feature seems problematic for many.

@Loirooriol
Copy link
Contributor

My thoughts are:

  • Follow this principle: https://w3ctag.github.io/design-principles/#third-party-tools. Usability takes precedence over compatibility with tools, and I'm concerned that the combinatorial explosion will make nesting unusable except in trivial tiny cases.
  • What SASS is doing seems a bad and confusing design to me, since it's not just replacing & with the parent selector:
    • Top-level selector lists are split and not handled as a single selector. This has e.g. the implication that wrapping a selector list inside :is() in order to have forgiving parsing will change the meaning in nesting
    • What about nesting .c& inside .a .b, the spec allows it, seems invalid in SASS, would it produce .c.a .b with this proposal? Regardless, .c& looks like a compound selector but it can't be freely reordered.
  • The current spec is more natural to CSS. May it be confusing to the people who are used to a tool with a different behavior? Sure, but of course they are biased.

@cdoublev
Copy link
Collaborator

cdoublev commented Jan 15, 2023

As Oriol noted, a consequence of resolving & with :is() is implicit forgiving parsing whereas parsing the top-level selector is not forgiving, which can be surprising for authors. For example, html|:hover, .class:hover {} would be invalid and entirely ignored, whereas :hover { html|&, .class& {} } is valid and apply styles on elements matching .class:hover.

I take this example on purpose because if & must be resolved with :is(), I think it should not be allowed to replace any symbol produced by <compound-selector>. For example, div { [title=&] {} } would also be valid. & should only be allowed to replace <simple-selector> (<type-selector> and <subclass-selector>), <pseudo-element-selector> and <pseudo-class-selector>.

On the other hand (also noted by Oriol), I am not sure whether :hover { &.class {} } should be valid if it is not resolved with :is().

@tabatkins
Copy link
Member Author

Usability takes precedence over compatibility with tools, and I'm concerned that the combinatorial explosion will make nesting unusable except in trivial tiny cases.

Given that Sass has been shipping with this exact behavior for years, and they pay a larger cost than browsers would (Sass has to ship the expanded results over the wire, rather than just representing it in memory), I don't believe this fear is warranted. (It does make the need for a minimum/maximum spec more relevant, as in #2881.)

Top-level selector lists are split and not handled as a single selector. This has e.g. the implication that wrapping a selector list inside :is() in order to have forgiving parsing will change the meaning in nesting

This is true. But note that "just wrap it in :is() is already noted to not just be a forgiving no-op; note its effects in :has(). Increasing the places where this is true isn't great, I agree.

What about nesting .c& inside .a .b, the spec allows it, seems invalid in SASS, would it produce .c.a .b with this proposal? Regardless, .c& looks like a compound selector but it can't be freely reordered.

Sass's particular parsing/emitting rules disallow .c& as invalid; you have to write it as &.c. That's not a requirement for us, tho; I imagine Sass's limitation is related to their support for token concatenation.

So no, .a .b { .c& {...}} would still effectively produce .a .b.c {...}. The rule is "the compound selector with the & is merged with the final compound selector of the parent; compounds preceding the & are placed entirely before the parent selector, while compounds following the & are placed entirely after".

The current spec is more natural to CSS. May it be confusing to the people who are used to a tool with a different behavior? Sure, but of course they are biased.

Right, that's the question - is the expectation formed by Sass authors the natural expectation, such that we can expect fresh CSS authors to do the same, and be surprised when they write HTML that violates that expectation but still matches? (Like markup options 1 and 3 in my original post.) I think that it's a somewhat reasonable expectation.

And again, since it's quite easy to regain the current spec's behavior (by using :is(&)) if we make this change, but impossible to gain the Sass-ish behavior if we retain the current spec, I think the threshold for reasonableness is lower than it would be if we were deciding entirely on one vs the other.

For example, div { [title=&] {} } would also be valid.

This definitely would not be. The & is still, itself, a <simple-selector> for grammar purposes.

In Sass it is also perfectly fine to use pseudo elements in the parent :

Yes, this would make pseudo-elements in the parent usable again.

@tabatkins
Copy link
Member Author

Summarizing some pro/cons of each:

The Current Spec

  • Pro: Seems more "CSS-ish", using CSS concepts and the dynamism of the browser more fully
  • Pro: Works better for sibling combinators. .a .b { .c + & {...}} does what it looks like (matches when the element's previous sibling is .c).
  • Con: .a .b { .c & {...}} matches in more cases, which can be unexpected, and there's no way to enforce ".c is an ancestor of the entire parent selector" semantics.
  • Con: Until we add some way for pseudo-elements to work in :is(), this is fundamentally incompatible with pseudo-elements in the parent selector.

The Sass-ish Proposal

  • Pro: Might match author intuition better in cases like .a .b { .c & {...}}, where it means ".c is a container around everything I was doing in the parent selector" (rather than just ".c is an ancestor of the selected element, but might be higher, lower, or on the same element as .a").
  • Pro: Works automatically with pseudo-elements, tho perhaps in a naive way - ::before { &:hover {...}} gives ::before:hover, but there's no way to apply conditions to the originating element (producing a selector like :hover::before, or something like .foo::before).
  • Pro: can regain the current spec's semantics with :is(&).
  • Con: Probably matches author intuition worse when used with the sibling combinators like in .a .b { .c + & {...}}, where it produces .c + .a .b. Usages like .a .b { & + & {...}} produce the nigh-nonsensical selector .a .b + .a .b, a complete departure from what the author appears to be saying in the nested selector (selecting an element when its previous sibling also matches the parent selector). (But as noted in the previous bullet point, :is(&) + :is(&) does do what's probably intended.)
  • Con: depending on implementation, might add an additional multiplicative factor (equal to the number of complex selectors in the selector list) to the internal selector expansion (on top of the existing depth-based multiplication).

@Loirooriol
Copy link
Contributor

Given that Sass has been shipping with this exact behavior for years, and they pay a larger cost

Correct me if I remember wrong, but I recall reading comments where you said that SASS avoids the combinatorial explosion by using some heuristics and not expanding to the full list. This doesn't seem reasonable for CSS.

I don't believe this fear is warranted.

I would like to know what an implementer thinks @sesse

@romainmenke
Copy link
Member

romainmenke commented Jan 16, 2023

I recall reading comments where you said that SASS avoids the combinatorial explosion by using some heuristics and not expanding to the full list.

It is also mentioned in the note under :

Why is the specificity different than non-nested rules?

https://drafts.csswg.org/css-nesting/#:~:text=Why%20is%20the%20specificity%20different%20than%20non%2Dnested%20rules%3F

Happens to be the same note I was looking for regarding specificity


Would specificity remain the same as if :is() were used?
i.e. only the matching behavior would change


:not() would behave differently from an author perspective.
I can't estimate if this would ever be an issue in real world cases.

.c .a .b

.a .b {
  .c & {}
}

.c :not(.a .b)

.a .b {
  .c :not(&) {}
}

@sesse
Copy link
Contributor

sesse commented Jan 16, 2023

I would like to know what an implementer thinks @sesse

It's a bit tricky to comment on this, because I don't think the boundaries of the proposal really have been fully laid out yet. So let me first assume that we don't mean to support token pasting (e.g. .foo { &_bar { … }} becoming .foo_bar { … }.

Second, do we intend to do this expansion before or after CSSOM?

Let me give a very short introduction to how CSS selectors work in Chromium, to the point over oversimplification. We store two different representation of a stylesheet in memory, where one is roughly an array (for parsing and CSSOM purposes) and the other is roughly a hash table (for matching purposes). The former looks very much like what you'd write in a stylesheet, which the notable exception that selectors with selector lists are parsed into a tree structure and the combinator set is a bit more explicit (each and every simple selector has a combinator towards the next element). E.g., if you write .a .b:is(.c, .d), what you get is basically .a <descendant> .b <subselector> :is(<mumble>) <end-of-complex-selector-list>, where the <mumble> part is a pointer in memory to another selector list (.c <end-of-complex-selector> .d <end-of-complex-selector-list>).

The second, which we actually use for matching, uses the same complex selector representation, but is otherwise completely different. In particular, this representation rests pretty firmly on the concept of the (complex) selector as a unit; if an author writes .a, #id { … } in the former, that becomes two entirely distinct rules in the latter. We also store some information like hashes of ancestor elements (for quick rejection of rules where the subject matches but the ancestors do not) in this representation, and the original flat array is completely broken up and is scattered in bits and pieces.

Supporting the current spec is super-simple for us. Whenever we see &, we treat is nearly identical to an :is(<mumble>) selector; it's just that instead of following the pointer to a new list, we store the pointer to the parent rule instead, and when it's time to match, we just go to its selector list, just as if we had an :is. Nearly everything else falls out of that; specificity, parent rejection, and so on.

Now let's look at the case where we are to treat & as just a selector list paste, under the example in question. There are basically two options here; expanding the selectors during the conversion from the first to the second form, or not. Let's look at the second first. Consider a case such as .a { .b & .c { … }}. The inner rule would be stored as . b <descendant> & .c <end-of-complex-selector-list>. So when matching, generating specificy, descendant hashes etc., we'd need to see the &, make a sort-of goto (or gosub!) jump into the upper rule, and then when seeing the end of the complex selector list in the parent, jump back. This is possible, but it's going to be pretty slow; right now, moving along this list is extremely fast (always just a pointer add), and now we'd introduce a branch. It's feasible, but it would slow down selector handling universally for us.

However, the bigger problem here is selector lists in the parent. I can't find any usable ways of dealing with that at all. Not only would we need to know ahead of time that we'd need to start matching of the inner rule twice, and then know which of the two parents to jump to; we wouldn't know where to store the rule in the first place. (We absolutely need two selectors .a and #id to be stored different places; where would we store & { … } if the parent was .a, #id { }?)

So we need to expand; there's no way around it. I see that people think this is a lower cost for us than Sass, but it absolutely isn't. The wire cost is real, but it can be compressed, and our memory representation is going to be much larger than what you'd see on the wire even without compression—and Sass only needs to run once, whereas we'd pay the CPU costs on load (when we recreate the second form, either due to load or a stylesheet change) and the RAM cost for as long as the page is loaded. I also see people claim that both the current spec and an expansion-like form are “multiplicative”, but that is entirely wrong; the current spec is additive (n rules always cost O(n) memory, O(n) setup cost and usually O(1) match cost—O(n) at most) and expansion is indeed multiplicative, leading to exponential behavior when chained (n rules can cost O(2^n) memory, O(2^n) setup cost—match cost is still usually O(1), but now O(2^n) at most).

Authors already know next to nothing about style performance. If you give them this kind of behavior, it's just a huge footgun, and there's no way they will understand that their innocent CSS blows up completely in the browser.

We could try to put limits on this, but it's not entirely clear how that would work. Where would these limits be enforced? What would happen if they are violated; would rules stop matching silently? What happens when some of the rules get modified in CSSOM; does the modification fail, or does something else happen?

If we didn't have complex selector lists in the parent, we wouldn't have these problems. We could expand; it would cost us some memory and be a bit slower in the setup phase, but it would be restricted to rules that actually use nesting. Even without parent lists, I don't really think trying to match & directly by goto/gosub would be a very attractive option, for speed reasons (and I can imagine WebKit's JITing of selectors would also potentially run into issues here, without ever having looked much at their JIT).

The third alternative is the infamous Sass heuristics, but if so, someone would have to specify what those are if we are to discuss them. And again, remember that they'd have to be efficiently implementable very quickly (whenever a stylesheet loads or changes), so it's not a given that we can just copy Sass' implementation.

As a separate note, it is a problem to try to radically change the spec basically at the moment we are trying to ship it. This is a great way to discourage people from being the first to ship a feature.

@jakearchibald
Copy link
Contributor

fwiw, I posted a couple of polls on the topic: Twitter thread, Mastodon thread.

@romainmenke
Copy link
Member

WebKit has also largely implemented nesting in the past few weeks if I read the commits correctly.

Would be good to hear from them too.

@bradkemper
Copy link
Contributor

bradkemper commented Jan 16, 2023

Probably matches author intuition worse when used with the sibling combinators like in .a .b { .c + & {...}}, where it produces .c + .a .b.

I've been using SASS significantly for only a year or two, and this did and does match my intuition. I think it is a LOT easier to understand than the implicit :is(&) version, which would select things I hadn't really considered.

@sesse makes some really good points though, and making a change like this now would seem to be something that will significantly delay the release and/or make it much less performant.

I could probably get used to the non-SASS way, but converting old SCSS files is going to be harder than I was imagining.

@lilles
Copy link
Member

lilles commented Jan 17, 2023

WebKit has also largely implemented nesting in the past few weeks if I read the commits correctly.

Would be good to hear from them too.

Adding @anttijk

@jakearchibald
Copy link
Contributor

jakearchibald commented Jan 17, 2023

Results of my mini-poll:

There was only ~350 votes per poll on Twitter, and ~150 on Mastodon. Mostly limited to folks who follow me.

Folks were pretty divided when it came to:

.a .b {
  .c + & {
    background: green;
  }
}

Twitter:

  • .c + :is(.a .b) 45%
  • .c + .a .b 40%

Mastodon:

  • .c + :is(.a .b) 36%
  • .c + .a .b 48%

Note: I didn't present these selectors as options. Instead I described the behaviour.

There was a stronger preference when it came to:

.a .b {
  .c & {
    background: green;
  }
}

Twitter:

  • .c :is(.a .b) 33%
  • .c .a .b 48%

Mastodon:

  • .c :is(.a .b) 18%
  • .c .a .b 47%

@cdoublev
Copy link
Collaborator

I would have been interested to see the results for .a .b { .c& {} } and .a .b { &.c {} }.

@anttijk
Copy link

anttijk commented Jan 18, 2023

To me inserting an :is() that is not visible anywhere in the selector is a surprising behavior. I wonder how many developers can correctly tell what .c :is(.a .b) actually matches.

WebKit implements this feature by desugaring the selectors (rather than having a "parent selector" visible in selector matching) so can easily adapt to changes.

@sesse
Copy link
Contributor

sesse commented Jan 18, 2023

WebKit implements this feature by desugaring the selectors (rather than having a "parent selector" visible in selector matching) so can easily adapt to changes.

What do you do if there's a selector list in the parent? Do you get exponential RAM usage if many of these are nested?

@anttijk
Copy link

anttijk commented Jan 18, 2023

The current implementation desugars with :is() (per spec) so it is not an issue at the moment. You are right that dealing with lists in parent while avoiding potential combinatorial explosion would require some specific handling (:is()-like internal selector that doesn't escape the context?).

On the other hand I'm not sure how important this really is in practice. To get the same behavior without nesting you'd have to write out each combination by hand.

@sesse
Copy link
Contributor

sesse commented Jan 18, 2023

The current implementation desugars with :is() (per spec) so it is not an issue at the moment.

Won't it still be?

.a, .b, .c, .d {
  & &, & &, & &, & & {
    & &, & &, & &, & & {
      & &, & &, & &, & & {
        & &, & &, & &, & & {
          color: red;
        }
      }
    }
  }
}

If you desugar, you'd end up with exponential :is() lists and thus RAM usage, right?

@anttijk
Copy link

anttijk commented Jan 18, 2023

Yeah, you can definitely break it. There are many ways to consume all memory in web platform if you want.

In practice there would be limits.

@bramus
Copy link
Contributor

bramus commented Jan 18, 2023

(@anttijk) I wonder how many developers can correctly tell what .c :is(.a .b) actually matches.

A quick informal ask around to a handful of devs I know didn’t realize it behaved as is behaves the way it does. They all thought it didn’t really make a difference and would act the same as .a .b .c.

Simultaneously I received this feedback from Ana Tudor, a well respected voice within the developer community who’s known for pushing the capabilities of CSS to its limits:

I don't see the logic behind ever using .a :is(.b .c) (unless what you really want is to catch the .b .a .c case too), whether you manually write it like that or it's generated from the way you nest things.

@zealvurte
Copy link

If it remains with an implicit :is(), and there's no way to achieve the alternative within nesting, then it doesn't support what is probably the larger use case for nesting (even if it is the less powerful one). It would mean nesting will be used less, this spec will be less useful, and it will require an alternative solution is created to fill that gap before it sees widespread use.

Changing it will allow both options to work, be more expressive, and easier to read naturally (for the simple fact :is() usage will be explicit). It will delay rollout, which would be a benefit in this scenario, as it means we won't end up with an afterthought of a solution tacked on later, that would likely add further confusion on what already will exist (authors are still playing catch-up on understanding :is() as it is).

In both cases, we can live without nesting indefinitely, by using more verbose selectors and preprocessors, so there should be no impetus to push this forward in a state that doesn't fulfil what developers need from it.

Clearly it's going to scupper current progress on implementation work, and require some new thinking on how to implement without excessive resource consumption, but this is something that will have to be achieved (as per :has()) sooner or later, and I firmly believe delaying things and changing this now will be healthier for CSS moving forward.

@sesse
Copy link
Contributor

sesse commented Jan 18, 2023

Changing it will allow both options to work, be more expressive, and easier to read naturally (for the simple fact :is() usage will be explicit). It will delay rollout, which would be a benefit in this scenario, as it means we won't end up with an afterthought of a solution tacked on later, that would likely add further confusion on what already will exist (authors are still playing catch-up on understanding :is() as it is).

You are missing an important detail: Changing it will most likely make sure it isn't rolled out at all. Not delay it; stop it. It's unlikely we'll want to implement something that blows up into exponential memory use so easily and without warning, and if nothing else is acceptable to web developers, seemingly the only solution is to do nothing.

In both cases, we can live without nesting indefinitely, by using more verbose selectors and preprocessors, so there should be no impetus to push this forward in a state that doesn't fulfil what developers need from it.

Yes, it sounds increasingly like we should just declare css-nesting-1 a painful learning experience, delete it permanently and go do something else.

@lilles
Copy link
Member

lilles commented Jan 18, 2023

There's a restriction in Sass, not in the current nesting spec, that the & has to be the leftmost part of a compound. That could make it a bit easier to implement it with a synthesized special :is()-like that keeps track of the element matching the leftmost compound inside the :is()-like and continue adjacent/descendant matching to the left using that element as the candidate, but that wouldn't easily make specificity match. We would have to compute the specificity dynamically by matching all sub-selectors to find the one with the highest specificity (assuming the specifity behavior also has to match current Sass).

@Loirooriol
Copy link
Contributor

as per :has()

I don't think the current spec is inconsistent with :has() if that's what you mean.
If you have .a:has(.b .c), then it's kinda like .a:has(& .b .c) [*]. So it's normal that it checks for .a .b .c.
And .a:has(:is(.b .c)) is like .a:has(& :is(.b .c)) so it checks for a .a :is(.b .c), which is different.

[*]: #7211 disallowed &/:scope in that sense, but imagine that's possible

Basically, this issue only has implication on relative selectors where the reference to the anchor elements is not on the left-most compound selector, which can't happen with :has().

If it remains with an implicit :is(), and there's no way to achieve the alternative within nesting, then it doesn't support what is probably the larger use case for nesting

Most nested selectors have & at the beginning, so they are not affected by this.

easier to read naturally

The natural thing is that .a + & selects some element preceded by .a. I guess that SASS has another behavior because back then :is() didn't exist and being a preprocessor it couldn't do better.

@dbaron
Copy link
Member

dbaron commented Jan 18, 2023

My general sense here is that this is proposing something putting something that is very preprocessor-ish and not very CSS-ish directly into CSS. But it's been a longstanding preprocessor feature, so people are accustomed to it being preprocessor-ish.

On the flip side, doing it the preprocessor-ish way means that (using :is()) developers can have both options, if they want. However, when choosing between the options, the thing that feels to me like a more natural fit for CSS is the one with the weird syntax.

However, I agree with @sesse that the lack of an efficient in memory representation for the selectors that would result from the change proposed here would be a major problem. (We don't know for sure that it's not possible to do -- but we don't yet have a demonstration that it is possible to do -- and to do in a way that isn't unacceptably complex.) I would be extremely uncomfortable resolving to do this without an explanation of how engines are expected to implement it in a way that is efficient in both time and memory, and agreement from implementors across multiple engines that such an approach would work. I think we should avoid adding more performance cliffs into CSS where using simple-looking features can cause extremely poor performance (in speed or memory use).

@romainmenke
Copy link
Member

romainmenke commented Jan 18, 2023

So to sumarize :

At this time it does not seem realistic to implement nesting in a way that is so close to nesting in preprocessors that it is indistinguishable for most users/projects.

Preprocessors can change their implementation to match nesting as currently specified and deal with all the fallout from such a breaking change. This is a short term problem.

But if authors agree that the resulting feature is less useful it means that implementing nesting in browsers has taken something away from authors instead of adding a new feature.
This is a long term issue.

Preprocessors could choose to keep their implementation and always desugar to unnested CSS. But this splits the ecosystem (dev tooling, learning resources,...).

Yes, it sounds increasingly like we should just declare css-nesting-1 a painful learning experience, delete it permanently and go do something else.

Leaving it as a preprocessors only feature might be better for authors?

@anttijk
Copy link

anttijk commented Jan 18, 2023

I think there is also a real-worldish performance concern in current engines with the :is() insertion approach. When you have :is(a, b) c you can't use the ancestor bloom filter to reject the selector quickly outside the a and b subtrees. The ancestor filter is often the optimization that enables decent performance with complex stylesheets.

@sesse
Copy link
Contributor

sesse commented Jan 18, 2023

Nothing really stops you from doing an OR query against the bloom filter, though (though we don't do that currently, and for very complex cases, it may not be feasible).

@anttijk
Copy link

anttijk commented Jan 18, 2023

Right, but that is equivalent to desugaring all the variants of the selector.

@sesse
Copy link
Contributor

sesse commented Jan 18, 2023

No, it's not? It's a fuzzy query. For something like

:is(.a, .b, .c) :is(.d, .e, .f) div { … }

This will cause at most six lookups in the bloom filter, but desugaring will create at most eighteen (nine rules, each doing two checks).

@anttijk
Copy link

anttijk commented Jan 18, 2023

You are right. I'm not sure how easy it is to expand these optimization at all without affecting performance with the existing content.

(Note that if you don't generate all the same permutations the filtering performance may be substantially worse compared to the desugared case. Say if most of the tree is within .a here but others are rare.)

@sesse
Copy link
Contributor

sesse commented Jan 18, 2023

I agree, and it's not something we've been spending much thought on, beyond the obvious optimization of :is() with a single selector inside (which we treat generally the same as that selector, both for bucketing and for Bloom filters). But at least we believe this is possible to do if it should become a performance problem in the future (i.e., users commonly write nesting selectors with lists in the parent selector), whereas it's not obvious at all how to implement an unbounded desugaring model without risk for blowup (in the same situation).

@zealvurte
Copy link

You are missing an important detail: Changing it will most likely make sure it isn't rolled out at all. Not delay it; stop it. It's unlikely we'll want to implement something that blows up into exponential memory use so easily and without warning, and if nothing else is acceptable to web developers, seemingly the only solution is to do nothing.

Not missing it, I'm just not convinced the implementation is such a hurdle that it won't eventually be overcome under the weight of developer needs, so I don't think it's a valid argument for the simpler option to be locked in and rolled out. If implementation stops, but one day starts again, is that not a delay? (I have no doubt it is a difficult challenge that requires lots of rethinking, and potentially large architectural changes that are simply not workable for implementers currently.)

I don't think the current spec is inconsistent with :has() if that's what you mean.

No, I was just comparing this situation as a developer desire for a feature that was long thought too expensive to implement, but a way to achieve it was eventually found.

Most nested selectors have & at the beginning, so they are not affected by this.

That's fair for the most part, but I wouldn't like to see future emerging patterns and features hindered by that metric.

The natural thing is that .a + & selects some element preceded by .a. I guess that SASS has another behavior because back then :is() didn't exist and being a preprocessor it couldn't do better.

Where does this interpretation of "natural" reading come from? Is there something already in CSS I'm missing that would suggest & should be the selected element, and not the selector? I know the latter is heavily influenced by preprocessor experience, but also feels natural for what most want nesting for (sugar for reducing multiple selectors). Functional pseudo-classes and :scope are the exceptions to the usual reading of a selector, so it seems odd to me there would be an assumption that nesting is also an exception without it being explicitly shown which type of expception using one of those.

It would seem the current behaviour is a big foot gun, requiring a large list of "gotchas" in learning resources (especially with the selectors being forgiving unlike most of CSS).

Leaving it as a preprocessors only feature might be better for authors?

Avoiding quoting your entire comment, I'm not against this, but it has lots of short and long term issues that you touched upon.

If this spec is to go ahead as it is, it serves a mostly different purpose to what preprocessors are doing (which I understand was the original intent, but this was certainly co-opted for its similarity to preprocessor nesting as a desirable feature), which means the syntax needs to clearly differentiate itself from preprocessor nesting, so that they can easily coexist and not cause confusion (which many clearly didn't know would need to be the outcome when feeding back on syntax).

I imagine that means either picking one of the less favourable syntax options, or preprocessors changing their syntax moving forward, but I still think someday we will end up in a situation where both exist in CSS itself, so may as well plan carefully for that.

To avoid going too off topic here, see my proposed solution in #8329.

@jimmyfrasche
Copy link

Is it possible to define a grouping operator similar to :is except that .a :group(.b .c, .d .e) .f = .a .b .c .f, .a .d .e .f and then use that as the implicit wrapper?

@mirisuzanne
Copy link
Contributor

I agree that authors familiar with preprocessor nesting will (at first) be surprised by this in the situations where it comes up - which is not the majority of nesting, but also not rare. But I'm not convinced by the arguments that one approach is more intuitive than the other outside that existing reference, or that authors would have to imagine an is() selector that isn't there, or that the currently-specified approach is a non-starter for authors once we're used to it.

Rather, the current approach:

  • Makes good sense if we stop thinking of & as a text-insertion point, and think of it much more like :scope which already refers to a matched element, much like using :is().
  • Can be 'how CSS nesting works' in a way that helps people also understand 'how CSS :is() works' and 'how scope works' without any of them requiring you to imagine or insert the other.

In Chromium browsers, with Experimental Web Platform features turned on, all but the final selector here match the same elements (see codepen demo):

/* Similar to `.a :is(.b .c)` currently */
.b .c { 
  .a & { 
    background: cyan;
  }
}

/* A scoped selector in the same shape */
@scope (.b .c) {
  .a :scope {
    border-style: dotted;
  }
}

/* Reversing the shape, with .a as the scope */
@scope (.a) {
  /* only the right-most matched element must be in-scope */
  .b .c {
    border-color: maroon;
  }

  /* can be narrowed more explicitly */
  :scope .b .c {
    font-weight: bold;
  }
}

This would change how people use nesting, and it will take some time for authors to get used to it - with :is(), @scope, and also & - similar to the same way we've had to re-think variables as custom properties. I'm not sure that needs to derail the project.

@tabatkins
Copy link
Member Author

Yeah, I'm going to withdraw this suggestion, for a few reasons.

  1. While I think the behavior with .a .b { .c & {...}} is reasonable, I think .a .b { .c + & {...}}'s behavior is fairly bad under this suggestion, and .a .b { & + & {...}} completely nonsensical.
  2. The currently-specced behavior matches the behavior of other nested approaches, notably @scope, and being able to transfer rules to/from @scope with minimal editting was one of the arguments made for the current syntax in the first place.
  3. While there's a certain degree of selector-length blowup as nesting depth increases in the current spec, this suggestion makes the blowup much worse. (Still not as bad as trying to match the current spec without :is(), which causes a combinatorial blowup rather than just multiplicative, but still.)

The only thing that gives me pause is that this suggestion's behavior can't be reproduced manually (without just unnesting entirely), while if we switched to this suggestion then the current spec's behavior can be easily opted into. I think it would be valuable to pursue having a way to opt into this suggestion's behavior, and will open a fresh issue for that.

@bramus
Copy link
Contributor

bramus commented Jan 19, 2023

For completeness, to compliment Miriam’s post above, note that the .c in the markup below is also matched by all but the final selector of that CSS snippet:

<div class="a b">
  <div class="c">nesting & scope</div>
</div>

To make it bold, a slight adjustments to that last selector needs to be done by removing the descendant combinator from between :scope and .b, i.e:

@scope (.a) {
  :scope.b .c {
    font-weight: bold;
  }
}

See forked codepen demo

EDIT: Oh, looks like this just crossed Tab’s closing of this issue

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
css-nesting-1 Current Work
Projects
None yet
Development

No branches or pull requests