You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I wonder whether it might be worth considering setting up some kind of process to fuzz (along the lines of this) any changes to the compiler that could affect either:
The treatment of existing syntax
The treatment of new syntax
I've noticed a pattern of longstanding compiler oddities or inconsistencies as well as of regressions and unintended features accidentally introduced alongside new syntax, both of which might have been caught earlier by fuzzing—that is, generating random programs (FsCheck? Hedgehog?) and asserting whether and how they should compile.
Even some kind of tool that could be run by hand on demand feels like it might be better than nothing.
Problem examples
Here is a quick, non-exhaustive list of examples that fuzzing might have helped catch sooner:
A language like F# is perhaps more prone to this kind of problem than many other programming languages, since most of F#'s features are designed to be composable—the language's expression and pattern-based nature, and the fact that expressions and patterns and types can be nested inside of themselves and each other, means that the language's syntagmatic surface area is already quite vast, and every new or updated construct makes it bigger still.
I think it's inevitable that we humans will fail to consider all of the potential syntactic implications of any given change to the parser or addition of a new syntactic construct.
Some questions
How do other languages handle this kind of problem?
Is it feasible to build and maintain such a tool, or is it known not to be a worthwhile undertaking?
The text was updated successfully, but these errors were encountered:
Proposal
I wonder whether it might be worth considering setting up some kind of process to fuzz (along the lines of this) any changes to the compiler that could affect either:
I've noticed a pattern of longstanding compiler oddities or inconsistencies as well as of regressions and unintended features accidentally introduced alongside new syntax, both of which might have been caught earlier by fuzzing—that is, generating random programs (FsCheck? Hedgehog?) and asserting whether and how they should compile.
Even some kind of tool that could be run by hand on demand feels like it might be better than nothing.
Problem examples
Here is a quick, non-exhaustive list of examples that fuzzing might have helped catch sooner:
static member
can be defined withoutmember
keyword, in unexpected way #16342~
in expressions compiles, but shouldn't #16135_.
in dot-lambdas #16136new
) #16257()
→(())
required in certain scenarios #16254A language like F# is perhaps more prone to this kind of problem than many other programming languages, since most of F#'s features are designed to be composable—the language's expression and pattern-based nature, and the fact that expressions and patterns and types can be nested inside of themselves and each other, means that the language's syntagmatic surface area is already quite vast, and every new or updated construct makes it bigger still.
I think it's inevitable that we humans will fail to consider all of the potential syntactic implications of any given change to the parser or addition of a new syntactic construct.
Some questions
The text was updated successfully, but these errors were encountered: