-
Notifications
You must be signed in to change notification settings - Fork 17.7k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
proposal: spec: treat s[-1] as equivalent to s[len(s)-1] #25594
Comments
Similar proposal in the past: #20176 |
Also #11245, but this variant is, on the one hand, better because because it doesn't accept variables, but, on the other hand, worse because why should we only accept constants? |
@mvdan yes, this proposal was follow-up to a suggestion mooted during the conversation in #20176. I should have noted that. @ianlancetaylor indeed. The best argument I see for the discrepancy is one of practicality: variables (and constant expressions) and constant literals should be treated differently because doing so affords increased readability without increasing the probability of silent bugs. But I agree that that isn’t a particularly compelling argument, given the small proportion of code that matches the pattern. |
How does adding an exception to an expression (-1 in some cases is to be read as len(foo)-1) improve readability? Please explain, for me it's quite the opposite. |
@gopherbot please close this issue in 48 hours. (That obviously doesn’t work. But I wish it did. I’ll be the bot.) |
@cznic the same way all syntactic sugar does: By providing a mental shortcut for common uses. Instead of having to parse out everything and confirm equivalence of the inner/outer expressions, I see |
That would be more natural if the indices of the slice |
I don't think is usually that much of a big deal usually, but I've wanted this feature sometimes for cases where I'm indexing into a slice that isn't bound to a variable. For example, given Not the best examples, perhaps, but I think that it could be useful in these cases. |
I'd like to re-open this. I wrote it six years ago(!) only to dismiss it. But over the years, talking to gophers, this comes up over and over as a common ergonomics complaint, like I propose the narrowest possible addition: special handling for the literal |
Now that there's a slices package we could have something like |
Or |
To summarize some of my concerns with
I support a language change that allows indexing with negative literals. Restricting it to just literals avoid any possibility of arithmetic bugs since there's no arithmetic involved. The presence of |
Copying over some of my usage analytics from #53510: Frequency of constant slice indexes relative to start or end:
Even though obtaining the last element is an order-of-magnitude less frequent than getting the first element, there are still ~300k occurences. Also, getting the last element is still 1/4 as common as arbitrary indexing expressions. |
I'm a bit on the fence on this. My worry is the curse of knowledge. Is there a way to check that someone who has never coded in go, looking at a codebase, can intuit what the negative slice index does? Other than that, if it's only literals as proposed, I guess that could work. But that's never something that truly bit me in my own code. Or at least I don't remember. |
Should slicing syntax also support negative indexes? Such as |
I believe yes. The original proposal said that:
and I would assume Truncating off the last element is also a fairly common operation that I've personally needed to do (and yet another reason why |
it's familiar enough from other languages like python, ruby etc and it's one the main question for "why doesn't support" in languages that don't. |
@doggedOwl not convinced. Searching python negative index and the first result is a stackoverflow question: In this day and age of advanced code completion, we could avoid having to explain this. Then again, just like how we have |
I thought about what the semantics should be for various library functions then realized I've written these things enough times and threw it all in a repo so I can use it later https://pkg.go.dev/github.com/jimmyfrasche/sidx I wouldn't expect most of that to be in std. |
Golang is supposed to be simple and concise, that was the idea as far as I know, And about stackoverflow - I was using it pretty actively first year or so after I discovered programming as a kid, but now there's nothing to ask there for me, not because I know everything (😅), but because I discovered google and search function on that very stackoverflow, and now looking at the absolute most of questions there it seems obvious to me that those questions are asked by people who haven't even touched any kind of a "tour" for the language they are asking their questions for, and I doubt that after learning something new and handy - they were disappointed about the learning curve or anything like that, because personally when I discovered negative indices in Python I was happy to learn about them and then use them instead of cumbersome low-level constructions of calculating length minus 1 |
note this was declined by review in #33359 |
One interesting thing about the Negative indices are therefore just the extension of that principle to lower numbers, wrapping around to the end of the slice. However, I suppose to be an adequate answer to the original use case it would also need something like package slices
// At returns the element from s at the given index modulo the slice length.
func At[Slice ~[]E, E any](s Slice, index int) E
// SliceAt returns a subslice of s starting at start modulo the slice length and ending at
// start+length modulo the slice length.
//
// If the requested subslice would cross from the end of s back to the start of s then
// the result is a slice over a newly-allocated backing array. Otherwise the result
// shares the backing array of s.
func SliceAt[Slice ~[]E, E any](s Slice, start, length int) Slice
// (no variation for the three-operand slice because it doesn't really make sense
// to specify capacity modulo length.) Both of these functions effectively treat the slice as an infinite series containing repetitions of slice The quirk of I will pre-concede that earlier comments suggest that taking the final element is considerably more common than any other variation that isn't an already in-bounds index and so this generality might not actually be justified. |
Overview
This is a backwards-compatible language change proposal. It will not be as thorough as it could be, since I do not think it should be adopted. I am writing it up merely for future reference.
I propose to treat negative constant literals in slicing and indexing expressions as offsets from the end of the slice or array. For example:
s[-1]
is equivalent tos[len(s)-1]
s[:-1]
is equivalent tos[:len(s)-1]
s[:-2:-1]
is equivalent tos[:len(s)-2:len(s)-1]
The motivation is to improve readability. Slice and index expressions like this occur commonly when treating slices as stacks.
Consider this code from the compiler:
Using the proposed syntactic sugar, it would read:
Scope
The proposed sugar would only apply to negative constant literals.
The rationale for
b[v]
to panic at runtime is that negative indices often arise due to overflow and programmer error.The same rationale discourages allowing constant expressions such as
b[c]
as syntactic sugar. Constant expressions can be complex, non-local, and build-tag controlled.However, a constant literal displays clear, obvious, local intent, and overflow-free, and affords little room for programmer error.
Order of evaluation and side-effects
Currently, the slice/array is evaluated before any index or slice indices. See https://play.golang.org/p/kTr9Az5HoDj.
This allows a natural order of evaluation for the proposal. Given
expr[-1]
orexpr[:-1]
,expr
is evaluated exactly once, and its length is used in subsequentlen(expr)-c
calculations.Data
A quick-and-dirty AST parsing program examining slice expressions suggests that such expressions occur, but not with particularly high frequency.
Running over GOROOT yields that 2.51% of slice expressions could be rewritten to use this syntactic sugar. Running over my GOPATH yields 3.17%. Searching for candidate index expressions yields 0.35% and 0.91% respectively.
This is an underestimate. In many cases, for clarity, the code will already have been rewritten like:
And all index expressions were considered candidates for this analysis, which includes map accesses and assignments.
Nevertheless, this analysis suggests that this sugar is unlikely to be transformative in how Go code is written, and therefore probably does not pull its weight.
The text was updated successfully, but these errors were encountered: