Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Closes #1722
When you save an entry (or term, or asset, etc) it updates the search indexes.
It looks through all the configured indexes, and checks whether the entry belongs in that index.
It did this by getting all the items that should be in the index, and check to see if that entry was in that array. Stupid in hindsight, but was simple to write originally and wasn't testing with enough content to notice a performance issue.
When you have a butt-tonne of entries (or assets, as per the issue) it has to do a lot of work to get all of them.
The problem is compounded when your assets are on S3 which is even slower.
This check has been reworked to just check against the rules and doesn't need to bother getting all the searchables.
Saving an entry on my 14,000-asset test site went from 7 seconds down to 150ms.
Code is not the prettiest but I'll take that over the load time.