Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

prevent duplicate json entries in sync-requires.js #4679

Closed
wants to merge 1 commit into from
Closed

prevent duplicate json entries in sync-requires.js #4679

wants to merge 1 commit into from

Conversation

docwhat
Copy link
Contributor

@docwhat docwhat commented Mar 23, 2018

This mitigates the dreaded "deoptimised" message from babel by
preventing the sync-requires.js file creating duplicates in the
exports.json hash.

[BABEL] Note: The code generator has deoptimised the styling of
"/Users/docwhat/Play/WebSites/docwhat/.cache/sync-requires.js" as it
exceeds the max of "500KB".

This does not fix the problems with the queue of *.js.<seconds-since-epoch> being created.

This mitigates the dreaded "deoptimised" message from babel by
preventing the `sync-requires.js` file creating duplicates in the
`exports.json` hash.

```
[BABEL] Note: The code generator has deoptimised the styling of
"/Users/docwhat/Play/WebSites/docwhat/.cache/sync-requires.js" as it
exceeds the max of "500KB".
```
@gatsbybot
Copy link
Collaborator

Deploy preview for gatsbygram ready!

Built with commit f2c7402

https://deploy-preview-4679--gatsbygram.netlify.com

@gatsbybot
Copy link
Collaborator

Deploy preview for using-drupal ready!

Built with commit f2c7402

https://deploy-preview-4679--using-drupal.netlify.com

@KyleAMathews
Copy link
Contributor

I'm not sure why this would be necessary — each page has a unique json name already and you fixed duplicate layout jsons being added — what JSON is being duplicated?

@docwhat
Copy link
Contributor Author

docwhat commented Mar 24, 2018

In the (a)sync-request.js cache files... the exports.json ends up with lots of duplicated entries on my system when it is slow...

See my gist

@KyleAMathews
Copy link
Contributor

Check if the same duplicates show up as pages? E.g. look in ./cache/redux-state.json.

@KyleAMathews
Copy link
Contributor

We're looping over a data object and pushing onto an array — slow i/o isn't going to change anything about that.

@docwhat
Copy link
Contributor Author

docwhat commented Mar 24, 2018

Check if the same duplicates show up as pages? E.g. look in ./cache/redux-state.json.

I ran grep '"id"' .cache/redux-state.json | sort | uniq -c | sort -n and didn't find any duplicates... What am I looking for?

@KyleAMathews
Copy link
Contributor

If you have duplicate json entries, it means you have duplicate pages — pages have to have unique json file paths — and we're just looping over them and pushing the JSON name on the array there. https://github.com/docwhat/gatsby/blob/f2c7402353979034563ab3148be876a79e2168d5/packages/gatsby/src/internal-plugins/query-runner/pages-writer.js#L39

So the only way that there's duplicate json entries is if there's duplicate pages. Unless there's something weird going on there I'm missing.

@docwhat
Copy link
Contributor Author

docwhat commented Mar 24, 2018

Closing in favor of #4681

@docwhat docwhat closed this Mar 24, 2018
@docwhat docwhat deleted the pr/duplicate-sync-requires branch May 2, 2018 20:08
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants