Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Googlebot compatible protobuf ajax request #661

Closed
robin-anil opened this issue Jan 26, 2017 · 7 comments
Closed

Googlebot compatible protobuf ajax request #661

robin-anil opened this issue Jan 26, 2017 · 7 comments
Labels

Comments

@robin-anil
Copy link
Contributor

protobuf.js version: any

This is more of a question. I have noticed that Google bot is running some old janky or hacked up version of chrome browser and is unable crawl and parse our sites using protobuf.js. We try to work around it by using prerender.io service. But that to me is a crutch.

Our Ajax implementation is based on https://github.com/dcodeIO/protobuf.js/wiki/How-to-read-binary-data-in-the-browser-or-under-node.js%3F using whatwg-fetch This works in any browser we test except Google bot always have trouble with it. My guess is Google bot is pre-Chrome 26. Any ideas here?

@dcodeIO
Copy link
Member

dcodeIO commented Jan 26, 2017

My first guess based on your assumptions would be that googlebot might not support ArrayBuffer / Uint8Array / responseType="arraybuffer" (properly). One way to work around it (or anything else not supporting one of these features) would be to use base64 encoded strings instead of binary data.

dcodeIO added a commit that referenced this issue Jan 27, 2017
@dcodeIO
Copy link
Member

dcodeIO commented Jan 27, 2017

This commit tries to add binary data support to protobuf.util.fetch through specifying an options parameter, like this:

protobuf.util.fetch("my/binary/file.bin", { binary: true }, function(err, data) {
   // with data being an Uint8Array if supported, otherwise an Array of octets
});

While I found a reference on how to prevent charset conversion on MDN, I have no idea if this actually works. The key here is to always return a reader/writer compatible Array while making sure that no utf8 whatsoever conversion is performed by ancient browsers without responseType="arraybuffer" support.

@dcodeIO
Copy link
Member

dcodeIO commented Jan 27, 2017

This won't be perfect, yet (or ever), probably, but it's on npm as 6.6.2 now.

@robin-anil
Copy link
Contributor Author

Yeah tried this. Does not work in Google crawler. I decided to do server side rendering of our app and give out HTML for the Google Crawler.

@robin-anil
Copy link
Contributor Author

Server side rendering works nicely.

@dcodeIO
Copy link
Member

dcodeIO commented Feb 2, 2017

Should also work with all sorts of crawlers then :)

@robin-anil
Copy link
Contributor Author

Yeah. Probably that is the right recommendation, we have had a huge reduction in JS errors (almost flatlined) since crawlers started consuming html. I think this can be closed. The return on investment with weird js hacks for old browsers is very small.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

2 participants