Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Max registration ID's #42

Closed
leearmstrong opened this issue Jan 29, 2014 · 9 comments
Closed

Max registration ID's #42

leearmstrong opened this issue Jan 29, 2014 · 9 comments

Comments

@leearmstrong
Copy link

Will node-gcm fail if I send it more than 1000 Registrations ID's in a single message as that is all Google support? Do I have to split it up myself before hand?

@ToothlessGear
Copy link
Owner

You'll have to split your ID's beforehand, as node-gcm doesn't do that for you automatically yet.

@leearmstrong
Copy link
Author

Perfect. Thanks!

On 30 Jan 2014, at 16:26, Marcus Farkas notifications@github.com wrote:

You'll have to split your ID's beforehand, as node-gcm doesn't do that for you automatically yet.


Reply to this email directly or view it on GitHub.

@eladnava
Copy link
Collaborator

eladnava commented Oct 4, 2015

If anyone else needs to support more than 1,000 devices, you can easily split the tokens up into batches like this:

// Max devices per request    
var batchLimit = 1000;

// Batches will be added to this array
var tokenBatches = [];

// Traverse tokens and split them up into batches of 1,000 devices each  
for (var start = 0; start < tokens.length; start += batchLimit) {
    // Get next 1,000 tokens
    var slicedTokens = tokens.slice(start, start + batchLimit);

    // Add to batches array
    tokenBatches.push(slicedTokens);
}

// You can now send a push to each batch of devices, in parallel, using the caolan/async library
async.each(batches, function (batch, callback) {
    // Assuming you already set up the sender and message
    sender.send(message, { registrationIds: batch }, function (err, result) {
        // Push failed?
        if (err) {
            // Stop executing other batches
            return callback(err);
        }

        // Done with batch
        callback();
    });
},
    function (err) {
        // Log the error to console
        if (err) {
            console.log(err);
        }
    });

@hypesystem
Copy link
Collaborator

I wrote parallel-batch which does pretty much that: https://www.npmjs.com/package/parallel-batch

@eladnava
Copy link
Collaborator

eladnava commented Oct 4, 2015

Nice package! Would greatly simplify the code I wrote.

@eladnava
Copy link
Collaborator

eladnava commented Oct 4, 2015

@hypesystem maybe it would be a good idea to integrate your parallel-batch library into node-gcm, so that batching will be performed automagically?

@hypesystem
Copy link
Collaborator

I agree, that was the original intention with parallel-batch. As it turns out, though, it's easier said than done.

Specifically we want to return errors correctly (as if no batching happened; so if only some of the batches fail, some of the messages will still (possibly) be sent); and we want to handle retries the way the user expects (which is also hard).

@eladnava
Copy link
Collaborator

eladnava commented Oct 4, 2015

I think it's doable -- in case one of the batches failed, we'll retry it until we run out of tries. We just have to make sure that GCM doesn't deliver the push notification to some devices in the batch while erroring out -- that would cause some serious spamming. But in any case, it could be happening today with < 1,000 devices in sender.send.

@hypesystem
Copy link
Collaborator

I would love to see you give it a try. First of all, though, I think we need a new issue to discuss this -- feel free to create it.

I will try to gather my thoughts on what behaviour I think we would want, and why, exactly, it is tricky :-)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants