You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository has been archived by the owner on Feb 25, 2020. It is now read-only.
I want to have my core subscribe to certain events, say "alerts" for example and then execute whatever I defined in the associated event handler. This should not be bound to a coreID, I want any core to react to all events named "alerts" when it subscribed that event name. Now, when I POST to /v1/devices/events using the name "alerts", the event never actually gets pushed to the core. After a lot of logger.log in different parts of the code I think I could isolate the problem down to node_modules/spark-protocol/clients/SparkCore.js:
try {
if (!global.publisher) {
logger.error('No global publisher');
return;
}
if (!global.publisher.publish(isPublic, obj.name, obj.userid, obj.data, obj.ttl, obj.published_at, this.getHexCoreID())) {
//this core is over its limit, and that message was not sent.
this.sendReply("EventSlowdown", msg.getId());
logger.log('EventSlowdown triggered' + this.getHexCoreID());
}
else {
this.sendReply("EventAck", msg.getId());
logger.log("onCoreSentEvent: sent to " + this.getHexCoreID());
}
}
It seems to me that global.publisher.publish is always going into this limit. I haven't really understood how the publisher works and have some trouble interpreting the code and I might just be doing something completely wrong. If anyone else has something like this working, any advice/example is welcome, otherwise it feels like a bug to me :)
To make it more clear, I don't want the core to subscribe/react to events from other cores, I just want to have them subscribe to a designated channel "alerts" and have them decide what to depending on event data. The trigger should be a simple POST through the spark-server API (as defined in the spark-server docs) so that hubot scripts or whatever else can trigger these events.
The text was updated successfully, but these errors were encountered:
I want to have my core subscribe to certain events, say "alerts" for example and then execute whatever I defined in the associated event handler. This should not be bound to a coreID, I want any core to react to all events named "alerts" when it subscribed that event name. Now, when I POST to /v1/devices/events using the name "alerts", the event never actually gets pushed to the core. After a lot of logger.log in different parts of the code I think I could isolate the problem down to node_modules/spark-protocol/clients/SparkCore.js:
It seems to me that global.publisher.publish is always going into this limit. I haven't really understood how the publisher works and have some trouble interpreting the code and I might just be doing something completely wrong. If anyone else has something like this working, any advice/example is welcome, otherwise it feels like a bug to me :)
To make it more clear, I don't want the core to subscribe/react to events from other cores, I just want to have them subscribe to a designated channel "alerts" and have them decide what to depending on event data. The trigger should be a simple POST through the spark-server API (as defined in the spark-server docs) so that hubot scripts or whatever else can trigger these events.
The text was updated successfully, but these errors were encountered: