-
Notifications
You must be signed in to change notification settings - Fork 4.8k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
API Analytics Plugin #86
Comments
+1 |
@nijikokun @SGrondin @kennethklee @ahmadnassri please tell us more info about the ALF Plugin. Where should the data be sent to: http, tcp? What address? @thibaultcha will take care of the implementation. Renaming this issue to: API Analytics plugin |
all the info for the format is documented here this ticket was not intended as the "API Analytics Plugin". but as was the original title suggested" default logging format in Kong should be in ALF / HAR" I see this as two different things entirely:
|
There is no such thing as "default logging". There is a plugin for each logging capacity. ALF appears to be one of the available logging plugin, but not the default because all plugins are equal... Thus why it changed to the ALF logging plugin ticket. I knew for the ALF repo, but it does not provide info son how to send the data. Will follow up in 1-1 |
you have tcplog, udplog AND filelog plugins... what formats are those using? what I'm saying is: adopt a standard and follow it across all "native" logging plugins (ones the core team builds) my suggestion is to make that "standardized format": HAR / ALF if third parties want to build different logging plugins with different formats, great. but you should have a unified standard format to follow. again, that was the point of this ticket, an API Analytics plugin is also needed, but just to clarify the original purpose |
I see your point. So rather than separate the plugins by logging format, it is true they are separated by protocol they're using to send the data. In that case, if I want to send data to apianalytics, can't I just use one of those and set the format to "ALF", or what other value would APIanalytics give me? More than logging? What do we need to build now? An ALF serializer for our current logging plugin or an API analytics plugin? There is no clear roadmap |
yes, API Analytics integration can simply be the TCP plugin (potentially) or UDP, whatever .. its more about the format (and setting a token) currently API Analytics is built on ZMQ, so TCP is certainly a preferred method, as long as its in ALF format + an API Analytics token is set, just send the data over TCP and that's it. I could see this same setup be used for any TCP based logging platform (splunk) hence why I'm more interested on the format since you already got a TCP plugin working. if it is possible for plugins to "stack" in a chain, where one plugin creates log data, and another sends it over a protocol, then that would be epic actually ... #IdealWorld |
To recap, we want a new plugin called API Analytics that creates the ALF payload and sends it to a configurable ZMQ server. We are not changing the default logging format. |
what formats are those plugins using? |
That's the point, we have different ways of doing that.
What's the preferred approach? |
The new plugin will create the ALF payload (only ALF) from the Lua object that Kong provides, and then send it to TCP. So it's your second option. In the future we might want |
As stated in the ApiAnalytics docs, the server accepts both ZMQ and HTTP. For something like Kong, it's up to you to either include the ZMQ instructions into the install scripts (homebrew, .deb, etc etc) or, I think better in this case, use the batch HTTP input. Simply batch ALFs 1000 at a time into an array and send it off to the server. |
you can batch both the ALFs + the entries within the ALFs, whaver makes the most sense for your agent's performance of course |
Only a temporary solution before each serializer to be a module of its own (maybe?) and each logging plugin to be independent from Kong, and require serializers as dependencies. Following the discussion in #86
- almost complete ALF serialization - basic data push to apianalytics - entries can be queued in the serializer, and flushed anytime - serializer can output itself in JSON, being a valid ALF object - more tests, with fixtures - introduce serializers for logging plugins Only a temporary solution before each serializer to be a module of its own (maybe?) and each logging plugin to be independent from Kong, and require serializers as dependencies. Following the discussion in #86
- almost complete ALF serialization - basic data push to apianalytics - entries can be queued in the serializer, and flushed anytime - serializer can output itself in JSON, being a valid ALF object - more tests, with fixtures - introduce serializers for logging plugins Only a temporary solution before each serializer to be a module of its own (maybe?) and each logging plugin to be independent from Kong, and require serializers as dependencies. Following the discussion in #86
- almost complete ALF serialization - basic data push to apianalytics - entries can be queued in the serializer, and flushed anytime - serializer can output itself in JSON, being a valid ALF object - more tests, with fixtures - introduce serializers for logging plugins Only a temporary solution before each serializer to be a module of its own (maybe?) and each logging plugin to be independent from Kong, and require serializers as dependencies. Following the discussion in #86
- almost complete ALF serialization - basic data push to apianalytics - entries can be queued in the serializer, and flushed anytime - serializer can output itself in JSON, being a valid ALF object - more tests, with fixtures - introduce serializers for logging plugins Only a temporary solution before each serializer to be a module of its own (maybe?) and each logging plugin to be independent from Kong, and require serializers as dependencies. Following the discussion in #86
- almost complete ALF serialization - basic data push to apianalytics - entries can be queued in the serializer, and flushed anytime - serializer can output itself in JSON, being a valid ALF object - more tests, with fixtures - introduce serializers for logging plugins Only a temporary solution before each serializer to be a module of its own (maybe?) and each logging plugin to be independent from Kong, and require serializers as dependencies. Following the discussion in #86
- almost complete ALF serialization - basic data push to apianalytics - entries can be queued in the serializer, and flushed anytime - serializer can output itself in JSON, being a valid ALF object - more tests, with fixtures - introduce serializers for logging plugins Only a temporary solution before each serializer to be a module of its own (maybe?) and each logging plugin to be independent from Kong, and require serializers as dependencies. Following the discussion in #86
- almost complete ALF serialization - basic data push to apianalytics - entries can be queued in the serializer, and flushed anytime - serializer can output itself in JSON, being a valid ALF object - more tests, with fixtures - introduce serializers for logging plugins Only a temporary solution before each serializer to be a module of its own (maybe?) and each logging plugin to be independent from Kong, and require serializers as dependencies. Following the discussion in #86
- almost complete ALF serialization - basic data push to apianalytics - entries can be queued in the serializer, and flushed anytime - serializer can output itself in JSON, being a valid ALF object - more tests, with fixtures - introduce serializers for logging plugins Only a temporary solution before each serializer to be a module of its own (maybe?) and each logging plugin to be independent from Kong, and require serializers as dependencies. Following the discussion in #86
- almost complete ALF serialization - basic data push to apianalytics - entries can be queued in the serializer, and flushed anytime - serializer can output itself in JSON, being a valid ALF object - more tests, with fixtures - introduce serializers for logging plugins Only a temporary solution before each serializer to be a module of its own (maybe?) and each logging plugin to be independent from Kong, and require serializers as dependencies. Following the discussion in #86
- almost complete ALF serialization - basic data push to apianalytics - entries can be queued in the serializer, and flushed anytime - serializer can output itself in JSON, being a valid ALF object - more tests, with fixtures - introduce serializers for logging plugins Only a temporary solution before each serializer to be a module of its own (maybe?) and each logging plugin to be independent from Kong, and require serializers as dependencies. Following the discussion in #86
- almost complete ALF serialization - basic data push to apianalytics - entries can be queued in the serializer, and flushed anytime - serializer can output itself in JSON, being a valid ALF object - more tests, with fixtures - introduce serializers for logging plugins Only a temporary solution before each serializer to be a module of its own (maybe?) and each logging plugin to be independent from Kong, and require serializers as dependencies. Following the discussion in #86
- almost complete ALF serialization - basic data push to apianalytics - entries can be queued in the serializer, and flushed anytime - serializer can output itself in JSON, being a valid ALF object - more tests, with fixtures - introduce serializers for logging plugins Only a temporary solution before each serializer to be a module of its own (maybe?) and each logging plugin to be independent from Kong, and require serializers as dependencies. Following the discussion in Kong#86 Former-commit-id: 8b94ffd30aa88484ac10a494f9bfcbe3ecfe200e
* feat(dependencies) migrate build logic to separate build script * fix(ci) clone via https * chore(debug) debugging travis-ci * fix(luarocks) adjust for luarocks directory name change * chore(ci) revert the travis-ci debug commit * fix(deb) the base image shouldn't require these packages * fix(building) need an additional build tool in some distros * fix(ci) rebuild the base images * fix(ci) conditionally install realpath * fix(centos) adjust how realpath is installed * fix(centos) seems which isnt installed on centos by default * fix(Makefil) the docker image might not exist which is fine * fix(openssl) adjust openssl installation location * chore(debug) check different Kong version * use system luarocks * fix(luarocks) implement the 2.1.3 luarocks fix * fix(luarocks) we bumped luarocks so dont need this * chore(openresty-build-tools) pin to 0.0.1 * fix(ci) adjust the Makefile task * Update dependency pbr to v5.3.0
Use ALF as the default format for all HTTP logging.
The text was updated successfully, but these errors were encountered: