Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

API Analytics Plugin #86

Closed
ahmadnassri opened this issue Mar 21, 2015 · 13 comments
Closed

API Analytics Plugin #86

ahmadnassri opened this issue Mar 21, 2015 · 13 comments
Assignees
Labels
idea/new plugin [legacy] those issues belong to Kong Nation, since GitHub issues are reserved for bug reports.
Milestone

Comments

@ahmadnassri
Copy link
Contributor

Use ALF as the default format for all HTTP logging.

@subnetmarco subnetmarco changed the title default logging should be in ALF format Default logging should be in ALF format Mar 21, 2015
@thibaultcha thibaultcha added the idea/new plugin [legacy] those issues belong to Kong Nation, since GitHub issues are reserved for bug reports. label Mar 25, 2015
@sonicaghi
Copy link
Member

+1

@subnetmarco
Copy link
Member

@nijikokun @SGrondin @kennethklee @ahmadnassri please tell us more info about the ALF Plugin.

Where should the data be sent to: http, tcp? What address? @thibaultcha will take care of the implementation.

Renaming this issue to: API Analytics plugin

@subnetmarco subnetmarco changed the title Default logging should be in ALF format API Analytics Plugin Apr 18, 2015
@ahmadnassri
Copy link
Contributor Author

all the info for the format is documented here

this ticket was not intended as the "API Analytics Plugin". but as was the original title suggested" default logging format in Kong should be in ALF / HAR"

I see this as two different things entirely:

  • default internal logging format (spec / feature)
  • sending the logs to a service such as API Analytics (plugin)

@thibaultcha
Copy link
Member

There is no such thing as "default logging". There is a plugin for each logging capacity. ALF appears to be one of the available logging plugin, but not the default because all plugins are equal...

Thus why it changed to the ALF logging plugin ticket. I knew for the ALF repo, but it does not provide info son how to send the data. Will follow up in 1-1

@ahmadnassri
Copy link
Contributor Author

you have tcplog, udplog AND filelog plugins... what formats are those using? what I'm saying is: adopt a standard and follow it across all "native" logging plugins (ones the core team builds) my suggestion is to make that "standardized format": HAR / ALF

if third parties want to build different logging plugins with different formats, great. but you should have a unified standard format to follow.

again, that was the point of this ticket, an API Analytics plugin is also needed, but just to clarify the original purpose

@thibaultcha
Copy link
Member

I see your point. So rather than separate the plugins by logging format, it is true they are separated by protocol they're using to send the data. In that case, if I want to send data to apianalytics, can't I just use one of those and set the format to "ALF", or what other value would APIanalytics give me? More than logging? What do we need to build now? An ALF serializer for our current logging plugin or an API analytics plugin? There is no clear roadmap

@ahmadnassri
Copy link
Contributor Author

yes, API Analytics integration can simply be the TCP plugin (potentially) or UDP, whatever .. its more about the format (and setting a token)

currently API Analytics is built on ZMQ, so TCP is certainly a preferred method, as long as its in ALF format + an API Analytics token is set, just send the data over TCP and that's it.

I could see this same setup be used for any TCP based logging platform (splunk) hence why I'm more interested on the format since you already got a TCP plugin working.

if it is possible for plugins to "stack" in a chain, where one plugin creates log data, and another sends it over a protocol, then that would be epic actually ... #IdealWorld

@subnetmarco
Copy link
Member

To recap, we want a new plugin called API Analytics that creates the ALF payload and sends it to a configurable ZMQ server.

We are not changing the default logging format.

@subnetmarco subnetmarco added this to the 0.2.0 milestone Apr 20, 2015
@ahmadnassri
Copy link
Contributor Author

@thibaultcha
Copy link
Member

we want a new plugin called API Analytics that creates the ALF payload and sends it to a configurable ZMQ server.

That's the point, we have different ways of doing that.

  1. Each logging plugin is tied to a protocol, and has different formats. tcplog sends data over tcp and can have an ALF format. And if you send data to API Analytics, it can also have a key. Which is kind of strange, but totally doable. In that case, the default format would be "ALF", and other formats could be "raw", however it is currently (not any specific format).
  2. We have an API Analytics plugin, with a documentation of its own, a configuration of its own, and that plugin sends data over, say, tcp only.

What's the preferred approach?

@subnetmarco
Copy link
Member

The new plugin will create the ALF payload (only ALF) from the Lua object that Kong provides, and then send it to TCP. So it's your second option.

In the future we might want tcplog and updlog to have configurable log formats, but that would another issue. When we do that, we could unify/change the way logging works.

@SGrondin
Copy link
Contributor

As stated in the ApiAnalytics docs, the server accepts both ZMQ and HTTP. For something like Kong, it's up to you to either include the ZMQ instructions into the install scripts (homebrew, .deb, etc etc) or, I think better in this case, use the batch HTTP input. Simply batch ALFs 1000 at a time into an array and send it off to the server.

@ahmadnassri
Copy link
Contributor Author

you can batch both the ALFs + the entries within the ALFs, whaver makes the most sense for your agent's performance of course

@subnetmarco subnetmarco modified the milestones: 0.3.0, 0.2.0 Apr 24, 2015
thibaultcha added a commit that referenced this issue May 28, 2015
Only a temporary solution before each serializer to be a module of its
own (maybe?) and each logging plugin to be independent from Kong, and
require serializers as dependencies. Following the discussion in #86
thibaultcha added a commit that referenced this issue May 29, 2015
- almost complete ALF serialization
- basic data push to apianalytics
- entries can be queued in the serializer, and flushed anytime
- serializer can output itself in JSON, being a valid ALF object
- more tests, with fixtures
- introduce serializers for logging plugins

Only a temporary solution before each serializer to be a module of its
own (maybe?) and each logging plugin to be independent from Kong, and
require serializers as dependencies. Following the discussion in #86
thibaultcha added a commit that referenced this issue May 29, 2015
- almost complete ALF serialization
- basic data push to apianalytics
- entries can be queued in the serializer, and flushed anytime
- serializer can output itself in JSON, being a valid ALF object
- more tests, with fixtures
- introduce serializers for logging plugins

Only a temporary solution before each serializer to be a module of its
own (maybe?) and each logging plugin to be independent from Kong, and
require serializers as dependencies. Following the discussion in #86
thibaultcha added a commit that referenced this issue May 29, 2015
- almost complete ALF serialization
- basic data push to apianalytics
- entries can be queued in the serializer, and flushed anytime
- serializer can output itself in JSON, being a valid ALF object
- more tests, with fixtures
- introduce serializers for logging plugins

Only a temporary solution before each serializer to be a module of its
own (maybe?) and each logging plugin to be independent from Kong, and
require serializers as dependencies. Following the discussion in #86
thibaultcha added a commit that referenced this issue May 29, 2015
- almost complete ALF serialization
- basic data push to apianalytics
- entries can be queued in the serializer, and flushed anytime
- serializer can output itself in JSON, being a valid ALF object
- more tests, with fixtures
- introduce serializers for logging plugins

Only a temporary solution before each serializer to be a module of its
own (maybe?) and each logging plugin to be independent from Kong, and
require serializers as dependencies. Following the discussion in #86
thibaultcha added a commit that referenced this issue Jun 1, 2015
- almost complete ALF serialization
- basic data push to apianalytics
- entries can be queued in the serializer, and flushed anytime
- serializer can output itself in JSON, being a valid ALF object
- more tests, with fixtures
- introduce serializers for logging plugins

Only a temporary solution before each serializer to be a module of its
own (maybe?) and each logging plugin to be independent from Kong, and
require serializers as dependencies. Following the discussion in #86
thibaultcha added a commit that referenced this issue Jun 2, 2015
- almost complete ALF serialization
- basic data push to apianalytics
- entries can be queued in the serializer, and flushed anytime
- serializer can output itself in JSON, being a valid ALF object
- more tests, with fixtures
- introduce serializers for logging plugins

Only a temporary solution before each serializer to be a module of its
own (maybe?) and each logging plugin to be independent from Kong, and
require serializers as dependencies. Following the discussion in #86
thibaultcha added a commit that referenced this issue Jun 3, 2015
- almost complete ALF serialization
- basic data push to apianalytics
- entries can be queued in the serializer, and flushed anytime
- serializer can output itself in JSON, being a valid ALF object
- more tests, with fixtures
- introduce serializers for logging plugins

Only a temporary solution before each serializer to be a module of its
own (maybe?) and each logging plugin to be independent from Kong, and
require serializers as dependencies. Following the discussion in #86
thibaultcha added a commit that referenced this issue Jun 3, 2015
- almost complete ALF serialization
- basic data push to apianalytics
- entries can be queued in the serializer, and flushed anytime
- serializer can output itself in JSON, being a valid ALF object
- more tests, with fixtures
- introduce serializers for logging plugins

Only a temporary solution before each serializer to be a module of its
own (maybe?) and each logging plugin to be independent from Kong, and
require serializers as dependencies. Following the discussion in #86
thibaultcha added a commit that referenced this issue Jun 3, 2015
- almost complete ALF serialization
- basic data push to apianalytics
- entries can be queued in the serializer, and flushed anytime
- serializer can output itself in JSON, being a valid ALF object
- more tests, with fixtures
- introduce serializers for logging plugins

Only a temporary solution before each serializer to be a module of its
own (maybe?) and each logging plugin to be independent from Kong, and
require serializers as dependencies. Following the discussion in #86
thibaultcha added a commit that referenced this issue Jun 4, 2015
- almost complete ALF serialization
- basic data push to apianalytics
- entries can be queued in the serializer, and flushed anytime
- serializer can output itself in JSON, being a valid ALF object
- more tests, with fixtures
- introduce serializers for logging plugins

Only a temporary solution before each serializer to be a module of its
own (maybe?) and each logging plugin to be independent from Kong, and
require serializers as dependencies. Following the discussion in #86
thibaultcha added a commit that referenced this issue Jun 4, 2015
- almost complete ALF serialization
- basic data push to apianalytics
- entries can be queued in the serializer, and flushed anytime
- serializer can output itself in JSON, being a valid ALF object
- more tests, with fixtures
- introduce serializers for logging plugins

Only a temporary solution before each serializer to be a module of its
own (maybe?) and each logging plugin to be independent from Kong, and
require serializers as dependencies. Following the discussion in #86
@sonicaghi sonicaghi modified the milestones: 0.3.0, 0.4.0 Jun 6, 2015
ctranxuan pushed a commit to streamdataio/kong that referenced this issue Aug 25, 2015
- almost complete ALF serialization
- basic data push to apianalytics
- entries can be queued in the serializer, and flushed anytime
- serializer can output itself in JSON, being a valid ALF object
- more tests, with fixtures
- introduce serializers for logging plugins

Only a temporary solution before each serializer to be a module of its
own (maybe?) and each logging plugin to be independent from Kong, and
require serializers as dependencies. Following the discussion in Kong#86


Former-commit-id: 8b94ffd30aa88484ac10a494f9bfcbe3ecfe200e
gszr pushed a commit that referenced this issue Jun 10, 2021
gszr pushed a commit that referenced this issue Aug 6, 2021
hutchic added a commit that referenced this issue Jun 10, 2022
* feat(dependencies) migrate build logic to separate build script

* fix(ci) clone via https

* chore(debug) debugging travis-ci

* fix(luarocks) adjust for luarocks directory name change

* chore(ci) revert the travis-ci debug commit

* fix(deb) the base image shouldn't require these packages

* fix(building) need an additional build tool in some distros

* fix(ci) rebuild the base images

* fix(ci) conditionally install realpath

* fix(centos) adjust how realpath is installed

* fix(centos) seems which isnt installed on centos by default

* fix(Makefil) the docker image might not exist which is fine

* fix(openssl) adjust openssl installation location

* chore(debug) check different Kong version

* use system luarocks

* fix(luarocks) implement the 2.1.3 luarocks fix

* fix(luarocks) we bumped luarocks so dont need this

* chore(openresty-build-tools) pin to 0.0.1

* fix(ci) adjust the Makefile task

* Update dependency pbr to v5.3.0
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
idea/new plugin [legacy] those issues belong to Kong Nation, since GitHub issues are reserved for bug reports.
Projects
None yet
Development

No branches or pull requests

5 participants