Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Enhance Cloud Function to send logs to Stackdriver #24

Open
gtseres-workable opened this issue Oct 23, 2019 · 4 comments
Open

Enhance Cloud Function to send logs to Stackdriver #24

gtseres-workable opened this issue Oct 23, 2019 · 4 comments

Comments

@gtseres-workable
Copy link

Currently the Cloud Function that is created writes the logs directly to BigQuery. This is very useful for being able to analyse the logs and visualise them using Data Studio.

One enhancement that could be made to the function would be to provide the ability (e.g. via an environment variable for example) to push the logs to Stackdriver as well, using the Stackdriver NodeJS SDK. This will provide the ability to search for logs in Stackdriver directly, which can be more cost-effective, especially when the log size increases (resulting in BigQuery analysis cost increase). By also setting the httpRequest field, the logs can show up in a more useful manner, e.g.:

Screenshot 2019-10-23 at 11 17 35

Please let me know of your thoughts on this, and maybe we can help with the implementation.

@shagamemnon
Copy link
Contributor

@gtseres-workable thanks for this suggestion. If this can meaningfully reduce costs, it should be implemented no question. Can you provide any estimate on the cost difference? I think the cost of a BigQuery search vs. the cost of querying those same logs in Stackdriver over a 48 hour time period would provide a solid starting point

@wlatic
Copy link

wlatic commented Nov 21, 2019

Taking costs out of the equation this would allow the use of strackdriver metrics to allow Cloudflare to be monitored for problems and have all logs in a single location.

@poligraph
Copy link

@shagamemnon for example, we have 1Tb logs in BQ, I did couple requests to BQ with 8Tb data read, and price for that is 35$

@shagamemnon
Copy link
Contributor

shagamemnon commented Dec 29, 2021

The major challenge here: memory and time limits for Cloud Functions. In order to push every log into stackdriver, this function would need to decompress the batches of log files (e.g. gunzip), and then buffer each file into memory, or read them as a stream. Right now, this task is managed by BigQuery - which has built-in utilities to ingest the .gz files and transform them into rows.

I would certainly love to add this functionality. But right now, I don't see a path to doing so.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants