You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I have an input file containing about 400k lines, one JSON entry per line, read using the tail input, and I am using SQL output into Postgres. In normal conditions all entries are properly inserted.
However I need to make sure that a sudden stop of PostgreSQL will not lead to a loss of records by Fluentd and that these records will be retried. When I tried stopping PostgreSQL at a random moment during the insertion, about 9k entries were dumped because the plugin classified these errors as "deterministic errors":
2023-09-13 19:09:51 +0200 [warn]: #0 Got deterministic error. Fallback to one-by-one import error_class=ActiveRecord::StatementInvalid error="PG::ConnectionBad: PQconsumeInput() FATAL: terminating connection due to administrator command\nSSL connection has been closed unexpectedly\n"
2023-09-13 19:09:51 +0200 [error]: #0 Got deterministic error again. Dump a record error_class=ActiveRecord::StatementInvalid error="PG::ConnectionBad: PQsocket() can't get socket descriptor" record=#<Fluent::Plugin::SQLOutput::BaseModel_506070496::AccessLog […]>
and 9k more Got deterministic error again messages with a dump of the record. After that, the Postgres connection error was detected and the plugin retried as it should until Postgres comes back up, and the rest of the records are inserted but the dumped ones are lost.
The behaviour I expect is that no records should be dumped in case Postgres is temporarily stopped.
The offending section appears to be this 'fallback' feature that is enabled by default and can be disabled by setting enable_fallback false in your config.
What this feature is doing, is for certain types of database error it will change from batch processing into processing messages one by one, and if further SQL errors happen it will just drop the message.
This should probably be disabled by default, or changed so it doesnt just dump messages if it gets an odd response from postgres, because this section will throw if you deliberately make your database read only, or restart the database.
I have an input file containing about 400k lines, one JSON entry per line, read using the
tail
input, and I am using SQL output into Postgres. In normal conditions all entries are properly inserted.However I need to make sure that a sudden stop of PostgreSQL will not lead to a loss of records by Fluentd and that these records will be retried. When I tried stopping PostgreSQL at a random moment during the insertion, about 9k entries were dumped because the plugin classified these errors as "deterministic errors":
and 9k more
Got deterministic error again
messages with a dump of the record. After that, the Postgres connection error was detected and the plugin retried as it should until Postgres comes back up, and the rest of the records are inserted but the dumped ones are lost.The behaviour I expect is that no records should be dumped in case Postgres is temporarily stopped.
This is my configuration:
Packages versions:
The text was updated successfully, but these errors were encountered: