Skip to content
This repository has been archived by the owner on Apr 26, 2024. It is now read-only.

Implement a structured logging output system. #5680

Merged
merged 61 commits into from
Aug 28, 2019
Merged
Show file tree
Hide file tree
Changes from 51 commits
Commits
Show all changes
61 commits
Select commit Hold shift + click to select a range
4071317
first cut of structured logging
hawkowl Jul 13, 2019
0ea4fb1
remove deprecated options, command line options, and simplify the log…
hawkowl Jul 13, 2019
af7bc1c
this should be true
hawkowl Jul 13, 2019
83bf10c
cleanup
hawkowl Jul 13, 2019
9c19dc3
remove disused arguments from workers too
hawkowl Jul 13, 2019
c19f2d5
Merge branch 'hawkowl/prune-logging-options' into hawkowl/structured-…
hawkowl Jul 13, 2019
32b9dd0
fix and push up
hawkowl Jul 13, 2019
83507eb
fix
hawkowl Jul 13, 2019
5cec494
update to use 18.9
hawkowl Jul 13, 2019
69cd070
fix sytests?
hawkowl Jul 15, 2019
51048fb
fix sytests?
hawkowl Jul 15, 2019
0b76783
make a misc
hawkowl Jul 15, 2019
9f5d825
Merge remote-tracking branch 'origin/develop' into hawkowl/structured…
hawkowl Jul 18, 2019
47820c5
fix
hawkowl Jul 18, 2019
5260a8b
fix
hawkowl Jul 18, 2019
586fd94
fix
hawkowl Jul 18, 2019
50fcd60
fix
hawkowl Jul 18, 2019
534b15d
Merge remote-tracking branch 'origin/develop' into hawkowl/structured…
hawkowl Jul 23, 2019
ae34f25
review comments
hawkowl Jul 23, 2019
34b3fce
review comments
hawkowl Jul 23, 2019
ba5b355
use native python read
hawkowl Jul 23, 2019
64c5cdb
Merge remote-tracking branch 'origin/develop' into hawkowl/structured…
hawkowl Jul 26, 2019
d1fc0ff
set the log level on the namespace too
hawkowl Jul 26, 2019
fa764c6
review comments
hawkowl Jul 26, 2019
b19156a
Merge remote-tracking branch 'origin/develop' into hawkowl/structured…
hawkowl Jul 29, 2019
6b4a540
Merge remote-tracking branch 'origin/develop' into hawkowl/structured…
hawkowl Jul 29, 2019
2fbc0e1
Merge remote-tracking branch 'origin/develop' into hawkowl/structured…
hawkowl Jul 30, 2019
aafc342
fix
hawkowl Jul 31, 2019
cdb8a72
Merge remote-tracking branch 'origin/develop' into hawkowl/structured…
hawkowl Jul 31, 2019
755d7e8
more test
hawkowl Jul 31, 2019
fb9b18a
Merge remote-tracking branch 'origin/develop' into hawkowl/structured…
hawkowl Aug 7, 2019
680fdee
fix
hawkowl Aug 7, 2019
db1b1b3
tests
hawkowl Aug 7, 2019
92289a7
fix
hawkowl Aug 7, 2019
e7c62b5
pass in a proper lang
hawkowl Aug 7, 2019
55391d2
try this to see if it'll go faster
hawkowl Aug 7, 2019
b2fa4ad
docs:
hawkowl Aug 7, 2019
9cf3c28
docs
hawkowl Aug 7, 2019
673d380
docs
hawkowl Aug 7, 2019
aae792d
more test
hawkowl Aug 7, 2019
b50d3a9
more test
hawkowl Aug 7, 2019
03ac5b4
fluentd log to export on console
hawkowl Aug 8, 2019
d10b9b8
fix
hawkowl Aug 8, 2019
51b3e81
fix
hawkowl Aug 8, 2019
232064e
cleanups
hawkowl Aug 8, 2019
d004e93
style fix
hawkowl Aug 8, 2019
5bbc8e7
docs and typing
hawkowl Aug 9, 2019
b611e53
run mypy and check for errors
hawkowl Aug 9, 2019
77f2334
run mypy and check for errors
hawkowl Aug 9, 2019
eb3c75d
add it to CI
hawkowl Aug 9, 2019
71d695a
docs
hawkowl Aug 9, 2019
589463f
Merge remote-tracking branch 'origin/develop' into hawkowl/structured…
hawkowl Aug 27, 2019
04f084c
some introductory logs
hawkowl Aug 27, 2019
8697d51
failing test
hawkowl Aug 27, 2019
4477c69
failing test
hawkowl Aug 27, 2019
2684627
Merge remote-tracking branch 'origin/develop' into hawkowl/structured…
hawkowl Aug 28, 2019
9864220
fix the test
hawkowl Aug 28, 2019
c400d15
fix the tests
hawkowl Aug 28, 2019
267501a
fix mypy
hawkowl Aug 28, 2019
518e48e
fix old tests
hawkowl Aug 28, 2019
0a0845f
fix old tests
hawkowl Aug 28, 2019
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions .buildkite/docker-compose.py35.pg95.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -6,6 +6,7 @@ services:
image: postgres:9.5
environment:
POSTGRES_PASSWORD: postgres
command: -c fsync=off

testenv:
image: python:3.5
Expand Down
1 change: 1 addition & 0 deletions .buildkite/docker-compose.py37.pg11.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -6,6 +6,7 @@ services:
image: postgres:11
environment:
POSTGRES_PASSWORD: postgres
command: -c fsync=off

testenv:
image: python:3.7
Expand Down
1 change: 1 addition & 0 deletions .buildkite/docker-compose.py37.pg95.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -6,6 +6,7 @@ services:
image: postgres:9.5
environment:
POSTGRES_PASSWORD: postgres
command: -c fsync=off

testenv:
image: python:3.7
Expand Down
10 changes: 9 additions & 1 deletion .buildkite/pipeline.yml
Original file line number Diff line number Diff line change
Expand Up @@ -45,8 +45,15 @@ steps:
- docker#v3.0.1:
image: "python:3.6"

- wait
- command:
- "python -m pip install tox"
- "tox -e mypy"
label: ":mypy: mypy"
plugins:
- docker#v3.0.1:
image: "python:3.5"

- wait

- command:
- "apt-get update && apt-get install -y python3.5 python3.5-dev python3-pip libxml2-dev libxslt-dev zlib1g-dev"
Expand All @@ -55,6 +62,7 @@ steps:
label: ":python: 3.5 / SQLite / Old Deps"
env:
TRIAL_FLAGS: "-j 2"
LANG: "C.UTF-8"
plugins:
- docker#v3.0.1:
image: "ubuntu:xenial" # We use xenail to get an old sqlite and python
Expand Down
5 changes: 3 additions & 2 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -19,6 +19,7 @@ _trial_temp*/
/*.signing.key
/env/
/homeserver*.yaml
/logs
/media_store/
/uploads

Expand All @@ -28,13 +29,13 @@ _trial_temp*/
/.vscode/

# build products
/.coverage*
!/.coveragerc
/.coverage*
/.mypy_cache/
/.tox
/build/
/coverage.*
/dist/
/docs/build/
/htmlcov
/pip-wheel-metadata/

1 change: 1 addition & 0 deletions changelog.d/5680.misc
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
Lay the groundwork for structured logging output.
83 changes: 83 additions & 0 deletions docs/structured_logging.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,83 @@
# Structured Logging

A structured logging system can be useful when your logs are destined for a machine to parse and process. By maintaining its machine-readable characteristics, it enables more efficient searching and aggregations when consumed by software such as the "ELK stack".

Synapse's structured logging system is configured via the file that Synapse's `log_config` config option points to. The file must be YAML and contain `structured: true`. It must contain a list of "drains" (places where logs go to).

A structured logging configuration looks similar to the following:

```yaml
structured: true

loggers:
synapse:
level: INFO
synapse.storage.SQL:
level: WARNING

drains:
console:
type: console
location: stdout
file:
type: file_json
location: homeserver.log
```

The above logging config will set Synapse as 'INFO' logging level by default, with the SQL layer at 'WARNING', and will have two logging drains (to the console and to a file, stored as JSON).

## Drain Types

Drain types can be specified by the `type` key.

### `console`

Outputs human-readable logs to the console.

Arguments:

- `location`: Either `stdout` or `stderr`.

### `console_json`

Outputs machine-readable JSON logs to the console.

Arguments:

- `location`: Either `stdout` or `stderr`.

### `console_json_terse`

Outputs machine-readable JSON logs to the console, separated by newlines. This
format is not designed to be read and re-formatted into human-readable text, but
is optimal for a logging aggregation system.

Arguments:

- `location`: Either `stdout` or `stderr`.

### `file`

Outputs human-readable logs to a file.

Arguments:

- `location`: An absolute path to the file to log to.

### `file_json`

Outputs machine-readable logs to a file.

Arguments:

- `location`: An absolute path to the file to log to.

### `network_json_terse`

Delivers machine-readable JSON logs to a log aggregator over TCP. This is
compatible with LogStash's TCP input with the codec set to `json_lines`.

Arguments:

- `host`: Hostname or IP address of the log aggregator.
- `port`: Numerical port to contact on the host.
12 changes: 7 additions & 5 deletions synapse/app/_base.py
Original file line number Diff line number Diff line change
Expand Up @@ -36,18 +36,20 @@

logger = logging.getLogger(__name__)

# list of tuples of function, args list, kwargs dict
_sighup_callbacks = []


def register_sighup(func):
def register_sighup(func, *args, **kwargs):
"""
Register a function to be called when a SIGHUP occurs.

Args:
func (function): Function to be called when sent a SIGHUP signal.
Will be called with a single argument, the homeserver.
Will be called with a single default argument, the homeserver.
*args, **kwargs: args and kwargs to be passed to the target function.
"""
_sighup_callbacks.append(func)
_sighup_callbacks.append((func, args, kwargs))


def start_worker_reactor(appname, config, run_command=reactor.run):
Expand Down Expand Up @@ -249,8 +251,8 @@ def handle_sighup(*args, **kwargs):
sd_channel = sdnotify.SystemdNotifier()
sd_channel.notify("RELOADING=1")

for i in _sighup_callbacks:
i(hs)
for i, args, kwargs in _sighup_callbacks:
i(hs, *args, **kwargs)

sd_channel.notify("READY=1")

Expand Down
4 changes: 2 additions & 2 deletions synapse/app/admin_cmd.py
Original file line number Diff line number Diff line change
Expand Up @@ -227,8 +227,6 @@ def start(config_options):
config.start_pushers = False
config.send_federation = False

setup_logging(config, use_worker_options=True)

synapse.events.USE_FROZEN_DICTS = config.use_frozen_dicts

database_engine = create_engine(config.database_config)
Expand All @@ -241,6 +239,8 @@ def start(config_options):
database_engine=database_engine,
)

setup_logging(ss, config, use_worker_options=True)

ss.setup()

# We use task.react as the basic run command as it correctly handles tearing
Expand Down
4 changes: 2 additions & 2 deletions synapse/app/appservice.py
Original file line number Diff line number Diff line change
Expand Up @@ -141,8 +141,6 @@ def start(config_options):

assert config.worker_app == "synapse.app.appservice"

setup_logging(config, use_worker_options=True)

events.USE_FROZEN_DICTS = config.use_frozen_dicts

database_engine = create_engine(config.database_config)
Expand All @@ -167,6 +165,8 @@ def start(config_options):
database_engine=database_engine,
)

setup_logging(ps, config, use_worker_options=True)

ps.setup()
reactor.addSystemEventTrigger(
"before", "startup", _base.start, ps, config.worker_listeners
Expand Down
4 changes: 2 additions & 2 deletions synapse/app/client_reader.py
Original file line number Diff line number Diff line change
Expand Up @@ -179,8 +179,6 @@ def start(config_options):

assert config.worker_app == "synapse.app.client_reader"

setup_logging(config, use_worker_options=True)

events.USE_FROZEN_DICTS = config.use_frozen_dicts

database_engine = create_engine(config.database_config)
Expand All @@ -193,6 +191,8 @@ def start(config_options):
database_engine=database_engine,
)

setup_logging(ss, config, use_worker_options=True)

ss.setup()
reactor.addSystemEventTrigger(
"before", "startup", _base.start, ss, config.worker_listeners
Expand Down
4 changes: 2 additions & 2 deletions synapse/app/event_creator.py
Original file line number Diff line number Diff line change
Expand Up @@ -175,8 +175,6 @@ def start(config_options):

assert config.worker_replication_http_port is not None

setup_logging(config, use_worker_options=True)

# This should only be done on the user directory worker or the master
config.update_user_directory = False

Expand All @@ -192,6 +190,8 @@ def start(config_options):
database_engine=database_engine,
)

setup_logging(ss, config, use_worker_options=True)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What happens if we log before this? I assume that for some reason we can't set up structured logging before this? I'm mainly worried that we'll end up losing any logging that happens while we create the server and set up the database.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

if you are logging to the twisted logging system, they go into the globalLogBeginner, which stores logs between init/import of twisted and then outputs them when the logging system starts. if you are logging to the stdlib logger before this, they go into the void

(so we should use the twisted logger, so they go somewhere reasonable)

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Oh, interesting. So if we use the stdlib config we don't get those log lines, but if we use the new structured config we don't?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

if we use the structured config we will, yes


ss.setup()
reactor.addSystemEventTrigger(
"before", "startup", _base.start, ss, config.worker_listeners
Expand Down
4 changes: 2 additions & 2 deletions synapse/app/federation_reader.py
Original file line number Diff line number Diff line change
Expand Up @@ -160,8 +160,6 @@ def start(config_options):

assert config.worker_app == "synapse.app.federation_reader"

setup_logging(config, use_worker_options=True)

events.USE_FROZEN_DICTS = config.use_frozen_dicts

database_engine = create_engine(config.database_config)
Expand All @@ -174,6 +172,8 @@ def start(config_options):
database_engine=database_engine,
)

setup_logging(ss, config, use_worker_options=True)

ss.setup()
reactor.addSystemEventTrigger(
"before", "startup", _base.start, ss, config.worker_listeners
Expand Down
4 changes: 2 additions & 2 deletions synapse/app/federation_sender.py
Original file line number Diff line number Diff line change
Expand Up @@ -171,8 +171,6 @@ def start(config_options):

assert config.worker_app == "synapse.app.federation_sender"

setup_logging(config, use_worker_options=True)

events.USE_FROZEN_DICTS = config.use_frozen_dicts

database_engine = create_engine(config.database_config)
Expand All @@ -197,6 +195,8 @@ def start(config_options):
database_engine=database_engine,
)

setup_logging(ss, config, use_worker_options=True)

ss.setup()
reactor.addSystemEventTrigger(
"before", "startup", _base.start, ss, config.worker_listeners
Expand Down
4 changes: 2 additions & 2 deletions synapse/app/frontend_proxy.py
Original file line number Diff line number Diff line change
Expand Up @@ -232,8 +232,6 @@ def start(config_options):

assert config.worker_main_http_uri is not None

setup_logging(config, use_worker_options=True)

events.USE_FROZEN_DICTS = config.use_frozen_dicts

database_engine = create_engine(config.database_config)
Expand All @@ -246,6 +244,8 @@ def start(config_options):
database_engine=database_engine,
)

setup_logging(ss, config, use_worker_options=True)

ss.setup()
reactor.addSystemEventTrigger(
"before", "startup", _base.start, ss, config.worker_listeners
Expand Down
4 changes: 2 additions & 2 deletions synapse/app/homeserver.py
Original file line number Diff line number Diff line change
Expand Up @@ -341,8 +341,6 @@ def setup(config_options):
# generating config files and shouldn't try to continue.
sys.exit(0)

synapse.config.logger.setup_logging(config, use_worker_options=False)

events.USE_FROZEN_DICTS = config.use_frozen_dicts

database_engine = create_engine(config.database_config)
Expand All @@ -356,6 +354,8 @@ def setup(config_options):
database_engine=database_engine,
)

synapse.config.logger.setup_logging(hs, config, use_worker_options=False)

logger.info("Preparing database: %s...", config.database_config["name"])

try:
Expand Down
4 changes: 2 additions & 2 deletions synapse/app/media_repository.py
Original file line number Diff line number Diff line change
Expand Up @@ -146,8 +146,6 @@ def start(config_options):
"Please add ``enable_media_repo: false`` to the main config\n"
)

setup_logging(config, use_worker_options=True)

events.USE_FROZEN_DICTS = config.use_frozen_dicts

database_engine = create_engine(config.database_config)
Expand All @@ -160,6 +158,8 @@ def start(config_options):
database_engine=database_engine,
)

setup_logging(ss, config, use_worker_options=True)

ss.setup()
reactor.addSystemEventTrigger(
"before", "startup", _base.start, ss, config.worker_listeners
Expand Down
4 changes: 2 additions & 2 deletions synapse/app/pusher.py
Original file line number Diff line number Diff line change
Expand Up @@ -184,8 +184,6 @@ def start(config_options):

assert config.worker_app == "synapse.app.pusher"

setup_logging(config, use_worker_options=True)

events.USE_FROZEN_DICTS = config.use_frozen_dicts

if config.start_pushers:
Expand All @@ -210,6 +208,8 @@ def start(config_options):
database_engine=database_engine,
)

setup_logging(ps, config, use_worker_options=True)

ps.setup()

def start():
Expand Down
4 changes: 2 additions & 2 deletions synapse/app/synchrotron.py
Original file line number Diff line number Diff line change
Expand Up @@ -435,8 +435,6 @@ def start(config_options):

assert config.worker_app == "synapse.app.synchrotron"

setup_logging(config, use_worker_options=True)

synapse.events.USE_FROZEN_DICTS = config.use_frozen_dicts

database_engine = create_engine(config.database_config)
Expand All @@ -450,6 +448,8 @@ def start(config_options):
application_service_handler=SynchrotronApplicationService(),
)

setup_logging(ss, config, use_worker_options=True)

ss.setup()
reactor.addSystemEventTrigger(
"before", "startup", _base.start, ss, config.worker_listeners
Expand Down
Loading